I have been looking at the AI Dev kit, and something stood out for me in the databricks-lakebase-autoscale skill: Current Limitations - Stateful AI agents (LangChain memory)
ai-dev-kit/databricks-skills/databricks-lakebase-autoscale/SKILL.md at main · databricks-solutions/…
The limitation is pretty vague, and I believe we have gotten Langchain memory to work in Lakebase Autoscaling; also, this seems to be one of the primary use cases for the product. I did more research in Databricks documentation, is the response from Claude:
_"The skill's claim is overstated. Here's what the official docs actually say:
The real, specific limitation (from the official agent-langgraph-advanced template README):
"If your Lakebase instance is autoscaling (not provisioned), the postgres resource is not yet supported as a resource dependency in databricks.yml."
"Note that databricks bundle deploy will overwrite app resources, so you must re-add the postgres resource after each bundle deploy."
LangGraph checkpointers work fine at runtime against Autoscaling Lakebase. The gap is narrower: Autoscaling can't be declared as a postgres resource in databricks.yml, so bundle deploy wipes the resource binding and you have to re-attach it manually after every deploy."_
Does the current limitation need clarification?
Thanks.
I have been looking at the AI Dev kit, and something stood out for me in the databricks-lakebase-autoscale skill: Current Limitations - Stateful AI agents (LangChain memory)
ai-dev-kit/databricks-skills/databricks-lakebase-autoscale/SKILL.md at main · databricks-solutions/…
The limitation is pretty vague, and I believe we have gotten Langchain memory to work in Lakebase Autoscaling; also, this seems to be one of the primary use cases for the product. I did more research in Databricks documentation, is the response from Claude:
_"The skill's claim is overstated. Here's what the official docs actually say:
The real, specific limitation (from the official agent-langgraph-advanced template README):
"If your Lakebase instance is autoscaling (not provisioned), the postgres resource is not yet supported as a resource dependency in databricks.yml."
"Note that databricks bundle deploy will overwrite app resources, so you must re-add the postgres resource after each bundle deploy."
LangGraph checkpointers work fine at runtime against Autoscaling Lakebase. The gap is narrower: Autoscaling can't be declared as a postgres resource in databricks.yml, so bundle deploy wipes the resource binding and you have to re-attach it manually after every deploy."_
Does the current limitation need clarification?
Thanks.