Skip to content

databricks-lakebase-autoscale skill current limitations - Stateful AI agents (LangChain memory) #522

@jericksonclinicaloptions

Description

I have been looking at the AI Dev kit, and something stood out for me in the databricks-lakebase-autoscale skill: Current Limitations - Stateful AI agents (LangChain memory)

ai-dev-kit/databricks-skills/databricks-lakebase-autoscale/SKILL.md at main · databricks-solutions/…

The limitation is pretty vague, and I believe we have gotten Langchain memory to work in Lakebase Autoscaling; also, this seems to be one of the primary use cases for the product. I did more research in Databricks documentation, is the response from Claude:

_"The skill's claim is overstated. Here's what the official docs actually say:

The real, specific limitation (from the official agent-langgraph-advanced template README):

"If your Lakebase instance is autoscaling (not provisioned), the postgres resource is not yet supported as a resource dependency in databricks.yml."
"Note that databricks bundle deploy will overwrite app resources, so you must re-add the postgres resource after each bundle deploy."

LangGraph checkpointers work fine at runtime against Autoscaling Lakebase. The gap is narrower: Autoscaling can't be declared as a postgres resource in databricks.yml, so bundle deploy wipes the resource binding and you have to re-attach it manually after every deploy."_

Does the current limitation need clarification?

Thanks.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions