Skip to content

Add databricks-genie-bedrock-agentcore skill#520

Open
antonyprasad-db wants to merge 1 commit intodatabricks-solutions:mainfrom
antonyprasad-db:add-databricks-genie-bedrock-agentcore-skill
Open

Add databricks-genie-bedrock-agentcore skill#520
antonyprasad-db wants to merge 1 commit intodatabricks-solutions:mainfrom
antonyprasad-db:add-databricks-genie-bedrock-agentcore-skill

Conversation

@antonyprasad-db
Copy link
Copy Markdown
Contributor

Summary

Adds a new skill databricks-genie-bedrock-agentcore covering the integration between Databricks Genie and Amazon Bedrock agents through AgentCore Gateway. Exposes Genie spaces as a governed MCP tool to Bedrock agents — no data movement into Knowledge Bases, no parallel metric definitions, Unity Catalog governance preserved end-to-end.

What's in the skill

Following the TEMPLATE/ structure with three reference files:

  • SKILL.md — overview, two auth modes (OBO and M2M) with disclosure guidance, two IaC paths (Terraform and CloudFormation), quick-start recipe, common patterns, troubleshooting matrix
  • 1-architecture-and-auth.md — end-to-end identity flow, OAuth credential provider configuration, governance validation steps, identity-flow gotchas
  • 2-deployment-tf-vs-cfn.md — deployment decision matrix, Terraform path (uses awscc provider), CloudFormation path with the pre-blessed exec-role pattern + the full IAM policy needed
  • 3-quickstart-and-scripts.md — end-to-end deployment recipe, helper-script reference, expected outputs

Why this is useful

databricks-skills/ has 0 AWS-Bedrock-AgentCore skills today. Customers asking "can my Bedrock agent answer governed analytics questions from Databricks?" currently lack a guided path — this fills that gap.

The skill cross-references the existing databricks-genie skill (Databricks-side Genie management) and databricks-agent-bricks skill (Databricks-native multi-agent orchestration), so users coming from those skills are routed correctly.

Honest disclosures baked into the skill

  • Auth-mode honesty: every section that mentions M2M says explicitly "do not claim user-level governance in this mode." OBO is the recommended production posture; M2M is a labeled booth-demo quick start.
  • Schema-name caveat: AgentCore CFN/awscc resources were authored against the 2026-05 registry. The skill flags that customers may need to patch attribute names if AWS renamed them post-preview.
  • Two-pass OAuth redirect URI: documented as a known setup wrinkle, with a script that closes the loop automatically.

Reference implementation

The skill points users at the deployable reference repo at databricks-field-eng/databricks-aws-integrations/genie_with_bedrock_agentcore (merged 2026-05-07), which provides the working Terraform + CloudFormation IaC, helper scripts, and a sample Genie space setup script.

Test plan

  • Reviewer reads SKILL.md and confirms the description triggers correctly for "Genie + Bedrock" / "AgentCore Gateway" / "MCP tool" prompts
  • Reviewer skims the three reference files for tone alignment with sibling skills (databricks-agent-bricks, databricks-genie)
  • Optional: install the skill locally (./install_skills.sh databricks-genie-bedrock-agentcore) and verify it lands at .claude/skills/databricks-genie-bedrock-agentcore/

This pull request and its description were written by Isaac.

Adds a new skill covering the Databricks Genie + Amazon Bedrock AgentCore
Gateway integration — exposing Genie spaces as a governed MCP tool to
Bedrock agents without data movement, while preserving Unity Catalog
governance.

Skill structure follows the TEMPLATE pattern with three reference files:

- SKILL.md: overview, two auth modes (OBO and M2M) with disclosure
  guidance, two IaC paths (Terraform and CloudFormation), quick-start
  recipe, common patterns, common-issue troubleshooting matrix
- 1-architecture-and-auth.md: end-to-end identity flow, OAuth credential
  provider configuration, identity-flow gotchas (redirect URI mismatch,
  ARN confusion, schema naming), governance validation steps
- 2-deployment-tf-vs-cfn.md: deployment decision matrix, Terraform path
  using awscc provider, CloudFormation path using the pre-blessed
  exec-role pattern (originated by Ioannis Papadopoulos for the
  Agent Bricks <-> Bedrock/AgentCore demo), full IAM policy for the
  deployer role
- 3-quickstart-and-scripts.md: end-to-end deployment recipe, helper-
  script reference (Genie space creation, gateway target registration,
  agent-gateway association, OAuth redirect URI sync), expected outputs,
  local-development bridge

Cross-references the existing databricks-genie skill (Databricks-side
Genie management) and databricks-agent-bricks skill (Databricks-native
multi-agent orchestration). Points users at the deployable reference
implementation in databricks-field-eng/databricks-aws-integrations under
the genie_with_bedrock_agentcore folder.

Always discloses auth-mode honesty (m2m claims user-level governance is
incorrect) and AgentCore CFN schema-name caveat (authored against the
2026-05 registry, may need patching after preview-to-GA renames).
@antonyprasad-db
Copy link
Copy Markdown
Contributor Author

@dustinvannoy-db — would appreciate your review when you have a moment. This is a follow-up to #511 (the Kiro IDE installer support you merged); this PR adds a new skill covering the Genie + Bedrock AgentCore Gateway integration.

@dustinvannoy-db
Copy link
Copy Markdown
Collaborator

I think this should go as part of databricks-field-eng/databricks-aws-integrations/genie_with_bedrock_agentcore and be kept internal, then provided to customers as needed. It seems quite specific to this particular setup that we haven't had many requests for.

@antonyprasad-db
Copy link
Copy Markdown
Contributor Author

Thanks Dustin — appreciate the read, and totally fair instinct to keep the catalog tight. Wanted to share some context I should have led with in the PR description, then get your take before we land on the right path:

Why I'd advocate for publishing this one:

  • This sits inside the active Databricks ↔ AWS partnership investment, and we have an upcoming DAIS moment for the Genie + Bedrock AgentCore story. A public skill ahead of DAIS would let SAs and customers self-serve the integration narrative rather than every conversation routing through me.
  • The "we haven't had many requests" signal is partly a discoverability gap on our side — AgentCore Gateway is recent, and the only current path is "ask the right person who knows the right internal repo exists." Customers don't tend to request what they don't know is supported, which makes inbound volume a lagging indicator for net-new partnership integrations.
  • The pattern itself (Genie via OAuth credential provider → AgentCore Gateway) is really the blueprint for any "expose Databricks governed analytics to a Bedrock agent" use case, not a single bespoke setup — though I hear you that it can read that way from the PR alone.

Happy to compromise:
If a full skill feels like too much surface area, I can scope this down to a thin router skill — a short page that explains when this integration applies and points back to the internal genie_with_bedrock_agentcore repo for the deployable IaC. That keeps the public footprint small while still solving the discoverability gap. Would that land better?

What would help me calibrate:

  • What's the catalog's general bar for partnership-driven integrations like this one? Want to make sure I'm submitting future ones the right way.
  • Any concerns beyond specificity / request volume — e.g., maintenance expectations once it's public, support implications — that I should address head-on?

Happy to jump on a quick call if easier than threading. Either way, want to land on something that works for both the catalog hygiene side and the partnership side.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants