Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
48 changes: 37 additions & 11 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -62,17 +62,43 @@ To maintain high-quality code and reduce technical debt, we enforce the followin
- Testing against both the latest Chainlit release and the main branch
- Documentation for all added or changed functionality

## 🔧 Features and Integrations

This repository hosts a variety of community-maintained features and integrations, including but not limited to:

- Additional LLM framework integrations
- Custom authentication implementations
- Specialized UI components
- Utility functions and helpers
- Domain-specific extensions

For a full list of available features and integrations, please check our [Features Directory](FEATURES.md).
## Component Architecture

Chainlit Community components follow a modular architecture with two main component types:

### 1. Data Layers
**Role**: Persistent structured data storage for conversation elements (users, threads, steps)
**Interactions**:
- Direct integration with Chainlit's data layer system
- Optional integration with Storage Providers for file attachments

| Package | Description | README |
|---------|-------------|--------|
| `dynamodb` | Amazon DynamoDB implementation with cloud storage integration | [docs](packages/data_layers/dynamodb/README.md) |
| `sqlalchemy` | SQL database support (PostgreSQL/SQLite) with storage provider integration | [docs](packages/data_layers/sqlalchemy/README.md) |
| `literalai` | Official Literal AI observability platform integration | [docs](packages/data_layers/literalai/README.md) |

### 2. Storage Providers
**Role**: File storage and management for attachments/media
**Interactions**:
- Used by Data Layers through dependency injection
- Handle upload/delete operations and URL generation

| Package | Cloud Provider | README |
|---------|----------------|--------|
| `azure` | Azure Data Lake | [docs](packages/storage_clients/azure/README.md) |
| `azure-blob` | Azure Blob Storage | [docs](packages/storage_clients/azure_blob/README.md) |
| `gcs` | Google Cloud Storage | [docs](packages/storage_clients/gcs/README.md) |
| `s3` | AWS S3 | [docs](packages/storage_clients/s3/README.md) |

## Typical Data Flow
```mermaid
graph LR
A[Chainlit App] --> B{Data Layer}
B -->|Persists metadata| C[(Database)]
B -->|Delegates files| D[[Storage Provider]]
D -->|Stores objects| E[(Cloud Storage)]
```

## 📚 Documentation

Expand Down
141 changes: 140 additions & 1 deletion packages/data_layers/dynamodb/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,4 +2,143 @@

DynamoDB data layer for [Chainlit](https://chainlit.io/).

## Usage
## DynamoDB Data Layer

This data layer supports Amazon DynamoDB with optional cloud storage integration for elements.

Key features:
- Single table design with efficient query patterns
- Supports storage clients for attachments (S3, Azure Blob)
- User/thread/step/element/feedback storage in DynamoDB
- Built-in pagination and sorting

## Setup Example (DynamoDB + Cloud Storage)

1. Create DynamoDB table using the [CloudFormation template](#table-structure)
2. Install required dependencies:
```bash
# Core requirements
pip install chainlit-dynamodb

# With cloud storage (choose one):
pip install chainlit-dynamodb[s3] # AWS S3
pip install chainlit-dynamodb[azure-blob] # Azure Blob Storage
pip install chainlit-dynamodb[gcs] # Google Cloud Storage
pip install chainlit-dynamodb[azure] # Azure Data Lake
```

3. Configure in your Chainlit app:
```python
import os
import chainlit as cl
from chainlit.data.dynamodb import DynamoDBDataLayer
from chainlit.data.storage_clients import (
S3StorageClient,
AzureBlobStorageClient,
GCSStorageClient,
AzureStorageClient
)

# Security Note: Always store secrets in environment variables
# Never commit credentials to source control
# Consider using secret managers like AWS Secrets Manager

@cl.data_layer
def get_data_layer():
# Choose one storage provider:

# AWS S3 Example
storage_client = S3StorageClient(
bucket="<your-bucket>",
region_name="<your-region>",
aws_access_key_id=os.environ["AWS_ACCESS_KEY_ID"],
aws_secret_access_key=os.environ["AWS_SECRET_ACCESS_KEY"]
)

# Azure Blob Example
# storage_client = AzureBlobStorageClient(
# container_name="<your-container>",
# storage_account="<your-account>",
# storage_key="<your-key>"
# )

# Google Cloud Storage Example
# storage_client = GCSStorageClient(
# project_id="<your-project>",
# client_email="<your-email>",
# private_key="<your-key>",
# bucket_name="<your-bucket>"
# )

# Azure Data Lake Example
# storage_client = AzureStorageClient(
# account_url="https://<account>.dfs.core.windows.net",
# credential="<your-credential>",
# container_name="<your-container>"
# )

return DynamoDBDataLayer(
table_name="<your-table-name>",
storage_provider=storage_client,
user_thread_limit=10
)
```

## Table Structure
```yaml
# CloudFormation template for required table structure
AWSTemplateFormatVersion: 2010-09-09
Resources:
DynamoDBTable:
Type: AWS::DynamoDB::Table
Properties:
TableName: "<YOUR-TABLE-NAME>"
AttributeDefinitions:
- AttributeName: PK
AttributeType: S
- AttributeName: SK
AttributeType: S
- AttributeName: UserThreadPK
AttributeType: S
- AttributeName: UserThreadSK
AttributeType: S
KeySchema:
- AttributeName: PK
KeyType: HASH
- AttributeName: SK
KeyType: RANGE
GlobalSecondaryIndexes:
- IndexName: UserThread
KeySchema:
- AttributeName: UserThreadPK
KeyType: HASH
- AttributeName: UserThreadSK
KeyType: RANGE
Projection:
ProjectionType: INCLUDE
NonKeyAttributes: [id, name]
BillingMode: PAY_PER_REQUEST
```

## Logging
```python
import logging
from chainlit import logger

# Enable debug logging for DynamoDB operations
logger.getChild("DynamoDB").setLevel(logging.DEBUG)
```

## Limitations
- Feedback filtering not supported
- Boto3-based implementation uses blocking IO (not async)
- Decimal types in feedback values require special handling

## Design
Uses single-table design with entity prefixes:
- Users: `USER#{identifier}`
- Threads: `THREAD#{thread_id}`
- Steps: `STEP#{step_id}`
- Elements: `ELEMENT#{element_id}`

Global Secondary Index (UserThread) enables efficient user thread queries.
20 changes: 19 additions & 1 deletion packages/data_layers/dynamodb/pyproject.toml
Original file line number Diff line number Diff line change
@@ -1,20 +1,38 @@
[project]
name = "chainlit-dynamodb"
version = "0.1.0"
description = "Add your description here"
description = "DynamoDB data layer for Chainlit"
readme = "README.md"
requires-python = ">=3.10"
dependencies = [
"aiohttp>=3.11.10",
"boto3>=1.34.73,<2",
]

[project.optional-dependencies]
s3 = [
"chainlit-s3",
]
azure-blob = [
"chainlit-azure-blob",
]
gcs = [
"chainlit-gcs",
]
azure = [
"chainlit-azure",
]

[dependency-groups]
dev = [
"pytest-chainlit",
]

[tool.uv.sources]
chainlit-s3 = { workspace = true }
chainlit-azure-blob = { workspace = true }
chainlit-gcs = { workspace = true }
chainlit-azure = { workspace = true }
pytest-chainlit = { workspace = true }

[build-system]
Expand Down
49 changes: 47 additions & 2 deletions packages/data_layers/literalai/README.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,50 @@
# chainlit-literalai

[Literal AI](https://www.literalai.com/) data layer for [Chainlit](https://chainlit.io/).
[Literal AI](https://www.literal.ai/) integration for [Chainlit](https://chainlit.io/) applications.

## Usage
## Overview

Official data persistence layer connecting Chainlit with Literal AI's LLM observability platform. Enables production-grade monitoring, evaluation and analytics while maintaining Chainlit's conversation structure.

**Key Features**:
- Full conversation history preservation (threads, steps, elements)
- Multimodal logging (text, images, audio, video)
- User feedback tracking
- Automated performance metrics
- Collaborative prompt versioning & A/B testing

## Setup

1. **Install package**:
```bash
pip install chainlit-literalai
```

2. **Configure environment**:
```bash
# .env file
# Security Best Practices:
# - Restrict .env file permissions
# - Never commit .env to version control
# - Use CI/CD secret management
LITERAL_API_KEY="your-api-key-from-literal-ai"
```

3. **Run your app**:
```bash
chainlit run app.py
```

## Documentation

- [Literal AI Documentation](https://docs.literalai.com)
- [Chainlit + Literal AI Integration Guide](https://docs.chainlit.io/llmops/literalai)

## Data Privacy

- Data retention policy: [literalai.com/security](https://www.literalai.com/security)
- Contact: <contact@chainlit.io>

> **Note**
> Developed by the Chainlit team for seamless integration.
> Literal AI is SOC 2 Type 2 compliant.
64 changes: 63 additions & 1 deletion packages/data_layers/sqlalchemy/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,4 +2,66 @@

SQLAlchemy data layer for [Chainlit](https://chainlit.io/).

## Usage
## SQLAlchemy Data Layer

This data layer supports PostgreSQL and other SQL databases via SQLAlchemy.

Key features:

- Supports storage clients for attachments (currently: Azure, Azure Blobs, GCS, S3)
- User/thread/step/element/feedback storage in SQL
- Async operations

## Setup Example (PostgreSQL + Azure Blob)

1. Load [schema.sql](schema.sql) into your database.
1. Install required dependencies:

```bash
# For PostgreSQL
pip install chainlit-sqlalchemy[postgres]

# For SQLite
pip install chainlit-sqlalchemy[sqlite]

# With cloud storage
pip install chainlit-sqlalchemy[postgres,gcs] # PostgreSQL + Google Cloud Storage
pip install chainlit-sqlalchemy[sqlite,azure-blob] # SQLite + Azure Blob
```

2. Configure in your Chainlit app:

```python
import chainlit as cl
from chainlit.data.sql_alchemy import SQLAlchemyDataLayer
from chainlit.data.storage_clients import AzureBlobStorageClient

@cl.data_layer
def get_data_layer():
storage_client = AzureBlobStorageClient(
container_name="<your_container>",
storage_account="<your_account>",
storage_key="<your_key>"
)

return SQLAlchemyDataLayer(
conninfo="postgresql+asyncpg://user:password@host/dbname",
storage_provider=storage_client
)
```

> [!NOTE]
> - Add `+asyncpg` to PostgreSQL connection strings for async support
> - See SQLAlchemy docs for other database connection formats

## Dependencies

- Core: `SQLAlchemy`
- Database Drivers (optional):
- PostgreSQL: `asyncpg`, `psycopg2-binary`
- SQLite: `aiosqlite`
- Cloud Storage (optional):
- Azure (Data Lake): `chainlit-azure`
- Azure Blob: `chainlit-azure-blob`
- Google Cloud: `chainlit-gcs`
- AWS S3: `chainlit-s3`
Loading