Skip to content

Commit d9abcf1

Browse files
tomasonjoErick Friis
andauthored
Neo4j conversation cypher template (langchain-ai#12927)
Adding custom graph memory to Cypher chain --------- Co-authored-by: Erick Friis <erick@langchain.dev>
1 parent 2287a31 commit d9abcf1

File tree

8 files changed

+1770
-0
lines changed

8 files changed

+1770
-0
lines changed
Lines changed: 92 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,92 @@
1+
2+
# neo4j-cypher-memory
3+
4+
This template allows you to have conversations with a Neo4j graph database in natural language, using an OpenAI LLM.
5+
It transforms a natural language question into a Cypher query (used to fetch data from Neo4j databases), executes the query, and provides a natural language response based on the query results.
6+
Additionally, it features a conversational memory module that stores the dialogue history in the Neo4j graph database.
7+
The conversation memory is uniquely maintained for each user session, ensuring personalized interactions.
8+
To facilitate this, please supply both the `user_id` and `session_id` when using the conversation chain.
9+
10+
## Environment Setup
11+
12+
Define the following environment variables:
13+
14+
```
15+
OPENAI_API_KEY=<YOUR_OPENAI_API_KEY>
16+
NEO4J_URI=<YOUR_NEO4J_URI>
17+
NEO4J_USERNAME=<YOUR_NEO4J_USERNAME>
18+
NEO4J_PASSWORD=<YOUR_NEO4J_PASSWORD>
19+
```
20+
21+
## Neo4j database setup
22+
23+
There are a number of ways to set up a Neo4j database.
24+
25+
### Neo4j Aura
26+
27+
Neo4j AuraDB is a fully managed cloud graph database service.
28+
Create a free instance on [Neo4j Aura](https://neo4j.com/cloud/platform/aura-graph-database?utm_source=langchain&utm_content=langserve).
29+
When you initiate a free database instance, you'll receive credentials to access the database.
30+
31+
## Populating with data
32+
33+
If you want to populate the DB with some example data, you can run `python ingest.py`.
34+
This script will populate the database with sample movie data.
35+
36+
## Usage
37+
38+
To use this package, you should first have the LangChain CLI installed:
39+
40+
```shell
41+
pip install -U langchain-cli
42+
```
43+
44+
To create a new LangChain project and install this as the only package, you can do:
45+
46+
```shell
47+
langchain app new my-app --package neo4j-cypher-memory
48+
```
49+
50+
If you want to add this to an existing project, you can just run:
51+
52+
```shell
53+
langchain app add neo4j-cypher-memory
54+
```
55+
56+
And add the following code to your `server.py` file:
57+
```python
58+
from neo4j_cypher_memory import chain as neo4j_cypher_memory_chain
59+
60+
add_routes(app, neo4j_cypher_memory_chain, path="/neo4j-cypher-memory")
61+
```
62+
63+
(Optional) Let's now configure LangSmith.
64+
LangSmith will help us trace, monitor and debug LangChain applications.
65+
LangSmith is currently in private beta, you can sign up [here](https://smith.langchain.com/).
66+
If you don't have access, you can skip this section
67+
68+
```shell
69+
export LANGCHAIN_TRACING_V2=true
70+
export LANGCHAIN_API_KEY=<your-api-key>
71+
export LANGCHAIN_PROJECT=<your-project> # if not specified, defaults to "default"
72+
```
73+
74+
If you are inside this directory, then you can spin up a LangServe instance directly by:
75+
76+
```shell
77+
langchain serve
78+
```
79+
80+
This will start the FastAPI app with a server is running locally at
81+
[http://localhost:8000](http://localhost:8000)
82+
83+
We can see all templates at [http://127.0.0.1:8000/docs](http://127.0.0.1:8000/docs)
84+
We can access the playground at [http://127.0.0.1:8000/neo4j_cypher_memory/playground](http://127.0.0.1:8000/neo4j_cypher/playground)
85+
86+
We can access the template from code with:
87+
88+
```python
89+
from langserve.client import RemoteRunnable
90+
91+
runnable = RemoteRunnable("http://localhost:8000/neo4j-cypher-memory")
92+
```
Lines changed: 16 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,16 @@
1+
from langchain.graphs import Neo4jGraph
2+
3+
graph = Neo4jGraph()
4+
5+
graph.query(
6+
"""
7+
MERGE (m:Movie {name:"Top Gun"})
8+
WITH m
9+
UNWIND ["Tom Cruise", "Val Kilmer", "Anthony Edwards", "Meg Ryan"] AS actor
10+
MERGE (a:Actor {name:actor})
11+
MERGE (a)-[:ACTED_IN]->(m)
12+
WITH a
13+
WHERE a.name = "Tom Cruise"
14+
MERGE (a)-[:ACTED_IN]->(:Movie {name:"Mission Impossible"})
15+
"""
16+
)
Lines changed: 23 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,23 @@
1+
from neo4j_cypher_memory.chain import chain
2+
3+
if __name__ == "__main__":
4+
original_query = "Who played in Top Gun?"
5+
print(
6+
chain.invoke(
7+
{
8+
"question": original_query,
9+
"user_id": "user_123",
10+
"session_id": "session_1",
11+
}
12+
)
13+
)
14+
follow_up_query = "Did they play in any other movies?"
15+
print(
16+
chain.invoke(
17+
{
18+
"question": follow_up_query,
19+
"user_id": "user_123",
20+
"session_id": "session_1",
21+
}
22+
)
23+
)
Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,3 @@
1+
from neo4j_cypher_memory.chain import chain
2+
3+
__all__ = ["chain"]
Lines changed: 146 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,146 @@
1+
from typing import Any, Dict, List
2+
3+
from langchain.chains.graph_qa.cypher_utils import CypherQueryCorrector, Schema
4+
from langchain.chat_models import ChatOpenAI
5+
from langchain.graphs import Neo4jGraph
6+
from langchain.memory import ChatMessageHistory
7+
from langchain.prompts import ChatPromptTemplate, MessagesPlaceholder
8+
from langchain.pydantic_v1 import BaseModel
9+
from langchain.schema.output_parser import StrOutputParser
10+
from langchain.schema.runnable import RunnablePassthrough
11+
12+
# Connection to Neo4j
13+
graph = Neo4jGraph()
14+
15+
# Cypher validation tool for relationship directions
16+
corrector_schema = [
17+
Schema(el["start"], el["type"], el["end"])
18+
for el in graph.structured_schema.get("relationships")
19+
]
20+
cypher_validation = CypherQueryCorrector(corrector_schema)
21+
22+
# LLMs
23+
cypher_llm = ChatOpenAI(model_name="gpt-4", temperature=0.0)
24+
qa_llm = ChatOpenAI(model_name="gpt-3.5-turbo", temperature=0.0)
25+
26+
27+
def convert_messages(input: List[Dict[str, Any]]) -> ChatMessageHistory:
28+
history = ChatMessageHistory()
29+
for item in input:
30+
history.add_user_message(item["result"]["question"])
31+
history.add_ai_message(item["result"]["answer"])
32+
return history
33+
34+
35+
def get_history(input: Dict[str, Any]) -> ChatMessageHistory:
36+
input.pop("question")
37+
# Lookback conversation window
38+
window = 3
39+
data = graph.query(
40+
"""
41+
MATCH (u:User {id:$user_id})-[:HAS_SESSION]->(s:Session {id:$session_id}),
42+
(s)-[:LAST_MESSAGE]->(last_message)
43+
MATCH p=(last_message)<-[:NEXT*0.."""
44+
+ str(window)
45+
+ """]-()
46+
WITH p, length(p) AS length
47+
ORDER BY length DESC LIMIT 1
48+
UNWIND reverse(nodes(p)) AS node
49+
MATCH (node)-[:HAS_ANSWER]->(answer)
50+
RETURN {question:node.text, answer:answer.text} AS result
51+
""",
52+
params=input,
53+
)
54+
history = convert_messages(data)
55+
return history.messages
56+
57+
58+
def save_history(input):
59+
input.pop("response")
60+
# store history to database
61+
graph.query(
62+
"""MERGE (u:User {id: $user_id})
63+
WITH u
64+
OPTIONAL MATCH (u)-[:HAS_SESSION]->(s:Session{id: $session_id}),
65+
(s)-[l:LAST_MESSAGE]->(last_message)
66+
FOREACH (_ IN CASE WHEN last_message IS NULL THEN [1] ELSE [] END |
67+
CREATE (u)-[:HAS_SESSION]->(s1:Session {id:$session_id}),
68+
(s1)-[:LAST_MESSAGE]->(q:Question {text:$question, cypher:$query, date:datetime()}),
69+
(q)-[:HAS_ANSWER]->(:Answer {text:$output}))
70+
FOREACH (_ IN CASE WHEN last_message IS NOT NULL THEN [1] ELSE [] END |
71+
CREATE (last_message)-[:NEXT]->(q:Question
72+
{text:$question, cypher:$query, date:datetime()}),
73+
(q)-[:HAS_ANSWER]->(:Answer {text:$output}),
74+
(s)-[:LAST_MESSAGE]->(q)
75+
DELETE l) """,
76+
params=input,
77+
)
78+
79+
# Return LLM response to the chain
80+
return input["output"]
81+
82+
83+
# Generate Cypher statement based on natural language input
84+
cypher_template = """This is important for my career.
85+
Based on the Neo4j graph schema below, write a Cypher query that would answer the user's question:
86+
{schema}
87+
88+
Question: {question}
89+
Cypher query:""" # noqa: E501
90+
91+
cypher_prompt = ChatPromptTemplate.from_messages(
92+
[
93+
(
94+
"system",
95+
"Given an input question, convert it to a Cypher query. No pre-amble.",
96+
),
97+
MessagesPlaceholder(variable_name="history"),
98+
("human", cypher_template),
99+
]
100+
)
101+
102+
cypher_response = (
103+
RunnablePassthrough.assign(schema=lambda _: graph.get_schema, history=get_history)
104+
| cypher_prompt
105+
| cypher_llm.bind(stop=["\nCypherResult:"])
106+
| StrOutputParser()
107+
)
108+
109+
# Generate natural language response based on database results
110+
response_template = """Based on the the question, Cypher query, and Cypher response, write a natural language response:
111+
Question: {question}
112+
Cypher query: {query}
113+
Cypher Response: {response}""" # noqa: E501
114+
115+
response_prompt = ChatPromptTemplate.from_messages(
116+
[
117+
(
118+
"system",
119+
"Given an input question and Cypher response, convert it to a "
120+
"natural language answer. No pre-amble.",
121+
),
122+
("human", response_template),
123+
]
124+
)
125+
126+
chain = (
127+
RunnablePassthrough.assign(query=cypher_response)
128+
| RunnablePassthrough.assign(
129+
response=lambda x: graph.query(cypher_validation(x["query"])),
130+
)
131+
| RunnablePassthrough.assign(
132+
output=response_prompt | qa_llm | StrOutputParser(),
133+
)
134+
| save_history
135+
)
136+
137+
# Add typing for input
138+
139+
140+
class Question(BaseModel):
141+
question: str
142+
user_id: str
143+
session_id: str
144+
145+
146+
chain = chain.with_types(input_type=Question)

0 commit comments

Comments
 (0)