Skip to content

Commit 985f36c

Browse files
alsakhaevethernian
andauthored
feat: introduce Aigency (One Trillion Agents Hackathon) (#61)
* feat: store parsed contexts in postgresql * feat: execute agents for each new context [WiP] * feat: introduce queue for agent runner * feat: install pgvector deps * feat: use pgvector for similarity search * feat: introduce generated openfaas nodejs client * feat: run agents in openfaas * feat: sentiment analysis agent example * feat: save agent response to db * feat: get jobs method * feat: measure context indexing performance * feat: use nltk instead of textblob for sentiment analysis * refactor: rename simple-agent to sentiment-analysis * refactor: extract ContextEdge to the separate file * feat: store relationship types * feat: invoke agent api * feat: introduce fake detector agent * feat: introduce associative summarizer agent [WiP] * build: deploy to kubernetes with helm chart * refactor: move openfaas agents to the separate directory * feat: introduce near ai agent fake detector * feat: run near ai agents in the cloud * build: add nearai api env var * build: simplify deploy helm chart to localhost * feat: own queues for each agents * refactor: move agents to one level higher * feat(fake-detector): answer in structured json * feat: introduce crawler agent * feat: run crawler agent by orders * feat: filter by metadata in similarity search * feat: create Layout for pages; redesign Apps page; add App pages; fix Header markup * feat: associative summarizer * feat: get agents api * feat: add meaningful agent icons * feat: count consumed time * feat: rename orders to jobs * feat: add job times * feat: add aigency page; add `useJobs` hook * docs: add repository banner --------- Co-authored-by: ethernian <dima@ethernian.com>
1 parent ebad8f2 commit 985f36c

File tree

127 files changed

+8359
-734
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

127 files changed

+8359
-734
lines changed

.gitignore

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -29,3 +29,5 @@ typedoc-docs/
2929
build
3030

3131
dependency-graph.svg
32+
33+
.env

README.md

Lines changed: 4 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -1,19 +1,7 @@
1-
# Mutable Web
1+
![Aigency banner](docs/readme-dark.png#gh-dark-mode-only)
2+
![Aigency banner](docs/readme-light.png#gh-light-mode-only)
23

3-
Mutable Web is a browser extension, that instantly adds insights, new workflows and automated AI agents to someone else´s websites, improving users' personal productivity, their community experience and overcoming the limitations set by website owners.
4+
# Aigency
45

5-
## Use Cases
6-
Here is an example of what is possible:
7-
- When a user visits a website, an AICrawler agent stores the parsed content in storage and prepares it for further analysis and AI training. If the data is used, the user is paid.
8-
- When a user visits a prediction market website, an AI agent highlights markets worth investing in.
9-
- When a user visits a forum website, another AI agent checks posts and discussions for fallacies, validates source links, adds trust ratings, and helps create bets that challenge dubious content.
10-
- Send and receive crypto to and from social accounts.
11-
- Create custom message timelines, even including deleted posts.
12-
- Create web guides to navigate users through complex websites.
6+
The project created at the One Trillion Agents Hackathon January 31 – March 2nd, 2025
137

14-
## The Greater Vision
15-
Mutable Web is completely decentralised, context-aware, permissionless and runs over existing websites without asking the owners for permission. Any user or community can now customise any website on the web. Anyone is free to create applications and agents that process content on websites visited by users and share it within the community.
16-
17-
Communities can now gain control over the UI of the websites they use, creating their own workflows, earning their own money and finally becoming self-sovereign.
18-
19-
Mutable Web is a primary choice for integrating personal AI agents into the websites users visit on the fly.
Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
__pycache__
Lines changed: 66 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,66 @@
1+
2+
FROM --platform=${TARGETPLATFORM:-linux/amd64} ghcr.io/openfaas/of-watchdog:0.10.7 AS watchdog
3+
FROM --platform=${TARGETPLATFORM:-linux/amd64} pytorch/pytorch AS build
4+
5+
COPY --from=watchdog /fwatchdog /usr/bin/fwatchdog
6+
RUN chmod +x /usr/bin/fwatchdog
7+
8+
ARG UPGRADE_PACKAGES=false
9+
10+
ARG ADDITIONAL_PACKAGE
11+
# Alternatively use ADD https:// (which will not be cached by Docker builder)
12+
13+
# RUN if [ "${UPGRADE_PACKAGES}" = "true" ] || [ "${UPGRADE_PACKAGES}" = "1" ]; then apk --no-cache upgrade; fi && \
14+
# apk --no-cache add openssl-dev ${ADDITIONAL_PACKAGE}
15+
16+
# Add non root user
17+
RUN addgroup --system app && adduser app --system --ingroup app
18+
RUN chown app /home/app
19+
20+
USER app
21+
22+
ENV PATH=$PATH:/home/app/.local/bin
23+
24+
WORKDIR /home/app/
25+
26+
COPY --chown=app:app index.py .
27+
COPY --chown=app:app requirements.txt .
28+
29+
USER root
30+
RUN pip install --no-cache-dir -r requirements.txt
31+
32+
# Build the function directory and install any user-specified components
33+
USER app
34+
35+
RUN mkdir -p function
36+
RUN touch ./function/__init__.py
37+
WORKDIR /home/app/function/
38+
COPY --chown=app:app function/requirements.txt .
39+
RUN pip install --no-cache-dir --user -r requirements.txt
40+
41+
#install function code
42+
USER root
43+
44+
COPY --chown=app:app function/ .
45+
46+
47+
FROM build AS test
48+
ARG TEST_COMMAND=tox
49+
ARG TEST_ENABLED=true
50+
RUN [ "$TEST_ENABLED" = "false" ] && echo "skipping tests" || eval "$TEST_COMMAND"
51+
52+
FROM build AS ship
53+
WORKDIR /home/app/
54+
55+
#configure WSGI server and healthcheck
56+
USER app
57+
58+
ENV fprocess="python index.py"
59+
60+
ENV cgi_headers="true"
61+
ENV mode="http"
62+
ENV upstream_url="http://127.0.0.1:5000"
63+
64+
HEALTHCHECK --interval=5s CMD [ -e /tmp/.lock ] || exit 1
65+
66+
CMD ["fwatchdog"]
Lines changed: 24 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,24 @@
1+
# Simple Agent Example
2+
3+
This agent is based on OpenFaaS' template [`python3-flask`](https://github.com/openfaas/python-flask-template/blob/master/template/python3-flask/)
4+
5+
Install deps:
6+
7+
```sh
8+
pip install -r requirements.txt
9+
```
10+
11+
Run:
12+
13+
```sh
14+
python index.py
15+
```
16+
17+
Make HTTP request to `http://localhost:5000` to call your function.
18+
19+
Build and push the agent:
20+
21+
```sh
22+
docker buildx create --use # if you didn't it before
23+
docker buildx build --platform linux/amd64,linux/arm64 . --tag ghcr.io/dapplets/associative-summarizer-agent:latest --push
24+
```

agents/associative-summarizer/function/__init__.py

Whitespace-only changes.
Lines changed: 111 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,111 @@
1+
import json
2+
import requests
3+
from transformers import pipeline
4+
5+
# Initialize the summarization pipeline with the explicitly specified model
6+
summarizer = pipeline("summarization", model="facebook/bart-large-cnn")
7+
8+
aigencyBaseUrl = "https://api.aigency.augm.link"
9+
10+
11+
def search_vector_db(context):
12+
url = f"{aigencyBaseUrl}/context/similar"
13+
headers = {"Content-Type": "application/json"}
14+
data = {"context": context, "limit": 5}
15+
response = requests.post(url, headers=headers, data=json.dumps(data))
16+
results = response.json()
17+
return results["contexts"]
18+
19+
20+
def handle(req):
21+
"""
22+
Process the input JSON, search for similar documents in the vector database,
23+
concatenate all similar texts, summarize them using an LLM model, and return the results.
24+
25+
Args:
26+
req (str): A JSON string with the following structure:
27+
{
28+
"context": {
29+
"namespace": "dapplets.near/parser/twitter",
30+
"contextType": "post",
31+
"id": "1234567890123456789",
32+
"parsedContext": {
33+
"text": "Text to search in the vector database.",
34+
"authorFullname": "John Doe",
35+
"authorUsername": "john_doe",
36+
"authorImg": "https://example.com/image.png",
37+
"createdAt": "2025-02-19T20:33:29.000Z",
38+
"url": "https://twitter.com/john_doe/status/1234567890123456789"
39+
}
40+
}
41+
}
42+
43+
Returns:
44+
str: A JSON string with the following structure:
45+
{
46+
"context": {
47+
"namespace": "dapplets.near/agent/vector-search",
48+
"contextType": "similarity",
49+
"id": "<same as input id>",
50+
"parsedContext": {
51+
"results": [<list of similar documents>],
52+
"summary": "<summarized text>"
53+
}
54+
}
55+
}
56+
"""
57+
# Deserialize the input JSON
58+
data = json.loads(req)
59+
60+
# Extract the text to search from the parsed context
61+
context = data["context"]
62+
63+
# Perform the search in the vector database.
64+
similar_contexts = search_vector_db(context)
65+
66+
# Concatenate all similar document texts into one continuous string
67+
combined_text = " ".join([doc["parsedContext"]["text"] for doc in similar_contexts])
68+
69+
# Use the summarization model (LLM) to summarize the combined text
70+
# Adjust max_length and min_length as needed
71+
summary_result = summarizer(
72+
combined_text, max_length=150, min_length=30, do_sample=False
73+
)
74+
summary_text = summary_result[0]["summary_text"] if summary_result else ""
75+
76+
# Build the output JSON with the search results and the summary
77+
output = {
78+
"context": {
79+
"namespace": "dapplets.near/agent/associative-summarizer",
80+
"contextType": "similarity",
81+
"id": data["context"]["id"],
82+
"parsedContext": {"results": similar_contexts, "summary": summary_text},
83+
}
84+
}
85+
86+
# Serialize and return the output JSON
87+
return json.dumps(output)
88+
89+
90+
# if __name__ == "__main__":
91+
# # Test input JSON
92+
# test_input = {
93+
# "context": {
94+
# "namespace": "dapplets.near/parser/twitter",
95+
# "contextType": "post",
96+
# "id": "1234567890123456789",
97+
# "parsedContext": {
98+
# "text": "Example text to search for similar associations in the vector database.",
99+
# "authorFullname": "John Doe",
100+
# "authorUsername": "john_doe",
101+
# "authorImg": "https://example.com/image.png",
102+
# "createdAt": "2025-02-19T20:33:29.000Z",
103+
# "url": "https://twitter.com/john_doe/status/1234567890123456789",
104+
# },
105+
# }
106+
# }
107+
108+
# # Serialize the test input and call the handle function
109+
# test_req = json.dumps(test_input)
110+
# result = handle(test_req)
111+
# print(result)
Lines changed: 10 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,10 @@
1+
from .handler import handle
2+
3+
# Test your handler here
4+
5+
# To disable testing, you can set the build_arg `TEST_ENABLED=false` on the CLI or in your stack.yml
6+
# https://docs.openfaas.com/reference/yaml/#function-build-args-build-args
7+
8+
def test_handle():
9+
# assert handle("input") == "input"
10+
pass
Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,4 @@
1+
sentence-transformers==3.4.1
2+
transformers
3+
numpy
4+
torch
Lines changed: 41 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,41 @@
1+
# If you would like to disable
2+
# automated testing during faas-cli build,
3+
4+
# Replace the content of this file with
5+
# [tox]
6+
# skipsdist = true
7+
8+
# You can also edit, remove, or add additional test steps
9+
# by editing, removing, or adding new testenv sections
10+
11+
12+
# find out more about tox: https://tox.readthedocs.io/en/latest/
13+
[tox]
14+
envlist = lint,test
15+
skipsdist = true
16+
17+
[testenv:test]
18+
deps =
19+
flask
20+
pytest
21+
-rrequirements.txt
22+
commands =
23+
# run unit tests with pytest
24+
# https://docs.pytest.org/en/stable/
25+
# configure by adding a pytest.ini to your handler
26+
pytest
27+
28+
[testenv:lint]
29+
deps =
30+
flake8
31+
commands =
32+
flake8 .
33+
34+
[flake8]
35+
count = true
36+
max-line-length = 127
37+
max-complexity = 10
38+
statistics = true
39+
# stop the build if there are Python syntax errors or undefined names
40+
select = E9,F63,F7,F82
41+
show-source = true

0 commit comments

Comments
 (0)