Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
38 changes: 38 additions & 0 deletions docs/advanced/chat_functions/info.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,38 @@
# Chat Functions - More information

Chat functions are the microservices that Lambda Feedback calls to provide the underlying functionality of a chatbot. Students can chat with the chatbots and ask for help or further explanations regarding the Question that they are working on. Each chatbot has its own personality and approach to assisting the students.

The chatbots have at their basis a [Large Language Model (LLM)](https://en.wikipedia.org/wiki/Large_language_model) which received information regarding:

- the raw markdown content of the question the student is on currently, including:
- the question name, number and content
- the final answer, structured tutorial, and worked solutions of the question
- the guidance (blurb and time estimate) form the teacher for the question
- the set name, number and description
- all parts with their number, content and done status (current part emphasised)
- all response areas and their respective expected answers
- the progress of the student on all parts of the Question, including:
- the total number of responses and the number of wrong responses the student has made for each response area
- the last responses the student has made for each response area and the received feedback
- the time duration the student has spent on the respective question and current part on that day

---

## Available Chat functions

Currently the students have access to the following chat functions that host their own specific chatbot. Many others are in development.

Click on the links below for information on each chatbot:

[1. Informational Chatbot](https://github.com/lambda-feedback/informationalChatFunction/blob/main/docs/user.md)


[2. Concise Chatbot](https://github.com/lambda-feedback/conciseChatFunction/blob/main/docs/user.md)


[3. Reflective Chatbot](https://github.com/lambda-feedback/reflectiveChatFunction/blob/main/docs/user.md)


## Chat Function Development

Are you interested in developing your own chatbot? Then check out the [Quickstart guide](quickstart.md) to develop and deploy your own AI chat function for Lambda Feedback.
20 changes: 8 additions & 12 deletions docs/advanced/chat_functions/local.md
Original file line number Diff line number Diff line change
@@ -1,19 +1,19 @@
# Running and Testing Agents Locally
# Running and Testing Chat function Locally

You can run the Python function for your agent itself by writing a `main()` function, or you can call the [`testbench_prompts.py`](https://github.com/lambda-feedback/lambda-chat/blob/main/src/agents/utils/testbench_prompts.py) script that runs a similar pipeline to the `module.py`.
You can run the Python function for your chat function itself by writing a `main()` function, or you can call the [`testbench_prompts.py`](https://github.com/lambda-feedback/lambda-chat/blob/main/src/agents/utils/testbench_prompts.py) script that runs a similar pipeline to the `module.py`.

```bash
python src/agents/utils/testbench_prompts.py
```

You can also use the `test_prompts.py` script to test the agents with example inputs from Lambda Feedback questions and synthetic conversations.
You can also use the `test_prompts.py` script to test the chat function with example inputs from Lambda Feedback questions and synthetic conversations.
```bash
python src/agents/utils/test_prompts.py
```

## Testing using the Docker Image [:material-docker:](https://www.docker.com/)

You can also build and run the docker pipeline for the agents. The chatbot agents are deployed onto a AWS Lambda serverless cloud function using the docker image. Hence, for final testing of your chatbots, we recommend completing those steps.
You can also build and run the docker pipeline for the chat function. The chatbot associated with the chat function is deployed onto a AWS Lambda serverless cloud function using the docker image. Hence, for final testing of your chatbot, we recommend completing those steps.

#### Build the Docker Image

Expand Down Expand Up @@ -42,7 +42,9 @@ docker run --env-file .env -it --name my-lambda-container -p 8080:8080 llm_chat
This will start the evaluation function and expose it on port `8080` and it will be open to be curl:

```bash
curl --location 'http://localhost:8080/2015-03-31/functions/function/invocations' --header 'Content-Type: application/json' --data '{"message":"hi","params":{"conversation_id":"12345Test","conversation_history": [{"type":"user","content":"hi"}]}}'
curl --location 'http://localhost:8080/2015-03-31/functions/function/invocations' \
--header 'Content-Type: application/json' \
--data '{"body":"{\"message\": \"hi\", \"params\": {\"conversation_id\": \"12345Test\", \"conversation_history\": [{\"type\": \"user\", \"content\": \"hi\"}]}}"}'
```

### Call Docker Container From Postman
Expand All @@ -56,13 +58,7 @@ http://localhost:8080/2015-03-31/functions/function/invocations
Body:

```JSON
{
"message":"hi",
"params":{
"conversation_id":"12345Test",
"conversation_history": [{"type":"user","content":"hi"}]
}
}
{"body":"{\"message\": \"hi\", \"params\": {\"conversation_id\": \"12345Test\", \"conversation_history\": [{\"type\": \"user\", \"content\": \"hi\"}]}}"}
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is there an issue with the commit here? I'm reviewing in GH on the browser, not sure how this will appear when rendered.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it emphasises that the body needs to be stringified

```

Body with optional Params:
Expand Down
57 changes: 41 additions & 16 deletions docs/advanced/chat_functions/quickstart.md
Original file line number Diff line number Diff line change
@@ -1,27 +1,27 @@
# Developing Chat Agents: Getting Started
# Developing Chat Functions: Getting Started

## What is a Chat Agent?
## What is a Chat Function?

It's a function which calls Large Language Models (LLMs) to respond to the student's messages given contxtual data:
A chat function is a function which calls Large Language Models (LLMs) to respond to the messages of students given contextual data:

- question data
- user data such as past responses to the problem
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

link to the info page with a full listing?

Chatbot Agents capture and automate the process of assisting students during their learning process when outside of classroom.

Chat functions host a chatbot. Chatbots capture and automate the process of assisting students during their learning process when outside of classroom.

## Getting Setup for Development

1. Get the code on your local machine (Using github desktop or the `git` cli)

- For new functions: clone the main repo for [lambda-chat](https://github.com/lambda-feedback/lambda-chat) and create a new branch. Then go under `scr/agents` and copy the `base_agent` folder.

- For new functions: clone the template repo for [chat-function-boilerplate](https://github.com/lambda-feedback/chat-function-boilerplate). **Make sure the new repository is set to public (it needs access to organisation secrets)**.
- For existing functions: please make your changes on a new separate branch

2. _If you are creating a new chatbot agent_, you'll need to set it's name as the folder name in `scr/agents` and its corresponding files.
3. You are now ready to start making changes and implementing features by editing each of the three main function-logic files:
2. _If you are creating a new chatbot_, you can either edit the `src/agents/base_agent` or copy it and rename it based on the name of your chatbot.
3. You are now ready to start making changes and implementing features by editing each of the main function-logic files:

1. **`scr/agents/{base_agent}/{base}_agent.py`**: This file contains the main LLM pipeline using [LangGraph](https://langchain-ai.github.io/langgraph/) and [LangChain](https://python.langchain.com/docs/introduction/).
1. **`src/agents/{base_agent}/{base}_agent.py`**: This file contains the main LLM pipeline using [LangGraph](https://langchain-ai.github.io/langgraph/) and [LangChain](https://python.langchain.com/docs/introduction/).

- the agent expects the following inputs when it being called:
- the chat function expects the following arguments when it being called:

Body with necessary Params:

Expand Down Expand Up @@ -52,19 +52,44 @@ It's a function which calls Large Language Models (LLMs) to respond to the stude
}
```

2. **`scr/agents/{base_agent}/{base}_prompts.py`**: This is where you can write the system prompts that describe how your AI Assistant should behave and respond to the user.
2. **`src/agents/{base_agent}/{base}_prompts.py`**: This is where you can write the system prompts that describe how your AI Assistant should behave and respond to the user.

3. _If you edited the chatbot agent file name_, make sure to add your chatbot `invoke()` function to the `module.py` file.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If agent is the right word, fine (because that's how it's coded), but it's still confusing

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes it has got confusing. I have noted down that once I go back to modelling the boilerplate for chat functions with Marcus, then I will update all those instructions.


3. Make sure to add your agent `invoke()` function to the `module.py` file.
4. Update the `config.json` file with the name of the chat function.

4. Please add a `README.md` file to describe the use and behaviour of your agent.
5. Please add a `README.md` file to describe the use and behaviour of your chatbot.

4. Changes can be tested locally by running the pipeline tests using:
```bash
pytest src/module_test.py
```
[Running and Testing Agents Locally](local.md){ .md-button }
[Running and Testing Chat Functions Locally](local.md){ .md-button }


5. Merge commits into any branch (except main) will trigger the `dev.yml` workflow, which will build the docker image, push it to a shared `dev` ECR repository to make the function available from the `dev` and `localhost` client app.
5. Merge commits into dev branch will trigger the `dev.yml` workflow, which will build the docker image, push it to a shared `dev` ECR repository and deploy an AWS Lambda function available to any http requests. In order to make your new chatbot available on the `dev` environment of the Lambda Feedback platform, you will have to get in contact with the ADMINS on the platform.

6. You can now test the deployed chat function using your preferred request client (such as [Insomnia](https://insomnia.rest/) or [Postman](https://www.postman.com/) or simply `curl` from a terminal). `DEV` Functions are made available at:
```url
https://<***>.execute-api.eu-west-2.amazonaws.com/default/chat/<function name as defined in config.json>
```

6. In order to make your new chatbot available on the LambdaFeedback platform, you will have to get in contact with the ADMINS on the platform.
!!! example "Example Request to chatFunctionBoilerplate-dev"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If you expose the real URL, then people will call it and hence use our API key. So you need to remove that part of the URL (e.g. use https://****.execute-api...). As this is a public repo, you now need to go and change the AWS addresses so that this historical commit can't be used to access our API key.

Copy link
Collaborator Author

@neagualexa neagualexa Oct 15, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

that was a dummy url. It does not exist in our infrastructure. So no need for worry. But I removed the urls and put ***s

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also, this is similar to how evaluation functions are presented

https://c1o0u8se7b.execute-api.eu-west-2.amazonaws.com/default/<function name as defined in config.json>

@peterbjohnson should I replace the url here as well? (for evaluation functions the url is now invalid)

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good idea.

Original eval functions didn't imply such high costs (no LLM calls) but that may change.

curl --location 'https://<***>.execute-api.eu-west-2.amazonaws.com/default/chat/chatFunctionBoilerplate-dev' \
--header 'Content-Type: application/json' \
--data '{
"message": "hi",
"params": {
"conversation_id": "12345Test",
"conversation_history": [
{
"type": "user",
"content": "hi"
}
]
}
}'

7. Once the `dev` chat function is fully tested, you can merge the code to the default branch (`main`). This will trigger the `main.yml` workflow, which will deploy the `staging` and `prod` versions of your chat function. Please contact the ADMIN to provide you the URLS for the `staging` and `prod` versions of your chat function.

8. In order to make your new chat function available on any of the environments of the Lambda Feedback platform, you will have to get in contact with the ADMINS on the platform.
6 changes: 3 additions & 3 deletions docs/advanced/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,10 +8,10 @@ The fundamental idea of Lambda Feedback is that it calls external microservices
Evaluate a student response and provide feedback:
[Evaluation functions - Quickstart Guide](evaluation_functions/quickstart.md){ .md-button .md-button--primary style="width: 400px;"}

Dialogic conversations with students:<br>
[Chat functions - Quickstart guide ](chatbot_agents/quickstart.md){ .md-button .md-button--primary style="width: 400px;"}
LLM-driven chatbots to converse with students:<br>
[Chat functions - Quickstart guide ](chat_functions/quickstart.md){ .md-button .md-button--primary style="width: 400px;"}

All microservices are called over http. There is complete freedom in their implementation. Lambda Feedback also provides families of deployed microservices, using open source code available in our public GitHub repositories.
All microservices are called over http. There is complete freedom in their implementation subject to the expected API schema. Lambda Feedback also provides families of deployed microservices, using open source code available in our public GitHub repositories.

This section of documentation is to help developers of microservices. The documentation is written assuming you have basic developer skills.

Expand Down
15 changes: 10 additions & 5 deletions docs/student/getting_started_student.md
Original file line number Diff line number Diff line change
Expand Up @@ -35,15 +35,20 @@ See the [Answering Questions](answering_questions.md) page for more help with an

### Using the Workspace

The Workspace provides you with various functionalities to assist you during your learning process:
#### 1. Canvas:
The Workspace provides you with various functionalities to assist you during your learning process. Your edits and progress in the Workspace are saved per each Question you preview. So, you will be able to view your old edits for the Question you are currently on.

Here are the various functionalities:

#### Canvas:
A pane where you can write down your thought process and notes for the previewed question (handwriting, sticky notes & text).

![Canvas Interface](images/canvas_interface.png)

#### 2. Chat:
A chat interface connecting you with helpful AI Chatbots to discuss any questions you have on the current topic you are working on.
#### Chat:
A chat interface connecting you with helpful Chatbots. The Chatbots are AI Assistants that you can chat with to ask for help or further explanations regarding the Question that you are working on.

![Chat Interface](images/chat_interface.png)

Your edits and progress in the Workspace are saved per each Question you preview. So, you will be able to view your old edits for the Question you are currently on.
For more information on what the chatbot knows about you and how you can use it to its full potential:

[Chatbots - More Info](../advanced/chat_functions/info.md){ .md-button .md-button--primary}
Binary file modified docs/student/images/canvas_interface.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/student/images/chat_interface.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
2 changes: 1 addition & 1 deletion docs/student/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ The image above shows an example question, with numbers to indicate:
10. _Response area_, where student responses are entered and feedback is given
11. Feedback to the teacher (currently in flux regarding the design - 02/07/25)
12. Access to content 'below the line' providing extra support.
13. _Workspace_ - Opens tab with canvas and ai chatbot
13. _Workspace_ - Opens tab with canvas and chat
14. Comments

## Below the line
Expand Down
2 changes: 1 addition & 1 deletion docs/terminology.md
Original file line number Diff line number Diff line change
Expand Up @@ -56,4 +56,4 @@ This is an optional section, and so does not have to be included in any question

### Workspace

On the Question page, the students has access to their own workspace tab. Here they can find the "Canvas", for handwriting notes, and the "Chat", for conversing with an AI Chatbot on the question materials.
On the Question page, the students has access to their own workspace tab. Here they can find the "Canvas", for handwriting notes, and the "Chat", for conversing with an LLM-driven Chatbot on the question materials.
1 change: 1 addition & 0 deletions mkdocs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -76,6 +76,7 @@ nav:
- Chat functions:
- Quickstart: "advanced/chat_functions/quickstart.md"
- Testing Functions Locally: "advanced/chat_functions/local.md"
- Chat Functions Information: "advanced/chat_functions/info.md"

# Configuration
theme:
Expand Down