diff --git a/docs/advanced/chat_functions/info.md b/docs/advanced/chat_functions/info.md new file mode 100644 index 000000000..eb671144c --- /dev/null +++ b/docs/advanced/chat_functions/info.md @@ -0,0 +1,38 @@ +# Chat Functions - More information + +Chat functions are the microservices that Lambda Feedback calls to provide the underlying functionality of a chatbot. Students can chat with the chatbots and ask for help or further explanations regarding the Question that they are working on. Each chatbot has its own personality and approach to assisting the students. + +The chatbots have at their basis a [Large Language Model (LLM)](https://en.wikipedia.org/wiki/Large_language_model) which received information regarding: + +- the raw markdown content of the question the student is on currently, including: + - the question name, number and content + - the final answer, structured tutorial, and worked solutions of the question + - the guidance (blurb and time estimate) form the teacher for the question + - the set name, number and description + - all parts with their number, content and done status (current part emphasised) + - all response areas and their respective expected answers +- the progress of the student on all parts of the Question, including: + - the total number of responses and the number of wrong responses the student has made for each response area + - the last responses the student has made for each response area and the received feedback + - the time duration the student has spent on the respective question and current part on that day + +--- + +## Available Chat functions + +Currently the students have access to the following chat functions that host their own specific chatbot. Many others are in development. + +Click on the links below for information on each chatbot: + +[1. Informational Chatbot](https://github.com/lambda-feedback/informationalChatFunction/blob/main/docs/user.md) + + +[2. Concise Chatbot](https://github.com/lambda-feedback/conciseChatFunction/blob/main/docs/user.md) + + +[3. Reflective Chatbot](https://github.com/lambda-feedback/reflectiveChatFunction/blob/main/docs/user.md) + + +## Chat Function Development + +Are you interested in developing your own chatbot? Then check out the [Quickstart guide](quickstart.md) to develop and deploy your own AI chat function for Lambda Feedback. diff --git a/docs/advanced/chat_functions/local.md b/docs/advanced/chat_functions/local.md index 3f122e5a4..04644ac37 100644 --- a/docs/advanced/chat_functions/local.md +++ b/docs/advanced/chat_functions/local.md @@ -1,19 +1,19 @@ -# Running and Testing Agents Locally +# Running and Testing Chat function Locally -You can run the Python function for your agent itself by writing a `main()` function, or you can call the [`testbench_prompts.py`](https://github.com/lambda-feedback/lambda-chat/blob/main/src/agents/utils/testbench_prompts.py) script that runs a similar pipeline to the `module.py`. +You can run the Python function for your chat function itself by writing a `main()` function, or you can call the [`testbench_prompts.py`](https://github.com/lambda-feedback/lambda-chat/blob/main/src/agents/utils/testbench_prompts.py) script that runs a similar pipeline to the `module.py`. ```bash python src/agents/utils/testbench_prompts.py ``` -You can also use the `test_prompts.py` script to test the agents with example inputs from Lambda Feedback questions and synthetic conversations. +You can also use the `test_prompts.py` script to test the chat function with example inputs from Lambda Feedback questions and synthetic conversations. ```bash python src/agents/utils/test_prompts.py ``` ## Testing using the Docker Image [:material-docker:](https://www.docker.com/) -You can also build and run the docker pipeline for the agents. The chatbot agents are deployed onto a AWS Lambda serverless cloud function using the docker image. Hence, for final testing of your chatbots, we recommend completing those steps. +You can also build and run the docker pipeline for the chat function. The chatbot associated with the chat function is deployed onto a AWS Lambda serverless cloud function using the docker image. Hence, for final testing of your chatbot, we recommend completing those steps. #### Build the Docker Image @@ -42,7 +42,9 @@ docker run --env-file .env -it --name my-lambda-container -p 8080:8080 llm_chat This will start the evaluation function and expose it on port `8080` and it will be open to be curl: ```bash -curl --location 'http://localhost:8080/2015-03-31/functions/function/invocations' --header 'Content-Type: application/json' --data '{"message":"hi","params":{"conversation_id":"12345Test","conversation_history": [{"type":"user","content":"hi"}]}}' +curl --location 'http://localhost:8080/2015-03-31/functions/function/invocations' \ +--header 'Content-Type: application/json' \ +--data '{"body":"{\"message\": \"hi\", \"params\": {\"conversation_id\": \"12345Test\", \"conversation_history\": [{\"type\": \"user\", \"content\": \"hi\"}]}}"}' ``` ### Call Docker Container From Postman @@ -56,13 +58,7 @@ http://localhost:8080/2015-03-31/functions/function/invocations Body: ```JSON -{ - "message":"hi", - "params":{ - "conversation_id":"12345Test", - "conversation_history": [{"type":"user","content":"hi"}] - } -} +{"body":"{\"message\": \"hi\", \"params\": {\"conversation_id\": \"12345Test\", \"conversation_history\": [{\"type\": \"user\", \"content\": \"hi\"}]}}"} ``` Body with optional Params: diff --git a/docs/advanced/chat_functions/quickstart.md b/docs/advanced/chat_functions/quickstart.md index 72b2c78d0..3c03ac7ea 100644 --- a/docs/advanced/chat_functions/quickstart.md +++ b/docs/advanced/chat_functions/quickstart.md @@ -1,27 +1,27 @@ -# Developing Chat Agents: Getting Started +# Developing Chat Functions: Getting Started -## What is a Chat Agent? +## What is a Chat Function? -It's a function which calls Large Language Models (LLMs) to respond to the student's messages given contxtual data: +A chat function is a function which calls Large Language Models (LLMs) to respond to the messages of students given contextual data: - question data - user data such as past responses to the problem - Chatbot Agents capture and automate the process of assisting students during their learning process when outside of classroom. + +Chat functions host a chatbot. Chatbots capture and automate the process of assisting students during their learning process when outside of classroom. ## Getting Setup for Development 1. Get the code on your local machine (Using github desktop or the `git` cli) - - For new functions: clone the main repo for [lambda-chat](https://github.com/lambda-feedback/lambda-chat) and create a new branch. Then go under `scr/agents` and copy the `base_agent` folder. - + - For new functions: clone the template repo for [chat-function-boilerplate](https://github.com/lambda-feedback/chat-function-boilerplate). **Make sure the new repository is set to public (it needs access to organisation secrets)**. - For existing functions: please make your changes on a new separate branch -2. _If you are creating a new chatbot agent_, you'll need to set it's name as the folder name in `scr/agents` and its corresponding files. -3. You are now ready to start making changes and implementing features by editing each of the three main function-logic files: +2. _If you are creating a new chatbot_, you can either edit the `src/agents/base_agent` or copy it and rename it based on the name of your chatbot. +3. You are now ready to start making changes and implementing features by editing each of the main function-logic files: - 1. **`scr/agents/{base_agent}/{base}_agent.py`**: This file contains the main LLM pipeline using [LangGraph](https://langchain-ai.github.io/langgraph/) and [LangChain](https://python.langchain.com/docs/introduction/). + 1. **`src/agents/{base_agent}/{base}_agent.py`**: This file contains the main LLM pipeline using [LangGraph](https://langchain-ai.github.io/langgraph/) and [LangChain](https://python.langchain.com/docs/introduction/). - - the agent expects the following inputs when it being called: + - the chat function expects the following arguments when it being called: Body with necessary Params: @@ -52,19 +52,44 @@ It's a function which calls Large Language Models (LLMs) to respond to the stude } ``` - 2. **`scr/agents/{base_agent}/{base}_prompts.py`**: This is where you can write the system prompts that describe how your AI Assistant should behave and respond to the user. + 2. **`src/agents/{base_agent}/{base}_prompts.py`**: This is where you can write the system prompts that describe how your AI Assistant should behave and respond to the user. + + 3. _If you edited the chatbot agent file name_, make sure to add your chatbot `invoke()` function to the `module.py` file. - 3. Make sure to add your agent `invoke()` function to the `module.py` file. + 4. Update the `config.json` file with the name of the chat function. - 4. Please add a `README.md` file to describe the use and behaviour of your agent. + 5. Please add a `README.md` file to describe the use and behaviour of your chatbot. 4. Changes can be tested locally by running the pipeline tests using: ```bash pytest src/module_test.py ``` - [Running and Testing Agents Locally](local.md){ .md-button } + [Running and Testing Chat Functions Locally](local.md){ .md-button } -5. Merge commits into any branch (except main) will trigger the `dev.yml` workflow, which will build the docker image, push it to a shared `dev` ECR repository to make the function available from the `dev` and `localhost` client app. +5. Merge commits into dev branch will trigger the `dev.yml` workflow, which will build the docker image, push it to a shared `dev` ECR repository and deploy an AWS Lambda function available to any http requests. In order to make your new chatbot available on the `dev` environment of the Lambda Feedback platform, you will have to get in contact with the ADMINS on the platform. + +6. You can now test the deployed chat function using your preferred request client (such as [Insomnia](https://insomnia.rest/) or [Postman](https://www.postman.com/) or simply `curl` from a terminal). `DEV` Functions are made available at: + ```url + https://<***>.execute-api.eu-west-2.amazonaws.com/default/chat/ + ``` -6. In order to make your new chatbot available on the LambdaFeedback platform, you will have to get in contact with the ADMINS on the platform. + !!! example "Example Request to chatFunctionBoilerplate-dev" + curl --location 'https://<***>.execute-api.eu-west-2.amazonaws.com/default/chat/chatFunctionBoilerplate-dev' \ + --header 'Content-Type: application/json' \ + --data '{ + "message": "hi", + "params": { + "conversation_id": "12345Test", + "conversation_history": [ + { + "type": "user", + "content": "hi" + } + ] + } + }' + +7. Once the `dev` chat function is fully tested, you can merge the code to the default branch (`main`). This will trigger the `main.yml` workflow, which will deploy the `staging` and `prod` versions of your chat function. Please contact the ADMIN to provide you the URLS for the `staging` and `prod` versions of your chat function. + +8. In order to make your new chat function available on any of the environments of the Lambda Feedback platform, you will have to get in contact with the ADMINS on the platform. diff --git a/docs/advanced/index.md b/docs/advanced/index.md index 6332786ec..d96d773d7 100644 --- a/docs/advanced/index.md +++ b/docs/advanced/index.md @@ -8,10 +8,10 @@ The fundamental idea of Lambda Feedback is that it calls external microservices Evaluate a student response and provide feedback: [Evaluation functions - Quickstart Guide](evaluation_functions/quickstart.md){ .md-button .md-button--primary style="width: 400px;"} -Dialogic conversations with students:
-[Chat functions - Quickstart guide ](chatbot_agents/quickstart.md){ .md-button .md-button--primary style="width: 400px;"} +LLM-driven chatbots to converse with students:
+[Chat functions - Quickstart guide ](chat_functions/quickstart.md){ .md-button .md-button--primary style="width: 400px;"} -All microservices are called over http. There is complete freedom in their implementation. Lambda Feedback also provides families of deployed microservices, using open source code available in our public GitHub repositories. +All microservices are called over http. There is complete freedom in their implementation subject to the expected API schema. Lambda Feedback also provides families of deployed microservices, using open source code available in our public GitHub repositories. This section of documentation is to help developers of microservices. The documentation is written assuming you have basic developer skills. diff --git a/docs/student/getting_started_student.md b/docs/student/getting_started_student.md index c75f8b184..c87fb04a1 100644 --- a/docs/student/getting_started_student.md +++ b/docs/student/getting_started_student.md @@ -35,15 +35,20 @@ See the [Answering Questions](answering_questions.md) page for more help with an ### Using the Workspace -The Workspace provides you with various functionalities to assist you during your learning process: -#### 1. Canvas: +The Workspace provides you with various functionalities to assist you during your learning process. Your edits and progress in the Workspace are saved per each Question you preview. So, you will be able to view your old edits for the Question you are currently on. + +Here are the various functionalities: + +#### Canvas: A pane where you can write down your thought process and notes for the previewed question (handwriting, sticky notes & text). ![Canvas Interface](images/canvas_interface.png) -#### 2. Chat: -A chat interface connecting you with helpful AI Chatbots to discuss any questions you have on the current topic you are working on. +#### Chat: +A chat interface connecting you with helpful Chatbots. The Chatbots are AI Assistants that you can chat with to ask for help or further explanations regarding the Question that you are working on. ![Chat Interface](images/chat_interface.png) -Your edits and progress in the Workspace are saved per each Question you preview. So, you will be able to view your old edits for the Question you are currently on. +For more information on what the chatbot knows about you and how you can use it to its full potential: + +[Chatbots - More Info](../advanced/chat_functions/info.md){ .md-button .md-button--primary} diff --git a/docs/student/images/canvas_interface.png b/docs/student/images/canvas_interface.png index bebe4e403..0ab77e5a8 100644 Binary files a/docs/student/images/canvas_interface.png and b/docs/student/images/canvas_interface.png differ diff --git a/docs/student/images/chat_interface.png b/docs/student/images/chat_interface.png index e5b484dbc..f724329cd 100644 Binary files a/docs/student/images/chat_interface.png and b/docs/student/images/chat_interface.png differ diff --git a/docs/student/index.md b/docs/student/index.md index 8d4d19b38..9ac78a237 100644 --- a/docs/student/index.md +++ b/docs/student/index.md @@ -19,7 +19,7 @@ The image above shows an example question, with numbers to indicate: 10. _Response area_, where student responses are entered and feedback is given 11. Feedback to the teacher (currently in flux regarding the design - 02/07/25) 12. Access to content 'below the line' providing extra support. -13. _Workspace_ - Opens tab with canvas and ai chatbot +13. _Workspace_ - Opens tab with canvas and chat 14. Comments ## Below the line diff --git a/docs/terminology.md b/docs/terminology.md index c23ed9aa5..e503b3ed4 100644 --- a/docs/terminology.md +++ b/docs/terminology.md @@ -56,4 +56,4 @@ This is an optional section, and so does not have to be included in any question ### Workspace -On the Question page, the students has access to their own workspace tab. Here they can find the "Canvas", for handwriting notes, and the "Chat", for conversing with an AI Chatbot on the question materials. +On the Question page, the students has access to their own workspace tab. Here they can find the "Canvas", for handwriting notes, and the "Chat", for conversing with an LLM-driven Chatbot on the question materials. diff --git a/mkdocs.yml b/mkdocs.yml index 51098d514..949ab4959 100644 --- a/mkdocs.yml +++ b/mkdocs.yml @@ -76,6 +76,7 @@ nav: - Chat functions: - Quickstart: "advanced/chat_functions/quickstart.md" - Testing Functions Locally: "advanced/chat_functions/local.md" + - Chat Functions Information: "advanced/chat_functions/info.md" # Configuration theme: