A repository housing a mock paper creator program powered by an LLM. (used to be called LLM mock paper creator). The LLM uses few shot learning- more information about this learning style can be found here..
The LLM used in development was phi-3:mini due to its small size. My personal opinion is that small language models could be very useful.
The aim of this project is to generate mock papers from notes, and render them as a PDF for printing. The idea for this project came about whilst I was studying for my exams, and grew tired of the papers that were provided to me. I did not feel as if I understood the material from these short papers. Ollama is used in order to host a local LLM on the development machine.
-
You will need to set up a LaTeX distribution for this. The paper generated uses commands from the
exam.clsclass file. I was able to set this up with assistance from the MIT document on rendering exams with LaTeX here. You can install this class file using:sudo tlmgr install exam
which uses
tlmgr(the TUG package manager) to installexam.cls. To compile the document,pdflatexis used, but this can be changed + extended. -
The Python packages required can be installed using conda. You will need to have
ollamaon your system, and up and running using theollama servecommand. -
For the web interface,
streamlitis required. The web application can be run usingstreamlit run main.py.
You can run the tests for the program using pytest tests -q. Note that pytest must be installed.
-
Generate clear options for multiple choice questions.
-
Add subdivision support(a,b,c) for questions with some sort of link.
-
Add support for example questions to be added.