What are the principles we can use to build LLM-powered software that is actually good enough to put in the hands of production customers?
-
Updated
Jul 2, 2025 - TypeScript
What are the principles we can use to build LLM-powered software that is actually good enough to put in the hands of production customers?
[ICML'24 Spotlight] LLM Maybe LongLM: Self-Extend LLM Context Window Without Tuning
A discovery and compression tool for your Python codebase. Creates a knowledge graph for a LLM context window, efficiently outlining your project | Code structure visualization | LLM Context Window Efficiency | Static analysis for AI | Large Language Model tooling #LLM #AI #Python #CodeAnalysis #ContextWindow #DeveloperTools
A lightweight tool to optimize your C# project for LLM context windows by using a knowledge graph | Code structure visualization | Static analysis for AI | Large Language Model tooling | .NET ecosystem support #LLM #AI #CSharp #DotNet #CodeAnalysis #ContextWindow #DeveloperTools
[ICLR 2025] Official code repository for "TULIP: Token-length Upgraded CLIP"
A discovery and compression tool for your Java codebase. Creates a knowledge graph for a LLM context window, efficiently outlining your project #LLM #AI #Java #CodeAnalysis #ContextWindow #DeveloperTools #StaticAnalysis #CodeVisualization
Building Agents with LLM structured generation (BAML), MCP Tools, and 12-Factor Agents principles
Information on LLM models, context window token limit, output token limit, pricing and more.
A tool that analyzes your content to determine if you need a RAG pipeline or if modern language models can handle your text directly. It compares your content's token requirements against model context windows to help you make an informed architectural decision.
A visualization website for comparing LLMs' long context comprehension based on the FictionLiveBench benchmark.
Tezeta is a Python package designed to optimize memory in chatbots and Language Model (LLM) requests using relevance-based vector embeddings. This, in essence, provides support for using much longer conversations and text requests than supported by the context window.
Contains a bunch of LLM functions that are useful in LLM application development.
A Python tool for combining text documents into consolidated files for Large Language Model processing. Creates organized document stacks with configurable sorting and formatting options.
A language model that generates text based on a given prompt.
An autonomous context window management implementation.
Add a description, image, and links to the context-window topic page so that developers can more easily learn about it.
To associate your repository with the context-window topic, visit your repo's landing page and select "manage topics."