Linguist Pro is a high-performance, containerized FastAPI service for precision language detection. It is specifically optimized to handle short sentences and single words where traditional statistical models often fail.
- Precision Detection: Powered by Meta's FastText (
lid.176.bin) model for industry-standard accuracy. - Short-Text Optimized: Handles single words and short phrases (like "Hola" or "Hi") with high confidence.
- Premium UI Playground: Built-in glassmorphism web interface with real-time animated confidence bars.
- Developer First: Fully documented REST API with Swagger (OpenAPI) support.
- Cloud Ready: Pre-configured Dockerfile with cold-start optimization.
- Python 3.11 or higher
- [Optional] Docker
-
Clone and setup:
# Ensure you have a virtual environment python3 -m venv .venv source .venv/bin/activate # On Windows use `.venv\Scripts\activate` # Install dependencies pip install -r requirements.txt
-
Run the server:
uvicorn main:app --host 0.0.0.0 --port 8000 --reload
-
Access the Playground: Open http://localhost:8000 in your browser.
Linguist Pro is optimized for containerization. The build process pre-downloads the required 125MB AI model to ensure fast startup.
# Build the image
docker build -t linguist-pro .
# Run the container
docker run -p 8000:8000 linguist-proPOST /api/detect
Request:
{
"text": "Bonjour tout le monde"
}Response:
{
"primary": {
"lang": "fr",
"name": "French",
"confidence": 0.9946
},
"all_matches": [
{
"lang": "fr",
"name": "French",
"confidence": 0.9946
}
],
"error": null
}- Backend: FastAPI (Python)
- Engine: Fast-Langdetect & Meta's FastText
- Frontend: Vanilla HTML5/CSS3 (Glassmorphism design)
- DevOps: Docker
This project is open-source and available under the MIT License.