A REST API for real-time sentiment classification built with FastAPI.
The API accepts text input and returns predicted sentiment labels and confidence scores returns positive, negative, or neutral with a confidence score..
The project demonstrates how machine learning models can be deployed as scalable APIs.
Client Request -> FastAPI Endpoint -> Sentiment Model -> Prediction Output -> JSON Response
- FastAPI for the REST API
- Hugging Face
transformersfor inference - PyTorch as the model backend
- Pydantic for request and response validation
The API loads cardiffnlp/twitter-roberta-base-sentiment-latest at startup. That model supports three sentiment classes, which makes it a better fit than the default binary sentiment pipeline.
You can override the model with the SENTIMENT_MODEL_NAME environment variable.
- Move into the project folder:
cd sentiment-analysis-fast-api- Install dependencies:
pip install -r requirements.txt- Start the API server:
python -m uvicorn app.main:app --reload- Open the API in your browser:
http://127.0.0.1:8000/
http://127.0.0.1:8000/health
http://127.0.0.1:8000/docs
The first startup can take longer because the model may need to be downloaded from Hugging Face.
Open:
http://127.0.0.1:8000/docs
Then:
- Expand
POST /predict - Click
Try it out - Paste a request body like this:
{
"text": "I love this movie so much"
}- Click
Execute
Invoke-RestMethod `
-Method Post `
-Uri "http://127.0.0.1:8000/predict" `
-ContentType "application/json" `
-Body '{"text":"This project is amazing"}'curl -X POST "http://127.0.0.1:8000/predict" \
-H "Content-Type: application/json" \
-d "{\"text\":\"This project is amazing\"}"Run the API tests with:
python -m pytest -qExpected result:
5 passed
Returns a basic service message.
Example:
{
"message": "Sentiment Analysis API is running"
}Returns the service health and whether the model is loaded.
Example response:
{
"status": "ok",
"model_loaded": true,
"model_name": "cardiffnlp/twitter-roberta-base-sentiment-latest"
}Request body:
{
"text": "This movie was surprisingly good"
}Example response:
{
"text": "This movie was surprisingly good",
"sentiment": "positive",
"confidence": 0.997
}Another example:
Request:
{
"text": "I hate this class"
}Possible response:
{
"text": "I hate this class",
"sentiment": "negative",
"confidence": 0.98
}- The model is loaded once during application startup and reused for all requests.
- The first startup may take longer because the model has to be downloaded from Hugging Face.