Transforming tender data into actionable business intelligence! π This AWS Lambda function serves as the analytical brain behind the Tender Tool's dynamic dashboard, delivering real-time insights and personalized analytics that power data-driven procurement decisions across South Africa's business landscape.
- π Overview
- β¨ Features
- βοΈ Architecture & Workflow
- π§ Setup & Deployment
- βοΈ Configuration
- π Usage
- π§° Troubleshooting
- π API Response Examples
Welcome to the data science command center! π― This serverless analytics engine transforms raw tender information from our comprehensive database into intelligent, actionable insights. Whether you're a public user exploring market trends or a power user managing complex watchlists, this function delivers personalized analytics that drive smart procurement decisions! π
What makes it analytically awesome? π‘
- π Multi-Persona Intelligence: Adapts analytics based on user roles and permissions
- β‘ Real-Time Processing: Live calculations from massive tender databases
- π API Integration: Seamlessly combines database analytics with external user services
- π― Smart Fallbacks: Graceful degradation ensures users always get valuable insights
Our analytics engine delivers three distinct intelligence levels, each tailored to specific user needs:
Perfect for market researchers and newcomers exploring the procurement landscape:
- π Total Tender Count: Real-time calculation of all opportunities in the system
- π’ Live Opportunity Tracker: Current count of open tenders ready for bidding
- π΄ Historical Analysis: Count of closed tenders for trend analysis
- βοΈ Market Ratio Intelligence: Open-to-closed ratio for market health assessment
- π Status Breakdown: Visual representation of opportunity distribution
- πΊοΈ Geographic Intelligence: Provincial breakdown across South Africa's procurement landscape
Enhanced insights for registered users managing their procurement portfolios:
- β All Public Analytics (comprehensive market view)
- π Personal Watchlist Intelligence:
- π Total tenders under surveillance
- π― Open opportunities in your portfolio
- π Personal opportunity ratio analysis
- β° Deadline proximity alerts (closing soon vs. future opportunities)
Executive-level intelligence for platform administrators and power users:
- β All Public Analytics (complete market overview)
- π Source Distribution Analysis: Tender counts by provider (Eskom, Transnet, SANRAL, SARS, eTenders)
- π₯ Platform Administration Intelligence:
- π Total platform user count
- π€ Standard user demographics
- π¦Έ Super user population
- π Monthly registration growth metrics
Our analytics engine follows an intelligent, adaptive processing flow:
-
π API Gateway Trigger: HTTP requests hit our
/analyticsendpoint with lightning-fast response times -
π User Intelligence Detection: Smart header analysis to determine user context and permissions level
-
π§ Adaptive Processing Logic:
π₯ Request Analysis ββ π« No User ID β Public Analytics Pipeline ββ π€ User ID Present β ββ π Super User Check β External API Call β ββ π¦Έ Super User Confirmed β Full Intelligence Suite β ββ π€ Standard User β Watchlist Intelligence β ββ β Fallback β Public Analytics ββ π Response Generation -
ποΈ Database Intelligence: Lightning-fast SQL queries against our comprehensive RDS database using optimized
pymssqlconnections -
π External API Orchestration: Seamless integration with user management and watchlist services for personalized insights
-
π Response Optimization: JSON payload optimization for maximum dashboard performance
This section covers three deployment methods for the Analytics Query Handler Lambda Function. Choose the method that best fits your workflow and infrastructure preferences.
Before deploying, ensure you have:
- AWS CLI configured with appropriate credentials π
- AWS SAM CLI installed (
pip install aws-sam-cli) - Python 3.9 runtime support in your target region
- Access to AWS Lambda, RDS, API Gateway, and CloudWatch Logs services βοΈ
- Analytics layer dependencies for database connectivity
- VPC configuration for RDS access (if applicable)
Deploy directly through your IDE using the AWS Toolkit extension.
- Install AWS Toolkit in your IDE (VS Code, IntelliJ, etc.)
- Configure AWS Profile with your credentials
- Open Project containing
lambda_function.pyanddb_handler.py
- Right-click on
lambda_function.pyin your IDE - Select "Deploy Lambda Function" from AWS Toolkit menu
- Configure Deployment:
- Function Name:
AnalyticsQueryHandler - Runtime:
python3.9 - Handler:
lambda_function.lambda_handler - Memory:
128 MB - Timeout:
60 seconds
- Function Name:
- Add Layers manually after deployment:
- analytics-layer (for database connectivity)
- Set Environment Variables:
DB_ENDPOINT=tender-tool-db.c2hq4seoidxc.us-east-1.rds.amazonaws.com DB_NAME=tendertool_db DB_USER=AnalyticsAppUser DB_PASSWORD=T3nder$Tool_DB_2025! USER_FETCH_API_URL=https://api.example.com/dev/tenderuser/fetch/{} WATCHLIST_API_URL=https://api.example.com/dev/watchlist/{} - Configure IAM Permissions for RDS, VPC, and CloudWatch Logs
- Set up API Gateway manually and connect to the Lambda function
- Test the function using the AWS Toolkit test feature
- Monitor logs through CloudWatch integration
- Verify database connectivity and API Gateway integration
- Test analytics endpoints with different user roles
Use AWS SAM for infrastructure-as-code deployment with the provided template.
# Install AWS SAM CLI
pip install aws-sam-cli
# Verify installation
sam --versionSince the template references an analytics layer not included in the repository, create it:
# Create analytics layer directory
mkdir -p analytics-layer/python
# Install required database and HTTP connectivity packages
pip install pymssql -t analytics-layer/python/
pip install sqlalchemy -t analytics-layer/python/
pip install requests -t analytics-layer/python/
pip install urllib3 -t analytics-layer/python/# Build the SAM application
sam build
# Deploy with guided configuration (first time)
sam deploy --guided
# Follow the prompts:
# Stack Name: analytics-query-handler-stack
# AWS Region: us-east-1 (or your preferred region)
# Confirm changes before deploy: Y
# Allow SAM to create IAM roles: Y
# Save parameters to samconfig.toml: YThe template already includes the required database environment variables:
# Already configured in template.yml
Environment:
Variables:
DB_ENDPOINT: tender-tool-db.c2hq4seoidxc.us-east-1.rds.amazonaws.com
DB_NAME: tendertool_db
DB_PASSWORD: T3nder$Tool_DB_2025!
DB_USER: AnalyticsAppUser# Add external API URLs after initial deployment
aws lambda update-function-configuration \
--function-name AnalyticsQueryHandler \
--environment Variables='{
"DB_ENDPOINT":"tender-tool-db.c2hq4seoidxc.us-east-1.rds.amazonaws.com",
"DB_NAME":"tendertool_db",
"DB_USER":"AnalyticsAppUser",
"DB_PASSWORD":"T3nder$Tool_DB_2025!",
"USER_FETCH_API_URL":"https://api.example.com/dev/tenderuser/fetch/{}",
"WATCHLIST_API_URL":"https://api.example.com/dev/watchlist/{}"
}'# Quick deployment after initial setup
sam build && sam deploy# Test function locally with API Gateway simulation
sam local start-api
# Test specific analytics endpoint
curl http://localhost:3000/analytics
# Test with user headers
curl -H "X-User-ID: user-12345" http://localhost:3000/analytics- β Complete infrastructure management
- β Automatic layer creation and management
- β API Gateway integration included
- β Environment variables defined in template
- β IAM permissions and VPC configuration
- β Easy rollback capabilities
- β CloudFormation integration
Automated deployment using GitHub Actions workflow for production environments.
-
GitHub Repository Secrets:
AWS_ACCESS_KEY_ID: Your AWS access key AWS_SECRET_ACCESS_KEY: Your AWS secret key AWS_REGION: us-east-1 (or your target region) -
Pre-existing Lambda Function: The workflow updates an existing function, so deploy initially using Method 1 or 2.
-
Create Release Branch:
# Create and switch to release branch git checkout -b release # Make your changes to lambda_function.py or db_handler.py # Commit changes git add . git commit -m "feat: update analytics query processing logic" # Push to trigger deployment git push origin release
-
Automatic Deployment: The workflow will:
- Checkout the code
- Configure AWS credentials
- Create deployment zip with
lambda_function.pyanddb_handler.py - Update the existing Lambda function code
- Maintain existing configuration (layers, environment variables, API Gateway, etc.)
You can also trigger deployment manually:
- Go to Actions tab in your GitHub repository
- Select "Deploy Python Lambda to AWS" workflow
- Click "Run workflow"
- Choose the
releasebranch - Click "Run workflow" button
- β Automated CI/CD pipeline
- β Consistent deployment process
- β Audit trail of deployments
- β Easy rollback to previous commits
- β No local environment dependencies
Regardless of deployment method, verify the following:
Ensure these environment variables are properly set:
# Verify environment variables via AWS CLI
aws lambda get-function-configuration \
--function-name AnalyticsQueryHandler \
--query 'Environment.Variables'Expected output:
{
"DB_ENDPOINT": "tender-tool-db.c2hq4seoidxc.us-east-1.rds.amazonaws.com",
"DB_NAME": "tendertool_db",
"DB_USER": "AnalyticsAppUser",
"DB_PASSWORD": "T3nder$Tool_DB_2025!",
"USER_FETCH_API_URL": "https://api.example.com/dev/tenderuser/fetch/{}",
"WATCHLIST_API_URL": "https://api.example.com/dev/watchlist/{}"
}Ensure the analytics database user exists and has proper permissions:
-- Connect to your SQL Server RDS instance
-- Create the analytics user if not exists
CREATE LOGIN AnalyticsAppUser WITH PASSWORD = 'T3nder$Tool_DB_2025!';
USE tendertool_db;
CREATE USER AnalyticsAppUser FOR LOGIN AnalyticsAppUser;
-- Grant required permissions for analytics queries
GRANT SELECT ON dbo.BaseTender TO AnalyticsAppUser;
GRANT SELECT ON dbo.TenderSource TO AnalyticsAppUser;
GRANT SELECT ON dbo.Province TO AnalyticsAppUser;
GRANT SELECT ON dbo.TenderStatus TO AnalyticsAppUser;
-- Add other necessary table permissions as neededCheck that API Gateway is properly configured:
# List API Gateway APIs
aws apigatewayv2 get-apis
# Get specific API configuration
aws apigatewayv2 get-api --api-id [your-api-id]
# Test the analytics endpoint
curl https://[api-id].execute-api.[region].amazonaws.com/analyticsAfter deployment, test the function thoroughly:
# Test public analytics (no headers)
curl https://[api-id].execute-api.[region].amazonaws.com/analytics
# Test standard user analytics
curl -H "X-User-ID: user-12345" \
https://[api-id].execute-api.[region].amazonaws.com/analytics
# Test super user analytics
curl -H "X-User-ID: superuser-67890" \
https://[api-id].execute-api.[region].amazonaws.com/analytics
# Test direct Lambda invocation
aws lambda invoke \
--function-name AnalyticsQueryHandler \
--payload '{"httpMethod":"GET","path":"/analytics","headers":{}}' \
response.json{
"statusCode": 200,
"headers": {
"Content-Type": "application/json",
"Access-Control-Allow-Origin": "*"
},
"body": "{\"totalTenders\":15847,\"openTenders\":342,\"closedTenders\":15505,\"openToClosedRatio\":0.022}"
}- β Function executes without errors
- β CloudWatch logs show successful database connections
- β API Gateway returns 200 status codes
- β Analytics data is properly formatted JSON
- β Different user types receive appropriate data levels
- β External API integrations work (when configured)
- Duration: Function execution time for analytics queries
- Error Rate: Failed analytics requests
- Memory Utilization: RAM usage during complex queries
- API Gateway Metrics: Request counts and latency
- Database Connection Health: RDS connection metrics
# View recent logs
aws logs tail /aws/lambda/AnalyticsQueryHandler --follow
# Search for successful analytics queries
aws logs filter-log-events \
--log-group-name /aws/lambda/AnalyticsQueryHandler \
--filter-pattern "Analytics query completed"
# Search for database connection issues
aws logs filter-log-events \
--log-group-name /aws/lambda/AnalyticsQueryHandler \
--filter-pattern "Database connection"
# Monitor API Gateway access logs
aws logs filter-log-events \
--log-group-name /aws/apigateway/[api-id] \
--filter-pattern "/analytics"Analytics Layer Dependencies Missing
Issue: Database connectivity or HTTP request packages not available
Solution: Ensure analytics layer is properly created and attached:
# For SAM: Verify layer directory exists and contains packages
ls -la analytics-layer/python/
ls -la analytics-layer/python/pymssql/
ls -la analytics-layer/python/requests/
# For manual deployment: Create and upload layer separatelyDatabase Connection Failures
Issue: Cannot connect to RDS SQL Server for analytics
Solution: Verify database configuration and network access:
- Check DB_ENDPOINT points to correct RDS instance
- Verify AnalyticsAppUser exists and has correct password
- Ensure Lambda is in same VPC as RDS or configure VPC peering
- Check RDS security groups allow Lambda subnet access
- Verify database is accessible and not in maintenance mode
API Gateway Integration Issues
Issue: API Gateway not properly connected to Lambda
Solution: Verify API Gateway configuration:
- Check API Gateway has correct Lambda integration
- Verify Lambda permissions allow API Gateway invocation
- Test API Gateway deployment and stage configuration
- Check CORS settings if accessing from web applications
VPC and Security Group Configuration
Issue: Lambda cannot access RDS due to VPC restrictions
Solution: Configure VPC properly:
- Ensure Lambda and RDS are in same VPC
- Configure security group rules for database port (1433 for SQL Server)
- Verify subnet routing and NAT gateway configuration
- Check network ACLs allow database traffic
External API Integration Failures
Issue: User fetch or watchlist APIs not responding
Solution: Implement robust error handling:
- Verify external API endpoints are accessible
- Check authentication tokens and API keys
- Implement graceful fallback to public analytics
- Monitor external API rate limits and quotas
Environment Variables Not Set
Issue: Missing database or API configuration
Solution: Set environment variables using AWS CLI:
aws lambda update-function-configuration \
--function-name AnalyticsQueryHandler \
--environment Variables='{
"DB_ENDPOINT":"tender-tool-db.c2hq4seoidxc.us-east-1.rds.amazonaws.com",
"DB_NAME":"tendertool_db",
"DB_USER":"AnalyticsAppUser",
"DB_PASSWORD":"T3nder$Tool_DB_2025!",
"USER_FETCH_API_URL":"https://api.example.com/dev/tenderuser/fetch/{}",
"WATCHLIST_API_URL":"https://api.example.com/dev/watchlist/{}"
}'Workflow Deployment Fails
Issue: GitHub Actions workflow errors
Solution:
- Check repository secrets are correctly configured
- Verify the target Lambda function exists in AWS
- Ensure workflow has correct function ARN
- Check that both lambda_function.py and db_handler.py exist in repository
Choose the deployment method that best fits your development workflow and infrastructure requirements. SAM deployment is recommended for development environments, while workflow deployment excels for production analytics services requiring high availability and consistent updates.
curl https://your-api-id.execute-api.region.amazonaws.com/analyticscurl -H "X-User-ID: user-12345" \
https://your-api-id.execute-api.region.amazonaws.com/analyticscurl -H "X-User-ID: superuser-67890" \
https://your-api-id.execute-api.region.amazonaws.com/analyticsDatabase Connection Timeouts
Issue: Lambda timing out on database connections.
Solution: Ensure your Lambda is in the same VPC as your RDS instance, or configure appropriate security groups. Database analytics requires reliable connectivity! ποΈ
External API Integration Failures
Issue: User fetch or watchlist APIs returning errors.
Solution: Implement robust fallback logic - users should always receive at least public analytics. Check API endpoints and authentication tokens! π
Performance Optimization
Issue: Slow response times for complex analytics queries.
Solution: Optimize your SQL queries, consider database indexing, and implement connection pooling. Analytics should be lightning-fast! β‘
Layer Compatibility Issues
Issue: pymssql layer not working with Lambda runtime.
Solution: Ensure your layer was built on Linux x86_64 architecture matching your Lambda runtime. Use Docker for consistent builds! π³
{
"statusCode": 200,
"body": {
"totalTenders": 15847,
"openTenders": 342,
"closedTenders": 15505,
"openToClosedRatio": 0.022,
"statusBreakdown": {
"Open": 342,
"Closed": 15505
},
"provinceBreakdown": {
"Gauteng": 4521,
"Western Cape": 3102,
"KwaZulu-Natal": 2876
}
}
}{
"statusCode": 200,
"body": {
"totalTenders": 15847,
"openTenders": 342,
"standardUserAnalytics": {
"totalWatchedTenders": 23,
"openWatchedTenders": 8,
"watchedOpenRatio": 0.348,
"tendersClosingSoon": 3,
"tendersClosingLater": 5
}
}
}Built with love, bread, and code by Bread Corporation π¦β€οΈπ»