Skip to content

Feature/azure sql integration : Add Azure SQL integration support#2

Merged
mandeeps merged 6 commits intomainfrom
feature/azure-sql-integration
Mar 8, 2026
Merged

Feature/azure sql integration : Add Azure SQL integration support#2
mandeeps merged 6 commits intomainfrom
feature/azure-sql-integration

Conversation

@SYCqi
Copy link
Copy Markdown
Collaborator

@SYCqi SYCqi commented Mar 7, 2026

This pull request adds Azure SQL support to the API.

Changes:

  • Added pyodbc dependency
  • Installed Microsoft ODBC Driver 18 in Dockerfile
  • Updated main.py to support SQL backend
  • Kept existing local file storage logic

This allows switching between local JSONL storage and Azure SQL.

SYCqi added 3 commits March 7, 2026 14:25
Add Azure SQL integration to save occupancy data
Add pyodbc dependency for Azure SQL integration
Add Azure SQL support and ODBC driver installation
- Installed Microsoft ODBC Driver 18
- Added unixodbc dependencies
- Combined RUN commands for proper Docker build layering
- Prepared backend for Azure SQL integration
Copy link
Copy Markdown

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Adds Azure SQL persistence support to the Occupancy API so occupancy results can be stored in an Azure SQL table in addition to the existing local/Blob storage paths.

Changes:

  • Add pyodbc dependency for ODBC connectivity.
  • Add Docker image steps to install Microsoft ODBC Driver 18 (and unixODBC deps).
  • Add SQL connection + insert logic in main.py and call it during /api/thermal ingestion.

Reviewed changes

Copilot reviewed 2 out of 3 changed files in this pull request and generated 7 comments.

File Description
requirements.txt Adds pyodbc dependency needed for SQL connectivity.
main.py Introduces SQL connection helper + occupancy_data insert on each ingest.
Dockerfile Installs unixODBC + Microsoft ODBC Driver 18 to support pyodbc in-container.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment on lines +322 to +328
def _get_sql_connection():
"""Create Azure SQL connection. Returns None if not configured."""
if not SQL_CONNECTION_STRING:
return None
try:
return pyodbc.connect(SQL_CONNECTION_STRING, timeout=10)
except Exception as e:
Copy link

Copilot AI Mar 8, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

A new ODBC connection is created on every call, which can become a significant overhead under load. Consider reusing a cached connection (similar to the blob client pattern), enabling/confirming pyodbc pooling explicitly, and adding a simple backoff/"disable after failure" flag to avoid repeated connection attempts on persistent misconfiguration/outage.

Copilot uses AI. Check for mistakes.
Comment on lines +362 to +386
cursor = conn.cursor()
cursor.execute(
"""
INSERT INTO occupancy_data
([timestamp], sensor_id, occupancy, room_temperature,
people_clusters, fever_count, any_fever)
VALUES (?, ?, ?, ?, ?, ?, ?)
""",
entry["timestamp"],
entry["sensor_id"],
entry["occupancy"],
entry["room_temperature"],
entry["people_clusters"],
entry["fever_count"],
1 if entry["any_fever"] else 0,
)
conn.commit()
cursor.close()
conn.close()
except Exception as e:
print(f"Error saving occupancy data to Azure SQL: {e}")
try:
conn.close()
except Exception:
pass
Copy link

Copilot AI Mar 8, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If an exception occurs after creating the cursor, the cursor is never explicitly closed (only the connection is closed in the except). Use a finally block (or context managers) to guarantee both cursor and connection are closed, and consider calling rollback() on failures before closing to avoid leaving open transactions in pooled connections.

Copilot uses AI. Check for mistakes.
Comment on lines 614 to +616
save_thermal_data(compact_data, latest_thermal_data, sensor_id)
save_occupancy_data(occupancy_result)
save_occupancy_data_sql(occupancy_result, timestamp_iso=now_iso)
Copy link

Copilot AI Mar 8, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

PR description mentions "switching" between local JSONL storage and Azure SQL, but the API still always writes local occupancy history when SAVE_THERMAL_DATA is true, and all history/stats endpoints still read only from local files. If Azure SQL is intended to be the backend, either add SQL-backed read paths (or a clear fallback), or update the description/env flags to reflect that SQL is an additional sink rather than a full backend.

Copilot uses AI. Check for mistakes.
Dockerfile Outdated
Comment on lines +8 to +11
RUN apt-get update && apt-get install -y curl gnupg unixodbc unixodbc-dev && \
curl https://packages.microsoft.com/keys/microsoft.asc | apt-key add - && \
curl https://packages.microsoft.com/config/debian/12/prod.list > /etc/apt/sources.list.d/mssql-release.list && \
apt-get update && ACCEPT_EULA=Y apt-get install -y msodbcsql18 && \
Copy link

Copilot AI Mar 8, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Using apt-key add - is deprecated on modern Debian and may break in future base images. Import the Microsoft key into /etc/apt/keyrings/ and reference it via signed-by= in the repo list instead. Also consider --no-install-recommends and cleaning /var/lib/apt/lists/* after installs to keep the image smaller and reduce attack surface.

Suggested change
RUN apt-get update && apt-get install -y curl gnupg unixodbc unixodbc-dev && \
curl https://packages.microsoft.com/keys/microsoft.asc | apt-key add - && \
curl https://packages.microsoft.com/config/debian/12/prod.list > /etc/apt/sources.list.d/mssql-release.list && \
apt-get update && ACCEPT_EULA=Y apt-get install -y msodbcsql18 && \
RUN apt-get update && apt-get install -y --no-install-recommends curl gnupg ca-certificates unixodbc unixodbc-dev && \
mkdir -p /etc/apt/keyrings && \
curl https://packages.microsoft.com/keys/microsoft.asc | gpg --dearmor -o /etc/apt/keyrings/microsoft.gpg && \
chmod 644 /etc/apt/keyrings/microsoft.gpg && \
curl https://packages.microsoft.com/config/debian/12/prod.list | sed 's#^deb #deb [signed-by=/etc/apt/keyrings/microsoft.gpg] #' > /etc/apt/sources.list.d/mssql-release.list && \
apt-get update && ACCEPT_EULA=Y apt-get install -y --no-install-recommends msodbcsql18 && \
rm -rf /var/lib/apt/lists/* && \

Copilot uses AI. Check for mistakes.
main.py Outdated
Comment on lines +8 to +10
import json
import os
import pyodbc
Copy link

Copilot AI Mar 8, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Importing pyodbc at module import time can prevent the API from starting in deployments where the ODBC runtime libraries aren’t present (common outside the Docker image), even if SQL saving is disabled. To keep SQL support optional, consider moving the import inside _get_sql_connection() and handling ImportError by disabling SQL saving with a clear (sanitized) log message.

Copilot uses AI. Check for mistakes.
Dockerfile Outdated
@@ -5,7 +5,11 @@ WORKDIR /app

# Install dependencies (no build tools needed for current deps)
Copy link

Copilot AI Mar 8, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The comment says "no build tools needed for current deps", but this layer now installs build/runtime dependencies (unixODBC, msodbcsql18). Update the comment to reflect the new rationale (ODBC driver + pyodbc support) to avoid confusion for future maintenance.

Suggested change
# Install dependencies (no build tools needed for current deps)
# Install system deps for ODBC/SQL Server (unixODBC, msodbcsql18) and Python requirements

Copilot uses AI. Check for mistakes.
Comment on lines +326 to +330
try:
return pyodbc.connect(SQL_CONNECTION_STRING, timeout=10)
except Exception as e:
print(f"Azure SQL connection failed: {e}")
return None
Copy link

Copilot AI Mar 8, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Avoid printing raw SQL connection exceptions here. pyodbc error strings can include server/user details (and in some cases parts of the connection string), which can leak sensitive info into logs. Prefer structured logging and a sanitized message (e.g., log an error code/class, or gate detailed errors behind a debug flag).

Copilot uses AI. Check for mistakes.
@mandeeps
Copy link
Copy Markdown
Member

mandeeps commented Mar 8, 2026

@copilot open a new pull request to apply changes based on the comments in this thread

Copy link
Copy Markdown
Contributor

Copilot AI commented Mar 8, 2026

@mandeeps I've opened a new pull request, #3, to work on those changes. Once the pull request is ready, I'll request review from you.

Copilot AI and others added 2 commits March 8, 2026 19:45
…lazy import, and error handling

Co-authored-by: mandeeps <3266584+mandeeps@users.noreply.github.com>
Fix Azure SQL integration: connection caching, lazy import, safe error handling, and Dockerfile hardening
@mandeeps mandeeps merged commit 84200de into main Mar 8, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants