Add caching to instrument and live-data endpoints#622
Draft
Conversation
Codecov Report❌ Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## main #622 +/- ##
===========================================
- Coverage 96.37% 79.53% -16.85%
===========================================
Files 48 48
Lines 1961 2008 +47
===========================================
- Hits 1890 1597 -293
- Misses 71 411 +340 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Closes #
Description
Adds Valkey caching to four endpoints that previously hit the database on every request despite returning data that rarely changes. This follows the same cache-aside pattern already established for the job list/count endpoints in
jobs.py.Cached endpoints
How it works
Each GET endpoint checks Valkey for a cached response before querying PostgreSQL. On a cache miss, the DB result is stored in Valkey with a TTL. When the corresponding PUT endpoint is called (e.g. a staff member updates a live data script), the cache entry is immediately invalidated by writing
Nonewith a 1-second TTL, so the next GET fetches fresh data from the database.For endpoints where the value can legitimately be
None(live data script, instrument specification), the cached value is wrapped in a dict (e.g.{"script": value}) so that a cache miss (Nonefrom Valkey) can be distinguished from a cachedNonevalue.Why this is useful
instrumentstable.Configuration
All TTLs are configurable via environment variables and default to sensible values:
Setting any of these to
0disables caching for that endpoint (matching the existing pattern forJOB_LIST_CACHE_TTL_SECONDS).Testing locally
# Run the new cache-hit tests pytest test/e2e/test_endpoint_cache.py -vThe new tests mock
cache_get_jsonto return a cached payload and assert that the underlying DB service function is never called, confirming the cache-hit path works correctly. There is also a test for the edge case where aNonescript is cached.Manual verification with a local Valkey instance
If you want to test the caching behaviour against a real Valkey server rather than relying on the mocked unit tests:
You can then watch cache keys being set and expiring:
Calling
GET /live-data/instrumentstwice should show aSETEXon the first call and aGEThit on the second. Calling the correspondingPUTendpoint should show the key being invalidated.Test plan
test/e2e/test_endpoint_cache.py— 5 tests)0disables caching for that endpoint