GPU-accelerated inference plugin for NASA OnAIR cognitive framework.
This plugin enables OnAIR to consume results from GPU-based geospatial foundation model inference running on NVIDIA Jetson hardware. It implements the Complex Reasoner interface, correlating inference outputs with satellite telemetry to produce prioritized action queues.
The plugin follows OnAIR's external plugin model:
- Inference pipeline processes imagery on the GPU independently
- Results (classifications, confidence scores, coordinates) are published to Redis
- OnAIR's RedisAdapter ingests results as telemetry data products
- This plugin (Complex Reasoner) correlates results with operational context and triages actions
No images flow through OnAIR's data pipeline. No modifications to OnAIR's codebase are required.
Design phase. See docs/DESIGN.md for the full design document, currently under review.
GPU Inference Pipeline ──► Redis ──► OnAIR RedisAdapter ──► GPU Plugin (CR)
(PyTorch/TensorRT) │ (existing) (this repo)
│
JSON messages:
- detection_class
- confidence
- coordinates
- severity
- gpu_health
- NASA OnAIR (tested with latest main)
- Python 3.9+
- Redis server
- GPU inference pipeline publishing to Redis (e.g., GodelEDGE)
# Clone this repo into your OnAIR plugins directory
git clone https://github.com/godel-space/onair-gpu-plugin.git plugins/gpu_inference_reasoner
# Configure OnAIR to use the plugin (see docs/DESIGN.md Section 5)
# Point ComplexPluginDict to the plugin path in your config.iniApache License 2.0
Godel Space -- Autonomous AI agents for satellite Earth observation.