This project provides a local emulation server for legacy Efergy Engage hubs (v2 and v3). It allows you to intercept and log your home's energy data to a local SQLite database, completely bypassing the decommissioned Efergy cloud servers.
This is designed for anyone who wants to keep their devices from becoming e-waste.
The Efergy Hub is hard-coded to send its data to sensornet.info over HTTPS.
Unfortunately, these old devices use the deprecated SSLv3 protocol, which modern web servers and Python libraries will
not accept.
This project solves the problem with a two-service system managed by Docker Compose:
legacy-nginxService: A custom-built Nginx server acts as a reverse proxy. It uses an old version of OpenSSL (1.0.2u) specifically compiled to accept the hub's SSLv3 connection. It then terminates the SSL and forwards the decrypted HTTP traffic to the Python application.hub-serverService: A lightweight Python 3 server (hub_server.py) that listens for the forwarded requests. It receives the plain HTTP request from the Nginx proxy, emulates the Efergy API, and logs the data to a SQLite database (readings.db) using thedb.pyscript.
The legacy-nginx service requires SSL certificates to run. A helper script is provided to generate self-signed certificates.
- Open generate_cert.sh in a text editor.
- Run the script from the project's root directory:
./generate-certs.shThis will create server.key and server.crt inside the legacy-nginx directory, where the docker-compose.yml file
expects to find them.
With the certificates in place, you can start both services using Docker Compose.
# Build and start the containers in detached mode
docker-compose up --build -dThis will:
- Build the
hub-serverimage from its Dockerfile. - Build the
legacy-nginximage from its Dockerfile. - Start both containers. The
legacy-nginxservice is exposed on port443. - Mount the
readings.dbfile from the project root into thehub-servercontainer.
Finally, you must trick your Efergy Hub into sending data to your new server instead of sensornet.info.
The easiest way to do this is with DNS spoofing on your router (e.g., using dnsmasq, Pi-hole, or similar):
Create a DNS entry that maps sensornet.info to the local IP address of the machine running your Docker container
(e.g., 10.0.0.213).
Once the hub is rebooted, it will contact sensornet.info, be directed to your legacy-nginx proxy, and your
hub-server should start logging data to readings.db.
Navigate to Settings -> Local DNS Records and add the following:
| Domain | IP |
|---|---|
| [device mac].[h2/h3].sensornet.info | [server ip] |
| 41.0a.04.001ec0.h2.sensornet.info | 10.0.0.213 |
Navigate to Services -> DNS Resolver -> Custom Options and add the following:
server:
local-zone: "sensornet.info" redirect
local-data: "sensornet.info 86400 IN A 10.0.0.213"
If your Efergy Hub server is running on HA OS, you can integrate the readings into Home Assistant via MQTT.
- Configure Environment Variables for MQTT
Update your environment variables in the docker-compose.yml file:
# Optional: logging level (DEBUG, INFO, WARN, ERROR)
LOG_LEVEL=INFO
# Enable MQTT (true/false)
MQTT_ENABLED=true
# MQTT broker details
MQTT_BROKER=homeassistant.local
MQTT_PORT=1883
MQTT_USER=mqtt-broker-username-here
MQTT_PASS=your-password-here
# Home Assistant MQTT Discovery
HA_DISCOVERY=true
- Home Assistant Auto-Discovery
With HA_DISCOVERY=true, the hub-server will automatically publish Home Assistant MQTT discovery payloads. This creates two sensors per Efergy device:
| Sensor | Topic | Unit | Device Class | State Class |
|---|---|---|---|---|
sensor.efergy_hub_live_power_usage_SID |
home/efergy/<sensor_label>/power |
kW | power | measurement |
sensor.efergy_hub__energy_consumption |
home/efergy/<sensor_label>/energy |
kWh | energy | total_increasing |
Home Assistant will pick up these sensors automatically, making them available for dashboards, automations, and the Energy Dashboard.
- Add Sensors to Energy Dashboard
Once discovered, the
sensor.efergy_energy_consumptionsensor can be added to Home Assistant’s Energy Dashboard under Grid Consumption, allowing you to track daily, weekly, and monthly usage.
You can integrate your local energy data into Home Assistant using the SQL Sensor
integration.
This allows Home Assistant to directly query the readings.db file.
The provided sensors.yaml file is a configuration snippet you can add to your Home Assistant setup.
- Ensure Home Assistant can access the database. Make sure your
readings.dbfile is located somewhere Home Assistant can read it (e.g., in your /config directory). - Add the SQL integration to your
configuration.yamlif you haven't already. - Add the sensor configuration. You can copy the contents of
sensors.yamlinto your Home Assistant'sconfiguration.yaml(under a sql: key) or, if you have a split configuration, !include it.
configuration.yaml example:
# Loads default set of integrations. Do not remove.
default_config:
# Load frontend themes from the themes folder
frontend:
themes: !include_dir_merge_named themes
automation: !include automations.yaml
script: !include scripts.yaml
scene: !include scenes.yaml
sql: !include sensors.yaml- Update the
db_urlinsensors.yaml. - Restart Home Assistant.
You will now have two sensors:
sensor.efergy_hub_live_power_usage_SID: The instantaneous power reading in kW.sensor.efergy_hub_energy_consumption: A running total of energy consumed in kWh, which can be added directly to your Home Assistant Energy Dashboard.
Documentation about the known data formats is within the Wiki.