Skip to content

DevOldSchool/powermeter_hub_server

 
 

Repository files navigation

Efergy Hub Local Data Logger

This project provides a local emulation server for legacy Efergy Engage hubs (v2 and v3). It allows you to intercept and log your home's energy data to a local SQLite database, completely bypassing the decommissioned Efergy cloud servers.

This is designed for anyone who wants to keep their devices from becoming e-waste.

How it Works

The Efergy Hub is hard-coded to send its data to sensornet.info over HTTPS. Unfortunately, these old devices use the deprecated SSLv3 protocol, which modern web servers and Python libraries will not accept.

This project solves the problem with a two-service system managed by Docker Compose:

  1. legacy-nginx Service: A custom-built Nginx server acts as a reverse proxy. It uses an old version of OpenSSL (1.0.2u) specifically compiled to accept the hub's SSLv3 connection. It then terminates the SSL and forwards the decrypted HTTP traffic to the Python application.
  2. hub-server Service: A lightweight Python 3 server (hub_server.py) that listens for the forwarded requests. It receives the plain HTTP request from the Nginx proxy, emulates the Efergy API, and logs the data to a SQLite database (readings.db) using the db.py script.

Setup with Docker Compose

1. Generate SSL Certificates

The legacy-nginx service requires SSL certificates to run. A helper script is provided to generate self-signed certificates.

  1. Open generate_cert.sh in a text editor.
  2. Run the script from the project's root directory:
./generate-certs.sh

This will create server.key and server.crt inside the legacy-nginx directory, where the docker-compose.yml file expects to find them.

2. Run the Services

With the certificates in place, you can start both services using Docker Compose.

# Build and start the containers in detached mode
docker-compose up --build -d

This will:

  • Build the hub-server image from its Dockerfile.
  • Build the legacy-nginx image from its Dockerfile.
  • Start both containers. The legacy-nginx service is exposed on port 443.
  • Mount the readings.db file from the project root into the hub-server container.

3. Redirect the Efergy Hub

Finally, you must trick your Efergy Hub into sending data to your new server instead of sensornet.info. The easiest way to do this is with DNS spoofing on your router (e.g., using dnsmasq, Pi-hole, or similar):

Create a DNS entry that maps sensornet.info to the local IP address of the machine running your Docker container (e.g., 10.0.0.213).

Once the hub is rebooted, it will contact sensornet.info, be directed to your legacy-nginx proxy, and your hub-server should start logging data to readings.db.

Pi-hole example

Navigate to Settings -> Local DNS Records and add the following:

Domain IP
[device mac].[h2/h3].sensornet.info [server ip]
41.0a.04.001ec0.h2.sensornet.info 10.0.0.213

pfSense example

Navigate to Services -> DNS Resolver -> Custom Options and add the following:

server:
    local-zone: "sensornet.info" redirect
    local-data: "sensornet.info 86400 IN A 10.0.0.213"

Home Assistant Integration

Home Assistant Operating System hosted

If your Efergy Hub server is running on HA OS, you can integrate the readings into Home Assistant via MQTT.

  1. Configure Environment Variables for MQTT

Update your environment variables in the docker-compose.yml file:

# Optional: logging level (DEBUG, INFO, WARN, ERROR)
LOG_LEVEL=INFO

# Enable MQTT (true/false)
MQTT_ENABLED=true

# MQTT broker details
MQTT_BROKER=homeassistant.local
MQTT_PORT=1883
MQTT_USER=mqtt-broker-username-here
MQTT_PASS=your-password-here

# Home Assistant MQTT Discovery
HA_DISCOVERY=true
  1. Home Assistant Auto-Discovery

With HA_DISCOVERY=true, the hub-server will automatically publish Home Assistant MQTT discovery payloads. This creates two sensors per Efergy device:

Sensor Topic Unit Device Class State Class
sensor.efergy_hub_live_power_usage_SID home/efergy/<sensor_label>/power kW power measurement
sensor.efergy_hub__energy_consumption home/efergy/<sensor_label>/energy kWh energy total_increasing

Home Assistant will pick up these sensors automatically, making them available for dashboards, automations, and the Energy Dashboard.

  1. Add Sensors to Energy Dashboard Once discovered, the sensor.efergy_energy_consumption sensor can be added to Home Assistant’s Energy Dashboard under Grid Consumption, allowing you to track daily, weekly, and monthly usage.

Container hosted

You can integrate your local energy data into Home Assistant using the SQL Sensor integration. This allows Home Assistant to directly query the readings.db file.

The provided sensors.yaml file is a configuration snippet you can add to your Home Assistant setup.

  1. Ensure Home Assistant can access the database. Make sure your readings.db file is located somewhere Home Assistant can read it (e.g., in your /config directory).
  2. Add the SQL integration to your configuration.yaml if you haven't already.
  3. Add the sensor configuration. You can copy the contents of sensors.yaml into your Home Assistant's configuration.yaml (under a sql: key) or, if you have a split configuration, !include it.

configuration.yaml example:

# Loads default set of integrations. Do not remove.
default_config:

# Load frontend themes from the themes folder
frontend:
  themes: !include_dir_merge_named themes

automation: !include automations.yaml
script: !include scripts.yaml
scene: !include scenes.yaml
sql: !include sensors.yaml
  1. Update the db_url in sensors.yaml.
  2. Restart Home Assistant.

You will now have two sensors:

  • sensor.efergy_hub_live_power_usage_SID: The instantaneous power reading in kW.
  • sensor.efergy_hub_energy_consumption: A running total of energy consumed in kWh, which can be added directly to your Home Assistant Energy Dashboard.

Other

Efergy Data Format

Documentation about the known data formats is within the Wiki.

About

Keep your legacy Efergy Engage Hub alive! This project emulates the Efergy API to log power data locally.

Topics

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Languages

  • Python 88.9%
  • Dockerfile 9.3%
  • Shell 1.8%