From e7be64a7c11f061e97115d4d52e05bf6757448f8 Mon Sep 17 00:00:00 2001 From: "Yan Xue, Tan (Francis)" <118970371+francis-tanyx@users.noreply.github.com> Date: Thu, 6 Mar 2025 01:00:09 +0800 Subject: [PATCH 1/4] Refactor Issue 51: Migrated documentation from individual services to a Services section under Automated Self Checkout in Documentation Repo - created section `Services` under Use Cases/Automated Self Checkout - renamed the files based on the service (common-service, pipeline-server, retail-data-visualization) - updated the mkdocs.yml with the new files under a section called Services - ensured that visualization works for docs Issue (intel-retail#51)[intel-retail#51] Co-authored-by: Voon Yong Shing Signed-off-by: Tan Yan Xue Signed-off-by: Ong Jing Hong Signed-off-by: Marcus Khaw Chin Rui Signed-off-by: Low Yu Zhe --- .../services/common-service.md | 129 ++++++++++++++++++ .../services/pipeline-server.md | 69 ++++++++++ .../services/retail-data-visualization.md | 61 +++++++++ mkdocs.yml | 6 +- 4 files changed, 264 insertions(+), 1 deletion(-) create mode 100644 docs_src/use-cases/automated-self-checkout/services/common-service.md create mode 100644 docs_src/use-cases/automated-self-checkout/services/pipeline-server.md create mode 100644 docs_src/use-cases/automated-self-checkout/services/retail-data-visualization.md diff --git a/docs_src/use-cases/automated-self-checkout/services/common-service.md b/docs_src/use-cases/automated-self-checkout/services/common-service.md new file mode 100644 index 0000000..8cad0e4 --- /dev/null +++ b/docs_src/use-cases/automated-self-checkout/services/common-service.md @@ -0,0 +1,129 @@ +# Common Service: LiDAR & Weight Sensor Microservice +This microservice manages **both LiDAR and Weight sensors** in a single container. It publishes sensor data over **MQTT** , **Kafka** , or **HTTP** (or any combination), controlled entirely by environment variables. + +## Overview + +- **Sensors** + - LiDAR & Weight support in the same codebase. + + - Configuration for each sensor (e.g., ID, port, mock mode, intervals). + +- **Publishing** + - `publisher.py` handles publishing to one or more protocols: + - **MQTT** + + - **Kafka** + + - **HTTP** + +- **Apps** + - Two main modules: + - `lidar_app.py` + + - `weight_app.py` + + - Each uses shared methods from `publisher.py` & `config.py`. + +## Environment Variables +All settings are defined in `docker-compose.yml` under the `asc_common_service` section. Key variables include: +### LiDAR +| Variable | Description | Example | +| --- | --- | --- | +| LIDAR_COUNT | Number of LiDAR sensors | 2 | +| LIDAR_SENSOR_ID_1 | Unique ID for first LiDAR sensor | lidar-001 | +| LIDAR_SENSOR_ID_2 | Unique ID for second LiDAR sensor (if any) | lidar-002 | +| LIDAR_MOCK_1 | Enable mock data for first LiDAR sensor (true/false) | true | +| LIDAR_MQTT_ENABLE | Toggle MQTT publishing | true | +| LIDAR_MQTT_BROKER_HOST | MQTT broker host | mqtt-broker or mqtt-broker_1 | +| LIDAR_MQTT_BROKER_PORT | MQTT broker port | 1883 | +| LIDAR_KAFKA_ENABLE | Toggle Kafka publishing | true | +| KAFKA_BOOTSTRAP_SERVERS | Kafka bootstrap server addresses | kafka:9093 | +| LIDAR_KAFKA_TOPIC | Kafka topic name for LiDAR data | lidar-data | +| LIDAR_HTTP_ENABLE | Toggle HTTP publishing | true | +| LIDAR_HTTP_URL | HTTP endpoint URL for LiDAR data | http://localhost:5000/api/lidar_data | +| LIDAR_PUBLISH_INTERVAL | Interval (in seconds) for LiDAR data publishing | 1.0 | +| LIDAR_LOG_LEVEL | Logging level (DEBUG, INFO, etc.) | INFO | + +### Weight +| Variable | Description | Example | +| --- | --- | --- | +| WEIGHT_COUNT | Number of Weight sensors | 2 | +| WEIGHT_SENSOR_ID_1 | Unique ID for first Weight sensor | weight-001 | +| WEIGHT_SENSOR_ID_2 | Unique ID for second Weight sensor (if any) | weight-002 | +| WEIGHT_MOCK_1 | Enable mock data for first Weight sensor (true/false) | true | +| WEIGHT_MQTT_ENABLE | Toggle MQTT publishing | true | +| WEIGHT_MQTT_BROKER_HOST | MQTT broker host | mqtt-broker_1 | +| WEIGHT_MQTT_BROKER_PORT | MQTT broker port | 1883 | +| WEIGHT_KAFKA_ENABLE | Toggle Kafka publishing | false | +| WEIGHT_MQTT_TOPIC | MQTT topic name for Weight data | weight/data | +| WEIGHT_HTTP_ENABLE | Toggle HTTP publishing | false | +| WEIGHT_PUBLISH_INTERVAL | Interval (in seconds) for Weight data publishing | 1.0 | +| WEIGHT_LOG_LEVEL | Logging level (DEBUG, INFO, etc.) | INFO | + +> **Note:** Change `"true"` or `"false"` to enable or disable each protocol. Adjust intervals, logging levels, or sensor counts as needed. +## Usage + +1. **Build and Run** + + ```bash + make run-demo + ``` + This spins up the `asc_common_service` container (and related services like Mosquitto or Kafka, depending on your configuration). + +2. **Data Flow** + - By default, LiDAR publishes to `lidar/data` (MQTT, if enabled) or `lidar-data` (Kafka), or an HTTP endpoint if configured. + + - Weight sensor similarly publishes to `weight/data` or `weight-data`. + +3. **Mock Mode** + - Setting `LIDAR_MOCK_1="true"` (or `WEIGHT_MOCK_1="true"`) forces the sensor to generate **random** data rather than reading from actual hardware. + +## Testing + +### A. MQTT + +- **Grafana** : A pre-loaded dashboard named *Sensor-Analytics* is available at [http://localhost:3000](http://localhost:3000/) (default credentials `admin`/`admin`). + +- Check that the MQTT data source in Grafana points to `tcp://mqtt-broker_1:1883` (or `tcp://mqtt-broker:1883`, depending on the network). + +### B. Kafka + +- Enable Kafka for LiDAR/Weight by setting `LIDAR_KAFKA_ENABLE="true"` and/or `WEIGHT_KAFKA_ENABLE="true"`. + +- Test from inside the container: + + ```bash + docker exec asc_common_service python kafka_publisher_test.py --topic lidar-data + ``` + You should see incoming messages in the console. + +### C. HTTP + +1️. **Local Test (Inside Docker)** + + - Set `LIDAR_HTTP_URL="http://localhost:5000/api/lidar_data"` in the environment. + - Run `make run-demo` and wait for all containers to start. + - Once up, execute: + + ```bash + docker exec asc_common_service python http_publisher_test.py + ``` + + - This will trigger the HTTP publisher and display the received data inside the container. + +2️. **Using an External Webhook Service** + + - Visit [Webhook.site](https://webhook.site/) and get a unique URL. + - Set `LIDAR_HTTP_URL` to this URL. + - Run `make run-demo`, and you should see the HTTP requests arriving on the Webhook.site dashboard. + + + +## Contributing & Development + +- **Code Structure** + - `publisher.py`: Core publishing logic (MQTT, Kafka, HTTP). + + - `config.py`: Loads environment variables and configures each sensor. + + - `lidar_app.py` and `weight_app.py`: Sensor-specific logic. \ No newline at end of file diff --git a/docs_src/use-cases/automated-self-checkout/services/pipeline-server.md b/docs_src/use-cases/automated-self-checkout/services/pipeline-server.md new file mode 100644 index 0000000..392ad71 --- /dev/null +++ b/docs_src/use-cases/automated-self-checkout/services/pipeline-server.md @@ -0,0 +1,69 @@ +# Pipeline Server + +## Prerequisites + + * Prepare models + + Use the model downloader [available here](https://github.com/dlstreamer/pipeline-server/tree/main/tools/model_downloader) to download new models. Point `MODEL_DIR` to the directory containing the new models. The following section assumes that the new models are available under `$(pwd)/models`. + ```bash + $ export MODEL_DIR=$(pwd)/models + ``` +object_detection/yolov5s/yolov5s.json +object_detection/yolov5s/FP32 + + * Prepare pipelines + + Use [these docs](https://github.com/dlstreamer/pipeline-server/blob/main/docs/defining_pipelines.md) to get started with defining new pipelines. Once the new pipelines have been defined, point `PIPELINE_DIR` to the directory containing the new pipelines. The following section assumes that the new pipelines are available under `$(pwd)/pipelines`. + ```bash + $ export PIPELINE_DIR=$(pwd)/pipelines + ``` + + * Run the image with new models and pipelines mounted into the container + ```bash + $ docker run -itd \ + --privileged \ + --device=/dev:/dev \ + --device-cgroup-rule='c 189:* rmw' \ + --device-cgroup-rule='c 209:* rmw' \ + --group-add 109 \ + --name evam \ + -p 8080:8080 \ + -p 8554:8554 \ + -e ENABLE_RTSP=true \ + -e RTSP_PORT=8554 \ + -e ENABLE_WEBRTC=true \ + -e WEBRTC_SIGNALING_SERVER=ws://localhost:8443 \ + -e RUN_MODE=EVA \ + -e DETECTION_DEVICE=CPU \ + -e CLASSIFICATION_DEVICE=CPU \ + -v ./models:/home/pipeline-server/models \ + -v ./src/pipelines:/home/pipeline-server/pipelines \ + dlstreamer:dev + ``` +## Starting pipelines + * We can trigger pipelines using the *pipeline server's* REST endpoints, here is an example cURL command, the output is available as a RTSP stream at *rtsp:///pipeline-server* + ```bash + $ curl localhost:8080/pipelines/object_detection/yolov5 -X POST -H \ + 'Content-Type: application/json' -d \ + '{ + "source": { + "uri": "rtsp://192.168.1.141:8555/camera_0", + "type": "uri" + }, + "destination": { + "metadata": { + "type": "file", + "path": "/tmp/results.jsonl", + "format": "json-lines" + }, + "frame": { + "type": "rtsp", + "path": "pipeline-server" + } + }, + "parameters": { + "detection-device": "CPU", + "network": "FP16-INT8" + } + }' + ``` \ No newline at end of file diff --git a/docs_src/use-cases/automated-self-checkout/services/retail-data-visualization.md b/docs_src/use-cases/automated-self-checkout/services/retail-data-visualization.md new file mode 100644 index 0000000..07fae8e --- /dev/null +++ b/docs_src/use-cases/automated-self-checkout/services/retail-data-visualization.md @@ -0,0 +1,61 @@ +# Retail Data Visualization + +## Features + +- **LIDAR Data:** Simulated LIDAR sensor data including length, width, height, and size. +- **Weight Data:** Simulated weight sensor data with the weight of items and item IDs. +- **Barcode Data:** Simulated barcode scanner data including item code, item name, category, and price. +- **Periodic Data Updates:** Sensor data is automatically regenerated and updated every 5 seconds. + +## API Endpoints + +1. **Get LIDAR Data** + + - **Endpoint:** `/lidar` + - **Method:** `GET` + - **Description:** Fetches the simulated LIDAR sensor data. Data is regenerated each time this endpoint is called. + - **Response:** A JSON array containing sensor data, including length, width, height, size, and timestamp. + +2. **Get Weight Data** + + - **Endpoint:** `/weight` + - **Method:** `GET` + - **Description:** Fetches the simulated weight sensor data. Data is regenerated each time this endpoint is called. + - **Response:** A JSON array containing weight sensor data with weight, item ID, and timestamp. + +3. **Get Barcode Data** + - **Endpoint:** `/barcode` + - **Method:** `GET` + - **Description:** Fetches the simulated barcode scanner data. Data is regenerated each time this endpoint is called. + - **Response:** A JSON array containing barcode data, including item code, item name, category, price, and timestamp. + +## Requirements + +To run this Flask application, you will need the following: + +- **Docker (for containerization)** + +## Setup and Installation + +1. Build the application + + Navigate to `src/retail-data-visualization` and run the command + + ``` + make build + ``` + +2. Run the application + + ``` + make run + ``` + +3. Stop the application + + ``` + make down + ``` + Navigate to https://localhost:3000 , go to dashboard tab and open **Retail Analytics Dashboard** and login with default credentials (admin,admin) if prompted (can change it later if required) + + The data here is fetched from dummy_data_load.py, if the users want to modify with any new data, it can be done by modifying dashboard.json file present in retail-data-visualization/grafana/dashboards.dashboard.json \ No newline at end of file diff --git a/mkdocs.yml b/mkdocs.yml index 4d09d36..e9508d8 100644 --- a/mkdocs.yml +++ b/mkdocs.yml @@ -56,6 +56,10 @@ nav: - Catalog: - Overview: 'use-cases/automated-self-checkout/catalog/Overview.md' - Getting Started: 'use-cases/automated-self-checkout/catalog/Get-Started-Guide.md' + - Services: + - Common Service: 'use-cases/automated-self-checkout/services/common-service.md' + - Pipeline Server: 'use-cases/automated-self-checkout/services/pipeline-server.md' + - Retail Data Visualization: 'use-cases/automated-self-checkout/services/retail-data-visualization.md' - AI Connect for Scientific Data (AiCSD): - Overview: 'use-cases/AiCSD/aicsd.md' - GRPC Yolov5s Pipeline: 'use-cases/AiCSD/pipeline-grpc-go.md' @@ -85,4 +89,4 @@ markdown_extensions: - tables extra: version: - provider: 'mike' + provider: 'mike' \ No newline at end of file From a4ad2e8184e5b35006ed027e3ce3ce306775f3f9 Mon Sep 17 00:00:00 2001 From: "Yan Xue, Tan (Francis)" <118970371+francis-tanyx@users.noreply.github.com> Date: Mon, 14 Apr 2025 13:03:04 +0800 Subject: [PATCH 2/4] del: remove retail data visualization Removing Retail Data Visualization Markdown due to its removal on other PR --- .../services/retail-data-visualization.md | 61 ------------------- 1 file changed, 61 deletions(-) delete mode 100644 docs_src/use-cases/automated-self-checkout/services/retail-data-visualization.md diff --git a/docs_src/use-cases/automated-self-checkout/services/retail-data-visualization.md b/docs_src/use-cases/automated-self-checkout/services/retail-data-visualization.md deleted file mode 100644 index 07fae8e..0000000 --- a/docs_src/use-cases/automated-self-checkout/services/retail-data-visualization.md +++ /dev/null @@ -1,61 +0,0 @@ -# Retail Data Visualization - -## Features - -- **LIDAR Data:** Simulated LIDAR sensor data including length, width, height, and size. -- **Weight Data:** Simulated weight sensor data with the weight of items and item IDs. -- **Barcode Data:** Simulated barcode scanner data including item code, item name, category, and price. -- **Periodic Data Updates:** Sensor data is automatically regenerated and updated every 5 seconds. - -## API Endpoints - -1. **Get LIDAR Data** - - - **Endpoint:** `/lidar` - - **Method:** `GET` - - **Description:** Fetches the simulated LIDAR sensor data. Data is regenerated each time this endpoint is called. - - **Response:** A JSON array containing sensor data, including length, width, height, size, and timestamp. - -2. **Get Weight Data** - - - **Endpoint:** `/weight` - - **Method:** `GET` - - **Description:** Fetches the simulated weight sensor data. Data is regenerated each time this endpoint is called. - - **Response:** A JSON array containing weight sensor data with weight, item ID, and timestamp. - -3. **Get Barcode Data** - - **Endpoint:** `/barcode` - - **Method:** `GET` - - **Description:** Fetches the simulated barcode scanner data. Data is regenerated each time this endpoint is called. - - **Response:** A JSON array containing barcode data, including item code, item name, category, price, and timestamp. - -## Requirements - -To run this Flask application, you will need the following: - -- **Docker (for containerization)** - -## Setup and Installation - -1. Build the application - - Navigate to `src/retail-data-visualization` and run the command - - ``` - make build - ``` - -2. Run the application - - ``` - make run - ``` - -3. Stop the application - - ``` - make down - ``` - Navigate to https://localhost:3000 , go to dashboard tab and open **Retail Analytics Dashboard** and login with default credentials (admin,admin) if prompted (can change it later if required) - - The data here is fetched from dummy_data_load.py, if the users want to modify with any new data, it can be done by modifying dashboard.json file present in retail-data-visualization/grafana/dashboards.dashboard.json \ No newline at end of file From 961f972a419c5cb12c3def65d3b5a4dd6a7997ce Mon Sep 17 00:00:00 2001 From: "Yan Xue, Tan (Francis)" <118970371+francis-tanyx@users.noreply.github.com> Date: Mon, 14 Apr 2025 13:04:36 +0800 Subject: [PATCH 3/4] del: remove retail data visualization Removal of Retail Data Visualization documentation due to its removal on PR: https://github.com/intel-retail/automated-self-checkout/pull/684/ --- mkdocs.yml | 3 +-- 1 file changed, 1 insertion(+), 2 deletions(-) diff --git a/mkdocs.yml b/mkdocs.yml index e9508d8..e4bf89b 100644 --- a/mkdocs.yml +++ b/mkdocs.yml @@ -59,7 +59,6 @@ nav: - Services: - Common Service: 'use-cases/automated-self-checkout/services/common-service.md' - Pipeline Server: 'use-cases/automated-self-checkout/services/pipeline-server.md' - - Retail Data Visualization: 'use-cases/automated-self-checkout/services/retail-data-visualization.md' - AI Connect for Scientific Data (AiCSD): - Overview: 'use-cases/AiCSD/aicsd.md' - GRPC Yolov5s Pipeline: 'use-cases/AiCSD/pipeline-grpc-go.md' @@ -89,4 +88,4 @@ markdown_extensions: - tables extra: version: - provider: 'mike' \ No newline at end of file + provider: 'mike' From bc3cca0761e7bce86b7bbb5a5981d892f88895a5 Mon Sep 17 00:00:00 2001 From: "Yan Xue, Tan (Francis)" <118970371+francis-tanyx@users.noreply.github.com> Date: Mon, 14 Apr 2025 13:06:03 +0800 Subject: [PATCH 4/4] mod: Modified common-service.md Modified common-service.md as PR: https://github.com/intel-retail/automated-self-checkout/pull/684/ has modifications --- .../services/common-service.md | 114 ++++++++++-------- 1 file changed, 66 insertions(+), 48 deletions(-) diff --git a/docs_src/use-cases/automated-self-checkout/services/common-service.md b/docs_src/use-cases/automated-self-checkout/services/common-service.md index 8cad0e4..42feff8 100644 --- a/docs_src/use-cases/automated-self-checkout/services/common-service.md +++ b/docs_src/use-cases/automated-self-checkout/services/common-service.md @@ -1,30 +1,31 @@ -# Common Service: LiDAR & Weight Sensor Microservice -This microservice manages **both LiDAR and Weight sensors** in a single container. It publishes sensor data over **MQTT** , **Kafka** , or **HTTP** (or any combination), controlled entirely by environment variables. - -## Overview +# Common-Service: LiDAR & Weight Sensor Microservice +This microservice manages **Barcode, LiDAR, and Weight sensors** in a single container. It publishes sensor data over **MQTT** , **Kafka** , or **HTTP** (or any combination), controlled entirely by environment variables. +## 1. Overview - **Sensors** - - LiDAR & Weight support in the same codebase. + - Barcode, LiDAR, & Weight support in the same codebase. - - Configuration for each sensor (e.g., ID, port, mock mode, intervals). + - Configuration for each sensor (e.g., ID, port, mock mode, intervals). - **Publishing** - - `publisher.py` handles publishing to one or more protocols: - - **MQTT** - - - **Kafka** - - - **HTTP** + - `publisher.py` handles publishing to one or more protocols: + - **MQTT** + + - **Kafka** + + - **HTTP** - **Apps** - - Two main modules: - - `lidar_app.py` - - - `weight_app.py` + - Three main modules: + - `barcode_app.py` + + - `lidar_app.py` - - Each uses shared methods from `publisher.py` & `config.py`. + - `weight_app.py` + + - Each uses shared methods from `publisher.py` & `config.py`. -## Environment Variables +## 2. Environment Variables All settings are defined in `docker-compose.yml` under the `asc_common_service` section. Key variables include: ### LiDAR | Variable | Description | Example | @@ -60,29 +61,46 @@ All settings are defined in `docker-compose.yml` under the `asc_common_service` | WEIGHT_PUBLISH_INTERVAL | Interval (in seconds) for Weight data publishing | 1.0 | | WEIGHT_LOG_LEVEL | Logging level (DEBUG, INFO, etc.) | INFO | +### Barcode +| Variable | Description | Example | +| --- | --- | --- | +| BARCODE_COUNT | Number of Barcode sensors | 2 | +| BARCODE_SENSOR_ID_1 | Unique ID for first Barcode sensor | barcode-001 | +| BARCODE_SENSOR_ID_2 | Unique ID for second Barcode sensor (if any) | barcode-002 | +| BARCODE_MOCK_1 | Enable mock data for first Barcode sensor (true/false) | true | +| BARCODE_MQTT_ENABLE | Toggle MQTT publishing | true | +| BARCODE_MQTT_BROKER_HOST | MQTT broker host | mqtt-broker_1 | +| BARCODE_MQTT_BROKER_PORT | MQTT broker port | 1883 | +| BARCODE_KAFKA_ENABLE | Toggle Kafka publishing | false | +| BARCODE_MQTT_TOPIC | MQTT topic name for Barcode data | barcode/data | +| BARCODE_HTTP_ENABLE | Toggle HTTP publishing | false | +| BARCODE_PUBLISH_INTERVAL | Interval (in seconds) for Barcode data publishing | 1.0 | +| BARCODE_LOG_LEVEL | Logging level (DEBUG, INFO, etc.) | INFO | + > **Note:** Change `"true"` or `"false"` to enable or disable each protocol. Adjust intervals, logging levels, or sensor counts as needed. -## Usage +## 3. Usage 1. **Build and Run** - ```bash - make run-demo - ``` - This spins up the `asc_common_service` container (and related services like Mosquitto or Kafka, depending on your configuration). +```bash +make build-sensors +make run-sensors +``` +This spins up the `asc_common_service` container (and related services like Mosquitto or Kafka, depending on your configuration). 2. **Data Flow** - - By default, LiDAR publishes to `lidar/data` (MQTT, if enabled) or `lidar-data` (Kafka), or an HTTP endpoint if configured. + - By default, LiDAR publishes to `lidar/data` (MQTT, if enabled) or `lidar-data` (Kafka), or an HTTP endpoint if configured. - - Weight sensor similarly publishes to `weight/data` or `weight-data`. + - Weight sensor similarly publishes to `weight/data` or `weight-data`. 3. **Mock Mode** - - Setting `LIDAR_MOCK_1="true"` (or `WEIGHT_MOCK_1="true"`) forces the sensor to generate **random** data rather than reading from actual hardware. + - Setting `LIDAR_MOCK_1="true"` (or `WEIGHT_MOCK_1="true"`) forces the sensor to generate **random** data rather than reading from actual hardware. -## Testing +## 4. Testing ### A. MQTT -- **Grafana** : A pre-loaded dashboard named *Sensor-Analytics* is available at [http://localhost:3000](http://localhost:3000/) (default credentials `admin`/`admin`). +- **Grafana** : A pre-loaded dashboard named *Retail Analytics Dashboard* is available at [http://localhost:3000](http://localhost:3000/) (default credentials `admin`/`admin`). - Check that the MQTT data source in Grafana points to `tcp://mqtt-broker_1:1883` (or `tcp://mqtt-broker:1883`, depending on the network). @@ -92,38 +110,38 @@ All settings are defined in `docker-compose.yml` under the `asc_common_service` - Test from inside the container: - ```bash - docker exec asc_common_service python kafka_publisher_test.py --topic lidar-data - ``` - You should see incoming messages in the console. +```bash +docker exec asc_common_service python kafka_publisher_test.py --topic lidar-data +``` +You should see incoming messages in the console. ### C. HTTP -1️. **Local Test (Inside Docker)** +1️ **Local Test (Inside Docker)** - - Set `LIDAR_HTTP_URL="http://localhost:5000/api/lidar_data"` in the environment. - - Run `make run-demo` and wait for all containers to start. - - Once up, execute: +- Set `LIDAR_HTTP_URL="http://localhost:5000/api/lidar_data"` in the environment. +- Run `make run-sensors` and wait for all containers to start. +- Once up, execute: - ```bash - docker exec asc_common_service python http_publisher_test.py - ``` +```bash +docker exec asc_common_service python http_publisher_test.py +``` - - This will trigger the HTTP publisher and display the received data inside the container. +- This will trigger the HTTP publisher and display the received data inside the container. -2️. **Using an External Webhook Service** +2️ **Using an External Webhook Service** - - Visit [Webhook.site](https://webhook.site/) and get a unique URL. - - Set `LIDAR_HTTP_URL` to this URL. - - Run `make run-demo`, and you should see the HTTP requests arriving on the Webhook.site dashboard. +- Visit [Webhook.site](https://webhook.site/) and get a unique URL. +- Set `LIDAR_HTTP_URL` to this URL. +- Run `make run-sensors`, and you should see the HTTP requests arriving on the Webhook.site dashboard. -## Contributing & Development +## 5. Contributing & Development - **Code Structure** - - `publisher.py`: Core publishing logic (MQTT, Kafka, HTTP). + - `publisher.py`: Core publishing logic (MQTT, Kafka, HTTP). - - `config.py`: Loads environment variables and configures each sensor. + - `config.py`: Loads environment variables and configures each sensor. - - `lidar_app.py` and `weight_app.py`: Sensor-specific logic. \ No newline at end of file + - `barcode_app.py`, `lidar_app.py`, and `weight_app.py`: Sensor-specific logic.