Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
147 changes: 147 additions & 0 deletions docs_src/use-cases/automated-self-checkout/services/common-service.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,147 @@
# Common-Service: LiDAR & Weight Sensor Microservice
This microservice manages **Barcode, LiDAR, and Weight sensors** in a single container. It publishes sensor data over **MQTT** , **Kafka** , or **HTTP** (or any combination), controlled entirely by environment variables.
## 1. Overview

- **Sensors**
- Barcode, LiDAR, & Weight support in the same codebase.

- Configuration for each sensor (e.g., ID, port, mock mode, intervals).

- **Publishing**
- `publisher.py` handles publishing to one or more protocols:
- **MQTT**

- **Kafka**

- **HTTP**

- **Apps**
- Three main modules:
- `barcode_app.py`

- `lidar_app.py`

- `weight_app.py`

- Each uses shared methods from `publisher.py` & `config.py`.

## 2. Environment Variables
All settings are defined in `docker-compose.yml` under the `asc_common_service` section. Key variables include:
### LiDAR
| Variable | Description | Example |
| --- | --- | --- |
| LIDAR_COUNT | Number of LiDAR sensors | 2 |
| LIDAR_SENSOR_ID_1 | Unique ID for first LiDAR sensor | lidar-001 |
| LIDAR_SENSOR_ID_2 | Unique ID for second LiDAR sensor (if any) | lidar-002 |
| LIDAR_MOCK_1 | Enable mock data for first LiDAR sensor (true/false) | true |
| LIDAR_MQTT_ENABLE | Toggle MQTT publishing | true |
| LIDAR_MQTT_BROKER_HOST | MQTT broker host | mqtt-broker or mqtt-broker_1 |
| LIDAR_MQTT_BROKER_PORT | MQTT broker port | 1883 |
| LIDAR_KAFKA_ENABLE | Toggle Kafka publishing | true |
| KAFKA_BOOTSTRAP_SERVERS | Kafka bootstrap server addresses | kafka:9093 |
| LIDAR_KAFKA_TOPIC | Kafka topic name for LiDAR data | lidar-data |
| LIDAR_HTTP_ENABLE | Toggle HTTP publishing | true |
| LIDAR_HTTP_URL | HTTP endpoint URL for LiDAR data | http://localhost:5000/api/lidar_data |
| LIDAR_PUBLISH_INTERVAL | Interval (in seconds) for LiDAR data publishing | 1.0 |
| LIDAR_LOG_LEVEL | Logging level (DEBUG, INFO, etc.) | INFO |

### Weight
| Variable | Description | Example |
| --- | --- | --- |
| WEIGHT_COUNT | Number of Weight sensors | 2 |
| WEIGHT_SENSOR_ID_1 | Unique ID for first Weight sensor | weight-001 |
| WEIGHT_SENSOR_ID_2 | Unique ID for second Weight sensor (if any) | weight-002 |
| WEIGHT_MOCK_1 | Enable mock data for first Weight sensor (true/false) | true |
| WEIGHT_MQTT_ENABLE | Toggle MQTT publishing | true |
| WEIGHT_MQTT_BROKER_HOST | MQTT broker host | mqtt-broker_1 |
| WEIGHT_MQTT_BROKER_PORT | MQTT broker port | 1883 |
| WEIGHT_KAFKA_ENABLE | Toggle Kafka publishing | false |
| WEIGHT_MQTT_TOPIC | MQTT topic name for Weight data | weight/data |
| WEIGHT_HTTP_ENABLE | Toggle HTTP publishing | false |
| WEIGHT_PUBLISH_INTERVAL | Interval (in seconds) for Weight data publishing | 1.0 |
| WEIGHT_LOG_LEVEL | Logging level (DEBUG, INFO, etc.) | INFO |

### Barcode
| Variable | Description | Example |
| --- | --- | --- |
| BARCODE_COUNT | Number of Barcode sensors | 2 |
| BARCODE_SENSOR_ID_1 | Unique ID for first Barcode sensor | barcode-001 |
| BARCODE_SENSOR_ID_2 | Unique ID for second Barcode sensor (if any) | barcode-002 |
| BARCODE_MOCK_1 | Enable mock data for first Barcode sensor (true/false) | true |
| BARCODE_MQTT_ENABLE | Toggle MQTT publishing | true |
| BARCODE_MQTT_BROKER_HOST | MQTT broker host | mqtt-broker_1 |
| BARCODE_MQTT_BROKER_PORT | MQTT broker port | 1883 |
| BARCODE_KAFKA_ENABLE | Toggle Kafka publishing | false |
| BARCODE_MQTT_TOPIC | MQTT topic name for Barcode data | barcode/data |
| BARCODE_HTTP_ENABLE | Toggle HTTP publishing | false |
| BARCODE_PUBLISH_INTERVAL | Interval (in seconds) for Barcode data publishing | 1.0 |
| BARCODE_LOG_LEVEL | Logging level (DEBUG, INFO, etc.) | INFO |

> **Note:** Change `"true"` or `"false"` to enable or disable each protocol. Adjust intervals, logging levels, or sensor counts as needed.
## 3. Usage

1. **Build and Run**

```bash
make build-sensors
make run-sensors
```
This spins up the `asc_common_service` container (and related services like Mosquitto or Kafka, depending on your configuration).

2. **Data Flow**
- By default, LiDAR publishes to `lidar/data` (MQTT, if enabled) or `lidar-data` (Kafka), or an HTTP endpoint if configured.

- Weight sensor similarly publishes to `weight/data` or `weight-data`.

3. **Mock Mode**
- Setting `LIDAR_MOCK_1="true"` (or `WEIGHT_MOCK_1="true"`) forces the sensor to generate **random** data rather than reading from actual hardware.

## 4. Testing

### A. MQTT

- **Grafana** : A pre-loaded dashboard named *Retail Analytics Dashboard* is available at [http://localhost:3000](http://localhost:3000/) (default credentials `admin`/`admin`).

- Check that the MQTT data source in Grafana points to `tcp://mqtt-broker_1:1883` (or `tcp://mqtt-broker:1883`, depending on the network).

### B. Kafka

- Enable Kafka for LiDAR/Weight by setting `LIDAR_KAFKA_ENABLE="true"` and/or `WEIGHT_KAFKA_ENABLE="true"`.

- Test from inside the container:

```bash
docker exec asc_common_service python kafka_publisher_test.py --topic lidar-data
```
You should see incoming messages in the console.

### C. HTTP

1️ **Local Test (Inside Docker)**

- Set `LIDAR_HTTP_URL="http://localhost:5000/api/lidar_data"` in the environment.
- Run `make run-sensors` and wait for all containers to start.
- Once up, execute:

```bash
docker exec asc_common_service python http_publisher_test.py
```

- This will trigger the HTTP publisher and display the received data inside the container.

2️ **Using an External Webhook Service**

- Visit [Webhook.site](https://webhook.site/) and get a unique URL.
- Set `LIDAR_HTTP_URL` to this URL.
- Run `make run-sensors`, and you should see the HTTP requests arriving on the Webhook.site dashboard.



## 5. Contributing & Development

- **Code Structure**
- `publisher.py`: Core publishing logic (MQTT, Kafka, HTTP).

- `config.py`: Loads environment variables and configures each sensor.

- `barcode_app.py`, `lidar_app.py`, and `weight_app.py`: Sensor-specific logic.
Original file line number Diff line number Diff line change
@@ -0,0 +1,69 @@
# Pipeline Server

## Prerequisites

* Prepare models

Use the model downloader [available here](https://github.com/dlstreamer/pipeline-server/tree/main/tools/model_downloader) to download new models. Point `MODEL_DIR` to the directory containing the new models. The following section assumes that the new models are available under `$(pwd)/models`.
```bash
$ export MODEL_DIR=$(pwd)/models
```
object_detection/yolov5s/yolov5s.json
object_detection/yolov5s/FP32

* Prepare pipelines

Use [these docs](https://github.com/dlstreamer/pipeline-server/blob/main/docs/defining_pipelines.md) to get started with defining new pipelines. Once the new pipelines have been defined, point `PIPELINE_DIR` to the directory containing the new pipelines. The following section assumes that the new pipelines are available under `$(pwd)/pipelines`.
```bash
$ export PIPELINE_DIR=$(pwd)/pipelines
```

* Run the image with new models and pipelines mounted into the container
```bash
$ docker run -itd \
--privileged \
--device=/dev:/dev \
--device-cgroup-rule='c 189:* rmw' \
--device-cgroup-rule='c 209:* rmw' \
--group-add 109 \
--name evam \
-p 8080:8080 \
-p 8554:8554 \
-e ENABLE_RTSP=true \
-e RTSP_PORT=8554 \
-e ENABLE_WEBRTC=true \
-e WEBRTC_SIGNALING_SERVER=ws://localhost:8443 \
-e RUN_MODE=EVA \
-e DETECTION_DEVICE=CPU \
-e CLASSIFICATION_DEVICE=CPU \
-v ./models:/home/pipeline-server/models \
-v ./src/pipelines:/home/pipeline-server/pipelines \
dlstreamer:dev
```
## Starting pipelines
* We can trigger pipelines using the *pipeline server's* REST endpoints, here is an example cURL command, the output is available as a RTSP stream at *rtsp://<host ip>/pipeline-server*
```bash
$ curl localhost:8080/pipelines/object_detection/yolov5 -X POST -H \
'Content-Type: application/json' -d \
'{
"source": {
"uri": "rtsp://192.168.1.141:8555/camera_0",
"type": "uri"
},
"destination": {
"metadata": {
"type": "file",
"path": "/tmp/results.jsonl",
"format": "json-lines"
},
"frame": {
"type": "rtsp",
"path": "pipeline-server"
}
},
"parameters": {
"detection-device": "CPU",
"network": "FP16-INT8"
}
}'
```
3 changes: 3 additions & 0 deletions mkdocs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -56,6 +56,9 @@ nav:
- Catalog:
- Overview: 'use-cases/automated-self-checkout/catalog/Overview.md'
- Getting Started: 'use-cases/automated-self-checkout/catalog/Get-Started-Guide.md'
- Services:
- Common Service: 'use-cases/automated-self-checkout/services/common-service.md'
- Pipeline Server: 'use-cases/automated-self-checkout/services/pipeline-server.md'
- AI Connect for Scientific Data (AiCSD):
- Overview: 'use-cases/AiCSD/aicsd.md'
- GRPC Yolov5s Pipeline: 'use-cases/AiCSD/pipeline-grpc-go.md'
Expand Down