Skip to content

Commit af7d264

Browse files
committed
Integrations: Slightly improve / re-organize Kafka and Node-RED sections
1 parent c1af1ca commit af7d264

File tree

2 files changed

+22
-18
lines changed

2 files changed

+22
-18
lines changed

docs/integrate/kafka/docker-python.md

Lines changed: 7 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -50,7 +50,7 @@ docker compose up -d
5050
* CrateDB Admin UI: `http://localhost:4200`
5151
* Kafka broker (inside-compose hostname): kafka:9092
5252

53-
### Create a demo table in CrateDB
53+
### Create a CrateDB table
5454

5555
The easiest way to do this is through the CrateDB Admin UI at `http://localhost:4200` and execute this using the console:
5656

@@ -69,7 +69,7 @@ But this can also be done using `curl`:
6969
curl -sS -H 'Content-Type: application/json' -X POST http://localhost:4200/_sql -d '{"stmt":"CREATE TABLE IF NOT EXISTS sensor_readings (device_id TEXT, ts TIMESTAMPTZ, temperature DOUBLE PRECISION, humidity DOUBLE PRECISION, PRIMARY KEY (device_id, ts))"}'
7070
```
7171

72-
### Create a Kafka topic and send a couple of messages
72+
### Create a Kafka topic
7373

7474
Creating a Kafka topic can be done in several ways, we are selecting to use
7575
`docker exec` in this way:
@@ -79,18 +79,20 @@ docker exec -it kafka kafka-topics.sh --create --topic sensors --bootstrap-serve
7979

8080
## Process events
8181

82+
### Submit events to Kafka
8283
```bash
8384
docker exec -it kafka kafka-console-producer.sh --bootstrap-server kafka:9092 --topic sensors <<'EOF'
8485
{"device_id":"alpha","ts":"2025-08-19T12:00:00Z","temperature":21.4,"humidity":48.0}
8586
{"device_id":"alpha","ts":"2025-08-19T12:01:00Z","temperature":21.5,"humidity":47.6}
8687
{"device_id":"beta","ts":"2025-08-19T12:00:00Z","temperature":19.8,"humidity":55.1}
8788
EOF
8889
```
90+
Events (messages) are newline-delimited JSON for simplicity.
8991

90-
Messages are newline-delimited JSON for simplicity.
92+
### Consume events into CrateDB
9193

92-
93-
Create a simple consumer using Python.
94+
Create a simple consumer application using Python. It consumes events from the
95+
Kafka topic and inserts them into the CrateDB database table.
9496

9597
```python
9698
# quick_consumer.py

docs/integrate/node-red/mqtt-tutorial.md

Lines changed: 15 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -23,19 +23,7 @@ You need:
2323
2. The [node-red-contrib-postgresql](https://github.com/alexandrainst/node-red-contrib-postgresql) module installed.
2424
3. A running MQTT broker. This tutorial uses [HiveMQ Cloud](https://www.hivemq.com/).
2525

26-
## Producing data
27-
28-
First, generate data to populate the MQTT topic with Node-RED. If you already
29-
have an MQTT topic with regular messages, you can skip this part.
30-
![Screenshot 2021-09-13 at 14.58.42|690x134, 50%](https://us1.discourse-cdn.com/flex020/uploads/crate/original/1X/5722946039148ca6ce69702d963f9f842c4f972c.png){width=480px}
31-
32-
The `inject` node creates a JSON payload with three attributes:
33-
![Screenshot 2021-09-13 at 14.56.42|690x293, 50%](https://us1.discourse-cdn.com/flex020/uploads/crate/original/1X/8084a53e544d681e79f85d780c621a340a7d0d30.png){width=480px}
34-
35-
In this example, two fields are static; only the timestamp changes.
36-
Download the full workflow definition: [flows-producer.json](https://community.cratedb.com/uploads/short-url/eOvAk3XzDkRbNZjcZV0pZ0SnGu4.json) (1.3 KB)
37-
38-
## Consuming and ingesting data
26+
## Provision CrateDB
3927

4028
First of all, we create the target table in CrateDB:
4129
```sql
@@ -49,6 +37,20 @@ Store the payload as CrateDB’s {ref}`OBJECT data type
4937
<crate-reference:type-object>` to accommodate an evolving schema.
5038
For production, also consider the {ref}`partitioning and sharding guide <sharding-partitioning>`.
5139

40+
## Publish messages to MQTT
41+
42+
First, generate data to populate the MQTT topic with Node-RED. If you already
43+
have an MQTT topic with regular messages, you can skip this part.
44+
![Screenshot 2021-09-13 at 14.58.42|690x134, 50%](https://us1.discourse-cdn.com/flex020/uploads/crate/original/1X/5722946039148ca6ce69702d963f9f842c4f972c.png){width=480px}
45+
46+
The `inject` node creates a JSON payload with three attributes:
47+
![Screenshot 2021-09-13 at 14.56.42|690x293, 50%](https://us1.discourse-cdn.com/flex020/uploads/crate/original/1X/8084a53e544d681e79f85d780c621a340a7d0d30.png){width=480px}
48+
49+
In this example, two fields are static; only the timestamp changes.
50+
Download the full workflow definition: [flows-producer.json](https://community.cratedb.com/uploads/short-url/eOvAk3XzDkRbNZjcZV0pZ0SnGu4.json) (1.3 KB)
51+
52+
## Consume messages into CrateDB
53+
5254
To ingest efficiently, group messages into batches and use
5355
{ref}`multi-value INSERT statements <inserts-multiple-values>`
5456
to avoid generating one INSERT per message:

0 commit comments

Comments
 (0)