Skip to content

Commit d7b619f

Browse files
authored
[DSM] Update Kafka dashboard with new layout and updated widgets (DataDog#20950)
* [DSM] Update Kafka dashboard with new layout and updated widgets * [DSM] Make more changes to kafka dashbaord * [DSM] Add integration links * [DSM] Update kafka readme * [DSM] Update kafka readme * [DSM] Minor updates to kafka dashboard * [DSM] Add new image for kafka dashboard * [DSM] Update copy
1 parent aeafb8b commit d7b619f

File tree

3 files changed

+953
-278
lines changed

3 files changed

+953
-278
lines changed

kafka/README.md

Lines changed: 13 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -4,26 +4,31 @@
44

55
## Overview
66

7-
View Kafka broker metrics collected for a 360-view of the health and performance of your Kafka clusters in real time. With this integration, you can collect metrics and logs from your Kafka deployment to visualize telemetry and alert on the performance of your Kafka stack.
7+
View Kafka broker metrics and logs for a 360-view of the health and performance of your Kafka clusters in real time.
88

9-
**Note**:
10-
- This check has a limit of 350 metrics per instance. The number of returned metrics is indicated in the Agent status output. Specify the metrics you are interested in by editing the configuration below. For more detailed instructions on customizing the metrics to collect, see the [JMX Checks documentation][2].
9+
Add [Data Streams Monitoring][24] to your producers and consumers to visualize the application topology, root cause issues across services, and measure end to end latency, throughput and lag.
10+
11+
**Note**:
12+
13+
- This check has a limit of 350 metrics per instance. The number of returned metrics is indicated in the Agent status output. Specify the metrics you are interested in by editing the configuration below. For more detailed instructions on customizing the metrics to collect, see the
14+
[JMX Checks documentation][2].
1115
- This integration attached sample configuration works only for Kafka >= 0.8.2.
12-
If you are running a version older than that, see the [Agent v5.2.x released sample files][22].
13-
- To collect Kafka consumer metrics, see the [kafka_consumer check][3].
16+
If you are running a version older than that, see the [Agent v5.2.x released sample files][22].
1417

15-
Consider [Data Streams Monitoring][24] to enhance your Kafka integration. This solution enables pipeline visualization and lag tracking, helping you identify and resolve bottlenecks.
18+
To collect Kafka consumer metrics, see the [kafka_consumer check][3].
1619

1720
## Setup
1821

1922
### Installation
2023

21-
The Agent's Kafka check is included in the [Datadog Agent][4] package, so you don't need to install anything else on your Kafka nodes.
24+
The Agent's Kafka check is included in the [Datadog Agent][4] package, no additional installation is needed on your Kafka nodes.
2225

2326
The check collects metrics from JMX with [JMXFetch][5]. A JVM is needed on each kafka node so the Agent can run JMXFetch. The same JVM that Kafka uses can be used for this.
2427

2528
**Note**: The Kafka check cannot be used with Managed Streaming for Apache Kafka (Amazon MSK). Use the [Amazon MSK integration][6] instead.
2629

30+
Add [Data Streams Monitoring][24] to your producers and consumers to visualize the application topology, root cause issues across services, and measure end to end latency, throughput and lag.
31+
2732
### Configuration
2833

2934
<!-- xxx tabs xxx -->
@@ -62,7 +67,7 @@ _Available for Agent versions >6.0_
6267
[%d] %p %m (%c)%n
6368
```
6469

65-
Clone and edit the [integration pipeline][10] if you have a different format.
70+
Clone and edit the [integration pipeline][10] if you have a different format.
6671

6772
3. Collecting logs is disabled by default in the Datadog Agent, enable it in your `datadog.yaml` file:
6873

0 commit comments

Comments
 (0)