Skip to content

Commit 83c3bf5

Browse files
committed
ETL/CDC: Implement suggestions by CodeRabbit
1 parent 997c81a commit 83c3bf5

File tree

7 files changed

+23
-22
lines changed

7 files changed

+23
-22
lines changed

docs/_include/links.md

Lines changed: 5 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,5 @@
1+
<!-- markdownlint-disable MD053 -->
2+
13
[Amazon DynamoDB Streams]: https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/Streams.html
24
[Amazon Kinesis Data Streams]: https://docs.aws.amazon.com/streams/latest/dev/introduction.html
35
[Apache Airflow]: https://airflow.apache.org/
@@ -21,9 +23,9 @@
2123
[Datashader]: https://datashader.org/
2224
[Dynamic Database Schemas]: https://cratedb.com/product/features/dynamic-schemas
2325
[DynamoDB]: https://aws.amazon.com/dynamodb/
24-
[DynamoDB CDC Relay]: https://cratedb-toolkit.readthedocs.io/io/dynamodb/cdc.html
25-
[DynamoDB CDC Relay with AWS Lambda]: https://cratedb-toolkit.readthedocs.io/io/dynamodb/cdc-lambda.html
26-
[DynamoDB Table Loader]: https://cratedb-toolkit.readthedocs.io/io/dynamodb/loader.html
26+
[DynamoDB CDC Relay]: inv:ctk:*:label#dynamodb-cdc
27+
[DynamoDB CDC Relay with AWS Lambda]: inv:ctk:*:doc#io/dynamodb/cdc-lambda
28+
[DynamoDB Table Loader]: inv:ctk:*:label#dynamodb-loader
2729
[Executable stack with Apache Kafka, Apache Flink, and CrateDB]: https://github.com/crate/cratedb-examples/tree/main/framework/flink/kafka-jdbcsink-java
2830
[Geospatial Data Model]: https://cratedb.com/data-model/geospatial
2931
[Geospatial Database]: https://cratedb.com/geospatial-spatial-database

docs/_include/styles.html

Lines changed: 2 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -47,10 +47,8 @@
4747
}
4848

4949
/* On tiled link overview index pages, give ul/li elements more space */
50-
.ul-li-wide {
51-
ul li {
52-
margin-bottom: 1rem;
53-
}
50+
.ul-li-wide ul li {
51+
margin-bottom: 1rem;
5452
}
5553

5654
</style>

docs/ingest/cdc/index.md

Lines changed: 4 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@ features.
1313
This documentation section lists CDC applications,
1414
frameworks, and solutions, which can be used together with CrateDB,
1515
and outlines how to use them optimally.
16-
Please also take a look at support for {ref}`generic ETL <etl>` solutions.
16+
Additionally, see support for {ref}`generic ETL <etl>` solutions.
1717
:::
1818

1919

@@ -67,7 +67,7 @@ kinds of databases.
6767
::::{grid-item-card} Debezium
6868
:link: debezium
6969
:link-type: ref
70-
Use, Debezium an open source distributed platform for change data capture for
70+
Use Debezium, an open source distributed platform for change data capture for
7171
loading data into CrateDB.
7272
It is used as a building block by a number of downstream third-party projects and products.
7373
::::
@@ -92,9 +92,8 @@ Python interface. It is available for on-premises and as a managed service.
9292
::::{grid-item-card} StreamSets
9393
:link: streamsets
9494
:link-type: ref
95-
Use the StreamSets Data Collector Engine to ingest and transform data from a variety
96-
of sources into CrateDB. It runs on-premises or in any cloud.
95+
Use the StreamSets Data Collector Engine to ingest and transform data from many
96+
sources into CrateDB. It runs on-premises or in any cloud.
9797
::::
9898

9999
:::::
100-

docs/ingest/etl/index.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -17,7 +17,7 @@ features.
1717
This documentation section lists ETL applications and
1818
frameworks which can be used together with CrateDB, and outlines how
1919
to use them optimally.
20-
Please also take a look at support for {ref}`cdc` solutions.
20+
Additionally, see support for {ref}`cdc` solutions.
2121
:::
2222

2323

@@ -162,8 +162,8 @@ Load data from database systems.
162162
- {ref}`streamsets`
163163

164164
The StreamSets Data Collector is a lightweight and powerful engine that allows you
165-
to build streaming, batch and change-data-capture (CDC) pipelines that can ingest
166-
and transform data from a variety of sources.
165+
to build streaming, batch, and change-data-capture (CDC) pipelines that can ingest
166+
and transform data from many sources.
167167

168168
+++
169169
Load data from streaming platforms.

docs/integrate/apache-airflow/index.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -11,9 +11,9 @@
1111

1212
```{div}
1313
:style: "float: right"
14-
[![](https://19927462.fs1.hubspotusercontent-na1.net/hub/19927462/hubfs/Partner%20Logos/392x140/Apache-Airflow-Logo-392x140.png?width=784&height=280&name=Apache-Airflow-Logo-392x140.png){w=180px}](https://airflow.apache.org/)
14+
[![Apache Airflow logo](https://19927462.fs1.hubspotusercontent-na1.net/hub/19927462/hubfs/Partner%20Logos/392x140/Apache-Airflow-Logo-392x140.png?width=784&height=280&name=Apache-Airflow-Logo-392x140.png){w=180px}](https://airflow.apache.org/)
1515
16-
[![](https://logowik.com/content/uploads/images/astronomer2824.jpg){w=180px}](https://www.astronomer.io/)
16+
[![Astronomer logo](https://logowik.com/content/uploads/images/astronomer2824.jpg){w=180px}](https://www.astronomer.io/)
1717
```
1818
:::{div}
1919
[Apache Airflow] is an open source software platform to programmatically author,

docs/integrate/aws-lambda/index.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -9,8 +9,8 @@
99

1010
:::{div}
1111
[AWS Lambda] is a serverless compute service that runs your code in response to
12-
events and automatically manages the underlying compute resources for you. These
13-
events may include changes in state or an update.
12+
events and automatically manages the underlying compute resources for you.
13+
Events can include state changes and updates.
1414
:::
1515

1616
:::{rubric} Learn

docs/integrate/n8n/index.md

Lines changed: 5 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -5,14 +5,16 @@
55
:::
66

77
[n8n] is a fair-code licensed workflow automation tool that combines AI capabilities
8-
with business process automation. It helps you to connect any app with an API with
8+
with business process automation. It helps you connect any app with an API to
99
any other, and manipulate its data with little or no code.
1010

1111
:::{rubric} Learn
1212
:::
1313

14-
- https://cratedb.com/integrations/cratedb-and-n8n
15-
- https://n8n.io/integrations/cratedb/
14+
- [CrateDB and n8n integration]
15+
- [n8n CrateDB integration]
1616

1717

18+
[CrateDB and n8n integration]: https://cratedb.com/integrations/cratedb-and-n8n
1819
[n8n]: https://docs.n8n.io/
20+
[n8n CrateDB integration]: https://n8n.io/integrations/cratedb/

0 commit comments

Comments
 (0)