A pinging project.
__,---,
.---. /__|o\ ) .-"-. .----.""".
/ 6_6 `-\ / / / 4 4 \ /____/ (0 )\
\_ (__\ ,) (, \_ v _/ `--\_ /
// \\ // \\ // \\ // \\
(( )) {( )} (( )) {{ }}
=======""===""=========""===""======""===""=========""===""=======
jgs ||| ||||| ||| |||
| ||| | '|'ASCII Copyright
$ pip install -e git://github.com/byashimov/pingeon.git#egg=pingeon
1. Producer
===========
+---------------+ +--------------+ +-----------------+
| | func() | | Log | |
| Regular job | -----> | Producer | ----> | Kafka producer |
| | | | | |
+---------------+ +--------------+ +-----------------+
^
| func() -> dict
+-------------+
| +-------+ | Create Log
| | check | | with unique Log.uid
| +-------+ |
| |
| +-------+ |
| | check | |
| +-------+ |
+-------------+
---------------------------------
---------------------------------/
2. Consumer
===========
+---------------+ +--------------+ +-----------------+
| | func() | | Log | |
| Regular job | -----> | Consumer | <---- | Kafka consumer |
| | | | | |
+---------------+ +--------------+ +-----------------+
| Log
V
+-----------------+
INSERT | |
ON CONFLICT UID | Postgres |
DO NOTHING | client |
PARTITION BY RANGE | |
+-----------------+ Project contains several components:
- A regular job
worker. Runs givenfuncevery givenintervalin seconds producerwhich run any amount ofcheckersand save those result as a labeledLogwith an unique uid to Kafka- "Checkers" are just regular functions
consumeris also run byworker, reads Kafka topic and saves data to partitioned table in Postgres. Since Kafka guarantees at least on delivery it doesn't fail with existing log uid.
Every component is well isolated by simple interfaces:
workerrun any async functionproducerrun any async function, which however must return a dictionarycheckerdoes whatever it does until it returns a dictionary- both
consumerandproduceruse "repositories" to read or save data. All of them useLogobject as a contract.
See tests/test_integration.py for usage example.