• Home
  • History
  • Annotate
Name Date Size #Lines LOC

..03-May-2022-

examples/H03-May-2022-1,003817

src/H03-May-2022-7,1454,757

tests/H03-May-2022-1,9171,556

.cargo-checksum.jsonH A D03-May-202289 11

.gitattributesH A D27-Jun-2018238 54

.gitignoreH A D27-Jun-201829 54

.gitmodulesH A D08-Dec-2018106 43

.travis.ymlH A D09-Jul-2020542 3630

CONTRIBUTING.mdH A D09-Jul-20201 KiB5033

Cargo.lockH A D09-Jul-202031.1 KiB1,2631,126

Cargo.tomlH A D09-Jul-20202.8 KiB12298

Cargo.toml.orig-cargoH A D09-Jul-20202.2 KiB8573

DockerfileH A D04-Dec-2019843 3124

LICENSEH A D27-Jun-20181 KiB2217

README.mdH A D09-Jul-202010.5 KiB256192

changelog.mdH A D09-Jul-202012 KiB370275

coverage.shH A D27-Jun-20181.9 KiB7057

docker-compose.yamlH A D09-Jul-2020573 2118

generate_readme.pyH A D09-Jul-20201.5 KiB6043

loc.shH A D27-Jun-2018101 31

push_rdkafka_docs.shH A D03-Jul-2018395 2011

rdkafka.suppressionsH A D09-Jul-2020760 2826

readme_templateH A D09-Jul-20201,020 2515

test_ci.shH A D09-Jul-2020210 117

test_suite.shH A D09-Jul-20201.5 KiB6545

README.md

1# rust-rdkafka
2
3[![crates.io](https://img.shields.io/crates/v/rdkafka.svg)](https://crates.io/crates/rdkafka)
4[![docs.rs](https://docs.rs/rdkafka/badge.svg)](https://docs.rs/rdkafka/)
5[![Build Status](https://travis-ci.org/fede1024/rust-rdkafka.svg?branch=master)](https://travis-ci.org/fede1024/rust-rdkafka)
6[![coverate](https://codecov.io/gh/fede1024/rust-rdkafka/graphs/badge.svg?branch=master)](https://codecov.io/gh/fede1024/rust-rdkafka/)
7[![Join the chat at https://gitter.im/rust-rdkafka/Lobby](https://badges.gitter.im/rust-rdkafka/Lobby.svg)](https://gitter.im/rust-rdkafka/Lobby?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge)
8
9A fully asynchronous, [futures]-enabled [Apache Kafka] client
10library for Rust based on [librdkafka].
11
12## The library
13
14`rust-rdkafka` provides a safe Rust interface to librdkafka. The master
15branch is currently based on librdkafka 1.4.2.
16
17### Documentation
18
19- [Latest release](https://docs.rs/rdkafka/)
20- [Changelog](https://github.com/fede1024/rust-rdkafka/blob/master/changelog.md)
21
22### Features
23
24The main features provided at the moment are:
25
26- Support for all Kafka versions since 0.8.x. For more information about
27  broker compatibility options, check the [librdkafka
28  documentation][broker-compat].
29- Consume from single or multiple topics.
30- Automatic consumer rebalancing.
31- Customizable rebalance, with pre and post rebalance callbacks.
32- Synchronous or asynchronous message production.
33- Customizable offset commit.
34- Create and delete topics and add and edit partitions.
35- Alter broker and topic configurations.
36- Access to cluster metadata (list of topic-partitions, replicas, active
37  brokers etc).
38- Access to group metadata (list groups, list members of groups, hostnames,
39  etc.).
40- Access to producer and consumer metrics, errors and callbacks.
41
42### One million messages per second
43
44`rust-rdkafka` is designed to be easy and safe to use thanks to the
45abstraction layer written in Rust, while at the same time being extremely
46fast thanks to the librdkafka C library.
47
48Here are some benchmark results using the [`BaseProducer`],
49sending data to a single Kafka 0.11 process running in localhost (default
50configuration, 3 partitions). Hardware: Dell laptop, with Intel Core
51i7-4712HQ @ 2.30GHz.
52
53- Scenario: produce 5 million messages, 10 bytes each, wait for all of them to be acked
54  - 1045413 messages/s, 9.970 MB/s  (average over 5 runs)
55
56- Scenario: produce 100000 messages, 10 KB each, wait for all of them to be acked
57  - 24623 messages/s, 234.826 MB/s  (average over 5 runs)
58
59For more numbers, check out the [kafka-benchmark] project.
60
61### Client types
62
63`rust-rdkafka` provides low level and high level consumers and producers.
64
65Low level:
66
67* [`BaseConsumer`]: a simple wrapper around the librdkafka consumer. It
68  must be periodically `poll()`ed in order to execute callbacks, rebalances
69  and to receive messages.
70* [`BaseProducer`]: a simple wrapper around the librdkafka producer. As in
71  the consumer case, the user must call `poll()` periodically to execute
72  delivery callbacks.
73* [`ThreadedProducer`]: a `BaseProducer` with a separate thread dedicated to
74  polling the producer.
75
76High level:
77
78 * [`StreamConsumer`]: a [`Stream`] of messages that takes care of
79   polling the consumer automatically.
80 * [`FutureProducer`]: a [`Future`] that will be completed once
81   the message is delivered to Kafka (or failed).
82
83For more information about consumers and producers, refer to their
84module-level documentation.
85
86*Warning*: the library is under active development and the APIs are likely
87to change.
88
89### Asynchronous data processing with Tokio
90
91[Tokio] is a platform for fast processing of asynchronous events in Rust.
92The interfaces exposed by the [`StreamConsumer`] and the [`FutureProducer`]
93allow rust-rdkafka users to easily integrate Kafka consumers and producers
94within the Tokio platform, and write asynchronous message processing code.
95Note that rust-rdkafka can be used without Tokio.
96
97To see rust-rdkafka in action with Tokio, check out the
98[asynchronous processing example] in the examples folder.
99
100### At-least-once delivery
101
102At-least-once delivery semantics are common in many streaming applications:
103every message is guaranteed to be processed at least once; in case of
104temporary failure, the message can be re-processed and/or re-delivered,
105but no message will be lost.
106
107In order to implement at-least-once delivery the stream processing
108application has to carefully commit the offset only once the message has
109been processed. Committing the offset too early, instead, might cause
110message loss, since upon recovery the consumer will start from the next
111message, skipping the one where the failure occurred.
112
113To see how to implement at-least-once delivery with `rdkafka`, check out the
114[at-least-once delivery example] in the examples folder. To know more about
115delivery semantics, check the [message delivery semantics] chapter in the
116Kafka documentation.
117
118### Users
119
120Here are some of the projects using rust-rdkafka:
121
122- [timely-dataflow]: a distributed data-parallel compute engine. See also
123  the [blog post][timely-blog] announcing its Kafka integration.
124- [kafka-view]: a web interface for Kafka clusters.
125- [kafka-benchmark]: a high performance benchmarking tool for Kafka.
126
127*If you are using rust-rdkafka, please let us know!*
128
129## Installation
130
131Add this to your `Cargo.toml`:
132
133```toml
134[dependencies]
135rdkafka = { version = "0.24", features = ["cmake-build"] }
136```
137
138This crate will compile librdkafka from sources and link it statically to
139your executable. To compile librdkafka you'll need:
140
141* the GNU toolchain
142* GNU `make`
143* `pthreads`
144* `zlib`: optional, but included by default (feature: `libz`)
145* `cmake`: optional, *not* included by default (feature: `cmake-build`)
146* `libssl-dev`: optional, *not* included by default (feature: `ssl`)
147* `libsasl2-dev`: optional, *not* included by default (feature: `gssapi`)
148* `libzstd-dev`: optional, *not* included by default (feature: `zstd-pkg-config`)
149
150Note that using the CMake build system, via the `cmake-build` feature, is
151encouraged if you can take the dependency on CMake.
152
153By default a submodule with the librdkafka sources pinned to a specific
154commit will be used to compile and statically link the library. The
155`dynamic-linking` feature can be used to instead dynamically link rdkafka to
156the system's version of librdkafka. Example:
157
158```toml
159[dependencies]
160rdkafka = { version = "0.24", features = ["dynamic-linking"] }
161```
162
163For a full listing of features, consult the [rdkafka-sys crate's
164documentation][rdkafka-sys-features]. All of rdkafka-sys features are
165re-exported as rdkafka features.
166
167### Asynchronous runtimes
168
169Some features of the [`StreamConsumer`] and [`FutureProducer`] depend on
170Tokio, which can be a heavyweight dependency for users who only intend to
171use the low-level consumers and producers. The Tokio integration is
172enabled by default, but can be disabled by turning off default features:
173
174```toml
175[dependencies]
176rdkafka = { version = "0.24", default-features = false }
177```
178
179If you would like to use an asynchronous runtime besides Tokio, you can
180integrate it with rust-rdkafka by providing a shim that implements the
181[`AsyncRuntime`] trait. See the [smol runtime example] for an example
182integration with [smol].
183
184## Examples
185
186You can find examples in the [`examples`] folder. To run them:
187
188```bash
189cargo run --example <example_name> -- <example_args>
190```
191
192## Debugging
193
194rust-rdkafka uses the [`log`] and [`env_logger`] crates to handle logging.
195Logging can be enabled using the `RUST_LOG` environment variable, for
196example:
197
198```bash
199RUST_LOG="librdkafka=trace,rdkafka::client=debug" cargo test
200```
201
202This will configure the logging level of librdkafka to trace, and the level
203of the client module of the Rust client to debug. To actually receive logs
204from librdkafka, you also have to set the `debug` option in the producer or
205consumer configuration (see librdkafka
206[configuration][librdkafka-config]).
207
208To enable debugging in your project, make sure you initialize the logger
209with `env_logger::init()`, or the equivalent for any `log`-compatible
210logging framework.
211
212[`AsyncRuntime`]: https://docs.rs/rdkafka/*/rdkafka/util/struct.AsyncRuntime.html
213[`BaseConsumer`]: https://docs.rs/rdkafka/*/rdkafka/consumer/base_consumer/struct.BaseConsumer.html
214[`BaseProducer`]: https://docs.rs/rdkafka/*/rdkafka/producer/base_producer/struct.BaseProducer.html
215[`Future`]: https://doc.rust-lang.org/stable/std/future/trait.Future.html
216[`FutureProducer`]: https://docs.rs/rdkafka/*/rdkafka/producer/future_producer/struct.FutureProducer.html
217[`Stream`]: https://docs.rs/futures/*/futures/stream/trait.Stream.html
218[`StreamConsumer`]: https://docs.rs/rdkafka/*/rdkafka/consumer/stream_consumer/struct.StreamConsumer.html
219[`ThreadedProducer`]: https://docs.rs/rdkafka/*/rdkafka/producer/base_producer/struct.ThreadedProducer.html
220[`log`]: https://docs.rs/log
221[`env_logger`]: https://docs.rs/env_logger
222[Apache Kafka]: https://kafka.apache.org
223[asynchronous processing example]: https://github.com/fede1024/rust-rdkafka/blob/master/examples/asynchronous_processing.rs
224[at-least-once delivery example]: https://github.com/fede1024/rust-rdkafka/blob/master/examples/at_least_once.rs
225[smol runtime example]: https://github.com/fede1024/rust-rdkafka/blob/master/examples/smol_runtime.rs
226[broker-compat]: https://github.com/edenhill/librdkafka/blob/master/INTRODUCTION.md#broker-version-compatibility
227[`examples`]: https://github.com/fede1024/rust-rdkafka/blob/master/examples/
228[futures]: https://github.com/rust-lang/futures-rs
229[kafka-benchmark]: https://github.com/fede1024/kafka-benchmark
230[kafka-benchmark]: https://github.com/fede1024/kafka-benchmark
231[kafka-view]: https://github.com/fede1024/kafka-view
232[librdkafka]: https://github.com/edenhill/librdkafka
233[librdkafka-config]: https://github.com/edenhill/librdkafka/blob/master/CONFIGURATION.md
234[message delivery semantics]: https://kafka.apache.org/0101/documentation.html#semantics
235[rdkafka-sys-features]: https://github.com/fede1024/rust-rdkafka/tree/master/rdkafka-sys/README.md#features
236[rdkafka-sys-known-issues]: https://github.com/fede1024/rust-rdkafka/tree/master/rdkafka-sys/README.md#known-issues
237[smol]: https://docs.rs/smol
238[timely-blog]: https://github.com/frankmcsherry/blog/blob/master/posts/2017-11-08.md
239[timely-dataflow]: https://github.com/frankmcsherry/timely-dataflow
240[Tokio]: https://tokio.rs/
241
242## rdkafka-sys
243
244See [rdkafka-sys](https://github.com/fede1024/rust-rdkafka/tree/master/rdkafka-sys).
245
246## Contributors
247
248Thanks to:
249* Thijs Cadier - [thijsc](https://github.com/thijsc)
250
251## Alternatives
252
253* [kafka-rust]: a pure Rust implementation of the Kafka client.
254
255[kafka-rust]: https://github.com/spicavigo/kafka-rust
256

readme_template