1# ![logo](fluentbit_logo.png) 2 3### CI Status 4 5| CI Workflow | Status | 6|-------------------|--------------------| 7| Unit Tests (master) | [![CI/Unit Tests](https://github.com/fluent/fluent-bit/actions/workflows/unit-tests.yaml/badge.svg?branch=master)](https://github.com/fluent/fluent-bit/actions/workflows/unit-tests.yaml) | 8| Integration Tests (master) | [![CI/Integration Tests](https://github.com/fluent/fluent-bit/actions/workflows/integration-run-master.yaml/badge.svg)](https://github.com/fluent/fluent-bit/actions/workflows/integration-run-master.yaml)| 9| Docker images (master) | [![CI/Docker Images](https://github.com/fluent/fluent-bit/actions/workflows/integration-build-master.yaml/badge.svg)](https://github.com/fluent/fluent-bit/actions/workflows/integration-build-master.yaml)| 10| Latest release build | [![CI/Build](https://github.com/fluent/fluent-bit/actions/workflows/build-release.yaml/badge.svg)](https://github.com/fluent/fluent-bit/actions/workflows/build-release.yaml)| 11 12 13## Project Description 14 15[Fluent Bit](http://fluentbit.io) is a fast Log Processor and Forwarder for Linux, Windows, Embedded Linux, MacOS and BSD family operating systems. It's part of the Graduated [Fluentd](http://fluentd.org) Ecosystem and a [CNCF](https://cncf.io) sub-project. 16 17Fluent Bit allows to collect log events or metrics from different sources, process them and deliver them to different backends such as [Fluentd](http://fluentd.org), Elasticsearch, Splunk, DataDog, Kafka, New Relic, Azure services, AWS services, Google services, NATS, InfluxDB or any custom HTTP end-point. 18 19Fluent Bit comes with full SQL [Stream Processing](https://docs.fluentbit.io/manual/stream-processing/introduction) capabilities: data manipulation and analytics using SQL queries. 20 21Fluent Bit runs on x86_64, x86, arm32v7 and arm64v8 architectures. 22 23## Features 24 25- High Performance 26- Data Parsing 27 - Convert your unstructured messages using our parsers: [JSON](https://docs.fluentbit.io/manual/pipeline/parsers/json), [Regex](https://docs.fluentbit.io/manual/pipeline/parsers/regular-expression), [LTSV](https://docs.fluentbit.io/manual/pipeline/parsers/ltsv) and [Logfmt](https://docs.fluentbit.io/manual/pipeline/parsers/logfmt) 28- Reliability and Data Integrity 29 - [Backpressure](https://docs.fluentbit.io/manual/administration/backpressure) Handling 30 - [Data Buffering](https://docs.fluentbit.io/manual/administration/buffering-and-storage) in memory and file system 31- Networking 32 - Security: built-in TLS/SSL support 33 - Asynchronous I/O 34- Pluggable Architecture and [Extensibility](https://docs.fluentbit.io/manual/development): Inputs, Filters and Outputs 35 - More than 70 built-in plugins available 36 - Extensibility 37 - Write any input, filter or output plugin in C language 38 - Write [Filters in Lua](https://docs.fluentbit.io/manual/filter/lua) or [Output plugins in Golang](https://docs.fluentbit.io/manual/development/golang-output-plugins) 39- [Monitoring](https://docs.fluentbit.io/manual/administration/monitoring): expose internal metrics over HTTP in JSON and [Prometheus](https://prometheus.io/) format 40- [Stream Processing](https://docs.fluentbit.io/manual/stream-processing/introduction): Perform data selection and transformation using simple SQL queries 41 - Create new streams of data using query results 42 - Aggregation Windows 43 - Data analysis and prediction: Timeseries forecasting 44- Portable: runs on Linux, MacOS, Windows and BSD systems 45 46## Fluent Bit in Production 47 48[Fluent Bit](https://fluentbit.io) is used widely in production environments. In 2020 Fluent Bit was deployed more than **220 Million** times, and continues to be deploy over **1 million times a day**. The following is a preview of who uses Fluent Bit heavily in production: 49 50> If your company uses Fluent Bit and is not listed, feel free to open a Github issue and we will add the logo. 51 52![users](documentation/fluentbit_users.png) 53 54## [Documentation](https://docs.fluentbit.io) 55 56Our official project documentation for [installation](https://docs.fluentbit.io/manual/installation), [configuration](https://docs.fluentbit.io/manual/administration/configuring-fluent-bit), deployment and development topics is located here: 57 58- [https://docs.fluentbit.io](https://fluentbit.io) 59 60### Quick Start 61 62#### Build from Scratch 63 64If you aim to build Fluent Bit from sources, you can go ahead and start with the following commands. 65 66```bash 67cd build 68cmake .. 69make 70bin/fluent-bit -i cpu -o stdout -f 1 71``` 72 73If you are interested into more details, please refer to the [Build & Install](https://docs.fluentbit.io/manual/installation/sources/build-and-install) section. 74 75#### Linux Packages 76 77We provide packages for most common Linux distributions: 78 79- [Debian](https://docs.fluentbit.io/manual/installation/linux/debian) 80- [Raspbian](https://docs.fluentbit.io/manual/installation/linux/raspbian-raspberry-pi) 81- [Ubuntu](https://docs.fluentbit.io/manual/installation/linux/ubuntu) 82- [CentOS](https://docs.fluentbit.io/manual/installation/linux/redhat-centos) 83 84#### Linux / Docker Container Images 85 86Our Linux containers images are the most common deployment model, thousands of 87new installation happen every day, learn more about the available images and 88tags [here](https://docs.fluentbit.io/manual/installation/docker). 89 90#### Windows Packages 91 92Fluent Bit is fully supported on Windows environments, get started with [these instructions](https://docs.fluentbit.io/manual/installation/windows). 93 94### Plugins: Inputs, Filters and Outputs 95 96[Fluent Bit](http://fluentbit.io) is based in a pluggable architecture where different plugins plays a major role in the data pipeline: 97 98#### Input Plugins 99 100| name | title | description | 101| :--- | :--- | :--- | 102| [collectd](https://docs.fluentbit.io/manual/pipeline/inputs/collectd) | Collectd | Listen for UDP packets from Collectd. | 103| [cpu](https://docs.fluentbit.io/manual/pipeline/inputs/cpu-metrics) | CPU Usage | measure total CPU usage of the system. | 104| [disk](https://docs.fluentbit.io/manual/pipeline/inputs/disk-io-metrics) | Disk Usage | measure Disk I/Os. | 105| [dummy](https://docs.fluentbit.io/manual/pipeline/inputs/dummy) | Dummy | generate dummy event. | 106| [exec](https://docs.fluentbit.io/manual/pipeline/inputs/exec) | Exec | executes external program and collects event logs. | 107| [forward](https://docs.fluentbit.io/manual/pipeline/inputs/forward) | Forward | Fluentd forward protocol. | 108| [head](https://docs.fluentbit.io/manual/pipeline/inputs/head) | Head | read first part of files. | 109| [health](https://docs.fluentbit.io/manual/pipeline/inputs/health) | Health | Check health of TCP services. | 110| [kmsg](https://docs.fluentbit.io/manual/pipeline/inputs/kernel-logs) | Kernel Log Buffer | read the Linux Kernel log buffer messages. | 111| [mem](https://docs.fluentbit.io/manual/pipeline/inputs/memory-metrics) | Memory Usage | measure the total amount of memory used on the system. | 112| [mqtt](https://docs.fluentbit.io/manual/pipeline/inputs/mqtt) | MQTT | start a MQTT server and receive publish messages. | 113| [netif](https://docs.fluentbit.io/manual/pipeline/inputs/network-io-metrics) | Network Traffic | measure network traffic. | 114| [proc](https://docs.fluentbit.io/manual/pipeline/inputs/process) | Process | Check health of Process. | 115| [random](https://docs.fluentbit.io/manual/pipeline/inputs/random) | Random | Generate Random samples. | 116| [serial](https://docs.fluentbit.io/manual/pipeline/inputs/serial-interface) | Serial Interface | read data information from the serial interface. | 117| [stdin](https://docs.fluentbit.io/manual/pipeline/inputs/standard-input) | Standard Input | read data from the standard input. | 118| [syslog](https://docs.fluentbit.io/manual/pipeline/inputs/syslog) | Syslog | read syslog messages from a Unix socket. | 119| [systemd](https://docs.fluentbit.io/manual/pipeline/inputs/systemd) | Systemd | read logs from Systemd/Journald. | 120| [tail](https://docs.fluentbit.io/manual/pipeline/inputs/tail) | Tail | Tail log files. | 121| [tcp](https://docs.fluentbit.io/manual/pipeline/inputs/tcp) | TCP | Listen for JSON messages over TCP. | 122| [thermal](https://docs.fluentbit.io/manual/pipeline/inputs/thermal) | Thermal | measure system temperature(s). | 123 124#### Filter Plugins 125 126| name | title | description | 127| :--- | :--- | :--- | 128| [aws](https://docs.fluentbit.io/manual/pipeline/filters/aws-metadata) | AWS Metadata | Enrich logs with AWS Metadata. | 129| [expect](https://docs.fluentbit.io/manual/pipeline/filters/expect) | Expect | Validate records match certain criteria in structure. | 130| [grep](https://docs.fluentbit.io/manual/pipeline/filters/grep) | Grep | Match or exclude specific records by patterns. | 131| [kubernetes](https://docs.fluentbit.io/manual/pipeline/filters/kubernetes) | Kubernetes | Enrich logs with Kubernetes Metadata. | 132| [lua](https://docs.fluentbit.io/manual/pipeline/filters/lua) | Lua | Filter records using Lua Scripts. | 133| [parser](https://docs.fluentbit.io/manual/pipeline/filters/parser) | Parser | Parse record. | 134| [record\_modifier](https://docs.fluentbit.io/manual/pipeline/filters/record-modifier) | Record Modifier | Modify record. | 135| [rewrite\_tag](https://docs.fluentbit.io/manual/pipeline/filters/rewrite-tag) | Rewrite Tag | Re-emit records under new tag. | 136| [stdout](https://docs.fluentbit.io/manual/pipeline/filters/standard-output) | Stdout | Print records to the standard output interface. | 137| [throttle](https://docs.fluentbit.io/manual/pipeline/filters/throttle) | Throttle | Apply rate limit to event flow. | 138| [nest](https://docs.fluentbit.io/manual/pipeline/filters/nest) | Nest | Nest records under a specified key | 139| [modify](https://docs.fluentbit.io/manual/pipeline/filters/modify) | Modify | Modifications to record. | 140 141#### Output Plugins 142 143| name | title | description | 144| :--- | :--- | :--- | 145| [azure](https://docs.fluentbit.io/manual/pipeline/outputs/azure) | Azure Log Analytics | Ingest records into Azure Log Analytics | 146| [bigquery](https://docs.fluentbit.io/manual/pipeline/outputs/bigquery) | BigQuery | Ingest records into Google BigQuery | 147| [counter](https://docs.fluentbit.io/manual/pipeline/outputs/counter) | Count Records | Simple records counter. | 148| [datadog](https://docs.fluentbit.io/manual/pipeline/outputs/datadog) | Datadog | Ingest logs into Datadog. | 149| [es](https://docs.fluentbit.io/manual/pipeline/outputs/elasticsearch) | Elasticsearch | flush records to a Elasticsearch server. | 150| [file](https://docs.fluentbit.io/manual/pipeline/outputs/file) | File | Flush records to a file. | 151| [flowcounter](https://docs.fluentbit.io/manual/pipeline/outputs/flowcounter) | FlowCounter | Count records. | 152| [forward](https://docs.fluentbit.io/manual/pipeline/outputs/forward) | Forward | Fluentd forward protocol. | 153| [gelf](https://docs.fluentbit.io/manual/pipeline/outputs/gelf) | GELF | Flush records to Graylog | 154| [http](https://docs.fluentbit.io/manual/pipeline/outputs/http) | HTTP | Flush records to an HTTP end point. | 155| [influxdb](https://docs.fluentbit.io/manual/pipeline/outputs/influxdb) | InfluxDB | Flush records to InfluxDB time series database. | 156| [kafka](https://docs.fluentbit.io/manual/pipeline/outputs/kafka) | Apache Kafka | Flush records to Apache Kafka | 157| [kafka-rest](https://docs.fluentbit.io/manual/pipeline/outputs/kafka-rest-proxy) | Kafka REST Proxy | Flush records to a Kafka REST Proxy server. | 158| [nats](https://docs.fluentbit.io/manual/pipeline/outputs/nats) | NATS | Flush records to a NATS server. | 159| [null](https://docs.fluentbit.io/manual/pipeline/outputs/null) | NULL | Throw away events. | 160| [s3](https://docs.fluentbit.io/manual/pipeline/outputs/s3) | S3 | Flush records to s3 | 161| [stackdriver](https://docs.fluentbit.io/manual/pipeline/outputs/stackdriver) | Google Stackdriver Logging | Flush records to Google Stackdriver Logging service. | 162| [stdout](https://docs.fluentbit.io/manual/pipeline/outputs/standard-output) | Standard Output | Flush records to the standard output. | 163| [splunk](https://docs.fluentbit.io/manual/pipeline/outputs/splunk) | Splunk | Flush records to a Splunk Enterprise service | 164| [tcp](https://docs.fluentbit.io/manual/pipeline/outputs/tcp-and-tls) | TCP & TLS | Flush records to a TCP server. | 165| [td](https://docs.fluentbit.io/manual/pipeline/outputs/treasure-data) | [Treasure Data](http://www.treasuredata.com) | Flush records to the [Treasure Data](http://www.treasuredata.com) cloud service for analytics. | 166 167## Contributing 168 169[Fluent Bit](https://fluentbit.io) is an open project, several individuals and companies contribute in different forms like coding, documenting, testing, spreading the word at events within others. If you want to learn more about contributing opportunities please reach out to us through our [Community Channels](https://fluentbit.io/community/). 170 171If you are interested in contributing to Fluent bit with bug fixes, new features or coding in general, please refer to the code [CONTRIBUTING](CONTRIBUTING.md) guidelines. You can also refer the Beginners Guide to contributing to Fluent Bit [here.](DEVELOPER_GUIDE.md) 172 173## Community & Contact 174 175Feel free to join us on our Slack channel, Mailing List or IRC: 176 177- [Slack](http://slack.fluentd.org) (#fluent-bit channel) 178- [Mailing List](https://groups.google.com/forum/#!forum/fluent-bit) 179- [Discourse Forum](https://discuss.fluentd.org) 180- [Twitter](http://twitter.com/fluentbit) 181- [IRC](irc.freenode.net) #fluent-bit 182 183## License 184 185This program is under the terms of the [Apache License v2.0](http://www.apache.org/licenses/LICENSE-2.0). 186 187## Authors 188 189[Fluent Bit](http://fluentbit.io) is originally made and currently sponsored by [Treasure Data](http://treasuredata.com) among other [contributors](https://github.com/fluent/fluent-bit/graphs/contributors). 190