Configuring event streams
The event streaming plugin for Apache Kafka is configured using command line options.
Ensure a Kafka system is available to receive streams from Hyperledger Besu.
Hyperledger Besu is compatible with Kafka versions 2.5.0 and above.
Multiple topics are created for each event stream in the format <stream_prefix><domain_type>
, where the domain types are:
block
transaction
smart-contract
node
log
<stream_prefix>
is defined using --plugin-kafka-stream
.
Configuring a Kafka event stream
Configure a Kafka event stream in the command line by enabling the plugin and setting the appropriate options.
`besu --plugin-kafka-enabled --plugin-kafka-stream=my-besu-stream --plugin-kafka-url=127.0.0.1:9090 --plugin-kafka-producer-config-override-enabled --plugin-kafka-producer-property=sasl.mechanism=PLAIN`
The command line specifies:
- Enable the Kafka plugin using the
--plugin-kafka-enabled
option. - Prefix to identify the stream in Kafka using the
--plugin-kafka-stream
option. - URL endpoint of the Kafka stream using the
--plugin-kafka-url
option. - Restrict broadcasted event topics with the
--plugin-kafka-enabled-topic
option. - Override Kafka Producer configuration properties using the
--plugin-kafka-producer-config-override-enabled
option. - Kafka Producer configuration property using the
--plugin-kafka-producer-property
option. If multiple properties are required, the--plugin-kafka-producer-property
option can be specified multiple times.
If --plugin-kafka-url
is not specified, the plugin attempts to connect to a local Kafka broker at 127.0.0.1:9092
.
Filtering smart contract event logs
You can configure event streaming plugins to filter event logs from specified smart contracts. To create the filter, use the following CLI options:
To display the filtered events in a more readable format, create a schema file to decode the events. To specify the schema file location, use the --plugin-kafka-log-schema-file
option.