Using Apache Kafka as a DataHub Source
In addition to being able to send data to Apache Kafka using DataHub's External Historian connector, DataHub can also source data from Apache Kafka for use with other DataHub features.
Apache Kafka provides distributed, fault-tolerant, and scalable event streaming capabilities, often used for real-time data integration, event-driven architectures, and building data pipelines.
DataHub supports flexible security features for connecting to Apache Kafka
- SaslSsl, SSl, SslPlaintext, and Plaintext protocols with Gssapi, Plain, ScramSha256, ScramSha512, and OAuthbearer SASL mechanisms
- Kerberos credentials supported
- SSL credentials (certificate) supported
- Reading from Azure Event Hubs using Kafka supported
Try It
Discuss Your Application