Event Hubs is a managed, real-time data ingestion service that’s used to stream millions of events per second from any source to build dynamic data pipelines and respond to business challenges. Users can continue to process data during emergencies using the geo-disaster recovery and geo-replication features. It can be integrated with other Azure services to unlock insights. Existing Apache Kafka clients and applications can be allowed to talk to Event Hubs without any code changes, producing a managed Kafka experience without having to manage clusters.
The service can be used to:
- Ingest millions of events per second - Continuously ingress data from hundreds of thousands of sources with low latency and configurable time retention.
- Enable real-time and micro-batch processing concurrently - Send data to Blob storage or Data Lake Storage for long-term retention or micro-batch processing with Event Hubs Capture.
- Get a managed service with elastic scale - Scale from streaming megabytes of data to terabytes while keeping control over when and how much to scale.
- Connect with the Apache Kafka ecosystem - Connect Event Hubs with Kafka applications and clients with Azure Event Hubs for Apache Kafka®.
- Build a serverless streaming solution - Natively connect with Stream Analytics to build an end-to-end serverless streaming solution.
- Ingest events on Azure Stack Hub and realize hybrid cloud solutions - Locally ingest and process data at a large scale on Azure Stack Hub and implement hybrid cloud architectures by leveraging Azure services to further process, visualize, or store data.