Apache Kafka is an open-source stream processing platform developed by the Apache Software Foundation written in Scala and Java. The Kafka event streaming platform is used by thousands of companies for high-performance data pipelines, streaming analytics, data integration, and mission-critical applications.
We used BizTalk Server as we had all other integrating applications developed on .Net and using Microsoft development environment. Kafka is best if integration is between non-Microsoft applications. We had few adapters developed using Microsoft .Net framework. BizTalk is well …
Apache Kafka is well-suited for most data-streaming use cases. Amazon Kinesis and Azure EventHubs, unless you have a specific use case where using those cloud PaAS for your data lakes, once set up well, Apache Kafka will take care of everything else in the background. Azure EventHubs, is good for cross-cloud use cases, and Amazon Kinesis - I have no real-world experience. But I believe it is the same.
BizTalk is well suited as middleware. Where you wish to translate an input file into an output file and send it to some endpoint. In our case, we used it to convert and send files to SAP. In many ways, it very flexible, and you can do almost anything you want with it. In many ways, it's a better solution than your SAP XI or PI as middleware, since it's much less expensive, and allows you do interface with non-SAP systems.
Really easy to configure. I've used other message brokers such as RabbitMQ and compared to them, Kafka's configurations are very easy to understand and tweak.
Very scalable: easily configured to run on multiple nodes allowing for ease of parallelism (assuming your queues/topics don't have to be consumed in the exact same order the messages were delivered)
Not exactly a feature, but I trust Kafka will be around for at least another decade because active development has continued to be strong and there's a lot of financial backing from Confluent and LinkedIn, and probably many other companies who are using it (which, anecdotally, is many).
It is very user friendly. Users can change rules during run time and change workflow.
Huge capacity for queueing messages. It supports all types of adapters like Oracle, Salesforce, SMTP, FTP, etc. Also users can built custom adaptors.
If users want to dynamically deploy their solution without any downtime, this is a perfect solution. BizTalk will be a good fit, especially for public-facing websites.
Well-proven in the market. I used it when developing a website for Virgin Trains, catering more than 800K user requests per day.
Sometimes it becomes difficult to monitor our Kafka deployments. We've been able to overcome it largely using AWS MSK, a managed service for Apache Kafka, but a separate monitoring dashboard would have been great.
Simplify the process for local deployment of Kafka and provide a user interface to get visibility into the different topics and the messages being processed.
Learning curve around creation of broker and topics could be simplified
Apache Kafka is highly recommended to develop loosely coupled, real-time processing applications. Also, Apache Kafka provides property based configuration. Producer, Consumer and broker contain their own separate property file
Support for Apache Kafka (if willing to pay) is available from Confluent that includes the same time that created Kafka at Linkedin so they know this software in and out. Moreover, Apache Kafka is well known and best practices documents and deployment scenarios are easily available for download. For example, from eBay, Linkedin, Uber, and NYTimes.
BizTalk Server has been supported for more than 15 years. It is well proven in the market. Microsoft has provided excellent support with technical issues.
I used other messaging/queue solutions that are a lot more basic than Confluent Kafka, as well as another solution that is no longer in the market called Xively, which was bought and "buried" by Google. In comparison, these solutions offer way fewer functionalities and respond to other needs.
BizTalk was selected here mainly because it is easy to integrate to a .NET application (most of them are Web Service, WCF SOAP, WCF REST and Web API) and many backend databases are Microsoft SQL Server. Another benefit is that the monitoring job is easy to set up and centralize with other .NET application monitoring jobs.
Positive: Get a quick and reliable pub/sub model implemented - data across components flows easily.
Positive: it's scalable so we can develop small and scale for real-world scenarios
Negative: it's easy to get into a confusing situation if you are not experienced yet or something strange has happened (rare, but it does). Troubleshooting such situations can take time and effort.
A positive impact has been the quicker turnaround time of a part request and that part showing up in SAP using Biztalk as middleware.
A somewhat negative impact has been the somewhat insufficient error logging/message capture settings that Biztalk provide. This has caused occasional delays when attempted to create parts for the business.
A somewhat negative impact has been the need to have a specialized developer who understands Biztalk to troubleshoot issues with the Biztalk and SAP interaction when creating parts, and when adding new fields to the parts.