Airflow is well-suited for data engineering pipelines, creating scheduled workflows, and working with various data sources. You can implement almost any kind of DAG for any use case using the different operators or enforce your operator using the Python operator with ease. The MLOps feature of Airflow can be enhanced to match MLFlow-like features, making Airflow the go-to solution for all workloads, from data science to data engineering.
BMC Footprints is so well suited to keep the documentation easy to read and find, as same as typification. You can find specific documentation for an audit so fast and export a report using the specific criteria that you need to comply with your boss or audit needs. As I told before, BMC footprints need to be more friendly to the end users because they get lost many times trying to track some ticket or typing documentation.
Apache Airflow is one of the best Orchestration platforms and a go-to scheduler for teams building a data platform or pipelines.
Apache Airflow supports multiple operators, such as the Databricks, Spark, and Python operators. All of these provide us with functionality to implement any business logic.
Apache Airflow is highly scalable, and we can run a large number of DAGs with ease. It provided HA and replication for workers. Maintaining airflow deployments is very easy, even for smaller teams, and we also get lots of metrics for observability.
Documentation. We try to reduce the amount of paperwork needed for staff to do their job, so by automating certain tasks, we are able to speed up the resolution process for trouble tickets.
Reporting. We'll use the reporting tool to get the number of tickets opened, response times and can go into granular reports.
Surveys. When tickets are closed, we automatically send out surveys to end users to get valuable feedback on how we did and what we can improve.
UI/Dashboard can be updated to be customisable, and jobs summary in groups of errors/failures/success, instead of each job, so that a summary of errors can be used as a starting point for reviewing them.
Navigation - It's a bit dated. Could do with more modern web navigation UX. i.e. sidebars navigation instead of browser back/forward.
Again core functional reorg in terms of UX. Navigation can be improved for core functions as well, instead of discovery.
Purpose based configuration- It would be beneficial to see a more purposed based, out of the box, configuration option. For example, if you need PCI compliance, more intuitive reporting would make managing compliance much easier.
Initial design and implementation- Don't think that your experience as an IT professional will allow you to stand this system up on your own. To properly configure Footprints and set yourself up for success down the road, get Professional Services with this one.
Somewhat behind the times- Service Core is making a huge leap forward with the latest version, 12, but Asset Core is far behind. There are quite a few quirks to how the application works and how it is used.
It has been the business decision to go with them and that is what we will do. Going back, this would have not been the choice, but nothing can be done about it now. We are stuck with this application for years to come. Wish there were other possibilities that could be done.
For its capability to connect with multicloud environments. Access Control management is something that we don't get in all the schedulers and orchestrators. But although it provides so many flexibility and options to due to python , some level of knowledge of python is needed to be able to build workflows.
It's so simple to use and customize however you want. You can create new workspaces and workflows with ease, set up new users, incoming email rules, customize the layout of the forms, and even change the colors and logos. It's just very easily customizable overall. It's also really straightforward to figure out how to use, you really almost don't have to show somebody how to use it. If you just sit them down in front of it and let them look it over, they could figure it out themselves easily.
I've had no issues with the support for FootPrints. We haven't really had to use them all that much over the years, but when needed they have always been prompt and knowledgeable at dealing with any issue. I've worked with a lot of different support teams over the years, and they have been one of my favorites to work with.
Multiple DAGs can be orchestrated simultaneously at varying times, and runs can be reproduced or replicated with relative ease. Overall, utilizing Apache Airflow is easier to use than other solutions now on the market. It is simple to integrate in Apache Airflow, and the workflow can be monitored and scheduling can be done quickly using Apache Airflow. We advocate using this tool for automating the data pipeline or process.
I was not involved in the selection process but in my opinion either SQL or Access databases would have worked just as well without the same amount of cost. These two systems would have been much easier to manage and would have tracked the same information in a less convoluted process and expense.
Impact Depends on number of workflows. If there are lot of workflows then it has a better usecase as the implementation is justified as it needs resources , dedicated VMs, Database that has a cost