Based on our record, Azkaban seems to be more popular. It has been mentiond 3 times since March 2021. We are tracking product recommendations and mentions on various public social media platforms and blogs. They can help you identify which product is more popular and what people think of it.
Not sure if https://azkaban.github.io/ would fit your use case. Source: about 2 years ago
I used this once, was pretty nice: https://azkaban.github.io/. Source: about 2 years ago
Apache Azkaban is a batch workflow job scheduler to help developers run Hadoop jobs. The open-sourced platform “resolves ordering through job dependencies” and offers an intuitive web interface to help users maintain and track workflows. Source: about 2 years ago
Apache Airflow - Airflow is a platform to programmaticaly author, schedule and monitor data pipelines.
Striim - Striim provides an end-to-end, real-time data integration and streaming analytics platform.
Metaflow - Framework for real-life data science; build, improve, and operate end-to-end workflows.
HVR - Your data. Where you need it. HVR is the leading independent real-time data replication solution that offers efficient data integration for cloud and more.
Luigi - Luigi is a Python module that helps you build complex pipelines of batch jobs.
Bryteflow Data Replication and Integration - Bryteflow is a popular platform that offers many services, including data replication and integration.