Recommended and mentioned products
-
Informatica PowerCenter ist eine skalierbare, hochperformante Lösung zur Integration von Unternehmensdaten, die den gesamten Zyklus der Datenintegration unterstützt.
-
Airflow is a platform to programmaticaly author, schedule and monitor data pipelines.
.NET Modern Task Scheduler about 7 days ago:
A few years ago, I opened a GitHub issue with Microsoft telling them that I think the .NET ecosystem needs its own equivalent of Apache Airflow or Prefect. Fast forward 'til now, and I still don't think we have anything close to these frameworks. -
Extract, transfer and load ETL data across multiple systems, with support forextended metadata management and big data enterprise connectivity.
-
Oracle Data Integrator is a data integration platform that covers batch loads, to trickle-feed integration processes.
-
Learn about SQL Server Integration Services, Microsoft's platform for building enterprise-level data integration and data transformations solutions
-
Connect to any data source in batch or real-time, across any platform. Download Talend Open Studio today to start working with Hadoop and NoSQL.
-
Hitachi Vantara brings Pentaho Data Integration, an end-to-end platform for all data integration challenges, that simplifies creation of data pipelines and provides big data processing.
-
Open-source software for reliable, scalable, distributed computing
5 Best Practices For Data Integration To Boost ROI And... about 14 days ago:
There are different ways to implement parallel dataflows, such as using parallel data processing frameworks like Apache Hadoop, Apache Spark, and Apache Flink, or using cloud-based services like Amazon EMR and Google Cloud Dataflow. It is also possible to use parallel dataflow frameworks to handle big data and distributed computing, like Apache Nifi and Apache Kafka. -
Fully managed extract, transform, and load (ETL) service
Deploying a Data Warehouse with Pulumi and Amazon Redshift about 4 months ago:
So in the next post, we'll do that: We'll take what we've done here, add a few more components with Pulumi and AWS Glue, and wire it all up with a few magical lines of Python scripting. -
AWS Data Pipeline is a cloud-based data workflow service that helps you process and move data between different AWS services and on-premise.
Ingestion of live data about about 1 year ago
Also, if you're doing this for an employer, and they have some deeper pockets, there is also AWS Data Pipeline. -
Learn more about Azure Data Factory, the easiest cloud-based hybrid data integration solution at an enterprise scale. Build data factories without the need to code.
(Recommend) Fun Open Source Tool for Pushing Data Around about 12 months ago
You might want to look at Azure Data Factory https://azure.microsoft.com/en-us/services/data-factory/ to extend SSIS EDIT: Yes, I missed the "open source" part :). -
Google Cloud Dataflow is a fully-managed cloud service and programming model for batch and streaming big data processing.
How do you implement CDC in your organization about 2 months ago:
Imo if you are using the cloud and not doing anything particularly fancy the native tooling is good enough. For AWS that is DMS (for RDBMS) and Kinesis/Lamba (for streams). Google has Data Fusion and Dataflow . Azure hasData Factory if you are unfortunate enough to have to use SQL Server or Azure. Imo the vendored tools and open source tools are more useful when you need to ingest data from SaaS platforms, and... -
Consolidate your customer and product data in minutes
-
SAP Data Services provides functionality for data integration, quality, cleansing, and more.
-
Hevo Data is a no-code, bi-directional data pipeline platform specially built for modern ETL, ELT, and Reverse ETL Needs. Get near real-time data pipelines for reporting and analytics up and running in just a few minutes. Try Hevo for Free today!
Quick tip: Replicating a MongoDB Atlas database to... about 5 months ago
In a previous article, we used open-source Airbyte to create an ELT pipeline between SingleStoreDB and Apache Pulsar. We have also seen in another article several methods to ingest MongoDB JSON data into SingleStoreDB. In this article, we’ll evaluate a commercial ELT tool called Hevo Data to create a pipeline between MongoDB Atlas and SingleStoreDB Cloud. Switching to SingleStoreDB has many benefits, as described... -
Improve the return on your data lake investment and simplify the ETL process by automating manual and repetitive aspects of data warehouse automation with Attunity Compose.