1. The Apache HDFS is a distributed file system that makes it possible to scale a single Apache Hadoop cluster to hundreds (and even thousands) of nodes.

  2. Fully managed extract, transform, and load (ETL) service

  3. Google Cloud Dataflow is a fully-managed cloud service and programming model for batch and streaming big data processing.

  4. Amazon Elastic MapReduce is a web service that makes it easy to quickly process vast amounts of data.

  5. AWS Data Pipeline is a cloud-based data workflow service that helps you process and move data between different AWS services and on-premise.

  6. A fully managed data warehouse for large-scale data analytics.

  7. Xplenty gives you the power of Hadoop data processing without the need for installing hardware or software, and without the need for Hadoop programming skills.It's all in the cloud.

  8. The Starfish ETL (Extract Transform Load) Suite is a CRM integration and migration tool.

  9. Free cloud data platform for data integration, backup & management

  10. Quickly and securely import, export and delete unlimited amounts of data for your enterprise.

  11. Automate manual processes without spreadsheets or code

  12. Databricks provides a Unified Analytics Platform that accelerates innovation by unifying data science, engineering and business.‎What is Apache Spark?