Try our new research platform with insights from 80,000+ expert users

Amazon EC2 Auto Scaling vs Apache Spark comparison

 

Comparison Buyer's Guide

Executive SummaryUpdated on May 21, 2025

Review summaries and opinions

We asked business professionals to review the solutions they use. Here are some excerpts of what they said:
 

ROI

Sentiment score
6.4
Organizations saved resources and infrastructure costs with Amazon EC2 Auto Scaling, achieving over 50% ROI in high-load applications.
Sentiment score
6.6
Apache Spark enhances machine learning, cutting operational costs by up to 50%, with efficiency reliant on resources and expertise.
 

Customer Service

Sentiment score
7.2
Amazon EC2 Auto Scaling is praised for its responsive support, though third-party integration challenges suggest vendor collaboration improvements.
Sentiment score
5.9
Apache Spark support feedback varies, with mixed reviews on community forums, vendor support, and documentation adequacy.
 

Scalability Issues

Sentiment score
7.6
Amazon EC2 Auto Scaling offers flexible, scalable solutions, efficiently managing resources to meet varied demands and simplifying operations.
Sentiment score
7.5
Apache Spark excels in scalability, efficiently handling large data workloads with ease, though it requires skilled infrastructure management.
The scaling feature appears to be embedded in the Amazon EC2 Auto Scaling price.
 

Stability Issues

Sentiment score
8.3
Amazon EC2 Auto Scaling is stable and reliable, with efficient scaling and minor regional issues, but SLAs need examination.
Sentiment score
7.5
Apache Spark is generally stable, trusted by companies; newer versions enhance reliability, though memory issues may arise without proper configuration.
The stability of Amazon EC2 Auto Scaling rates a 10.
Amazon EC2 Auto Scaling should automatically scale out systems during high demand and scale in new instances when demand decreases.
Apache Spark resolves many problems in the MapReduce solution and Hadoop, such as the inability to run effective Python or machine learning algorithms.
 

Room For Improvement

Amazon EC2 Auto Scaling struggles with scalability, support, billing complexity, limited automation, connectivity issues, and inadequate training materials.
Apache Spark requires improvements in scalability, usability, documentation, memory efficiency, real-time processing, and broader language support for better performance.
In enterprise environments such as healthcare or banking with numerous instances running different applications, customizable policies allow appropriate scaling.
Integration with LLM would be beneficial as many services are implementing this functionality.
Amazon should provide more detailed training materials for people who are just starting to work with Amazon EC2 Auto Scaling.
 

Setup Cost

EC2 Auto Scaling offers pay-as-you-go pricing, where costs vary by usage and region, with competitors varying pricing.
Apache Spark is cost-effective but may incur expenses from hardware, cloud resources, or commercial support, impacting deployment costs.
In some projects, incorrect decisions were made by not consulting them first, resulting in higher setup and maintenance costs.
It operates on a pay-as-you-go model, meaning if a machine is used for only an hour, the pricing will be calculated for that hour only, not the entire month.
 

Valuable Features

Amazon EC2 Auto Scaling offers seamless scalability, automatic provisioning, integration, security, cost-efficiency, and flexible management for diverse business needs.
Apache Spark offers fast in-memory processing, scalable analytics, MLlib for machine learning, SQL support, and seamless integration with languages.
This pre-configuration makes on-demand scaling refined, and the configuration includes automatic traffic distribution because when the first system is overloaded, new incoming traffic is redirected to the newly created systems.
The service offers 99.9999% availability.
The most valuable feature of Amazon EC2 Auto Scaling is its scalability.
Not all solutions can make this data fast enough to be used, except for solutions such as Apache Spark Structured Streaming.
 

Categories and Ranking

Amazon EC2 Auto Scaling
Ranking in Compute Service
5th
Average Rating
9.0
Reviews Sentiment
7.6
Number of Reviews
49
Ranking in other categories
No ranking in other categories
Apache Spark
Ranking in Compute Service
3rd
Average Rating
8.4
Reviews Sentiment
7.3
Number of Reviews
67
Ranking in other categories
Hadoop (1st), Java Frameworks (2nd)
 

Mindshare comparison

As of September 2025, in the Compute Service category, the mindshare of Amazon EC2 Auto Scaling is 9.3%, down from 13.7% compared to the previous year. The mindshare of Apache Spark is 11.9%, up from 11.5% compared to the previous year. It is calculated based on PeerSpot user engagement data.
Compute Service Market Share Distribution
ProductMarket Share (%)
Apache Spark11.9%
Amazon EC2 Auto Scaling9.3%
Other78.8%
Compute Service
 

Featured Reviews

Karthikeyan Ganesan - PeerSpot reviewer
Offers automatic scaling features but can improve user interface and setup guidance
Customizable policies help us determine how scaling should occur. In enterprise environments such as healthcare or banking with numerous instances running different applications, customizable policies allow appropriate scaling. For critical servers, we can set up a higher number of new instances to scale out to prevent downtime. For less critical servers that perform simple tasks such as file copying, we can use customizable policies to scale out minimal instances to avoid unnecessary expenses or cloud costs. Regarding integration, there are some disadvantages in AWS where certain availability zones or regions experience glitches. This can cause production halts because of problems on Amazon's side. In particular regions, when integrating Amazon EC2 Auto Scaling or other services, there might be delays in creating new Amazon EC2 instances, sometimes becoming inefficient.
Omar Khaled - PeerSpot reviewer
Empowering data consolidation and fast decision-making with efficient big data processing
I can improve the organization's functions by taking less time to make decisions. To make the right decision, you need the right data, and a solution can provide this by hiring talent and employees who can consolidate data from different sources and organize it. Not all solutions can make this data fast enough to be used, except for solutions such as Apache Spark Structured Streaming. To make the right decision, you should have both accurate and fast data. Apache Spark itself is similar to the Python programming language. Python is a language with many libraries for mathematics and machine learning. Apache Spark is the solution, and within it, you have PySpark, which is the API for Apache Spark to write and run Python code. Within it, there are many APIs, including SQL APIs, allowing you to write SQL code within a Python function in Apache Spark. You can also use Apache Spark Structured Streaming and machine learning APIs.
report
Use our free recommendation engine to learn which Compute Service solutions are best for your needs.
867,370 professionals have used our research since 2012.
 

Top Industries

By visitors reading reviews
Financial Services Firm
19%
Computer Software Company
12%
Retailer
7%
University
7%
Financial Services Firm
26%
Computer Software Company
11%
Manufacturing Company
7%
Comms Service Provider
7%
 

Company Size

By reviewers
Large Enterprise
Midsize Enterprise
Small Business
By reviewers
Company SizeCount
Small Business12
Midsize Enterprise9
Large Enterprise25
By reviewers
Company SizeCount
Small Business27
Midsize Enterprise15
Large Enterprise32
 

Questions from the Community

What do you like most about Amazon EC2 Auto Scaling?
The solution removes the need for hardware. We can easily create servers or machines. Just by clicking or specifying our requirements, like memory size or disk space, it's set up for us. The tool e...
What is your experience regarding pricing and costs for Amazon EC2 Auto Scaling?
The costing for Amazon EC2 Auto Scaling is equivalent to creating Amazon EC2 servers. For instance, if one dollar is paid for a single Amazon EC2 machine, the same cost applies for Amazon EC2 Auto ...
What needs improvement with Amazon EC2 Auto Scaling?
Amazon should provide more detailed training materials for people who are just starting to work with Amazon EC2 Auto Scaling. Documentation should be provided according to region. For example, in I...
What do you like most about Apache Spark?
We use Spark to process data from different data sources.
What is your experience regarding pricing and costs for Apache Spark?
Apache Spark is open-source, so it doesn't incur any charges.
What needs improvement with Apache Spark?
Regarding Apache Spark, I have only used Apache Spark Structured Streaming, not the machine learning components. I am uncertain about specific improvements needed today. However, after five years, ...
 

Also Known As

AWS RAM
No data available
 

Overview

 

Sample Customers

Expedia, Intuit, Royal Dutch Shell, Brooks Brothers
NASA JPL, UC Berkeley AMPLab, Amazon, eBay, Yahoo!, UC Santa Cruz, TripAdvisor, Taboola, Agile Lab, Art.com, Baidu, Alibaba Taobao, EURECOM, Hitachi Solutions
Find out what your peers are saying about Amazon EC2 Auto Scaling vs. Apache Spark and other solutions. Updated: July 2025.
867,370 professionals have used our research since 2012.