Try our new research platform with insights from 80,000+ expert users

CrossBrowserTesting vs HeadSpin comparison

 

Comparison Buyer's Guide

Executive SummaryUpdated on Dec 18, 2024

Review summaries and opinions

We asked business professionals to review the solutions they use. Here are some excerpts of what they said:
 

Categories and Ranking

CrossBrowserTesting
Ranking in Functional Testing Tools
28th
Average Rating
9.0
Reviews Sentiment
7.6
Number of Reviews
19
Ranking in other categories
No ranking in other categories
HeadSpin
Ranking in Functional Testing Tools
33rd
Average Rating
8.0
Reviews Sentiment
6.8
Number of Reviews
6
Ranking in other categories
Mobile APM (6th), Mobile App Testing Tools (8th)
 

Mindshare comparison

As of October 2025, in the Functional Testing Tools category, the mindshare of CrossBrowserTesting is 1.0%, up from 0.8% compared to the previous year. The mindshare of HeadSpin is 0.6%, down from 0.6% compared to the previous year. It is calculated based on PeerSpot user engagement data.
Functional Testing Tools Market Share Distribution
ProductMarket Share (%)
CrossBrowserTesting1.0%
HeadSpin0.6%
Other98.4%
Functional Testing Tools
 

Featured Reviews

CN
Knowledgeable support, scalable, and stable
We use CrossBrowserTesting for testing our web-based applications We had some issues with the onboarding process and the cloud conductivity could improve. I have used CrossBrowserTesting within the past 12 months. CrossBrowserTesting is stable. I have found CrossBrowserTesting to be scalable.…
Saorabh Singh - PeerSpot reviewer
It fulfills everything from automation to manual performance
The most valuable features of the product are the performance parameters it gives us, as well as the seamless connectivity with our automation suites. I am also pleased with the continuous enhancements made to HeadSpin. There have been many features added since we started using the product, and all of them are useful.

Quotes from Members

We asked business professionals to review the solutions they use. Here are some excerpts of what they said:
 

Pros

"The screen shot portal is essential for an easy way to run tests across hundreds of browsers and retrieve screenshots which then indicate success or failure."
"Each new session started with the live testing feature allows for a cleared browser and new experience to be able to not only see these attributes on the page clearly but also pass clean data."
"The support team is top-notch. I have a great relationship with them. They are extremely honest and responsive."
"At the moment, all our deploys depend on results of automation. If the tests are failing, then we know that something is wrong at the early stages of development."
"The extensive range of products available to simulate is something I have come to appreciate as it has resulted in an ability to broaden the scope of our tests."
"When I started to work on testing automation, I was very excited about how easy it is to run tests on different browsers. It was just a matter of configuration."
"The CrossBrowserTesting Selenium API and live test features have greatly improved our team's ability to quickly and effectively perform QA."
"SmartBear has excellent, informative webinars, so keep an eye out for those."
"The most valuable feature is that this is the first connected intelligence all-in-one platform."
"The technical support is really helpful because we can set up direct calls with them if we want to. We can use Zoom or Google Meet to interact with them directly, and if there is an issue in our system, they will help us by reproducing the issue in their machines and trying to figure out a solution. The support is really smooth, and we like that they're very supportive."
"It has an interesting feature called AV box testing. A lot of companies that are in the OTT segment don't really understand what their streaming is like. They can't test for streaming quality. There are restrictions where you cannot simulate live streaming. For example, on Netflix, you can't simulate how a movie is being streamed on a remote device. That's why HeadSpin has got this AV box testing feature. It is a patented feature. They send an AV box to your location, and you can test live streaming, which is something that no other company does."
"The most valuable features of the product are the performance parameters it gives us."
"The initial setup of HeadSpin was very easy and user-friendly. It was easy to configure and write a script."
"The most valuable feature of HeadSpin it's the integration with other solutions. It is great. I can search for an element or do a quick debugging on the application right on HeadSpin. It's very useful."
 

Cons

"A problem that we are facing quite often is related to the network connection. Tests can fail if the remote CrossBrowserTesting's VM has connection problems. This happens mostly with browsers of Internet Explorer family which work on Windows OS."
"Elements of 'real' mobile/tablet testing could be sped up."
"The "Getting Started" documentation for Selenium testing could be improved."
"The speed connection in mobile devices could be improved, because sometimes the load time is uncertain."
"It would be useful if we can run the live-testing test cases on multiple platforms at the same time, instead of waiting for one session to finish."
"This solution would benefit from faster testing and support for more devices."
"Sometimes the testing is slow."
"Being able to test on real devices via the virtual connection is wonderful, but it can cause some lag and load time issues while testing."
"They should automate their onboarding. A lot of things are still manual. They can create a video assistant or something like that to completely automate the entire process."
"HeadSpin could improve on the user interface because it is very poor. The checks that are done on the iOS devices are very difficult, but for Android, it runs great. For all iOS devices, the user interface and how it interacts with the device are very poor."
"Sometimes, devices go offline and some features are not functioning on some devices, specifically on iOS."
"If you want to do some testing or check the devices manually or check the application in a particular device manually, it is really laggy. That's a disappointment because sometimes we would like to do manual testing when our local devices are not available."
"HeadSpin needs to improve the hardware. With the mobile, the battery life reduces and must be continuously charged."
"Support and pricing could be improved."
 

Pricing and Cost Advice

"SmartBear offers bundles of products that work together."
"A few intermediary pricing options for small QA teams would be nice, e.g., unlimited screenshots, "as you need it" parallel tests, etc."
"The lowest price point is very reasonable. It is also useful if only one person in the company needs to check on the browser display."
"It is worth the pricing as the product is supported on multiple platforms and browsers."
"CrossBrowserTesting offered the best value for its price."
"We have a yearly license for 16 devices."
"It has a yearly license. There is no other option. It is expensive. There are a lot of other cheaper players in the market, but it is like a Mercedes. You pay an extra premium for it, but you get the benefits. I would love to see them come up with project-based costing. Companies that are low on funds or new-age can do with pricing that is easily digestible. They can give them a pricing model for three months. They can provide a startup package."
"It's not cheap, but there are a few different packages and different prices for enterprises with different product versions."
"I believe the licensing cost is cheap because it's a total solution, hardware, license and software."
report
Use our free recommendation engine to learn which Functional Testing Tools solutions are best for your needs.
869,202 professionals have used our research since 2012.
 

Top Industries

By visitors reading reviews
No data available
Computer Software Company
17%
Financial Services Firm
16%
Manufacturing Company
12%
Healthcare Company
5%
 

Company Size

By reviewers
Large Enterprise
Midsize Enterprise
Small Business
By reviewers
Company SizeCount
Small Business9
Midsize Enterprise5
Large Enterprise10
No data available
 

Overview

 

Sample Customers

St. Jude Children's Research Hospital, Accenture, Sony, Los Angeles Times, ADP, Verizon, T-Mobile, Wistia
Zynga, Tinder, Pinterest, Akamai, Microsoft, Airbnb, Jam City, TMobile, Mozilla, CNN, Cognizant, Yahoo!, ebay, Quora, Walmart, Kohls, Telstra
Find out what your peers are saying about CrossBrowserTesting vs. HeadSpin and other solutions. Updated: September 2025.
869,202 professionals have used our research since 2012.