We performed a comparison between Apica and Datadog based on real PeerSpot user reviews.
Find out in this report how the two Application Performance Monitoring (APM) and Observability solutions compare in terms of features, pricing, service and support, easy of deployment, and ROI."It is easy for beginners to learn and use Apica."
"It is easy to set up and configure."
"I like the transcript download feature. And with UI scripting, it's helpful that Apica handles a lot of the backend work automatically. I don't have to tag everything manually, though I can tag elements later if needed. It's really good at recording the steps."
"It uses a basic scripting language, which is easy to learn and customize as needed. Compared to LoadRunner, I found writing and customizing code much easier in Apica."
"It helps with releases because we monitor them in staging. We can tell if something is critically wrong before it gets into production, e.g., if it was load related or function related and also what was different in the dev stage. It then alerts us straightaway inside of our production monitors once it has been released. Therefore, it has improved how we run our systems since we monitor multiple environments."
"Our application SREs do script checks in such a way that closely mimic our customers' actions using the platform. Because there are so many different ways and options to be able to configure checks to closely mirror your applications' capabilities, it provides a lot of optionality for teams to create the right type of check that can notify when there are any issues. At the end of the day, we want our monitoring tools to be able to catch any outage before our customers do. This is where Apica Synthetic does a great job."
"There are several features that are really good. The first one is the flexibility and the advanced configuration that Apica offers when it comes to configuring synthetic checks. It provides the ability to customize how the check should be performed and it is very flexible in the number of synthetic locations that it can use. It allows us to run scripts from different locations all over the world, and they have a really good number of these locations."
"As always, within the IT industry, everybody's always looking to upgrade and update everything else like that. Apica has been one of those things but it's really hard to replace because it offers us the unique capability to see what the customer is seeing. A lot of other ones can do Selenium script and things like that, but there's a lot in Apica that we use right now. We utilize a lot of the scenario options in Apica right now, and there's a lot of other ones that do parts of it, but it doesn't do everything that Apica does."
"Most of the features in the way Datadog does monitoring are commendable and that is the reason we choose it. We did some comparisons before picking Datadog. Datadog was recommended based on the features provided."
"Their interface is probably one of the easiest things to use because it lets non-developers and non-engineers quickly get access to metrics and pull business value out of them. We could put together dashboards and give it to people who are non-technical, then they can see the state of the world."
"It lets us react more quickly to things going wrong. Whereas before, it might have been 30 minutes to an hour before we noticed something going on, we will know within a minute or two if something is off, which will let us essentially get something back up and running faster for our customers, which is revenue."
"It brings in observability, monitoring, and alerting capabilities - all of which we need to operate at scale."
"It is great that creating an incident is possible from Slack while having all the relevant data in Datadog."
"The ability to send notifications based on metadata from the monitor is helpful."
"The application performance monitoring is pretty good."
"We've found it most useful for managing Rstudio Workbench, which has its own logs that would not be picked up via Cloudwatch."
"We have been focused on reducing polling times for synthetic checks. We have gone from 10 minutes down to five minutes for a pretty broad swath, but there is some appetite to reduce that further, which could be an improvement."
"If you are adding any input file, the tool fails to capture the path."
"Learning the tool has always been a little difficult from a scripting perspective because the framework is proprietary and unique. Once we became used to what it does and how to perform it, then it became easier for my team and me. I would like to see some of the testing steps be part of a more well-known language, like Java or Python. That would be a big improvement."
"We could use more detailed information in the request and response sections."
"Apica was a relatively new tool when I started using it. Although Apica had good documentation, it still felt less developed or advanced than a tool like LoadRunner."
"I have noticed that the tool isn't widely recognized outside our organization. Also, there aren't any tutorials or dedicated resources for this tool, making it challenging for newcomers to learn. It would be beneficial if someone experienced with it could provide guidance."
"The reporting part that we use for our executives needs a bit more customization capabilities. Right now, you can use only the three main templates for reporting. We would like to be able to customize them."
"The accuracy of alerts can be improved a little bit. Right now, it's pretty good in terms of alerting pretty quickly about failures or changes in response times. However, what we have seen happen is the number of alerts that we are getting is very frequent, and we would like to tone down the number of alerts. That's the only trouble we have. Apica could tone down those settings because there is no option for us to tone it down to a level that would reduce the alerts to a minimum. As a platform, it does send us good alerts, but it could be improved a bit."
"Graph filters for logs need to be set manually which works well for JSON but not for unstructured logs."
"Their security features could be improved. We looked at their Security Monitoring feature but it was early in its development. Datadog are just getting into the security space so I'm sure this will improve in the future."
"I would like testing for data in the future."
"It would also be nice if we had more insight into our own usage of Datadog (agents and custom metrics). They provide a usage page which does help, but it is not in real-time."
"I found the documentation can sometimes be confusing."
"I would like the tooling to have better integration in Slack, specifically sending out reminders to the relevant people to take breaks, do a retrospective, and specify with emojis which messages to log."
"Geo-data is also something very critical that we hope to see in the future."
"I'm not sure what kind of features are in the roadmap right now, but I encourage the development of features for defining your organization, and allowing the visibility of what kind of metrics you can get. Those features would be really useful for us."
Apica is ranked 45th in Application Performance Monitoring (APM) and Observability with 6 reviews while Datadog is ranked 1st in Application Performance Monitoring (APM) and Observability with 137 reviews. Apica is rated 8.2, while Datadog is rated 8.6. The top reviewer of Apica writes "Offers transcript download feature and easy to set up and configure tests but not very user friendly". On the other hand, the top reviewer of Datadog writes "Very good RUM, synthetics, and infrastructure host maps". Apica is most compared with Dynatrace, AppDynamics, Apache JMeter and OpenText LoadRunner Cloud, whereas Datadog is most compared with Dynatrace, Azure Monitor, New Relic, AWS X-Ray and Elastic Observability. See our Apica vs. Datadog report.
See our list of best Application Performance Monitoring (APM) and Observability vendors and best Log Management vendors.
We monitor all Application Performance Monitoring (APM) and Observability reviews to prevent fraudulent reviews and keep review quality high. We do not post reviews by company employees or direct competitors. We validate each review for authenticity via cross-reference with LinkedIn, and personal follow-up with the reviewer when necessary.