Try our new research platform with insights from 80,000+ expert users

OpenVINO Valuable Features

Mahender Reddy Pokala - PeerSpot reviewer
AI Developer at University of Chicago
I found OpenVINO's ability to convert custom models into its format particularly beneficial, as businesses sometimes require unique models specific to their use cases. Utilizing OpenVINO allowed me to run these custom models on devices directly, which I found quite impressive. Additionally, the Model Zoo offered by OpenVINO added value to the product. View full review »
JH
Senior Data Scientist /Ai Engineer at Zantaz Data Resources

What I most appreciate about OpenVINO is the possibility to run models in CPU. That was obviously the reason to choose this solution. Another valuable feature is the possibility to run on non-NVIDIA GPUs. As this is a product developed by Intel, they developed it looking at the hardware, which is part of their house, to run inference or models, LLMs or deep learning models, in CPUs or from Intel, an Intel CPU or also Intel GPUs.

Before, it was not possible. The GPU market is completely dominated by NVIDIA. OpenVINO allowed me to run a model in an Intel CPU. Also, any framework PyTorch, TensorFlow, deep learning frameworks, they only have compatibility with NVIDIA GPUs, and OpenVINO opened the door to run LLMs in non-NVIDIA GPUs.

The benefit from using OpenVINO is that NVIDIA is dominating the market of GPUs and they set the price. There is no competitor. If I am able to run an LLM doing inference in commodity hardware, I am saving costs. To buy a GPU to run one of these models from NVIDIA costs a minimum of 5,000 pounds, and it's a very small GPU from NVIDIA. This cost is pretty high in comparison with a GPU from Intel or even a CPU with the same performance or similar performance on Intel. It is a cost saving solution and offers the possibility to run in cheaper hardware, which ultimately also saves cost.

View full review »
DS
Computer Vision Engineer at Ivideon
The runtime of OpenVINO is highly valuable for running different computer vision models. I use it primarily for this purpose. Additionally, I sometimes use the quantizer to make the models run faster. OpenVINO's cross-platform support, especially on MacOS with Apple silicon, is a significant feature. View full review »
Buyer's Guide
OpenVINO
August 2025
Learn what your peers think about OpenVINO. Get advice and tips from experienced pros sharing their opinions. Updated: August 2025.
867,445 professionals have used our research since 2012.
IJ
Embedded & Robotics Software Developer at Unemployed

The only benefit I have seen from using OpenVINO is the GPU boost performance for the Raspberry Pi.

View full review »
reviewer1530384 - PeerSpot reviewer
Systems and Solutions Architect at a tech services company with 1,001-5,000 employees

The solution's ability to stream data directly from camera inputs is the most valuable aspect for us. 

It's tailored to the Movidius chipset, which makes it a nice package. You don't have to run it on the Movidius. It runs on X86, however, we'd like to use it with our Movidius based co-processor. 

The ease of integration is fantastic. The option to run it just on X86 or X86 plus an Intel CPU is great.

It's an open-source solution.

The initial setup is quite simple.

View full review »
ZM
Machine Learning Software Developer at freelancer

The inferencing and processing capabilities are quite beneficial for our requirements.

Compared to Jetson Nano or Jetson TX2, or Jetson Xavier, OpenVINO is a much more cost-effective solution. Processing-wise, they are comparable to Jetson and maybe Jetson Xavier NX.

View full review »
CR
Freelance Engineer at Autónomo

The features for model comparison, the feature for model testing, evaluation, and deployment are very nice. It can work almost with all the models. 

View full review »
Buyer's Guide
OpenVINO
August 2025
Learn what your peers think about OpenVINO. Get advice and tips from experienced pros sharing their opinions. Updated: August 2025.
867,445 professionals have used our research since 2012.