The solution's pricing depends on the services you will deploy. The solution's ChatGPT service would have a different price depending on the number of tokens or requests. If you go for machine learning, it comes at a different price. Azure OpenAI doesn't have a fixed price. The pricing depends on the services you deploy, the amount of data you push, and the endpoint output. For example, if you increase the memory of a virtual machine, its cost will increase. Azure OpenAI will show you the cost based on the services you use.
The pricing is similar. It's a token-based system, so you pay per token used by the model. The cost itself is comparable to other service providers. It's not like one charges significantly more. It's mainly the cost of the language model and the tokens we use. So, from a pricing perspective, it's comparable, but we can configure token usage in OpenAI to potentially save costs. With Microsoft, we have an enterprise agreement covering various products, including Office 365, Azure Cloud, and Windows. With IBM and Watson XS, we have separate data insights pricing deals.
Head of IT at a manufacturing company with 1,001-5,000 employees
Real User
Top 5
2023-12-11T03:52:39Z
Dec 11, 2023
The cost is pretty high. So, hopefully, once the turbo is available, in the general availability, market problem, my cost will come down. But as of now, the cost is pretty high. Even by US standards, you would find it high.
The cost structure depends on the volume of data processed and the computational resources required. There might be additional costs for private cloud usage and security considerations.
The Azure OpenAI service provides REST API access to OpenAI's powerful language models including the GPT-3, Codex and Embeddings model series. These models can be easily adapted to your specific task including but not limited to content generation, summarization, semantic search, and natural language to code translation. Users can access the service through REST APIs, Python SDK, or our web-based interface in the Azure OpenAI Studio.
The solution's pricing depends on the services you will deploy. The solution's ChatGPT service would have a different price depending on the number of tokens or requests. If you go for machine learning, it comes at a different price. Azure OpenAI doesn't have a fixed price. The pricing depends on the services you deploy, the amount of data you push, and the endpoint output. For example, if you increase the memory of a virtual machine, its cost will increase. Azure OpenAI will show you the cost based on the services you use.
The pricing is similar. It's a token-based system, so you pay per token used by the model. The cost itself is comparable to other service providers. It's not like one charges significantly more. It's mainly the cost of the language model and the tokens we use. So, from a pricing perspective, it's comparable, but we can configure token usage in OpenAI to potentially save costs. With Microsoft, we have an enterprise agreement covering various products, including Office 365, Azure Cloud, and Windows. With IBM and Watson XS, we have separate data insights pricing deals.
According to the negotiations taking place and the contract, there is a need to make either monthly or yearly payments to use the solution.
The cost is pretty high. So, hopefully, once the turbo is available, in the general availability, market problem, my cost will come down. But as of now, the cost is pretty high. Even by US standards, you would find it high.
The pricing is acceptable, and it's delivering good value for the results and outcomes we need.
The cost structure depends on the volume of data processed and the computational resources required. There might be additional costs for private cloud usage and security considerations.