Sponsor

The cost of running ChatGPT is $100,000 per day | What is ChatGPT

blog.softwaretechit.com
home.softwaretechit.com



The cost of running ChatGPT is $100,000 per day 

 

OpenAI’s ChatGPT: Here’s how much it costs to run per day, and other interesting facts

ChatGPT is an AI-powered chatbot, capable of creating interaction-style conversation. Developed by OpenAI, ChatGPT is known for its human-like responses and the best part about it is the fact that it is available for free. While you might be able to access ChatGPT for free, OpenAI is actually spending a lot of money to keep ChatGPT up and running.

Here are some of the least-known facts about Open AI’s ChatGPT that gives us a hint regarding the various aspects of developing and running an AI service.

 

The cost of running ChatGPT is $100,000 per day 

According to the analysis, ChatGPT is hosted on Microsoft’s Azure cloud, so, OpenAI doesn’t have to buy a setup physical server room. As per the current rate, Microsoft charges $3 an hour for a single A100 GPU and each word generated on ChatGPT costs $0.0003.

A response from ChatGPT will usually have at least 30 words, hence, a single response from ChatGPT will cost at least 1 cent to the company. Currently, it is estimated that OpenAI is spending at least $100K per day or $3 million per month on running costs.

A single ChatGPT query uses at least 8 GPUs

According to Goldstein, Associate Professor at Maryland, a single NVIDIA A100 GPU is capable of running a 3-billion parameter model in about 6ms. With this speed, a single NVIDIA A100 GPU could take 350ms seconds to print out just a single word on ChatGPT.

Given ChatGPT’s latest version 3.5 has over 175 billion parameters, to get an output for a single query, it needs at least five A100 GPUs to load the model and text. ChatGPT is capable of outputting around 15-20 words per second, hence, it needs a server with at least 8 A100 GPUs.

ChatGPT doesn’t have answers to all your questions

While ChatGPT is currently the most capable AI chat boat, it is actually trained using models that are created on or before 2021. Hence, it might not be able to give you accurate responses to all the queries.

ChatGPT has over one million users

 

Within a few days of the official launch, ChatGPT has over 1 million users. While most of these might not be active users, the company has definitely managed to gather a lot of users in a limited time. However, the company has to do a lot more than this to retail all these users to make ChatGPT a profitable AI tool.

 

In this article we’ll cover the different types of chatbot technology: linguistics, machine learning and a hybrid model approach. We’ll also look at chatbot development and integrations.

Chatbots Technology

The majority of chatbot development tools today are based on two main types of chatbots, either linguistic (rule-based chatbots) or machine learning (AI chatbot) models.

Linguistic Based (Rule-Based Chatbots)

Linguistic based — sometimes referred to as ‘rules-based’, delivers the fine-tuned control and flexibility that is missing in machine learning chatbots. It’s possible to work out in advance what the correct answer to a question is, and design automated tests to check the quality and consistency of the system.

Rule-based chatbots use if/then logic to create conversational flows.

Language conditions can be created to look at the words, their order, synonyms, common ways to phrase a question and more, to ensure that questions with the same meaning receive the same answer. If something is not right in the understanding it’s possible for a human to fine-tune the conditions.

However, chatbots based on a purely linguistic model can be rigid and slow to develop, due to this highly labor-intensive approach.

Though these types of bots use Natural Language Processing, interactions with them are quite specific and structured. These type of chatbots tend to resemble interactive FAQs, and their capabilities are basic.

These are the most common type of bots, of which many of us have likely interacted with — either on a live chat, through an e-commerce website, or on Facebook messenger.

Machine learning (AI Chatbots)

Chatbots powered by AI Software are more complex than rule-based chatbots and tend to be more conversational, data-driven and predictive.

These types of chatbots are generally more sophisticated, interactive and personalized than task-oriented chatbots. Over time with data they are more contextually aware and leverage natural language understanding and apply predictive intelligence to personalize a user’s experience.

Conversational systems based on machine learning can be impressive if the problem at hand is well-matched to their capabilities. By its nature, it learns from patterns and previous experiences.

But, to perform even at the most rudimentary level, such systems often require staggering amounts of training data and highly trained skilled human specialists. In addition, a machine learning chatbot functions as a black box. If something goes wrong with the model it can be hard to intervene, let alone to optimize and improve.

The resources required, combined with the very narrow range of scenarios in which statistical algorithms are truly excellent, makes purely machine learning-based chatbots an impractical choice for many enterprises.

Hybrid Model — The Ultimate Chatbot Experience

While linguistic and machine learning models have a place in developing some types of conversational systems, taking a hybrid approach offers the best of both worlds, and offers the ability to deliver more complex conversational AI chatbot solutions.

A hybrid approach has several key advantages over both the alternatives. When considered against machine learning methods, it allows for conversational systems to be built even without data, provides transparency in how the system operates, enables business users to understand the application, and ensures that a consistent personality is maintained and that its behavior is in alignment with business expectations.

At the same time, it allows for machine learning integrations to go beyond the realm of linguistic rules, to make smart and complex inferences in areas where a linguistic only approach is difficult, or even impossible to create. When a hybrid approach is delivered at a native level this allows for statistical algorithms to be embedded alongside the linguistic conditioning, maintaining them in the same visual interface.

Building conversational applications using only linguistic or machine learning methods is hard, resource intensive and frequently prohibitively expensive. By taking a hybrid approach, enterprises have the muscle, flexibility and speed required to develop business-relevant AI applications that can make a difference to the customer experience and the bottom line.

Here’s a video explaining the benefits of a hybrid approach using both linguistic and machine learning models:

Build better voice apps. Get more articles & interviews from voice technology experts at voicetechpodcast.com


Post a Comment

0 Comments