Open AI (ChatGPT) CEO Unveils His Plan for the Future

OpenAI - ChatGPT CEO Unveils His Plan

ChatGPT may cost up to $3M per month

OpenAI CEO Sam Altman announced on Twitter that ChatGPT, the AI assistant based on the GPT-3.5 architecture for for natural language interaction with humans, had reached one million users just a few days after its launch.

The chatbot has been well-received for its high-quality, longer output and improved ability to follow instructions, surpassing the performance of previous models.

The AI assistant may cost $100K per day, or $3M per month, according to a tweet by Tom Goldstein. That's the reasoning behind it:

  • A 3-billion parameter model can generate a token in about 6ms on an A100 GPU (using half precision+tensorRT+activation caching). If we scale that up to the size of ChatGPT, it should take 350ms secs for an A100 GPU to print out a single word.
  • Of course, you could never fit ChatGPT on a single GPU. You would need 5 80Gb A100 GPUs just to load the model and text. ChatGPT cranks out about 15-20 words per second. If it uses A100s, that could be done on an 8-GPU server (a likely choice on Azure cloud).
  • So what would this cost to host? On Azure cloud, each A100 card costs about $3 an hour. That's $0.0003 per word generated. But it generates a lot of words! The model usually responds to my queries with ~30 words, which adds up to about 1 cent per query.
  • ChatGPT acquired 1M users within its first 5 days of operation. If an average user has made 10 queries per day, I think it’s reasonable to estimate that ChatGPT serves ~10M queries per day.

What's the (business) plan?

For such a cost, the founding company will surely think of monetizing its AI platform. The question is: how? Well, in 2019, Sam Altman, has already unveiled his plan to monetize the company; and it consists of two simple steps:

  • Bringing an AI to perfection
  • Delegating the mission of finding the right business plan to the AI itself.

Interesting, isn’t it?

What’s interesting is that Sam Altman had no idea at the time about monetizing the platform.

How can OpenAI make revenue using its ChatGPT - According to ChatGPT

Obviously, I asked the AI to find the right model to make revenue from ChatGPT. This is the answer:

There are several ways that OpenAI could potentially make revenue using its ChatGPT artificial intelligence (AI) and get a return on investment. Some possible options could include:

  1. Licensing: OpenAI could license its ChatGPT technology to other companies or organizations that want to use it for their own purposes, such as building chatbots or other conversational AI applications.
  2. Service-based model: OpenAI could offer ChatGPT as a service, charging a fee on a subscription basis for access to the technology.
  3. Consultancy: OpenAI could offer consulting services to companies that want to use ChatGPT or other AI technologies, helping them to understand how to implement and use the technology effectively.
  4. Partnerships: OpenAI could form partnerships with other companies or organizations to jointly develop and sell products or services using ChatGPT or other AI technologies.

Ultimately, the specific revenue model that OpenAI chooses to use for its ChatGPT technology will depend on a variety of factors, including the market demand for the technology and the competition in the space. It may be helpful for the company to consider testing and evaluating different revenue models to determine which ones are most effective for generating a return on investment.


So, these answers are generic and may apply to other business types, not just ChatGPT; but it’s possible to dive deeper into each model with more specific questions and push ChatGPT to give more specific answers.

What Does the Future Hold for Artificial Intelligence?

When I ask myself this question, I think of the debate between Jack Ma and Elon Musk over the future of AI and intelligent systems.

👉 Alibaba co-founder and executive chairman, Jack Ma, believes that AI will play a role in augmenting human intelligence without interfering with human

👉 Tesla CEO, Elon Musk, believes the future of AI is singularity; as it could outsmarts humans. “The computer will just get impatient if nothing else. It will be like talking to a tree,” Musk said.

Regardless of what the future holds, it is clear that AI will continue to play a significant role in many aspects of our lives, and it will be important for society to carefully consider the ethical and societal implications of this technology as it continues to advance.

Connect Deeper

If you're interested in ChatGPT, OpenAI, and AI/ML in general, you should join my coming-soon AI/ML Weekly Newsletter! You will not regret it.

You can also check my GitHub repository, Awesome ChatGPT: A curated list of awesome ChatGPT resources, libraries, SDKs, APIs, and more.


Only registered users can post comments. Please, login or signup.

Start blogging about your favorite technologies and get more readers

Join other developers and claim your FAUN account now!

FAUN Team

Developers Community

Avatar

Aymen El Amri

Founder, FAUN

@eon01
Founder of FAUN (www.faun.dev) / Startup Advisor / Indie Maker / TechAuthor / DevOps / SRE / Architect
Stats
890

Influence

85k

Total Hits

30

Posts