ChatGPT Enterprise Released for Lucky Big Companies

OpenAI recently announced the release of their latest LLM product, ChatGPT Enterprise. This business offering promises securely encrypted transfer of information, unlimited access to the 32k context GPT-4 model, and an administrative console to help manage your users and API ...

By Daniel Detlaf

One-man flea circus, writer, sci-fi nerd, news junkie and AI tinkerer.

Pssst. Would you like a quick weekly dose of AI news, tools and tips to your inbox? Sign up for our newsletter, AIn't Got The Time.

OpenAI recently announced the release of their latest LLM product, ChatGPT Enterprise. This business offering promises securely encrypted transfer of information, unlimited access to the 32k context GPT-4 model, and an administrative console to help manage your users and API keys. ChatGPT Enterprise also touts Code Interpreter as a feature, recently rebranded as “Advanced Data Analysis.”

Pricing for the new service wasn’t disclosed, with customers being asked to contact sales. The Servitor jumped right in the line of course. Unfortunately, it looks like a waiting game:

Hello,  We've received your message and we're excited to be a part of your journey!     If you contacted us about ChatGPT Enterprise:   We are currently working through the high volume interest we’ve received in ChatGPT Enterprise. We are working to get back to companies as quickly as possible.
Email autoresponse from OpenAI sales.

“Hello,

We’ve received your message and we’re excited to be a part of your journey!

If you contacted us about ChatGPT Enterprise:

We are currently working through the high volume interest we’ve received in ChatGPT Enterprise. We are working to get back to companies as quickly as possible.”

We are not, properly speaking, an enterprise – but hey, they said there would be a small business version too!

The release of ChatGPT Enterprise is clearly intended to address the gap between individual users with ChatGPT and larger companies and development teams using the API to develop AI applications.

Although it can be toggled off, by default data that users send in to OpenAI can be used for training their models. Users can opt out of having their data recorded via a toggle in the settings, but this also disables conversation history, a handy feature. Obviously, having your data used for training is a big concern if you don’t want your business secrets and plans being spilled. Who knows when some future version of ChatGPT might regurgitate something you had it work on?  In contrast, data sent in over the API (or via the OpenAI playground) is not used for training.

This has led to a growing ecosystem of secure LLM service providers, largely running open source models to provide “talk to your data” services. The kind where you upload your documents and can then query them through a large language model.  ChatGPT Enterprise has the potential to squash these budding businesses by now offering a secure option for what is still (sorry, Claude 2) the flagship LLM of the current AI mania: GPT-4.

The unlimited 32k context usage is of particular note because currently, 32k context usage is still pretty scarce on the ground. Developers wanting access to GPT-4-32k via the API (us!) have to join a waitlist. The Servitor has been on the waitlist for two or three months.  In the meantime, GPT-4-32k can be accessed on Poe.com.

Create an amazing adventure with Storynest.ai. Try it free.  - Sponsored

Sponsored