The ChatGPT API offers significantly reduced pricing for developers seeking powerful language models. Compared to the text-davinci-003 model, the API is 10 times less expensive (90% off), making it more accessible to developers integrating natural language processing into their applications.
ChatGPT Plus API pricing
ChatGPT Plus is available as a monthly subscription for $20, excluding taxes. It is unclear whether OpenAI will maintain this price for ChatGPT Plus in the future.
Note that unlike using the API, ChatGPT Plus currently has a limit of 25 messages every 3 hours if you use the GPT-4 model.
ChatGPT Plus vs. ChatGPT API pricing comparison
With a budget of $20USD, you can process about 7.5 million words through the ChatGPT API, which is the equivalent of ~10 King James Bibles. To estimate the number of tokens in a text, you can use the OpenAI Tokenizer.
Why is GPT-4 not free?
GPT-4 requires high costs for hardware upgrades, updates, additional features, hosting, ChatGPT would cost OpenAI about $700,000 per day. It is therefore essential for the company to charge for the GPT-4 language model to cover expenses.
GPT-4 features and capabilities
- Reliable and accurate: GPT-4 is more reliable and accurate than other OpenAI models.
- Human-level performance: GPT-4 is capable of human-level performance on several academic and professional benchmarks.
- Handling task complexity: GPT-4 can handle complex and difficult tasks by reaching a sufficient threshold.
- Visual inputs: GPT-4 can accept image and text prompts, unlike previous models.
GPT-4 API Pricing Analysis
If you access GPT-4 via ChatGPT Plus, you will need to subscribe to the $20 monthly fee. If you access GPT-4 via the API, there are two pricing options available based on context lengths. OpenAI has reduced the price of prompt tokens to make it easier to use GPT-4.
Costs for models with 8K context lengths (gpt-4 and gpt-4-0314):
- Prompt tokens cost $0.03/1k
- Completion tokens $0.06/1k
Costs for models with context lengths of 32K (gpt-4-32k and gpt-4-32k-0314):
- Prompt tokens cost $0.06/1k
- Completion tokens cost $0.12/1k
*1,000 tokens correspond to about 750 words.
Compared to GPT-3.5, the GPT-4 API has doubled the maximum context length of tokens from 4096 to 8192. The default throughput limits for the GPT-4 API are 40,000 tokens per minute and 200 requests per minute.
To join the GPT-4 API waiting list, click here.
Cost comparison between the GPT-4 API and the ChatGPT API
With a budget of $20, you can process about 444,000 tokens (or 333,000 words) with the GPT-4 API, assuming an equal split of prompt and completion tokens. However, the most relevant comparison is between the GPT-4 API and the GPT-3.5 API:
- GPT-4 for prompts is 14 times more expensive than the GPT-3.5 API
- GPT-4 for completions is 29 times more expensive than the GPT-3.5 API
The question to ask is whether the GPT-4 API adds 14 to 29 times the value of the GPT-3.5 API for your use case. If you work in the legal field or education/tutoring in general, it probably makes sense to test GPT-4. For other use cases, extensive testing will be required to determine if the additional costs are justified over the ChatGPT API.
The choice between the ChatGPT API and the GPT-4 API will depend on a careful analysis of your project needs and constraints. When evaluating these leading technologies, consider the key factors:
- the intended application
- desired accuracy
- ethical considerations
- cost implications
- adaptability to future developments
Your final decision will reflect your vision and commitment to harnessing the revolutionary potential of artificial intelligence.