- author: All About AI
OpenAI Updates API Functions and Models: Longer Context and Lowered Prices
OpenAI has recently updated their API and function calls, bringing exciting news for AI enthusiasts and developers. The updates include more steerable API models and functioning calling capabilities, longer context, and of course lowered prices. In this article, we'll dive into the details of the recent updates and take a look at some examples and use cases.
Updates to GPT 4 and 3.5 Turbo
OpenAI has released updated and more stable versions of GPT 4 and 3.5 Turbo. The updates bring a vast range of benefits to developers who use these models. A new 16k contacts window of GPT 3.5 turbo is opening new opportunities for developers, with four times the contact length, it provides twice the price.
Cost Reductions for Embeddings
The cost of the embeddings has been reduced by 75%. Additionally, the prices of the GPT 5 turbo have also been reduced, making it more accessible for developers to use this model.
Interpretation Timeline for GBT4 on GBT3.5
There is now an interpretation timeline available for GBT4 on GBT3.5. This allows developers to see how their results will be interpreted by these models, providing more insights into the AI's decision-making process.
New GPT 432k Version
The new GPT 432k version of the API is now available to developers. This version offers a higher level of capability, providing even more opportunities for developers to work with AI technology.
What is the 16k contact window?
The 16k contact window is a new feature that offers four times the contact length and twice the price. This addition means that developers can work with a lot more data within the context window. This feature is game-changing for AI development, giving developers more freedom to create and explore innovative AI-based solutions.
Example of the 16k contact window
To understand the real-world applications of the 16k contact window, we can take a look at an example. Imagine you want to use an AI system to recognize your YouTube channel's name. With the previous AI models, it would be impossible to ask because the context window could only fit 3000 words or 4K tokens. But with the 16k contact window, we could feed 11,000 words, and the AI would provide the correct answer.
Another example is a simple chatbot. With a massive knowledge base of 10,000 words, developers can question the AI easily, opening up a vast array of potential uses. The 16k contact window is a real game-changer when it comes to applying AI to larger datasets.
Pricing
The 16k contact window is priced at the O3 per 1K input and 4 per 1K token, which is double the price. However, this model's four-time increase in contact window length makes it worth the extra cost, offering more opportunity for exploration and innovation.
Conclusion
OpenAI's updated API and function calls bring some exciting news for developers and AI enthusiasts. The new 16k contact window of GPT 3.5 Turbo and other model updates offer new opportunities to explore and push the boundaries of what's possible with AI. With the lowered prices and longer context, developers can now work with larger datasets and create exciting new applications. These updates undoubtedly make AI a more accessible technology for developers to experiment with creative solutions.