The company also announced prompt caching, which allows developers to cache frequently used context between API calls, reducing costs and improving latency. According to OpenAI, developers can save up to 50% using this feature, while Anthropic promises a 90% discount. This feature is expected to be particularly beneficial for developers who rely heavily on OpenAI's API for their applications.
You are viewing a single comment's thread from: