Discover capabilities and limitations of AI models
Understand the technical capabilities of different AI models including token limits, training data, and performance characteristics.
Tokens are the basic units of text that AI models process. In English, a token can be as short as one character or as long as one word (e.g., "a" or "apple"). The maximum token limit determines how much text the model can process in a single request.
* Display limited to comparison scale
Token limits can vary based on the specific implementation and may change over time as models are updated. Always check the latest documentation for the model you're using.
OpenAI
OpenAI
Anthropic
* Off scale for comparison
Estimate how many tokens your text will consume based on different models.
Enter text and select a model to see token usage estimates and how much of the model's capacity you're using.