Token
A token is the basic unit of text that AI models process. One token is roughly 3/4 of a word in English. The word "hamburger" is 3 tokens. AI model pricing and context windows are measured in tokens.
How It Works
Tokenization breaks text into pieces the model can process. Common English words are often a single token, while longer or less common words are split into multiple tokens. Code tends to use more tokens per character than prose because of special syntax. Token counts matter for two reasons: pricing (you pay per token) and context limits (models can only process a fixed number of tokens per request). Input tokens (your prompt) and output tokens (the model's response) are often priced differently.
Token in Chapeta
Chapeta does not add markup to token costs. You pay the provider price per token through your OpenRouter API key. Chapeta shows actual prompt token counts from the API so you can monitor costs accurately. When switching between models with different context windows, you can see exactly how much of the context budget your conversation uses.
Related
More Terms
OpenRouter
OpenRouter is a unified API gateway that provides access to hundreds of AI models from different providers through a single API endpoint.
BYOK (Bring Your Own Key)
BYOK stands for Bring Your Own Key.
LLM (Large Language Model)
A Large Language Model (LLM) is an AI system trained on massive text datasets that can understand and generate human language.
AI Agent
An AI agent is an AI system that can take actions in the real world, not just generate text.
API Key
An API key is a unique string of characters that authenticates your identity when making requests to a web API.
Context Window
A context window is the maximum amount of text (measured in tokens) that an AI model can process in a single request.
Prompt Engineering
Prompt engineering is the practice of crafting input text (prompts) to get the best possible output from AI models.
Fine-Tuning
Fine-tuning is the process of further training a pre-trained AI model on a specific dataset to specialize it for a particular task or domain.