Category: Technical Terms
Definition
Tokens are the basic units AI models use to process text - usually parts of words, whole words, or punctuation marks.
How It Works
AI can't read text like humans do. It breaks sentences into tokens first. "Hello world!" becomes three tokens: "Hello", " world", and "!".
Each token gets converted to numbers the AI can work with. Common words like "the" might be single tokens, while unusual words get split into smaller pieces.
Why It Matters
Token limits determine how much text AI can handle at once. ChatGPT processes around 4,000 tokens per conversation. Longer documents need to be split or summarized.
Understanding tokens helps you work within AI limits and estimate costs, since many AI services charge by token count.
← Back to Technical Terms | All Terms