We're just getting started -

Token

Definition

The basic unit that AI models use to process text. Roughly, 1 token equals about 3/4 of a word. AI models have token limits that determine how much text they can read and write in one conversation.

Example

The sentence 'Hello, how are you today?' is about 7 tokens. GPT-4o can handle up to 128,000 tokens in a single conversation.