A token is a piece of a word or symbol used by AI models for processing language.
Understanding token limits is key when using AI models like the ones powering SearchRovr.
AI services often charge for their usage by number of tokens. Thus a system that efficiently chunks the content and uses tokens efficiently can scale without great cost.