Awarity is an engine built on top of conventional Large Language Models (LLMs) like ChatGPT, Google Gemini, Anthropic, etc. Unlike these LLMs, Awarity is built to scale to virtually unlimited datasets. While these LLMs are limited to 1 million tokens today, Awarity's Elastic Context Window (ECW) allows for huge datasets.
Why is this important? Take a long document or a collection of documents. Something the size of the novel "War and Peace" is 1.8m tokens. So you can upload about half of a novel. Which sounds great if you are looking to query over a single document. But what if there are 10 documents or 100 or 1000 or... more? These LLMs can only reason over one document at a time. Imagine if you were doing a book report but you could only report on half the book at one time!
Member discussion: