

Other open source alternatives could not boast GPT-3-level performance on readily available consumer-level hardware.Įnter LLaMA, an LLM available in parameter sizes ranging from 7B to 65B (that's "B" as in "billion parameters," which are floating point numbers stored in matrices that represent what the model "knows"). Open source solutions do exist (such as GPT-J), but they require a lot of GPU RAM and storage space. Thus began the dream-in some quarters-of an open source large language model (LLM) that anyone could run locally without censorship and without paying API fees to OpenAI.


Since ChatGPT launched, some people have been frustrated by the AI model's built-in limits that prevent it from discussing topics that OpenAI has deemed sensitive. (At least not today-as in literally today, March 13, 2023.) But what will arrive next week, no one knows. If this keeps up, we may be looking at a pocket-sized ChatGPT competitor before we know it.īut let's back up a minute, because we're not quite there yet. Further Reading Meta unveils a new large language model that can run on a single GPU
