Article Public (all visitors)

You can now run a GPT-3-level AI model on your laptop, phone, and Raspberry Pi | Ars Technica

Thanks to Meta LLaMA, AI text models may have their "Stable Diffusion moment."

Things are moving at lightning speed in AI Land. On Friday, a software developer named Georgi Gerganov created a tool called "llama.cpp" that can run Meta's new GPT-3-class AI large language model, LLaMA, locally on a Mac laptop. Soon thereafter, people worked out how to run LLaMA on Windows as well. Then someone showed it running on a Pixel 6 phone, and next came a Raspberry Pi (albeit running very slowly).

If this keeps up, we may be looking at a pocket-sized ChatGPT competitor before we know it.

Show More

Curated by

FoundryBase

Updated 10 months ago

Browse more

View all Articles

Continue from source

More from source

Browse more from Ars Technica

Contribute to FoundryBase

Found something worth adding?

Sign in to suggest resources and start building your own collection.