Every AI hobbyist needs to realize how soon your laptop and smartphone will have a ChatGPT on it.

It's happening soon and way faster than you think.

Here's what you need to know (the non-technical version):
First, let's talk about WHY you want it on your own device.

Isn't ChatGPT online good enough?

Putting it on every device gives us:

1. Privacy
2. Speed
3. Personalization
1. Privacy

Whatever you put into Bing Chat and ChatGPT today is traveling all over the place.

Firms like JP Morgan, Verizon and Amazon have all banned ChatGPT at work.

Microsoft has had to explicitly advise its users never to put confidential information into its chat.
1. Privacy

Any sensitive AI use case, personal or business, will be 10x better if the model is on your device.

Data doesn't go anywhere. All of the thinking lives on your device, not on the cloud.
2. Speed and reliability

People freaked out (in a good way) when ChatGPT Plus came out and it was a lot faster than regular ChatGPT.

People freaked out (in a bad way) when ChatGPT went down.

You get more of the first and less of the second when you have it in your pocket.
3. Personalization

We're headed to a world where everyone could have their own personalized AI model.

But it would be ridiculous to expect OpenAI to support 7 billion versions.

Putting models on our devices lets everyone customize their own model based on their preferences.
So why hasn't this happened yet?

These large language models like GPT-4 are, well, large. Like very, very big.

You'd need to tape together dozens of Macbook Pros to 1) store something like GPT-3 in the first place and 2) run it.
That is, until recently. Enter the LLaMA experiments.

Meta released a family of GPT-3 quality models to researchers starting end of February.

Since then, it's been leaked. And people have been pushing to get them running on smaller and smaller devices.

A quick runthrough:
Just 3 weeks after LLaMA release, someone got it running on an M1 Macbook Pro:

Here's LLaMA on a Google Pixel 6:

Then, LLaMA on a Raspberry Pi:

Alpaca (a version of LLaMA prepped for chat) on an iPhone 14:

Voice chat to Alpaca on an M1 Macbook Pro:

The same as above, but now fully offline!

In a matter of weeks, people have:

- Ran LLaMA on laptops, phones and even tiny Raspberry Pis
- Turned LLaMA into a chat-ready model
- Ran that on a phone
- Layered on voice chat and made *that* run on a laptop

ChatGPT on your laptop/phone isn't just coming. It's already here.
So, what's next? You should have your eyes on two players:

1. Apple
2. Stability AI
1. Apple

Everyone's been waiting for what they're going to do in this category.

Your iPhone, today, carries components optimized for AI.

Siri's about to have the craziest comeback. The question is when.

(Aside: We could also see Microsoft try phones again. All because of AI.)

2. Stability AI

CEO Emad Mostaque has repeatedly stated that their vision is to put models "on the edge" - i.e., in your pocket.

2. Stability AI

They're already hiring to make this happen:

This is going to be the biggest shift in our day-to-day as consumers.

We will have superpowers in our pockets.

And *it's actually happening*.
Liked this? I'd appreciate a like and RT :)

32,000+ business professionals rely on us to understand what's going on AI.

Join them here:

Recommended by
Recommendations from around the web and our community.

Interesting thread on Large Language Models running on very small devices, even without internet connectivity. This technology is going to be nearly impossible to contain. It will be democratized, as pretty much all computing and information technologies before it have been.