Using local AI

"Local AI" refers to the usage of AI models hosted on the user's computer.

Modern computers are powerfull enough to support decently powerfull models. The response time may be slower depending on the graphic card. Apple Mac Air M1 and above for instance offer fast responses for models that can handle tasks similar to Chat GPT3. But it also works for earlier Mac. I tried running local AI on my Mac Air from 2014 and it was working, albeit unbearably slow.

The advantages of using local AI are two folds:

  • It's free. You can basicaly let it run during days assimilating and generating millions of words and it will only cost you the electricity bill
  • It's private. You can work on sensitive or private documents and it will just run on your machine securely and unplugged without this data ever leaking to some server

Local AI is such a fantastic and mesmarizing ecosystem that it is worth knowing about so I will provide here more history and explanation. You can skip the next sections to get directly on the setup guide unless you want to know more it

The history of local AI

Open Source models

Local AI tools

The future of AI

Setup Guide