How to run AI Locally on Your Laptop
OpenAI has some really nice APIs, but you have to pay for their service. It's the easy way.
But, I wanted to run LLMs locally on my laptop. That way, I had much more control and could write code to learn more about how to make thing work.
Here's how I did it.
  1. Download and install Ollama, https://ollama.ai/ It runs on Mac (ARM, M1, M2, etc.) or Linux. Not Windows yet, but WSL should work.
  2. Check out the models they have, https://ollama.ai/library
  3. I've had good luck with the mistral model. Go to your command line and type ollama run mistral.
There you go. Right on your laptop—your own LLM experience.
There's a lot more I could add. If this is interesting, please send me a Like and ask your questions.
31
27 comments
Clarke Bishop
4
How to run AI Locally on Your Laptop
Data Alchemy
skool.com/data-alchemy
Your Community to Master the Fundamentals of Working with Data and AI — by Datalumina®
Leaderboard (30-day)
powered by