OpenAI has some really nice APIs, but you have to pay for their service. It's the easy way.
But, I wanted to run LLMs locally on my laptop. That way, I had much more control and could write code to learn more about how to make thing work.
Here's how I did it.
- Download and install Ollama, https://ollama.ai/ It runs on Mac (ARM, M1, M2, etc.) or Linux. Not Windows yet, but WSL should work.
- Check out the models they have, https://ollama.ai/library
- I've had good luck with the mistral model. Go to your command line and type ollama run mistral.
There you go. Right on your laptop—your own LLM experience.
There's a lot more I could add. If this is interesting, please send me a Like and ask your questions.