Generative AI
Log In
Community
Classroom
Calendar
Members
Leaderboards
About
Log In
3
Jyoti Gupta
Mar 24 in
AI Technology Updates:
Ollama library for running LLMs locally
Ollama is a tool to run Large Language Models locally, without the need of a cloud service. Its usage is similar to Docker, but it's specifically designed for LLMs. You can use it as an interactive shell, through its REST API or using it from a Python library. Read more here -
https://www.andreagrandi.it/posts/ollama-running-llm-locally/#:~:text=Ollama%20is%20a%20tool%20to,it%20from%20a%20Python%20library
.
Like
1
4 comments
3
Ollama library for running LLMs locally
Generative AI
skool.com/generativeai
Learn and Master Generative AI, tools and programming with practical applications at work or business. Embrace the future – join us now!
Beginner LLM Course
Create your own Community!
Generative AI Book
240
Members
0
Online
3
Admins
JOIN GROUP
powered by