Ollama: Difference between revisions
extracted install bits Tags: Replaced Visual edit: Switched |
No edit summary |
||
| (3 intermediate revisions by one other user not shown) | |||
| Line 2: | Line 2: | ||
[https://www.ycombinator.com/companies/ollama Ollama] was funded by [https://www.ycombinator.com/people/jared-friedman Jared Friedman] out of Y Combinator (YC). Founders Jeffrey Morgan and [https://2024.allthingsopen.org/speakers/michael-chiang Michael Chiang] wanted an easier way to run LLMs than having to do it in the cloud. In fact, they were previously founders of a startup project named Kitematic which was the early UI for Docker. Acquired by Docker, it was the precursor to [[Docker Desktop]]. | [https://www.ycombinator.com/companies/ollama Ollama] was funded by [https://www.ycombinator.com/people/jared-friedman Jared Friedman] out of Y Combinator (YC). Founders Jeffrey Morgan and [https://2024.allthingsopen.org/speakers/michael-chiang Michael Chiang] wanted an easier way to run LLMs than having to do it in the cloud. In fact, they were previously founders of a startup project named Kitematic which was the early UI for Docker. Acquired by Docker, it was the precursor to [[Docker Desktop]]. | ||
Ollama will enable you to get up and running with Llama 3.3, DeepSeek-R1, Phi-4, Gemma 3, Mistral Small 3.1 and other large language models. | |||
== Installing it == | == Installing it == | ||
I had some problems getting off the ground with Ollama. Some details are in [[Ollama/install]] | I had some problems getting off the ground with Ollama. Some details are in [[Ollama/install]] The problems were described better in [[Nvidia on Ubuntu]] where we cover one of the greatest challenges of Linux computing: graphics drivers. | ||
== Docs == | |||
The [https://github.com/ollama/ollama/blob/main/docs/linux.md docs] tell you how you can customize and update or uninstall the environment. | |||
Looking at the logs with <code>journalctl -e -u ollama</code> told me what my new generated public key is, but also that it could not load a compatible GPU so I spent time fixing that. | |||
Start with the [https://github.com/ollama/ollama/blob/main/README.md README] for an intro. | |||
== Interface == | |||
Although you can instantly begin using a model from the command line with something like | |||
{{References}} | <code>ollama run gemma3</code> <ref>This will download and run a 4B parameter model.</ref> there are many User Interfaces or front-ends that can be coupled with Ollama such as [https://github.com/open-webui/open-webui Open-Webui].{{References}} | ||
[[Category:Artificial Intelligence]] | [[Category:Artificial Intelligence]] | ||