Ollama: Difference between revisions
add text from GitHub repo about the models supported. |
link to Open-Webui |
||
| Line 14: | Line 14: | ||
Looking at the logs with <code>journalctl -e -u ollama</code> told me what my new generated public key is, but also that it could not load a compatible GPU so I spent time fixing that. | Looking at the logs with <code>journalctl -e -u ollama</code> told me what my new generated public key is, but also that it could not load a compatible GPU so I spent time fixing that. | ||
Start with the [https://github.com/ollama/ollama/blob/main/README.md README] for an intro. | |||
{{References}} | == Interface == | ||
Although you can instantly begin using a model from the command line with something like | |||
<code>ollama run gemma3</code> <ref>This will download and run a 4B parameter model.</ref> there are many User Interfaces or front-ends that can be coupled with Ollama such as [https://github.com/open-webui/open-webui Open-Webui].{{References}} | |||
[[Category:Artificial Intelligence]] | [[Category:Artificial Intelligence]] | ||