Ollama: Difference between revisions
No edit summary |
No edit summary |
||
| Line 38: | Line 38: | ||
</pre> | </pre> | ||
The important part seems to be '<tt>Auto-detected mode as | The important part seems to be '<tt>Auto-detected mode as legacy</tt>' | ||
Running the image from Docker Desktop, with setting optons for ports and volumes, and copying the 'run' command spits out: | |||
<code>docker run --hostname=3f50cd4183bd --mac-address=02:42:ac:11:00:02 --env=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin --env=LD_LIBRARY_PATH=/usr/local/nvidia/lib:/usr/local/nvidia/lib64 --env=NVIDIA_DRIVER_CAPABILITIES=compute,utility --env=NVIDIA_VISIBLE_DEVICES=all --env=OLLAMA_HOST=0.0.0.0:11434 --volume=ollama:/root/.ollama --network=bridge -p 11434:11434 --restart=no --label='org.opencontainers.image.ref.name=ubuntu' --label='org.opencontainers.image.version=20.04' --runtime=runc -d ollama/ollama:latest</code> | |||
http://localhost:11434/ just reveals 'Ollama is running' in 'hello world' style. | |||
{{References}} | {{References}} | ||
[[Category:Artificial Intelligence]] | [[Category:Artificial Intelligence]] | ||