Published On Sep 9, 2024
People have been asking about a problem with Ollama in Lightning AI and being confused about needing to install Ollama and the LLM models each time they reopen the studio in lightning AI. I personally also wondered whats going on. But in this video I show how to solve the issue.
If you want a shortcut you can use the Lightning AI Studio template with the installation already done here:
https://lightning.ai/openintegrator/s...
First note the persistence guidance here:
https://lightning.ai/docs/overview/st...
Installation commands below just replace in - install.sh with angle bracket install.sh to download the file.
curl -fsSL https://ollama.com/install.sh - install.sh
OLLAMA_INSTALL_DIR="/teamspace/studios/this_studio/"
/teamspace/studios/this_studio/bin/ollama
chmod +x install.sh
./install.sh
vim ~/.zshrc
PATH=$PATH:/teamspace/studios/this_studio/bin
source ~/.zshrc
ollama serve
ollama run llama3.1