Virtual Design Lab Server
The Virtual Design Lab Server and its services is available at https://vdl.sdq.kastel.kit.edu
Jupyter Hub
You can find a Jupyter Hub instance at https://vdl.sdq.kastel.kit.edu.
New Accounts
- Register at https://vdl.sdq.kastel.kit.edu/hub/signup . Use
firstname.lastname
as username. - Get authorized by an employee of SDQ.
Ollama
We have a hosted version of Ollama that is available at https://ollama.vdl.sdq.kastel.kit.edu/ or internally from Jupyter Hub.
Using ollama in Jupyter
You can use ollama in the Jupyterhub directly:
from langchain_community.chat_models import ChatOllama
from langchain_core.output_parsers import StrOutputParser
from langchain_core.prompts import ChatPromptTemplate
llm = ChatOllama(model="mixtral", base_url="http://ollama:11434")
prompt = ChatPromptTemplate.from_template("Tell me a short joke about {topic}")
chain = prompt | llm | StrOutputParser()
print(chain.invoke({"topic": "Space travel"}))
Using ollama externally
You can use ollama externally. Thus, you need a username and password. Please ask your supervisor for that. (For employees, you can find it in vaultwarden) You should use the library dotenv as the following snippet also does.
To update all models of the ollama server: ollama list | awk 'NR>1 {print $1}' | xargs -I {} ollama pull {}
from langchain_community.chat_models import ChatOllama
from langchain_core.output_parsers import StrOutputParser
from langchain_core.prompts import ChatPromptTemplate
import dotenv
dotenv.load_dotenv()
username = os.environ.get("OLLAMA_USER")
password = os.environ.get("OLLAMA_PASSWORD")
headers = {"Authorization": "Basic " + b64encode(f"{username}:{password}".encode('utf-8')).decode("ascii")}
llm = ChatOllama(model="mixtral", base_url="https://ollama.vdl.sdq.kastel.kit.edu", headers=headers)
prompt = ChatPromptTemplate.from_template("Tell me a short joke about {topic}")
chain = prompt | llm | StrOutputParser()
print(chain.invoke({"topic": "Space travel"}))