Virtual Design Lab Server

Aus SDQ-Wiki

The Virtual Design Lab Server and its services is available at https://vdl.sdq.kastel.kit.edu

Jupyter Hub

You can find a Jupyter Hub instance at https://hub.vdl.sdq.kastel.kit.edu.

New Accounts

  1. Register at https://hub.vdl.sdq.kastel.kit.edu/hub/signup . Use firstname.lastname as username.
  2. Get authorized by an employee of SDQ.

Ollama / Open WebUI

We have a hosted version of Ollama that is available at https://chat.vdl.sdq.kastel.kit.edu/ or internally from Jupyter Hub.

New Accounts

  1. Register at https://chat.vdl.sdq.kastel.kit.edu/auth?redirect=%2F
  2. Get authorized by an employee of SDQ.

Using ollama in Jupyter

You can use ollama in the Jupyterhub directly:

from langchain_ollama.chat_models import ChatOllama
from langchain_core.output_parsers import StrOutputParser
from langchain_core.prompts import ChatPromptTemplate

llm = ChatOllama(model="llama3.1:8b")
prompt = ChatPromptTemplate.from_template("Tell me a short joke about {topic}")
chain = prompt | llm | StrOutputParser()

print(chain.invoke({"topic": "Space travel"}))

Using ollama externally

You can use ollama externally. Thus, you need an API Key from our Open WebUI instance. Please ask your supervisor for that. You should use the library dotenv as the following snippet also does.

To update all models of the ollama server: ollama list | awk 'NR>1 {print $1}' | xargs -I {} ollama pull {}

from langchain_openai.chat_models import ChatOpenAI
from langchain_core.output_parsers import StrOutputParser
from langchain_core.prompts import ChatPromptTemplate
import os

import dotenv
dotenv.load_dotenv()

token = os.environ.get("OPEN_WEBUI_TOKEN")

llm = ChatOpenAI(model="llama3.1:8b", base_url="https://chat.vdl.sdq.kastel.kit.edu/api", api_key=token)
prompt = ChatPromptTemplate.from_template("Tell me a short joke about {topic}")
chain = prompt | llm | StrOutputParser()

print(chain.invoke({"topic": "Space travel"}))