finally got it working. the config looks like below:
from langroid.language_models import OpenAIGPTConfig
ollama_config = OpenAIGPTConfig(
chat_model="ollama/mixtral",
chat_context_length=16_000,
api_base="http://<your.ip>/v1" #the /v1 was necessary for my ollama llm
)