79660812

Date: 2025-06-10 16:21:39
Score: 1
Natty:
Report link

finally got it working. the config looks like below:

from langroid.language_models import OpenAIGPTConfig

ollama_config = OpenAIGPTConfig(
    chat_model="ollama/mixtral",
    chat_context_length=16_000,
    api_base="http://<your.ip>/v1" #the /v1 was necessary for my ollama llm
)
Reasons:
  • Low length (0.5):
  • Has code block (-0.5):
  • Self-answer (0.5):
  • Low reputation (0.5):
Posted by: kpozzi90