79241134

Date: 2024-12-01 08:02:30
Score: 1
Natty:
Report link

Firstly you need to convert the model. Use this page. Its based on the llama.cpp repo

https://huggingface.co/spaces/ggml-org/gguf-my-repo

And to run, you can use Ollama to run any GGUF model.

Reasons:
  • Whitelisted phrase (-1.5): you can use
  • Low length (1):
  • No code block (0.5):
  • Low reputation (1):
Posted by: Alvin Rachmat