Date: 2024-11-21 17:56:23
Score: 1.5
Natty:
You can also use OnnxRuntime, in fact it might be the best option among others. On a high level this is what you can do:
- Generate model in scikit-learn.
- Convert the model to onnx format
- Save to disk
- Load the model in Java via Onnx Runtime
- Execute inference
Links :
Reasons:
- Low length (0.5):
- No code block (0.5):
- Low reputation (0.5):
Posted by: Waleed