
Inference Engines
MNN-LLM
A Device-Inference framework, including LLM Inference on device(Mobile Phone/PC/IOT)
A Device-Inference framework, including LLM Inference on device(Mobile Phone/PC/IOT)
Confidently evaluate, test, and ship LLM applications with a suite of observability tools to calibrate language model outputs across your dev and production lifecycle.