Inference for text-embeddings in Rust, HFOIL Licence.
An interactive chat project that leverages Ollama/OpenAI/MistralAI LLMs for rapid understanding and navigation of GitHub code repository or compressed file resources.
Easily build, version, evaluate and deploy your LLM-powered apps.
A distributed multi-model LLM serving system with web UI and OpenAI-compatible RESTful APIs.
A Device-Inference framework, including LLM Inference on device(Mobile Phone/PC/IOT)
A high-throughput and low-latency inference and serving framework for LLMs and VLs
Test your prompts. Evaluate and compare LLM outputs, catch regressions, and improve prompt quality.
Your email address will not be published. Required fields are marked *
Comment *
Name *
Email *
Website
Captcha: 19 - 12 = ?*
Save my name, email, and website in this browser for the next time I comment.
An interactive chat project that leverages Ollama/OpenAI/MistralAI LLMs for rapid understanding and navigation of GitHub code repository or compressed file resources.