An open-source GPU cluster manager for running LLMs
An interactive chat project that leverages Ollama/OpenAI/MistralAI LLMs for rapid understanding and navigation of GitHub code repository or compressed file resources.
Create, deploy and operate Actions using Python anywhere to enhance your AI agents and assistants. Batteries included with an extensive set of libraries, helpers and logging.
Seamlessly integrate LLMs as Python functions
Simple API for deploying any RAG or LLM that you want adding plugins.
Data integration platform for LLMs.
Building applications with LLMs through composability
Your email address will not be published. Required fields are marked *
Comment *
Name *
Email *
Website
Captcha: 11 + 17 = ?*
Save my name, email, and website in this browser for the next time I comment.
An interactive chat project that leverages Ollama/OpenAI/MistralAI LLMs for rapid understanding and navigation of GitHub code repository or compressed file resources.