No features have been listed yet.
Based on our record, Ollama seems to be more popular. It has been mentiond 128 times since March 2021. We are tracking product recommendations and mentions on various public social media platforms and blogs. They can help you identify which product is more popular and what people think of it.
Open WebUI: I use it for user interface of my running Ollama models. - Source: dev.to / 1 day ago
Use Ollama to run LLMs like Mistral, LLaMA, or OpenChat on your machine. It’s one command to load and run. - Source: dev.to / 1 day ago
First of all, install Ollama from https://ollama.com/. - Source: dev.to / 8 days ago
Swap OpenAI for Mistral , Mixtral , or Gemma running locally via Ollama, for:. - Source: dev.to / 13 days ago
The original example uses AWS Bedrock, but one of the great things about Spring AI is that with just a few config tweaks and dependency changes, the same code works with any other supported model. In our case, we’ll use Ollama, which will hopefully let us run locally and in CI without heavy hardware requirements 🙏. - Source: dev.to / 15 days ago
BabyAGI - A pared-down version of Task-Driven Autonomous AI Agent
LM Studio - Discover, download, and run local LLMs
Auto-GPT - An Autonomous GPT-4 Experiment
Pinokio - Pinokio is a browser that lets you install, run, and programmatically control ANY application, automatically.
AgentGPT - Assemble, configure, and deploy autonomous AI Agents in your browser
privateGPT - Interact privately with your documents using the power of GPT, 100% privately, no data leaks - GitHub - imartinez/privateGPT: Interact privately with your documents using the power of GPT, 100% pri...