Drop-in, local AI alternative to the OpenAI stack. Multi-engine (llama.cpp, TensorRT-LLM). Powers 👋 Jan
-
Updated
Jun 13, 2024 - C++
Drop-in, local AI alternative to the OpenAI stack. Multi-engine (llama.cpp, TensorRT-LLM). Powers 👋 Jan
Maid is a cross-platform Flutter app for interfacing with GGUF / llama.cpp models locally, and with Ollama and OpenAI models remotely.
Tools for easing the handoff between AI/ML and App/SRE teams.
Making offline AI models accessible to all types of edge devices.
Gradio based tool to run opensource LLM models directly from Huggingface
The Natural Language Shell integrates OpenAI's GPTs, Anthropic's Claude, or local GGUF-formatted LLMs directly into the terminal experience, allowing operators to describe their tasks in either POSIX commands or fluent human language
A Javascript library (with Typescript types) to parse metadata of GGML based GGUF files.
Search for anything using the Google, DuckDuckGo, phind.com. Also containes AI models, can transcribe yt videos, temporary email and phone number generation, have TTS support and webai(terminal gpt and open interpeter)
PyGPTPrompt: A CLI tool that manages context windows for AI models, facilitating user interaction and data ingestion for optimized long-term memory and task automation.
Local character AI chatbot with chroma vector store memory and some scripts to process documents for Chroma
Speech-to-Text based on silero-vad + whisper.cpp (GGUF TTS) for ROS 2
Add a description, image, and links to the gguf topic page so that developers can more easily learn about it.
To associate your repository with the gguf topic, visit your repo's landing page and select "manage topics."