local-llm
Here are 46 public repositories matching this topic...
Harness LLMs with Multi-Agent Programming
-
Updated
Jun 13, 2024 - Python
A simple "Be My Eyes" web app with a llama.cpp/llava backend
-
Updated
Nov 28, 2023 - JavaScript
Chrome Extension to Summarize or Chat with Web Pages/Local Documents Using locally running LLMs. Keep all of your data and conversations private. 🔐
-
Updated
Jun 11, 2024 - TypeScript
Your fully proficient, AI-powered and local chatbot assistant🤖
-
Updated
Jun 12, 2024 - Python
A python package for developing AI applications with local LLMs.
-
Updated
Jun 1, 2024 - Python
Openai-style, fast & lightweight local language model inference w/ documents
-
Updated
Mar 19, 2024 - Python
Structured inference with Llama 2 in your browser
-
Updated
May 23, 2024 - TypeScript
Recipes for on-device voice AI and local LLM
-
Updated
Jun 11, 2024 - Java
Infinite Craft but in Pyside6 and Python with local LLM (llama2 & others) using Ollama
-
Updated
Apr 18, 2024 - Python
PalmHill.BlazorChat is a chat application and API built with Blazor WebAssembly, SignalR, and WebAPI, featuring real-time LLM conversations, markdown support, customizable settings, and a responsive design. This project supports Llama2 models and was tested with Orca2.
-
Updated
Dec 20, 2023 - C#
LLMs prompt augmentation with RAG by integrating external custom data from a variety of sources, allowing chat with such documents
-
Updated
Mar 21, 2024 - Python
Uchinoko Studio is a web application designed to facilitate real-time voice conversations with AI.
-
Updated
Jun 12, 2024 - Svelte
An Offline Document Enquiry LLM for Everyone
-
Updated
Jul 25, 2023 - Python
Use your open source local model from the terminal
-
Updated
Jun 1, 2024 - Python
Attempt to summarize text from `stdin`, using a large language model (locally and offline), to `stdout`
-
Updated
Aug 21, 2023 - Rust
Improve this page
Add a description, image, and links to the local-llm topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the local-llm topic, visit your repo's landing page and select "manage topics."