A minimal LLM chat app that runs entirely in your browser
Web app for interacting with any LangGraph agent (PY & TS) via a chat
the terminal client for Ollama
WebAssembly binding for llama.cpp - Enabling on-browser LLM inference
The open source codebase powering HuggingChat
ChatGLM2-6B: An Open Bilingual Chat LLM
A Comprehensive Benchmark to Evaluate LLMs as Agents (ICLR'24)
Automatic question answering for local knowledge bases based on LLM
Real-time NVIDIA GPU dashboard
Request recommended movies, TV shows and anime to Jellyseer/Overseer