ChattyUI

    Run open-source LLMs locally in the browser using WebGPU

    Featured
    130 Votes
    ChattyUI media 2

    Description

    Open-source, feature rich Gemini/ChatGPT-like interface for running open-source models (Gemma, Mistral, LLama3 etc.) locally in the browser using WebGPU. No server-side processing - your data never leaves your pc!

    Recommended Products