• 1 Post
  • 5 Comments
Joined 6 years ago
cake
Cake day: June 30th, 2020

help-circle




  • FrankLaskey@lemmy.mltoOpen Source@lemmy.mla foos GenAi tool for daily usage
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    11 days ago

    I would check out Open WebUI which can be self-hosted via docker etc and configured with any OpenAI compatible endpoint so you can use a service like OpenRouter to run nearly any LLM remotely. Most of the open weights ones like Qwen 3 or Kimi K2 Thinking are great and cost pennies per inquiry and can be configured with Zero Data Retention (ZDR) so your data is not recorded. You could also use something like Ollama to run local LLMs if you want even more privacy and have the hardware (typically a modern Nvidia GPU with at least 16-24 GB of VRAM).