Find a file
Timothy Jaeryang Baek 27a3075c3a fix(linux): use /tmp for shared memory to prevent AppImage /dev/shm crashes
Add --disable-dev-shm-usage flag on Linux so Chromium writes shared
memory to /tmp instead of /dev/shm.  AppImage's FUSE mount restricts
child-process access to /dev/shm, causing FATAL zygote/renderer crashes
with 'Unable to access(W_OK|X_OK) /dev/shm' — resulting in a blank/grey
screen.  Also affects .deb and Snap packages on some distros.

Fixes #136
2026-04-25 00:32:27 +09:00
.github refac 2026-04-06 13:34:08 -05:00
build fix: add disable-library-validation entitlement to fix macOS launch crash 2026-04-24 23:59:57 +09:00
resources feat: add global voice input with push-to-talk transcription (0.0.8) 2026-04-11 15:17:36 -06:00
src fix(linux): use /tmp for shared memory to prevent AppImage /dev/shm crashes 2026-04-25 00:32:27 +09:00
.editorconfig Initial commit 2026-03-17 16:01:25 -05:00
.gitignore refac 2026-04-08 13:49:32 -07:00
.npmrc Update .npmrc 2026-04-02 18:53:57 -05:00
.prettierignore Initial commit 2026-03-17 16:01:25 -05:00
.prettierrc.yaml Initial commit 2026-03-17 16:01:25 -05:00
CHANGELOG.md chore: bump version to 0.0.12 and update changelog 2026-04-25 00:29:27 +09:00
CONTRIBUTOR_LICENSE_AGREEMENT Add AGPL-3.0 license, CLA, and CLA enforcement workflow 2026-03-17 16:04:41 -05:00
demo.png doc: readme 2026-03-18 19:04:24 -05:00
dev-app-update.yml refac 2026-03-18 03:56:16 -05:00
electron-builder.yml fix: enable npmRebuild and unpack node-pty to fix Linux deb crash (#125) 2026-04-25 00:27:10 +09:00
electron.vite.config.ts feat: add global voice input with push-to-talk transcription (0.0.8) 2026-04-11 15:17:36 -06:00
entitlements.plist refac 2026-03-17 23:59:33 -05:00
eslint.config.mjs Initial commit 2026-03-17 16:01:25 -05:00
LICENSE Add AGPL-3.0 license, CLA, and CLA enforcement workflow 2026-03-17 16:04:41 -05:00
package-lock.json refac 2026-04-06 12:31:17 -05:00
package.json chore: bump version to 0.0.12 and update changelog 2026-04-25 00:29:27 +09:00
README.md refac 2026-04-25 00:00:16 +09:00
svelte.config.mjs Initial commit 2026-03-17 16:01:25 -05:00
tsconfig.json Initial commit 2026-03-17 16:01:25 -05:00
tsconfig.node.json Initial commit 2026-03-17 16:01:25 -05:00
tsconfig.web.json Initial commit 2026-03-17 16:01:25 -05:00

Open WebUI Desktop

Version Downloads Discord License

Open WebUI Desktop

Your AI, right on your desktop. Open WebUI as a native app. Run models locally or connect to any server. No Docker, no terminal, no setup. Download, launch, chat.

Warning

Early Alpha. Things move fast and stuff might break. Report bugs or come hang out on Discord.

Download

Platform Installer
macOS (Apple Silicon) Download .dmg
macOS (Intel) Download .dmg
Windows x64 Download .exe
Linux (AppImage) Download .AppImage
Linux (Debian/Ubuntu) Download .deb
Linux (Snap) Download .snap
Linux (Flatpak) Download .flatpak

Internet required on first launch. After that, everything works offline. All releases →

How It Works

🖥️ Run locally. The app runs Open WebUI on your machine. You can optionally enable the built-in llama.cpp engine to download and run models offline. Nothing leaves your computer.

☁️ Connect remotely. Point the app at any Open WebUI server. Switch between multiple connections from the sidebar.

Use both at the same time.

Highlights

  • Spotlight. Hit Shift+Cmd+I (macOS) or Shift+Ctrl+I (Windows/Linux) to summon a floating chat bar over whatever you're doing. Drag to screenshot anything on screen.
  • 🎙️ Voice input. System-wide push-to-talk. Press the shortcut from any app to record, and your speech is transcribed and sent to your chat automatically.
  • 🧠 Local inference. Optionally run models entirely on your hardware via the built-in llama.cpp engine. Your data never leaves your machine.
  • 🎯 One-click setup. Launch and connect to a server in seconds. Local models can be enabled from the settings.
  • 🔌 Multiple connections. Juggle servers and switch between them instantly.
  • 🔄 Auto-updates. New releases land in the background.
  • 📡 Offline-ready. No internet needed after initial setup.
  • 💻 Cross-platform. macOS, Windows, and Linux.

System Requirements

Local Models Remote Only
Disk 5 GB+ ~500 MB
RAM 16 GB+ 4 GB
OS macOS 12+, Windows 10+, modern Linux Same

Note

Local models need serious RAM (7B ≈ 8 GB, 13B ≈ 16 GB). Lighter machine? Connect to a remote server instead.

Privacy

No telemetry. No tracking. No phone-home. Your conversations stay on your machine. Period.

Community

  • 💬 Discord - Come hang out
  • 🐛 Issues - Report bugs or request features
  • 🌐 Open WebUI - The main project
  • 📖 Docs - Full documentation

Contributing

npm install
npm run dev

See CHANGELOG.md for release history. Licensed under AGPL-3.0.