{open ? (
@@ -278,8 +278,8 @@ const MobileNav = ({ navItems, visible }: NavbarProps) => {
variant="outline"
className="flex cursor-pointer items-center gap-2 mt-4 w-full justify-center rounded-full dark:bg-white/20 dark:hover:bg-white/30 dark:text-white bg-gray-100 hover:bg-gray-200 text-gray-800 border-0"
>
-
- Sign in with Google
+
+ Sign in
)}
diff --git a/surfsense_web/content/docs/docker-installation.mdx b/surfsense_web/content/docs/docker-installation.mdx
index 6e64cd5..aac7cc7 100644
--- a/surfsense_web/content/docs/docker-installation.mdx
+++ b/surfsense_web/content/docs/docker-installation.mdx
@@ -82,8 +82,7 @@ Before you begin, ensure you have:
| -------------------------- | ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| DATABASE_URL | PostgreSQL connection string (e.g., `postgresql+asyncpg://postgres:postgres@localhost:5432/surfsense`) |
| SECRET_KEY | JWT Secret key for authentication (should be a secure random string) |
-| GOOGLE_OAUTH_CLIENT_ID | Google OAuth client ID obtained from Google Cloud Console |
-| GOOGLE_OAUTH_CLIENT_SECRET | Google OAuth client secret obtained from Google Cloud Console |
+| AUTH_TYPE | Authentication method: `GOOGLE` for OAuth with Google, `LOCAL` for email/password authentication |
| NEXT_FRONTEND_URL | URL where your frontend application is hosted (e.g., `http://localhost:3000`) |
| EMBEDDING_MODEL | Name of the embedding model (e.g., `openai://text-embedding-ada-002`, `anthropic://claude-v1`, `mixedbread-ai/mxbai-embed-large-v1`) |
| RERANKERS_MODEL_NAME | Name of the reranker model (e.g., `ms-marco-MiniLM-L-12-v2`) |
@@ -96,10 +95,21 @@ Before you begin, ensure you have:
| TTS_SERVICE | Text-to-Speech API provider for Podcasts (e.g., `openai/tts-1`, `azure/neural`, `vertex_ai/`). See [supported providers](https://docs.litellm.ai/docs/text_to_speech#supported-providers) |
| STT_SERVICE | Speech-to-Text API provider for Podcasts (e.g., `openai/whisper-1`). See [supported providers](https://docs.litellm.ai/docs/audio_transcription#supported-providers) |
-Include API keys for the LLM providers you're using. For example:
-- `OPENAI_API_KEY`: If using OpenAI models
-- `GEMINI_API_KEY`: If using Google Gemini models
+Include API keys for your chosen LLM providers:
+
+| ENV VARIABLE | DESCRIPTION |
+|--------------------|-----------------------------------------------------------------------------|
+| `OPENAI_API_KEY` | Required if using OpenAI models |
+| `GEMINI_API_KEY` | Required if using Google Gemini models |
+| `ANTHROPIC_API_KEY`| Required if using Anthropic models |
+
+### Google OAuth Configuration (if AUTH_TYPE=GOOGLE)
+
+| ENV VARIABLE | DESCRIPTION |
+|----------------------------|-----------------------------------------------------------------------------|
+| `GOOGLE_OAUTH_CLIENT_ID` | Client ID from Google Cloud Console |
+| `GOOGLE_OAUTH_CLIENT_SECRET` | Client secret from Google Cloud Console |
**Optional Backend LangSmith Observability:**
| ENV VARIABLE | DESCRIPTION |
@@ -125,6 +135,7 @@ For other LLM providers, refer to the [LiteLLM documentation](https://docs.litel
| ENV VARIABLE | DESCRIPTION |
| ------------------------------- | ---------------------------------------------------------- |
| NEXT_PUBLIC_FASTAPI_BACKEND_URL | URL of the backend service (e.g., `http://localhost:8000`) |
+| NEXT_PUBLIC_FASTAPI_BACKEND_AUTH_TYPE | Same value as set in backend AUTH_TYPE i.e `GOOGLE` for OAuth with Google, `LOCAL` for email/password authentication |
2. **Build and Start Containers**
diff --git a/surfsense_web/content/docs/index.mdx b/surfsense_web/content/docs/index.mdx
index f3411b8..4845a73 100644
--- a/surfsense_web/content/docs/index.mdx
+++ b/surfsense_web/content/docs/index.mdx
@@ -47,9 +47,11 @@ See the [installation notes](https://github.com/pgvector/pgvector/tree/master#in
---
-## Google OAuth Setup
+## Google OAuth Setup (Optional)
-SurfSense user management and authentication works on Google OAuth. Lets set it up.
+SurfSense supports both Google OAuth and local email/password authentication. Google OAuth is optional - if you prefer local authentication, you can skip this section.
+
+To set up Google OAuth:
1. Login to your [Google Developer Console](https://console.cloud.google.com/)
2. Enable People API.
diff --git a/surfsense_web/content/docs/manual-installation.mdx b/surfsense_web/content/docs/manual-installation.mdx
index b3999dc..72492c1 100644
--- a/surfsense_web/content/docs/manual-installation.mdx
+++ b/surfsense_web/content/docs/manual-installation.mdx
@@ -53,25 +53,37 @@ Edit the `.env` file and set the following variables:
| -------------------------- | ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| DATABASE_URL | PostgreSQL connection string (e.g., `postgresql+asyncpg://postgres:postgres@localhost:5432/surfsense`) |
| SECRET_KEY | JWT Secret key for authentication (should be a secure random string) |
-| GOOGLE_OAUTH_CLIENT_ID | Google OAuth client ID |
-| GOOGLE_OAUTH_CLIENT_SECRET | Google OAuth client secret |
-| NEXT_FRONTEND_URL | Frontend application URL (e.g., `http://localhost:3000`) |
+| AUTH_TYPE | Authentication method: `GOOGLE` for OAuth with Google, `LOCAL` for email/password authentication |
+| NEXT_FRONTEND_URL | URL where your frontend application is hosted (e.g., `http://localhost:3000`) |
| EMBEDDING_MODEL | Name of the embedding model (e.g., `openai://text-embedding-ada-002`, `anthropic://claude-v1`, `mixedbread-ai/mxbai-embed-large-v1`) |
| RERANKERS_MODEL_NAME | Name of the reranker model (e.g., `ms-marco-MiniLM-L-12-v2`) |
| RERANKERS_MODEL_TYPE | Type of reranker model (e.g., `flashrank`) |
-| FAST_LLM | LiteLLM routed faster LLM (e.g., `openai/gpt-4o-mini`, `ollama/deepseek-r1:8b`) |
-| STRATEGIC_LLM | LiteLLM routed advanced LLM (e.g., `openai/gpt-4o`, `ollama/gemma3:12b`) |
-| LONG_CONTEXT_LLM | LiteLLM routed long-context LLM (e.g., `gemini/gemini-2.0-flash`, `ollama/deepseek-r1:8b`) |
-| UNSTRUCTURED_API_KEY | API key for Unstructured.io service |
-| FIRECRAWL_API_KEY | API key for Firecrawl service (if using crawler) |
+| FAST_LLM | LiteLLM routed smaller, faster LLM (e.g., `openai/gpt-4o-mini`, `ollama/deepseek-r1:8b`) |
+| STRATEGIC_LLM | LiteLLM routed advanced LLM for complex tasks (e.g., `openai/gpt-4o`, `ollama/gemma3:12b`) |
+| LONG_CONTEXT_LLM | LiteLLM routed LLM for longer context windows (e.g., `gemini/gemini-2.0-flash`, `ollama/deepseek-r1:8b`) |
+| UNSTRUCTURED_API_KEY | API key for Unstructured.io service for document parsing |
+| FIRECRAWL_API_KEY | API key for Firecrawl service for web crawling |
| TTS_SERVICE | Text-to-Speech API provider for Podcasts (e.g., `openai/tts-1`, `azure/neural`, `vertex_ai/`). See [supported providers](https://docs.litellm.ai/docs/text_to_speech#supported-providers) |
| STT_SERVICE | Speech-to-Text API provider for Podcasts (e.g., `openai/whisper-1`). See [supported providers](https://docs.litellm.ai/docs/audio_transcription#supported-providers) |
-**Important**: Since LLM calls are routed through LiteLLM, include API keys for the LLM providers you're using:
-- For OpenAI models: `OPENAI_API_KEY`
-- For Google Gemini models: `GEMINI_API_KEY`
-- For other providers, refer to the [LiteLLM documentation](https://docs.litellm.ai/docs/providers)
+Include API keys for your chosen LLM providers:
+
+| ENV VARIABLE | DESCRIPTION |
+|--------------------|-----------------------------------------------------------------------------|
+| `OPENAI_API_KEY` | Required if using OpenAI models |
+| `GEMINI_API_KEY` | Required if using Google Gemini models |
+| `ANTHROPIC_API_KEY`| Required if using Anthropic models |
+
+For other providers, refer to the [LiteLLM documentation](https://docs.litellm.ai/docs/providers)
+
+### Google OAuth Configuration (if AUTH_TYPE=GOOGLE)
+
+| ENV VARIABLE | DESCRIPTION |
+|----------------------------|-----------------------------------------------------------------------------|
+| `GOOGLE_OAUTH_CLIENT_ID` | Client ID from Google Cloud Console |
+| `GOOGLE_OAUTH_CLIENT_SECRET` | Client secret from Google Cloud Console |
+
**Optional Backend LangSmith Observability:**
| ENV VARIABLE | DESCRIPTION |
@@ -169,6 +181,7 @@ Edit the `.env` file and set:
| ENV VARIABLE | DESCRIPTION |
| ------------------------------- | ------------------------------------------- |
| NEXT_PUBLIC_FASTAPI_BACKEND_URL | Backend URL (e.g., `http://localhost:8000`) |
+| NEXT_PUBLIC_FASTAPI_BACKEND_AUTH_TYPE | Same value as set in backend AUTH_TYPE i.e `GOOGLE` for OAuth with Google, `LOCAL` for email/password authentication |
### 2. Install Dependencies