Update project configuration and structure with Docker support and environment variable adjustments

- Updated .env.example to reflect new LLM configuration with Aliyun's API.
- Enhanced .gitignore to include additional files and directories for better exclusion of sensitive and build artifacts.
- Added docker-compose.yml for streamlined deployment of backend and frontend services.
- Introduced Dockerfiles for both backend and frontend to facilitate containerized builds.
- Created README.md to provide comprehensive project documentation and setup instructions.
- Established nginx configuration for frontend to support API proxying and static file serving.
This commit is contained in:
666ghj 2025-12-17 18:17:40 +08:00
parent e2b1a1554d
commit e432e223df
10 changed files with 517 additions and 36 deletions

View file

@ -3,8 +3,8 @@ ZEP_API_KEY=your_zep_api_key_here
# ===== 通用 LLM 配置 =====
LLM_API_KEY=your_api_key_here
LLM_BASE_URL=https://api.openai.com/v1
LLM_MODEL_NAME=gpt-4o-mini
LLM_BASE_URL=https://dashscope.aliyuncs.com/compatible-mode/v1
LLM_MODEL_NAME=qwen-plus
# ===== 加速 LLM 配置(可选)=====
LLM_BOOST_API_KEY=your_boost_api_key_here

55
.gitignore vendored
View file

@ -1,24 +1,50 @@
# OS
.DS_Store
Thumbs.db
# 环境变量(保护敏感信息)
.env
.env.local
.env.development.local
.env.test.local
.env.production.local
.env.*.local
.env.development
.env.test
.env.production
.env.local
.env.development.local
.env.test.local
.env.production.local
__pycache__/
.vscode
.idea
.pytest_cache
.pytest_cache
# Python
__pycache__/
*.py[cod]
*$py.class
*.so
.Python
.venv/
venv/
ENV/
.eggs/
*.egg-info/
dist/
build/
# Node.js
node_modules/
npm-debug.log*
yarn-debug.log*
yarn-error.log*
# IDE
.vscode/
.idea/
*.swp
*.swo
# 测试
.pytest_cache/
.coverage
htmlcov/
# Cursor
.cursor/
# 文档与测试程序
mydoc/
mytest/
@ -27,4 +53,7 @@ backend/logs/
*.log
# 上传文件
backend/uploads/
backend/uploads/
# Docker 数据
data/

196
README.md Normal file
View file

@ -0,0 +1,196 @@
# MiroFish 🐟
**简洁通用的群体智能引擎,预测万物**
MiroFish 是一个基于多智能体Multi-Agent技术的社交媒体舆情模拟平台能够模拟 Twitter/Reddit 等社交媒体上的用户行为,预测舆情发展趋势。
## 📁 项目结构
```
MiroFish/
├── backend/ # Flask 后端服务
│ ├── app/ # 应用核心代码
│ ├── scripts/ # OASIS 模拟脚本
│ ├── requirements.txt
│ └── run.py # 后端启动入口
├── frontend/ # Vue 3 前端
│ ├── src/
│ ├── package.json
│ └── vite.config.js
├── .env.example # 环境变量示例
├── docker-compose.yml # Docker 部署配置
├── package.json # 根目录启动脚本
└── README.md
```
---
## 🚀 快速开始
### 前置要求
- **Python 3.11+**
- **Node.js 18+**
- **[uv](https://docs.astral.sh/uv/)**Python 包管理器)
安装 uv
```bash
# macOS/Linux
curl -LsSf https://astral.sh/uv/install.sh | sh
# Windows
powershell -c "irm https://astral.sh/uv/install.ps1 | iex"
```
### 配置环境变量
```bash
# 复制示例配置文件
cp .env.example .env
# 编辑 .env 文件,填入必要的 API 密钥
```
必需的环境变量:
```env
# LLM 配置(支持 OpenAI 格式的任意 LLM
LLM_API_KEY=your_api_key
LLM_BASE_URL=https://api.openai.com/v1
LLM_MODEL_NAME=gpt-4o-mini
# Zep Cloud 配置
ZEP_API_KEY=your_zep_api_key
```
---
## 📦 部署方式一:源码部署(开发推荐)
使用 `concurrently` 同时启动前后端,**跨平台兼容**Windows/macOS/Linux
### 1. 安装依赖
```bash
# 一键安装所有依赖(根目录 + 前端 + 后端)
npm run setup:all
```
或者分步安装:
```bash
# 安装 Node 依赖(根目录 + 前端)
npm run setup
# 安装 Python 依赖(自动创建虚拟环境)
npm run setup:backend
```
### 2. 启动服务
```bash
# 同时启动前后端(在项目根目录执行)
npm run dev
```
服务地址:
- 前端:`http://localhost:3000`
- 后端 API`http://localhost:5001`
### 单独启动
```bash
# 仅启动后端
npm run backend
# 仅启动前端
npm run frontend
```
---
## 🐳 部署方式二Docker 部署(生产推荐)
### 前置要求
- Docker 20.10+
- Docker Compose v2+
### 启动服务
```bash
# 构建并启动所有服务
docker compose up -d
# 查看日志
docker compose logs -f
# 停止服务
docker compose down
```
服务地址:
- 前端:`http://localhost:3000`
- 后端 API`http://localhost:5001`
### 仅构建镜像
```bash
# 构建后端镜像
docker build -t mirofish-backend ./backend
# 构建前端镜像
docker build -t mirofish-frontend ./frontend
```
---
## 🛠 技术栈
### 后端
- **框架**: Flask 3.x
- **LLM 调用**: OpenAI SDK
- **图谱存储**: Zep Cloud
- **模拟引擎**: OASIS (camel-oasis)
### 前端
- **框架**: Vue 3 + Composition API
- **构建工具**: Vite
- **可视化**: D3.js
- **HTTP 客户端**: Axios
---
## ⚙️ 环境变量说明
| 变量名 | 必需 | 说明 | 默认值 |
|--------|------|------|--------|
| `LLM_API_KEY` | ✅ | LLM API 密钥 | - |
| `LLM_BASE_URL` | ❌ | LLM API 地址 | `https://api.openai.com/v1` |
| `LLM_MODEL_NAME` | ❌ | 模型名称 | `gpt-4o-mini` |
| `ZEP_API_KEY` | ✅ | Zep Cloud API 密钥 | - |
| `FLASK_DEBUG` | ❌ | 调试模式 | `true` |
| `FLASK_HOST` | ❌ | 后端监听地址 | `0.0.0.0` |
| `FLASK_PORT` | ❌ | 后端端口 | `5001` |
---
## 🐛 常见问题
### Q: 后端启动报错 "LLM_API_KEY 未配置"
A: 确保 `.env` 文件在项目根目录,且配置了正确的 API 密钥。
### Q: 前端无法连接后端
A: 检查后端是否正常运行在 5001 端口,前端开发服务器会自动代理 `/api/*` 请求。
### Q: OASIS 模拟启动失败
A: 确保已安装 `camel-oasis``camel-ai` 依赖,且 LLM API 配置正确。
### Q: Windows 上 Python 虚拟环境激活失败
A: 使用 `.venv\Scripts\activate` 而不是 `source .venv/bin/activate`
---
## 📄 License
MIT License

59
backend/Dockerfile Normal file
View file

@ -0,0 +1,59 @@
# MiroFish Backend Dockerfile
# Multi-stage build for smaller image size
# ============= Build Stage =============
FROM python:3.11-slim AS builder
WORKDIR /app
# 安装构建依赖
RUN apt-get update && apt-get install -y --no-install-recommends \
build-essential \
&& rm -rf /var/lib/apt/lists/*
# 复制依赖文件
COPY requirements.txt .
# 创建虚拟环境并安装依赖
RUN python -m venv /opt/venv
ENV PATH="/opt/venv/bin:$PATH"
RUN pip install --no-cache-dir --upgrade pip && \
pip install --no-cache-dir -r requirements.txt
# ============= Production Stage =============
FROM python:3.11-slim
WORKDIR /app
# 安装运行时依赖
RUN apt-get update && apt-get install -y --no-install-recommends \
curl \
&& rm -rf /var/lib/apt/lists/*
# 从 builder 复制虚拟环境
COPY --from=builder /opt/venv /opt/venv
ENV PATH="/opt/venv/bin:$PATH"
# 复制应用代码
COPY app/ ./app/
COPY scripts/ ./scripts/
COPY run.py .
# 创建上传目录
RUN mkdir -p uploads/simulations uploads/reports
# 设置环境变量
ENV PYTHONUNBUFFERED=1
ENV FLASK_HOST=0.0.0.0
ENV FLASK_PORT=5001
# 暴露端口
EXPOSE 5001
# 健康检查
HEALTHCHECK --interval=30s --timeout=10s --start-period=5s --retries=3 \
CMD curl -f http://localhost:5001/health || exit 1
# 启动命令
CMD ["python", "run.py"]

53
backend/pyproject.toml Normal file
View file

@ -0,0 +1,53 @@
[project]
name = "mirofish-backend"
version = "1.0.0"
description = "MiroFish - 简洁通用的群体智能引擎,预测万物"
readme = "README.md"
requires-python = ">=3.11"
license = { text = "MIT" }
authors = [
{ name = "MiroFish Team" }
]
dependencies = [
# 核心框架
"flask>=3.0.0",
"flask-cors>=6.0.0",
# LLM 相关
"openai>=1.0.0",
# Zep Cloud
"zep-cloud==3.13.0",
# OASIS 社交媒体模拟
"camel-oasis==0.2.5",
"camel-ai==0.2.78",
# 文件处理
"PyMuPDF>=1.24.0",
# 工具库
"python-dotenv>=1.0.0",
"pydantic>=2.0.0",
]
[project.optional-dependencies]
dev = [
"pytest>=8.0.0",
"pytest-asyncio>=0.23.0",
"pipreqs>=0.5.0",
]
[build-system]
requires = ["hatchling"]
build-backend = "hatchling.build"
[tool.uv]
dev-dependencies = [
"pytest>=8.0.0",
"pytest-asyncio>=0.23.0",
]
[tool.hatch.build.targets.wheel]
packages = ["app"]

View file

@ -1,31 +1,32 @@
# Flask框架
# ===========================================
# MiroFish Backend Dependencies
# ===========================================
# Python 3.11+ required
# Install: pip install -r requirements.txt
# ===========================================
# ============= 核心框架 =============
flask>=3.0.0
flask-cors>=4.0.0
flask-cors>=6.0.0
# Zep Cloud SDK
zep-cloud>=2.0.0
# OpenAI SDK用于LLM调用
# ============= LLM 相关 =============
# OpenAI SDK统一使用 OpenAI 格式调用 LLM
openai>=1.0.0
# PDF处理
# ============= Zep Cloud =============
zep-cloud==3.13.0
# ============= OASIS 社交媒体模拟 =============
# OASIS 社交模拟框架
camel-oasis==0.2.5
camel-ai==0.2.78
# ============= 文件处理 =============
PyMuPDF>=1.24.0
# 环境变量
# ============= 工具库 =============
# 环境变量加载
python-dotenv>=1.0.0
# 数据验证
pydantic>=2.0.0
# 文件处理
werkzeug>=3.0.0
# OASIS社交媒体模拟框架
oasis-ai>=0.1.0
camel-ai>=0.2.0
# LangChain框架用于Report Agent
langchain>=0.2.0
langchain-core>=0.2.0
langchain-openai>=0.1.0

44
docker-compose.yml Normal file
View file

@ -0,0 +1,44 @@
version: '3.8'
services:
# 后端服务
backend:
build:
context: ./backend
dockerfile: Dockerfile
container_name: mirofish-backend
ports:
- "5001:5001"
environment:
- FLASK_HOST=0.0.0.0
- FLASK_PORT=5001
- FLASK_DEBUG=false
env_file:
- .env
volumes:
# 持久化上传文件和模拟数据
- ./data/uploads:/app/uploads
restart: unless-stopped
healthcheck:
test: ["CMD", "curl", "-f", "http://localhost:5001/health"]
interval: 30s
timeout: 10s
retries: 3
# 前端服务
frontend:
build:
context: ./frontend
dockerfile: Dockerfile
args:
- VITE_API_BASE_URL=http://backend:5001
container_name: mirofish-frontend
ports:
- "3000:80"
depends_on:
- backend
restart: unless-stopped
networks:
default:
name: mirofish-network

34
frontend/Dockerfile Normal file
View file

@ -0,0 +1,34 @@
# MiroFish Frontend Dockerfile
# Multi-stage build: Node.js build + Nginx serve
# ============= Build Stage =============
FROM node:20-alpine as builder
WORKDIR /app
# 复制 package 文件
COPY package*.json ./
# 安装依赖
RUN npm ci
# 复制源代码
COPY . .
# 构建生产版本
RUN npm run build
# ============= Production Stage =============
FROM nginx:alpine
# 复制 Nginx 配置
COPY nginx.conf /etc/nginx/conf.d/default.conf
# 复制构建产物
COPY --from=builder /app/dist /usr/share/nginx/html
# 暴露端口
EXPOSE 80
# 启动 Nginx
CMD ["nginx", "-g", "daemon off;"]

45
frontend/nginx.conf Normal file
View file

@ -0,0 +1,45 @@
server {
listen 80;
server_name localhost;
root /usr/share/nginx/html;
index index.html;
# Gzip 压缩
gzip on;
gzip_types text/plain text/css application/json application/javascript text/xml application/xml;
gzip_min_length 1000;
# Vue Router history 模式支持
location / {
try_files $uri $uri/ /index.html;
}
# API 代理到后端
location /api/ {
proxy_pass http://backend:5001;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection 'upgrade';
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
proxy_cache_bypass $http_upgrade;
# 超时设置(模拟可能耗时较长)
proxy_connect_timeout 60s;
proxy_send_timeout 300s;
proxy_read_timeout 300s;
}
# 健康检查代理
location /health {
proxy_pass http://backend:5001/health;
}
# 静态文件缓存
location ~* \.(js|css|png|jpg|jpeg|gif|ico|svg|woff|woff2)$ {
expires 1y;
add_header Cache-Control "public, immutable";
}
}

20
package.json Normal file
View file

@ -0,0 +1,20 @@
{
"name": "mirofish",
"version": "1.0.0",
"description": "MiroFish - 简洁通用的群体智能引擎,预测万物",
"scripts": {
"setup": "npm install && cd frontend && npm install",
"setup:backend": "cd backend && uv venv && uv pip install -r requirements.txt",
"setup:all": "npm run setup && npm run setup:backend",
"dev": "concurrently -n \"backend,frontend\" -c \"yellow,cyan\" \"npm run backend\" \"npm run frontend\"",
"backend": "cd backend && uv run python run.py",
"frontend": "cd frontend && npm run dev",
"build": "cd frontend && npm run build"
},
"devDependencies": {
"concurrently": "^9.1.2"
},
"engines": {
"node": ">=18.0.0"
}
}