mirror of
https://github.com/TheBlewish/Automated-AI-Web-Researcher-Ollama.git
synced 2025-04-23 01:59:10 +00:00
Add Docker support
Related to #14 Add Docker support to the project. * **Dockerfile**: - Create a `Dockerfile` to build the application image using Python 3.9-slim as the base image. - Set the working directory to `/app` and copy the current directory contents into the container. - Install dependencies from `requirements.txt`. - Expose port 80 and set the entry point to run `Web-LLM.py`. * **docker-compose.yml**: - Create a `docker-compose.yml` to manage multi-container applications. - Define a service for the application, map port 80, set up volumes, and configure networks. * **README.md**: - Add Docker setup instructions for building and running the Docker image. - Add Docker Compose setup instructions for starting and stopping services.
This commit is contained in:
parent
3cc6efdd09
commit
bd9f8edda3
3 changed files with 62 additions and 0 deletions
20
Dockerfile
Normal file
20
Dockerfile
Normal file
|
@ -0,0 +1,20 @@
|
|||
# Use an official Python runtime as a parent image
|
||||
FROM python:3.9-slim
|
||||
|
||||
# Set the working directory in the container
|
||||
WORKDIR /app
|
||||
|
||||
# Copy the current directory contents into the container at /app
|
||||
COPY . /app
|
||||
|
||||
# Install any needed packages specified in requirements.txt
|
||||
RUN pip install --no-cache-dir -r requirements.txt
|
||||
|
||||
# Make port 80 available to the world outside this container
|
||||
EXPOSE 80
|
||||
|
||||
# Define environment variable
|
||||
ENV NAME World
|
||||
|
||||
# Run Web-LLM.py when the container launches
|
||||
CMD ["python", "Web-LLM.py"]
|
28
README.md
28
README.md
|
@ -81,6 +81,34 @@ ollama create research-phi3 -f modelfile
|
|||
|
||||
Note: This specific configuration is necessary as recent Ollama versions have reduced context windows on models like phi3:3.8b-mini-128k-instruct despite the name suggesing high context which is why the modelfile step is necessary due to the high amount of information being used during the research process.
|
||||
|
||||
## Docker Setup
|
||||
|
||||
1. Build the Docker image:
|
||||
|
||||
```sh
|
||||
docker build -t automated-ai-web-researcher .
|
||||
```
|
||||
|
||||
2. Run the Docker container:
|
||||
|
||||
```sh
|
||||
docker run -it --rm automated-ai-web-researcher
|
||||
```
|
||||
|
||||
## Docker Compose Setup
|
||||
|
||||
1. Start the services using Docker Compose:
|
||||
|
||||
```sh
|
||||
docker-compose up
|
||||
```
|
||||
|
||||
2. Stop the services:
|
||||
|
||||
```sh
|
||||
docker-compose down
|
||||
```
|
||||
|
||||
## Usage
|
||||
|
||||
1. Start Ollama:
|
||||
|
|
14
docker-compose.yml
Normal file
14
docker-compose.yml
Normal file
|
@ -0,0 +1,14 @@
|
|||
version: '3.8'
|
||||
|
||||
services:
|
||||
web:
|
||||
build: .
|
||||
ports:
|
||||
- "80:80"
|
||||
volumes:
|
||||
- .:/app
|
||||
networks:
|
||||
- webnet
|
||||
|
||||
networks:
|
||||
webnet:
|
Loading…
Add table
Reference in a new issue