mirror of
https://github.com/TheBlewish/Automated-AI-Web-Researcher-Ollama.git
synced 2025-09-01 10:10:01 +00:00
Update README.md to include checkout
This commit is contained in:
parent
fd0bbc750c
commit
9930fad0ca
1 changed files with 10 additions and 4 deletions
14
README.md
14
README.md
|
@ -44,20 +44,26 @@ The key distinction is that this isn't just a chatbot—it's an automated resear
|
||||||
cd Automated-AI-Web-Researcher-Ollama
|
cd Automated-AI-Web-Researcher-Ollama
|
||||||
```
|
```
|
||||||
|
|
||||||
2. **Create and activate a virtual environment:**
|
2. **Checkout the `feature/windows-support` branch:**
|
||||||
|
|
||||||
|
```sh
|
||||||
|
git checkout -b feature/windows-support origin/feature/windows-support
|
||||||
|
```
|
||||||
|
|
||||||
|
3. **Create and activate a virtual environment:**
|
||||||
|
|
||||||
```sh
|
```sh
|
||||||
python -m venv venv
|
python -m venv venv
|
||||||
source venv/bin/activate
|
source venv/bin/activate
|
||||||
```
|
```
|
||||||
|
|
||||||
3. **Install dependencies:**
|
4. **Install dependencies:**
|
||||||
|
|
||||||
```sh
|
```sh
|
||||||
pip install -r requirements.txt
|
pip install -r requirements.txt
|
||||||
```
|
```
|
||||||
|
|
||||||
4. **Install and configure Ollama:**
|
5. **Install and configure Ollama:**
|
||||||
|
|
||||||
Install Ollama following the instructions at [https://ollama.ai](https://ollama.ai).
|
Install Ollama following the instructions at [https://ollama.ai](https://ollama.ai).
|
||||||
|
|
||||||
|
@ -81,7 +87,7 @@ The key distinction is that this isn't just a chatbot—it's an automated resear
|
||||||
|
|
||||||
**Note:** This specific configuration is necessary as recent Ollama versions have reduced context windows on models like `phi3:3.8b-mini-128k-instruct` despite the name suggesting high context, which is why the `modelfile` step is necessary due to the large amount of information used during the research process.
|
**Note:** This specific configuration is necessary as recent Ollama versions have reduced context windows on models like `phi3:3.8b-mini-128k-instruct` despite the name suggesting high context, which is why the `modelfile` step is necessary due to the large amount of information used during the research process.
|
||||||
|
|
||||||
5. Go to the llm_config.py file which should hav an ollama section that looks like this:
|
6. Go to the llm_config.py file which should hav an ollama section that looks like this:
|
||||||
|
|
||||||
```sh
|
```sh
|
||||||
LLM_CONFIG_OLLAMA = {
|
LLM_CONFIG_OLLAMA = {
|
||||||
|
|
Loading…
Add table
Reference in a new issue