mirror of
https://github.com/carlrobertoh/ProxyAI.git
synced 2026-04-29 12:11:26 +00:00
docs: add nextra docs
This commit is contained in:
parent
4be7ba0b44
commit
33e886860c
87 changed files with 6342 additions and 0 deletions
4
docs/pages/tutorials/_meta.json
Normal file
4
docs/pages/tutorials/_meta.json
Normal file
|
|
@ -0,0 +1,4 @@
|
|||
{
|
||||
"run-deepseek-r1-locally-mac-mini-pycharm": "Run Deepseek R1 locally on a Mac Mini in PyCharm",
|
||||
"deploy-deepseek-r1-on-runpod-and-use-it-in-pycharm": "Deploy Deepseek R1 on a RunPod and use it in PyCharm"
|
||||
}
|
||||
|
|
@ -0,0 +1,9 @@
|
|||
# Deploy Deepseek R1 on a RunPod and use it in PyCharm
|
||||
|
||||
*By Laurent Meyer*
|
||||
|
||||
---
|
||||
|
||||
Deploy DeepSeek-R1 on Runpod Serverless with Docker & vLLM—run your own private local LLM that processes RAG data, auto-scales, and shuts down when idle.
|
||||
|
||||
[Read the full article →](https://meyer-laurent.com/deploying-deepseek-r1-on-runpod-serverless-and-use-it-in-pycharm)
|
||||
|
|
@ -0,0 +1,9 @@
|
|||
# Run Deepseek R1 locally on a Mac Mini in PyCharm
|
||||
|
||||
*By Laurent Meyer*
|
||||
|
||||
---
|
||||
|
||||
Discover how to enhance your development workflow and protect proprietary code by setting up DeepSeek R1 locally on a Mac. This guide covers installing ProxyAI, configuring Ollama, and using a local Large Language Model for secure, AI-assisted coding without relying on public LLMs.
|
||||
|
||||
[Read the full article →](https://meyer-laurent.com/run-deepseek-r1-locally-mac-mini-pycharm)
|
||||
Loading…
Add table
Add a link
Reference in a new issue