Commit graph

2 commits

Author SHA1 Message Date
rcourtman
900e05025a Fix OpenAI-compatible endpoint support for chat
Two issues fixed:

1. Custom base URL wasn't being passed to the OpenAI client in
   createProviderForModel() - requests went to api.openai.com instead
   of the configured endpoint (e.g., LM Studio, llama.cpp)

2. Tool schemas were missing the "properties" field when tools had no
   parameters. OpenAI API requires "properties" to always be present
   as an object, even if empty.

Fixes #1154
2026-02-03 12:03:06 +00:00
rcourtman
5ff4f97a0d feat(ai): Add native chat service with streaming and tool execution
Replace the OpenCode sidecar with a native chat service that handles:
- Real-time streaming responses from AI providers
- Multi-turn conversation sessions with history
- Tool execution with automatic function calling
- Agentic workflows for autonomous task completion
- Patrol integration for automated health analysis

The chat service directly communicates with AI providers using the
new StreamingProvider interface, eliminating the need for an external
sidecar process. Sessions are managed in-memory with configurable
history limits.

Key components:
- service.go: Main chat service with provider integration
- session.go: Session management and message history
- agentic.go: Agentic loop for autonomous tool execution
- patrol.go: Patrol-specific chat context and analysis
- tools.go: Tool execution bridge to tools package
- types.go: Chat message and event type definitions
2026-01-19 19:12:04 +00:00