Replace generic "An unexpected error occurred" messages with descriptive,
user-friendly error messages when LLM operations fail. Errors like invalid
API keys, wrong model names, and rate limits now surface clearly in the UI.
Adds error classification utility, global FastAPI exception handlers, and
frontend getApiErrorMessage() helper. Bumps version to 1.7.2.
* docs: update CHANGELOG for v1.6.0 release
* fix: improve error logging for chat model configuration issues (#358)
- Add detailed error logging in provision.py when model lookup fails
- Add warning logging in models.py when default model is not configured
- Add traceback logging in chat router exception handler
- Update Ollama docs with model name configuration guidance
- Update troubleshooting docs with "Failed to send message" solutions
- Bump version to 1.6.1
* chore: uvlock