feat: Add separate model selection for code completion (#1032)
Some checks failed
Build / Build (push) Has been cancelled
Build / Verify Plugin (push) Has been cancelled

* add separate Ollama model selection for code completion

closes issue #733

* add separate selection of code completion provider

closes issue #804

* add test for Ollama code completion with separate model selection

also added missing clearing of code completion cache to other tests. This fixes ProxyAI test - before the fix it was passed always regardless of real ProxyAI completion operation

* fix saving of code completion provider option in IDE settings

* fix code style

* fix: remove expected failing test

---------

Co-authored-by: Carl-Robert Linnupuu <carlrobertoh@gmail.com>
This commit is contained in:
Viktor Muraviev 2025-07-04 18:16:52 +03:00 committed by GitHub
parent 846ecfeddc
commit 89a5a27586
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
15 changed files with 235 additions and 120 deletions

View file

@ -39,6 +39,7 @@ settings.displayName=ProxyAI: Settings
settings.openaiQuotaExceeded=OpenAI quota exceeded.
settingsConfigurable.displayName.label=Display name:
settingsConfigurable.service.label=Selected provider:
settingsConfigurable.service.codeCompletion.label=Code completion provider:
settingsConfigurable.service.codegpt.apiKey.comment=You can find the API key in your <a href="https://tryproxy.io/account">User settings</a>.
settingsConfigurable.service.codegpt.chatCompletionModel.comment=Choose a model optimized for conversational interactions, including assistance with general queries and explanations.
settingsConfigurable.service.codegpt.codeCompletionModel.comment=Choose a model tailored for code completion-related tasks.
@ -169,6 +170,7 @@ settingsConfigurable.prompts.exportDialog.exportError=Error exporting prompts se
settingsConfigurable.prompts.exportDialog.title=Target File
settingsConfigurable.prompts.importDialog.importError=Error importing prompts settings
settingsConfigurable.service.ollama.models.refresh=Refresh Models
settingsConfigurable.service.ollama.codeCompletionModel.label=Model for code completion:
advancedSettingsConfigurable.displayName=ProxyAI: Advanced Settings
advancedSettingsConfigurable.proxy.title=HTTP/SOCKS Proxy
advancedSettingsConfigurable.proxy.typeComboBoxField.label=Proxy: