feat(vscode-ide-companion): add image paste support (#1978)

* feat(vscode-ide-companion): add image paste support

  - Add clipboard image paste functionality with drag-and-drop support
  - Implement image preview component with removal capability
  - Support multimodal content in ACP session manager for text and images
  - Save pasted images to temporary .gemini-clipboard directory
  - Add image attachment display in user messages
  - Update CSP to allow data: URIs for inline image display
  - Add comprehensive image utilities with size validation (max 10MB)
  - Include tests for image processing utilities

* refactor: simplify VS Code paste image implementation

- Remove dead code and redundant error handling
- Extract common isAuthError() helper function
- Simplify SessionMessageHandler methods (80% reduction)
- Change temp directory from .gemini-clipboard to clipboard (aligned with CLI)
- Keep multimodal image sending format (type: image + base64)

Stats:
- 6 files changed
- 367 insertions (+)
- 1176 deletions (-)

* refactor: align paste image handling

* chore: trim paste image diff

* refactor(vscode-ide-companion): remove unused attachments logic

- Remove unused ImageAttachment type imports
- Remove attachments field from TextMessage interface
- Remove attachments from message data sent to WebView
- Clean up debug console.log statements
- Simplify SessionMessageHandler handleSendMessage method

This removes dead code from the previous image paste implementation
that was no longer needed after switching to @path reference approach.

* refactor(vscode-ide-companion/webview): extract image handling into dedicated hooks and utils

- extract ImagePreview and ImageMessageRenderer components from App.tsx
- create useImageAttachments hook for managing image attachments
- create useImageResolution hook for image path resolution
- add imageAttachmentHandler for saving images to temp files
- add imageMessageUtils for message expansion and resolution
- add imagePathResolver for resolving image paths in webview
- integrate image resolution in useWebViewMessages
- extract shouldSendMessage utility from useMessageSubmit
- add getLocalResourceRoots in PanelManager for resource access

Co-authored-by: Qwen-Coder <qwen-coder@alibabacloud.com>

* fix: harden vscode image handling and webview hosts

* fix: remove this alias in acp connection

* feat: add path escaping utility functions and tests

* feat: add support for image attachments and improve prompt handling

* refactor(webview): Optimize editing mode switching function

* refactor(vscode-ide-companion): move path escaping utilities to local module

- Move escapePath and unescapePath functions from qwen-code-core to local utils
- Add pathEscaping.ts with shell special characters handling
- Update imports in imageFormats.ts, imageAttachmentHandler.ts, and imageMessageUtils.ts
- Add unit tests for path escaping round-trip and browser bundle verification
- Fix browser bundling issue by avoiding node-only module dependencies in webview

Co-authored-by: Qwen-Coder <qwen-coder@alibabacloud.com>

* refactor: consolidate image handling logic across vscode-ide-companion and webui

- Merge分散的 image hooks (useImageAttachments, useImageResolution, usePasteHandler) into unified useImage hook
- Replace image utils (imageMessageUtils, imagePathResolver, imageUtils) with imageHandler and imageSupport
- Remove clipboard image storage from core package
- Consolidate webui image components into ImageComponents.tsx
- Update imports and tests to reflect new structure

Co-authored-by: Qwen-Coder <qwen-coder@alibabacloud.com>

* chore: drop unrelated core tool changes

* test: fix webview provider mocks and drop unrelated core diffs

* fix(cli): resolve original prompt through standard path in no_command case

Co-authored-by: Qwen-Coder <qwen-coder@alibabacloud.com>

---------

Co-authored-by: Qwen-Coder <qwen-coder@alibabacloud.com>
This commit is contained in:
易良 2026-03-20 13:47:09 +08:00 committed by GitHub
parent 85ed1a801d
commit 87f03cf2e9
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
26 changed files with 2188 additions and 150 deletions

View file

@ -867,13 +867,12 @@ export class Session implements SessionContext {
} }
case 'no_command': case 'no_command':
// No command was found or executed, use original prompt // No command was found or executed, resolve the original prompt
return originalPrompt.map((block) => { // through the standard path that handles all block types
if (block.type === 'text') { return this.#resolvePrompt(
return { text: block.text }; originalPrompt,
} new AbortController().signal,
throw new Error(`Unsupported block type: ${block.type}`); );
});
default: { default: {
// Exhaustiveness check // Exhaustiveness check

View file

@ -6,6 +6,7 @@
import { describe, expect, it, vi } from 'vitest'; import { describe, expect, it, vi } from 'vitest';
import { RequestError } from '@agentclientprotocol/sdk'; import { RequestError } from '@agentclientprotocol/sdk';
import type { ContentBlock } from '@agentclientprotocol/sdk';
// AcpConnection imports AcpFileHandler which imports vscode. // AcpConnection imports AcpFileHandler which imports vscode.
// Mock vscode so it can be resolved without the actual VS Code runtime. // Mock vscode so it can be resolved without the actual VS Code runtime.
@ -66,6 +67,43 @@ describe('AcpConnection readTextFile error mapping', () => {
requestError, requestError,
); );
}); });
it('passes structured ACP prompt blocks through without wrapping them as text', async () => {
const prompt = vi.fn().mockResolvedValue({});
const onEndTurn = vi.fn();
const conn = new AcpConnection() as unknown as {
sdkConnection: {
prompt: (params: {
sessionId: string;
prompt: ContentBlock[];
}) => Promise<unknown>;
};
sessionId: string | null;
onEndTurn: (reason?: string) => void;
sendPrompt: (prompt: string | ContentBlock[]) => Promise<unknown>;
};
const promptBlocks: ContentBlock[] = [
{ type: 'text', text: 'Inspect this image' },
{
type: 'resource_link',
name: 'pasted image.png',
mimeType: 'image/png',
uri: 'file:///tmp/pasted image.png',
},
];
conn.sdkConnection = { prompt };
conn.sessionId = 'session-1';
conn.onEndTurn = onEndTurn;
await conn.sendPrompt(promptBlocks);
expect(prompt).toHaveBeenCalledWith({
sessionId: 'session-1',
prompt: promptBlocks,
});
expect(onEndTurn).toHaveBeenCalled();
});
}); });
describe('AcpConnection.isConnected', () => { describe('AcpConnection.isConnected', () => {

View file

@ -13,6 +13,7 @@ import {
import type { import type {
Client, Client,
Agent, Agent,
ContentBlock,
SessionNotification, SessionNotification,
RequestPermissionRequest, RequestPermissionRequest,
RequestPermissionResponse, RequestPermissionResponse,
@ -431,14 +432,16 @@ export class AcpConnection {
return response; return response;
} }
async sendPrompt(prompt: string): Promise<PromptResponse> { async sendPrompt(prompt: string | ContentBlock[]): Promise<PromptResponse> {
const conn = this.ensureConnection(); const conn = this.ensureConnection();
if (!this.sessionId) { if (!this.sessionId) {
throw new Error('No active ACP session'); throw new Error('No active ACP session');
} }
const promptBlocks =
typeof prompt === 'string' ? [{ type: 'text', text: prompt }] : prompt;
const response: PromptResponse = await conn.prompt({ const response: PromptResponse = await conn.prompt({
sessionId: this.sessionId, sessionId: this.sessionId,
prompt: [{ type: 'text', text: prompt }], prompt: promptBlocks,
}); });
// Emit end-of-turn from stopReason // Emit end-of-turn from stopReason
if (response.stopReason) { if (response.stopReason) {

View file

@ -7,6 +7,7 @@ import { AcpConnection } from './acpConnection.js';
import type { import type {
ModelInfo, ModelInfo,
AvailableCommand, AvailableCommand,
ContentBlock,
RequestPermissionRequest, RequestPermissionRequest,
SessionNotification, SessionNotification,
} from '@agentclientprotocol/sdk'; } from '@agentclientprotocol/sdk';
@ -351,7 +352,7 @@ export class QwenAgentManager {
* *
* @param message - Message content * @param message - Message content
*/ */
async sendMessage(message: string): Promise<void> { async sendMessage(message: string | ContentBlock[]): Promise<void> {
await this.connection.sendPrompt(message); await this.connection.sendPrompt(message);
} }

View file

@ -0,0 +1,161 @@
/**
* @license
* Copyright 2025 Qwen Team
* SPDX-License-Identifier: Apache-2.0
*/
import { isSupportedImageMimeType } from '@qwen-code/qwen-code-core/src/utils/request-tokenizer/supportedImageFormats.js';
// ---------- Types ----------
export interface ImageAttachment {
id: string;
name: string;
type: string;
size: number;
data: string;
timestamp: number;
}
export interface SavedImageAttachment {
path: string;
name: string;
mimeType: string;
}
// ---------- Constants ----------
export const MAX_IMAGE_SIZE = 10 * 1024 * 1024;
export const MAX_TOTAL_IMAGE_SIZE = 20 * 1024 * 1024;
// ---------- Path escaping ----------
export const SHELL_SPECIAL_CHARS = /[ \t()[\]{};|*?$`'"#&<>!~]/;
export function escapePath(filePath: string): string {
let result = '';
for (let i = 0; i < filePath.length; i += 1) {
const char = filePath[i];
let backslashCount = 0;
for (let j = i - 1; j >= 0 && filePath[j] === '\\'; j -= 1) {
backslashCount += 1;
}
const isAlreadyEscaped = backslashCount % 2 === 1;
if (!isAlreadyEscaped && SHELL_SPECIAL_CHARS.test(char)) {
result += `\\${char}`;
} else {
result += char;
}
}
return result;
}
export function unescapePath(filePath: string): string {
return filePath.replace(
new RegExp(`\\\\([${SHELL_SPECIAL_CHARS.source.slice(1, -1)}])`, 'g'),
'$1',
);
}
// ---------- Image format detection ----------
const PASTED_IMAGE_MIME_TO_EXTENSION: Record<string, string> = {
'image/bmp': '.bmp',
'image/heic': '.heic',
'image/jpeg': '.jpg',
'image/jpg': '.jpg',
'image/png': '.png',
'image/tiff': '.tiff',
'image/webp': '.webp',
};
const DISPLAYABLE_IMAGE_EXTENSION_TO_MIME: Record<string, string> = {
'.bmp': 'image/bmp',
'.gif': 'image/gif',
'.heic': 'image/heic',
'.heif': 'image/heif',
'.jpeg': 'image/jpeg',
'.jpg': 'image/jpeg',
'.png': 'image/png',
'.tiff': 'image/tiff',
'.webp': 'image/webp',
};
export function isSupportedPastedImageMimeType(mimeType: string): boolean {
return isSupportedImageMimeType(mimeType);
}
export function getImageExtensionForMimeType(mimeType: string): string {
return PASTED_IMAGE_MIME_TO_EXTENSION[mimeType] ?? '.png';
}
export function getDisplayableImageMimeType(
filePath: string,
): string | undefined {
const lowerPath = filePath.toLowerCase();
const extensionIndex = lowerPath.lastIndexOf('.');
if (extensionIndex === -1) {
return undefined;
}
return DISPLAYABLE_IMAGE_EXTENSION_TO_MIME[lowerPath.slice(extensionIndex)];
}
export function isDisplayableImagePath(filePath: string): boolean {
return getDisplayableImageMimeType(filePath) !== undefined;
}
// ---------- Attachment validation ----------
function extractBase64Payload(data: string): string | null {
const dataUrlMatch = data.match(/^data:[^;]+;base64,(.+)$/);
const payload = dataUrlMatch ? dataUrlMatch[1] : data;
const normalized = payload.trim();
if (!normalized || /[^A-Za-z0-9+/=]/.test(normalized)) {
return null;
}
return normalized;
}
function getDecodedByteSize(base64Payload: string): number {
const padding = base64Payload.endsWith('==')
? 2
: base64Payload.endsWith('=')
? 1
: 0;
return Math.floor((base64Payload.length * 3) / 4) - padding;
}
export function normalizeImageAttachment(
attachment: ImageAttachment,
options?: {
maxBytes?: number;
},
): ImageAttachment | null {
if (!isSupportedPastedImageMimeType(attachment.type)) {
return null;
}
const payload = extractBase64Payload(attachment.data);
if (!payload) {
return null;
}
const byteSize = getDecodedByteSize(payload);
const maxBytes = options?.maxBytes ?? MAX_IMAGE_SIZE;
if (byteSize <= 0 || byteSize > maxBytes) {
return null;
}
return {
...attachment,
size: byteSize,
data: payload,
};
}

View file

@ -18,7 +18,10 @@ import { useFileContext } from './hooks/file/useFileContext.js';
import { useMessageHandling } from './hooks/message/useMessageHandling.js'; import { useMessageHandling } from './hooks/message/useMessageHandling.js';
import { useToolCalls } from './hooks/useToolCalls.js'; import { useToolCalls } from './hooks/useToolCalls.js';
import { useWebViewMessages } from './hooks/useWebViewMessages.js'; import { useWebViewMessages } from './hooks/useWebViewMessages.js';
import { useMessageSubmit } from './hooks/useMessageSubmit.js'; import {
shouldSendMessage,
useMessageSubmit,
} from './hooks/useMessageSubmit.js';
import type { PermissionOption, PermissionToolCall } from '@qwen-code/webui'; import type { PermissionOption, PermissionToolCall } from '@qwen-code/webui';
import type { TextMessage } from './hooks/message/useMessageHandling.js'; import type { TextMessage } from './hooks/message/useMessageHandling.js';
import type { ToolCallData } from './components/messages/toolcalls/ToolCall.js'; import type { ToolCallData } from './components/messages/toolcalls/ToolCall.js';
@ -35,6 +38,9 @@ import {
InterruptedMessage, InterruptedMessage,
FileIcon, FileIcon,
PermissionDrawer, PermissionDrawer,
AskUserQuestionDialog,
ImageMessageRenderer,
ImagePreview,
// Layout components imported directly from webui // Layout components imported directly from webui
EmptyState, EmptyState,
ChatHeader, ChatHeader,
@ -50,7 +56,7 @@ import {
DEFAULT_TOKEN_LIMIT, DEFAULT_TOKEN_LIMIT,
tokenLimit, tokenLimit,
} from '@qwen-code/qwen-code-core/src/core/tokenLimits.js'; } from '@qwen-code/qwen-code-core/src/core/tokenLimits.js';
import { AskUserQuestionDialog } from '@qwen-code/webui'; import { useImagePaste, type WebViewImageMessage } from './hooks/useImage.js';
export const App: React.FC = () => { export const App: React.FC = () => {
const vscode = useVSCode(); const vscode = useVSCode();
@ -89,16 +95,10 @@ export const App: React.FC = () => {
>([]); >([]);
const [availableModels, setAvailableModels] = useState<ModelInfo[]>([]); const [availableModels, setAvailableModels] = useState<ModelInfo[]>([]);
const [showModelSelector, setShowModelSelector] = useState(false); const [showModelSelector, setShowModelSelector] = useState(false);
const messagesEndRef = useRef<HTMLDivElement>( const messagesEndRef = useRef<HTMLDivElement | null>(null);
null,
) as React.RefObject<HTMLDivElement>;
// Scroll container for message list; used to keep the view anchored to the latest content // Scroll container for message list; used to keep the view anchored to the latest content
const messagesContainerRef = useRef<HTMLDivElement>( const messagesContainerRef = useRef<HTMLDivElement | null>(null);
null, const inputFieldRef = useRef<HTMLDivElement | null>(null);
) as React.RefObject<HTMLDivElement>;
const inputFieldRef = useRef<HTMLDivElement>(
null,
) as React.RefObject<HTMLDivElement>;
const [editMode, setEditMode] = useState<ApprovalModeValue>( const [editMode, setEditMode] = useState<ApprovalModeValue>(
ApprovalMode.DEFAULT, ApprovalMode.DEFAULT,
@ -284,10 +284,18 @@ export const App: React.FC = () => {
completion.query, completion.query,
]); ]);
// Message submission const { attachedImages, handleRemoveImage, clearImages, handlePaste } =
useImagePaste({
onError: (error) => {
console.error('Paste error:', error);
},
});
const { handleSubmit: submitMessage } = useMessageSubmit({ const { handleSubmit: submitMessage } = useMessageSubmit({
inputText, inputText,
setInputText, setInputText,
attachedImages,
clearImages,
messageHandling, messageHandling,
fileContext, fileContext,
skipAutoActiveContext, skipAutoActiveContext,
@ -297,6 +305,13 @@ export const App: React.FC = () => {
isWaitingForResponse: messageHandling.isWaitingForResponse, isWaitingForResponse: messageHandling.isWaitingForResponse,
}); });
const canSubmit = shouldSendMessage({
inputText,
attachedImages,
isStreaming: messageHandling.isStreaming,
isWaitingForResponse: messageHandling.isWaitingForResponse,
});
// Handle cancel/stop from the input bar // Handle cancel/stop from the input bar
// Emit a cancel to the extension and immediately reflect interruption locally. // Emit a cancel to the extension and immediately reflect interruption locally.
const handleCancel = useCallback(() => { const handleCancel = useCallback(() => {
@ -813,76 +828,86 @@ export const App: React.FC = () => {
console.log('[App] Rendering messages:', allMessages); console.log('[App] Rendering messages:', allMessages);
// Render all messages and tool calls // Render all messages and tool calls
const renderMessages = useCallback<() => React.ReactNode>( const renderMessages = useCallback<() => React.ReactNode>(() => {
() => let imageIndex = 0;
allMessages.map((item, index) => { return allMessages.map((item, index) => {
switch (item.type) { switch (item.type) {
case 'message': { case 'message': {
const msg = item.data as TextMessage; const msg = item.data as TextMessage;
const handleFileClick = (path: string): void => { const handleFileClick = (path: string): void => {
vscode.postMessage({ vscode.postMessage({
type: 'openFile', type: 'openFile',
data: { path }, data: { path },
}); });
}; };
if (msg.role === 'thinking') { if (msg.kind === 'image' && msg.imagePath) {
return ( imageIndex += 1;
<ThinkingMessage
key={`message-${index}`}
content={msg.content || ''}
timestamp={msg.timestamp || 0}
onFileClick={handleFileClick}
/>
);
}
if (msg.role === 'user') {
return (
<UserMessage
key={`message-${index}`}
content={msg.content || ''}
timestamp={msg.timestamp || 0}
onFileClick={handleFileClick}
fileContext={msg.fileContext}
/>
);
}
{
const content = (msg.content || '').trim();
if (content === 'Interrupted' || content === 'Tool interrupted') {
return (
<InterruptedMessage key={`message-${index}`} text={content} />
);
}
return (
<AssistantMessage
key={`message-${index}`}
content={content}
timestamp={msg.timestamp || 0}
onFileClick={handleFileClick}
/>
);
}
}
case 'in-progress-tool-call':
case 'completed-tool-call': {
return ( return (
<ToolCall <ImageMessageRenderer
key={`toolcall-${(item.data as ToolCallData).toolCallId}-${item.type}`} key={`message-${index}`}
toolCall={item.data as ToolCallData} msg={msg as WebViewImageMessage}
imageIndex={imageIndex}
/> />
); );
} }
default: if (msg.role === 'thinking') {
return null; return (
<ThinkingMessage
key={`message-${index}`}
content={msg.content || ''}
timestamp={msg.timestamp || 0}
onFileClick={handleFileClick}
/>
);
}
if (msg.role === 'user') {
return (
<UserMessage
key={`message-${index}`}
content={msg.content || ''}
timestamp={msg.timestamp || 0}
onFileClick={handleFileClick}
fileContext={msg.fileContext}
/>
);
}
{
const content = (msg.content || '').trim();
if (content === 'Interrupted' || content === 'Tool interrupted') {
return (
<InterruptedMessage key={`message-${index}`} text={content} />
);
}
return (
<AssistantMessage
key={`message-${index}`}
content={content}
timestamp={msg.timestamp || 0}
onFileClick={handleFileClick}
/>
);
}
} }
}),
[allMessages, vscode], case 'in-progress-tool-call':
); case 'completed-tool-call': {
return (
<ToolCall
key={`toolcall-${(item.data as ToolCallData).toolCallId}-${item.type}`}
toolCall={item.data as ToolCallData}
/>
);
}
default:
return null;
}
});
}, [allMessages, vscode]);
const hasContent = const hasContent =
messageHandling.messages.length > 0 || messageHandling.messages.length > 0 ||
@ -1027,11 +1052,21 @@ export const App: React.FC = () => {
} }
}} }}
onAttachContext={handleAttachContextClick} onAttachContext={handleAttachContextClick}
onPaste={handlePaste}
completionIsOpen={completion.isOpen} completionIsOpen={completion.isOpen}
completionItems={completion.items} completionItems={completion.items}
onCompletionSelect={handleCompletionSelect} onCompletionSelect={handleCompletionSelect}
onCompletionFill={(item) => handleCompletionSelect(item, true)} onCompletionFill={(item) => handleCompletionSelect(item, true)}
onCompletionClose={completion.closeCompletion} onCompletionClose={completion.closeCompletion}
canSubmit={canSubmit}
extraContent={
attachedImages.length > 0 ? (
<ImagePreview
images={attachedImages}
onRemove={handleRemoveImage}
/>
) : null
}
showModelSelector={showModelSelector} showModelSelector={showModelSelector}
availableModels={availableModels} availableModels={availableModels}
currentModelId={modelInfo?.modelId} currentModelId={modelInfo?.modelId}

View file

@ -7,7 +7,7 @@
* This allows local ApprovalModeValue to work with webui's EditModeInfo * This allows local ApprovalModeValue to work with webui's EditModeInfo
*/ */
import type { FC } from 'react'; import type { ClipboardEvent, FC, ReactNode } from 'react';
import { InputForm as BaseInputForm, getEditModeIcon } from '@qwen-code/webui'; import { InputForm as BaseInputForm, getEditModeIcon } from '@qwen-code/webui';
import type { import type {
InputFormProps as BaseInputFormProps, InputFormProps as BaseInputFormProps,
@ -26,6 +26,10 @@ export interface InputFormProps
extends Omit<BaseInputFormProps, 'editModeInfo' | 'onCompletionFill'> { extends Omit<BaseInputFormProps, 'editModeInfo' | 'onCompletionFill'> {
/** Edit mode value (local type) */ /** Edit mode value (local type) */
editMode: ApprovalModeValue; editMode: ApprovalModeValue;
/** Optional paste handler forwarded to the base input */
onPaste?: (e: ClipboardEvent) => void;
/** Optional content rendered between the input and actions */
extraContent?: ReactNode;
/** Completion fill callback (Tab or equivalent) */ /** Completion fill callback (Tab or equivalent) */
onCompletionFill?: (item: CompletionItem) => void; onCompletionFill?: (item: CompletionItem) => void;
/** Whether to show model selector */ /** Whether to show model selector */

View file

@ -0,0 +1,164 @@
/**
* @license
* Copyright 2025 Qwen Team
* SPDX-License-Identifier: Apache-2.0
*/
import { beforeEach, describe, expect, it, vi } from 'vitest';
const { mockProcessImageAttachments, mockShowErrorMessage } = vi.hoisted(
() => ({
mockProcessImageAttachments: vi.fn(),
mockShowErrorMessage: vi.fn(),
}),
);
vi.mock('vscode', () => ({
window: {
showWarningMessage: vi.fn(),
showErrorMessage: mockShowErrorMessage,
},
commands: {
executeCommand: vi.fn(),
},
workspace: {
workspaceFolders: [{ uri: { fsPath: '/workspace' } }],
},
}));
vi.mock('../utils/imageHandler.js', async (importOriginal) => {
const actual =
await importOriginal<typeof import('../utils/imageHandler.js')>();
return {
...actual,
processImageAttachments: mockProcessImageAttachments,
};
});
import { SessionMessageHandler } from './SessionMessageHandler.js';
describe('SessionMessageHandler', () => {
beforeEach(() => {
vi.clearAllMocks();
mockProcessImageAttachments.mockResolvedValue({
formattedText: '',
displayText: '',
savedImageCount: 0,
promptImages: [],
});
});
it('does not create conversation state or send an empty prompt when all pasted images fail to materialize', async () => {
const agentManager = {
isConnected: true,
currentSessionId: 'session-1',
sendMessage: vi.fn(),
};
const conversationStore = {
createConversation: vi.fn().mockResolvedValue({ id: 'conversation-1' }),
getConversation: vi.fn().mockResolvedValue(null),
addMessage: vi.fn(),
};
const sendToWebView = vi.fn();
const handler = new SessionMessageHandler(
agentManager as never,
conversationStore as never,
null,
sendToWebView,
);
await handler.handle({
type: 'sendMessage',
data: {
text: '',
attachments: [
{
id: 'img-1',
name: 'pasted.png',
type: 'image/png',
size: 3,
data: 'data:image/png;base64,YWJj',
timestamp: Date.now(),
},
],
},
});
expect(conversationStore.createConversation).not.toHaveBeenCalled();
expect(conversationStore.addMessage).not.toHaveBeenCalled();
expect(agentManager.sendMessage).not.toHaveBeenCalled();
expect(sendToWebView).toHaveBeenCalledWith(
expect.objectContaining({
type: 'error',
data: expect.objectContaining({
message: expect.stringContaining('image'),
}),
}),
);
});
it('sends formatted prompt text so session restore can reconstruct pasted images', async () => {
mockProcessImageAttachments.mockResolvedValue({
formattedText: '这是什么内容\n\n@/tmp/clipboard/clipboard-123.png',
displayText: '这是什么内容\n\n@/tmp/clipboard/clipboard-123.png',
savedImageCount: 1,
promptImages: [
{
path: '/tmp/clipboard/clipboard-123.png',
name: 'clipboard-123.png',
mimeType: 'image/png',
},
],
});
const agentManager = {
isConnected: true,
currentSessionId: 'session-1',
sendMessage: vi.fn().mockResolvedValue(undefined),
};
const conversationStore = {
createConversation: vi.fn().mockResolvedValue({ id: 'conversation-1' }),
getConversation: vi.fn().mockResolvedValue(null),
addMessage: vi.fn(),
};
const sendToWebView = vi.fn();
const handler = new SessionMessageHandler(
agentManager as never,
conversationStore as never,
null,
sendToWebView,
);
await handler.handle({
type: 'sendMessage',
data: {
text: '这是什么内容',
attachments: [
{
id: 'img-1',
name: 'clipboard-123.png',
type: 'image/png',
size: 3,
data: 'data:image/png;base64,YWJj',
timestamp: Date.now(),
},
],
},
});
expect(agentManager.sendMessage).toHaveBeenCalledWith([
{
type: 'text',
text: '这是什么内容\n\n@/tmp/clipboard/clipboard-123.png',
},
{
type: 'resource_link',
name: 'clipboard-123.png',
mimeType: 'image/png',
uri: 'file:///tmp/clipboard/clipboard-123.png',
},
]);
});
});

View file

@ -7,7 +7,12 @@
import * as vscode from 'vscode'; import * as vscode from 'vscode';
import { BaseMessageHandler } from './BaseMessageHandler.js'; import { BaseMessageHandler } from './BaseMessageHandler.js';
import type { ChatMessage } from '../../services/qwenAgentManager.js'; import type { ChatMessage } from '../../services/qwenAgentManager.js';
import type { ImageAttachment } from '../../utils/imageSupport.js';
import type { ApprovalModeValue } from '../../types/approvalModeValueTypes.js'; import type { ApprovalModeValue } from '../../types/approvalModeValueTypes.js';
import {
processImageAttachments,
buildPromptBlocks,
} from '../utils/imageHandler.js';
import { isAuthenticationRequiredError } from '../../utils/authErrors.js'; import { isAuthenticationRequiredError } from '../../utils/authErrors.js';
import { getErrorMessage } from '../../utils/errorMessage.js'; import { getErrorMessage } from '../../utils/errorMessage.js';
@ -67,6 +72,7 @@ export class SessionMessageHandler extends BaseMessageHandler {
endLine?: number; endLine?: number;
} }
| undefined, | undefined,
data?.attachments as ImageAttachment[] | undefined,
); );
break; break;
@ -280,20 +286,21 @@ export class SessionMessageHandler extends BaseMessageHandler {
startLine?: number; startLine?: number;
endLine?: number; endLine?: number;
}, },
attachments?: ImageAttachment[],
): Promise<void> { ): Promise<void> {
console.log('[SessionMessageHandler] handleSendMessage called with:', text); console.log('[SessionMessageHandler] handleSendMessage called with:', text);
// Guard: do not process empty or whitespace-only messages. // Guard: do not process empty or whitespace-only messages.
// This prevents ghost user-message bubbles when slash-command completions // This prevents ghost user-message bubbles when slash-command completions
// or model-selector interactions clear the input but still trigger a submit. // or model-selector interactions clear the input but still trigger a submit.
const trimmedText = text.replace(/\u200B/g, '').trim(); const trimmedText = text.replace(/\u200B/g, '').trim();
if (!trimmedText) { const hasAttachments = (attachments?.length ?? 0) > 0;
if (!trimmedText && !hasAttachments) {
console.warn('[SessionMessageHandler] Ignoring empty message'); console.warn('[SessionMessageHandler] Ignoring empty message');
return; return;
} }
// Format message with file context if present let displayText = trimmedText ? text : '';
let formattedText = text; let promptText = text;
if (context && context.length > 0) { if (context && context.length > 0) {
const contextParts = context const contextParts = context
.map((ctx) => { .map((ctx) => {
@ -304,7 +311,28 @@ export class SessionMessageHandler extends BaseMessageHandler {
}) })
.join('\n'); .join('\n');
formattedText = `${contextParts}\n\n${text}`; promptText = `${contextParts}\n\n${text}`;
}
const {
formattedText,
displayText: updatedDisplayText,
savedImageCount,
promptImages,
} = await processImageAttachments(promptText, attachments);
promptText = formattedText;
displayText = updatedDisplayText;
if (hasAttachments && !trimmedText && savedImageCount === 0) {
const errorMsg =
'Failed to attach the pasted image. Nothing was sent. Please paste the image again.';
console.warn('[SessionMessageHandler]', errorMsg);
vscode.window.showErrorMessage(errorMsg);
this.sendToWebView({
type: 'error',
data: { message: errorMsg },
});
return;
} }
// Ensure we have an active conversation // Ensure we have an active conversation
@ -359,7 +387,8 @@ export class SessionMessageHandler extends BaseMessageHandler {
// Generate title for first message, but only if it hasn't been set yet // Generate title for first message, but only if it hasn't been set yet
if (isFirstMessage && !this.isTitleSet) { if (isFirstMessage && !this.isTitleSet) {
const title = text.substring(0, 50) + (text.length > 50 ? '...' : ''); const title =
displayText.substring(0, 50) + (displayText.length > 50 ? '...' : '');
this.sendToWebView({ this.sendToWebView({
type: 'sessionTitleUpdated', type: 'sessionTitleUpdated',
data: { data: {
@ -373,7 +402,7 @@ export class SessionMessageHandler extends BaseMessageHandler {
// Save user message // Save user message
const userMessage: ChatMessage = { const userMessage: ChatMessage = {
role: 'user', role: 'user',
content: text, content: displayText,
timestamp: Date.now(), timestamp: Date.now(),
}; };
@ -382,7 +411,6 @@ export class SessionMessageHandler extends BaseMessageHandler {
userMessage, userMessage,
); );
// Send to WebView
this.sendToWebView({ this.sendToWebView({
type: 'message', type: 'message',
data: { ...userMessage, fileContext }, data: { ...userMessage, fileContext },
@ -445,7 +473,9 @@ export class SessionMessageHandler extends BaseMessageHandler {
}, },
}); });
await this.agentManager.sendMessage(formattedText); await this.agentManager.sendMessage(
buildPromptBlocks(promptText, promptImages),
);
// Save assistant message // Save assistant message
if (this.currentStreamContent && this.currentConversationId) { if (this.currentStreamContent && this.currentConversationId) {

View file

@ -10,6 +10,10 @@ export interface TextMessage {
role: 'user' | 'assistant' | 'thinking'; role: 'user' | 'assistant' | 'thinking';
content: string; content: string;
timestamp: number; timestamp: number;
kind?: 'image';
imagePath?: string;
imageSrc?: string;
imageMissing?: boolean;
fileContext?: { fileContext?: {
fileName: string; fileName: string;
filePath: string; filePath: string;

View file

@ -21,7 +21,7 @@ interface CompletionTriggerState {
* Based on vscode-copilot-chat's AttachContextAction * Based on vscode-copilot-chat's AttachContextAction
*/ */
export function useCompletionTrigger( export function useCompletionTrigger(
inputRef: RefObject<HTMLDivElement>, inputRef: RefObject<HTMLDivElement | null>,
getCompletionItems: ( getCompletionItems: (
trigger: '@' | '/', trigger: '@' | '/',
query: string, query: string,

View file

@ -0,0 +1,44 @@
/**
* @license
* Copyright 2025 Qwen Team
* SPDX-License-Identifier: Apache-2.0
*/
import { build } from 'esbuild';
import { fileURLToPath } from 'node:url';
import { describe, expect, it } from 'vitest';
import { escapePath } from '../../utils/imageSupport.js';
import { splitMessageContentForImages } from './useImage.js';
describe('splitMessageContentForImages', () => {
it('restores escaped image paths with spaces back to their original file path', () => {
const imagePath = '/tmp/My Images/pasted image.png';
const escapedImageReference = `@${escapePath(imagePath)}`;
const result = splitMessageContentForImages(
`Please inspect this screenshot.\n\n${escapedImageReference}`,
);
expect(result.text).toBe('Please inspect this screenshot.');
expect(result.imagePaths).toEqual([imagePath]);
});
});
describe('useImage browser bundle', () => {
it('bundles without resolving node-only qwen-code-core modules', async () => {
const entryPoint = fileURLToPath(new URL('./useImage.ts', import.meta.url));
await expect(
build({
entryPoints: [entryPoint],
bundle: true,
format: 'esm',
logLevel: 'silent',
platform: 'browser',
write: false,
}),
).resolves.toMatchObject({
outputFiles: expect.any(Array),
});
});
});

View file

@ -0,0 +1,501 @@
/**
* @license
* Copyright 2025 Qwen Team
* SPDX-License-Identifier: Apache-2.0
*/
import { useCallback, useRef, useState } from 'react';
import type { ImageAttachment } from '../../utils/imageSupport.js';
import {
MAX_IMAGE_SIZE,
MAX_TOTAL_IMAGE_SIZE,
isDisplayableImagePath,
isSupportedPastedImageMimeType,
getImageExtensionForMimeType,
unescapePath,
} from '../../utils/imageSupport.js';
export type { ImageAttachment };
// ======================== Message Types ========================
export interface WebViewMessageBase {
role: 'user' | 'assistant' | 'thinking';
content: string;
timestamp: number;
fileContext?: {
fileName: string;
filePath: string;
startLine?: number;
endLine?: number;
};
}
export interface WebViewImageMessage extends WebViewMessageBase {
kind: 'image';
imagePath: string;
imageSrc?: string;
imageMissing?: boolean;
}
export type WebViewMessage = WebViewMessageBase | WebViewImageMessage;
// ======================== Message Parsing ========================
interface ParsedImageReference {
imagePath: string;
start: number;
end: number;
}
function normalizeWhitespace(value: string): string {
return value
.replace(/[ \t]+/g, ' ')
.replace(/ ?\n ?/g, '\n')
.replace(/\n{3,}/g, '\n\n')
.trim();
}
export function splitMessageContentForImages(content: string): {
text: string;
imagePaths: string[];
} {
if (!content) {
return { text: '', imagePaths: [] };
}
const imageReferences = parseImageReferences(content);
if (imageReferences.length === 0) {
return { text: content, imagePaths: [] };
}
let cleanedContent = '';
let lastIndex = 0;
for (const reference of imageReferences) {
cleanedContent += content.slice(lastIndex, reference.start);
lastIndex = reference.end;
}
cleanedContent += content.slice(lastIndex);
const cleaned = normalizeWhitespace(cleanedContent);
const imagePaths = imageReferences.map((reference) => reference.imagePath);
return { text: cleaned, imagePaths };
}
function parseImageReferences(content: string): ParsedImageReference[] {
const references: ParsedImageReference[] = [];
let currentIndex = 0;
while (currentIndex < content.length) {
let atIndex = -1;
let nextSearchIndex = currentIndex;
while (nextSearchIndex < content.length) {
if (
content[nextSearchIndex] === '@' &&
(nextSearchIndex === 0 || content[nextSearchIndex - 1] !== '\\')
) {
atIndex = nextSearchIndex;
break;
}
nextSearchIndex += 1;
}
if (atIndex === -1) {
break;
}
let pathEndIndex = atIndex + 1;
let inEscape = false;
while (pathEndIndex < content.length) {
const char = content[pathEndIndex];
if (inEscape) {
inEscape = false;
} else if (char === '\\') {
inEscape = true;
} else if (/[,\s;!?()[\]{}]/.test(char)) {
break;
} else if (char === '.') {
const nextChar =
pathEndIndex + 1 < content.length ? content[pathEndIndex + 1] : '';
if (nextChar === '' || /\s/.test(nextChar)) {
break;
}
}
pathEndIndex += 1;
}
const rawReference = content.slice(atIndex, pathEndIndex);
const unescapedReference = unescapePath(rawReference);
const imagePath = unescapedReference.startsWith('@')
? unescapedReference.slice(1)
: unescapedReference;
if (isDisplayableImagePath(imagePath)) {
references.push({
imagePath,
start: atIndex,
end: pathEndIndex,
});
}
currentIndex = pathEndIndex;
}
return references;
}
export function expandUserMessageWithImages(message: WebViewMessageBase): {
messages: WebViewMessage[];
imagePaths: string[];
} {
const { text, imagePaths } = splitMessageContentForImages(message.content);
if (imagePaths.length === 0) {
return { messages: [message], imagePaths: [] };
}
const expanded: WebViewMessage[] = imagePaths.map((imagePath) => ({
role: 'user',
content: '',
timestamp: message.timestamp,
kind: 'image',
imagePath,
}));
if (text) {
expanded.push({
...message,
content: text,
});
}
return { messages: expanded, imagePaths };
}
export function applyImageResolution(
messages: WebViewMessage[],
resolutions: Map<string, string | null>,
): WebViewMessage[] {
if (messages.length === 0 || resolutions.size === 0) {
return messages;
}
let changed = false;
const next = messages.map((message) => {
if (!('kind' in message) || message.kind !== 'image') {
return message;
}
const resolved = resolutions.get(message.imagePath);
if (resolved === undefined) {
return message;
}
const imageMissing = resolved === null;
const imageSrc = resolved ?? undefined;
if (
message.imageSrc === imageSrc &&
message.imageMissing === imageMissing
) {
return message;
}
changed = true;
return {
...message,
imageSrc,
imageMissing,
};
});
return changed ? next : messages;
}
// ======================== useImagePaste ========================
async function fileToBase64(file: File | Blob): Promise<string> {
return new Promise((resolve, reject) => {
const reader = new FileReader();
reader.onload = () => resolve(reader.result as string);
reader.onerror = reject;
reader.readAsDataURL(file);
});
}
function isSupportedImage(file: File): boolean {
return isSupportedPastedImageMimeType(file.type);
}
function isWithinSizeLimit(file: File): boolean {
return file.size <= MAX_IMAGE_SIZE;
}
function formatFileSize(bytes: number): string {
if (bytes === 0) {
return '0 B';
}
const k = 1024;
const sizes = ['B', 'KB', 'MB', 'GB'];
const i = Math.floor(Math.log(bytes) / Math.log(k));
return parseFloat((bytes / Math.pow(k, i)).toFixed(2)) + ' ' + sizes[i];
}
async function createImageAttachment(
file: File,
): Promise<ImageAttachment | null> {
if (!isSupportedImage(file)) {
return null;
}
if (!isWithinSizeLimit(file)) {
return null;
}
try {
const base64Data = await fileToBase64(file);
return {
id: `img_${Date.now()}_${Math.random().toString(36).substr(2, 9)}`,
name: file.name || `image_${Date.now()}`,
type: file.type,
size: file.size,
data: base64Data,
timestamp: Date.now(),
};
} catch {
return null;
}
}
function generatePastedImageName(mimeType: string): string {
const now = new Date();
const timeStr = `${now.getHours().toString().padStart(2, '0')}${now
.getMinutes()
.toString()
.padStart(2, '0')}${now.getSeconds().toString().padStart(2, '0')}`;
return `pasted_image_${timeStr}${getImageExtensionForMimeType(mimeType)}`;
}
export function useImagePaste({
onError,
}: { onError?: (error: string) => void } = {}) {
const [attachedImages, setAttachedImages] = useState<ImageAttachment[]>([]);
const processingRef = useRef(false);
const handleRemoveImage = useCallback((imageId: string) => {
setAttachedImages((prev) => prev.filter((img) => img.id !== imageId));
}, []);
const clearImages = useCallback(() => {
setAttachedImages([]);
}, []);
const handlePaste = useCallback(
async (event: React.ClipboardEvent | ClipboardEvent) => {
if (processingRef.current) {
return;
}
const clipboardData = event.clipboardData;
if (!clipboardData?.files?.length) {
return;
}
processingRef.current = true;
event.preventDefault();
event.stopPropagation();
const imageAttachments: ImageAttachment[] = [];
const errors: string[] = [];
let runningTotal = attachedImages.reduce((sum, img) => sum + img.size, 0);
try {
for (let i = 0; i < clipboardData.files.length; i++) {
const file = clipboardData.files[i];
if (!file.type.startsWith('image/')) {
continue;
}
if (!isSupportedImage(file)) {
errors.push(`Unsupported image type: ${file.type}`);
continue;
}
if (!isWithinSizeLimit(file)) {
errors.push(
`Image "${file.name || 'pasted image'}" is too large (${formatFileSize(file.size)}). Maximum size is 10MB.`,
);
continue;
}
if (runningTotal + file.size > MAX_TOTAL_IMAGE_SIZE) {
errors.push(
`Skipping image "${file.name || 'pasted image'}" total attachment size would exceed ${formatFileSize(MAX_TOTAL_IMAGE_SIZE)}.`,
);
continue;
}
try {
// Clipboard pastes default to "image.png"; generate a timestamped name instead.
const imageFile =
file.name && file.name !== 'image.png'
? file
: new File([file], generatePastedImageName(file.type), {
type: file.type,
});
const attachment = await createImageAttachment(imageFile);
if (attachment) {
imageAttachments.push(attachment);
runningTotal += attachment.size;
}
} catch {
errors.push(
`Failed to process image "${file.name || 'pasted image'}"`,
);
}
}
if (errors.length > 0) {
onError?.(errors.join('\n'));
}
if (imageAttachments.length > 0) {
setAttachedImages((prev) => [...prev, ...imageAttachments]);
}
} finally {
processingRef.current = false;
}
},
[attachedImages, onError],
);
return { attachedImages, handleRemoveImage, clearImages, handlePaste };
}
// ======================== useImageResolution ========================
export function useImageResolution({
vscode,
}: {
vscode: { postMessage: (message: unknown) => void };
}) {
const imageResolutionRef = useRef<Map<string, string | null>>(new Map());
const pendingImagePathsRef = useRef<Set<string>>(new Set());
const imageRequestIdRef = useRef(0);
const expandMessages = useCallback(
(
messages: WebViewMessageBase[],
): { messages: WebViewMessage[]; imagePaths: string[] } => {
const expanded: WebViewMessage[] = [];
const allImagePaths: string[] = [];
for (const message of messages) {
if (message.role === 'user') {
const result = expandUserMessageWithImages(message);
expanded.push(...result.messages);
allImagePaths.push(...result.imagePaths);
} else {
expanded.push(message);
}
}
return { messages: expanded, imagePaths: allImagePaths };
},
[],
);
const applyCurrentImageResolutions = useCallback(
(messages: WebViewMessage[]): WebViewMessage[] =>
applyImageResolution(messages, imageResolutionRef.current),
[],
);
const requestImageResolutions = useCallback(
(imagePaths: string[]) => {
if (imagePaths.length === 0) {
return;
}
const pending = imagePaths.filter(
(p) =>
!imageResolutionRef.current.has(p) &&
!pendingImagePathsRef.current.has(p),
);
if (pending.length === 0) {
return;
}
for (const p of pending) {
pendingImagePathsRef.current.add(p);
}
imageRequestIdRef.current += 1;
vscode.postMessage({
type: 'resolveImagePaths',
data: { paths: pending, requestId: imageRequestIdRef.current },
});
},
[vscode],
);
const materializeMessages = useCallback(
(messages: WebViewMessageBase[]): WebViewMessage[] => {
const expanded = expandMessages(messages);
requestImageResolutions(expanded.imagePaths);
return applyCurrentImageResolutions(expanded.messages);
},
[applyCurrentImageResolutions, expandMessages, requestImageResolutions],
);
const materializeMessage = useCallback(
(message: WebViewMessageBase): WebViewMessage[] => {
const expanded =
message.role === 'user'
? expandUserMessageWithImages(message)
: { messages: [message], imagePaths: [] as string[] };
requestImageResolutions(expanded.imagePaths);
return applyCurrentImageResolutions(expanded.messages);
},
[applyCurrentImageResolutions, requestImageResolutions],
);
const mergeResolvedImages = useCallback(
(
messages: WebViewMessage[],
resolved: Array<{ path: string; src?: string | null }>,
): WebViewMessage[] => {
for (const item of resolved) {
pendingImagePathsRef.current.delete(item.path);
imageResolutionRef.current.set(
item.path,
item.src === null || item.src === undefined ? null : item.src,
);
}
return applyCurrentImageResolutions(messages);
},
[applyCurrentImageResolutions],
);
const clearImageResolutions = useCallback(() => {
imageResolutionRef.current.clear();
pendingImagePathsRef.current.clear();
}, []);
return {
materializeMessages,
materializeMessage,
mergeResolvedImages,
clearImageResolutions,
};
}

View file

@ -7,12 +7,15 @@
import { useCallback } from 'react'; import { useCallback } from 'react';
import type { VSCodeAPI } from './useVSCode.js'; import type { VSCodeAPI } from './useVSCode.js';
import { getRandomLoadingMessage } from '../../constants/loadingMessages.js'; import { getRandomLoadingMessage } from '../../constants/loadingMessages.js';
import type { ImageAttachment } from './useImage.js';
interface UseMessageSubmitProps { interface UseMessageSubmitProps {
vscode: VSCodeAPI; vscode: VSCodeAPI;
inputText: string; inputText: string;
setInputText: (text: string) => void; setInputText: (text: string) => void;
inputFieldRef: React.RefObject<HTMLDivElement>; attachedImages?: ImageAttachment[];
clearImages?: () => void;
inputFieldRef: React.RefObject<HTMLDivElement | null>;
isStreaming: boolean; isStreaming: boolean;
isWaitingForResponse: boolean; isWaitingForResponse: boolean;
// When true, do NOT auto-attach the active editor file/selection to context // When true, do NOT auto-attach the active editor file/selection to context
@ -31,6 +34,26 @@ interface UseMessageSubmitProps {
}; };
} }
export const shouldSendMessage = ({
inputText,
attachedImages,
isStreaming,
isWaitingForResponse,
}: {
inputText: string;
attachedImages?: ImageAttachment[];
isStreaming: boolean;
isWaitingForResponse: boolean;
}): boolean => {
if (isStreaming || isWaitingForResponse) {
return false;
}
const hasText = inputText.replace(/\u200B/g, '').trim().length > 0;
const hasAttachments = (attachedImages?.length ?? 0) > 0;
return hasText || hasAttachments;
};
/** /**
* Message submit Hook * Message submit Hook
* Handles message submission logic and context parsing * Handles message submission logic and context parsing
@ -39,6 +62,8 @@ export const useMessageSubmit = ({
vscode, vscode,
inputText, inputText,
setInputText, setInputText,
attachedImages = [],
clearImages,
inputFieldRef, inputFieldRef,
isStreaming, isStreaming,
isWaitingForResponse, isWaitingForResponse,
@ -50,7 +75,14 @@ export const useMessageSubmit = ({
(e: React.FormEvent) => { (e: React.FormEvent) => {
e.preventDefault(); e.preventDefault();
if (!inputText.trim() || isStreaming || isWaitingForResponse) { if (
!shouldSendMessage({
inputText,
attachedImages,
isStreaming,
isWaitingForResponse,
})
) {
return; return;
} }
@ -142,6 +174,7 @@ export const useMessageSubmit = ({
text: inputText, text: inputText,
context: context.length > 0 ? context : undefined, context: context.length > 0 ? context : undefined,
fileContext: fileContextForMessage, fileContext: fileContextForMessage,
attachments: attachedImages.length > 0 ? attachedImages : undefined,
}, },
}); });
@ -153,9 +186,14 @@ export const useMessageSubmit = ({
inputFieldRef.current.setAttribute('data-empty', 'true'); inputFieldRef.current.setAttribute('data-empty', 'true');
} }
fileContext.clearFileReferences(); fileContext.clearFileReferences();
if (clearImages) {
clearImages();
}
}, },
[ [
inputText, inputText,
attachedImages,
clearImages,
isStreaming, isStreaming,
setInputText, setInputText,
inputFieldRef, inputFieldRef,

View file

@ -16,6 +16,11 @@ import type { ApprovalModeValue } from '../../types/approvalModeValueTypes.js';
import type { PlanEntry } from '../../types/chatTypes.js'; import type { PlanEntry } from '../../types/chatTypes.js';
import type { ModelInfo, AvailableCommand } from '@agentclientprotocol/sdk'; import type { ModelInfo, AvailableCommand } from '@agentclientprotocol/sdk';
import type { Question } from '../../types/acpTypes.js'; import type { Question } from '../../types/acpTypes.js';
import {
useImageResolution,
type WebViewMessage,
type WebViewMessageBase,
} from './useImage.js';
const FORCE_CLEAR_STREAM_END_REASONS = new Set([ const FORCE_CLEAR_STREAM_END_REASONS = new Set([
'user_cancelled', 'user_cancelled',
@ -66,23 +71,11 @@ interface UseWebViewMessagesProps {
// Message handling // Message handling
messageHandling: { messageHandling: {
setMessages: ( setMessages: (
messages: Array<{ messages:
role: 'user' | 'assistant' | 'thinking'; | WebViewMessage[]
content: string; | ((prev: WebViewMessage[]) => WebViewMessage[]),
timestamp: number;
fileContext?: {
fileName: string;
filePath: string;
startLine?: number;
endLine?: number;
};
}>,
) => void; ) => void;
addMessage: (message: { addMessage: (message: WebViewMessage) => void;
role: 'user' | 'assistant' | 'thinking';
content: string;
timestamp: number;
}) => void;
clearMessages: () => void; clearMessages: () => void;
startStreaming: (timestamp?: number) => void; startStreaming: (timestamp?: number) => void;
appendStreamChunk: (chunk: string) => void; appendStreamChunk: (chunk: string) => void;
@ -124,7 +117,7 @@ interface UseWebViewMessagesProps {
) => void; ) => void;
// Input // Input
inputFieldRef: React.RefObject<HTMLDivElement>; inputFieldRef: React.RefObject<HTMLDivElement | null>;
setInputText: (text: string) => void; setInputText: (text: string) => void;
// Edit mode setter (maps ACP modes to UI modes) // Edit mode setter (maps ACP modes to UI modes)
setEditMode?: (mode: ApprovalModeValue) => void; setEditMode?: (mode: ApprovalModeValue) => void;
@ -164,6 +157,17 @@ export const useWebViewMessages = ({
}: UseWebViewMessagesProps) => { }: UseWebViewMessagesProps) => {
// VS Code API for posting messages back to the extension host // VS Code API for posting messages back to the extension host
const vscode = useVSCode(); const vscode = useVSCode();
// Image resolution handling
const {
materializeMessages,
materializeMessage,
mergeResolvedImages,
clearImageResolutions,
} = useImageResolution({
vscode,
});
// Track active long-running tool calls (execute/bash/command) so we can // Track active long-running tool calls (execute/bash/command) so we can
// keep the bottom "waiting" message visible until all of them complete. // keep the bottom "waiting" message visible until all of them complete.
const activeExecToolCallsRef = useRef<Set<string>>(new Set()); const activeExecToolCallsRef = useRef<Set<string>>(new Set());
@ -422,7 +426,10 @@ export const useWebViewMessages = ({
case 'conversationLoaded': { case 'conversationLoaded': {
const conversation = message.data as Conversation; const conversation = message.data as Conversation;
handlers.messageHandling.setMessages(conversation.messages); clearImageResolutions();
handlers.messageHandling.setMessages(
materializeMessages(conversation.messages as WebViewMessageBase[]),
);
break; break;
} }
@ -431,11 +438,15 @@ export const useWebViewMessages = ({
role?: 'user' | 'assistant' | 'thinking'; role?: 'user' | 'assistant' | 'thinking';
content?: string; content?: string;
timestamp?: number; timestamp?: number;
fileContext?: {
fileName: string;
filePath: string;
startLine?: number;
endLine?: number;
};
}; };
handlers.messageHandling.addMessage( materializeMessage(msg as WebViewMessageBase).forEach((entry) =>
msg as unknown as Parameters< handlers.messageHandling.addMessage(entry),
typeof handlers.messageHandling.addMessage
>[0],
); );
// Robustness: if an assistant message arrives outside the normal stream // Robustness: if an assistant message arrives outside the normal stream
// pipeline (no explicit streamEnd), ensure we clear streaming/waiting states // pipeline (no explicit streamEnd), ensure we clear streaming/waiting states
@ -864,7 +875,12 @@ export const useWebViewMessages = ({
vscode.postMessage({ type: 'updatePanelTitle', data: { title } }); vscode.postMessage({ type: 'updatePanelTitle', data: { title } });
} }
if (message.data.messages) { if (message.data.messages) {
handlers.messageHandling.setMessages(message.data.messages); clearImageResolutions();
handlers.messageHandling.setMessages(
materializeMessages(
message.data.messages as WebViewMessageBase[],
),
);
} else { } else {
handlers.messageHandling.clearMessages(); handlers.messageHandling.clearMessages();
} }
@ -901,6 +917,7 @@ export const useWebViewMessages = ({
handlers.messageHandling.clearMessages(); handlers.messageHandling.clearMessages();
handlers.clearToolCalls(); handlers.clearToolCalls();
handlers.sessionManagement.setCurrentSessionId(null); handlers.sessionManagement.setCurrentSessionId(null);
clearImageResolutions();
handlers.sessionManagement.setCurrentSessionTitle( handlers.sessionManagement.setCurrentSessionTitle(
'Past Conversations', 'Past Conversations',
); );
@ -986,6 +1003,18 @@ export const useWebViewMessages = ({
break; break;
} }
case 'imagePathsResolved': {
const resolved =
(
message.data as
| { resolved?: Array<{ path: string; src?: string | null }> }
| undefined
)?.resolved ?? [];
handlers.messageHandling.setMessages((prevMessages) =>
mergeResolvedImages(prevMessages, resolved),
);
break;
}
case 'cancelStreaming': case 'cancelStreaming':
// Handle cancel streaming response from extension // Handle cancel streaming response from extension
// Note: The "Interrupted" message is already added by handleCancel in App.tsx // Note: The "Interrupted" message is already added by handleCancel in App.tsx
@ -999,7 +1028,16 @@ export const useWebViewMessages = ({
break; break;
} }
}, },
[inputFieldRef, setInputText, vscode, setEditMode], [
inputFieldRef,
setInputText,
vscode,
setEditMode,
materializeMessages,
materializeMessage,
mergeResolvedImages,
clearImageResolutions,
],
); );
useEffect(() => { useEffect(() => {

View file

@ -5,6 +5,24 @@
*/ */
import * as vscode from 'vscode'; import * as vscode from 'vscode';
import { Storage } from '@qwen-code/qwen-code-core';
export function getLocalResourceRoots(
extensionUri: vscode.Uri,
workspaceFolders: readonly vscode.WorkspaceFolder[] | undefined,
): vscode.Uri[] {
const roots = [
vscode.Uri.joinPath(extensionUri, 'dist'),
vscode.Uri.joinPath(extensionUri, 'assets'),
vscode.Uri.file(Storage.getGlobalTempDir()),
];
if (workspaceFolders && workspaceFolders.length > 0) {
roots.push(...workspaceFolders.map((folder) => folder.uri));
}
return roots;
}
/** /**
* Panel and Tab Manager * Panel and Tab Manager
@ -62,10 +80,10 @@ export class PanelManager {
{ {
enableScripts: true, enableScripts: true,
retainContextWhenHidden: true, retainContextWhenHidden: true,
localResourceRoots: [ localResourceRoots: getLocalResourceRoots(
vscode.Uri.joinPath(this.extensionUri, 'dist'), this.extensionUri,
vscode.Uri.joinPath(this.extensionUri, 'assets'), vscode.workspace.workspaceFolders,
], ),
}, },
); );
// Track the group column hosting this panel // Track the group column hosting this panel
@ -90,10 +108,10 @@ export class PanelManager {
{ {
enableScripts: true, enableScripts: true,
retainContextWhenHidden: true, retainContextWhenHidden: true,
localResourceRoots: [ localResourceRoots: getLocalResourceRoots(
vscode.Uri.joinPath(this.extensionUri, 'dist'), this.extensionUri,
vscode.Uri.joinPath(this.extensionUri, 'assets'), vscode.workspace.workspaceFolders,
], ),
}, },
); );
// Lock the group after creation // Lock the group after creation
@ -111,10 +129,10 @@ export class PanelManager {
{ {
enableScripts: true, enableScripts: true,
retainContextWhenHidden: true, retainContextWhenHidden: true,
localResourceRoots: [ localResourceRoots: getLocalResourceRoots(
vscode.Uri.joinPath(this.extensionUri, 'dist'), this.extensionUri,
vscode.Uri.joinPath(this.extensionUri, 'assets'), vscode.workspace.workspaceFolders,
], ),
}, },
); );

View file

@ -50,7 +50,7 @@ export class WebViewContent {
<head> <head>
<meta charset="UTF-8"> <meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0"> <meta name="viewport" content="width=device-width, initial-scale=1.0">
<meta http-equiv="Content-Security-Policy" content="default-src 'none'; img-src ${webview.cspSource}; script-src ${webview.cspSource}; style-src ${webview.cspSource} 'unsafe-inline';"> <meta http-equiv="Content-Security-Policy" content="default-src 'none'; img-src ${webview.cspSource} data:; script-src ${webview.cspSource}; style-src ${webview.cspSource} 'unsafe-inline';">
<title>Qwen Code</title> <title>Qwen Code</title>
</head> </head>
<body data-extension-uri="${safeExtensionUri}"> <body data-extension-uri="${safeExtensionUri}">

View file

@ -0,0 +1,290 @@
/**
* @license
* Copyright 2025 Qwen Team
* SPDX-License-Identifier: Apache-2.0
*/
import { beforeEach, describe, expect, it, vi } from 'vitest';
const {
mockCreateImagePathResolver,
mockGetGlobalTempDir,
mockGetPanel,
mockOnDidChangeActiveTextEditor,
mockOnDidChangeTextEditorSelection,
} = vi.hoisted(() => ({
mockCreateImagePathResolver: vi.fn(),
mockGetGlobalTempDir: vi.fn(() => '/global-temp'),
mockGetPanel: vi.fn<() => { webview: { postMessage: unknown } } | null>(
() => null,
),
mockOnDidChangeActiveTextEditor: vi.fn(() => ({ dispose: vi.fn() })),
mockOnDidChangeTextEditorSelection: vi.fn(() => ({ dispose: vi.fn() })),
}));
vi.mock('@qwen-code/qwen-code-core', () => ({
Storage: {
getGlobalTempDir: mockGetGlobalTempDir,
},
}));
vi.mock('vscode', () => ({
Uri: {
joinPath: vi.fn((base: { fsPath?: string }, ...parts: string[]) => ({
fsPath: `${base.fsPath ?? ''}/${parts.join('/')}`.replace(/\/+/g, '/'),
})),
file: vi.fn((filePath: string) => ({ fsPath: filePath })),
},
window: {
onDidChangeActiveTextEditor: mockOnDidChangeActiveTextEditor,
onDidChangeTextEditorSelection: mockOnDidChangeTextEditorSelection,
activeTextEditor: undefined,
},
workspace: {
workspaceFolders: [{ uri: { fsPath: '/workspace-root' } }],
},
commands: {
executeCommand: vi.fn(),
},
}));
vi.mock('../../services/qwenAgentManager.js', () => ({
QwenAgentManager: class {
isConnected = false;
currentSessionId = null;
onMessage = vi.fn();
onStreamChunk = vi.fn();
onThoughtChunk = vi.fn();
onModeInfo = vi.fn();
onModeChanged = vi.fn();
onUsageUpdate = vi.fn();
onModelInfo = vi.fn();
onModelChanged = vi.fn();
onAvailableCommands = vi.fn();
onAvailableModels = vi.fn();
onEndTurn = vi.fn();
onToolCall = vi.fn();
onPlan = vi.fn();
onPermissionRequest = vi.fn();
onAskUserQuestion = vi.fn();
disconnect = vi.fn();
},
}));
vi.mock('../../services/conversationStore.js', () => ({
ConversationStore: class {
constructor(_context: unknown) {}
},
}));
vi.mock('./PanelManager.js', async (importOriginal) => {
const actual = await importOriginal<typeof import('./PanelManager.js')>();
return {
...actual,
PanelManager: class {
constructor(_extensionUri: unknown, _onPanelDispose: () => void) {}
getPanel() {
return mockGetPanel();
}
},
};
});
vi.mock('./MessageHandler.js', () => ({
MessageHandler: class {
constructor(
_agentManager: unknown,
_conversationStore: unknown,
_currentConversationId: string | null,
_sendToWebView: (message: unknown) => void,
) {}
setLoginHandler = vi.fn();
setPermissionHandler = vi.fn();
setAskUserQuestionHandler = vi.fn();
setupFileWatchers = vi.fn(() => ({ dispose: vi.fn() }));
appendStreamContent = vi.fn();
route = vi.fn();
},
}));
vi.mock('./WebViewContent.js', () => ({
WebViewContent: {
generate: vi.fn(() => '<html />'),
},
}));
vi.mock('../utils/imageHandler.js', () => ({
createImagePathResolver: mockCreateImagePathResolver,
}));
vi.mock('../../utils/authErrors.js', () => ({
isAuthenticationRequiredError: vi.fn(() => false),
}));
vi.mock('../../utils/errorMessage.js', () => ({
getErrorMessage: vi.fn((error: unknown) => String(error)),
}));
import { WebViewProvider } from './WebViewProvider.js';
describe('WebViewProvider.attachToView', () => {
beforeEach(() => {
vi.clearAllMocks();
mockGetPanel.mockReturnValue(null);
mockCreateImagePathResolver.mockReturnValue((paths: string[]) =>
paths.map((entry) => ({
path: entry,
src: `webview:${entry}`,
})),
);
vi.spyOn(
WebViewProvider.prototype as unknown as {
initializeAgentConnection: () => Promise<void>;
},
'initializeAgentConnection',
).mockResolvedValue(undefined);
});
it('configures sidebar views with workspace/temp roots and resolves image paths through the attached webview', async () => {
let messageHandler:
| ((message: { type: string; data?: unknown }) => Promise<void>)
| undefined;
const postMessage = vi.fn();
const webview = {
options: undefined as unknown,
html: '',
postMessage,
asWebviewUri: vi.fn((uri: { fsPath: string }) => ({
toString: () => `webview:${uri.fsPath}`,
})),
onDidReceiveMessage: vi.fn(
(
handler: (message: { type: string; data?: unknown }) => Promise<void>,
) => {
messageHandler = handler;
return { dispose: vi.fn() };
},
),
};
const provider = new WebViewProvider(
{ subscriptions: [] } as never,
{ fsPath: '/extension-root' } as never,
);
await provider.attachToView(
{
webview,
visible: true,
onDidChangeVisibility: vi.fn(() => ({ dispose: vi.fn() })),
onDidDispose: vi.fn(() => ({ dispose: vi.fn() })),
} as never,
'qwen-code.chatView.sidebar',
);
const roots = (
webview.options as { localResourceRoots?: Array<{ fsPath: string }> }
).localResourceRoots;
expect(roots).toEqual(
expect.arrayContaining([
expect.objectContaining({ fsPath: '/extension-root/dist' }),
expect.objectContaining({ fsPath: '/extension-root/assets' }),
expect.objectContaining({ fsPath: '/global-temp' }),
expect.objectContaining({ fsPath: '/workspace-root' }),
]),
);
expect(messageHandler).toBeTypeOf('function');
await messageHandler?.({
type: 'resolveImagePaths',
data: { paths: ['clipboard/example.png'], requestId: 7 },
});
expect(mockCreateImagePathResolver).toHaveBeenCalledWith(
expect.objectContaining({
workspaceRoots: ['/workspace-root'],
toWebviewUri: expect.any(Function),
}),
);
expect(postMessage).toHaveBeenCalledWith({
type: 'imagePathsResolved',
data: {
resolved: [
{
path: 'clipboard/example.png',
src: 'webview:clipboard/example.png',
},
],
requestId: 7,
},
});
});
it('routes resolved image paths back to the requesting attached webview even when a panel exists', async () => {
let messageHandler:
| ((message: { type: string; data?: unknown }) => Promise<void>)
| undefined;
const attachedPostMessage = vi.fn();
const panelPostMessage = vi.fn();
mockGetPanel.mockReturnValue({
webview: {
postMessage: panelPostMessage,
},
});
const webview = {
options: undefined as unknown,
html: '',
postMessage: attachedPostMessage,
asWebviewUri: vi.fn((uri: { fsPath: string }) => ({
toString: () => `attached:${uri.fsPath}`,
})),
onDidReceiveMessage: vi.fn(
(
handler: (message: { type: string; data?: unknown }) => Promise<void>,
) => {
messageHandler = handler;
return { dispose: vi.fn() };
},
),
};
const provider = new WebViewProvider(
{ subscriptions: [] } as never,
{ fsPath: '/extension-root' } as never,
);
await provider.attachToView(
{
webview,
visible: true,
onDidChangeVisibility: vi.fn(() => ({ dispose: vi.fn() })),
onDidDispose: vi.fn(() => ({ dispose: vi.fn() })),
} as never,
'qwen-code.chatView.sidebar',
);
await messageHandler?.({
type: 'resolveImagePaths',
data: { paths: ['/global-temp/clipboard/example.png'], requestId: 8 },
});
expect(attachedPostMessage).toHaveBeenCalledWith({
type: 'imagePathsResolved',
data: {
resolved: [
{
path: '/global-temp/clipboard/example.png',
src: 'webview:/global-temp/clipboard/example.png',
},
],
requestId: 8,
},
});
expect(panelPostMessage).not.toHaveBeenCalled();
});
});

View file

@ -16,10 +16,11 @@ import type {
PermissionResponseMessage, PermissionResponseMessage,
AskUserQuestionResponseMessage, AskUserQuestionResponseMessage,
} from '../../types/webviewMessageTypes.js'; } from '../../types/webviewMessageTypes.js';
import { PanelManager } from './PanelManager.js'; import { PanelManager, getLocalResourceRoots } from './PanelManager.js';
import { MessageHandler } from './MessageHandler.js'; import { MessageHandler } from './MessageHandler.js';
import { WebViewContent } from './WebViewContent.js'; import { WebViewContent } from './WebViewContent.js';
import { getFileName } from '../utils/webviewUtils.js'; import { getFileName } from '../utils/webviewUtils.js';
import { createImagePathResolver } from '../utils/imageHandler.js';
import { type ApprovalModeValue } from '../../types/approvalModeValueTypes.js'; import { type ApprovalModeValue } from '../../types/approvalModeValueTypes.js';
import { isAuthenticationRequiredError } from '../../utils/authErrors.js'; import { isAuthenticationRequiredError } from '../../utils/authErrors.js';
import { getErrorMessage } from '../../utils/errorMessage.js'; import { getErrorMessage } from '../../utils/errorMessage.js';
@ -476,10 +477,10 @@ export class WebViewProvider {
// Configure webview options // Configure webview options
webview.options = { webview.options = {
enableScripts: true, enableScripts: true,
localResourceRoots: [ localResourceRoots: getLocalResourceRoots(
vscode.Uri.joinPath(this.extensionUri, 'dist'), this.extensionUri,
vscode.Uri.joinPath(this.extensionUri, 'assets'), vscode.workspace.workspaceFolders,
], ),
}; };
// Store reference so sendMessageToWebView can reach it // Store reference so sendMessageToWebView can reach it
@ -500,6 +501,10 @@ export class WebViewProvider {
this.handleWebviewReady(); this.handleWebviewReady();
return; return;
} }
if (message.type === 'resolveImagePaths') {
this.handleResolveImagePaths(message.data, webview);
return;
}
if (this.handleNewChatByContext(message)) { if (this.handleNewChatByContext(message)) {
return; return;
} }
@ -653,6 +658,10 @@ export class WebViewProvider {
this.handleWebviewReady(); this.handleWebviewReady();
return; return;
} }
if (message.type === 'resolveImagePaths') {
this.handleResolveImagePaths(message.data, newPanel.webview);
return;
}
// Allow webview to request updating the VS Code tab title // Allow webview to request updating the VS Code tab title
if (message.type === 'updatePanelTitle') { if (message.type === 'updatePanelTitle') {
const title = String( const title = String(
@ -1229,12 +1238,42 @@ export class WebViewProvider {
*/ */
private sendMessageToWebView(message: unknown): void { private sendMessageToWebView(message: unknown): void {
this.updateAuthStateFromMessage(message); this.updateAuthStateFromMessage(message);
const panel = this.panelManager.getPanel(); this.getActiveWebview()?.postMessage(message);
if (panel) { }
panel.webview.postMessage(message);
} else if (this.attachedWebview) { private handleResolveImagePaths(
this.attachedWebview.postMessage(message); data: unknown,
targetWebview?: vscode.Webview,
): void {
const webview = targetWebview ?? this.getActiveWebview();
if (!webview) {
return;
} }
const payload = data as
| { paths?: string[]; requestId?: number }
| undefined;
const paths = Array.isArray(payload?.paths) ? (payload?.paths ?? []) : [];
const workspaceFolders = vscode.workspace.workspaceFolders ?? [];
const workspaceRoots = workspaceFolders.map((folder) => folder.uri.fsPath);
const resolveImagePaths = createImagePathResolver({
workspaceRoots,
toWebviewUri: (filePath: string) =>
webview.asWebviewUri(vscode.Uri.file(filePath)).toString(),
});
const resolved = resolveImagePaths(paths);
webview.postMessage({
type: 'imagePathsResolved',
data: { resolved, requestId: payload?.requestId },
});
}
private getActiveWebview(): vscode.Webview | null {
return this.panelManager.getPanel()?.webview ?? this.attachedWebview;
} }
/** /**
@ -1380,6 +1419,10 @@ export class WebViewProvider {
this.handleWebviewReady(); this.handleWebviewReady();
return; return;
} }
if (message.type === 'resolveImagePaths') {
this.handleResolveImagePaths(message.data, panel.webview);
return;
}
if (message.type === 'updatePanelTitle') { if (message.type === 'updatePanelTitle') {
const title = String( const title = String(
(message.data as { title?: unknown } | undefined)?.title ?? '', (message.data as { title?: unknown } | undefined)?.title ?? '',

View file

@ -67,6 +67,7 @@ describe('registerChatViewProviders', () => {
'qwen-code.chatView.sidebar', 'qwen-code.chatView.sidebar',
'qwen-code.chatView.secondary', 'qwen-code.chatView.secondary',
]); ]);
expect(calls[0]?.[1]).not.toBe(calls[1]?.[1]);
expect(calls[0]?.[2]).toEqual({ expect(calls[0]?.[2]).toEqual({
webviewOptions: { retainContextWhenHidden: true }, webviewOptions: { retainContextWhenHidden: true },
}); });

View file

@ -43,17 +43,18 @@ export function registerChatViewProviders(params: {
); );
} }
const chatViewProvider = new ChatWebviewViewProvider(createViewProvider); const sidebarViewProvider = new ChatWebviewViewProvider(createViewProvider);
const secondaryViewProvider = new ChatWebviewViewProvider(createViewProvider);
context.subscriptions.push( context.subscriptions.push(
vscode.window.registerWebviewViewProvider( vscode.window.registerWebviewViewProvider(
CHAT_VIEW_ID_SIDEBAR, CHAT_VIEW_ID_SIDEBAR,
chatViewProvider, sidebarViewProvider,
{ webviewOptions: { retainContextWhenHidden: true } }, { webviewOptions: { retainContextWhenHidden: true } },
), ),
vscode.window.registerWebviewViewProvider( vscode.window.registerWebviewViewProvider(
CHAT_VIEW_ID_SECONDARY, CHAT_VIEW_ID_SECONDARY,
chatViewProvider, secondaryViewProvider,
{ webviewOptions: { retainContextWhenHidden: true } }, { webviewOptions: { retainContextWhenHidden: true } },
), ),
); );

View file

@ -0,0 +1,219 @@
/**
* @license
* Copyright 2025 Qwen Team
* SPDX-License-Identifier: Apache-2.0
*/
import path from 'node:path';
import { afterEach, beforeEach, describe, expect, it, vi } from 'vitest';
import {
normalizeImageAttachment,
escapePath,
unescapePath,
} from '../../utils/imageSupport.js';
const mockMkdir = vi.hoisted(() => vi.fn().mockResolvedValue(undefined));
const mockWriteFile = vi.hoisted(() => vi.fn().mockResolvedValue(undefined));
const mockReaddir = vi.hoisted(() => vi.fn().mockResolvedValue([]));
const mockStat = vi.hoisted(() => vi.fn());
const mockUnlink = vi.hoisted(() => vi.fn().mockResolvedValue(undefined));
vi.mock('fs/promises', () => ({
mkdir: mockMkdir,
writeFile: mockWriteFile,
readdir: mockReaddir,
stat: mockStat,
unlink: mockUnlink,
}));
vi.mock('@qwen-code/qwen-code-core', async (importOriginal) => {
const actual =
await importOriginal<typeof import('@qwen-code/qwen-code-core')>();
return {
...actual,
Storage: { getGlobalTempDir: () => '/mock/tmp' },
};
});
vi.mock('vscode', () => ({
workspace: {
workspaceFolders: [],
},
}));
import {
processImageAttachments,
saveImageToFile,
buildPromptBlocks,
} from './imageHandler.js';
describe('imageHandler', () => {
beforeEach(() => {
vi.clearAllMocks();
});
afterEach(() => {
vi.restoreAllMocks();
});
it('decodes base64 data URL and writes correct buffer to disk', async () => {
const filePath = await saveImageToFile(
'data:image/png;base64,YWJj',
'image/png',
);
expect(filePath).toBeTruthy();
expect(mockMkdir).toHaveBeenCalledWith(
path.join('/mock/tmp', 'clipboard'),
{ recursive: true },
);
expect(mockWriteFile).toHaveBeenCalledOnce();
const [writtenPath, buffer] = mockWriteFile.mock.calls[0];
expect(buffer).toEqual(Buffer.from('abc'));
expect(path.basename(writtenPath)).toMatch(
/^clipboard-\d+-[a-f0-9-]+\.png$/,
);
});
it('decodes raw base64 (without data URL prefix)', async () => {
const filePath = await saveImageToFile('YWJj', 'image/png');
expect(filePath).toBeTruthy();
const [, buffer] = mockWriteFile.mock.calls[0];
expect(buffer).toEqual(Buffer.from('abc'));
});
it('prunes old clipboard images after saving', async () => {
mockReaddir.mockResolvedValueOnce(['clipboard-1.png', 'clipboard-2.png']);
mockStat
.mockResolvedValueOnce({ mtimeMs: 100 })
.mockResolvedValueOnce({ mtimeMs: 200 });
await saveImageToFile('data:image/png;base64,YWJj', 'image/png');
expect(mockReaddir).toHaveBeenCalled();
});
it('generates unique file names for images saved in the same millisecond', async () => {
vi.spyOn(Date, 'now').mockReturnValue(1234567890);
await saveImageToFile('data:image/png;base64,YWJj', 'image/png');
await saveImageToFile('data:image/png;base64,ZGVm', 'image/png');
const firstName = path.basename(mockWriteFile.mock.calls[0][0]);
const secondName = path.basename(mockWriteFile.mock.calls[1][0]);
expect(firstName).not.toBe(secondName);
});
it('returns null when file write throws', async () => {
mockWriteFile.mockRejectedValueOnce(new Error('disk full'));
const result = await saveImageToFile(
'data:image/png;base64,YWJj',
'image/png',
);
expect(result).toBeNull();
});
it('returns saved prompt image metadata for validated attachments', async () => {
const result = await processImageAttachments('Inspect this image', [
{
id: 'img-1',
name: 'pasted.png',
type: 'image/png',
size: 3,
data: 'data:image/png;base64,YWJj',
timestamp: Date.now(),
},
]);
expect(result.savedImageCount).toBe(1);
expect(result.promptImages).toEqual([
expect.objectContaining({
name: 'pasted.png',
mimeType: 'image/png',
path: expect.stringContaining(`${path.sep}clipboard-`),
}),
]);
expect(result.formattedText).toContain('@');
});
});
describe('buildPromptBlocks', () => {
it('builds ACP resource_link blocks from saved image attachments', () => {
expect(
buildPromptBlocks('Please inspect this screenshot.', [
{
path: '/tmp/My Images/pasted image.png',
name: 'pasted image.png',
mimeType: 'image/png',
},
]),
).toEqual([
{ type: 'text', text: 'Please inspect this screenshot.' },
{
type: 'resource_link',
name: 'pasted image.png',
mimeType: 'image/png',
uri: 'file:///tmp/My Images/pasted image.png',
},
]);
});
it('returns only resource links when the prompt has images only', () => {
expect(
buildPromptBlocks('', [
{
path: '/tmp/clipboard/pasted.webp',
name: 'pasted.webp',
mimeType: 'image/webp',
},
]),
).toEqual([
{
type: 'resource_link',
name: 'pasted.webp',
mimeType: 'image/webp',
uri: 'file:///tmp/clipboard/pasted.webp',
},
]);
});
});
describe('normalizeImageAttachment', () => {
it('rejects attachments with unsupported image mime types', () => {
expect(
normalizeImageAttachment({
id: 'img-1',
name: 'animated.gif',
type: 'image/gif',
size: 43,
data: 'data:image/gif;base64,R0lGODdhAQABAIAAAP///////ywAAAAAAQABAAACAkQBADs=',
timestamp: Date.now(),
}),
).toBeNull();
});
it('rejects attachments whose decoded payload exceeds the enforced byte limit', () => {
expect(
normalizeImageAttachment(
{
id: 'img-2',
name: 'oversized.png',
type: 'image/png',
size: 1,
data: 'data:image/png;base64,QUJDREU=',
timestamp: Date.now(),
},
{ maxBytes: 4 },
),
).toBeNull();
});
});
describe('pathEscaping', () => {
it('round-trips shell-escaped file paths', () => {
const originalPath = '/tmp/My Images/(draft) final.png';
expect(unescapePath(escapePath(originalPath))).toBe(originalPath);
});
});

View file

@ -0,0 +1,261 @@
/**
* @license
* Copyright 2025 Qwen Team
* SPDX-License-Identifier: Apache-2.0
*/
import * as fs from 'fs';
import * as fsp from 'fs/promises';
import * as path from 'path';
import { randomUUID } from 'node:crypto';
import type { ContentBlock } from '@agentclientprotocol/sdk';
import { Storage } from '@qwen-code/qwen-code-core';
import type {
ImageAttachment,
SavedImageAttachment,
} from '../../utils/imageSupport.js';
import {
MAX_IMAGE_SIZE,
MAX_TOTAL_IMAGE_SIZE,
getImageExtensionForMimeType,
escapePath,
normalizeImageAttachment,
} from '../../utils/imageSupport.js';
// ---------- Clipboard image storage ----------
const CLIPBOARD_DIR_NAME = 'clipboard';
const DEFAULT_MAX_IMAGES = 100;
function getClipboardImageDir(): string {
return path.join(Storage.getGlobalTempDir(), CLIPBOARD_DIR_NAME);
}
async function saveImageBufferToClipboardDir(
buffer: Buffer,
fileName: string,
): Promise<string> {
const dir = getClipboardImageDir();
await fsp.mkdir(dir, { recursive: true });
const filePath = path.join(dir, fileName);
await fsp.writeFile(filePath, buffer);
return filePath;
}
async function pruneClipboardImages(
maxImages: number = DEFAULT_MAX_IMAGES,
): Promise<void> {
try {
const dir = getClipboardImageDir();
const files = await fsp.readdir(dir);
const imageFiles: Array<{ filePath: string; mtimeMs: number }> = [];
for (const file of files) {
if (file.startsWith('clipboard-')) {
const filePath = path.join(dir, file);
const stats = await fsp.stat(filePath);
imageFiles.push({ filePath, mtimeMs: stats.mtimeMs });
}
}
if (imageFiles.length > maxImages) {
imageFiles.sort((a, b) => b.mtimeMs - a.mtimeMs);
for (const { filePath } of imageFiles.slice(maxImages)) {
await fsp.unlink(filePath);
}
}
} catch {
// Ignore errors in cleanup — directory may not exist yet
}
}
// ---------- Image saving & processing ----------
export function appendImageReferences(
text: string,
imageReferences: string[],
): string {
if (imageReferences.length === 0) {
return text;
}
const imageText = imageReferences.join(' ');
if (!text.trim()) {
return imageText;
}
return `${text}\n\n${imageText}`;
}
export async function saveImageToFile(
base64Data: string,
mimeType: string,
): Promise<string | null> {
try {
let pureBase64 = base64Data;
const dataUrlMatch = base64Data.match(/^data:[^;]+;base64,(.+)$/);
if (dataUrlMatch) {
pureBase64 = dataUrlMatch[1];
}
const buffer = Buffer.from(pureBase64, 'base64');
const timestamp = Date.now();
const ext = getImageExtensionForMimeType(mimeType);
const fileName = `clipboard-${timestamp}-${randomUUID()}${ext}`;
const filePath = await saveImageBufferToClipboardDir(buffer, fileName);
await pruneClipboardImages();
return filePath;
} catch (error) {
console.error('[ImageHandler] Failed to save image:', error);
return null;
}
}
export async function processImageAttachments(
text: string,
attachments?: ImageAttachment[],
): Promise<{
formattedText: string;
displayText: string;
savedImageCount: number;
promptImages: SavedImageAttachment[];
}> {
let formattedText = text;
let displayText = text;
let savedImageCount = 0;
let remainingBytes = MAX_TOTAL_IMAGE_SIZE;
const promptImages: SavedImageAttachment[] = [];
if (attachments && attachments.length > 0) {
const imageReferences: string[] = [];
for (const attachment of attachments) {
const normalizedAttachment = normalizeImageAttachment(attachment, {
maxBytes: Math.min(MAX_IMAGE_SIZE, remainingBytes),
});
if (!normalizedAttachment) {
console.warn(
'[ImageHandler] Rejected invalid image attachment:',
attachment.name,
);
continue;
}
const imagePath = await saveImageToFile(
normalizedAttachment.data,
normalizedAttachment.type,
);
if (imagePath) {
imageReferences.push(`@${escapePath(imagePath)}`);
promptImages.push({
path: imagePath,
name: normalizedAttachment.name,
mimeType: normalizedAttachment.type,
});
remainingBytes -= normalizedAttachment.size;
savedImageCount += 1;
} else {
console.warn('[ImageHandler] Failed to save image:', attachment.name);
}
}
if (imageReferences.length > 0) {
formattedText = appendImageReferences(formattedText, imageReferences);
displayText = appendImageReferences(displayText, imageReferences);
}
}
return { formattedText, displayText, savedImageCount, promptImages };
}
// ---------- ACP prompt builder ----------
export function buildPromptBlocks(
text: string,
images: SavedImageAttachment[] = [],
): ContentBlock[] {
const blocks: ContentBlock[] = [];
if (text || images.length === 0) {
blocks.push({ type: 'text', text });
}
for (const image of images) {
blocks.push({
type: 'resource_link',
name: image.name,
mimeType: image.mimeType,
uri: `file://${image.path}`,
});
}
return blocks;
}
// ---------- Image path resolution ----------
export function resolveImagePathsForWebview({
paths,
workspaceRoots,
globalTempDir,
existsSync,
toWebviewUri,
}: {
paths: string[];
workspaceRoots: string[];
globalTempDir: string;
existsSync: (path: string) => boolean;
toWebviewUri: (path: string) => string;
}): Array<{ path: string; src: string | null }> {
const allowedRoots = [...workspaceRoots, globalTempDir].filter(Boolean);
const root = workspaceRoots[0];
return paths.map((imagePath) => {
if (!imagePath || typeof imagePath !== 'string') {
return { path: imagePath, src: null };
}
const resolvedPath = path.isAbsolute(imagePath)
? path.normalize(imagePath)
: root
? path.normalize(path.resolve(root, imagePath))
: null;
if (!resolvedPath) {
return { path: imagePath, src: null };
}
const isAllowed = allowedRoots.some((allowedRoot) => {
const normalizedRoot = path.normalize(allowedRoot);
return (
resolvedPath === normalizedRoot ||
resolvedPath.startsWith(normalizedRoot + path.sep)
);
});
if (!isAllowed || !existsSync(resolvedPath)) {
return { path: imagePath, src: null };
}
return { path: imagePath, src: toWebviewUri(resolvedPath) };
});
}
export function createImagePathResolver({
workspaceRoots,
toWebviewUri,
}: {
workspaceRoots: string[];
toWebviewUri: (filePath: string) => string;
}) {
return function resolveImagePaths(
paths: string[],
): Array<{ path: string; src: string | null }> {
return resolveImagePathsForWebview({
paths,
workspaceRoots,
globalTempDir: Storage.getGlobalTempDir(),
existsSync: fs.existsSync,
toWebviewUri,
});
};
}

View file

@ -64,7 +64,7 @@ export interface InputFormProps {
/** Current input text */ /** Current input text */
inputText: string; inputText: string;
/** Ref for the input field */ /** Ref for the input field */
inputFieldRef: React.RefObject<HTMLDivElement>; inputFieldRef: React.RefObject<HTMLDivElement | null>;
/** Whether AI is currently generating */ /** Whether AI is currently generating */
isStreaming: boolean; isStreaming: boolean;
/** Whether waiting for response */ /** Whether waiting for response */
@ -117,8 +117,14 @@ export interface InputFormProps {
onCompletionFill?: (item: CompletionItem) => void; onCompletionFill?: (item: CompletionItem) => void;
/** Completion close callback */ /** Completion close callback */
onCompletionClose?: () => void; onCompletionClose?: () => void;
/** Optional paste handler for the contentEditable input */
onPaste?: (e: React.ClipboardEvent) => void;
/** Optional content rendered between the input and actions */
extraContent?: ReactNode;
/** Placeholder text */ /** Placeholder text */
placeholder?: string; placeholder?: string;
/** Whether the current draft is eligible to submit */
canSubmit?: boolean;
} }
/** /**
@ -174,9 +180,14 @@ export const InputForm: FC<InputFormProps> = ({
onCompletionSelect, onCompletionSelect,
onCompletionFill, onCompletionFill,
onCompletionClose, onCompletionClose,
onPaste,
extraContent,
placeholder = 'Ask Qwen Code …', placeholder = 'Ask Qwen Code …',
canSubmit,
}) => { }) => {
const composerDisabled = isStreaming || isWaitingForResponse; const composerDisabled = isStreaming || isWaitingForResponse;
const hasDraftContent =
canSubmit ?? inputText.replace(/\u200B/g, '').trim().length > 0;
const completionItemsResolved = completionItems ?? []; const completionItemsResolved = completionItems ?? [];
const completionActive = const completionActive =
completionIsOpen && completionIsOpen &&
@ -275,10 +286,15 @@ export const InputForm: FC<InputFormProps> = ({
onCompositionStart={onCompositionStart} onCompositionStart={onCompositionStart}
onCompositionEnd={onCompositionEnd} onCompositionEnd={onCompositionEnd}
onKeyDown={handleKeyDown} onKeyDown={handleKeyDown}
onPaste={onPaste}
suppressContentEditableWarning suppressContentEditableWarning
/> />
</div> </div>
{extraContent ? (
<div className="relative z-[1]">{extraContent}</div>
) : null}
<div className="composer-actions"> <div className="composer-actions">
{/* Edit mode button */} {/* Edit mode button */}
<button <button
@ -357,7 +373,7 @@ export const InputForm: FC<InputFormProps> = ({
<button <button
type="submit" type="submit"
className="btn-send-compact [&>svg]:w-5 [&>svg]:h-5" className="btn-send-compact [&>svg]:w-5 [&>svg]:h-5"
disabled={composerDisabled || !inputText.trim()} disabled={composerDisabled || !hasDraftContent}
aria-label="Send message" aria-label="Send message"
> >
<ArrowUpIcon /> <ArrowUpIcon />

View file

@ -0,0 +1,119 @@
/**
* @license
* Copyright 2025 Qwen Team
* SPDX-License-Identifier: Apache-2.0
*/
import type { FC } from 'react';
import { CloseSmallIcon } from '../icons/NavigationIcons.js';
// ======================== ImagePreview ========================
export interface ImagePreviewItem {
id: string;
name: string;
data: string;
}
export interface ImagePreviewProps {
images: ImagePreviewItem[];
onRemove: (id: string) => void;
}
export const ImagePreview: FC<ImagePreviewProps> = ({ images, onRemove }) => {
if (images.length === 0) {
return null;
}
return (
<div className="image-preview-container flex gap-2 px-2 pb-2">
{images.map((image) => (
<div key={image.id} className="image-preview-item relative group">
<div className="relative">
<img
src={image.data}
alt={image.name}
className="w-14 h-14 object-cover rounded-md border border-gray-500 dark:border-gray-600"
title={image.name}
/>
<button
type="button"
onClick={() => onRemove(image.id)}
className="absolute -top-2 -right-2 w-5 h-5 bg-gray-700 dark:bg-gray-600 text-white rounded-full flex items-center justify-center opacity-0 group-hover:opacity-100 transition-opacity hover:bg-gray-800 dark:hover:bg-gray-500"
aria-label={`Remove ${image.name}`}
>
<CloseSmallIcon />
</button>
</div>
</div>
))}
</div>
);
};
// ======================== ImageMessageRenderer ========================
export interface ImageMessageLike {
kind: 'image';
imagePath: string;
imageSrc?: string;
imageMissing?: boolean;
}
export interface ImageMessageRendererProps {
msg: ImageMessageLike;
imageIndex: number;
}
export const ImageMessageRenderer: FC<ImageMessageRendererProps> = ({
msg,
imageIndex,
}) => {
if (msg.kind !== 'image' || !msg.imagePath) {
return null;
}
const label = `[Image #${imageIndex}]`;
const showImage = Boolean(msg.imageSrc) && !msg.imageMissing;
return (
<div className="qwen-message user-message-container flex gap-0 my-1 items-start text-left flex-col relative">
<div
className="inline-block relative whitespace-pre-wrap rounded-md max-w-full overflow-x-auto overflow-y-hidden select-text leading-[1.5]"
style={{
border: '1px solid var(--app-input-border)',
borderRadius: 'var(--corner-radius-medium)',
backgroundColor: 'var(--app-input-background)',
padding: '6px 8px',
color: 'var(--app-primary-foreground)',
}}
>
<div
style={{
fontSize: '12px',
color: 'var(--app-secondary-foreground)',
marginBottom: '4px',
}}
>
{label}
</div>
{showImage ? (
<img
src={msg.imageSrc}
alt={msg.imagePath}
className="max-w-full rounded-md border border-gray-600"
/>
) : (
<div
style={{
fontSize: '12px',
color: 'var(--app-secondary-foreground)',
}}
>
@{msg.imagePath}
</div>
)}
</div>
</div>
);
};

View file

@ -92,6 +92,16 @@ export type {
Question, Question,
QuestionOption, QuestionOption,
} from './components/messages/AskUserQuestionDialog'; } from './components/messages/AskUserQuestionDialog';
export {
ImagePreview,
ImageMessageRenderer,
} from './components/messages/ImageComponents';
export type {
ImagePreviewProps,
ImagePreviewItem,
ImageMessageRendererProps,
ImageMessageLike,
} from './components/messages/ImageComponents';
// ChatViewer - standalone chat display component // ChatViewer - standalone chat display component
export { export {