feat: add commit attribution with per-file AI contribution tracking (#3115)

* feat: add commit attribution with per-file AI contribution tracking via git notes

Track character-level AI vs human contributions per file and store
detailed attribution metadata as git notes (refs/notes/ai-attribution)
after each successful git commit. This enables open-source AI disclosure
and enterprise compliance audits without polluting commit messages.

* feat: enhance commit attribution with real AI/human ratios and generated file exclusion

- Replace line-based diff with a prefix/suffix character-level algorithm
  for precise contribution calculation (e.g. "Esc"→"esc" = 1 char, not whole line)
- Compute real AI vs human contribution percentages at commit time by analyzing
  git diff --stat output: humanChars = max(0, diffSize - trackedAiChars)
- Add generated file exclusion (lock files, dist/, .min.js, .d.ts, etc.)
  ported from an existing generatedFiles.ts
- Add file deletion tracking via recordDeletion()
- Update git notes payload format: {aiChars, humanChars, percent} per file
  with real percentages instead of hardcoded 100%

* feat: add surface tracking, prompt counting, session persistence, and PR attribution

Align with the full attribution feature set:
- Surface tracking: read QWEN_CODE_ENTRYPOINT env var (cli/ide/api/sdk),
  include surfaceBreakdown in git notes payload
- Prompt counting: incrementPromptCount() hooked into client.ts message
  loop, tracks promptCount/permissionPromptCount/escapeCount
- Session persistence: toSnapshot()/restoreFromSnapshot() for serializing
  attribution state; ChatRecordingService.recordAttributionSnapshot()
  writes to session JSONL; client.ts restores on session resume
- PR attribution: addAttributionToPR() in shell.ts detects `gh pr create`
  and appends "🤖 Generated with Qwen Code (N-shotted by Qwen-Coder)"
- Session baseline: saves content hash on first AI edit of each file
  for precise human/AI contribution detection
- generatePRAttribution() method for programmatic access

* fix: audit fixes — initial commit handling, cron prompt exclusion, failed commit counter preservation

- Handle initial commit (no HEAD~1) by detecting parent with rev-parse
  and falling back to --root for first commit in repo
- Exclude Cron-triggered messages from promptCount (not user-initiated)
- Add commitSucceeded parameter to clearAttributions() so failed/disabled
  commits don't reset the prompts-since-last-commit counter
- Add test for clearAttributions(false) behavior

* fix: cross-platform and correctness fixes from multi-round audit

- Normalize path.relative() to forward slashes for Windows compatibility
- Use diff-tree --root for initial commits (git diff --root is invalid)
- Replace String.replace() with indexOf+slice to avoid $& special patterns
- Fix clearAttributions(false→true) when co-author disabled but commit succeeded
- Use real newlines instead of literal \n in PR attribution text
- Add surface fallback in restoreFromSnapshot for version compatibility
- Fix single-quote regex to not assume bash supports \' escaping
- Case-insensitive directory matching in generated file detection
- Handle renamed file brace notation in parseDiffStat

* fix(attribution): also snapshot on ToolResult turns so resume keeps tool edits

Previously, recordAttributionSnapshot() only ran at the start of UserQuery
and Cron turns — before the tools for that turn had executed. A session
that wrote a file in turn 1 and committed in turn 2 (across process
boundaries via --resume) lost the tracked edit: the last persisted
snapshot was the turn-1-start snapshot (empty fileStates), so on resume
the attribution service restored empty state and no git notes were
attached to the commit.

Move the snapshot call out of the UserQuery/Cron conditional and run it
on every non-Retry turn. ToolResult turns are scheduled right after
tools execute, so their start-of-turn snapshot now captures any edits
those tools made. Retry turns are skipped since the state is unchanged
from the prior turn.

Added unit tests asserting the snapshot fires for ToolResult/UserQuery
turns and skips Retry turns.

Verified end-to-end in a scratch repo: write-file in turn 1 (no commit)
→ exit → --resume → commit in turn 2 → git notes now contain the
recorded file with correct aiChars and promptCount: 2.

* refactor(attribution): merge duplicate retry guard and update stale doc

Collapse the two back-to-back messageType !== Retry blocks in
sendMessageStream into one, and refresh chatRecordingService's
recordAttributionSnapshot doc comment to reflect that snapshots fire
on every non-retry turn (not just after user prompts).

* feat(attribution): split gitCoAuthor into independent commit and pr toggles

Matches the shape used upstream in Claude Code's `attribution.{commit,pr}`
so users can disable the PR body line without losing the commit-message
Co-authored-by trailer (or vice versa). The previous boolean forced both
to move together, which conflated two different surfaces.

- settingsSchema: gitCoAuthor becomes an object with nested commit/pr
  booleans, each `showInDialog: true` so both appear in /settings.
- Config constructor accepts legacy boolean (coerced to { commit: v, pr: v })
  so stored preferences from the pre-split schema carry over.
- shell.ts: attachCommitAttribution and addCoAuthorToGitCommit read .commit;
  addAttributionToPR reads .pr.

* feat(settings): add v3→v4 migration for gitCoAuthor shape change

Legacy gitCoAuthor was a single boolean and shipped ~4 months ago; the
previous commit split it into { commit, pr } sub-toggles. Without a
migration, users who had set gitCoAuthor: false would see the settings
dialog show the default (true) for both sub-toggles — misleading and
likely to flip their preference on the next save because getNestedValue
returns undefined when asked for .commit on a boolean.

- New v3-to-v4 migration expands boolean → { commit: v, pr: v },
  preserves already-object values, resets invalid values to {} with a
  warning.
- SETTINGS_VERSION bumped 3 → 4; existing integration assertions use the
  constant so the next bump is a single-line change.
- Regenerate vscode-ide-companion settings.schema.json to reflect the
  new nested shape.
- Docs: split the single gitCoAuthor row into .commit and .pr.

* test(migration): cover null/array/number and partial object for v3-to-v4

The migration already treats any non-boolean, non-object value as invalid
(reset to {} with warning), but the existing test only exercised the
string "yes" branch. Add parameterized cases for null, array, and number
so a future regression that accepts these in the valid bucket gets caught.
Also cover partial objects — the migration must not paternalistically
fill defaults; that responsibility lives in normalizeGitCoAuthor at the
Config boundary.

* fix(shell): address PR review for compound commits and PR body escaping

Two critical issues called out in review:

1. attachCommitAttribution treated the final shell exit code as proof
   that `git commit` itself failed. For compound commands like
   `git commit -m "x" && npm test`, the commit can succeed and a later
   step can fail; the previous code then cleared attribution without
   writing the git note. Now we snapshot HEAD before the command (via
   `git rev-parse HEAD` through child_process.execFile, kept independent
   of the mockable ShellExecutionService) and detect commit creation by
   HEAD movement, so attribution lands whenever a new commit was created
   regardless of later steps.

2. addAttributionToPR spliced the configured generator name into the
   user-approved `gh pr create --body "..."` argument verbatim. A name
   containing `"`, `$`, a backtick, or `'` could break the command or be
   evaluated as command substitution. Now we shell-escape the appended
   text per the surrounding quote style before splicing.

Tests cover the new escape paths for both double- and single-quoted
bodies, including a generator name designed to break interpolation
(`$(rm -rf /) "danger" \`eval\``) and one with an apostrophe.

* fix(attribution): address Copilot review on shell, schema, and totals

Six items called out on PR #3115 by Copilot:

- shell.ts: addAttributionToPR's bash quote escaping doesn't apply to
  cmd.exe / PowerShell, where `\$` and `'\''` aren't honored. Skip the
  PR body rewrite entirely on Windows — losing PR attribution there is
  preferable to corrupting the user-approved `gh pr create` command.

- attributionTrailer.ts + shell.ts call site: buildGitNotesCommand used
  bash-style single-quote escaping on the JSON note, which is broken on
  Windows. Switched to argv form (`{ command, args }`) and routed the
  invocation through child_process.execFile so shell quoting is bypassed
  entirely. Tests updated to assert the argv shape.

- commitAttribution.ts: when a tracked file's aiChars exceeded the diff
  --stat-derived diffSize (long-line edits where diffSize ≈ lines * 40),
  humanChars clamped to 0 but aiChars stayed inflated, leaving aiChars +
  humanChars > the committed change magnitude. Clamp aiChars to diffSize
  so the totals stay consistent.

- shell.ts parseDiffStat: only normalized rename brace notation
  (`{old => new}`). Cross-directory renames emit `old/path => new/path`
  without braces, leaving diffSizes keyed by the full string. Added a
  second normalization step.

- shell.ts: addAttributionToPR docstring claimed `(X% N-shotted)` but
  the implementation only emits `(N-shotted by Generator)`. Updated the
  docstring to match the actual behavior.

- settingsSchema.ts + generator: gitCoAuthor went from boolean to object
  in the V4 migration. The exported JSON Schema now wraps the field in
  `anyOf: [boolean, object]` (via a new `legacyTypes` hint on
  SettingDefinition) so users with a stored boolean don't see a spurious
  IDE warning before their next launch runs the migration.

* fix(attribution): parse binary diffs, source generator from model, sync schema $version

Three follow-up review items from Copilot:

- parseDiffStat now handles git's binary-diff format (`path | Bin A ->
  B bytes`) using the byte delta with a floor of 1. Without this,
  binary edits arrived at the attribution payload as diffSize=0 and
  were silently dropped. Also extracted the parser to a top-level
  exported function so the binary path is unit-testable; added five
  targeted cases (text/binary/rename normalisation/summary skip).

- attachCommitAttribution now passes `this.config.getModel()` into
  generateNotePayload instead of the user-configurable
  `gitCoAuthor.name`. The note's `generator` field reflects which
  model produced the changes — and CommitAttributionService's
  sanitizeModelName() actually has the codename to scrub now.

- generate-settings-schema.ts imports SETTINGS_VERSION instead of
  hardcoding `default: 3`, so a future bump propagates to the emitted
  JSON schema in one place. Regenerated settings.schema.json bumps
  $version's default from 3 to 4 to match the V4 migration.

* fix(attribution): repo-root baseDir, escape co-author trailer, switch to numstat

Three Critical items called out by wenshao:

- attachCommitAttribution was passing config.getTargetDir() as `baseDir`
  to generateNotePayload, but getCommittedFileInfo returns paths
  relative to `git rev-parse --show-toplevel`. When the working
  directory was a subdirectory of the repo, path.relative produced
  `../...` keys that never matched in the AI-attribution lookup,
  silently zeroing out attribution for every file outside getTargetDir.
  StagedFileInfo now carries an optional `repoRoot` (filled in by
  getCommittedFileInfo via `git rev-parse --show-toplevel`) and the
  caller prefers it over the target dir.

- addCoAuthorToGitCommit interpolated `gitCoAuthorSettings.name` and
  `.email` into the rewritten command without escaping. A name
  containing `$()`, backticks, or `"` could be evaluated as command
  substitution under double quotes, or break the user-approved
  `git commit -m "..."` quoting. Now escapes per the surrounding quote
  style with the same helpers addAttributionToPR uses, gates on
  non-Windows for the same shell-quoting reason, and fixes the regex
  to accept `-m"msg"` shorthand (no space) so users who type the
  bash-shorthand form aren't silently denied a trailer.

- parseDiffStat used `git diff --stat` output and approximated each
  line as ~40 chars by parsing a graphical text bar. Replaced with
  `git diff --numstat` which gives unambiguous integer
  additions+deletions per file; the heuristic remains but the parser
  is no longer fooled by the visual `++--` markers. Binary entries
  fall back to a fixed estimate so they still land in the map (rather
  than dropping out as diffSize=0).

Suggestions also addressed: stale duplicate JSDoc on
addCoAuthorToGitCommit removed, misleading `clearAttributions`
comments rewritten to describe what the boolean argument actually
does. Tests cover the new shorthand path, escape behavior, and
numstat parsing (text/binary/rename/malformed).

* fix(shell): shell-aware git-commit detection and apostrophe-escape handling

Two more Critical items called out by wenshao plus the matching Copilot
quote-handling notes:

- attachCommitAttribution and addCoAuthorToGitCommit now go through a
  shell-aware `looksLikeGitCommit` helper instead of a raw
  `\bgit\s+commit\b` regex. The helper splits the command on shell
  separators (`splitCommands`) and checks each segment, so `echo "git
  commit"` no longer triggers attribution clearing or trailer
  injection. The same helper bails on any segment that contains `cd`
  or `git -C <path>`, since either could redirect the commit into a
  different repo than our cwd — writing notes or capturing HEAD there
  would corrupt unrelated state.

- The post-command attribution call now runs regardless of whether the
  shell wrapper aborted. `git commit -m "x" && sleep 999` could move
  HEAD and then time out, leaving the new commit without its
  attribution note while the stale per-file attribution stayed around
  for a later unrelated commit. attachCommitAttribution still gates on
  HEAD movement, so it's a no-op when no commit was actually created.

- The `-m '...'` and `--body '...'` regexes used to match only the
  first quote segment, so a command like `git commit -m 'don'\''t'`
  (bash's standard apostrophe-escape form) would have the trailer
  spliced mid-message and break the command's quoting. The single-
  quote patterns now use a negative lookahead / inner alternation to
  either skip those messages entirely (commit path) or match the
  whole escape-aware body (PR path).

Tests cover the new behavior: quoted "git commit" is left alone, the
`cd && git commit` and `git -C` patterns get no trailer, and the
apostrophe-escape form passes through unchanged for both `-m` and
`--body`.

* fix(attribution): drop magic 100 fallback for empty deletions

Deleted files with no AI tracking now use diffSize directly. With
numstat as the input source, diffSize is an exact count, and an
empty-file deletion legitimately reports zero — a magic fallback would
only inflate totals.

* fix(shell): broaden git-commit detection, gate background, drop dead helpers

Five Copilot follow-ups:

- looksLikeGitCommit now strips leading env-var assignments
  (`GIT_COMMITTER_DATE=now git commit ...`) and a small allowlist of
  safe wrappers (`sudo`, `command`) before matching. The previous
  exact-prefix match silently skipped trailer injection on common
  real-world commit forms.

- A new looksLikeGhPrCreate (same shell-aware shape) replaces the raw
  `\bgh\s+pr\s+create\b` regex in addAttributionToPR, so quoted text
  like `echo "gh pr create --body \"x\""` no longer triggers a
  command-string rewrite.

- executeBackground refuses to run `git commit` and tells the user to
  re-run foreground. The BackgroundShellRegistry lifecycle has no
  hook for the post-command pre/post-HEAD comparison or git-notes
  write, so allowing the commit through would create the new commit
  without notes and leak stale per-file attribution into the next
  foreground commit.

- recordDeletion was unused outside its own test — removed (and the
  test). When AI-driven deletions need tracking we'll add it with an
  actual integration point rather than carrying dead API surface.

- generatePRAttribution was likewise unused; addAttributionToPR
  builds the trailer string inline. The two formats had already
  diverged. Removed the helper and its tests; reviving from git
  history is straightforward if a future caller needs it.

Tests: env-var and sudo prefixes now produce trailers; quoted
"gh pr create" leaves the command unchanged; existing 81 shell tests
still pass alongside the trimmed 25 commitAttribution tests.

* fix(shell): unified git-commit detection split by intent

Six items called out across CodeQL, Copilot, and wenshao:

- The earlier `looksLikeGitCommit`/`stripCommandPrefix` returned a
  single yes/no and rejected ANY `cd` in the chain. That fixed the
  wrong-repo case but also disabled attribution for `git commit -m
  "x" && cd ..` (commit already landed safely in our cwd; the cd
  came after). It also conflated three distinct decisions onto one
  predicate.

  New `gitCommitContext` returns both `hasCommit` and
  `attributableInCwd`, walking segments in order so that a `cd`
  AFTER the commit doesn't invalidate it. Callers now pick the right
  arm:
  - background-mode refusal uses `hasCommit` (refuses even
    `cd /elsewhere && git commit` since we can't attribute it
    afterward either way)
  - HEAD snapshot, addCoAuthorToGitCommit, and the
    attachCommitAttribution gate use `attributableInCwd`

- Tokenisation switches from a regex while-loop to `shell-quote`'s
  `parse`. Quoted env values like `FOO="a b" git commit` now skip
  correctly (the old `\S*\s+` form would cut after the opening
  quote). Eliminates the CodeQL polynomial-regex alert at the same
  time since the `\S*\s+` pattern is gone.

- attachCommitAttribution now snapshots prompt counters via
  `clearAttributions(true)` whenever a commit lands, even if no
  per-file attributions were tracked. Previously the early-return
  on `hasAttributions() === false` meant `promptCountAtLastCommit`
  never advanced, so a later `gh pr create` reported an inflated
  N-shotted count spanning multiple commits.

Tests: env-var and sudo prefixes still produce trailers; quoted
"git commit" / "gh pr create" leave commands unchanged; cd BEFORE
commit suppresses the rewrite while cd AFTER commit does not; `git
-C <path> commit` is treated as a commit (refused in background)
but not as attributable.

* fix(shell): position-independent git subcommand detection + bash-shell guard

Six review items, two of them critical:

- gitCommitContext was checking fixed-position tokens (`arg1`, `arg3`)
  and missed every git invocation that puts a global flag between
  `git` and the subcommand: `git -c user.email=x@y commit`,
  `git --no-pager commit`, `git -C /p -c k=v commit`, etc. In
  background mode these would slip past the refusal guard; in
  foreground they got no co-author trailer, no git note, and no
  prompt-counter snapshot. New `parseGitInvocation` walks past
  git's global flags (with their values) before reading the
  subcommand, and reports `changesCwd` for `-C` / `--git-dir` /
  `--work-tree`.

- The Windows guard on addCoAuthorToGitCommit and addAttributionToPR
  used `os.platform() === 'win32'`, which incorrectly skipped Windows
  + Git Bash (`getShellConfiguration().shell === 'bash'`). Switched
  both to gate on `getShellConfiguration().shell !== 'bash'` so Git
  Bash users keep the feature.

- attachCommitAttribution was re-parsing `gitCommitContext(command)`
  even though `execute()` already gates on `commitCtx.attributableInCwd`.
  Removed the redundant re-parse — drift between the two checks would
  silently diverge trailer injection from git-notes writes.

- tokeniseSegment (formerly tokeniseProgram) now logs via debugLogger
  on parse failure instead of swallowing silently. Easier to debug
  if shell-quote ever throws on something unusual.

- Added a comment on `cwdShifted` documenting that it's a one-way
  latch — `cd src && cd ..` will still skip attribution. The
  trade-off matches the wrong-repo guard's "better miss than corrupt
  unrelated repos" intent.

- Stale `--stat` reference in the aiChars-clamp comment updated to
  `--numstat` to match the actual git command in
  ShellToolInvocation.getCommittedFileInfo.

Tests: `git -c key=val commit` and `git --no-pager commit` now
produce a trailer; existing 82 shell tests still pass.

* fix(shell): refuse multi-commit attribution; misc review follow-ups

Five follow-ups from the latest review pass:

- attachCommitAttribution now refuses to write a single git note for
  shell commands that produce more than one commit (e.g.
  `git commit -m a && git commit -m b`). The singleton's per-file
  attribution map can't be partitioned across the individual commits,
  so attaching the combined note to HEAD would mis-attribute earlier
  commits' changes to the last one. Walks `preHead..HEAD` via
  `git rev-list --count`; on multi-commit detection it snapshots the
  prompt counters and bails with a debug warning instead of writing
  a misleading note.

- parseGitInvocation now recognises the attached `-C/path` form
  (e.g. `git -C/path commit -m x`). shell-quote tokenises that as a
  single `-C/path` token which previously fell to the generic flag
  branch with `changesCwd = false`, leaving an out-of-cwd commit
  classified as attributable.

- attachCommitAttribution dropped its unused `command` parameter
  (the caller already gates on `commitCtx.attributableInCwd`, so
  re-parsing was removed earlier; the parameter became dead).

- Added wiring guards in edit.test.ts and write-file.test.ts:
  AI-originated edits/writes hit `CommitAttributionService.recordEdit`,
  `modified_by_user: true` skips, and write-file's distinction
  between a true new file and an overwritten empty file (`null` vs
  `''` old content) is now pinned by `aiCreated` assertions.

* fix(attribution): partial-commit clear, symlink baseDir, gh/git flag handling

Two Critical items, two Copilot, and five wenshao Suggestions:

- attachCommitAttribution's `finally` block used to call
  `clearAttributions()` unconditionally, wiping per-file tracking
  for files the AI had edited but the user excluded from this
  commit. Added `clearAttributedFiles(committedAbsolutePaths)` to
  the service and the call site now passes only the paths that
  actually landed in this commit; entries for un-`add`ed files stay
  pending for a later commit.

- generateNotePayload now runs both `baseDir` and each tracked
  absolute path through `fs.realpathSync` before `path.relative`.
  On macOS in particular `/var` symlinks to `/private/var`, so the
  toplevel from `git rev-parse --show-toplevel` and the absolute
  path captured by edit/write-file tools could diverge — producing
  `../../actual/path` keys in the lookup that never matched and
  silently zeroed all per-file AI attribution.

- tokeniseSegment now consumes value-taking sudo flags (`-u`,
  `-g`, `-h`, `-D`, `-r`, `-t`, `-C`, plus the long forms). Without
  this, `sudo -u other git commit` left `other` standing in for
  the program name and skipped the trailer entirely.

- A duplicate JSDoc block above `countCommitsAfter` (a leftover
  from the earlier extraction of `getGitHead`) was removed; both
  helpers now have one accurate comment each.

- attachCommitAttribution's multi-commit guard now also runs when
  `preHead === null` (brand-new repo), via `git rev-list --count
  HEAD`. A compound `git init && git commit -m a && git commit -m b`
  no longer slips through and mis-attributes combined data to the
  last commit.

- addCoAuthorToGitCommit's `-m` matching switched to `matchAll` and
  takes the LAST match. `git commit -m "title" -m "body"` puts the
  trailer at the end of the body so `git interpret-trailers`
  recognises it; the previous first-match behaviour stuffed the
  trailer in the title where git treats it as plain message text.

- addAttributionToPR's `--body` regex accepts both space and
  `=` separators (`--body "..."` and `--body="..."`); the `=` form
  is common with gh.

- New `parseGhInvocation` walks past gh's global flags
  (`--repo`, `-R`, `--hostname`) so `gh --repo owner/repo pr
  create ...` is detected. The earlier fixed-position check at
  tokens[1]/tokens[2] missed any command with a global flag.

- getCommittedFileInfo now fans out the two `rev-parse` calls and
  the three diff calls with `Promise.all`. They're independent and
  serialising them was paying spawn latency 5× per commit.

Tests: sudo with `-u user`, multi `-m`, `gh --repo owner/repo`,
`--body="..."`, plus the existing 84 shell tests still pass.

* fix(attribution): canonicalize file paths centrally in CommitAttributionService

Two related Copilot follow-ups:

- recordEdit/getFileAttribution/clearAttributedFiles now run input
  paths through fs.realpathSync before storing/looking up, so a
  symlinked path (e.g. macOS /var ↔ /private/var) resolves to the
  same key regardless of which form the caller passes. Previously
  edit.ts/write-file.ts handed in non-realpath'd absolute paths
  while generateNotePayload tried to realpath only inside its
  lookup loop, leaving partial-clear and clear-on-finally paths
  unable to find entries when the forms diverged.

- restoreFromSnapshot also canonicalises on the way in so a
  session resumed from a pre-fix snapshot (where keys may not
  have been canonical) ends up with the same shape as newly
  recorded entries — otherwise a single file could end up with
  two parallel records.

- generateNotePayload's lookup loop dropped its per-entry realpath
  call (now redundant since keys are canonical at write time),
  keeping only the realpath of `baseDir` (which still comes from
  `git rev-parse --show-toplevel` and may be a symlink).

- Updated `clearAttributedFiles` doc to describe the new semantics:
  callers can pass either the resolved repo-relative path or an
  already-canonical absolute path, and either will match.

* fix(attribution): canonicalize-from-root cleanup; fix mixed-quote -m / gh -R=

Five review items, one Critical:

- attachCommitAttribution now canonicalises via the repo *root* (one
  realpath call) and resolves committed paths against that canonical
  root, rather than per-leaf realpath inside clearAttributedFiles.
  At cleanup time the leaf for a just-deleted file no longer exists,
  so per-leaf fs.realpathSync would fail and silently fall back to a
  non-canonical path that misses the stored canonical key — leaving
  stale attributions for deleted files.
  clearAttributedFiles drops its internal realpath and now documents
  the canonical-paths-required precondition explicitly.

- addCoAuthorToGitCommit picks the LAST `-m` regardless of quote
  style. Previously `doubleMatch ?? singleMatch` always preferred
  the last double-quoted match, so `git commit -m "Title" -m
  'Body'` injected the trailer into the title where git
  interpret-trailers would silently ignore it. Now compares match
  indices, and the escape helper follows the actually-selected
  match's quote style.

- parseGhInvocation handles `-R=value` (the equals form of the
  short `--repo` alias). `--repo=...` and `--hostname=...` were
  already covered; `-R=...` previously fell through to the generic
  flag branch and skipped the value.

- New tests for the symlink-aware canonicalisation: macOS-style
  `/var` ↔ `/private/var` mapping is mocked via vi.mock on
  node:fs, with cases for record-then-look-up under either form,
  generateNotePayload with a symlinked baseDir, partial clear via
  the canonical-root-derived path (deleted leaf), and snapshot
  restore canonicalisation.

- Doc-only: integration-test header comments updated from
  "V1 -> V2 -> V3" / "migration to V3" to reflect the actual V4
  end state (assertions already used the literal `4`).

* fix(shell): scope -m rewrite to commit segment, reject nested matches

Two Critical findings on addCoAuthorToGitCommit, plus a Copilot
maintainability nit:

- The `-m` regex used to scan the whole compound command, so
  `git commit -m "fix" && git tag -a v1 -m "release"` would target
  the LATER tag annotation (last -m wins) and splice the trailer
  there instead of the commit message. The rewrite now scopes to
  the actual `git commit` segment via a new
  findAttributableCommitSegment(): same shell-aware walk
  gitCommitContext does, but returning the segment's character
  range so the regex can be run on a slice and spliced back into
  the original command.

- Within the segment, a literal `-m '...'` *inside* a quoted body
  was treated as a real later -m. For
  `git commit -m "docs mention -m 'flag' for completeness"`, the
  inner single-quoted -m sits at a higher index than the real
  outer -m, and the previous index comparison would have it win —
  splicing the trailer mid-message and corrupting the quoting.
  The new code checks whether the candidate is nested inside the
  other quote-style's range (start/end containment) and prefers
  the outer match when so.

- Hoisted three constant Sets (sudo flag list, git global flags
  taking values, git global flags shifting cwd, gh global flags)
  out of the per-call scope to module constants. Functional
  no-op, but keeps the parsing helpers easier to read and avoids
  re-allocating the Sets on every command.

Two regression tests added for the cases above:
- inner `-m '...'` inside the outer message body is preserved
  literally and the trailer lands after the body
- `git tag -a v1 -m "release notes"` after a real
  `git commit -m "fix"` is left untouched, with the trailer
  appended to "fix" only

* fix(attribution): cd-leak, numstat partial failure, $() bailout, gh pr new alias

Five Critical/Suggestion items:

- `cd subdir && git commit` (or any non-attributable commit chain
  whose HEAD movement still happens in our cwd, e.g. cd into a
  subdirectory of the same repo) used to skip attribution AND fail
  to clear pending per-file entries. Those entries then leaked into
  the next foreground commit, inflating its AI percentage. New
  `else if (commitCtx.hasCommit)` branch in execute() compares pre-
  and post-HEAD; if HEAD moved we drop the per-file state. preHead
  is now snapshotted whenever ANY commit was attempted, not only
  attributable ones.

- getCommittedFileInfo's three diff calls run in `Promise.all`. If
  `--numstat` failed while `--name-only` succeeded, every file's
  diffSize would be 0 and generateNotePayload would clamp aiChars
  to 0 — emitting a structurally valid note with all-zero AI
  percentages. Detect the partial-failure shape (files non-empty,
  diffSizes empty) and return empty so no note is written.

- addCoAuthorToGitCommit and addAttributionToPR now bail when the
  captured `-m`/`--body` value contains `$(`. The tool description
  recommends `git commit -m "$(cat <<'EOF' ... EOF)"` for
  multi-line messages, but the regex's `(?:[^"\\]|\\.)*` body group
  stops at the first interior `"` from a nested shell token —
  splicing the trailer there breaks the command before it reaches
  the executor.

- looksLikeGhPrCreate now accepts `gh pr new` as well — it's a
  documented alias for `gh pr create` and was silently skipped.

- Removed `incrementPermissionPromptCount` / `incrementEscapeCount`
  and their getters: they had no production callers, so the backing
  fields just round-tripped through snapshots as 0. The four
  snapshot fields are now optional so pre-fix snapshots that carry
  non-zero values still load cleanly and just get ignored.

Three regression tests added: heredoc-style `-m "$(cat <<EOF...)"`
preserved literally, heredoc-style `--body` likewise, `gh pr new
--body "..."` rewritten with attribution.

* fix(attribution): --amend, --message/-b aliases, .d.ts over-exclusion

Four Copilot follow-ups, three of them user-visible coverage gaps:

- `git commit --amend` was diffing `HEAD~1..HEAD` for attribution,
  which spans the entire amended commit (parent → amended) rather
  than the actual amend delta. A message-only amend would emit a
  note attributing every file in the original commit to this
  amend. New `isAmendCommit` helper detects the flag and
  getCommittedFileInfo switches to `HEAD@{1}..HEAD` (the pre-amend
  HEAD vs the amended HEAD); if the reflog is GC'd we bail with a
  warning rather than over-attribute.

- `git commit --message "..."` and `--message="..."` were silently
  skipped because the regex only recognised the short `-m` form.
  The flag prefix now matches both alternatives via
  `(?:-[a-zA-Z]*m|--message)\s*=?\s*` (non-capturing inner group
  so the existing `[full, prefix, body]` destructure still works).

- `gh pr create -b "..."` (the short alias for `--body`) was the
  same gap on the PR side; `(?:--body|-b)[\s=]+` now covers both
  forms.

- `.d.ts` was an over-broad blanket exclusion in
  EXCLUDED_EXTENSIONS — declaration files are commonly authored
  (ambient declarations, asset shims like `*.d.ts` for
  `import './x.svg'`); the repo even contains
  `packages/vscode-ide-companion/src/assets.d.ts`. Removed `.d.ts`
  from the extensions Set and adjusted the test to assert the new
  behavior. Auto-generated `.d.ts` (e.g. `tsc --declaration`
  output) still gets caught by the build-directory rules.

Tests added: `--amend` plumbing covered by the new branch in
getCommittedFileInfo (no targeted unit test — the diff invocation
goes through ShellExecutionService and is exercised by the existing
post-command path); `--message`/`--message="..."`/-b/-b="..."` all
have positive trailer-injection assertions; `.d.ts` test split into
"hand-authored" (negative) and "in dist" (positive).

* fix(attribution): cd-subdir, scope --body, multi-commit count guard, /clear reset

Four bugs flagged this round:

- gitCommitContext / findAttributableCommitSegment used a blanket
  "any cd shifts cwd" gate, breaking the very common
  `cd subdir && git commit -m "..."` flow even though the commit
  lands in the same repo. New `cdTargetMayChangeRepo` heuristic:
  treat relative paths that don't escape upward (no leading `..`,
  no absolute path, no `~`/`$VAR` expansion, no bare `cd`/`cd -`)
  as in-repo and let attribution proceed. Conservative on anything
  it can't statically verify.

- addAttributionToPR was running the `--body`/`-b` regex against
  the FULL compound command string. In
  `curl -b "session=abc" && gh pr create --body "summary"` the
  regex would match curl's `-b` cookie flag and inject attribution
  into the cookie value, corrupting the curl call. Added
  `findGhPrCreateSegment` (analog of `findAttributableCommitSegment`)
  and scoped the body regex to that segment, splicing back into
  the original command via offsetting the in-segment match index.

- The multi-commit guard treated `runGitCount === 0` as "single
  commit" and bypassed itself. After `commitCreated === true`, a
  count of 0 is impossible in normal operation — it means
  rev-list errored or timed out. Now we bail on `commitCount !== 1`
  with a tailored message: anything other than exactly 1 commit
  is suspicious and refuses the note.

- The CommitAttributionService singleton survives across
  `Config.startNewSession()` (the `/clear` and resume paths). New
  `CommitAttributionService.resetInstance()` call alongside the
  existing chat-recording / file-cache resets in startNewSession
  prevents pending attributions from a prior session attaching to
  a commit in the new one.

Three regression tests added: `cd src && git commit` produces a
trailer (in-repo cd), `cd .. && git commit` does not (could escape
repo root), and `curl -b "..." && gh pr create --body "..."` leaves
curl's cookie value untouched while attribution lands in gh's body.

* fix(attribution): cd embedded .., env wrapper, Windows ARG_MAX, segment-locator warn

Four review items, all small but real:

- cdTargetMayChangeRepo missed embedded `..` traversal — `cd
  foo/../../escape` and similar would slip past the leading-`..`
  check and be treated as in-repo. Added an `includes('/..')` /
  `includes('\\..')` check (catches POSIX and Windows separators
  without false-positiving on `..` chars inside ordinary names,
  which only escape when followed by a separator).

- tokeniseSegment now recognises `env` as a safe wrapper alongside
  `sudo`/`command`, so `env GIT_COMMITTER_DATE=now git commit ...`
  resolves to `git`. After the wrapper detection we also skip any
  `KEY=VALUE` argv entries (env's own argument syntax for setting
  vars before the program).

- buildGitNotesCommand's MAX_NOTE_BYTES dropped from 128 KB to
  30 KB. Windows' CreateProcess lpCommandLine is capped around
  32,768 UTF-16 chars including the executable path and other argv
  entries; a 128 KB note would still fail to spawn even though
  the function returned a command instead of null. 30 KB leaves
  ~2 KB of headroom for the rest of the argv on Windows and is
  larger than any real commit's metadata in practice.

- findAttributableCommitSegment / findGhPrCreateSegment now log a
  debugLogger.warn when `command.indexOf(sub, cursor)` returns -1
  — splitCommands strips line continuations (`\<newline>`), so a
  multi-line command can have the trimmed segment text fail to
  match its source. Previously the segment was silently skipped
  with no signal; the warn makes the failure observable when
  QWEN_DEBUG_LOG_FILE is set.

Two regression tests added: `cd foo/../../escape && git commit`
gets no trailer (embedded-`..` heuristic catches it), and
`env GIT_COMMITTER_DATE=now git commit` does (env wrapper skipped).

* fix(attribution): scope isAmendCommit to attributable segment only

`git -C ../other commit --amend && git commit -m x` would previously
flag the second (fresh) commit as an amend, causing
attachCommitAttribution to diff `HEAD@{1}..HEAD` against an unrelated
reflog entry. Mirror findAttributableCommitSegment's cd/cwd tracking
so only the first commit segment that runs in the original cwd
determines amend status.

* fix(attribution): last-match --body, symlink leaf canonicalisation, scoped prompt count

- addAttributionToPR: use matchAll/last-match for `--body`/`-b` so the
  trailer lands in the gh-honoured (final) body when multiple flags are
  present. Mirrors addCoAuthorToGitCommit. Adds regression test.
- attachCommitAttribution: also fs.realpathSync the per-file resolved
  path (not just the repo root) so files behind intermediate symlinks
  are matched against canonical keys recordEdit stored, instead of
  silently zeroing attribution and leaking entries past commit.
- incrementPromptCount: scope to SendMessageType.UserQuery — ToolResult,
  Retry, Hook, Cron, Notification are model/background re-entries of
  the same logical turn. Tracking them all inflated the "N-shotted"
  trailer (one user message could become 10-shotted with 10 tool calls).
- AttributionSnapshot: add `version: 1` field; restoreFromSnapshot now
  refuses incompatible versions and validates per-field types so a
  partially-written snapshot can't seed `Math.min(undefined, n) === NaN`
  into git-notes payloads.
- Drop unused permission/escape counters (declared, persisted, never
  read or incremented) — fields, snapshot tolerance, and clear-method
  bookkeeping all removed; AttributionSnapshot interface simplifies.
- isGeneratedFile: switch directory rule from substring `.includes('/dist/')`
  to segment-boundary check (split on `/`) so project dirs like
  `my-dist/` or `xbuild/` don't match. `.lock` removed from the blanket
  extension exclusion — well-known lockfiles already covered by
  EXCLUDED_FILENAMES; hand-authored `.lock` files (e.g. `.terraform.lock.hcl`)
  now stay attributable.
- getClientSurface: document `QWEN_CODE_ENTRYPOINT` as the embedder
  override hook so the always-`'cli'` default is intentional.

* fix(attribution): skip values for env -u NAME and -S string

`env`'s value-taking flags (`-u`/`--unset`, `-S`/`--split-string`) were
not in the wrapper's flag-skip allowlist, so `env -u FOO git commit ...`
left FOO as the next token and the parser treated it as the program —
masking the real `git commit` from attribution detection. Add an
ENV_FLAGS_WITH_VALUE table mirroring the sudo allowlist. Regression
test added.

* fix(attribution): submodule leak, PR body nesting, shallow-clone bail, schema default

- attachCommitAttribution: when HEAD didn't move in our cwd, leave
  pending attributions alone instead of dropping them. The case can be
  a failed commit, `git reset HEAD~1`, OR `cd submodule && git commit`
  (inner repo's HEAD moves, ours doesn't). Dropping was overly
  aggressive and silently lost outer-repo edits in the submodule case.
- addAttributionToPR: mirror addCoAuthorToGitCommit's nested-match
  rejection so `gh pr create --body "docs mention -b 'flag'"` picks the
  outer `--body`, not the inner literal `-b`. Splicing into the inner
  match would corrupt the body. Regression test added.
- getCommittedFileInfo: when `rev-parse --verify HEAD~1` fails, also
  check `rev-list --count HEAD === 1` to confirm HEAD is the true
  root commit. In a shallow clone, HEAD~1 is unreadable but the commit
  has a parent recorded — falling back to `diff-tree --root` would
  diff against the empty tree and over-attribute the entire commit.
  Bail with a debug warning instead.
- generate-settings-schema: lift `default` (and `description`) out of
  the inner `anyOf[N]` schema to the outer level when wrapping with
  `legacyTypes`. Most JSON-schema-driven editors only surface
  top-level defaults; burying the default under `anyOf` lost the
  "enabled by default" hint. Also extend the default filter to
  publish non-empty plain objects (so `gitCoAuthor`'s default can
  appear). gitCoAuthor's source default updated to the runtime shape
  `{commit: true, pr: true}` to match `normalizeGitCoAuthor`.

* fix(attribution): drop unsafe full-clear, tag analysis-failure with null

ju1p (Copilot): the `else if (commitCtx.hasCommit)` branch fully
cleared the singleton on `cd /abs/same-repo/subdir && git commit`
(or `git -C . commit`), losing pending AI edits the user hadn't
staged. We can't tell which files were in the commit from this
branch, and the next attributable commit's partial-clear handles
cleanup correctly anyway. Drop the branch entirely.

ju2D (Copilot): `getCommittedFileInfo` returned the same empty
StagedFileInfo for both "could not analyze" (shallow clone, --amend
without reflog, --numstat partial failure, exception) and
"intentionally empty" (--allow-empty). The caller couldn't tell them
apart, so the partial clear became a no-op on analysis failure and
the just-committed AI edits leaked to the next commit. Switch the
return type to `StagedFileInfo | null` and have the caller treat
null as "fall back to full clear" while empty StagedFileInfo
(--allow-empty) leaves attributions intact for the next real commit.

* fix(attribution): dedup snapshot writes, cap excludedGenerated, doc commit toggle scope

rsf- (Copilot): recordAttributionSnapshot wrote a full snapshot to
the JSONL on every non-retry turn, even when the tracked state was
unchanged. Long-running sessions accumulated thousands of identical
snapshot copies, inflating session size and slowing /resume hydrate.
Dedup by JSON-equality with the prior write — first write always
goes through, identical successors are no-ops.

rsgo (Copilot): excludedGenerated path list was unbounded. A commit
churning thousands of generated artifacts (large dist/ rebuild)
could push the JSON note past MAX_NOTE_BYTES (30KB) and lose
attribution for the real source files in the same commit. Cap the
serialized sample at MAX_EXCLUDED_GENERATED_SAMPLE (50) and add
excludedGeneratedCount for the true total.

rsg9 + rshM (Copilot): the gitCoAuthor.commit description claimed
the toggle only controlled the Co-authored-by trailer, but
attachCommitAttribution also gates the per-file git-notes payload
on the same flag. Update both the schema description and the
settings.md table to mention both effects so disabling the option
isn't a silent surprise.

* fix(attribution): depth-1 shallow detection, snapshot dedup post-rewind/post-failure

sfGz (Copilot): rev-list --count HEAD === 1 cannot distinguish a
true root commit from a depth-1 shallow clone — both report 1
because rev-list only walks locally available objects. Switch to
git log -1 --pretty=%P HEAD which reads the parent SHA directly
from commit metadata: empty means a real root, non-empty means a
parent is recorded (whether or not its object is local). The
shallow-clone bail is now reliable.

sfIm (Copilot): the dedup key persisted across rewindRecording, so
the previous snapshot living on the now-abandoned branch would
match the next post-rewind snapshot and silently skip the write,
leaving /resume on the rewound session with no attribution state.
Reset lastAttributionSnapshotJson when rewindRecording fires.

sfJE (Copilot): dedup key was committed before the async write
settled. A transient write failure would update the key, then
permanently suppress all future identical snapshots even though
nothing was ever persisted. Switch to optimistic-set then rollback
on appendRecord rejection — synchronous identical calls dedup
cleanly, but a failed write clears the key so the next identical
snapshot retries. appendRecord now returns the per-record write
promise (writeChain still has its swallow-catch for chain liveness)
so callers needing per-write success can react to it. Tests added
in chatRecordingService.test.ts for both rewind-reset and
rollback-on-failure paths.

* fix(attribution): preHead race, regex apostrophe-escape, surface failures, dead code

t2G0 (deepseek-v4-pro): addCoAuthorToGitCommit single-quote regex now
matches the bash close-escape-reopen apostrophe form using
((?:[^']|'\\'')*) — the same pattern bodySinglePattern uses for
gh pr create. Input like git commit -m 'don'\''t' was previously
silently un-rewritten because the negative lookahead bailed; the
trailer now lands at the FINAL closing quote. Test updated.

tMBP (gpt-5.5): preHead capture switched from concurrent async
getGitHead to a synchronous getGitHeadSync (execFileSync) BEFORE
ShellExecutionService.execute spawns the user's command. A fast
hot-cached git commit could move HEAD before the async rev-parse
resolved, leaving preHead === postHead and silently skipping the
attribution note. Trade ~10–50 ms event-loop block per
commit-shaped command for correctness of the post-command HEAD
comparison.

t2Gv (deepseek-v4-pro): attribution write failures (note exec
non-zero, payload too large, diff-analysis exception, shallow
clone / amend-without-reflog) are now surfaced on the shell tool's
returnDisplay AND llmContent so the user and agent both see when
their commit succeeded but the per-file git note didn't land.
attachCommitAttribution now returns string | null (warning text or
null for intentional skips like no-tracked-edits). Co-authored-by
trailer is unaffected — only the note is gated by these failures.

t2Gy (deepseek-v4-pro): committedAbsolutePaths now matches against
the canonical keys already stored in fileAttributions
(matchCommittedFiles iterates by relative path against the
canonical repo root) instead of re-resolving each diff path
on the fly. realpathSync(resolved) failed for deleted files and
didn't follow intermediate symlinks, leaving stale per-file
attribution alive past commit and inflating AI percentages on
subsequent commits.

t2HI (deepseek-v4-pro): removed dead sessionBaselines /
FileBaseline / contentHash / computeContentHash infrastructure
(~40 lines). The fields were written, persisted, and restored but
never read for any computation or decision. AttributionSnapshot
schema stays at version 1 — restore tolerates pre-fix snapshots
that carried the now-ignored baselines field.

t2HM (deepseek-v4-pro): extracted the duplicated lastMatch helper
in addCoAuthorToGitCommit and addAttributionToPR into a single
module-level lastMatchOf so future fixes can't be applied to only
one copy.

* chore(schema): regenerate settings.schema.json to match gitCoAuthor.commit description

The settingsSchema.ts source for `gitCoAuthor.commit.description` was
updated in 3c0e3293b but the JSON schema only picked up the OUTER
description rewrite and missed this inner property's. The Lint check
("Check settings schema is up-to-date") fails on that drift; this
commit re-runs `npm run generate:settings-schema` to sync them.

* fix(attribution): preserve unstaged AI edits across cleanup branches

uxU5 + uxVQ + uxUO (Copilot): every cleanup branch in
attachCommitAttribution that called clearAttributions(true) was
wholesale-erasing pending AI edits for files the user never staged
in this commit. Reviewer scenarios:
- multi-commit chain (`commit a && commit b`) bails out without
  writing a note, but unstaged edits to file Z (touched by neither
  commit) get cleared along with the chain's committed files.
- attribution toggle off: same — toggling the flag wipes pending
  unstaged work.
- analysis failure (shallow clone, --amend without reflog, partial
  diff failure): the finally-block fallback wholesale-cleared
  every pending file, consuming unrelated AI edits.
- 0%-AI commit: when no file in the commit was AI-touched,
  generateNotePayload was emitting an "0% AI" note attached to a
  commit that legitimately had no AI involvement — actively
  misleading metadata.

Add `noteCommitWithoutClearing()` to the service: snapshots the
prompt counter as the new "at last commit" but leaves the per-file
map alone. Use it in the multi-commit, no-tracked-edits,
toggle-off, and analysis-failure paths. The committed-files
partial-clear (clearAttributedFiles) still runs in the success
path. The 0%-AI no-match case now skips the note write entirely.

* fix(attribution): runGit null-on-failure, versionless v3→v4 migration

z54M (Copilot): runGit returned '' on both successful-empty-output
and silent failure, so a `--name-only` that errored mid-way through
the diff fan-out aliased to a real `--allow-empty` commit. The
empty-commit branch then preserved pending attributions, leaving
the just-committed file's tracked AI edit alive to re-attribute on
the next commit. Switch runGit to `Promise<string | null>`,
distinguishing exit code 0 (any output, including '') from non-zero
(null). The diff-stage fan-out and ancillary probes now treat null
as analysis failure and bail with `return null` instead of falling
into the empty-commit path.

z539 (Copilot): the v3→v4 `shouldMigrate` only fired on
`$version === 3`. A versionless settings file carrying the legacy
`general.gitCoAuthor: false` boolean would skip every migration
(gitCoAuthor isn't in V1_INDICATOR_KEYS — it post-dates V2), get
its `$version` normalized to 4 by the loader, and leave the
boolean in place. The settings dialog then reads the V4
`{commit, pr}` shape, sees missing keys, defaults both to true, and
silently overwrites the user's opt-out on the next save. Also fire
when `$version` is absent AND the value at `general.gitCoAuthor`
is a boolean. Tests cover the new path and confirm the existing
versioned/object-shape paths are untouched.

* fix(attribution): toggle-off partial clear, normalizeGitCoAuthor type-check, terraform lockfile

0oAK (Copilot): the gitCoAuthor.commit toggle-off branch returned
before computing the committed file set, leaving the just-committed
files' tracked AI work in the singleton. Re-enabling the toggle and
committing the same file again would re-attribute earlier (already-
committed) AI edits to the new commit. Move the toggle gate AFTER
matchCommittedFiles so the finally block does a proper partial clear
of the just-committed files even when the note write is skipped.

0oAg (Copilot): normalizeGitCoAuthor copied value?.commit / value?.pr
without type-checking. settings.json is hand-editable; a stored
`{ commit: "false" }` reached runtime as a truthy string and behaved
as if attribution were enabled. Add a per-field bool coercion that
falls back to the schema default (true) for any non-boolean,
matching what the dialog and IDE schema already imply. Tests cover
the string / number / null cases.

0oAo (Copilot): v3→v4 shouldMigrate only special-cased versionless
legacy booleans — versionless files with invalid gitCoAuthor values
(`"off"`, `[]`, etc.) skipped the migration and the loader stamped
`$version: 4` over the bad value. Runtime normalization then
silently re-enabled attribution. Extend shouldMigrate to fire on ANY
versionless non-object value at general.gitCoAuthor; the existing
migrate() body's drop-and-warn path resets it. Already-object
shapes (hand-edited to v4) still skip cleanly. Tests added.

0oAt (Copilot): `.terraform.lock.hcl` got dropped from generated-file
exclusion when `.lock` was removed from the blanket extension list
in 3c0e3293b. It's a generated provider lockfile in the same class
as `package-lock.json` and dominates Terraform-repo commits. Re-add
to EXCLUDED_FILENAMES and add a regression test covering both
repo-root and module-nested locations.

* fix(attribution): harden restoreFromSnapshot against corrupt payloads

1KMY (Copilot): snapshot.surface was copied without type validation.
A corrupted/partially-written snapshot with a non-string surface
(e.g. {}, 42, null) would later be serialized into the git note as
"[object Object]" and used as a Map key downstream, breaking the
expected payload shape. Type-check and fall back to the current
client surface for any non-string (or empty-string) value.

1KLq (Copilot): per-field sanitiseCount enforced
`promptCount >= 0` and `promptCountAtLastCommit >= 0` independently,
but never the cross-field invariant. A snapshot with
promptCountAtLastCommit > promptCount would surface a negative
getPromptsSinceLastCommit() and propagate as a "(-N)-shotted"
trailer into PR text. Clamp atLastCommit to total on restore.

1KL_ (Copilot): when a snapshot carried both the symlinked and
canonical paths for the same file (a session straddling the
canonicalisation fix), `set(realpathOrSelf(k), ...)` overwrote the
first entry with the second, silently dropping the AI contribution
the first form had accumulated. Merge instead: sum aiContribution
and OR aiCreated when collapsing duplicate keys.

Tests cover all three branches: non-string surface fallback,
promptCount clamp, and duplicate-key merge.

* fix(attribution): roll back snapshot dedup key on sync appendRecord failure

1UMh (Copilot): appendRecord can throw synchronously before returning
a promise — e.g. when ensureConversationFile() rethrows a non-EEXIST
writeFileSync error. The async .catch() handler attached to the
promise never runs in that case, so the optimistic dedup-key set
sticks on a write that never landed and permanently suppresses
identical retries. Roll back lastAttributionSnapshotJson in the outer
catch too. Regression test forces writeFileSync to throw EACCES on
the first invocation, then asserts the second identical snapshot
attempt fires a fresh write rather than getting deduped.

* docs(attribution): align cleanup-branch comments with noteCommitWithoutClearing

Three doc/test-fixture stale-after-refactor cleanups (Copilot
4MDx / 4MEI / 4MEa):

- shell.ts:1944 (around the stagedInfo === null branch): the comment
  still claimed the finally block "falls back to a full clear", but
  1ece87438 switched analysis-failure cleanup to
  noteCommitWithoutClearing(). Update the comment so the reasoning
  matches what the code actually does (and so a future reader doesn't
  reintroduce the wholesale clear thinking it's already there).

- shell.ts: getCommittedFileInfo docstring carried the same stale
  "full clear" claim for the `null` return value. Update to describe
  the noteCommitWithoutClearing() fallback and the smaller-evil
  trade-off for the just-committed file.

- chatRecordingService.test.ts: baseSnapshot fixture for the
  recordAttributionSnapshot tests still carried `baselines: {}`,
  even though that field was removed from AttributionSnapshot in
  296fb55ae's dead-code purge. Structural typing let it compile,
  but the fixture didn't reflect the production shape — drop it.

* fix(attribution): restore fire-and-forget appendRecord, route rollback via callback

6OcJ (Copilot): refactor in 715c258fb returned a Promise from
appendRecord so the snapshot dedup-key path could chain rollback —
but recordUserMessage / recordAssistantTurn / recordAtCommand /
recordSlashCommand / rewindRecording all call appendRecord without
await or .catch(). A transient jsonl.writeLine rejection on any of
those would surface as an unhandled-promise-rejection (warning, or
crash on --unhandled-rejections=throw).

Restore the original fire-and-forget semantics: appendRecord again
returns void and internally swallows async failures (logging via
debugLogger). Per-record failure reactions are routed through an
optional onError callback — recordAttributionSnapshot uses this to
roll back lastAttributionSnapshotJson when the write that set it
ends up rejecting.

Tests: add a fire-and-forget regression that mocks writeLine to
reject and asserts no unhandledRejection events fire while the
existing snapshot rollback tests (sync + async) still pass via the
new callback path.

* fix(attribution): GIT_DIR repo-shift bail, snapshot envelope validation, narrow legacyTypes

80ME (gpt-5.5 /review, [Critical]): tokeniseSegment unconditionally
stripped every leading KEY=value token. `GIT_DIR=elsewhere/.git git
commit ...` was therefore treated as an in-cwd commit, picked up the
Co-authored-by trailer, and produced a per-file note that landed
against our cwd's HEAD even though the actual commit went to a
different repo. Define a GIT_ENV_SHIFTS_REPO set (GIT_DIR,
GIT_WORK_TREE, GIT_COMMON_DIR, GIT_INDEX_FILE, GIT_NAMESPACE) and
have tokeniseSegment refuse to parse any segment whose leading env
block (including the env-wrapper's KEY=VALUE block) carries one of
these. Identity / date variables (GIT_AUTHOR_*, GIT_COMMITTER_*) are
deliberately NOT in the set — they tweak metadata but don't relocate
the repo. Tests cover plain prefix, env-wrapped prefix, and a
GIT_COMMITTER_DATE positive control that should still get the trailer.

8EeQ (Copilot): restoreFromSnapshot received `snapshot as
AttributionSnapshot` from a structural cast off `unknown` (the
resume path), so its TS-typed shape was only a hint. A corrupted
JSONL line (non-object / array / wrong type discriminator / missing
type) would skip past the version check straight into
Object.entries(snapshot.fileStates) — and a non-object fileStates
(an array, say) seeded fileAttributions with numeric-string keys.
Add envelope-level shape gates (isPlainObject + type discriminator)
and a fileStates plain-object check before iterating; both bail to a
clean reset rather than poisoning the singleton. Tests added.

8Eej (Copilot): SettingDefinition.legacyTypes was typed as
SettingsType[] which includes 'enum' and 'object' — JSON Schema's
`type` keyword doesn't accept those values. Adding
`legacyTypes: ['enum']` would silently produce an invalid
settings.schema.json. Narrow the field's type to
ReadonlyArray<'boolean' | 'string' | 'number' | 'array'> (the
JSON-Schema-primitive subset). Future complex-shape legacy support
should land its own branch in convertSettingToJsonSchema.

* docs(attribution): correct legacyTypes / EXCLUDED_DIRECTORY_SEGMENTS comments

9Ta_ (Copilot): the JSDoc on legacyTypes claimed JSON Schema's
`type` keyword does not accept `'object'` — that's wrong; `'object'`
IS a valid JSON Schema type. Reword to reflect the actual rationale:
`'enum'` is not a valid JSON Schema `type` value at all (enum
constraints use the `enum` keyword), and a bare `{type: 'object'}`
would accept any object regardless of what the field's pre-expansion
shape actually allowed. The narrowed `boolean | string | number |
array` set is exactly what the one-liner generator can faithfully
emit; richer legacy shapes belong in their own branch of
convertSettingToJsonSchema.

9Tbs (Copilot): the comment in generatedFiles.ts referenced
`EXCLUDED_DIRECTORIES`, but the constant is `EXCLUDED_DIRECTORY_SEGMENTS`
(renamed during the segment-boundary refactor). Update the
reference so a future maintainer scanning for the rule doesn't
chase a non-existent identifier.

* fix(attribution): SHA-pin git notes, on-disk hash divergence detection, env -C cwd-shift

tanzhenxin review #1 — Note targets symbolic HEAD, not captured SHA:
buildGitNotesCommand hard-coded 'HEAD' as the target; postHead was
captured at commit-detection time but only used for the !== preHead
diff. Between that capture and the execFile, three more awaited git
calls run — anything that moves HEAD in the same cwd (post-commit
hook, chained `commit && tag -m`, parallel process) silently lands
the note on the wrong commit because of `-f`. Thread postHead
through buildGitNotesCommand as a required `targetCommit` arg.
Test asserts the targeted SHA, not the symbolic ref.

tanzhenxin review #2 — Accumulator has no baseline:
recordEdit was monotonic per-path with no reset for out-of-band
mutations. Re-instate FileAttribution.contentHash and:
- recordEdit hashes the input `oldContent` and resets the per-file
  accumulator if it doesn't match what AI's last write recorded
  (catches paste-replace via external editor, manual save, etc.
  WHEN AI subsequently edits the same file again).
- New validateOnDiskHashes() rehashes every tracked file's CURRENT
  on-disk content and drops entries whose hash diverged. Called
  from attachCommitAttribution before matchCommittedFiles so a
  commit can never credit AI for a human-only diff. Deleted files
  (readFileSync throws) are left alone — the commit's deletion
  record is what the note should reflect.

tanzhenxin review #4 — Failed-commit / staleness leak:
The recordEdit divergence check above + commit-time
validateOnDiskHashes together catch tanzhenxin's exact scenario
(AI edits a.ts → hook rejects → user manually edits a.ts → user
commits → no AI credit because validateOnDiskHashes drops the
stale entry). The !commitCreated branch still preserves
attributions to keep the submodule case working — the staleness
problem is now solved at the next commit's validation step.

Self-review item — env -C / --chdir treated as repo-shifting:
Added ENV_FLAGS_SHIFT_CWD set covering -C / --chdir. tokeniseSegment
returns null for `env -C DIR git commit ...` segments — same
contract as a leading GIT_DIR=... assignment. Without this we'd
either misidentify /elsewhere as the program (silently dropping
attribution) or, worse if -C went into the value-skip set,
trailer-inject onto a commit that lands in /elsewhere's repo. Tests
added alongside the existing GIT_DIR repo-shift cases.

339 tests pass; typecheck clean.

* fix(attribution): pickBool intent-aware, shouldClear gate, ETIMEDOUT surface, drop dead exports

-wgA + -wg0 (deepseek): pickBool defaulted non-boolean to true,
turning a hand-edited `{ commit: "false" }` into enabled
attribution. Replace with intent-aware parsing: "true"/"yes"/"on"/
"1" → true, "false"/"no"/"off"/"0"/"" → false, anything else
(unknown strings, non-1 numbers, objects, arrays, null) → false.
Genuinely-absent sub-fields still default to true (schema default).
Migration test scenarios covered. Tests now cover ~17 input cases
across both string/number/null/object/unknown forms.

-wgq (deepseek): when buildGitNotesCommand returned null (oversized
payload) or git notes itself failed, the finally block called
clearAttributedFiles(committedAbsolutePaths) — irreversibly
deleting per-file attribution data the user might need to amend &
retry. Introduce a separate `shouldClear` set that's only assigned
on successful note write OR explicit toggle-off. Failure paths
(oversized, exitCode != 0, exception, analysis failure) leave
shouldClear null so the finally block calls noteCommitWithoutClearing
instead — preserving per-file state for the user's recovery.

9p7W (Copilot): execFile callback coerced ETIMEDOUT / SIGTERM
(timeout) into a generic exitCode=1 warning. Detect both
`error.code === 'ETIMEDOUT'` and `error.killed === true &&
error.signal === 'SIGTERM'` so the user-visible warning correctly
names "timed out after 5s" instead of "exited 1".

-wg7 (deepseek): formatAttributionSummary and getAttributionNotesRef
were exported but had zero production callers (only tests). Remove
the dead exports + their tests (~40 LOC). If/when a logging surface
needs them, they can be re-introduced.

-wgb (deepseek): tokeniseSegment doesn't recursively unwrap
`bash -c '...'` / `sh -c` / `zsh -c`, so addCoAuthorToGitCommit
won't splice the trailer into a wrapped command. The background
refusal AND the post-commit note path DO catch the wrapped commit
because stripShellWrapper at the top of execute peels the wrapper
before gitCommitContext / getGitHead run — so the worst-case
("background bash -c 'git commit' bypasses the guard") doesn't
materialize. The remaining gap (no Co-authored-by trailer for
bash -c-wrapped commits) requires recursively splicing into the
inner script with proper bash single-quote re-quoting; significant
enough that it's worth its own PR. Documented as a partial-coverage
limitation.

339 → 325 tests pass after the dead-export removal; typecheck clean.

* fix(attribution): committed-blob validation, deleted-leaf canonicalisation, sudo/env shifts, dir-stack

gpt-5.5 review (issue 4389405179):

1. realpathOrSelf falls back to the non-canonical input when the
   leaf doesn't exist (deleted file). recordEdit stored the entry
   under the canonical path; lookup post-deletion misses on macOS
   where /var ↔ /private/var. Canonicalise the parent and rejoin
   the basename for missing leaves so deleted-file getFileAttribution
   still resolves the canonical key. Test updated to assert the
   lookup-after-unlink path explicitly.

2. validateOnDiskHashes read the LIVE working-tree, so a user who
   `git add`'d AI's content and then made additional unstaged edits
   would have the entry dropped on a commit whose blob still matched
   AI's hash. Replace with `validateAgainst(getContent)` that takes
   a caller-supplied reader; attachCommitAttribution now passes a
   reader that fetches the COMMITTED blob via `git show HEAD:<rel>`.
   Working-tree validation kept as `validateAgainstWorkingTree` for
   code paths without a committed ref. Returns null = no comparison
   signal (entry preserved). Tests cover all three readers
   (committed-blob via stub, working-tree, null-passthrough).

deepseek-v4-pro review #1: sanitiseAttribution defaults missing
contentHash to '' on legacy-snapshot restore. recordEdit's
divergence check would then trip on every subsequent edit and
silently reset all the AI work. Skip the divergence check when
existing.contentHash is empty — we have no baseline to compare
against, so don't drop. Test added covering legacy-snapshot
preservation through validateAgainst.

deepseek #4: validateAgainst now logs every entry drop via
debugLogger.debug so a 3am operator can see WHICH entry got
dropped and tied to which canonical key.

deepseek #8: GIT_NAMESPACE removed from GIT_ENV_SHIFTS_REPO. It
prefixes ref names within the same repo but doesn't redirect git
to a different on-disk repository, so a commit underneath it still
lands in our cwd's repo. Doc comment explains the distinction.

deepseek #9: pushd/popd treated as cwd-shifting alongside cd in
gitCommitContext / isAmendCommit / findAttributableCommitSegment.
pushd reuses cdTargetMayChangeRepo (relative-no-escape stays
in-repo); popd unconditionally flips cwdShifted because we don't
track the bash dir-stack.

deepseek #10: sudo's value-taking flag table now has a parallel
SUDO_FLAGS_SHIFT_CWD set covering -D / --chdir (Linux sudo 1.9.2+).
Any segment whose sudo wrapper sees one of those flags returns null
from tokeniseSegment — same contract as env -C / --chdir and
GIT_DIR=...

328 tests pass; typecheck clean both packages.

* fix(attribution): scope validateAgainst to committed set, SHA-pin reader, intent-aware migration

Round 1 of multi-pass audit on b3a06a7c4. Three correctness fixes:

1. validateAgainst was iterating ALL fileAttributions but the
   committed-blob reader (git show HEAD:<rel>) returns HEAD's
   pre-AI content for files NOT in the just-made commit. Result:
   pending unstaged AI work was silently wiped on every commit
   because the divergence check ran against the wrong baseline
   for unrelated files. Fix: build the committed scope first via
   matchCommittedFiles, scope the reader to that set (return null
   for everything else), validate, then RE-run matchCommittedFiles
   to pick up dropped entries. The validateAgainstWorkingTree
   wrapper had no production caller — removed it and its test.

2. The committed-blob reader used symbolic `HEAD` instead of the
   captured postHead SHA — same TOCTOU concern buildGitNotesCommand
   already addressed. A post-commit hook moving HEAD between
   capture and the reader's `git show` would silently compare
   against the wrong commit's content and trip the divergence
   check spuriously. Pin the reader to `git show <postHead>:<rel>`.

3. v3→v4 migration's invalid-string fallback used to reset to {}.
   Combined with the runtime pickBool's "absent → schema default
   true" rule, that silently re-enabled attribution for users who
   hand-edited `"gitCoAuthor": "off"` to disable. Migration now
   recognises enable-intent strings (true/yes/on/1/enabled) and
   disable-intent strings (false/no/off/0/disabled/'') and maps
   them to {commit, pr} explicitly. Unrecognised strings fall to
   {commit: false, pr: false} with a warning — same safer-by-default
   contract as runtime pickBool. Test grid covers all 11 cases.

Also tidied the FileAttribution.contentHash JSDoc to reference
the renamed `validateAgainst` (was still pointing at the dropped
`validateOnDiskHashes` name).

1085 tests pass; typecheck clean both packages.

* chore(attribution): extract pickOuterLastMatch, log unrecognised pickBool inputs

Round 2 of multi-pass audit. Two cleanups, no behaviour changes:

1. addCoAuthorToGitCommit and addAttributionToPR each carried their
   own copy of the matchRange / isInside / "pick LAST non-nested
   match" logic (~25 LOC duplicated). Extracted to module-level
   helpers `matchSpan`, `isMatchInside`, and `pickOuterLastMatch<T>`
   so a future bug fix can't apply to only one of the two
   rewriters. Behaviour identical — same algorithm, same edge cases.

2. normalizeGitCoAuthor's pickBool silently maps unrecognised
   strings to false (safer-by-default vs the old "default-to-true
   on mismatch" policy, but a user who hand-edited
   `{ commit: "maybe" }` had no signal that their setting was being
   ignored). Add a `gitCoAuthorLogger.warn` listing the accepted
   forms so a debug-mode user can see the actual coercion. Known
   disable-intent strings (false/no/off/0/empty) stay silent —
   they're explicit user intent. Also pass the field name so the
   warning identifies which sub-toggle (commit vs pr) was bad.

1101 tests pass; typecheck clean.

* fix(attribution): canonicalise BOM and CRLF before hashing

Round 3 of multi-pass audit. One real correctness fix.

Edit and WriteFile preserve the file's BOM and CRLF line-ending
choice when writing back, so the on-disk bytes can include a leading
U+FEFF and CRLFs even when AI's recordEdit input was given with LF
and no BOM. The committed-blob reader's `git show <sha>:<rel>`
returns those raw bytes verbatim, and computeContentHash hashed them
as-is — so a UTF-8 BOM file or a CRLF-line-ending file would always
have a mismatch between AI's recorded hash and the on-disk hash, and
validateAgainst would drop the entry on every commit.

Add `canonicaliseForHash`: strips a leading U+FEFF and normalises
CRLF→LF before computing the SHA-256. Both sides (recordEdit when
storing the post-write hash, and validateAgainst when comparing to
the on-disk read) flow through computeContentHash, so the
canonicalisation is symmetric. The hash is metadata used only for
divergence detection — collapsing these visual differences is the
right comparison semantics.

Three regression tests added: BOM-only, CRLF-only, and BOM+CRLF
combined. All exercise the typical case where AI's recordEdit input
is LF + no BOM but the on-disk content (post-writeTextFile) has the
file's preserved BOM/lineEnding choice.

* fix(attribution): reset accumulator when re-creating a deleted tracked file

Round 4 of multi-pass audit + Copilot finding from review 4236842362
(I missed it in the previous refresh).

recordEdit's existing prior-state check was symmetric on diverged
oldContent but ASYMMETRIC on a fresh file lifetime: when AI creates
`foo.ts` (oldContent=null), then user `rm foo.ts`, then AI
re-creates `foo.ts` (oldContent=null again), the second recordEdit
saw `existing` (from the first lifetime) and SKIPPED the divergence
check (because oldContent === null bails out of that branch). The
accumulator carried 100 chars from the deleted file plus 5 chars
from the new content = 155, vs the actual 5 on disk. Subsequent
generateNotePayload's clamp against `(adds+dels) * 40` couldn't
catch this — the diff size for a 1-line addition is 40, far above
the actual content size.

Add a fresh-file-lifetime branch: when `existing` is set AND the
caller reports `oldContent === null`, reset aiContribution and
aiCreated before counting the new contribution. The new edit is
treated as a brand-new file at the same path (which is what the
caller's null oldContent means semantically).

Test added covering the exact `AI create → delete → AI re-create`
flow. Also verified `should treat new files as ai-created` and
`should accumulate contributions across multiple edits` still pass.

* fix(attribution): treat git -C . as in-cwd, gate preHead on attributable

Round 5 of multi-pass audit. Two related correctness/efficiency
fixes around the cwd-shift parser and the preHead capture.

1. `git -C .` (and `-C ./`, `-C.`) is a no-op cwd shift but the
   "any -C → cwd-shifted" rule was treating it the same as
   `-C /tmp/other`, suppressing attribution for what's effectively
   `git commit` with an explicit current-dir marker. Add an
   `isNoopCwdTarget` helper used in both the spaced (`-C .`) and
   attached (`-C.`) branches of `parseGitInvocation`. `--git-dir`
   / `--work-tree` are left unconditional — those aren't cwd in the
   same sense.

2. preHead was being captured for ANY hasCommit, including the
   non-attributable cases (`cd /elsewhere && git commit`,
   `git -C /other commit`). The only consumer of preHead is the
   `attachCommitAttribution` call inside the `attributableInCwd`
   branch — there is intentionally NO cleanup branch for the
   non-attributable case (see the existing comment around the
   `else if (commitCtx.hasCommit)` non-branch). The execFileSync
   for `getGitHeadSync` is dead work in that path: ~10–50 ms
   blocking the event loop before the user's real command spawns.
   Gate the capture on `attributableInCwd` to match the consumer.

Tests added for the three -C dot-form variants. Full suite green:
146 in shell.test.ts, 56 in commitAttribution.test.ts.

* fix(core): preserve attribution across renamed files

* fix(attribution): preserve env-vars in tokens, exclude empty -C targets

Round 7 of multi-pass audit. Two related fixes around how
`shell-quote` handles env-var references and how the cwd-shift
detector reads them.

1. `shell-quote.parse` collapses `$NAME` references it cannot
   resolve to the empty string. The downstream cwd-shift checks
   (`cdTargetMayChangeRepo`'s `target.includes('$')` repo-shift
   detector, and the new `isNoopCwdTarget` no-op detector) were
   designed to catch env-var targets but received `''` instead of
   `$NAME` from `tokeniseSegment` and silently failed. Concretely,
   `cd $HOME && git commit` and `git -C $HOME commit` would both
   pass through as in-cwd attributable, stamping our trailer onto
   commits that land in whatever repo `$HOME`/`$REPO_ROOT`
   resolves to at runtime.

   Pass an env getter `(key) => '$' + key` to `shell-quote.parse`
   inside `tokeniseSegment` so unresolved references stay literal
   in tokens (`['cd', '$HOME']` instead of `['cd', '']`).
   `target.includes('$')` now fires correctly, and the no-op
   detector sees `$HOME` (non-`.`) and rejects it. KEY=value
   leading-env detection is unaffected (shell-quote doesn't
   interpolate inside KEY=value tokens).

2. Even with env preservation, an `''` target can still slip
   through (literal `-C ""`, escaped quotes, edge cases in
   shell-quote). Round 5's `isNoopCwdTarget` accepted `''` as a
   no-op alongside `'.'` / `'./'`, which would re-introduce the
   attribution-on-wrong-repo problem if any path produced an
   empty token. Tighten to `'.'` and `'./'` only — the only
   missed cases are literal `-C ""` (malformed, won't actually
   commit) and the rare `-C $PWD` (now also caught conservatively,
   since `$PWD` becomes literal `$PWD` and isn't `.` or `./`).

Tests added for `cd $HOME` / `cd $REPO_ROOT && git commit` and
`git -C $HOME commit` / `git -C "" commit`. Full suite green
(150 in shell.test.ts, 58 in commitAttribution.test.ts).

* fix(attribution): SHA-pin diff/rev-list phase, document aiChars heuristic

Addresses tanzhenxin's review (4240760004) — two residuals after
the prior pinning round.

1. Diff phase still races against HEAD.

   The note write itself was already pinned to the captured `postHead`
   (`git notes add -f <postHead>`), but the *content* of the note —
   `getCommittedFileInfo`'s probe + diff calls and the multi-commit
   guard's `rev-list --count` — were still going through symbolic
   `HEAD` / `HEAD~1` / `HEAD@{1}`. Several awaited subprocesses run
   between the postHead capture and these reads, so a husky / lefthook
   auto-amender, signed-commits hook, chained `git tag -m`, or
   parallel git process moving HEAD in that window would leave the
   note attached to commit A but describing commit B's contents.
   Same TOCTOU class as the prior critical, half-closed.

   Thread `postHead` (and `preHead` for amend) through
   `getCommittedFileInfo`. Probes become `rev-parse --verify
   ${postHead}~1` and `log -1 --pretty=%P ${postHead}`; diffs become
   `${postHead}~1..${postHead}` (parent case),
   `${preHead}..${postHead}` (amend — preHead is the pre-amend SHA
   captured before the user's command and is exactly what HEAD@{1}
   resolved to at parse time, with the added benefit that it can't be
   GC'd between capture and use), and `diff-tree --root <postHead>`
   (root commit). The amend branch keeps the existing reflog-vs-
   no-reflog warning, just driven off `preHead` instead of HEAD@{1}.

   Same pin applied to `countCommitsAfter` (now `${preHead}..
   ${postHead}`) and `countCommitsFromRoot` (now `${postHead}`).

   Why parent case uses `${postHead}~1` and NOT `${preHead}`: in
   `git reset HEAD~3 && git commit` chains the captured preHead
   points well above postHead's parent, and `${preHead}..${postHead}`
   would describe the reset-away commits as deletions, drastically
   over-attributing. The actual parent of the just-landed commit is
   what we want, and `${postHead}~1` is the SHA-pinned form of that.

2. `aiChars` reads as a literal char count but isn't.

   The field is emitted as a plain integer named `aiChars`; the PR
   description's example shows values like 3200 / 1500 / 4700 that
   anyone parsing the note will read as literal character counts.
   Internally it's `(addedLines + deletedLines) × 40` for text and a
   flat 1024 for binary, with the per-file AI accumulator clamped
   against that ceiling. So 1000 one-character lines and 1000
   thousand-character lines both report aiChars=40000, and a 5 MB
   image change and a 1-byte binary tweak both report 1024. Anyone
   aggregating raw aiChars for compliance reporting gets
   systematically wrong numbers.

   Add a comprehensive doc block on `FileAttributionDetail` (and
   `CommitAttributionNote`) calling out the heuristic explicitly,
   noting that `percent` / `summary.aiPercent` are the correct
   fields for aggregation since both numerator and denominator use
   the same proxy. Also expand the `APPROX_CHARS_PER_LINE` /
   `BINARY_DIFF_SIZE_FALLBACK` const docs to point at the same
   caveat. (Not renaming the fields — that'd break any downstream
   consumer already parsing the existing schema; the doc is the
   minimum-disruption call here.)

208 attribution tests pass; type-check clean.

* fix(attribution): use posix join in applyCommittedRenames for Windows compat

Windows CI failure on the two new rename tests (visible at PR #3115's
`Test (windows-latest, *)` jobs):

  AssertionError: expected undefined to be defined
  ❯ src/services/commitAttribution.test.ts:572:66 (basic move)
  AssertionError: expected 11 to be 22 (merge into existing)

Root cause: `path.join(canonicalRepoRoot, ...renamedRel.split('/'))`
calls `path.win32.join` on Windows, which forces backslash separators
regardless of input form. The test's `fs.realpathSync` mock returns
forward-slash paths (matching the macOS `/var` ↔ `/private/var`
fixture style), so `recordEdit` stores keys like
`/private/var/repo/src/old.ts`. The rename's joined target then came
out as `\\private\\var\\repo\\src\\new.ts`, the mock left it
unchanged (no `/var/` prefix to translate), and the subsequent
`fileAttributions.get(renamedAbs)` / `getFileAttribution(...)` lookups
missed the just-set entry — the rename silently dropped attribution.

The fix: build the joined path with `path.posix.join` against a
forward-slash-normalised `posixRepoRoot`, then let `realpathOrSelf`
canonicalise to the platform's storage form. This way:

  - On real Windows production: posix-joined `D:/repo/src/new.ts` is
    accepted by `fs.realpathSync` (Win32 API takes mixed slashes) and
    returned in backslash form, matching what `recordEdit` stored.
  - On real Linux/macOS production: forward-slash throughout, no-op.
  - In the symlink-aware test (any platform): forward-slash matches
    the mock-fixture storage form.

`matchCommittedFiles` already does the inverse normalisation
(`.split(path.sep).join('/')` for the relative-form check), so the
in/out paths line up either way.

Skipped adding a path.sep-mocked Linux-side regression because the
ESM module namespace doesn't allow `vi.spyOn` on path's exports.
The Windows CI job is the regression catcher; a focused-rerun
should now go green.

* docs(attribution): refresh stale HEAD~1/HEAD@{1} references in comments

The SHA-pinning round (8c3312027) replaced symbolic `HEAD~1..HEAD` /
`HEAD@{1}..HEAD` with `${postHead}~1..${postHead}` and
`${preHead}..${postHead}` in `getCommittedFileInfo` and the rev-list
helpers, but three docstrings / inline comments still described the
old shapes:

- `isAmendCommit` JSDoc said the amend switch goes from `HEAD~1..HEAD`
  to `HEAD@{1}..HEAD`. Updated to reference `${postHead}~1..${postHead}`
  and `${preHead}..${postHead}`, with the why (amended commit's parent
  is the original's parent so the standard parent diff lumps both
  commits' changes).
- `attachCommitAttribution`'s amend branch comment had the same drift;
  updated to mention `${preHead}..${postHead}` directly.
- `getCommittedFileInfo` JSDoc said it diffs "HEAD against its parent
  (HEAD~1)" and listed "--amend with no reflog" as an analysis-failure
  case. Updated to mention postHead-pinning and the preHead-driven
  amend bail (the reflog-GC dependency was dropped in the SHA-pin
  round).

The remaining `HEAD~1..HEAD` references at countCommitsAfter:1959 and
getCommittedFileInfo:2523 are intentional — they describe the old
buggy shape as contrast for why we pin now.

No code change; tests + tsc still clean.

* fix(attribution): catch attached-value forms of env/sudo cwd-shift flags

Round 13 audit found a real bug: `sudo --chdir=/tmp git commit`,
`env -C/tmp git commit`, `env --chdir=/tmp git commit`, and
`sudo -D/tmp git commit` were all silently slipping through the
cwd-shift detector and getting our `Co-authored-by` trailer stamped
onto commits that landed in a different repo.

Root cause: `shell-quote` tokenises both the long attached form
(`--chdir=/tmp`) and the short attached form (`-C/tmp`) as a single
argv entry. The previous SHIFT_CWD detector did set-membership only
against the bare flag (`{'-C', '--chdir'}` for env;
`{'-D', '--chdir'}` for sudo), so the attached-form tokens never
matched and `tokeniseSegment` returned a normally-attributable
`['git', 'commit', ...]` segment.

Fix: introduce `isShiftCwdFlag(flag, set)` that catches:
  - bare set-membership (existing behavior),
  - long attached: `--name=...` when `--name` is in the set,
  - short attached: `-Xanything` when `-X` is in the set and the
    token is longer than the flag itself.

The flag does NOT need to consume an extra value token in the
attached-form case (the value is already embedded), so the existing
TAKES_VALUE bookkeeping is unaffected — we just bail with `null`
from `tokeniseSegment` before reaching the value-skip step.

Tests added: `env --chdir=`, `env -C/...` (attached), `sudo --chdir=`,
`sudo -D/...` (attached) — each is asserted NOT to add a co-author
trailer. 154 shell tests pass; type-check + lint clean.

* test(attribution): cover attached-form git -C/--git-dir/--work-tree

Adds three regression cases to the existing "git -C <path>" suppression
test: the short attached form `-C/path` (single shell-quote token)
and the long attached forms `--git-dir=/path` / `--work-tree=/path`.
parseGitInvocation already had the prefix checks at lines 416/425, but
no test exercised them — paired with the b89b65533 sudo/env attached-
form fix this round closes the family of "shell-quote single-token
flag with embedded value" cases that the bare set-membership checks
would otherwise miss.

157 shell tests pass; type-check clean.

* docs(attribution): document why backtick body doesn't bail like $(

The addCoAuthorToGitCommit body capture has a known truncation case
when an inner unescaped `"` appears inside the captured body — handled
for `$(...)` command substitution with an explicit bailout, but not
for backtick command substitution. The trade-off was unspoken; spell
it out so a future reviewer doesn't read the asymmetry as an
oversight.

Bare-backtick bodies (`\`func()\`` markdown-style) are common in
commit messages, have no inner `"`, and the regex captures them
correctly. Pathological backtick-with-inner-quote bodies (`\`cmd
"with" quotes\``) are a near-zero-traffic case where bash itself
already interprets the backticks as command substitution, so the
user has likely already broken their own command before our rewrite
runs. Bailing on any backtick would lose attribution for the common
case to defend against the rare one.

Also drops a stray blank line in commitAttribution.test.ts left over
from an earlier regression-test attempt.

* fix(attribution): scope trailer rewrite to before unquoted shell comment

Round 13 follow-on. Both `addCoAuthorToGitCommit` and
`addAttributionToPR` ran their `-m` / `--body` regex against the full
segment string, including any trailing shell comment. For a command
like `git commit -m "real" # -m "fake"` (a human-authored script
might leave a comment-out flag in place), `lastMatchOf` would pick
the comment's `-m "fake"`, splice the `Co-authored-by:` trailer in
there, and bash would silently discard the entire segment as a
comment — leaving the actual commit unattributed. Same shape for
`gh pr create --body "real" # --body "fake"`.

Fix: introduce `findUnquotedCommentStart(s)` — a bash-aware position
scanner that tracks single/double-quote state and treats `#` as a
comment marker only when it begins a word (start of input or
preceded by whitespace), not when it appears inside a quoted region
or mid-token like `foo#bar`. Both rewriters slice the segment to
`[0, commentStart)` before running their regex, so the trailer can
only land in the live (pre-comment) part.

Tests added:
  - `git commit -m "real" # -m "fake"` — trailer lands in `"real"`
    body BEFORE the `#`, comment's `-m "fake"` is left untouched.
  - `git commit -m "fix #123 add feature"` — `#` inside the quoted
    body is correctly NOT treated as a comment; the `#123` stays
    inside the body and the trailer is appended.

159 shell tests pass; type-check clean.

* fix(attribution): warn on gh pr create flows that can't be rewritten + cover legacy gitCoAuthor migration end-to-end

Two residuals from this morning's review pass.

1. ANm7O — `addAttributionToPR` silently skipped for `--body-file`,
   `--fill`, and bare `gh pr create` (editor) flows.

   The rewriter only knows how to splice into an inline `--body`/`-b`
   argv entry. For a `gh pr create` that uses `--body-file path`,
   `--fill` (uses commit messages), or no body flag at all (editor
   prompt), there's no inline body to splice into and the function
   returned the unmodified command. Users with `gitCoAuthor.pr`
   enabled would see PRs created without the attribution line and
   have no signal as to why.

   Add a debugLogger.warn at the no-match path naming the unsupported
   flows and pointing the user at the inline form. Don't try to
   handle `--body-file` automatically — that would mean mutating the
   user's file on disk, which is well outside what an unprompted
   command rewriter should do; `--fill` and editor flows have no body
   in argv at all and can't be rewritten without re-architecting.

   Tests added for `--body-file <path>`, `--fill`, and bare
   `gh pr create` — each is asserted to leave the command unchanged
   (no `Generated with Qwen Code` line spliced in).

2. ANm7L — settings-migration integration suite didn't cover the
   exact V3 legacy shape this PR introduces.

   `v3-to-v4.test.ts` already pins the migration body, but the end-
   to-end CLI load → migrate → write path could regress without the
   integration suite noticing. The existing v3LegacyDisableSettings
   fixture has no `general.gitCoAuthor` field, so the V3→V4 step
   technically fires but doesn't exercise the new boolean-expansion
   logic.

   Add a `v3GitCoAuthorBooleanSettings` fixture and a paired test
   case that writes `general: { gitCoAuthor: false }` at $version 3,
   runs the same `mcp list` CLI invocation, and asserts the saved
   file has $version 4 plus `general.gitCoAuthor` exactly
   `{ commit: false, pr: false }` — with sibling general.* keys and
   unrelated top-level sections preserved.

162 shell tests pass; type-check + lint clean.
This commit is contained in:
Shaojin Wen 2026-05-08 09:55:58 +08:00 committed by GitHub
parent 0491252b27
commit cfbcea1e88
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
32 changed files with 6983 additions and 132 deletions

View file

@ -83,16 +83,17 @@ Settings are organized into categories. Most settings should be placed within th
#### general
| Setting | Type | Description | Default |
| ------------------------------------------ | ------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | ----------- |
| `general.preferredEditor` | string | The preferred editor to open files in. | `undefined` |
| `general.vimMode` | boolean | Enable Vim keybindings. | `false` |
| `general.enableAutoUpdate` | boolean | Enable automatic update checks and installations on startup. | `true` |
| `general.showSessionRecap` | boolean | Auto-show a one-line "where you left off" recap when returning to the terminal after being away. Off by default. Use `/recap` to trigger manually regardless of this setting. | `false` |
| `general.sessionRecapAwayThresholdMinutes` | number | Minutes the terminal must be blurred before an auto-recap fires on focus-in. Only used when `showSessionRecap` is enabled. | `5` |
| `general.gitCoAuthor` | boolean | Automatically add a Co-authored-by trailer to git commit messages when commits are made through Qwen Code. | `true` |
| `general.checkpointing.enabled` | boolean | Enable session checkpointing for recovery. | `false` |
| `general.defaultFileEncoding` | string | Default encoding for new files. Use `"utf-8"` (default) for UTF-8 without BOM, or `"utf-8-bom"` for UTF-8 with BOM. Only change this if your project specifically requires BOM. | `"utf-8"` |
| Setting | Type | Description | Default |
| ------------------------------------------ | ------- | ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | ----------- |
| `general.preferredEditor` | string | The preferred editor to open files in. | `undefined` |
| `general.vimMode` | boolean | Enable Vim keybindings. | `false` |
| `general.enableAutoUpdate` | boolean | Enable automatic update checks and installations on startup. | `true` |
| `general.showSessionRecap` | boolean | Auto-show a one-line "where you left off" recap when returning to the terminal after being away. Off by default. Use `/recap` to trigger manually regardless of this setting. | `false` |
| `general.sessionRecapAwayThresholdMinutes` | number | Minutes the terminal must be blurred before an auto-recap fires on focus-in. Only used when `showSessionRecap` is enabled. | `5` |
| `general.gitCoAuthor.commit` | boolean | Add a Co-authored-by trailer to git commit messages AND attach a per-file AI-attribution git note (`refs/notes/ai-attribution`) for commits made through Qwen Code. Disabling skips both. | `true` |
| `general.gitCoAuthor.pr` | boolean | Append a Qwen Code attribution line to pull request descriptions when running `gh pr create`. | `true` |
| `general.checkpointing.enabled` | boolean | Enable session checkpointing for recovery. | `false` |
| `general.defaultFileEncoding` | string | Default encoding for new files. Use `"utf-8"` (default) for UTF-8 without BOM, or `"utf-8-bom"` for UTF-8 with BOM. Only change this if your project specifically requires BOM. | `"utf-8"` |
#### output

View file

@ -24,16 +24,21 @@ const {
v2PreexistingEnableSettings,
v3LegacyDisableSettings,
v999FutureVersionSettings,
v3GitCoAuthorBooleanSettings,
} = workspacesSettings;
/**
* Integration tests for settings migration chain (V1 -> V2 -> V3)
* Integration tests for settings migration chain (V1 -> V2 -> V3 -> V4)
*
* These tests verify that:
* 1. V1 settings are automatically migrated to V3 on CLI startup
* 2. V2 settings are automatically migrated to V3 on CLI startup
* 3. V3 settings remain unchanged
* 1. V1 settings are automatically migrated to V4 on CLI startup
* 2. V2 settings are automatically migrated to V4 on CLI startup
* 3. V3 settings are automatically migrated to V4 on CLI startup
* 4. Migration is idempotent (running multiple times produces same result)
*
* The numeric assertions use the literal `4` to match
* `SETTINGS_VERSION`; bump that constant and the literal together
* when adding a future migration.
*/
describe('settings-migration', () => {
let rig: TestRig;
@ -77,7 +82,7 @@ describe('settings-migration', () => {
};
describe('V1 settings migration', () => {
it('should migrate V1 settings to V3 on CLI startup', async () => {
it('should migrate V1 settings forward through the chain on CLI startup', async () => {
rig.setup('v1-to-v3-migration');
// Write V1 settings directly (overwrites the one created by setup)
@ -94,8 +99,8 @@ describe('settings-migration', () => {
// Read migrated settings
const migratedSettings = readSettingsFile(rig);
// Verify migration to V3
expect(migratedSettings['$version']).toBe(3);
// Verify migration to V4 (current SETTINGS_VERSION)
expect(migratedSettings['$version']).toBe(4);
expect(migratedSettings['ui']).toEqual({
theme: 'dark',
hideTips: false,
@ -137,7 +142,7 @@ describe('settings-migration', () => {
const migratedSettings = readSettingsFile(rig);
// Expected output based on stable test output
expect(migratedSettings['$version']).toBe(3);
expect(migratedSettings['$version']).toBe(4);
expect(migratedSettings['tools']).toEqual({ autoAccept: false });
expect(migratedSettings['context']).toEqual({ includeDirectories: [] });
expect(migratedSettings['model']).toEqual({ name: ['gemini', 'claude'] });
@ -161,8 +166,8 @@ describe('settings-migration', () => {
// Read migrated settings
const migratedSettings = readSettingsFile(rig);
// Should be migrated to V3
expect(migratedSettings['$version']).toBe(3);
// Should be migrated to V4
expect(migratedSettings['$version']).toBe(4);
// Legacy string values for ui/general should be preserved as-is (user data)
expect(migratedSettings['ui']).toBe('legacy-ui-string');
expect(migratedSettings['general']).toBe('legacy-general-string');
@ -189,7 +194,7 @@ describe('settings-migration', () => {
const migratedSettings = readSettingsFile(rig);
// Expected output based on stable test output
expect(migratedSettings['$version']).toBe(3);
expect(migratedSettings['$version']).toBe(4);
expect(migratedSettings['model']).toEqual({ name: 'qwen-plus' });
expect(migratedSettings['ui']).toEqual({
hideWindowTitle: true,
@ -209,7 +214,7 @@ describe('settings-migration', () => {
});
describe('V2 settings migration', () => {
it('should migrate V2 settings to V3 on CLI startup', async () => {
it('should migrate V2 settings forward through the chain on CLI startup', async () => {
rig.setup('v2-to-v3-migration');
// Write V2 settings directly (overwrites the one created by setup)
@ -225,8 +230,8 @@ describe('settings-migration', () => {
// Read migrated settings
const migratedSettings = readSettingsFile(rig);
// Verify migration to V3
expect(migratedSettings['$version']).toBe(3);
// Verify migration to V4 (current SETTINGS_VERSION)
expect(migratedSettings['$version']).toBe(4);
// Verify disable* -> enable* conversion with inversion
expect(
@ -302,8 +307,8 @@ describe('settings-migration', () => {
// Read migrated settings
const migratedSettings = readSettingsFile(rig);
// Should be updated to V3 version
expect(migratedSettings['$version']).toBe(3);
// Should be updated to V4 version
expect(migratedSettings['$version']).toBe(4);
// Other settings should remain unchanged
expect(migratedSettings['ui']).toEqual({ theme: 'dark' });
expect(migratedSettings['model']).toEqual({ name: 'gemini' });
@ -330,12 +335,12 @@ describe('settings-migration', () => {
const migratedSettings = readSettingsFile(rig);
// Version metadata should still be normalized to current version
expect(migratedSettings['$version']).toBe(3);
expect(migratedSettings['$version']).toBe(4);
// Existing user content should be preserved
expect(migratedSettings['customOnlyKey']).toBe('value');
});
it('should coerce valid string booleans and remove invalid deprecated keys while bumping V2 to V3', async () => {
it('should coerce valid string booleans and remove invalid deprecated keys while bumping V2 forward through the chain', async () => {
rig.setup('v2-non-boolean-disable-values-migration');
// Cover both coercible string booleans and invalid non-boolean values:
@ -372,7 +377,7 @@ describe('settings-migration', () => {
const migratedSettings = readSettingsFile(rig);
// Coercible strings are migrated; invalid disable* values are removed.
expect(migratedSettings['$version']).toBe(3);
expect(migratedSettings['$version']).toBe(4);
expect(migratedSettings['general']).toEqual({
enableAutoUpdate: false,
});
@ -437,7 +442,7 @@ describe('settings-migration', () => {
const migratedSettings = readSettingsFile(rig);
// Expected output based on stable test output
expect(migratedSettings['$version']).toBe(3);
expect(migratedSettings['$version']).toBe(4);
// Migration converts disable* to enable* by inverting the value
// disableAutoUpdate: false -> enableAutoUpdate: true (inverted)
// But disableUpdateNag: true may affect the consolidation
@ -501,11 +506,10 @@ describe('settings-migration', () => {
// Read settings
const finalSettings = readSettingsFile(rig);
// Should remain V3
expect(finalSettings['$version']).toBe(3);
// Note: V3 settings with legacy disable* keys are left as-is
// Migration only runs when version < current version
// Since this is already V3, no migration logic is applied
// V3 → V4 migration bumps the version; V3→V4 only touches
// general.gitCoAuthor, so unrelated legacy disable* keys remain as-is
// (V2→V3 ran on original V3 load, not re-applied here).
expect(finalSettings['$version']).toBe(4);
expect(
(finalSettings['general'] as Record<string, unknown>)?.[
'disableAutoUpdate'
@ -536,6 +540,44 @@ describe('settings-migration', () => {
note: 'should remain unchanged in v3',
});
});
// V3 used to allow `general.gitCoAuthor: <boolean>`. The V3→V4
// migration must expand that boolean into the new
// `{ commit, pr }` object shape so the user's stored opt-out
// doesn't get silently overwritten by the schema defaults
// (which default both sub-toggles to `true`) on the next save.
// The unit test in `v3-to-v4.test.ts` already pins the
// migration body, but without an end-to-end fixture the real
// CLI load → migrate → write path could regress without
// this suite noticing.
it('should expand legacy boolean general.gitCoAuthor: false through V3 → V4', async () => {
rig.setup('v3-gitcoauthor-boolean');
overwriteSettingsFile(rig, v3GitCoAuthorBooleanSettings);
try {
await rig.runCommand(['mcp', 'list']);
} catch {
// Expected to potentially fail
}
const finalSettings = readSettingsFile(rig);
expect(finalSettings['$version']).toBe(4);
expect(
(finalSettings['general'] as Record<string, unknown>)?.['gitCoAuthor'],
).toEqual({ commit: false, pr: false });
// Sibling general.* keys must survive the migration unchanged.
expect(
(finalSettings['general'] as Record<string, unknown>)?.[
'disableAutoUpdate'
],
).toBe(true);
// And so must unrelated top-level sections.
expect(finalSettings['custom']).toEqual({
note: 'preserve me through v3->v4',
});
});
});
describe('Future version settings handling', () => {

View file

@ -184,5 +184,15 @@
"experimentalFlag": {
"enabled": true
}
},
"v3GitCoAuthorBooleanSettings": {
"$version": 3,
"general": {
"gitCoAuthor": false,
"disableAutoUpdate": true
},
"custom": {
"note": "preserve me through v3->v4"
}
}
}

View file

@ -15,7 +15,7 @@ import { SETTINGS_VERSION } from '../settings.js';
describe('Migration Framework Integration', () => {
describe('runMigrations', () => {
it('should migrate V1 settings to V3', () => {
it('should migrate V1 settings all the way to the current version', () => {
const v1Settings = {
theme: 'dark',
model: 'gemini',
@ -25,8 +25,8 @@ describe('Migration Framework Integration', () => {
const result = runMigrations(v1Settings, 'user');
expect(result.finalVersion).toBe(3);
expect(result.executedMigrations).toHaveLength(2);
expect(result.finalVersion).toBe(SETTINGS_VERSION);
expect(result.executedMigrations).toHaveLength(SETTINGS_VERSION - 1);
expect(result.executedMigrations[0]).toEqual({
fromVersion: 1,
toVersion: 2,
@ -38,7 +38,7 @@ describe('Migration Framework Integration', () => {
// Check V2 structure was created
const settings = result.settings as Record<string, unknown>;
expect(settings['$version']).toBe(3);
expect(settings['$version']).toBe(SETTINGS_VERSION);
expect(settings['ui']).toEqual({
theme: 'dark',
accessibility: { enableLoadingPhrases: true },
@ -51,7 +51,7 @@ describe('Migration Framework Integration', () => {
).toBe(false);
});
it('should migrate V2 settings to V3', () => {
it('should migrate V2 settings forward through the chain', () => {
const v2Settings = {
$version: 2,
ui: { theme: 'light' },
@ -60,15 +60,15 @@ describe('Migration Framework Integration', () => {
const result = runMigrations(v2Settings, 'user');
expect(result.finalVersion).toBe(3);
expect(result.executedMigrations).toHaveLength(1);
expect(result.finalVersion).toBe(SETTINGS_VERSION);
expect(result.executedMigrations).toHaveLength(SETTINGS_VERSION - 2);
expect(result.executedMigrations[0]).toEqual({
fromVersion: 2,
toVersion: 3,
});
const settings = result.settings as Record<string, unknown>;
expect(settings['$version']).toBe(3);
expect(settings['$version']).toBe(SETTINGS_VERSION);
expect(
(settings['general'] as Record<string, unknown>)['enableAutoUpdate'],
).toBe(true);
@ -77,18 +77,18 @@ describe('Migration Framework Integration', () => {
).toBeUndefined();
});
it('should not modify V3 settings', () => {
const v3Settings = {
$version: 3,
it('should not modify settings already at the current version', () => {
const current = {
$version: SETTINGS_VERSION,
ui: { theme: 'dark' },
general: { enableAutoUpdate: true },
};
const result = runMigrations(v3Settings, 'user');
const result = runMigrations(current, 'user');
expect(result.finalVersion).toBe(3);
expect(result.finalVersion).toBe(SETTINGS_VERSION);
expect(result.executedMigrations).toHaveLength(0);
expect(result.settings).toEqual(v3Settings);
expect(result.settings).toEqual(current);
});
it('should be idempotent', () => {
@ -100,7 +100,7 @@ describe('Migration Framework Integration', () => {
const result1 = runMigrations(v1Settings, 'user');
const result2 = runMigrations(result1.settings, 'user');
expect(result1.executedMigrations).toHaveLength(2);
expect(result1.executedMigrations).toHaveLength(SETTINGS_VERSION - 1);
expect(result2.executedMigrations).toHaveLength(0);
expect(result1.finalVersion).toBe(result2.finalVersion);
});
@ -135,13 +135,13 @@ describe('Migration Framework Integration', () => {
expect(needsMigration(cleanV2Settings)).toBe(true);
});
it('should return false for V3 settings', () => {
const v3Settings = {
$version: 3,
it('should return false for settings already at the current version', () => {
const current = {
$version: SETTINGS_VERSION,
general: { enableAutoUpdate: true },
};
expect(needsMigration(v3Settings)).toBe(false);
expect(needsMigration(current)).toBe(false);
});
it('should return false for legacy numeric version when no migration can execute', () => {
@ -156,13 +156,12 @@ describe('Migration Framework Integration', () => {
describe('ALL_MIGRATIONS', () => {
it('should contain all migrations in order', () => {
expect(ALL_MIGRATIONS).toHaveLength(2);
expect(ALL_MIGRATIONS).toHaveLength(SETTINGS_VERSION - 1);
expect(ALL_MIGRATIONS[0].fromVersion).toBe(1);
expect(ALL_MIGRATIONS[0].toVersion).toBe(2);
expect(ALL_MIGRATIONS[1].fromVersion).toBe(2);
expect(ALL_MIGRATIONS[1].toVersion).toBe(3);
for (let i = 0; i < ALL_MIGRATIONS.length; i++) {
expect(ALL_MIGRATIONS[i].fromVersion).toBe(i + 1);
expect(ALL_MIGRATIONS[i].toVersion).toBe(i + 2);
}
});
});
@ -178,10 +177,10 @@ describe('Migration Framework Integration', () => {
const result = scheduler.migrate(v1Settings);
expect(result.executedMigrations).toHaveLength(2);
expect(result.executedMigrations).toHaveLength(SETTINGS_VERSION - 1);
const settings = result.settings as Record<string, unknown>;
expect(settings['$version']).toBe(3);
expect(settings['$version']).toBe(SETTINGS_VERSION);
expect((settings['ui'] as Record<string, unknown>)['theme']).toBe('dark');
expect(
(settings['general'] as Record<string, unknown>)['enableAutoUpdate'],
@ -212,16 +211,16 @@ describe('Migration Framework Integration', () => {
});
it('needsMigration should return false when runMigrations would execute no migrations', () => {
const v3Settings = {
$version: 3,
const current = {
$version: SETTINGS_VERSION,
general: { enableAutoUpdate: true },
};
// needsMigration should report that no migration is needed
expect(needsMigration(v3Settings)).toBe(false);
expect(needsMigration(current)).toBe(false);
// runMigrations should execute no migrations
const result = runMigrations(v3Settings, 'user');
const result = runMigrations(current, 'user');
expect(result.executedMigrations).toHaveLength(0);
});
@ -234,10 +233,10 @@ describe('Migration Framework Integration', () => {
// needsMigration should report that migration is needed
expect(needsMigration(cleanV2Settings)).toBe(true);
// runMigrations should execute the V2->V3 migration
// runMigrations should execute migrations forward to the current version
const result = runMigrations(cleanV2Settings, 'user');
expect(result.executedMigrations.length).toBeGreaterThan(0);
expect(result.finalVersion).toBe(3);
expect(result.finalVersion).toBe(SETTINGS_VERSION);
});
});
@ -364,14 +363,14 @@ describe('Migration Framework Integration', () => {
it('should avoid repeated no-op migration loops', () => {
// Settings that might cause repeated migrations
const v3Settings = {
$version: 3,
const current = {
$version: SETTINGS_VERSION,
general: { enableAutoUpdate: true },
};
// First check
expect(needsMigration(v3Settings)).toBe(false);
const result1 = runMigrations(v3Settings, 'user');
expect(needsMigration(current)).toBe(false);
const result1 = runMigrations(current, 'user');
expect(result1.executedMigrations).toHaveLength(0);
// Second check should be consistent

View file

@ -13,6 +13,7 @@ export { MigrationScheduler } from './scheduler.js';
// Export migrations
export { v1ToV2Migration, V1ToV2Migration } from './versions/v1-to-v2.js';
export { v2ToV3Migration, V2ToV3Migration } from './versions/v2-to-v3.js';
export { v3ToV4Migration, V3ToV4Migration } from './versions/v3-to-v4.js';
// Import settings version from single source of truth
import { SETTINGS_VERSION } from '../settings.js';
@ -22,6 +23,7 @@ import { SETTINGS_VERSION } from '../settings.js';
// Order matters: migrations must be sorted by ascending version
import { v1ToV2Migration } from './versions/v1-to-v2.js';
import { v2ToV3Migration } from './versions/v2-to-v3.js';
import { v3ToV4Migration } from './versions/v3-to-v4.js';
import { MigrationScheduler } from './scheduler.js';
import type { MigrationResult } from './types.js';
@ -35,7 +37,11 @@ import type { MigrationResult } from './types.js';
* const result = scheduler.migrate(settings);
* ```
*/
export const ALL_MIGRATIONS = [v1ToV2Migration, v2ToV3Migration] as const;
export const ALL_MIGRATIONS = [
v1ToV2Migration,
v2ToV3Migration,
v3ToV4Migration,
] as const;
/**
* Convenience function that runs all migrations on the given settings.

View file

@ -0,0 +1,242 @@
/**
* @license
* Copyright 2025 Google LLC
* SPDX-License-Identifier: Apache-2.0
*/
import { describe, it, expect } from 'vitest';
import { V3ToV4Migration } from './v3-to-v4.js';
describe('V3ToV4Migration', () => {
const migration = new V3ToV4Migration();
describe('shouldMigrate', () => {
it('returns true for V3 settings', () => {
expect(
migration.shouldMigrate({
$version: 3,
general: { gitCoAuthor: false },
}),
).toBe(true);
});
it('returns true for V3 settings without gitCoAuthor', () => {
// Even without the relevant key, the version must still bump.
expect(migration.shouldMigrate({ $version: 3 })).toBe(true);
});
it('returns false for V4 settings', () => {
expect(
migration.shouldMigrate({
$version: 4,
general: { gitCoAuthor: { commit: true, pr: true } },
}),
).toBe(false);
});
it('returns false for non-object input', () => {
expect(migration.shouldMigrate(null)).toBe(false);
expect(migration.shouldMigrate('x')).toBe(false);
expect(migration.shouldMigrate(42)).toBe(false);
});
// `gitCoAuthor` post-dates the V1 indicator-key list, so a settings
// file that has ONLY this legacy boolean shape (no `$version`,
// no other migration-triggering keys) wouldn't fire any earlier
// migration. The v3→v4 step must catch it directly so the dialog
// doesn't silently overwrite the user's stored opt-out with the
// schema defaults on next save.
it('returns true for versionless settings with legacy boolean gitCoAuthor', () => {
expect(
migration.shouldMigrate({
general: { gitCoAuthor: false },
}),
).toBe(true);
});
it('returns false for versionless settings without gitCoAuthor', () => {
expect(migration.shouldMigrate({ general: {} })).toBe(false);
expect(migration.shouldMigrate({})).toBe(false);
});
it('returns false for versionless settings with already-object gitCoAuthor', () => {
// User who hand-edited to the v4 shape — let the loader's
// version normalization handle it without rewriting.
expect(
migration.shouldMigrate({
general: { gitCoAuthor: { commit: false, pr: true } },
}),
).toBe(false);
});
// Without the migration firing on invalid versionless values, the
// loader would stamp $version: 4 with `"off"` / `[]` / etc. left
// on disk, and runtime normalization would silently re-enable
// attribution. The migrate() body's drop-and-warn handles these
// — shouldMigrate has to fire so it gets a chance to run.
it.each([
['"off"', 'off'],
['empty array', []],
['number', 42],
['null', null],
])(
'returns true for versionless settings with invalid gitCoAuthor (%s)',
(_label, value) => {
expect(
migration.shouldMigrate({
general: { gitCoAuthor: value },
}),
).toBe(true);
},
);
});
describe('migrate', () => {
it('expands legacy boolean true into { commit: true, pr: true }', () => {
const input = { $version: 3, general: { gitCoAuthor: true } };
const { settings, warnings } = migration.migrate(input, 'user') as {
settings: Record<string, unknown>;
warnings: string[];
};
expect(
(settings['general'] as Record<string, unknown>)['gitCoAuthor'],
).toEqual({ commit: true, pr: true });
expect(settings['$version']).toBe(4);
expect(warnings).toEqual([]);
});
it('expands legacy boolean false into { commit: false, pr: false }', () => {
const input = { $version: 3, general: { gitCoAuthor: false } };
const { settings } = migration.migrate(input, 'user') as {
settings: Record<string, unknown>;
warnings: string[];
};
expect(
(settings['general'] as Record<string, unknown>)['gitCoAuthor'],
).toEqual({ commit: false, pr: false });
});
it('leaves an already-object value untouched', () => {
const input = {
$version: 3,
general: { gitCoAuthor: { commit: false, pr: true } },
};
const { settings, warnings } = migration.migrate(input, 'user') as {
settings: Record<string, unknown>;
warnings: string[];
};
expect(
(settings['general'] as Record<string, unknown>)['gitCoAuthor'],
).toEqual({ commit: false, pr: true });
expect(warnings).toEqual([]);
});
it('bumps version when gitCoAuthor is absent', () => {
const input = { $version: 3, general: {} };
const { settings, warnings } = migration.migrate(input, 'user') as {
settings: Record<string, unknown>;
warnings: string[];
};
expect(settings['$version']).toBe(4);
expect(
(settings['general'] as Record<string, unknown>)['gitCoAuthor'],
).toBeUndefined();
expect(warnings).toEqual([]);
});
// String enable-intent forms map to {commit: true, pr: true};
// disable-intent forms map to {commit: false, pr: false}; an
// unrecognised string also defaults to disabled (safer-by-default
// — same contract as the runtime `pickBool`) but emits a warning.
it.each([
['"true"', 'true', { commit: true, pr: true }, false],
['"yes"', 'yes', { commit: true, pr: true }, false],
['"on"', 'on', { commit: true, pr: true }, false],
['"1"', '1', { commit: true, pr: true }, false],
['"false"', 'false', { commit: false, pr: false }, false],
['"no"', 'no', { commit: false, pr: false }, false],
['"off"', 'off', { commit: false, pr: false }, false],
['"0"', '0', { commit: false, pr: false }, false],
['empty string', '', { commit: false, pr: false }, false],
['"OFF" (case)', 'OFF', { commit: false, pr: false }, false],
['unknown string', 'maybe', { commit: false, pr: false }, true],
])(
'maps string %s to %j (warn=%s)',
(_label, str, expected, expectWarn) => {
const input = { $version: 3, general: { gitCoAuthor: str } };
const { settings, warnings } = migration.migrate(input, 'user') as {
settings: Record<string, unknown>;
warnings: string[];
};
expect(
(settings['general'] as Record<string, unknown>)['gitCoAuthor'],
).toEqual(expected);
if (expectWarn) {
expect(warnings).toHaveLength(1);
expect(warnings[0]).toContain('gitCoAuthor');
} else {
expect(warnings).toHaveLength(0);
}
},
);
// Non-string invalid values (null/array/number) get the
// safer-by-default disabled state with a warning.
it.each([
['null', null],
['array', []],
['number', 42],
])(
'treats %s as invalid and resets to disabled with a warning',
(_label, bad) => {
const input = { $version: 3, general: { gitCoAuthor: bad } };
const { settings, warnings } = migration.migrate(input, 'user') as {
settings: Record<string, unknown>;
warnings: string[];
};
expect(
(settings['general'] as Record<string, unknown>)['gitCoAuthor'],
).toEqual({ commit: false, pr: false });
expect(warnings).toHaveLength(1);
},
);
it('leaves a partially-specified object unchanged', () => {
// Downstream normalizeGitCoAuthor fills missing sub-keys with defaults;
// the migration only reshapes, it does not paternalistically fill defaults.
const input = {
$version: 3,
general: { gitCoAuthor: { commit: false } },
};
const { settings, warnings } = migration.migrate(input, 'user') as {
settings: Record<string, unknown>;
warnings: string[];
};
expect(
(settings['general'] as Record<string, unknown>)['gitCoAuthor'],
).toEqual({ commit: false });
expect(warnings).toEqual([]);
});
it('does not mutate the input settings object', () => {
const input = { $version: 3, general: { gitCoAuthor: false } };
migration.migrate(input, 'user');
expect(input).toEqual({
$version: 3,
general: { gitCoAuthor: false },
});
});
it('throws for non-object input', () => {
expect(() => migration.migrate(null, 'user')).toThrow();
expect(() => migration.migrate('string', 'user')).toThrow();
});
});
});

View file

@ -0,0 +1,146 @@
/**
* @license
* Copyright 2025 Google LLC
* SPDX-License-Identifier: Apache-2.0
*/
import type { SettingsMigration } from '../types.js';
import {
getNestedProperty,
setNestedPropertySafe,
} from '../../../utils/settingsUtils.js';
const GIT_CO_AUTHOR_PATH = 'general.gitCoAuthor';
/**
* V3 -> V4 migration (gitCoAuthor boolean object expansion).
*
* Before V4, `general.gitCoAuthor` was a single boolean that governed both
* commit message attribution and PR body attribution. V4 splits those into
* two independent sub-toggles so users can disable one without losing the
* other. This migration rewrites any stored boolean into `{ commit: v,
* pr: v }` so the user's prior choice carries over to both new toggles and
* the settings dialog reads the expected object shape.
*
* Compatibility strategy:
* - Boolean values are expanded in place.
* - Object values with `commit`/`pr` keys are left untouched (forward-
* compatible a user who edited their settings.json by hand to the new
* shape is already on V4-equivalent data).
* - Any other present value (string, number, array, null) is dropped with
* a warning so the caller sees an actionable message.
*/
export class V3ToV4Migration implements SettingsMigration {
readonly fromVersion = 3;
readonly toVersion = 4;
shouldMigrate(settings: unknown): boolean {
if (typeof settings !== 'object' || settings === null) {
return false;
}
const s = settings as Record<string, unknown>;
if (s['$version'] === 3) {
return true;
}
// Versionless settings file (no $version key): the V1/V2 migrations
// don't list `gitCoAuthor` as an indicator key (it post-dates them),
// so a settings file with ONLY this shape wouldn't trigger any
// earlier migration. Catch it here so:
// - legacy boolean (`gitCoAuthor: false`) gets expanded to
// `{commit: false, pr: false}` instead of being silently
// overwritten by the dialog's schema defaults on first save;
// - invalid shapes (`gitCoAuthor: "off"`, `gitCoAuthor: []`,
// etc.) get reset by the migrate() body's drop-and-warn path
// so runtime normalization doesn't quietly re-enable
// attribution against the user's intent.
if (s['$version'] === undefined) {
const value = getNestedProperty(s, GIT_CO_AUTHOR_PATH);
if (value === undefined) return false;
// Already in the v4 shape — leave the loader to stamp $version: 4.
if (
typeof value === 'object' &&
value !== null &&
!Array.isArray(value)
) {
return false;
}
// Anything else (boolean, string, number, array, null) needs
// rewriting via migrate().
return true;
}
return false;
}
migrate(
settings: unknown,
scope: string,
): { settings: unknown; warnings: string[] } {
if (typeof settings !== 'object' || settings === null) {
throw new Error('Settings must be an object');
}
const result = structuredClone(settings) as Record<string, unknown>;
const warnings: string[] = [];
const value = getNestedProperty(result, GIT_CO_AUTHOR_PATH);
if (typeof value === 'boolean') {
// Legacy shape — rewrite as { commit, pr } preserving the prior choice.
setNestedPropertySafe(result, GIT_CO_AUTHOR_PATH, {
commit: value,
pr: value,
});
} else if (typeof value === 'string') {
// String forms: a user who hand-edited `"gitCoAuthor": "off"` (or
// similar) to disable the feature must NOT see attribution
// silently re-enable just because we couldn't parse the literal
// shape. Map disable-intent strings to `{commit: false, pr: false}`,
// enable-intent strings to `{commit: true, pr: true}`, and
// anything else to disabled with a warning (safer-by-default
// than enabling against an ambiguous opt-out).
const lowered = value.trim().toLowerCase();
const disableIntent = ['false', 'no', 'off', '0', 'disabled', ''];
const enableIntent = ['true', 'yes', 'on', '1', 'enabled'];
if (enableIntent.includes(lowered)) {
setNestedPropertySafe(result, GIT_CO_AUTHOR_PATH, {
commit: true,
pr: true,
});
} else if (disableIntent.includes(lowered)) {
setNestedPropertySafe(result, GIT_CO_AUTHOR_PATH, {
commit: false,
pr: false,
});
} else {
setNestedPropertySafe(result, GIT_CO_AUTHOR_PATH, {
commit: false,
pr: false,
});
warnings.push(
`Reset '${GIT_CO_AUTHOR_PATH}' in ${scope} settings to {commit: false, pr: false} because the stored string '${value}' was not a recognized boolean form.`,
);
}
} else if (
value !== undefined &&
(typeof value !== 'object' || value === null || Array.isArray(value))
) {
// Invalid non-string shape (number, array, null). Drop and
// disable rather than re-enable on ambiguity — same
// safer-by-default contract as `pickBool` at runtime.
setNestedPropertySafe(result, GIT_CO_AUTHOR_PATH, {
commit: false,
pr: false,
});
warnings.push(
`Reset '${GIT_CO_AUTHOR_PATH}' in ${scope} settings to {commit: false, pr: false} because the stored value was not a boolean or object.`,
);
}
// Object values (including the new shape) pass through unchanged.
result['$version'] = 4;
return { settings: result, warnings };
}
}
export const v3ToV4Migration = new V3ToV4Migration();

View file

@ -65,7 +65,7 @@ export const USER_SETTINGS_DIR = path.dirname(USER_SETTINGS_PATH);
export const DEFAULT_EXCLUDED_ENV_VARS = ['DEBUG', 'DEBUG_MODE'];
// Settings version to track migration state
export const SETTINGS_VERSION = 3;
export const SETTINGS_VERSION = 4;
export const SETTINGS_VERSION_KEY = '$version';
/**

View file

@ -78,6 +78,27 @@ export interface SettingDefinition {
options?: readonly SettingEnumOption[];
/** Schema for array items when type is 'array' */
items?: SettingItemDefinition;
/**
* Primitive shapes a field accepted before it was expanded to its current
* type. The exported JSON Schema wraps the field in `anyOf` so values from
* those older shapes don't trip the IDE validator while the runtime
* migration is still pending. Has no runtime effect it's purely a
* compatibility hint for editors.
*
* Narrowed to the subset our generator can faithfully emit as a
* one-liner `{ type: <legacyType> }` schema fragment. `'enum'` is
* not a valid JSON Schema `type` value at all (enum constraints
* use the `enum` keyword, not `type: 'enum'`), so allowing it here
* would silently produce an invalid `settings.schema.json`.
* `'object'` IS a valid JSON Schema type, but a bare
* `{ type: 'object' }` legacy entry would accept ANY object value
* most likely not what the field's pre-expansion shape actually
* permitted. Future legacy shapes that need `enum` / structured-
* object compatibility should land their own branch in
* `convertSettingToJsonSchema` (with proper `enum:` / `properties:`
* companions) instead of widening this set.
*/
legacyTypes?: ReadonlyArray<'boolean' | 'string' | 'number' | 'array'>;
/**
* Escape hatch for the JSON Schema generator: when set, this object is
* emitted verbatim under the setting's properties entry instead of the
@ -387,14 +408,47 @@ const SETTINGS_SCHEMA = {
showInDialog: true,
},
gitCoAuthor: {
type: 'boolean',
label: 'Attribution: commit',
type: 'object',
label: 'Attribution',
category: 'General',
requiresRestart: false,
default: true,
// Match `normalizeGitCoAuthor`'s runtime defaults so the IDE
// schema publishes the same "enabled by default" hint users see
// at runtime. The empty-object form here would silently lose
// editor-surfaced defaults.
default: { commit: true, pr: true },
description:
'Automatically add a Co-authored-by trailer to git commit messages when commits are made through Qwen Code.',
showInDialog: true,
'Attribution added to git commits and pull requests created through Qwen Code.',
showInDialog: false,
// Pre-V4 settings stored this as a single boolean. The V3→V4
// migration rewrites those on first launch, but the IDE schema
// validator runs before that — accept the boolean shape so users
// editing settings.json in VS Code don't see a spurious warning
// until they run qwen once. Config.normalizeGitCoAuthor handles
// the boolean at runtime.
legacyTypes: ['boolean'],
properties: {
commit: {
type: 'boolean',
label: 'Attribution: commit',
category: 'General',
requiresRestart: false,
default: true,
description:
'Add a Co-authored-by trailer to git commit messages AND attach a per-file AI-attribution git note (`refs/notes/ai-attribution`) for commits made through Qwen Code. Disabling skips both.',
showInDialog: true,
},
pr: {
type: 'boolean',
label: 'Attribution: PR',
category: 'General',
requiresRestart: false,
default: true,
description:
'Append a Qwen Code attribution line to PR descriptions when running `gh pr create`.',
showInDialog: true,
},
},
},
checkpointing: {
type: 'object',

View file

@ -295,7 +295,8 @@ const SETTINGS_DIALOG_ORDER: readonly string[] = [
'ui.enableWelcomeBack',
// Git Behavior
'general.gitCoAuthor',
'general.gitCoAuthor.commit',
'general.gitCoAuthor.pr',
// File Filtering
'context.fileFiltering.respectGitIgnore',

View file

@ -1030,6 +1030,95 @@ describe('Server Config (config.ts)', () => {
});
});
describe('GitCoAuthor Settings', () => {
it('defaults both commit and pr to true when not specified', () => {
const config = new Config({ ...baseParams, gitCoAuthor: undefined });
const settings = config.getGitCoAuthor();
expect(settings.commit).toBe(true);
expect(settings.pr).toBe(true);
});
it('accepts an object with independent commit and pr toggles', () => {
const config = new Config({
...baseParams,
gitCoAuthor: { commit: true, pr: false },
});
const settings = config.getGitCoAuthor();
expect(settings.commit).toBe(true);
expect(settings.pr).toBe(false);
});
// Legacy shape: before commit and PR attribution were split, this
// setting was a single boolean. Treat it as governing both toggles so
// existing users' preferences carry over.
it.each([true, false])(
'coerces legacy boolean %s to { commit, pr } with the same value',
(value) => {
const config = new Config({ ...baseParams, gitCoAuthor: value });
const settings = config.getGitCoAuthor();
expect(settings.commit).toBe(value);
expect(settings.pr).toBe(value);
},
);
// settings.json is hand-editable; without intent-aware string
// parsing a hand-edited `{ commit: "false" }` would silently
// inflate to `commit: true` (the previous "default-to-true on
// mismatch" policy). Honor common string disable-intent forms
// and fall through to disabled on genuinely unrecognisable
// input — safer-by-default than turning attribution on against
// the user's clear opt-out.
it.each([
// Disable-intent strings.
['string "false"', 'false', false],
['string "FALSE"', 'FALSE', false],
['string "no"', 'no', false],
['string "off"', 'off', false],
['string "0"', '0', false],
['empty string', '', false],
// Enable-intent strings.
['string "true"', 'true', true],
['string "yes"', 'yes', true],
['string "on"', 'on', true],
['string "1"', '1', true],
// Numbers.
['number 1', 1, true],
['number 0', 0, false],
['number 42', 42, false],
// Other types fall through to disabled.
['null', null, false],
['object', {}, false],
['array', [], false],
// Unknown strings → disabled (don't quietly enable).
['unknown string', 'maybe', false],
])(
'parses %s as %s for both commit and pr',
(_label, badValue, expected) => {
const config = new Config({
...baseParams,
gitCoAuthor: {
commit: badValue as unknown as boolean,
pr: badValue as unknown as boolean,
},
});
const settings = config.getGitCoAuthor();
expect(settings.commit).toBe(expected);
expect(settings.pr).toBe(expected);
},
);
// A genuinely-absent sub-field still defaults to true (schema default).
it('defaults absent commit/pr to true', () => {
const config = new Config({
...baseParams,
gitCoAuthor: {} as { commit?: boolean; pr?: boolean },
});
const settings = config.getGitCoAuthor();
expect(settings.commit).toBe(true);
expect(settings.pr).toBe(true);
});
});
describe('Telemetry Settings', () => {
it('should return default telemetry target if not provided', () => {
const params: ConfigParameters = {

View file

@ -130,6 +130,9 @@ import {
import { getAutoMemoryRoot } from '../memory/paths.js';
import { readAutoMemoryIndex } from '../memory/store.js';
import { MemoryManager } from '../memory/manager.js';
import { CommitAttributionService } from '../services/commitAttribution.js';
const gitCoAuthorLogger = createDebugLogger('GIT_CO_AUTHOR');
import {
ModelsConfig,
@ -237,11 +240,72 @@ export interface OutputSettings {
}
export interface GitCoAuthorSettings {
enabled?: boolean;
commit: boolean;
pr: boolean;
name?: string;
email?: string;
}
/**
* Shape accepted by the Config constructor for the `gitCoAuthor` param.
*
* A plain `boolean` is accepted for backward compatibility: older settings
* (shipped before commit and PR attribution were split) stored this field as
* a single boolean, and we treat that as applying to both sub-toggles so
* nobody's stored preference silently flips.
*/
export type GitCoAuthorParam = boolean | { commit?: boolean; pr?: boolean };
function normalizeGitCoAuthor(value: GitCoAuthorParam | undefined): {
commit: boolean;
pr: boolean;
} {
if (typeof value === 'boolean') {
return { commit: value, pr: value };
}
// Default to `true` (the schema default) ONLY when the sub-field
// is genuinely absent. For PRESENT-but-non-boolean values, honor
// common string forms (`"true"`/`"yes"`/`"on"`/`"1"` → true,
// `"false"`/`"no"`/`"off"`/`"0"`/`""` → false) and treat anything
// else as opt-out. settings.json is user-editable, and the previous
// "default-to-true on mismatch" policy meant a hand-edited
// `{ "commit": "false" }` silently activated attribution against
// the user's clear intent. Safer-by-default: ambiguous values
// disable rather than enable.
const pickBool = (v: unknown, fieldName: string): boolean => {
if (v === undefined) return true;
if (typeof v === 'boolean') return v;
if (typeof v === 'string') {
const lowered = v.trim().toLowerCase();
if (
lowered === 'true' ||
lowered === 'yes' ||
lowered === 'on' ||
lowered === '1'
) {
return true;
}
// Known disable-intent forms — silent (matches user intent).
const knownDisable = ['false', 'no', 'off', '0', 'disabled', ''];
if (!knownDisable.includes(lowered)) {
// Unrecognised string — disable (safer-by-default) but log
// so a user wondering "why is my setting being ignored?"
// can see the actual coercion in QWEN_DEBUG_LOG_FILE.
gitCoAuthorLogger.warn(
`Unrecognized string value for general.gitCoAuthor.${fieldName}: ${JSON.stringify(v)}; treating as false. Accepted forms: true/yes/on/1, false/no/off/0/empty.`,
);
}
return false;
}
if (typeof v === 'number') return v === 1;
return false;
};
return {
commit: pickBool(value?.commit, 'commit'),
pr: pickBool(value?.pr, 'pr'),
};
}
export type ExtensionOriginSource = 'QwenCode' | 'Claude' | 'Gemini';
export interface ExtensionInstallMetadata {
@ -375,7 +439,7 @@ export interface ConfigParameters {
contextFileName?: string | string[];
accessibility?: AccessibilitySettings;
telemetry?: TelemetrySettings;
gitCoAuthor?: boolean;
gitCoAuthor?: GitCoAuthorParam;
usageStatisticsEnabled?: boolean;
/**
* If true, disables the per-session FileReadCache short-circuit
@ -781,7 +845,7 @@ export class Config {
useCollector: params.telemetry?.useCollector,
};
this.gitCoAuthor = {
enabled: params.gitCoAuthor ?? true,
...normalizeGitCoAuthor(params.gitCoAuthor),
name: 'Qwen-Coder',
email: 'qwen-coder@alibabacloud.com',
};
@ -1379,6 +1443,12 @@ export class Config {
// constructed via Object.create — those should clear their own
// cache, not the parent's.
this.getFileReadCache().clear();
// The commit-attribution singleton accumulates per-file AI edits
// and a session-scoped prompt counter — both stop being meaningful
// when the session resets. Without this, pending attributions
// from the previous session could attach to a commit in the new
// one, and the "N-shotted" PR label would span sessions.
CommitAttributionService.resetInstance();
if (this.initialized) {
logStartSession(this, new StartSessionEvent(this));
}

View file

@ -2728,6 +2728,64 @@ Other open files:
expect(mockMessageBus.request).toHaveBeenCalled();
});
});
describe('attribution snapshot persistence', () => {
let recordAttributionSnapshot: ReturnType<typeof vi.fn>;
beforeEach(() => {
recordAttributionSnapshot = vi.fn();
vi.mocked(mockConfig.getChatRecordingService).mockReturnValue({
recordAttributionSnapshot,
recordUserMessage: vi.fn(),
recordCronPrompt: vi.fn(),
} as unknown as ReturnType<Config['getChatRecordingService']>);
mockTurnRunFn.mockReturnValue(
(async function* () {
yield { type: 'content', value: 'ok' };
})(),
);
});
it('records a snapshot on ToolResult turns so post-tool state is captured', async () => {
const stream = client.sendMessageStream(
[{ text: 'tool-result' }],
new AbortController().signal,
'prompt-tr',
{ type: SendMessageType.ToolResult },
);
for await (const _ of stream) {
/* consume */
}
expect(recordAttributionSnapshot).toHaveBeenCalled();
});
it('records a snapshot on UserQuery turns', async () => {
const stream = client.sendMessageStream(
[{ text: 'user' }],
new AbortController().signal,
'prompt-uq',
{ type: SendMessageType.UserQuery },
);
for await (const _ of stream) {
/* consume */
}
expect(recordAttributionSnapshot).toHaveBeenCalled();
});
it('does not record a snapshot on Retry turns', async () => {
const stream = client.sendMessageStream(
[{ text: 'retry' }],
new AbortController().signal,
'prompt-retry-snap',
{ type: SendMessageType.Retry },
);
for await (const _ of stream) {
/* consume */
}
expect(recordAttributionSnapshot).not.toHaveBeenCalled();
});
});
});
describe('generateContent', () => {

View file

@ -46,6 +46,7 @@ import {
COMPRESSION_TOKEN_THRESHOLD,
} from '../services/chatCompressionService.js';
import { LoopDetectionService } from '../services/loopDetectionService.js';
import { CommitAttributionService } from '../services/commitAttribution.js';
// Models
import { buildAgentContentGeneratorConfig } from '../models/content-generator-config.js';
@ -220,11 +221,44 @@ export class GeminiClient {
this.getChat().setLastPromptTokenCount(
uiTelemetryService.getLastPromptTokenCount(),
);
// Restore attribution state from the last snapshot in the session
this.restoreAttributionFromSession(resumedSessionData.conversation);
} else {
await this.startChat();
}
}
/**
* Restore attribution state from the last snapshot in a resumed session.
*/
private restoreAttributionFromSession(conversation: {
messages: Array<{ subtype?: string; systemPayload?: unknown }>;
}): void {
// Find the last attribution snapshot in the session
let lastSnapshot: unknown = null;
for (const msg of conversation.messages) {
if (
msg.subtype === 'attribution_snapshot' &&
msg.systemPayload &&
typeof msg.systemPayload === 'object' &&
'snapshot' in msg.systemPayload
) {
lastSnapshot = (msg.systemPayload as { snapshot: unknown }).snapshot;
}
}
if (lastSnapshot && typeof lastSnapshot === 'object') {
try {
CommitAttributionService.getInstance().restoreFromSnapshot(
lastSnapshot as import('../services/commitAttribution.js').AttributionSnapshot,
);
debugLogger.debug('Restored attribution state from session snapshot');
} catch {
debugLogger.warn('Failed to restore attribution snapshot');
}
}
}
private getContentGeneratorOrFail(): ContentGenerator {
if (!this.config.getContentGenerator()) {
throw new Error('Content generator not initialized');
@ -782,6 +816,18 @@ export class GeminiClient {
);
}
// Track prompt count for commit attribution. Only the user typing a
// fresh prompt should bump the counter — `ToolResult` (tool-call
// continuation), `Retry`, `Hook`, `Cron`, and `Notification` are all
// model-driven or background-driven re-entries of the same logical
// turn. Counting them inflates the "N-shotted" label in the PR
// attribution trailer (one user message becomes "10-shotted" when it
// triggered ten tool calls).
const attributionService = CommitAttributionService.getInstance();
if (messageType === SendMessageType.UserQuery) {
attributionService.incrementPromptCount();
}
// record user/cron message for session management
if (messageType === SendMessageType.Cron) {
this.config
@ -820,7 +866,18 @@ export class GeminiClient {
);
}
}
if (messageType !== SendMessageType.Retry) {
// Snapshot on every non-retry turn. ToolResult turns run right after
// tool execution, so their snapshot captures edits that a prior
// UserQuery turn scheduled. Without this, a resumed session only sees
// the UserQuery-time snapshot (empty) and loses tool-driven edits.
this.config
.getChatRecordingService()
?.recordAttributionSnapshot(
CommitAttributionService.getInstance().toSnapshot(),
);
this.sessionTurnCount++;
if (

View file

@ -0,0 +1,100 @@
/**
* @license
* Copyright 2025 Google LLC
* SPDX-License-Identifier: Apache-2.0
*/
import { describe, it, expect } from 'vitest';
import { buildGitNotesCommand } from './attributionTrailer.js';
import type { CommitAttributionNote } from './commitAttribution.js';
const sampleNote: CommitAttributionNote = {
version: 1,
generator: 'Qwen-Coder',
files: {
'src/main.ts': { aiChars: 150, humanChars: 50, percent: 75 },
'src/utils.ts': { aiChars: 0, humanChars: 200, percent: 0 },
},
summary: {
aiPercent: 38,
aiChars: 150,
humanChars: 250,
totalFilesTouched: 2,
surfaces: ['cli'],
},
surfaceBreakdown: { cli: { aiChars: 150, percent: 38 } },
excludedGenerated: ['package-lock.json'],
excludedGeneratedCount: 1,
promptCount: 3,
};
describe('attributionTrailer', () => {
describe('buildGitNotesCommand', () => {
const TARGET_SHA = 'abc1234567890abcdef1234567890abcdef12345';
it('should build a valid git notes invocation', () => {
const cmd = buildGitNotesCommand(sampleNote, TARGET_SHA);
expect(cmd).not.toBeNull();
expect(cmd!.command).toBe('git');
expect(cmd!.args.slice(0, 6)).toEqual([
'notes',
'--ref=refs/notes/ai-attribution',
'add',
'-f',
'-m',
// index 5 is the JSON note payload, asserted below
cmd!.args[5],
]);
// Note must target the captured SHA, not the symbolic `HEAD` —
// otherwise a post-commit hook or chained command can move HEAD
// between capture and exec, and `-f` lands the note on the
// wrong commit.
expect(cmd!.args.at(-1)).toBe(TARGET_SHA);
});
it('should pass the JSON note as a single argv entry (no shell quoting)', () => {
// The `-f` flag is at args[3]; the note JSON sits at args[5] between
// `-m` and the target commit. Returning argv (rather than a
// shell-quoted command string) keeps the payload off the shell
// parser entirely so quotes, command substitution, and
// platform-specific escaping cannot break it on cmd.exe / PowerShell.
const cmd = buildGitNotesCommand(sampleNote, TARGET_SHA)!;
const noteArg = cmd.args[5]!;
const parsed = JSON.parse(noteArg);
expect(parsed.version).toBe(1);
expect(parsed.summary.aiPercent).toBe(38);
expect(parsed.files['src/main.ts'].percent).toBe(75);
});
it('should return null when note exceeds size limit', () => {
const hugeNote: CommitAttributionNote = {
...sampleNote,
files: {},
excludedGenerated: [],
excludedGeneratedCount: 0,
};
for (let i = 0; i < 2000; i++) {
hugeNote.files[
`src/very/long/path/to/some/deeply/nested/file_${i}.ts`
] = { aiChars: 999999, humanChars: 999999, percent: 50 };
}
expect(buildGitNotesCommand(hugeNote, TARGET_SHA)).toBeNull();
});
it('should leave single quotes literal in the argv payload', () => {
// The previous string-based command needed bash-style quote escaping.
// With argv, the apostrophe stays literal — the executor passes it
// through to git unmolested.
const noteWithQuotes: CommitAttributionNote = {
...sampleNote,
files: {
"it's-a-file.ts": { aiChars: 10, humanChars: 5, percent: 67 },
},
};
const cmd = buildGitNotesCommand(noteWithQuotes, TARGET_SHA);
expect(cmd).not.toBeNull();
const parsed = JSON.parse(cmd!.args[5]!);
expect(parsed.files["it's-a-file.ts"].percent).toBe(67);
});
});
});

View file

@ -0,0 +1,80 @@
/**
* @license
* Copyright 2025 Google LLC
* SPDX-License-Identifier: Apache-2.0
*/
/**
* Attribution Trailer Utility
*
* Generates git notes commands for storing per-file AI attribution metadata
* on commits. This keeps the commit message clean (only Co-Authored-By trailer)
* while storing detailed contribution data in git notes.
*/
import type { CommitAttributionNote } from './commitAttribution.js';
const GIT_NOTES_REF = 'refs/notes/ai-attribution';
/**
* Maximum byte length for the JSON note. Sized for the most
* restrictive ARG_MAX in the wild: Windows' `CreateProcess`
* lpCommandLine is capped around 32,768 UTF-16 chars including the
* git executable path, the other argv entries, and separators, so
* the note itself has to fit in that minus a safety margin (~2 KB)
* for everything else. Linux/macOS ARG_MAX is much larger; sizing
* for Windows just means we cap earlier on those platforms the
* note is meant to be small metadata, not a payload, so the limit
* is rarely the binding constraint.
*/
const MAX_NOTE_BYTES = 30 * 1024; // 30 KB
/**
* argv-form git notes invocation, designed for `child_process.execFile`.
*
* We return argv rather than a shell-quoted command string because the JSON
* note travels as a separate argv entry no shell quoting is needed and no
* shell metacharacters can be re-evaluated. This matters most on Windows
* where bash-style single-quote escaping (`'\''`) is invalid and would
* corrupt the note (or, worse, allow interpolation under PowerShell/cmd).
*/
export interface GitNotesCommand {
command: string;
args: string[];
}
/**
* Build the git notes add invocation to attach attribution metadata to a
* specific commit. `targetCommit` MUST be the SHA the caller captured
* after detecting the commit's HEAD movement passing the symbolic
* `'HEAD'` opens a TOCTOU window where a post-commit hook, a chained
* `git commit && git tag -m ...`, or a parallel process can advance
* HEAD between capture and exec, and `-f` would silently overwrite the
* note on the wrong commit.
*
* Caller should pass the result to a process-spawning API
* (`child_process.execFile`) along with a `cwd` option.
*
* Returns null if the serialized note exceeds MAX_NOTE_BYTES.
*/
export function buildGitNotesCommand(
note: CommitAttributionNote,
targetCommit: string,
): GitNotesCommand | null {
const noteJson = JSON.stringify(note);
if (Buffer.byteLength(noteJson, 'utf-8') > MAX_NOTE_BYTES) {
return null;
}
return {
command: 'git',
args: [
'notes',
`--ref=${GIT_NOTES_REF}`,
'add',
'-f',
'-m',
noteJson,
targetCommit,
],
};
}

View file

@ -594,6 +594,138 @@ describe('ChatRecordingService', () => {
});
});
describe('recordAttributionSnapshot', () => {
const baseSnapshot = {
type: 'attribution-snapshot' as const,
version: 1,
surface: 'cli',
fileStates: {},
promptCount: 0,
promptCountAtLastCommit: 0,
};
it('should write each distinct snapshot', async () => {
chatRecordingService.recordAttributionSnapshot(baseSnapshot);
chatRecordingService.recordAttributionSnapshot({
...baseSnapshot,
promptCount: 1,
});
chatRecordingService.recordAttributionSnapshot({
...baseSnapshot,
promptCount: 2,
});
await chatRecordingService.flush();
expect(jsonl.writeLine).toHaveBeenCalledTimes(3);
});
// Sessions that touch many files emit a non-retry turn snapshot
// every prompt cycle. Without dedup, repeated identical snapshots
// (no edits, no prompt-counter change) would re-serialize the entire
// attribution state into the JSONL on every turn, inflating session
// size and slowing /resume.
it('should skip a snapshot identical to the previous write', async () => {
chatRecordingService.recordAttributionSnapshot(baseSnapshot);
chatRecordingService.recordAttributionSnapshot(baseSnapshot);
chatRecordingService.recordAttributionSnapshot(baseSnapshot);
await chatRecordingService.flush();
expect(jsonl.writeLine).toHaveBeenCalledTimes(1);
});
// After rewindRecording, the previous attribution snapshot lives on
// the abandoned branch, so the dedup key has to clear — otherwise
// the post-rewind identical snapshot would be silently skipped and
// /resume on the rewound session would lose all attribution state.
it('should re-write an identical snapshot after rewindRecording', async () => {
chatRecordingService.recordUserMessage([{ text: 'turn 1' }]);
chatRecordingService.recordAttributionSnapshot(baseSnapshot);
await chatRecordingService.flush();
const beforeRewind = vi.mocked(jsonl.writeLine).mock.calls.length;
chatRecordingService.rewindRecording(0, { truncatedCount: 0 });
// Same snapshot bytes — without the rewind reset this would dedup.
chatRecordingService.recordAttributionSnapshot(baseSnapshot);
await chatRecordingService.flush();
// 1 rewind record + 1 fresh snapshot = 2 more writes after rewind.
expect(vi.mocked(jsonl.writeLine).mock.calls.length).toBe(
beforeRewind + 2,
);
});
// A transient write failure must NOT permanently suppress future
// identical snapshots: if the dedup key were committed before the
// write, the next identical snapshot would dedup and the session
// would have no attribution snapshot at all.
it('should retry an identical snapshot after a write failure', async () => {
vi.mocked(jsonl.writeLine).mockRejectedValueOnce(new Error('disk full'));
chatRecordingService.recordAttributionSnapshot(baseSnapshot);
// Wait for the queued (failing) write to settle so the rollback runs.
await chatRecordingService.flush();
const afterFailure = vi.mocked(jsonl.writeLine).mock.calls.length;
chatRecordingService.recordAttributionSnapshot(baseSnapshot);
await chatRecordingService.flush();
// Retry should fire, so we get a new write call.
expect(vi.mocked(jsonl.writeLine).mock.calls.length).toBe(
afterFailure + 1,
);
});
// appendRecord is fire-and-forget for non-snapshot callers
// (recordUserMessage / recordAssistantTurn / recordAtCommand /
// ...). When jsonl.writeLine rejects, the rejection MUST be
// swallowed inside the service — otherwise it surfaces as an
// unhandled-promise-rejection in production (and as a flaky
// failure under vitest's --reporter=default).
it('should swallow async writeLine rejection for fire-and-forget callers', async () => {
vi.mocked(jsonl.writeLine).mockRejectedValueOnce(new Error('disk full'));
// Track unhandled rejections during this test.
const unhandled: unknown[] = [];
const handler = (err: unknown) => unhandled.push(err);
process.on('unhandledRejection', handler);
try {
chatRecordingService.recordUserMessage([{ text: 'hi' }]);
await chatRecordingService.flush();
// Microtask drain to give any unhandled rejections a chance
// to surface before we assert.
await new Promise((resolve) => setImmediate(resolve));
expect(unhandled).toHaveLength(0);
} finally {
process.off('unhandledRejection', handler);
}
});
// appendRecord can throw SYNCHRONOUSLY before returning a promise
// (e.g. ensureConversationFile fails because the conversation
// file can't be created). Without rollback in the outer catch,
// the dedup key stays set on a write that never happened, so
// all future identical snapshots get suppressed.
it('should retry an identical snapshot after a synchronous failure', async () => {
// First call: force writeFileSync (used by ensureConversationFile
// to wx-create the JSONL file) to throw a non-EEXIST error.
// ensureConversationFile rethrows that, which propagates through
// appendRecord SYNCHRONOUSLY before any promise is returned.
const writeFileSpy = vi.spyOn(fs, 'writeFileSync');
writeFileSpy.mockImplementationOnce(() => {
const e = new Error(
'EACCES: permission denied',
) as NodeJS.ErrnoException;
e.code = 'EACCES';
throw e;
});
chatRecordingService.recordAttributionSnapshot(baseSnapshot);
await chatRecordingService.flush();
// Sync failure: writeLine never reached.
expect(vi.mocked(jsonl.writeLine)).not.toHaveBeenCalled();
// Identical snapshot on retry: dedup key should have been
// rolled back so this fires a fresh write.
chatRecordingService.recordAttributionSnapshot(baseSnapshot);
await chatRecordingService.flush();
expect(vi.mocked(jsonl.writeLine)).toHaveBeenCalledTimes(1);
});
});
// Note: Session management tests (listSessions, loadSession, deleteSession, etc.)
// have been moved to sessionService.test.ts
// Session resume integration tests should test via SessionService mock

View file

@ -19,6 +19,7 @@ import {
import * as jsonl from '../utils/jsonl-utils.js';
import { getGitBranch } from '../utils/gitUtils.js';
import { createDebugLogger } from '../utils/debugLogger.js';
import type { AttributionSnapshot } from './commitAttribution.js';
import { tryGenerateSessionTitle } from './sessionTitle.js';
import type {
ChatCompressionInfo,
@ -216,6 +217,7 @@ export interface ChatRecord {
| 'slash_command'
| 'ui_telemetry'
| 'at_command'
| 'attribution_snapshot'
| 'notification'
| 'cron'
| 'custom_title'
@ -262,6 +264,7 @@ export interface ChatRecord {
| SlashCommandRecordPayload
| UiTelemetryRecordPayload
| AtCommandRecordPayload
| AttributionSnapshotPayload
| CustomTitleRecordPayload
| NotificationRecordPayload
| RewindRecordPayload
@ -374,6 +377,14 @@ export interface UiTelemetryRecordPayload {
uiEvent: UiEvent;
}
/**
* Stored payload for attribution state snapshots.
* Enables session persistence of AI contribution tracking.
*/
export interface AttributionSnapshotPayload {
snapshot: AttributionSnapshot;
}
/**
* Stored payload for conversation rewind events.
*/
@ -466,6 +477,16 @@ export class ChatRecordingService {
*/
private autoTitleController: AbortController | undefined;
/**
* JSON-serialized form of the most recent attribution snapshot we
* wrote, used to deduplicate identical writes on every non-retry
* turn. Without this, sessions that touch many files would write a
* full duplicate of the entire snapshot to the JSONL on every turn,
* inflating the on-disk session and making `/resume` slower to
* hydrate.
*/
private lastAttributionSnapshotJson: string | undefined;
constructor(config: Config) {
this.config = config;
this.lastRecordUuid =
@ -612,7 +633,25 @@ export class ChatRecordingService {
* local-disk writes failures are rare enough to accept the fire-and-forget
* simplification.
*/
private appendRecord(record: ChatRecord): void {
/**
* Fire-and-forget: queues a JSONL write on the internal writeChain
* and swallows async failures (logs them via debugLogger). All
* existing call sites recordUserMessage, recordAssistantTurn,
* etc. invoke this synchronously without awaiting, so the
* internal swallow keeps an unhandled-promise-rejection from
* surfacing on a single transient writeLine failure.
*
* Callers that need to react to per-record write FAILURE (e.g. the
* snapshot dedup-key rollback in `recordAttributionSnapshot`) pass
* an `onError` callback, which fires after the write rejects (and
* after the rejection has been logged + the chain re-armed). Sync
* throws still propagate so the caller's outer try/catch can roll
* back optimistic state see the synchronous-failure test.
*/
private appendRecord(
record: ChatRecord,
onError?: (err: unknown) => void,
): void {
let conversationFile: string;
try {
conversationFile = this.ensureConversationFile();
@ -626,6 +665,13 @@ export class ChatRecordingService {
.then(() => jsonl.writeLine(conversationFile, record))
.catch((err) => {
debugLogger.error('Error appending record (async):', err);
if (onError) {
try {
onError(err);
} catch (cbErr) {
debugLogger.error('appendRecord onError callback threw:', cbErr);
}
}
});
}
@ -945,6 +991,12 @@ export class ChatRecordingService {
this.lastRecordUuid = this.turnParentUuids[targetTurnIndex] ?? null;
// Trim future boundaries — they no longer exist in the active branch.
this.turnParentUuids = this.turnParentUuids.slice(0, targetTurnIndex);
// The previous attribution snapshot now sits on the abandoned
// branch — clear the dedup key so the next snapshot lands on the
// active branch and `/resume` can find it. Without this, a
// post-rewind identical snapshot would be skipped and the rewound
// session would lose all attribution state on restore.
this.lastAttributionSnapshotJson = undefined;
const record: ChatRecord = {
...this.createBaseRecord('system'),
@ -1083,4 +1135,59 @@ export class ChatRecordingService {
debugLogger.error('Error saving @-command record:', error);
}
}
/**
* Records an attribution state snapshot for session persistence.
* Called at the start of every non-retry turn so that a resumed session
* sees the most recent state including edits made during the prior turn.
*
* Deduplicates identical successive writes: if the snapshot's JSON
* form is byte-identical to the last one we wrote, skip the append.
* Without this, sessions that touch many files would write a full
* duplicate of the entire snapshot to the JSONL on every turn, even
* when nothing changed inflating session size and slowing /resume.
*
* Set the dedup key optimistically and roll it back if the write
* fails. Synchronous identical calls (common during a tool-driven
* turn) all dedup correctly, but a transient write failure clears
* the key so the next identical snapshot retries the write rather
* than being permanently suppressed.
*/
recordAttributionSnapshot(snapshot: AttributionSnapshot): void {
let json: string | undefined;
try {
json = JSON.stringify(snapshot);
if (json === this.lastAttributionSnapshotJson) {
return;
}
const record: ChatRecord = {
...this.createBaseRecord('system'),
type: 'system',
subtype: 'attribution_snapshot',
systemPayload: { snapshot },
};
this.lastAttributionSnapshotJson = json;
this.appendRecord(record, () => {
// Async write failed — only roll back if the key still
// belongs to our snapshot (a later distinct write may have
// overwritten it).
if (this.lastAttributionSnapshotJson === json) {
this.lastAttributionSnapshotJson = undefined;
}
});
} catch (error) {
// appendRecord (and createBaseRecord/JSON.stringify) can throw
// synchronously — e.g. ensureConversationFile() fails because
// the project temp dir isn't writable. The .catch() handler
// attached to the promise never runs in that case, so we'd
// otherwise leave the dedup key set without a write ever
// having landed and permanently suppress identical retries.
// Roll back here too.
if (json !== undefined && this.lastAttributionSnapshotJson === json) {
this.lastAttributionSnapshotJson = undefined;
}
debugLogger.error('Error saving attribution snapshot:', error);
}
}
}

View file

@ -0,0 +1,762 @@
/**
* @license
* Copyright 2025 Google LLC
* SPDX-License-Identifier: Apache-2.0
*/
import { describe, it, expect, beforeEach, vi, afterEach } from 'vitest';
// Stub `fs.realpathSync` so the symlink-aware tests below can simulate
// macOS-style `/var` ↔ `/private/var` mapping without needing a real
// symlink in the filesystem. Other tests don't touch realpath, so the
// pass-through default keeps them unaffected.
vi.mock('node:fs', async () => {
const actual = await vi.importActual<typeof import('node:fs')>('node:fs');
return { ...actual, realpathSync: vi.fn(actual.realpathSync) };
});
import * as fs from 'node:fs';
import * as os from 'node:os';
import * as path from 'node:path';
import {
CommitAttributionService,
computeCharContribution,
type StagedFileInfo,
} from './commitAttribution.js';
function makeStagedInfo(
files: string[],
diffSizes?: Record<string, number>,
deleted?: string[],
renamed?: Record<string, string>,
): StagedFileInfo {
return {
files,
diffSizes: new Map(Object.entries(diffSizes ?? {})),
deletedFiles: new Set(deleted ?? []),
renamedFiles: new Map(Object.entries(renamed ?? {})),
};
}
describe('computeCharContribution', () => {
it('should return new content length for file creation', () => {
expect(computeCharContribution('', 'hello world')).toBe(11);
});
it('should return old content length for file deletion', () => {
expect(computeCharContribution('hello world', '')).toBe(11);
});
it('should handle same-length replacement via prefix/suffix', () => {
expect(computeCharContribution('Esc', 'esc')).toBe(1);
});
it('should handle insertion in the middle', () => {
expect(computeCharContribution('ab', 'aXb')).toBe(1);
});
it('should handle deletion in the middle', () => {
expect(computeCharContribution('aXb', 'ab')).toBe(1);
});
it('should handle complete replacement', () => {
expect(computeCharContribution('abc', 'xyz')).toBe(3);
});
it('should return 0 for identical content', () => {
expect(computeCharContribution('same', 'same')).toBe(0);
});
it('should handle multi-line changes', () => {
const old = 'line1\nline2\nline3';
const now = 'line1\nchanged\nline3';
expect(computeCharContribution(old, now)).toBe(7); // "changed" > "line2"
});
});
describe('CommitAttributionService', () => {
beforeEach(() => {
CommitAttributionService.resetInstance();
});
it('should return the same singleton instance', () => {
const a = CommitAttributionService.getInstance();
const b = CommitAttributionService.getInstance();
expect(a).toBe(b);
});
it('should track new file creation', () => {
const service = CommitAttributionService.getInstance();
service.recordEdit('/project/src/file.ts', null, 'hello world');
const attr = service.getFileAttribution('/project/src/file.ts');
expect(attr!.aiCreated).toBe(true);
expect(attr!.aiContribution).toBe(11);
});
it('should NOT treat empty existing file as new file creation', () => {
const service = CommitAttributionService.getInstance();
service.recordEdit('/project/empty.ts', '', 'new content');
const attr = service.getFileAttribution('/project/empty.ts');
expect(attr!.aiCreated).toBe(false);
expect(attr!.aiContribution).toBe(11);
});
it('should track edits with prefix/suffix algorithm', () => {
const service = CommitAttributionService.getInstance();
service.recordEdit('/project/f.ts', 'Hello World', 'Hello world');
expect(service.getFileAttribution('/project/f.ts')!.aiContribution).toBe(1);
});
it('should accumulate contributions across multiple edits', () => {
const service = CommitAttributionService.getInstance();
service.recordEdit('/project/f.ts', 'aaa', 'bbb'); // 3
service.recordEdit('/project/f.ts', 'bbb', 'bbbccc'); // 3
expect(service.getFileAttribution('/project/f.ts')!.aiContribution).toBe(6);
});
// Out-of-band mutation detection: if the input `oldContent` doesn't
// match the contentHash AI recorded after its previous edit, the
// file was changed externally between AI's two writes — drop the
// accumulator before counting the new edit so prior AI work the
// user has since overwritten doesn't get credited later.
it('should reset accumulator when oldContent diverges from AI last write', () => {
const service = CommitAttributionService.getInstance();
// First AI edit: file goes from 'abc' to 'AI block of 100 chars padded' (28 chars).
const aiBlock = 'AI block of 100 chars padded';
service.recordEdit('/project/f.ts', 'abc', aiBlock);
const after1 = service.getFileAttribution('/project/f.ts')!;
expect(after1.aiContribution).toBeGreaterThan(0);
// Now a DIFFERENT oldContent shows up — the user paste-replaced
// the file via an external editor in between. AI's recordEdit
// should reset the counter before applying the new contribution.
service.recordEdit('/project/f.ts', 'user paste replacement', 'final');
const after2 = service.getFileAttribution('/project/f.ts')!;
// aiContribution is now bounded by the divergent edit alone, NOT
// accumulated on top of after1.aiContribution.
expect(after2.aiContribution).toBeLessThan(after1.aiContribution);
});
// Fresh-file lifetime: when AI re-creates a file at a path that was
// previously tracked but has since been deleted (oldContent === null
// signals "no file existed on disk"), the previous tracked state is
// from a different file lifetime. Without this reset, AI's
// accumulated chars from the deleted file would carry over and
// double-count toward the new file's attribution.
it('should reset accumulator when re-creating a previously-tracked deleted file', () => {
const service = CommitAttributionService.getInstance();
// First lifetime: AI creates 'foo.ts' with 100 chars of content.
const firstContent = 'A'.repeat(100);
service.recordEdit('/project/foo.ts', null, firstContent);
const after1 = service.getFileAttribution('/project/foo.ts')!;
expect(after1.aiContribution).toBe(100);
expect(after1.aiCreated).toBe(true);
// Second lifetime: file was deleted (e.g. user `rm foo.ts`), then
// AI re-creates it with new (shorter) content. oldContent=null
// signals "didn't exist on disk before this write".
const secondContent = 'short';
service.recordEdit('/project/foo.ts', null, secondContent);
const after2 = service.getFileAttribution('/project/foo.ts')!;
// aiContribution should reflect ONLY the second write's chars, not
// 100 + 5. aiCreated stays true (this lifetime is also a creation).
expect(after2.aiContribution).toBe(5);
expect(after2.aiCreated).toBe(true);
});
it('should NOT reset accumulator when oldContent matches AI last write', () => {
const service = CommitAttributionService.getInstance();
service.recordEdit('/project/f.ts', 'abc', 'AI step one');
const after1 = service.getFileAttribution('/project/f.ts')!;
// Second AI edit picks up where the first left off — oldContent
// matches the post-first hash, so accumulation continues.
service.recordEdit('/project/f.ts', 'AI step one', 'AI step two final');
const after2 = service.getFileAttribution('/project/f.ts')!;
expect(after2.aiContribution).toBeGreaterThan(after1.aiContribution);
});
// validateAgainst runs at commit time and drops entries whose
// recorded post-write hash doesn't match the caller-supplied
// content — catches user edits that happened entirely outside the
// Edit/Write tools (no recordEdit was called, so the input-hash
// check above couldn't see the divergence).
describe('validateAgainst', () => {
let tmpDir: string;
beforeEach(() => {
tmpDir = fs.mkdtempSync(path.join(os.tmpdir(), 'attr-validate-'));
});
afterEach(() => {
fs.rmSync(tmpDir, { recursive: true, force: true });
});
it('drops entries whose content has diverged', () => {
const service = CommitAttributionService.getInstance();
const filePath = path.join(tmpDir, 'diverged.ts');
fs.writeFileSync(filePath, 'AI wrote this', 'utf-8');
service.recordEdit(filePath, null, 'AI wrote this');
expect(service.getFileAttribution(filePath)).toBeDefined();
// Caller passes a reader that returns the diverged content.
service.validateAgainst(() => 'human replaced this');
expect(service.getFileAttribution(filePath)).toBeUndefined();
});
it('keeps entries whose content matches', () => {
const service = CommitAttributionService.getInstance();
const filePath = path.join(tmpDir, 'unchanged.ts');
fs.writeFileSync(filePath, 'AI wrote this', 'utf-8');
service.recordEdit(filePath, null, 'AI wrote this');
service.validateAgainst(() => 'AI wrote this');
expect(service.getFileAttribution(filePath)).toBeDefined();
});
it('keeps entries when getContent returns null (no comparison signal)', () => {
const service = CommitAttributionService.getInstance();
const filePath = path.join(tmpDir, 'no-comparison.ts');
fs.writeFileSync(filePath, 'will be queried', 'utf-8');
service.recordEdit(filePath, null, 'will be queried');
// null = "no committed blob / unreadable / out-of-scope" — the
// entry should NOT be dropped.
service.validateAgainst(() => null);
expect(service.getFileAttribution(filePath)).toBeDefined();
});
// BOM/CRLF normalisation: writeTextFile preserves the file's BOM
// and CRLF line-ending choice independently of whether AI's
// recordEdit input string contained the BOM char or used LF. The
// on-disk bytes returned by `git show` can therefore include a
// leading U+FEFF and CRLFs that AI never wrote — the hash MUST
// canonicalise both sides so a BOM/CRLF file isn't dropped on
// every commit.
it('keeps entries when on-disk content has BOM but AI input did not', () => {
const service = CommitAttributionService.getInstance();
const filePath = path.join(tmpDir, 'bom.ts');
// Simulate the on-disk file having a BOM (writeTextFile wrote
// it because the previous file version had one).
const aiContent = 'export const foo = 42;';
const onDiskWithBom = '' + aiContent;
fs.writeFileSync(filePath, onDiskWithBom, 'utf-8');
service.recordEdit(filePath, null, aiContent);
// Reader returns the on-disk content (with BOM). After
// canonicalisation, both sides hash to the same value.
service.validateAgainst(() => onDiskWithBom);
expect(service.getFileAttribution(filePath)).toBeDefined();
});
it('keeps entries when on-disk uses CRLF but AI input used LF', () => {
const service = CommitAttributionService.getInstance();
const filePath = path.join(tmpDir, 'crlf.ts');
const aiContent = 'line one\nline two\n';
const onDiskCrlf = 'line one\r\nline two\r\n';
fs.writeFileSync(filePath, onDiskCrlf, 'utf-8');
service.recordEdit(filePath, null, aiContent);
service.validateAgainst(() => onDiskCrlf);
expect(service.getFileAttribution(filePath)).toBeDefined();
});
// Combined: BOM + CRLF on disk, plain LF + no BOM in AI input.
// The most common case for a Windows-edited file the model
// returned in unix form.
it('keeps entries when on-disk has BOM AND CRLF, AI input had neither', () => {
const service = CommitAttributionService.getInstance();
const filePath = path.join(tmpDir, 'bom-crlf.ts');
const aiContent = 'foo\nbar\n';
const onDisk = 'foo\r\nbar\r\n';
fs.writeFileSync(filePath, onDisk, 'utf-8');
service.recordEdit(filePath, null, aiContent);
service.validateAgainst(() => onDisk);
expect(service.getFileAttribution(filePath)).toBeDefined();
});
// Legacy snapshot from before contentHash existed: the entry has
// an empty contentHash. We can't tell stale from fresh, so leave
// it alone (don't reset).
it('skips entries with empty contentHash (legacy snapshot)', () => {
const service = CommitAttributionService.getInstance();
service.restoreFromSnapshot({
type: 'attribution-snapshot',
surface: 'cli',
fileStates: {
'/legacy.ts': {
aiContribution: 50,
aiCreated: false,
contentHash: '',
},
},
promptCount: 0,
promptCountAtLastCommit: 0,
});
// Even if the reader claims a different hash, an empty recorded
// hash means we have no baseline — keep the entry.
service.validateAgainst(() => 'totally different');
expect(service.getFileAttribution('/legacy.ts')).toBeDefined();
});
// Deleted-file lookup must remain stable: recordEdit canonicalises
// the path via realpathSync; getFileAttribution must still resolve
// the same canonical key after the leaf is unlinked. realpathOrSelf
// canonicalises the parent and rejoins the basename for missing
// leaves so macOS /var ↔ /private/var doesn't break the lookup
// post-deletion.
it('keeps deleted-file entries reachable via the original path', () => {
const service = CommitAttributionService.getInstance();
const filePath = path.join(tmpDir, 'deleted.ts');
fs.writeFileSync(filePath, 'will be deleted', 'utf-8');
service.recordEdit(filePath, null, 'will be deleted');
fs.unlinkSync(filePath);
// Lookup must still find the entry by the original path even
// though realpath of the leaf now throws.
expect(service.getFileAttribution(filePath)).toBeDefined();
});
});
it('should save session baseline on first edit', () => {
const service = CommitAttributionService.getInstance();
service.recordEdit('/project/f.ts', 'original content', 'new content');
// Baseline should have been saved from oldContent
// We can verify indirectly: after clear, baseline is gone
service.clearAttributions();
expect(service.hasAttributions()).toBe(false);
});
it('should return defensive copies', () => {
const service = CommitAttributionService.getInstance();
service.recordEdit('/project/f.ts', null, 'content');
const copy = service.getFileAttribution('/project/f.ts')!;
copy.aiContribution = 99999;
expect(
service.getFileAttribution('/project/f.ts')!.aiContribution,
).not.toBe(99999);
});
describe('prompt counting', () => {
it('should track prompt counts', () => {
const service = CommitAttributionService.getInstance();
expect(service.getPromptCount()).toBe(0);
service.incrementPromptCount();
service.incrementPromptCount();
service.incrementPromptCount();
expect(service.getPromptCount()).toBe(3);
expect(service.getPromptsSinceLastCommit()).toBe(3);
});
it('should reset prompts-since-commit counter on successful clear', () => {
const service = CommitAttributionService.getInstance();
service.incrementPromptCount();
service.incrementPromptCount();
service.clearAttributions(true);
expect(service.getPromptCount()).toBe(2);
expect(service.getPromptsSinceLastCommit()).toBe(0);
});
it('should NOT reset prompts-since-commit on failed clear', () => {
const service = CommitAttributionService.getInstance();
service.incrementPromptCount();
service.incrementPromptCount();
service.recordEdit('/project/f.ts', null, 'x');
service.clearAttributions(false);
// File data cleared, but prompt counter preserved
expect(service.hasAttributions()).toBe(false);
expect(service.getPromptCount()).toBe(2);
expect(service.getPromptsSinceLastCommit()).toBe(2);
});
});
describe('surface tracking', () => {
it('should default to cli surface', () => {
const service = CommitAttributionService.getInstance();
expect(service.getSurface()).toBe('cli');
});
});
describe('snapshot / restore', () => {
it('should serialize and restore state', () => {
const service = CommitAttributionService.getInstance();
service.recordEdit('/project/f.ts', null, 'hello');
service.incrementPromptCount();
service.incrementPromptCount();
const snapshot = service.toSnapshot();
expect(snapshot.type).toBe('attribution-snapshot');
expect(snapshot.promptCount).toBe(2);
expect(Object.keys(snapshot.fileStates)).toHaveLength(1);
// Restore into a fresh instance
CommitAttributionService.resetInstance();
const restored = CommitAttributionService.getInstance();
restored.restoreFromSnapshot(snapshot);
expect(restored.getPromptCount()).toBe(2);
expect(restored.getFileAttribution('/project/f.ts')!.aiContribution).toBe(
5,
);
});
});
describe('generateNotePayload', () => {
it('should compute real AI/human percentages', () => {
const service = CommitAttributionService.getInstance();
service.recordEdit('/project/src/main.ts', '', 'x'.repeat(200));
const staged = makeStagedInfo(['src/main.ts', 'src/human.ts'], {
'src/main.ts': 400,
'src/human.ts': 200,
});
const note = service.generateNotePayload(
staged,
'/project',
'Qwen-Coder',
);
expect(note.files['src/main.ts']!.percent).toBe(50);
expect(note.files['src/human.ts']!.percent).toBe(0);
expect(note.summary.aiPercent).toBe(33);
expect(note.summary.surfaces).toContain('cli');
expect(note.surfaceBreakdown['cli']).toBeDefined();
});
it('should exclude generated files', () => {
const service = CommitAttributionService.getInstance();
service.recordEdit('/project/src/main.ts', null, 'code');
const staged = makeStagedInfo(
['src/main.ts', 'package-lock.json', 'dist/bundle.js'],
{
'src/main.ts': 100,
'package-lock.json': 50000,
'dist/bundle.js': 30000,
},
);
const note = service.generateNotePayload(staged, '/project');
expect(Object.keys(note.files)).toHaveLength(1);
expect(note.excludedGenerated).toContain('package-lock.json');
expect(note.excludedGenerated).toContain('dist/bundle.js');
});
it('should include promptCount', () => {
const service = CommitAttributionService.getInstance();
service.recordEdit('/project/f.ts', null, 'code');
service.incrementPromptCount();
service.incrementPromptCount();
const staged = makeStagedInfo(['f.ts'], { 'f.ts': 100 });
const note = service.generateNotePayload(staged, '/project');
expect(note.promptCount).toBe(2);
});
it('should sanitize internal model codenames', () => {
const service = CommitAttributionService.getInstance();
service.recordEdit('/project/f.ts', null, 'x');
const staged = makeStagedInfo(['f.ts'], { 'f.ts': 10 });
expect(
service.generateNotePayload(staged, '/project', 'qwen-72b').generator,
).toBe('Qwen-Coder');
expect(
service.generateNotePayload(staged, '/project', 'CustomAgent')
.generator,
).toBe('CustomAgent');
});
// Long-line edits inflate the tracked AI char count (we count actual
// characters), but diffSize comes from `git diff --stat` which
// approximates each changed line as ~40 chars. Without clamping,
// aiChars stays large while humanChars snaps to 0, leaving
// aiChars+humanChars > the committed change magnitude.
it('should clamp aiChars to diffSize so totals stay consistent', () => {
const service = CommitAttributionService.getInstance();
// Big AI edit but small reported diff (one long-line change).
service.recordEdit('/project/src/big.ts', '', 'x'.repeat(1000));
const staged = makeStagedInfo(['src/big.ts'], { 'src/big.ts': 40 });
const note = service.generateNotePayload(staged, '/project');
const detail = note.files['src/big.ts']!;
expect(detail.aiChars).toBe(40);
expect(detail.humanChars).toBe(0);
// aiChars + humanChars now equals the reported diff size.
expect(detail.aiChars + detail.humanChars).toBe(40);
expect(note.summary.aiChars).toBe(40);
});
});
// The service realpath's file paths at every entry/exit point so a
// symlinked vs canonical absolute path collapses to one entry. This
// matters most on macOS (`/var` → `/private/var`), where edit.ts
// can record a path under one form while git rev-parse reports the
// other — without canonicalisation, the lookup never matches and
// AI attribution silently zeroes out.
describe('symlink-aware path canonicalisation', () => {
beforeEach(() => {
// Map any /var/... input to /private/var/... (the macOS-ism).
// Anything else passes through unchanged.
vi.mocked(fs.realpathSync).mockImplementation(((input: unknown) => {
const s = String(input);
if (s.startsWith('/var/')) return s.replace('/var/', '/private/var/');
if (s === '/var') return '/private/var';
return s;
}) as unknown as typeof fs.realpathSync);
});
afterEach(() => {
vi.mocked(fs.realpathSync).mockReset();
});
it('records and looks up under the canonical path', () => {
const service = CommitAttributionService.getInstance();
service.recordEdit('/var/repo/src/main.ts', '', 'x'.repeat(50));
// Lookup with EITHER form should work — the service canonicalises
// both write and read.
expect(service.getFileAttribution('/var/repo/src/main.ts')).toBeDefined();
expect(
service.getFileAttribution('/private/var/repo/src/main.ts'),
).toBeDefined();
});
it('matches diff paths when baseDir is the symlinked form', () => {
const service = CommitAttributionService.getInstance();
service.recordEdit('/var/repo/src/main.ts', '', 'x'.repeat(80));
// generateNotePayload receives the symlinked baseDir; the loop
// canonicalises it before computing path.relative against the
// (already-canonical) keys.
const staged = makeStagedInfo(['src/main.ts'], { 'src/main.ts': 80 });
const note = service.generateNotePayload(staged, '/var/repo');
expect(note.files['src/main.ts']!.aiChars).toBe(80);
expect(note.files['src/main.ts']!.percent).toBe(100);
});
it('clearAttributedFiles deletes by canonical key without realpath-ing the leaf', () => {
const service = CommitAttributionService.getInstance();
service.recordEdit('/var/repo/src/deleted.ts', '', 'will be removed');
expect(
service.getFileAttribution('/var/repo/src/deleted.ts'),
).toBeDefined();
// Caller composes paths against a canonical baseDir (mirrors
// attachCommitAttribution's pattern), so the leaf doesn't need
// to exist for the delete to find the right key.
service.clearAttributedFiles(
new Set(['/private/var/repo/src/deleted.ts']),
);
expect(
service.getFileAttribution('/var/repo/src/deleted.ts'),
).toBeUndefined();
});
it('moves attribution across committed renames before payload generation', () => {
const service = CommitAttributionService.getInstance();
service.recordEdit('/var/repo/src/old.ts', '', 'renamed content');
service.applyCommittedRenames(
new Map([['src/old.ts', 'src/new.ts']]),
'/private/var/repo',
);
expect(
service.getFileAttribution('/var/repo/src/old.ts'),
).toBeUndefined();
expect(service.getFileAttribution('/var/repo/src/new.ts')).toBeDefined();
const staged = makeStagedInfo(['src/new.ts'], { 'src/new.ts': 80 }, [], {
'src/old.ts': 'src/new.ts',
});
const note = service.generateNotePayload(staged, '/var/repo');
expect(note.files['src/new.ts']!.aiChars).toBe(15);
expect(note.files['src/new.ts']!.percent).toBe(19);
});
it('merges old-path attribution into an existing destination entry', () => {
const service = CommitAttributionService.getInstance();
service.recordEdit('/var/repo/src/old.ts', '', 'old ai text');
service.recordEdit('/var/repo/src/new.ts', '', 'new ai text');
service.applyCommittedRenames(
new Map([['src/old.ts', 'src/new.ts']]),
'/private/var/repo',
);
const attr = service.getFileAttribution('/var/repo/src/new.ts')!;
expect(attr.aiContribution).toBe(
'old ai text'.length + 'new ai text'.length,
);
expect(
service.getFileAttribution('/var/repo/src/old.ts'),
).toBeUndefined();
});
it('canonicalises keys on snapshot restore', () => {
const service = CommitAttributionService.getInstance();
service.restoreFromSnapshot({
type: 'attribution-snapshot',
surface: 'cli',
// Snapshot written before the canonicalisation fix could carry
// either form; restore should normalise to canonical.
fileStates: {
'/var/repo/src/legacy.ts': {
aiContribution: 99,
aiCreated: false,
contentHash: '',
},
},
promptCount: 0,
promptCountAtLastCommit: 0,
});
// Lookup under the canonical form succeeds even though the
// snapshot wrote the symlink form.
expect(
service.getFileAttribution('/private/var/repo/src/legacy.ts')!
.aiContribution,
).toBe(99);
});
// A snapshot straddling the canonicalisation fix can carry both
// the symlinked and canonical paths for the same file. After
// realpathOrSelf normalises them, the second entry to land
// would overwrite the first if we just `set()` — losing the
// first form's accumulated aiContribution. Merge instead.
it('merges duplicate entries collapsed by canonicalisation', () => {
const service = CommitAttributionService.getInstance();
service.restoreFromSnapshot({
type: 'attribution-snapshot',
surface: 'cli',
fileStates: {
'/var/repo/src/dup.ts': {
aiContribution: 30,
aiCreated: false,
contentHash: 'old',
},
'/private/var/repo/src/dup.ts': {
aiContribution: 70,
aiCreated: true,
contentHash: 'new',
},
},
promptCount: 0,
promptCountAtLastCommit: 0,
});
const restored = service.getFileAttribution(
'/private/var/repo/src/dup.ts',
)!;
expect(restored.aiContribution).toBe(100);
// aiCreated is OR'd: any form carrying true wins.
expect(restored.aiCreated).toBe(true);
});
// A corrupted snapshot with promptCountAtLastCommit > promptCount
// would surface a negative `getPromptsSinceLastCommit()` and
// propagate as a "(-3)-shotted" trailer into PR text.
it('clamps promptCountAtLastCommit to promptCount on restore', () => {
const service = CommitAttributionService.getInstance();
service.restoreFromSnapshot({
type: 'attribution-snapshot',
surface: 'cli',
fileStates: {},
promptCount: 5,
promptCountAtLastCommit: 99,
});
expect(service.getPromptsSinceLastCommit()).toBe(0);
});
// `surface` lands verbatim in the git-notes payload and is used
// as a Map key. Non-string values would coerce into
// `[object Object]` etc. Fall back to the current client surface.
it.each([
['object', { foo: 'bar' }],
['number', 42],
['null', null],
['empty string', ''],
])(
'falls back to client surface when snapshot.surface is non-string (%s)',
(_label, badValue) => {
const service = CommitAttributionService.getInstance();
service.restoreFromSnapshot({
type: 'attribution-snapshot',
surface: badValue as unknown as string,
fileStates: {},
promptCount: 0,
promptCountAtLastCommit: 0,
});
// getClientSurface() returns 'cli' in tests (no env var set).
expect(service.getSurface()).toBe('cli');
},
);
// Envelope-level corruption: a payload whose `type` discriminator
// is wrong (or whose top-level shape is non-object) must reset to
// a clean state instead of polluting fileAttributions. The
// resume-time caller passes `snapshot as AttributionSnapshot`
// from a structural cast off `unknown`, so the runtime value
// could be anything.
it.each([
['null', null],
['array', []],
['string', 'snapshot'],
['number', 42],
['wrong type discriminator', { type: 'something-else' }],
['missing type', { fileStates: {} }],
])(
'resets to fresh state when snapshot envelope is malformed (%s)',
(_label, badPayload) => {
const service = CommitAttributionService.getInstance();
// Seed some pre-existing state to confirm the reset clears it.
service.recordEdit('/project/preexisting.ts', null, 'hello');
expect(
service.getFileAttribution('/project/preexisting.ts'),
).toBeDefined();
service.restoreFromSnapshot(
badPayload as unknown as Parameters<
typeof service.restoreFromSnapshot
>[0],
);
expect(
service.getFileAttribution('/project/preexisting.ts'),
).toBeUndefined();
expect(service.getSurface()).toBe('cli');
expect(service.getPromptsSinceLastCommit()).toBe(0);
},
);
// `fileStates` must be a plain object; otherwise Object.entries
// would happily iterate an array's [index, value] pairs and seed
// fileAttributions with numeric-string keys.
it.each([
['array', []],
['string', 'oops'],
['number', 42],
['null', null],
])(
'ignores non-object fileStates (%s) without polluting attribution map',
(_label, badFileStates) => {
const service = CommitAttributionService.getInstance();
service.restoreFromSnapshot({
type: 'attribution-snapshot',
surface: 'cli',
fileStates: badFileStates as unknown as Record<
string,
{ aiContribution: number; aiCreated: boolean; contentHash: string }
>,
promptCount: 0,
promptCountAtLastCommit: 0,
});
expect(service.hasAttributions()).toBe(false);
},
);
});
});

View file

@ -0,0 +1,916 @@
/**
* @license
* Copyright 2025 Google LLC
* SPDX-License-Identifier: Apache-2.0
*/
/**
* Commit Attribution Service
*
* Tracks character-level contribution ratios between AI and humans per file.
* When a git commit is made, this data is combined with git diff analysis to
* calculate real AI vs human contribution percentages, stored as git notes.
*
* Features:
* - Character-level prefix/suffix diff algorithm
* - Real AI/human contribution ratio via git diff
* - Surface tracking (cli/ide/api/sdk)
* - Prompt counting (since-last-commit window)
* - Snapshot/restore for session persistence
* - Generated file exclusion
*/
import { createHash } from 'node:crypto';
import * as fs from 'node:fs';
import * as path from 'node:path';
import { createDebugLogger } from '../utils/debugLogger.js';
import { isGeneratedFile } from './generatedFiles.js';
const debugLogger = createDebugLogger('COMMIT_ATTRIBUTION');
/**
* Strip the per-platform / per-encoding noise (leading UTF-8 BOM,
* CRLF line endings) so two byte-different but semantically-identical
* versions of the same content hash to the same value.
*
* Edit and WriteFile preserve the user's BOM/lineEnding choice when
* writing back, so the on-disk bytes can include CRLF or a leading
* U+FEFF even when AI's `recordEdit` was given LF-normalised content
* with no BOM. Without this normalisation, a `git show <sha>:<rel>`
* read of a BOM/CRLF file would always mismatch AI's recorded hash
* and drop the entry on every commit. The hash is metadata (used for
* divergence detection only); collapsing these visual differences is
* the right comparison semantics.
*/
function canonicaliseForHash(content: string): string {
// Strip a single leading UTF-8 BOM (U+FEFF). writeTextFile's
// BOM mode prepends BOM bytes to the on-disk file independently of
// whether AI's input included the BOM character in its string.
let normalised =
content.length > 0 && content.charCodeAt(0) === 0xfeff
? content.slice(1)
: content;
// Normalise CRLF → LF. writeTextFile writes CRLF when the file's
// detected line-ending is CRLF; AI's recordEdit input is typically
// LF-normalised (Edit's `currentContent.replace(/\r\n/g, '\n')`
// happens before recordEdit fires).
normalised = normalised.replace(/\r\n/g, '\n');
return normalised;
}
function computeContentHash(content: string): string {
return createHash('sha256')
.update(canonicaliseForHash(content))
.digest('hex');
}
/**
* Resolve symlinks on a path. On macOS in particular, `/var` is a
* symlink to `/private/var`, so an absolute path captured via
* `fs.realpathSync` (what edit.ts/write-file.ts records) and
* `path.relative` against `git rev-parse --show-toplevel` (which may
* report either form) won't line up unless we normalise both sides.
*
* For DELETED leaves (file no longer exists on disk), realpathSync
* throws but the parent directory is still resolvable. Canonicalise
* the parent and rejoin the missing basename so a deleted file's
* lookup still hits the canonical key recordEdit stored before the
* file was removed. Without this, a `getFileAttribution(deletedPath)`
* call after the file was deleted would fall back to the
* non-canonical input and miss the canonical entry on macOS.
*/
function realpathOrSelf(p: string): string {
try {
return fs.realpathSync(p);
} catch {
try {
const parent = path.dirname(p);
const realParent = fs.realpathSync(parent);
return path.join(realParent, path.basename(p));
} catch {
return p;
}
}
}
// ---------------------------------------------------------------------------
// Types
// ---------------------------------------------------------------------------
export interface FileAttribution {
/** Total characters contributed by AI (accumulated across edits) */
aiContribution: number;
/** Whether the file was created by AI */
aiCreated: boolean;
/**
* SHA-256 of the file content immediately after AI's last write. Used
* to detect out-of-band mutation (paste-replace via external editor,
* `rm` + recreate, manual save) so AI's accumulated counter doesn't
* silently get credited to subsequent human edits. recordEdit checks
* this on every call (resets when the input `oldContent` doesn't
* match), and `validateAgainst` re-verifies before a commit note is
* generated to catch user edits that happened entirely outside the
* Edit/Write tools.
*/
contentHash: string;
}
/**
* Per-file attribution detail in the git notes payload.
*
* Field naming caveat: `aiChars` and `humanChars` look like literal
* UTF-16/UTF-8 character counts, but they are NOT. Both are
* heuristic diff-size proxies derived from `git diff --numstat`:
* for text files the value is `(addedLines + deletedLines) × 40`
* (the 40-char/line heuristic), and for binary files both sides
* are reported as a flat `1024`. The per-file AI accumulator from
* `recordEdit` is then clamped against this same line-based ceiling.
*
* Practical consequence: a commit adding 1000 one-character lines
* and one adding 1000 thousand-character lines both report
* `aiChars = 40000`; a 5 MB image change and a 1-byte binary tweak
* both report `1024`. `percent` (and `summary.aiPercent`) is
* largely insulated from this both numerator and denominator use
* the same heuristic but consumers aggregating raw
* `aiChars`/`humanChars` for compliance reporting will get
* systematically biased numbers and should treat these fields as
* "approximate change size in proxy-chars" rather than literal
* char counts.
*/
export interface FileAttributionDetail {
/** Heuristic diff-size proxy (NOT a literal char count — see interface doc). */
aiChars: number;
/** Heuristic diff-size proxy (NOT a literal char count — see interface doc). */
humanChars: number;
/**
* AI share of the per-file diff, rounded to integer percent.
* Robust against the heuristic in `aiChars`/`humanChars` because
* both sides of the ratio use the same proxy; safe to aggregate.
*/
percent: number;
surface?: string;
}
/**
* Full attribution payload stored as git notes JSON.
*
* Same `aiChars`/`humanChars` caveat as `FileAttributionDetail`:
* those summed totals are sums of heuristic diff-size proxies, not
* literal character counts. `aiPercent` (and per-file `percent`)
* use the same proxy on both sides of the ratio, so the percentage
* is the field consumers should rely on for cross-commit
* aggregation; the raw chars values are useful for ordering
* commits within the same payload but should not be summed across
* unrelated commits as if they were byte counts.
*/
export interface CommitAttributionNote {
version: 1;
generator: string;
files: Record<string, FileAttributionDetail>;
summary: {
/** AI share of the whole commit, rounded to integer percent. */
aiPercent: number;
/** Sum of per-file `aiChars` heuristic proxies (see FileAttributionDetail). */
aiChars: number;
/** Sum of per-file `humanChars` heuristic proxies (see FileAttributionDetail). */
humanChars: number;
totalFilesTouched: number;
surfaces: string[];
};
surfaceBreakdown: Record<string, { aiChars: number; percent: number }>;
/**
* Sample of generated/vendored files that were excluded from
* attribution. Capped at `MAX_EXCLUDED_GENERATED_SAMPLE` paths so a
* commit churning thousands of `dist/` artifacts can't blow past the
* 30 KB note budget and silently drop attribution for the real
* source files in the same commit. Use `excludedGeneratedCount` for
* the true total.
*/
excludedGenerated: string[];
/** Total count of excluded files (≥ excludedGenerated.length). */
excludedGeneratedCount: number;
promptCount: number;
}
/**
* Upper bound on the number of excluded-generated paths we serialize
* into the git note. Keeps the JSON payload bounded for commits with
* lots of generated artifacts.
*/
export const MAX_EXCLUDED_GENERATED_SAMPLE = 50;
/** Result of running git commands to get staged file info. */
export interface StagedFileInfo {
files: string[];
diffSizes: Map<string, number>;
deletedFiles: Set<string>;
/**
* Git rename map from old repo-relative path to new repo-relative path.
* Populated from `git diff --name-status --find-renames`. Used to move
* pending attribution from the pre-rename absolute key to the post-rename
* key before payload generation and cleanup.
*/
renamedFiles: Map<string, string>;
/**
* Absolute path of the repository root (`git rev-parse --show-toplevel`).
* Optional for backward compatibility with synthetic test inputs;
* production callers should set it so file paths in `files` (which are
* relative to the repo root) align with absolute paths tracked by the
* attribution service. When absent, callers may fall back to the
* configured target directory at the cost of zeroed-out attribution
* for files outside that directory.
*/
repoRoot?: string;
}
/**
* On-disk schema version for AttributionSnapshot. Bump when the shape
* changes incompatibly so restoreFromSnapshot can refuse / migrate
* stale payloads instead of silently producing NaN counters or
* mismatched key shapes.
*/
export const ATTRIBUTION_SNAPSHOT_VERSION = 1;
/** Serializable snapshot for session persistence. */
export interface AttributionSnapshot {
type: 'attribution-snapshot';
/** Schema version; absent on pre-versioning snapshots, treated as 1. */
version?: number;
surface: string;
fileStates: Record<string, FileAttribution>;
promptCount: number;
promptCountAtLastCommit: number;
}
// ---------------------------------------------------------------------------
// Model name sanitization
// ---------------------------------------------------------------------------
const INTERNAL_MODEL_PATTERNS = [
/qwen[-_]?\d+(\.\d+)?[-_]?b?/i,
/qwen[-_]?coder[-_]?\d*/i,
/qwen[-_]?max/i,
/qwen[-_]?plus/i,
/qwen[-_]?turbo/i,
];
const SANITIZED_GENERATOR_NAME = 'Qwen-Coder';
function sanitizeModelName(name: string): string {
for (const pattern of INTERNAL_MODEL_PATTERNS) {
if (pattern.test(name)) {
return SANITIZED_GENERATOR_NAME;
}
}
return name;
}
// ---------------------------------------------------------------------------
// Utilities
// ---------------------------------------------------------------------------
/**
* Defensive coercions for restoring snapshot fields. A snapshot can
* arrive with `undefined` / wrong-type fields if the on-disk JSON was
* partially written or pre-dates the current schema; without coercion
* they would flow through `Math.min(undefined, n) === NaN` into the
* git-notes payload.
*/
function sanitiseCount(v: unknown): number {
return typeof v === 'number' && Number.isFinite(v) && v >= 0 ? v : 0;
}
function sanitiseAttribution(v: unknown): FileAttribution {
const obj = (v ?? {}) as Partial<FileAttribution>;
return {
aiContribution: sanitiseCount(obj.aiContribution),
aiCreated: typeof obj.aiCreated === 'boolean' ? obj.aiCreated : false,
contentHash: typeof obj.contentHash === 'string' ? obj.contentHash : '',
};
}
/**
* Surface label embedded in the git-notes payload. Defaults to `'cli'`
* for the qwen-code CLI; embedders (IDE extensions, SDK consumers) can
* override by setting `QWEN_CODE_ENTRYPOINT` before construction so the
* note records where the contribution was authored.
*/
export function getClientSurface(): string {
return process.env['QWEN_CODE_ENTRYPOINT'] ?? 'cli';
}
// ---------------------------------------------------------------------------
// Service
// ---------------------------------------------------------------------------
export class CommitAttributionService {
private static instance: CommitAttributionService | null = null;
/** Per-file AI contribution tracking (keyed by absolute path) */
private fileAttributions: Map<string, FileAttribution> = new Map();
/** Client surface (cli, ide, api, sdk, etc.) */
private surface: string = getClientSurface();
// -- Prompt counting --
private promptCount: number = 0;
private promptCountAtLastCommit: number = 0;
private constructor() {}
static getInstance(): CommitAttributionService {
if (!CommitAttributionService.instance) {
CommitAttributionService.instance = new CommitAttributionService();
}
return CommitAttributionService.instance;
}
/** Reset singleton for testing. */
static resetInstance(): void {
CommitAttributionService.instance = null;
}
// -----------------------------------------------------------------------
// Recording
// -----------------------------------------------------------------------
/**
* Record an AI edit to a file.
* Uses prefix/suffix matching for precise character-level contribution.
*
* `filePath` is canonicalised via `fs.realpathSync` before being used
* as a key, so symlinked paths (e.g. `/var/...` `/private/var/...`
* on macOS) collapse to the same entry instead of silently producing
* two parallel records.
*
* Divergence detection: if a tracked entry's recorded `contentHash`
* doesn't match the hash of the `oldContent` we received here, the
* file was changed out-of-band between AI's last write and this
* call (paste-replace via external editor, `git checkout`, manual
* save, ...). Reset `aiContribution` and `aiCreated` to 0/false
* before applying the new edit so prior AI work that the user
* since overwrote isn't credited to the next commit.
*/
recordEdit(
filePath: string,
oldContent: string | null,
newContent: string,
): void {
const key = realpathOrSelf(filePath);
const existing = this.fileAttributions.get(key);
const isNewFile = oldContent === null;
let aiContribution = existing?.aiContribution ?? 0;
let aiCreated = existing?.aiCreated ?? false;
// Fresh-file lifetime: if we have a prior tracked state for this
// path BUT the caller is reporting `oldContent === null` (the file
// didn't exist on disk at edit time), the previous tracking was
// for a since-deleted file at the same path — accumulating across
// distinct file lifetimes would credit AI for chars from the old
// file that no longer exist. Reset before counting the new
// contribution. Common path: AI creates `foo.ts` → user / shell
// `rm foo.ts` → AI re-creates `foo.ts` from scratch.
if (existing && isNewFile) {
aiContribution = 0;
aiCreated = false;
}
// Out-of-band mutation: if we have a prior tracked state AND the
// input `oldContent` doesn't match the hash we recorded after
// AI's last write, the file diverged out-of-band (paste-replace
// via external editor, `git checkout`, manual save). Reset the
// accumulator before applying the new edit.
//
// Skip when `existing.contentHash` is empty: that's a legacy
// snapshot (pre-divergence-detection schema) where we never
// recorded the post-write hash. Comparing an empty hash to the
// actual file hash would always trip the reset and silently wipe
// AI work that's still on disk.
if (existing && existing.contentHash && oldContent !== null) {
const oldHash = computeContentHash(oldContent);
if (existing.contentHash !== oldHash) {
aiContribution = 0;
aiCreated = false;
}
}
const contribution = computeCharContribution(oldContent ?? '', newContent);
aiContribution += contribution;
if (isNewFile) aiCreated = true;
this.fileAttributions.set(key, {
aiContribution,
aiCreated,
contentHash: computeContentHash(newContent),
});
}
/**
* Re-hash each tracked file's content via a caller-supplied reader
* and drop entries whose hash doesn't match what AI's last write
* recorded. Catches the cases recordEdit's input-hash check can't
* see i.e. the user (or another tool) modified the file entirely
* outside the Edit/Write tools, then committed it. Without this,
* the AI's stale aiContribution would attach to the human-only
* diff at commit time and credit AI for human work.
*
* `getContent(absPath)` returns the bytes the caller wants to
* compare against, or `null` if the entry shouldn't be checked
* (deletion, unreadable, file not in the relevant scope). Returning
* `null` leaves the entry alone rather than dropping it.
*
* Production caller (`attachCommitAttribution`) passes a reader
* that fetches the COMMITTED blob (`git show HEAD:<rel>`) for files
* actually in the just-made commit, returning null for everything
* else. The "committed blob" choice (rather than the live working
* tree) is what makes a `git add AI's edit && extra unstaged edits
* && git commit` flow correctly attribute the commit to AI even
* though the working-tree file no longer matches AI's recorded
* hash.
*/
validateAgainst(getContent: (absPath: string) => string | null): void {
for (const [key, attr] of this.fileAttributions) {
// Skip legacy entries that have no recorded post-write hash —
// we can't tell stale from fresh, so leave them alone.
if (!attr.contentHash) continue;
const current = getContent(key);
if (current === null) continue; // not a divergence signal
if (computeContentHash(current) !== attr.contentHash) {
debugLogger.debug(
`validateAgainst: dropping stale attribution for ${key} (hash diverged)`,
);
this.fileAttributions.delete(key);
}
}
}
// -----------------------------------------------------------------------
// Prompt counting
// -----------------------------------------------------------------------
incrementPromptCount(): void {
this.promptCount++;
}
getPromptCount(): number {
return this.promptCount;
}
/** Prompts since last commit (for "N-shotted" display). */
getPromptsSinceLastCommit(): number {
return this.promptCount - this.promptCountAtLastCommit;
}
// -----------------------------------------------------------------------
// Querying
// -----------------------------------------------------------------------
getAttributions(): Map<string, FileAttribution> {
const copy = new Map<string, FileAttribution>();
for (const [k, v] of this.fileAttributions) {
copy.set(k, { ...v });
}
return copy;
}
getFileAttribution(filePath: string): FileAttribution | undefined {
// Canonicalise so callers don't have to know about the realpath
// normalization happening inside `recordEdit`.
const attr = this.fileAttributions.get(realpathOrSelf(filePath));
return attr ? { ...attr } : undefined;
}
hasAttributions(): boolean {
return this.fileAttributions.size > 0;
}
getSurface(): string {
return this.surface;
}
/**
* Clear file attribution data. Called after commit (success or failure).
* @param commitSucceeded If true, also updates the "at last commit"
* counters so getPromptsSinceLastCommit() resets to 0.
*/
clearAttributions(commitSucceeded: boolean = true): void {
if (commitSucceeded) {
this.promptCountAtLastCommit = this.promptCount;
}
this.fileAttributions.clear();
}
/**
* Clear attribution data for the specific files that just landed in
* a commit, leaving entries for files the user *didn't* include
* (partial commits, `git add A && git commit -m "..."`) intact so
* they're still credited on a later commit. Snapshots prompt
* counters since a commit did succeed.
*
* Inputs must already be canonical absolute paths. The caller
* should resolve repo-relative diff entries against a canonical
* (realpath'd) repo root rather than realpathing each leaf at
* cleanup time the leaf for a just-deleted file no longer exists,
* so per-leaf `fs.realpathSync` would fail and fall back to a
* non-canonical path that misses the stored canonical key.
*/
clearAttributedFiles(committedAbsolutePaths: Set<string>): void {
this.promptCountAtLastCommit = this.promptCount;
for (const p of committedAbsolutePaths) {
this.fileAttributions.delete(p);
}
}
/**
* Snapshot the prompt counter as the new "last commit" without
* clearing per-file attribution. Used when a commit landed but we
* can't reliably determine which files were in it (multi-commit
* chain we won't write a note for, attribution toggle off, diff
* analysis failed). Wholesale-clearing in those branches would
* silently wipe pending AI edits for *unrelated* files the user
* didn't stage a worse failure mode than the small risk of
* stale per-file state for files that did just land.
*/
noteCommitWithoutClearing(): void {
this.promptCountAtLastCommit = this.promptCount;
}
/**
* Resolve a set of repo-relative file paths to the canonical absolute
* keys actually stored in the attribution map. Used by cleanup to
* partial-clear only the files that just landed in a commit.
*
* Matching by walking `fileAttributions` (instead of resolving each
* relative path with `path.resolve` + `fs.realpathSync`) is the only
* approach that handles all of: deleted files (where realpathSync
* throws), intermediate-symlink directories (where path.resolve only
* canonicalises the base), and renamed files (where the diff-time
* relative path differs from the recordEdit-time absolute path
* still no match here, that's a rename-tracking concern handled
* separately). Each tracked key is canonical (recordEdit ran it
* through `realpathOrSelf`), so its computed relative form against
* the canonical repo root is what generateNotePayload uses too.
*/
matchCommittedFiles(
relativeFiles: Iterable<string>,
canonicalRepoRoot: string,
): Set<string> {
const wanted = new Set(relativeFiles);
const matched = new Set<string>();
for (const key of this.fileAttributions.keys()) {
const rel = path
.relative(canonicalRepoRoot, key)
.split(path.sep)
.join('/');
if (wanted.has(rel)) {
matched.add(key);
}
}
return matched;
}
/**
* Move pending attribution across git renames before matching committed files.
*
* `recordEdit` stores attribution by canonical absolute path at edit time.
* If the user later commits `git mv old.ts new.ts`, git reports the committed
* file as `new.ts` while our map is still keyed by `old.ts`. Without moving
* the key first, the note either misses the AI-authored rename entirely or
* treats the old path as a deletion depending on diff settings.
*/
applyCommittedRenames(
renamedFiles: ReadonlyMap<string, string>,
canonicalRepoRoot: string,
): void {
if (renamedFiles.size === 0) return;
// Build the new-key path using POSIX semantics so the joined
// string matches `git diff --name-only` output (which is always
// forward-slash) and isn't subject to `path.win32.join`'s
// backslash conversion. `realpathOrSelf` is what canonicalises
// back to the platform's actual storage form: on Windows it
// calls `fs.realpathSync` which accepts mixed slashes and returns
// backslash form, matching what `recordEdit` stored.
const posixRepoRoot = canonicalRepoRoot.split(path.sep).join('/');
for (const [key, attr] of [...this.fileAttributions.entries()]) {
const rel = path
.relative(canonicalRepoRoot, key)
.split(path.sep)
.join('/');
const renamedRel = renamedFiles.get(rel);
if (!renamedRel) continue;
const renamedAbs = realpathOrSelf(
path.posix.join(posixRepoRoot, renamedRel),
);
if (renamedAbs === key) continue;
const existing = this.fileAttributions.get(renamedAbs);
if (existing) {
this.fileAttributions.set(renamedAbs, {
aiContribution: existing.aiContribution + attr.aiContribution,
aiCreated: existing.aiCreated || attr.aiCreated,
// Prefer the destination hash: if AI edited after the rename,
// that entry reflects the freshest post-write content.
contentHash: existing.contentHash || attr.contentHash,
});
} else {
this.fileAttributions.set(renamedAbs, { ...attr });
}
this.fileAttributions.delete(key);
}
}
// -----------------------------------------------------------------------
// Snapshot / restore (session persistence)
// -----------------------------------------------------------------------
/** Serialize current state for session persistence. */
toSnapshot(): AttributionSnapshot {
const fileStates: Record<string, FileAttribution> = {};
for (const [k, v] of this.fileAttributions) {
fileStates[k] = { ...v };
}
return {
type: 'attribution-snapshot',
version: ATTRIBUTION_SNAPSHOT_VERSION,
surface: this.surface,
fileStates,
promptCount: this.promptCount,
promptCountAtLastCommit: this.promptCountAtLastCommit,
};
}
/** Restore state from a persisted snapshot. */
restoreFromSnapshot(snapshot: AttributionSnapshot): void {
// The resume-time caller (client.ts) passes `snapshot` as a
// structural cast from `unknown`, so its TS-typed shape is only
// a hint — the actual runtime value can be anything (corrupted
// JSONL line, hand-edited session file, schema drift). Bail to
// a clean reset on any envelope-level shape mismatch:
// - non-object / null / array
// - wrong `type` discriminator
// - non-numeric `version` (after the `version ?? 1` default)
// - non-object `fileStates`
// Per-field coercion (sanitiseAttribution etc.) handles damage
// INSIDE a structurally valid snapshot; this gate stops a
// wholesale-wrong payload from polluting fileAttributions with
// garbage keys before per-field validation can run.
const isPlainObject = (v: unknown): v is Record<string, unknown> =>
typeof v === 'object' && v !== null && !Array.isArray(v);
const looksLikeSnapshot =
isPlainObject(snapshot) &&
(snapshot as Record<string, unknown>)['type'] === 'attribution-snapshot';
if (!looksLikeSnapshot) {
this.fileAttributions.clear();
this.surface = getClientSurface();
this.promptCount = 0;
this.promptCountAtLastCommit = 0;
return;
}
// Future schema bumps land here. Treat absent `version` as 1
// (the schema in production at the time this field was added) so
// existing on-disk snapshots restore cleanly.
const snapshotVersion = snapshot.version ?? 1;
if (snapshotVersion !== ATTRIBUTION_SNAPSHOT_VERSION) {
// Don't trust a stale shape — its fields may have moved or
// changed semantics. Reset to a fresh state rather than
// splice incompatible data.
this.fileAttributions.clear();
this.surface = getClientSurface();
this.promptCount = 0;
this.promptCountAtLastCommit = 0;
return;
}
// `surface` is embedded verbatim in the git-notes payload and used
// as a Map/Record key downstream. A corrupted snapshot with a
// non-string value (e.g. `{}`, `42`, `null`) would coerce into
// strings like `[object Object]` and break the payload shape.
// Fall back to the current client surface when the stored value
// isn't a string.
this.surface =
typeof snapshot.surface === 'string' && snapshot.surface.length > 0
? snapshot.surface
: getClientSurface();
// A corrupted or partially-written snapshot can leave numeric
// counters as `undefined`; without coercion, downstream
// `Math.min(undefined, n)` produces NaN that flows into the
// git-notes payload. Coerce per-field with a typed default.
this.promptCount = sanitiseCount(snapshot.promptCount);
this.promptCountAtLastCommit = sanitiseCount(
snapshot.promptCountAtLastCommit,
);
// Enforce the invariant `atLastCommit <= total`: a corrupted /
// partially-written snapshot with the inverse would surface a
// negative `getPromptsSinceLastCommit()` and propagate as a
// "(-3)-shotted" trailer into PR descriptions.
if (this.promptCountAtLastCommit > this.promptCount) {
this.promptCountAtLastCommit = this.promptCount;
}
this.fileAttributions.clear();
// Reject a corrupted `fileStates` (e.g. an array, a string, or
// null) before iterating: `Object.entries(<array>)` would happily
// produce `[index, value]` pairs and seed fileAttributions with
// numeric-string keys.
const fileStates = isPlainObject(snapshot.fileStates)
? snapshot.fileStates
: {};
for (const [k, v] of Object.entries(fileStates)) {
// Re-canonicalise on restore so old snapshots (written before
// recordEdit started running keys through realpath) end up
// with the same shape as newly-recorded entries. If both the
// symlinked and canonical forms were stored under separate
// keys (e.g. a session straddling the canonicalisation fix),
// collapsing them onto the same canonical key MUST merge their
// attribution rather than overwrite — otherwise the second
// entry to land wins and the AI's accumulated contribution from
// the first form is silently dropped.
const canonicalKey = realpathOrSelf(k);
const incoming = sanitiseAttribution(v);
const existing = this.fileAttributions.get(canonicalKey);
if (existing) {
// Sum aiContribution and OR aiCreated. Pick the
// most-recently-recorded contentHash (incoming wins) so
// post-restore divergence checks compare against the freshest
// hash; an old form's stale hash would force unnecessary
// resets on the next recordEdit.
this.fileAttributions.set(canonicalKey, {
aiContribution: existing.aiContribution + incoming.aiContribution,
aiCreated: existing.aiCreated || incoming.aiCreated,
contentHash: incoming.contentHash || existing.contentHash,
});
} else {
this.fileAttributions.set(canonicalKey, incoming);
}
}
}
// -----------------------------------------------------------------------
// Payload generation
// -----------------------------------------------------------------------
/**
* Generate the git notes JSON payload by combining tracked AI contributions
* with staged file information from git.
*/
generateNotePayload(
stagedInfo: StagedFileInfo,
baseDir: string,
generatorName?: string,
): CommitAttributionNote {
const generator = sanitizeModelName(
generatorName ?? SANITIZED_GENERATOR_NAME,
);
const files: Record<string, FileAttributionDetail> = {};
const excludedGenerated: string[] = [];
let excludedGeneratedCount = 0;
const surfaceCounts: Record<string, number> = {};
let totalAiChars = 0;
let totalHumanChars = 0;
// Build lookup: relative path → tracked AI contribution. Keys in
// `fileAttributions` are already canonical (recordEdit runs them
// through realpath); we only need to canonicalise `baseDir`,
// which comes from `git rev-parse --show-toplevel` and may be a
// symlink (e.g. macOS `/var` → `/private/var`). Without that
// canonicalisation `path.relative` would produce a `../...` key
// that never matches the diff output. Normalize separators to
// forward slashes so git paths line up on Windows.
const canonicalBase = realpathOrSelf(baseDir);
const aiLookup = new Map<string, FileAttribution>();
for (const [absPath, attr] of this.fileAttributions) {
const rel = path
.relative(canonicalBase, absPath)
.split(path.sep)
.join('/');
aiLookup.set(rel, attr);
}
for (const relFile of stagedInfo.files) {
if (isGeneratedFile(relFile)) {
excludedGeneratedCount++;
// Cap the sample so a commit churning thousands of `dist/`
// artifacts can't blow past the 30 KB note budget.
if (excludedGenerated.length < MAX_EXCLUDED_GENERATED_SAMPLE) {
excludedGenerated.push(relFile);
}
continue;
}
const tracked = aiLookup.get(relFile);
const diffSize = stagedInfo.diffSizes.get(relFile) ?? 0;
const isDeleted = stagedInfo.deletedFiles.has(relFile);
let aiChars: number;
let humanChars: number;
if (tracked) {
// Clamp aiChars to diffSize so aiChars+humanChars stays
// consistent with the committed change magnitude derived from
// `git diff --numstat`. Without this, cases where
// tracked.aiContribution exceeds the committed change size
// can leave aiChars > diffSize: humanChars then snaps to 0
// but aiChars stays large, inflating the per-file total
// beyond what was committed.
aiChars = Math.min(tracked.aiContribution, diffSize);
humanChars = Math.max(0, diffSize - aiChars);
} else if (isDeleted) {
// Deleted files with no AI tracking are attributed entirely to
// the human. diffSize comes from `git diff --numstat` so empty
// deletions legitimately have diffSize=0 — a magic fallback
// would only inflate totals.
aiChars = 0;
humanChars = diffSize;
} else {
aiChars = 0;
humanChars = diffSize;
}
const total = aiChars + humanChars;
const percent = total > 0 ? Math.round((aiChars / total) * 100) : 0;
files[relFile] = { aiChars, humanChars, percent, surface: this.surface };
totalAiChars += aiChars;
totalHumanChars += humanChars;
surfaceCounts[this.surface] =
(surfaceCounts[this.surface] ?? 0) + aiChars;
}
const totalChars = totalAiChars + totalHumanChars;
const aiPercent =
totalChars > 0 ? Math.round((totalAiChars / totalChars) * 100) : 0;
// Surface breakdown
const surfaceBreakdown: Record<
string,
{ aiChars: number; percent: number }
> = {};
for (const [surf, chars] of Object.entries(surfaceCounts)) {
surfaceBreakdown[surf] = {
aiChars: chars,
percent: totalChars > 0 ? Math.round((chars / totalChars) * 100) : 0,
};
}
return {
version: 1,
generator,
files,
summary: {
aiPercent,
aiChars: totalAiChars,
humanChars: totalHumanChars,
totalFilesTouched: Object.keys(files).length,
surfaces: [this.surface],
},
surfaceBreakdown,
excludedGenerated,
excludedGeneratedCount,
promptCount: this.getPromptsSinceLastCommit(),
};
}
}
// ---------------------------------------------------------------------------
// Character contribution calculation (Claude's prefix/suffix algorithm)
// ---------------------------------------------------------------------------
/**
* Compute the character contribution for a file modification.
* Uses common prefix/suffix matching to find the actual changed region,
* then returns the larger of the old/new changed lengths.
*/
export function computeCharContribution(
oldContent: string,
newContent: string,
): number {
if (oldContent === '' || newContent === '') {
return oldContent === '' ? newContent.length : oldContent.length;
}
const minLen = Math.min(oldContent.length, newContent.length);
let prefixEnd = 0;
while (
prefixEnd < minLen &&
oldContent[prefixEnd] === newContent[prefixEnd]
) {
prefixEnd++;
}
let suffixLen = 0;
while (
suffixLen < minLen - prefixEnd &&
oldContent[oldContent.length - 1 - suffixLen] ===
newContent[newContent.length - 1 - suffixLen]
) {
suffixLen++;
}
const oldChangedLen = oldContent.length - prefixEnd - suffixLen;
const newChangedLen = newContent.length - prefixEnd - suffixLen;
return Math.max(oldChangedLen, newChangedLen);
}

View file

@ -0,0 +1,95 @@
/**
* @license
* Copyright 2025 Google LLC
* SPDX-License-Identifier: Apache-2.0
*/
import { describe, it, expect } from 'vitest';
import { isGeneratedFile } from './generatedFiles.js';
describe('isGeneratedFile', () => {
it('should exclude lock files', () => {
expect(isGeneratedFile('package-lock.json')).toBe(true);
expect(isGeneratedFile('yarn.lock')).toBe(true);
expect(isGeneratedFile('pnpm-lock.yaml')).toBe(true);
expect(isGeneratedFile('Cargo.lock')).toBe(true);
});
it('should exclude minified files', () => {
expect(isGeneratedFile('app.min.js')).toBe(true);
expect(isGeneratedFile('styles.min.css')).toBe(true);
expect(isGeneratedFile('lib-min.js')).toBe(true);
});
it('should exclude files in dist/build directories', () => {
expect(isGeneratedFile('dist/bundle.js')).toBe(true);
expect(isGeneratedFile('build/output.js')).toBe(true);
expect(isGeneratedFile('src/.next/cache.js')).toBe(true);
});
// `.d.ts` files are commonly authored by hand (declaration files
// for projects without TS sources, ambient module declarations,
// asset shims like `*.d.ts` for `import './x.svg'`); the prior
// blanket exclusion silently dropped AI edits to those files.
// Auto-generated `.d.ts` (e.g. `tsc --declaration` output) tends
// to live under `/dist/` etc. and is still excluded by the
// directory rules.
it('should NOT exclude hand-authored TypeScript declaration files', () => {
expect(isGeneratedFile('types/index.d.ts')).toBe(false);
expect(isGeneratedFile('src/assets.d.ts')).toBe(false);
});
it('should still exclude .d.ts emitted into build directories', () => {
expect(isGeneratedFile('dist/index.d.ts')).toBe(true);
expect(isGeneratedFile('build/types.d.ts')).toBe(true);
});
it('should exclude generated code files', () => {
expect(isGeneratedFile('api.generated.ts')).toBe(true);
expect(isGeneratedFile('schema.pb.go')).toBe(true);
expect(isGeneratedFile('service.grpc.ts')).toBe(true);
});
it('should exclude vendor directories', () => {
expect(isGeneratedFile('vendor/lib/utils.js')).toBe(true);
expect(isGeneratedFile('node_modules/pkg/index.js')).toBe(true);
});
it('should NOT exclude normal source files', () => {
expect(isGeneratedFile('src/main.ts')).toBe(false);
expect(isGeneratedFile('lib/utils.js')).toBe(false);
expect(isGeneratedFile('README.md')).toBe(false);
expect(isGeneratedFile('package.json')).toBe(false);
expect(isGeneratedFile('src/components/Button.tsx')).toBe(false);
});
// Segment-boundary check: project dirs that *contain* a reserved
// word as a substring (e.g. `my-dist`, `xbuild`, `prebuild`) must
// not be caught by the directory rule.
it('should NOT exclude dirs that merely substring-match a reserved name', () => {
expect(isGeneratedFile('my-dist/file.ts')).toBe(false);
expect(isGeneratedFile('xbuild/core.ts')).toBe(false);
expect(isGeneratedFile('rebuild/notes.md')).toBe(false);
expect(isGeneratedFile('preout/x.ts')).toBe(false);
// The filename itself isn't subject to the directory rule.
expect(isGeneratedFile('src/dist.ts')).toBe(false);
});
// `.lock` is no longer a blanket exclusion — only the explicit
// EXCLUDED_FILENAMES (yarn.lock etc.) are dropped.
it('should NOT exclude unknown .lock files (only well-known ones)', () => {
expect(isGeneratedFile('config/feature.lock')).toBe(false);
// Sanity: known lockfiles still excluded.
expect(isGeneratedFile('yarn.lock')).toBe(true);
});
// Terraform: `.terraform.lock.hcl` is generated by `terraform init`
// and dominates Terraform-repo commits. Listed as a known
// generated lockfile in EXCLUDED_FILENAMES.
it('should exclude Terraform provider lockfile', () => {
expect(isGeneratedFile('.terraform.lock.hcl')).toBe(true);
expect(isGeneratedFile('infra/modules/network/.terraform.lock.hcl')).toBe(
true,
);
});
});

View file

@ -0,0 +1,157 @@
/**
* @license
* Copyright 2025 Google LLC
* SPDX-License-Identifier: Apache-2.0
*/
/**
* Generated / vendored file detection for attribution exclusion.
* Based on GitHub Linguist vendored patterns and common generated file patterns.
*/
import { basename, extname, posix, sep } from 'node:path';
// Exact file name matches (case-insensitive)
const EXCLUDED_FILENAMES = new Set([
'package-lock.json',
'yarn.lock',
'pnpm-lock.yaml',
'bun.lockb',
'bun.lock',
'composer.lock',
'gemfile.lock',
'cargo.lock',
'poetry.lock',
'pipfile.lock',
'shrinkwrap.json',
'npm-shrinkwrap.json',
// Terraform: provider lockfile generated by `terraform init`. In
// Terraform repos this file often dominates a commit's churn and
// would skew per-file AI ratios toward provider noise rather than
// the human-authored Terraform sources the note is meant to
// describe.
'.terraform.lock.hcl',
]);
// File extension patterns (case-insensitive). Note: `.d.ts` is NOT
// listed here — `.d.ts` files are commonly authored by hand
// (declaration files for projects without TS sources, ambient module
// declarations, asset shims like `*.d.ts` for `import './x.svg'`),
// and treating every one as generated would silently drop AI edits
// to those files. Auto-generated `.d.ts` (e.g. `tsc --declaration`
// output) tends to live under `/dist/`, `/build/`, or `/out/`,
// which are already covered by `EXCLUDED_DIRECTORY_SEGMENTS`.
const EXCLUDED_EXTENSIONS = new Set([
'.min.js',
'.min.css',
'.min.html',
'.bundle.js',
'.bundle.css',
'.generated.ts',
'.generated.js',
]);
// Directory segments that indicate generated/vendored content. Compared
// against path segments (split on `/`) rather than substrings, so a
// project dir named `my-dist` or `xbuild` doesn't get caught by a
// `/dist/` substring match and silently drop AI attribution.
const EXCLUDED_DIRECTORY_SEGMENTS = new Set([
'dist',
'build',
'out',
'output',
'node_modules',
'vendor',
'vendored',
'third_party',
'third-party',
'external',
'.next',
'.nuxt',
'.svelte-kit',
'coverage',
'__pycache__',
'.tox',
'venv',
'.venv',
]);
// Multi-segment directory patterns that need contiguous matches
// (e.g. `target/release` and `target/debug` for Rust — `target` alone
// is too noisy as it's a common app name too).
const EXCLUDED_DIRECTORY_PATH_SUFFIXES = ['target/release', 'target/debug'];
// Filename patterns using regex for more complex matching
const EXCLUDED_FILENAME_PATTERNS = [
/^.*\.min\.[a-z]+$/i,
/^.*-min\.[a-z]+$/i,
/^.*\.bundle\.[a-z]+$/i,
/^.*\.generated\.[a-z]+$/i,
/^.*\.gen\.[a-z]+$/i,
/^.*\.auto\.[a-z]+$/i,
/^.*_generated\.[a-z]+$/i,
/^.*_gen\.[a-z]+$/i,
/^.*\.pb\.(go|js|ts|py|rb)$/i,
/^.*_pb2?\.py$/i,
/^.*\.pb\.h$/i,
/^.*\.grpc\.[a-z]+$/i,
/^.*\.swagger\.[a-z]+$/i,
/^.*\.openapi\.[a-z]+$/i,
];
/**
* Check if a file should be excluded from attribution based on Linguist-style rules.
*
* @param filePath - Relative file path from repository root
* @returns true if the file should be excluded from attribution
*/
export function isGeneratedFile(filePath: string): boolean {
const normalizedPath =
posix.sep + filePath.split(sep).join(posix.sep).replace(/^\/+/, '');
const fileName = basename(filePath).toLowerCase();
const ext = extname(filePath).toLowerCase();
if (EXCLUDED_FILENAMES.has(fileName)) {
return true;
}
if (EXCLUDED_EXTENSIONS.has(ext)) {
return true;
}
// Check for compound extensions like .min.js
const parts = fileName.split('.');
if (parts.length > 2) {
const compoundExt = '.' + parts.slice(-2).join('.');
if (EXCLUDED_EXTENSIONS.has(compoundExt)) {
return true;
}
}
const normalizedPathLower = normalizedPath.toLowerCase();
// Segment-boundary check: split on `/` and test each segment against
// EXCLUDED_DIRECTORY_SEGMENTS so `/repo/my-dist/file.ts` (a literal
// dir name) doesn't get caught by the `dist` rule the way a naïve
// `.includes('/dist/')` substring match could appear to suggest.
const segments = normalizedPathLower.split('/').filter(Boolean);
// The last segment is the filename — directory rules only apply to
// intermediate path components.
for (const seg of segments.slice(0, -1)) {
if (EXCLUDED_DIRECTORY_SEGMENTS.has(seg)) {
return true;
}
}
for (const suffix of EXCLUDED_DIRECTORY_PATH_SUFFIXES) {
if (normalizedPathLower.includes(`/${suffix}/`)) {
return true;
}
}
for (const pattern of EXCLUDED_FILENAME_PATTERNS) {
if (pattern.test(fileName)) {
return true;
}
}
return false;
}

View file

@ -140,15 +140,10 @@ describe('createApprovalModeOverride bound-tool isolation', () => {
expect(parent.getApprovalMode()).toBe(ApprovalMode.DEFAULT);
const child = await createApprovalModeOverride(
parent,
ApprovalMode.YOLO,
);
const child = await createApprovalModeOverride(parent, ApprovalMode.YOLO);
expect(child.getApprovalMode()).toBe(ApprovalMode.YOLO);
const childEdit = await child
.getToolRegistry()
.ensureTool(ToolNames.EDIT);
const childEdit = await child.getToolRegistry().ensureTool(ToolNames.EDIT);
// eslint-disable-next-line @typescript-eslint/no-explicit-any
const boundConfig = (childEdit as any).config as Config;
expect(boundConfig.getApprovalMode()).toBe(ApprovalMode.YOLO);

View file

@ -30,6 +30,7 @@ import { ApprovalMode } from '../config/config.js';
import { createMockWorkspaceContext } from '../test-utils/mockWorkspaceContext.js';
import { FileReadCache } from '../services/fileReadCache.js';
import { StandardFileSystemService } from '../services/fileSystemService.js';
import { CommitAttributionService } from '../services/commitAttribution.js';
describe('EditTool', () => {
let tool: EditTool;
@ -500,6 +501,60 @@ describe('EditTool', () => {
expect(display.fileName).toBe(testFile);
});
// The Edit tool feeds the commit-attribution singleton on success so
// commit notes can later report per-file AI/human ratios. Service-
// level tests for `recordEdit` already exist; these guard against
// the wiring at the tool boundary regressing (e.g. someone moves
// the call out of the success path).
describe('commit-attribution wiring', () => {
beforeEach(() => {
CommitAttributionService.resetInstance();
});
it('records AI-originated edits in the attribution service', async () => {
const initial = 'old line';
const updated = 'new line';
fs.writeFileSync(filePath, initial, 'utf8');
// Prior-read enforcement (origin/main #3774) requires the file
// to have been Read before Edit can mutate it.
seedPriorRead(filePath);
const invocation = tool.build({
file_path: filePath,
old_string: 'old',
new_string: 'new',
});
await invocation.execute(new AbortController().signal);
const attribution =
CommitAttributionService.getInstance().getFileAttribution(filePath);
expect(attribution).toBeDefined();
// The actual char count is implementation detail of
// computeCharContribution; we only assert the entry exists
// with a positive contribution.
expect(attribution!.aiContribution).toBeGreaterThan(0);
// Length sanity: contribution is bounded by the new content.
expect(attribution!.aiContribution).toBeLessThanOrEqual(updated.length);
});
it('skips attribution when the edit is modified_by_user', async () => {
fs.writeFileSync(filePath, 'old line', 'utf8');
seedPriorRead(filePath);
const invocation = tool.build({
file_path: filePath,
old_string: 'old',
new_string: 'new',
modified_by_user: true,
});
await invocation.execute(new AbortController().signal);
expect(
CommitAttributionService.getInstance().getFileAttribution(filePath),
).toBeUndefined();
});
});
it('should create a new file if old_string is empty and file does not exist, and return created message', async () => {
const newFileName = 'brand_new_file.txt';
const newFilePath = path.join(rootDir, newFileName);

View file

@ -45,6 +45,7 @@ import type {
ModifiableDeclarativeTool,
ModifyContext,
} from './modifiable-tool.js';
import { CommitAttributionService } from '../services/commitAttribution.js';
import { safeLiteralReplace } from '../utils/textUtils.js';
import {
countOccurrences,
@ -568,6 +569,15 @@ class EditToolInvocation implements ToolInvocation<EditToolParams, ToolResult> {
});
}
// Track AI contribution for commit attribution
if (!this.params.modified_by_user) {
CommitAttributionService.getInstance().recordEdit(
this.params.file_path,
editData.currentContent,
editData.newContent,
);
}
// Mark the cache entry written, capturing the post-write stats
// so a follow-up Read sees `lastReadAt < lastWriteAt` and falls
// through to the full pipeline instead of returning the

File diff suppressed because it is too large Load diff

File diff suppressed because it is too large Load diff

View file

@ -28,6 +28,7 @@ import { GeminiClient } from '../core/client.js';
import { createMockWorkspaceContext } from '../test-utils/mockWorkspaceContext.js';
import { FileReadCache } from '../services/fileReadCache.js';
import { StandardFileSystemService } from '../services/fileSystemService.js';
import { CommitAttributionService } from '../services/commitAttribution.js';
const rootDir = path.resolve(os.tmpdir(), 'qwen-code-test-root');
@ -826,6 +827,79 @@ describe('WriteFileTool', () => {
});
});
// Same as edit.test's wiring guard: the WriteFileTool feeds the
// commit-attribution singleton on success. The recordEdit call
// distinguishes a true file creation (`null` old content) from
// overwriting an existing empty file (`''` old content); these
// tests pin both shapes so the distinction can't drift silently.
describe('commit-attribution wiring', () => {
const abortSignal = new AbortController().signal;
beforeEach(() => {
CommitAttributionService.resetInstance();
});
it('records AI-originated writes in the attribution service', async () => {
const filePath = path.join(rootDir, 'attr_write.txt');
const invocation = tool.build({
file_path: filePath,
content: 'fresh content',
});
await invocation.execute(abortSignal);
const attribution =
CommitAttributionService.getInstance().getFileAttribution(filePath);
expect(attribution).toBeDefined();
expect(attribution!.aiContribution).toBeGreaterThan(0);
// A truly new file should be flagged so deletions later in the
// session can be reconciled.
expect(attribution!.aiCreated).toBe(true);
fs.unlinkSync(filePath);
});
it('skips attribution when modified_by_user', async () => {
const filePath = path.join(rootDir, 'attr_skip.txt');
const invocation = tool.build({
file_path: filePath,
content: 'human-edited',
modified_by_user: true,
});
await invocation.execute(abortSignal);
expect(
CommitAttributionService.getInstance().getFileAttribution(filePath),
).toBeUndefined();
fs.unlinkSync(filePath);
});
it('marks aiCreated=false when overwriting an existing empty file', async () => {
const filePath = path.join(rootDir, 'attr_existing_empty.txt');
// Create an empty file first — the distinction we're guarding
// is that overwriting an empty existing file should NOT be
// counted as a creation, even though both old contents are
// length-0.
fs.writeFileSync(filePath, '', 'utf8');
// Prior-read enforcement (origin/main #3774) requires the file
// to have been Read before WriteFile can overwrite it.
seedPriorRead(filePath);
const invocation = tool.build({
file_path: filePath,
content: 'overwrite content',
});
await invocation.execute(abortSignal);
const attribution =
CommitAttributionService.getInstance().getFileAttribution(filePath);
expect(attribution).toBeDefined();
expect(attribution!.aiCreated).toBe(false);
fs.unlinkSync(filePath);
});
});
describe('prior-read enforcement', () => {
const abortSignal = new AbortController().signal;

View file

@ -49,6 +49,7 @@ import {
fileExists as isFilefileExists,
} from '../utils/fileUtils.js';
import { getLanguageFromFilePath } from '../utils/language-detection.js';
import { CommitAttributionService } from '../services/commitAttribution.js';
import { createDebugLogger } from '../utils/debugLogger.js';
const debugLogger = createDebugLogger('WRITE_FILE');
@ -426,6 +427,17 @@ class WriteFileToolInvocation extends BaseToolInvocation<
},
});
// Track AI contribution for commit attribution.
// Pass null only when the file truly did not exist before this write;
// an empty string means the file existed but was empty.
if (!modified_by_user) {
CommitAttributionService.getInstance().recordEdit(
file_path,
fileExists ? originalContent : null,
content,
);
}
// Mark the cache entry written, capturing the post-write stats
// so a follow-up Read sees `lastReadAt < lastWriteAt` and falls
// through to the full pipeline instead of returning the

View file

@ -272,7 +272,10 @@ function isTimeoutError(error) {
);
}
async function getReleaseState({ packageVersion, releaseVersion }, allVersions) {
async function getReleaseState(
{ packageVersion, releaseVersion },
allVersions,
) {
const state = {
packageVersionExistsOnPyPI: allVersions.includes(packageVersion),
gitTagExists: false,

View file

@ -66,9 +66,31 @@
"default": 5
},
"gitCoAuthor": {
"description": "Automatically add a Co-authored-by trailer to git commit messages when commits are made through Qwen Code.",
"type": "boolean",
"default": true
"description": "Attribution added to git commits and pull requests created through Qwen Code.",
"default": {
"commit": true,
"pr": true
},
"anyOf": [
{
"type": "boolean"
},
{
"type": "object",
"properties": {
"commit": {
"description": "Add a Co-authored-by trailer to git commit messages AND attach a per-file AI-attribution git note (`refs/notes/ai-attribution`) for commits made through Qwen Code. Disabling skips both.",
"type": "boolean",
"default": true
},
"pr": {
"description": "Append a Qwen Code attribution line to PR descriptions when running `gh pr create`.",
"type": "boolean",
"default": true
}
}
}
]
},
"checkpointing": {
"description": "Session checkpointing settings.",
@ -2129,7 +2151,7 @@
"$version": {
"type": "number",
"description": "Settings schema version for migration tracking.",
"default": 3
"default": 4
}
},
"additionalProperties": true

View file

@ -25,6 +25,7 @@ import type {
SettingsSchema,
} from '../packages/cli/src/config/settingsSchema.js';
import { getSettingsSchema } from '../packages/cli/src/config/settingsSchema.js';
import { SETTINGS_VERSION } from '../packages/cli/src/config/settings.js';
const __filename = fileURLToPath(import.meta.url);
const __dirname = path.dirname(__filename);
@ -161,7 +162,7 @@ function convertSettingToJsonSchema(
break;
}
// Add default value for simple types only
// Add default value for simple and object types
if (setting.default !== undefined && setting.default !== null) {
const defaultVal = setting.default;
if (
@ -172,9 +173,39 @@ function convertSettingToJsonSchema(
schema.default = defaultVal;
} else if (Array.isArray(defaultVal) && defaultVal.length > 0) {
schema.default = defaultVal;
} else if (
typeof defaultVal === 'object' &&
!Array.isArray(defaultVal) &&
Object.keys(defaultVal).length > 0
) {
// Non-empty plain object — publish so IDE editors can surface the
// default value (e.g. `{commit: true, pr: true}` for gitCoAuthor).
schema.default = defaultVal;
}
}
// If the field accepts a legacy primitive shape (e.g. a boolean that was
// later expanded into an object), wrap with `anyOf` so existing values
// in users' settings.json don't trip the IDE schema validator while
// they wait for our migration to rewrite them on the next launch.
//
// Lift `description` and `default` to the outer (anyOf) level so IDE
// editors that surface schema-driven defaults / descriptions still see
// them — burying these behind `anyOf[N]` makes most validators ignore
// the `default`, which loses the "enabled by default" hint for any
// setting using `legacyTypes`.
if (setting.legacyTypes && setting.legacyTypes.length > 0) {
const description = schema.description;
const defaultVal = schema.default;
delete schema.description;
delete schema.default;
return {
...(description ? { description } : {}),
...(defaultVal !== undefined ? { default: defaultVal } : {}),
anyOf: [...setting.legacyTypes.map((t) => ({ type: t })), schema],
};
}
return schema;
}
@ -195,11 +226,12 @@ function generateJsonSchema(
);
}
// Add $version property
// Add $version property — sourced from settings.ts so a SETTINGS_VERSION
// bump propagates here instead of needing a parallel manual edit.
jsonSchema.properties!['$version'] = {
type: 'number',
description: 'Settings schema version for migration tracking.',
default: 3,
default: SETTINGS_VERSION,
};
return jsonSchema;