Microsoft secretly inserts 'Co-Authored-by Copilot' into Git commits - even with AI features turned off
What it really says
In Visual Studio Code 1.118, Microsoft silently changed the git.addAICoAuthor setting from 'off' to 'all'. A pull request opened on April 15, 2026 and merged on April 16 caused a 'Co-Authored-by: Copilot' trailer to be automatically appended to Git commits. The problem: the trailer appeared even for developers who had explicitly disabled all AI features via the chat.disableAIFeatures setting. On GitHub, Issue #313064 quickly accumulated 372 thumbs-down reactions and 30 'confused' reactions versus just 2 thumbs-up. Criticism was intense: developers noticed their commits were being falsely labeled as AI-assisted despite never using a single Copilot suggestion. The responsible Microsoft developer acknowledged the mistake and announced a revert to 'off' as the default in VS Code 1.119. The reversion was implemented on May 3, 2026.
Our assessment
The concern is justified, but this is a fixable mistake, not a fundamental threat. What Microsoft did was a breach of trust: a setting that modifies developers' work output was activated without clear communication. Particularly problematic is that the change applied even with AI disabled - this undermines trust in opt-out mechanisms generally. The practical consequences are not trivial: in organizations that must separately review AI-generated code - for licensing, compliance, or liability reasons - false co-author labels create extra work and bad decisions. In open-source projects where the provenance of every line of code matters, the labeling corrupts project history. On the positive side, Microsoft responded quickly and reverted the change. But the incident illustrates an industry-wide pattern: AI providers tend to enable AI features by default and shift the burden of opting out onto users.
Relevance for Germany
Relevant for Germany because Visual Studio Code is the most widely used code editor globally and is broadly deployed in German companies and public agencies. False labeling of code as AI-generated carries particular significance here: the EU AI Act requires transparency about where AI is used. When a tool falsely labels AI involvement, it creates a compliance problem in reverse - companies may need to prove their code was not AI-generated despite the labeling. Additionally, many German companies have internal policies for handling AI-generated code, especially in regulated industries like finance, healthcare, and public administration. False co-author labels can trigger unnecessary review overhead. The incident underscores the need to consciously configure AI tools in software development and not blindly trust default settings.
Fact check
Facts are well-documented through primary sources. Pull Request #310226 on GitHub shows the change from 'off' to 'all' with a merge date of April 16, 2026. GitHub Issue #313064 documents user reactions with the stated numbers (372 thumbs-down). Heise reported on May 1 under its 'WTF' label, The Decoder on May 2. The May 3 reversion is documented by WinBuzzer and the Windows Forum. That the labeling appeared even with disabled AI (chat.disableAIFeatures = true) is confirmed by multiple users in the GitHub issue and by The Decoder.
Source
- • Heise 01.05.2026 (heise.de/en/news/WTF-Microsoft-forces-Co-Authored-by-Copilot-in-commits-11279554.html)
- • The Decoder 02.05.2026 (the-decoder.com/co-pilot-becomes-a-co-author-in-vs-code-without-being-asked/)
- • GitHub Issue #313064 (github.com/microsoft/vscode/issues/313064)
- • GitHub PR #310226 (github.com/microsoft/vscode/pull/310226)
- • WinBuzzer 03.05.2026 (winbuzzer.com/2026/05/03/vs-code-1-118-copilot-co-author-default-commits-xcxwbn/)