A malware family called Glassworm has returned to the Microsoft Visual Studio Marketplace and the OpenVSX Registry with a new wave of malicious extensions after the original batch was identified and removed in September 2025. The reappearance was not sophisticated. The group simply published new extensions under different names. That the approach worked at all reveals something important about the structural vulnerability these developer tool ecosystems have not yet addressed.
The September incident followed a pattern that security researchers recognized immediately. At least 24 fake extensions, designed to look like legitimate productivity tools and theme packs, were distributing Lumma Stealer across Windows machines. Developers who installed them got exactly what the extension appeared to offer, while malicious code activated quietly in the background and began its actual work. The extensions were identified, removed, and the incident was treated as contained.
Glassworm’s response to that containment was to publish new extensions. The underlying access to the marketplace that made the first wave possible remained intact, and the verification processes that should have made reentry difficult apparently did not. For businesses whose development teams rely on these extension ecosystems as a routine part of their workflow, the reappearance is not just a security headline. It is a direct operational concern that demands a specific response.
Why This Attack Category Is Particularly Dangerous
Supply chain attacks work by targeting the tools and infrastructure that developers trust rather than attempting to breach business systems directly. The logic is straightforward from an attacker’s perspective. A business with strong perimeter security, well-configured endpoint protection, and security-aware employees is difficult to compromise through conventional approaches. But if those same security-aware employees routinely install extensions from a marketplace they consider trustworthy, the extension ecosystem becomes an attack surface that bypasses most of the defenses the organization has invested in building.
The VS Code extension marketplaces fit this profile precisely. Developers use them constantly, treat them as legitimate infrastructure rather than external risk surfaces, and make installation decisions quickly because extensions are designed to be low-friction productivity tools. The entire user experience is optimized for easy adoption, which is exactly the characteristic that makes the ecosystem valuable to attackers.
Lumma Stealer, the malware Glassworm distributed through the first wave of fake extensions, is classified as an information stealer. Once active on a machine, it targets credentials, session tokens, cryptocurrency wallet data, and browser-stored information. In a development environment, the consequences extend beyond the individual machine. Compromised developer credentials can provide access to source code repositories, cloud infrastructure, deployment pipelines, and internal systems that represent the core intellectual and operational assets of the business. The extension is the entry point. What follows depends on what the infected machine can reach.
What Made the Malicious Extensions Difficult to Distinguish
The extensions Glassworm published were designed to look credible at a glance. Productivity boosters and theme packs are among the most commonly installed extension categories, familiar enough that developers download them without extensive evaluation. The malicious versions mimicked this presentation while containing code that activated on installation rather than operating transparently.
Several characteristics distinguished them from legitimate extensions, though none of these signals is individually definitive. Low install counts, typically fewer than 500, indicated extensions that had not been adopted by the broader developer community. Recent creation dates, within the prior 60 days, combined with low adoption, suggested extensions that had not yet built a track record. Publisher names that appeared randomly generated or that closely mimicked established publishers without matching them exactly were present in multiple cases. Permission requests that exceeded what the stated functionality would require, particularly broad file system access, were another indicator.
The challenge is that legitimate new extensions from credible publishers also have low install counts when first published. The signals are probabilistic rather than definitive, which means that mechanical rule application misses some malicious extensions while flagging some legitimate ones. What the signals collectively support is a posture of elevated scrutiny toward unfamiliar extensions rather than a simple checklist that resolves the question automatically.
The Immediate Response for Organizations With Development Teams
The persistence of Glassworm after the September removals means that organizations cannot treat the threat as resolved simply because the original extensions were taken down. The appropriate immediate response has several components that address both ongoing exposure and the possibility that malicious extensions have already been installed.
Freezing new extension installations while the threat is active gives security teams time to assess the current state without the risk that additional malicious extensions enter the environment during the evaluation period. Developers who need new extensions during this period should route those requests through a review process rather than installing them independently. This is a temporary friction that the threat environment justifies.
Auditing installed extensions across every company machine, with particular attention to anything added since September from sources outside a trusted publisher list, identifies potential existing compromises. Extensions that cannot be verified as legitimate should be removed, and the machines they were installed on should be examined for indicators of infostealer activity, including unusual outbound network connections and unexpected credential usage.
Enabling Microsoft’s built-in extension verification, which is disabled by default for many users, adds a baseline layer of screening that filters out extensions that have not met minimum verification criteria. This control does not eliminate the risk, but it raises the cost of successful distribution for attackers who cannot meet verification requirements.
Endpoint protection that includes infostealer detection provides a backstop for cases where a malicious extension reaches a machine despite other controls. Detection that is current enough to recognize Lumma Stealer variants is the relevant capability requirement, and verifying that endpoint protection meets this standard is worth doing explicitly rather than assuming.
Building Extension Security Into Development Policy
The Glassworm situation illustrates a category of risk that many organizations have not yet incorporated into their security policy frameworks. Policies governing software installation, approved vendor lists, and external code dependencies exist in most security-conscious organizations. Extension ecosystems for development tools often fall into a gap between these policies, treated as lighter-weight than full software installations and therefore subject to less scrutiny.
Closing that gap means treating VS Code extensions and extension ecosystems for other development tools, as components of the software supply chain that carry the same risk profile as other third-party code. The same questions that apply to open-source package dependencies apply here. Who published this? What does it actually do? What permissions does it require and why? Has it been reviewed by others in the developer community over a sufficient period to establish a track record?
Establishing an approved extension list for the tools development teams use most commonly gives developers a clear path to installing what they need without the friction of case-by-case review for every extension. Extensions that appear on the approved list have been evaluated and are considered safe for installation. Extensions outside the list require review before installation. This structure maintains developer productivity while closing the open-access pattern that makes extension ecosystems attractive attack surfaces.
Periodic audits of installed extensions, rather than one-time reviews conducted only in response to incidents, create ongoing visibility into what is running in the development environment. Extensions that were legitimate when installed can be updated to include malicious code, and publishers can change hands in ways that alter the trustworthiness of previously safe extensions. Continuous visibility is more protective than point-in-time assessment.
The Structural Problem the Marketplaces Have Not Resolved
Glassworm’s return after the September removals points to a gap in how the VS Code extension marketplaces handle publisher verification and reentry after malicious activity has been identified. The group was able to republish under different names without encountering controls that made reentry prohibitively difficult. Until both Microsoft and OpenVSX implement stronger publisher verification processes, the ecosystem will remain susceptible to exactly this pattern.
This is worth understanding not as a criticism to be directed at the marketplace operators and then set aside, but as a structural condition that organizations need to account for in their own policies. The marketplaces are working on the problem. They are not yet solving it at the speed that Glassworm is exploiting it. In the interim, the defensive work happens at the organizational level, through the auditing, policy, and verification controls that individual businesses can implement without waiting for marketplace-level solutions to mature.
The VS Code extension ecosystem is valuable enough that abandoning it is not a realistic response to this threat. The developer productivity it enables is genuine, and the extensions distributed through it are overwhelmingly legitimate. The appropriate response is the same one that applies across supply chain risk generally. Treat the ecosystem as a risk surface that requires active management, apply scrutiny proportional to the potential consequences of a compromise, and build the policies and controls that make the organization’s exposure to that risk surface manageable rather than open-ended.
Glassworm proved that removal is not resolution. The organizations that internalize that lesson now are the ones that will not be learning it later through a more expensive experience.