Skip to main content
DAEMON Tools Trojan: A Build Pipeline BreachIncident
4 min readFor Security Engineers

DAEMON Tools Trojan: A Build Pipeline Breach

On May 5, 2025, Disc Soft Limited confirmed that their DAEMON Tools Lite software had been compromised at the source. Security researchers at Kaspersky had already detected this breach. Version 12.5.1, distributed since April 8, contained malware. Users in over 100 countries downloaded and installed what they believed was a legitimate update. The company released a clean version 12.6 the same day they confirmed the breach.

This wasn't a vulnerability in the software. This was the software itself turned into a delivery mechanism.

Timeline

April 8: Trojanized DAEMON Tools Lite version 12.5.1 begins distribution through official channels.

Late April - Early May: Kaspersky researchers detect anomalous behavior in DAEMON Tools installations and begin analysis.

May 5: Disc Soft confirms the breach publicly and releases malware-free version 12.6.

Ongoing: Investigation into the initial compromise vector continues.

The 27-day window between initial distribution and public confirmation represents the critical period when users were installing compromised software from what appeared to be a trusted source.

Which Controls Failed

The attack succeeded because multiple defensive layers either didn't exist or didn't function:

Build environment isolation failed. Attackers gained access to inject malicious code into the build pipeline, indicating inadequate network segmentation between development, build, and production systems.

Code signing didn't prevent distribution. The trojanized version was signed with valid certificates, suggesting either compromised signing keys or that signing occurred after malicious code injection. Your code signing process should be the final step before distribution, not an automated step in the middle of the pipeline.

Integrity verification was absent or bypassed. There's no indication that automated hash verification or reproducible build checks were in place. If they existed, they didn't catch the modification.

Runtime monitoring took weeks to surface the issue. The breach was detected by an external security firm, not internal controls. This points to insufficient telemetry from distributed software and no behavioral analysis of released versions.

What Standards Require

NIST 800-53 Rev 5 addresses supply chain risk directly:

  • SR-3 (Supply Chain Controls and Processes) requires organizations to establish security requirements for developers and implement controls to ensure acquired systems meet security specifications. For software publishers, this means your build environment must meet the same security bar as your production environment.

  • SA-10 (Developer Configuration Management) mandates configuration management and control throughout the system development life cycle. Your build artifacts should be immutable and traceable to specific source commits.

ISO/IEC 27001:2022 Control 8.31 requires logical or physical separation to reduce unauthorized access risks. If your build servers share network segments with general development workstations, you're not compliant.

PCI DSS v4.0.1 Requirement 6.3.2 requires that custom software be developed based on industry standards and incorporate information security throughout the software development life cycle. For organizations that distribute software to payment environments, this includes securing your own build pipeline.

SOC 2 Type II Common Criteria CC6.6 addresses logical and physical access controls for system components. Your auditor will want to see that build systems have restricted access, separate credentials, and audit logging.

Lessons and Action Items

Implement build environment isolation today. Your build servers should be on a separate network segment with strict firewall rules. Developers shouldn't have direct access to build infrastructure. All code moves through version control, and builds are triggered by merge events, not manual logins.

Separate signing keys from build processes. Code signing should happen in an air-gapped ceremony or through a hardware security module (HSM) with multi-party authorization. The build pipeline produces unsigned artifacts, and signing is a separate, audited step. If your CI/CD pipeline has direct access to signing keys, you're one compromised runner away from this same scenario.

Generate and publish build attestations. Use frameworks like SLSA (Supply Chain Levels for Software Artifacts) to create verifiable metadata about your build process. At minimum, publish cryptographic hashes of your releases through an independent channel. Users should be able to verify that the file they downloaded matches the hash you published before installation.

Instrument your released software. You need telemetry from deployed instances that can surface anomalous behavior. This doesn't mean invasive tracking—it means your software should report version information, crash data, and behavioral patterns that deviate from expected operation. Kaspersky found this breach because they analyze software behavior at scale. You should find your own breaches first.

Test your incident response plan for supply chain scenarios. Disc Soft released a clean version within hours of confirmation, which is commendable. But they had a 27-day window where users were installing malware. Your incident response plan should include procedures for emergency software recalls, user notification at scale, and coordinated disclosure with detection partners. Have you ever practiced recalling a software release? Do you have communication templates ready?

Review your dependency verification process. If you're consuming third-party libraries (and you are), you're downstream from someone else's build pipeline. Pin your dependencies to specific hashes, not just version numbers. Use tools like Sigstore to verify signatures on open-source components. The DAEMON Tools breach happened to a software publisher, but the same attack vector exists for any dependency you pull into your builds.

The DAEMON Tools incident demonstrates that software publishers are high-value targets. Your build pipeline is infrastructure, and it needs infrastructure-grade security. Network segmentation, access controls, audit logging, and behavioral monitoring aren't optional for systems that produce code your users will execute.

Start with your signing keys. Where are they stored? Who can access them? How would you know if they were used without authorization? If you can't answer those questions definitively, that's your first action item.

SLSA

Topics:Incident

You Might Also Like