Signing code with secure digital certificates has become supremely important for software development as supply-chain attacks mount and governments worldwide create new tech-safety regulations.This is especially true for Internet of Things and operational technology (OT) devices, several speakers said at the Keyfactor Tech Days conference in Miami Beach last week."Consumers say they're much more likely to buy an IoT device if it has a security certification label," said Steve Hanna, Distinguished Engineer at Infineon, as he presented a brief talk about IoT security compliance.Even though code signing is part of compliance with many new regulations, especially the European Union's upcoming Cyber Resilience Act, the practice is nowhere near as widespread as it should be.
"It's amazing how many companies I work with that don't actually sign code," said Eric Mizell, Field CTO at Keyfactor, whose presentation immediately followed Hanna's and who was speaking of all software developers, not just those of IoT and OT devices.
Why securing code matters
The Internet of Things market is growing 10% year-over-year, Keyfactor Senior Vice President of IoT Strategies and Operations Ellen Boehm said at the conference. But at the same time, she added, IoT devices are increasingly being used to attack networks and organizations.The potential impact of these attacks extends far beyond device compromise, Boehm said. Breaches that make the news damage an organization's reputation, possibly leading to lost business, while outages of daily operations unquestionably bear a financial cost.Meanwhile, she said, the consumers of IoT devices and the operators of OT devices have little idea of how to make them more secure, and the manufacturers often don't know how to secure their devices either.Because of these trends, Boehm said, the digital-cryptography industry has both an obligation and an opportunity to secure IoT devices, as well as OT machines and those that fit into a category Boehm called "IIoT" — the industrial Internet of Things.Ideally, software and firmware developers should "sign" their code with a cryptography-based digital certificate that shows the code comes from the appropriate source, is verified from initial development to release, and proves that the code has not been tampered with.Later on, as the software or firmware is updated after the product hits the market, each update package should also be cryptographically signed to verify authenticity and integrity.This works well when all or most of the code is generated in-house, as with a major developer like Microsoft, Apple or Adobe. But it gets awfully complicated when developers, especially those working for or with smaller companies, pull open-source code from online repositories and from other software packages.This presents the software-supply-chain problem: When you have software that is built using parts coming from dozens or even hundreds of different sources, some which are themselves constantly revised and others which might be copies of copies, how can you keep track of who has contributed to the final product? How can you be sure each source is trustworthy?"You have to think about who has access to that code," Boehm told us in an interview. "Someone could swap in malware or place another little something inside that could be collecting data and sending it back to a different place."Ninety percent of companies that develop their own software use open-source libraries, and 97% of the commercial code base does too, said Karthik Lalithraj, Keyfactor Director of Solutions Engineering, East, during a conference presentation.The average application depends on more than 500 open-source libraries, Lalithraj added, a figure that's risen 77% from two years ago. Meanwhile, open-source code makes up close to 75% of the code in the average application.
Say yes to SBOMs
A clearer picture of what goes into a piece of software can be provided by a software bill of materials, or SBOM. As with a traditional bill of materials, this is a list of every (or almost every) component that comprises a finished product.In a stand-alone presentation at Keyfactor Tech Days, Keyfactor Senior Product Manager of Signing Ben Dewberry compared the production of software to baking a cake, and the SBOM as all the ingredients in the cake recipe.That sounds pretty straightforward, but Dewberry asked, "What if a bad actor adds an ingredient? Or a supplier supplies a bad package?"That's where a verifiable SBOM comes in. To the best extent possible, it verifies and attests that each component of a software package is what it claims to be and comes from where it says it does."Why is this important now?" said Dewberry. "Because supply-chain attacks continue to grow, and supply chains are complex and growing fast."Alongside Dewberry was Miguel Martinez Trivino, co-founder and CTO of Chainloop, a firm that automates the verification and attestation of software components in a supply chain.This past September, Chainloop and Keyfactor partnered to integrate the Chainloop platform with Keyfactor's EJBCA public-key infrastructure and SignServer code-signing products so that Chainloop's attestations could be digitally certified.As Martinez explained, this partnership enables software developers to have general-purpose, authenticated, tamper-resistant statements about software artifacts.Software-supply-chain management and SBOMs are especially important for IoT and OT devices, Boehm told us, because there just are too many opportunities for unauthorized or unknown devices to connect to an organization's network."You hear stories about rogue devices getting on the networks because somebody thought it would be great to install a camera system into their warehouses," she said. "But if that's not controlled for the corporate level, any one of those new cameras could be a way that someone can hack into the network if that camera doesn't have the proper security policies around it."
How a public-key infrastructure helps secure software
In her presentation, Boehm went further into how Keyfactor's products helped software developers and the principle of developer security operations, or DevSecOps.Using consumer washing machines as an example, she explained how the machines' firmware is verified during initial development using the EJBCA public-key infrastructure (PKI), which lets organizations generate their own certificates and signatures."The PKI is really the thing that is establishing the trust," Boehm told us. "That's why we offer that product to our end customers."Later, as the software continues to be modified as the machines approach a release date, the additions are verified by EJBCA and SignServer as part of normal DevSecOps.Finally, after the washing machines are sold to consumers, over-the-air software updates continue to be distributed for the lifetimes of the products and are themselves verified using EJBCA and Keyfactor's Command certificate-management tool.Keyfactor's Ellen Boehm shows how to secure "smart" device firmware. Credit: Paul Wagenseil/SC Media"Pushing over-the-air firmware updates is very common, and from a Keyfactor perspective, what we want to do is help to ensure that that code has been signed by the manufacturer, and that you're verifying that signature," Boehm told us in an interview. "You're looking at origin and authenticity and all of those things. And that's where the whole PKI piece comes in."
An onslaught of regulation
Looping back to Infineon's Steve Hanna, he detailed a number of the safety-labeling and regulatory schemes that have arisen around the world in the past five years, beginning with Singapore's Cybersecurity Labelling Scheme for IoT, or CLS(IoT), which began issuing safety certifications in 2021."These create a global approach to IoT security certification," Hanna said.The CLS(IoT) is voluntary, but it's stringent enough to be cross-recognized by the German and Finnish governments and by the German independent testing service TÜV Süd.It was followed by the UK's Product Security and Telecommunications Infrastructure (PSTI or PSTIA), which took effect in 2024, and the latest version of the European Union Radio Equipment Directive (RED), the provisions of which take effect in 2025.Unlike the CLS(IoT), the UK's PSTI is mandatory, applying to all internet-connected consumer products from smartphones to toys to medical devices.It requires all devices to have unique, non-guessable administrative passwords; to publish contact information whereby members of the public can submit reports of security issues, which must be replied to; and to publish for how long security updates will be provided.The European RED, also mandatory, was first adopted in 2014. It has been amended several times since, most notably with the USB-C requirement that led Apple to abandon its proprietary Lightning plug.RED's latest iteration addresses the cybersecurity of all radio-connected devices, including IoT devices.Unfortunately, despite the compliance deadline being less than 18 months away, its specifications have not yet been finalized (or "harmonized" to use the EU's preferred term). In general, the regulations aim to "improve network resilience," "better protect consumers' privacy" and "reduce the risk of monetary fraud."After an 18-month public consultation period, the U.S. Cyber Trust Mark was announced by the outgoing Biden administration in early January of this year.Unlike some other Biden cybersecurity actions, President Trump has taken no action against the Cyber Trust Mark, which means that manufacturers of IoT devices should soon be able to submit their devices for testing and certification.Like the 30-year Energy Star program, the Cyber Trust Mark is entirely voluntary, and certification that a device has met its requirements will be indicated by a sticker. The certification applies only to IoT "smart" devices like cameras, baby monitors and appliances, not to smartphones, computers or wireless routers.As with the EU's RED, the certification requirements have not been published, but they will likely be devised by the National Institute of Standards and Technologies.
The CRA is coming for you
Then there's the big bad (or good, depending on your perspective) regulation: the European Union's Cyber Resilience Act (CRA), which goes into effect December 2027.This is the one you need to pay attention to. While the above regulations are somewhat toothless, violations of the CRA could amount to heavy fines that run into millions of euros or up to 2.5% of an organization's annual revenue."U.S. manufacturers, especially those that sell globally, need to understand what this act is and how they need to comply," Boehm told us. "Anything that has a digital element inside of it needs to have a consistent cybersecurity strategy framework."As Keyfactor explains in a 10-page eBook about the CRA, the act applies to all "products with digital elements." A CRA informative slideshow states this includes laptops, smartphones, networking equipment, operating systems, firmware and applications. Wired and wireless devices alike are covered.Following an outcry from open-source developers, the EU exempted "non-commercial projects" from the CRA. Also not covered are cloud assets, software-as-a-service, and connected cars and medical devices, all of which are subject to other regulations.The CRA mandates that manufacturers institute DevSecOps and secure software updates, from the design phase through production and then at least five years after the product's initial market debut."The CRA covers a product's entire lifespan," notes the Keyfactor eBook. "Manufacturers must start enhancing their entire development process now to ensure their products are secure by design."Other requirements for each product sold in the EU include:
A risk assessment
A software bill of materials (SBOM) in the technical documents, to be maintained over the product lifespan
Built-in security meeting EU requirements, with no add-ons necessary
Factory-reset ability
Anti-tampering features to protect data and functions and prevent hijacking
Manufacturer reporting of incidents and known vulnerabilities
Vulnerabilities must be patched as quickly as possible, free of charge to consumers
Cybersecurity advocates have been recommending such requirements for many years. As such, the CRA may improve the security of digital devices, especially IoT ones, across the board. It could also give an advantage to mid-market device makers, who until now have been undercut by cheaper white-label brands that may lose access to the European market.However, the CRA doesn't mandate how all this must be achieved. Organizations can take different roads to reach the same goals.The Keyfactor eBook recommends using the sorts of digital security it offers, such as secure bootloaders, secure firmware/software updates, code signing and certificate managements. It also suggests some hardware thresholds such as secure elements on CPUs and adequate memory and storage space to handle updates."This is a little bit more forward thinking" than similar regulations, Boehm told us. "It's already out there, so I think people are going to design to this, and then it's what they're going to follow."
Paul Wagenseil is a custom content strategist for CyberRisk Alliance, leading creation of content developed from CRA research and aligned to the most critical topics of interest for the cybersecurity community. He previously held editor roles focused on the security market at Tom’s Guide, Laptop Magazine, TechNewsDaily.com and SecurityNewsDaily.com.