Proposal for an All-or-Nothing Secure Software Standard

Tuesday, May 10, 2011

Keith Mendoza


The 2010 Nall Report (PDF) shows that 70% of non-commercial and 60% of commercial accidents are caused by human error. This is because of the strict standards placed on certifying aircraft, and aircraft components.

When an airworthiness certificate is issued to an aircraft, the aircraft manufacturer provides a parts list of the components that makes up that aircraft; if there are any components in that aircraft that is not in the parts list, the particular aircraft could be declared not airworthy (there are some circumstances when an aircraft can be flown with missing/broken parts for the purpose of getting it to a repair shop).

For example, Diamond DA40's have two Garmin G1000 panels (display screen); if for example you replace one of the panels with the same display screen used in a Boeing 787 that aircraft is not airworthy.

Is the other panel capable if displaying the same screen? I'm sure it can, but the point is a DA40 is certified to have G1000 panels. It can be flown with the 787 panel, but it will be considered experimental.

I propose that secure software standards should be all-or-nothing. Either the software--and all of its dependencies--are compliant or the software is not compliant. Not owning the library, or database, will not be an excuse to meet the standards.

The application developer must specify the specific dependency versions that will be used with the application to make sure that no new security holes are introduced because a newer version of some dependency was installed.

I would even go further as to require that software running in the same environment--starting from the OS and device drivers, all the way to things like SSH, the shell, netstat, NTPD, etc--as the application must also be standards compliant.

If software that is not standards-compliant is installed in that same environment, then every application will be considered as non-compliant anymore.

I know that existing security standards such as PCI and OWASP do require that all security patches be installed on the system, and if potential security holes with third-party software are found during the audit that a bug report is filed; however, I feel this is the biggest shortcoming of these standards.

This gives the application owner an out and they are allowed to wait on the third-party to plug the security hole. For open-source software I think it is unacceptable for the user community (well the application developers to be exact) to just essentially do nothing.

If you are benefiting from that free software, do your share and contribute some fixes. Isn't having many more eyes looking at the code one of the biggest thing open-source advocates tout?

So why bring that third-party software open-source compliant, and provide your changes upstream for inclusion in the main release? I think it would be best for commercial software to have their software be standards-compliant so people will pay them for their software.

Many will argue that this bar very high. There are people out there who write software that has to be absolutely bullet-proof everyday. They do it because they know that someone will literally loose their life if their code is not solid.

I say its about time that software developers as a whole should write code knowing that lives will be ruined if they don't. Whether we like to admit it or not, but when people's personal information are stolen it ruins their lives.

Cross-posted from Home+Power

Possibly Related Articles:
OWASP Software Development Secure Coding PCI SSC Standards
Post Rating I Like this!
Erlend Oftedal You may have a point, however I don't think comparing software development to any soft of construction is really all that useful.
When you set out to build a plane, you have a finished design, and the planes main functionality does not change much during the lifetime of the plane. Yes you may install new seats and things like that, but the majority of components can stay the same.
Comparing this to software, you may start building a plane and end up with a helicopter or a car. Or you might start building a garage and end up with a sky scraper. This happens because requirements change during the construction and lifetime of the application. The application continues to evolve even after being put to production.
Software development moves at an incredible speed, even though it may not seem so. New components are frameworks are made available all the time, and developers are required to keep up to date, in order to be able to deliver.
If they had to wait for any of the components to be certified, they would lose the chance to be first to marked, because some other company under differenct compliance requirements, would grab the opportunity.
And if you ever run maven on a java project, you would have seen the amazing amount of components that are in play (people jokingly say Maven is DSL for downloading the internet). Open source components are often made by developers to scratch their own itch, and they probably don't have any incentives for getting them approved. It would slow down the innovation.
I'm not saying I disagree with your thoughts - I just don't find them to be realistic.

(Oh, and OWASP is not a standard)
Keith Mendoza Erlend,
I have to disagree that changing requirements is an excuse for writing insecure code. The reason why things are the way they are is because all the stakeholders--from the developers to the users--are tolerating it.

Take the many custom software development firms out there (I'm lumping everyone from the web design firms all the way to those that produce custom desktop applications). At the end of the day, benefits them to allow the customer to change the requirements through the course of the software development phase; because, the more the customer changes the requirements the longer it takes for the project to be completed, and in turn that means more money for the company.

I understand that this standard may not be realistic for everyone, because currently it's not economical. As long as software providers cannot be held responsible for the result of their bad code, it will not be in their best interest to do anything. Bad publicity doesn't really hurt companies in the long term, because--quite honestly--people aren't listening. They hear it, but they don't really care.
Rod MacPherson "you may start building a plane and end up with a helicopter or a car."

I would say that if this is happening then you have poor project management and probably out to look for a PMP.

There has to be a point in every project when features get frozen so that you will be able to finish the project at some point, and hopefully within budget. After that point if something really cool and new comes along that is a new project. Maybe that's the 2.0 version of Project A, but to actually be able to make a Project A that you can stand behind, you have to put a hold on feature creep at some point, and switch modes to QA.
Erlend Oftedal @Keith: I never said it was an excuse for writing insecure code. However, it might be an excuse to pick components without knowing whether or not they really are secure. This holds true for open source as well as closed source. Certifications of components will slow things down. It does come with a cost. And in a first to market scenario that could be all that matters.

@Rob: I think you did not understand my point. Even though you switch mode to QA, you would never rebuild a plane into something it was never meant to be. This happens all the time in software. In the first release (A) it's one thing, then in later version things are added and removed, and the end result is something really different from what you started out building. This is not the case for a plane - which is why it's a bad analogy.
The views expressed in this post are the opinions of the Infosec Island member that posted this content. Infosec Island is not responsible for the content or messaging of this post.

Unauthorized reproduction of this article (in part or in whole) is prohibited without the express written permission of Infosec Island and the Infosec Island member that posted this content--this includes using our RSS feed for any purpose other than personal use.