I’ve had the following idea rattling around my head for a while now, and I think it nicely and somewhat humorously illustrates the gravity of the situation and the importance of secure coding practices (and a secure development lifecycle).
The Story
Let’s say I’m a business user (Me) with a problem and I have a friendly IT team (Alice, Bob, and Charlie) willing to help me solve it.
The problem statement is this: “I’m getting nagged to death by my [partner|spouse|child|etc.]”, so I formulate the requirement “I want the nagging to stop.” and convey that to my willing and eager IT team. The meeting goes something like this:
Alice: “Sure, we can do that. It’ll be easy. I’d give it one story point. We can even put it in the current sprint so you don’t have to wait.”
Me: “Awesome! Thanks a lot, you’re doing me a great favor. I thought I’d never be able to solve this problem. How can you come up with a solution so fast?”
Alice: “Oh, we’re going to use a cyanogen. It’ll be quick.”
Me: (eyes glazed over at the technical term, regretting that I asked) “Alright, well, thanks again and I look forward to testing out your solution.”
Alice then promptly goes off and kills my partner. Bob reviews the implementation and signs off on it because, well, it does meet the requirements quite effectively and, well – permanently. He may even give Alice a pat on the back for her elegant and ingenious solution. Charlie, although he might have some reservations about Alice’s implementation, believes that it’s Not His Problem™, and if asked about it would refer you to Alice, because she wrote/executed the implementation.
A few days later it comes back to me for functional testing/user acceptance testing. Like any good functional tester, I care that the requirements have been met, and don’t really understand or care about *how* they’ve been met.
Me: “Wow! Whatever you guys did it worked wonders. Thank you so much for this. I haven’t been nagged at all in almost a week! Of course, the car is missing…but anyway, that’s a different problem, I’m sure it’ll turn up sooner or later. You guys are heroes.”
Alice: “Glad we could help, let us know if you need anything else.”
Of course, after a while a body (a vulnerability) is found and exploited by the police, landing me in jail for hiring my IT team to fix my problem. I can explain and exclaim all I want that it’s Not My Fault™ and that I didn’t know that murder was involved, but the money trail doesn’t lie and I still hired my team to “fix my problem”.
The Moral of the Story:
Just because it works and meets all of the business requirements does not mean that it’s correct.
First Corollary:
Security testers care more about correct and appropriate implementations that protect the stakeholders from harm and losses. The fact that the software works and meets its requirements is a bonus.
Second Corollary:
Software can easily be implemented incorrectly. Correct implementations frequently take more time, analysis and attention to detail.