I agree and from a purely rational standpoint you are correct. However in my experience the main benefit of a successful pentest is to achieve a change in culture and the perception of security of staff across the entire workforce where before it was deemed “taken care of” or non-important.
In short, people won’t change before shit has hit the fan, and a pentest is the closest you can get to a controlled shit-hit-fan situation without it being a meaningless drill :) How, and what is uncovered is besides the point and merely secondary when viewed from that perspective.
I did a website for Visa a few years ago, and it required a pentest before launch. We tried to find a loophole to justify it not needing a pentest (because that would give us 3 more weeks to develop the site), but no luck. It was such a simple site with no database, but they required it to go through pentest anyways.
The pentest came back with some recommendations. Mostly to do with the use of HTTP headers. Absolutely we fixed them, and made damn sure that the next time we had a site to be pentested those unforced errors were not repeated.
So on a small scale, yes. Pentesting improved the way we developed websites. I don't know about how it affected the "culture". Visa has a really strong security culture already.
>> Pentesting improved the way we developed websites. I don't know about how it affected the "culture". Visa has a really strong security culture already.
So if the security culture is strong, the pentesters reports are read and implemented; if the security culture is weak-to-completely-non-existant, they'll likely be ignored?
I think you already answered your own question when emphasizing ‘public’ there.
Security isn’t fun, and at best a relief (when nothing is found). When a pentest was successful, as in, the tester got in you can be sure it’s kept under wraps.
So no, I don’t think there are many, if any public records. The fact that there is shame and status involved in not being completely air tight is a big driver of the persistent insecurity of the world at large.
Anonymized records would go a long way in achieving a shift to safety and awareness but as you can read here they are easily construed as stories of fiction.
Everyone likes talking about that growth hack that drove a 1000% revenue increase. No one wants to talk about the database hack that spilled thousands of client records out in the open.
So basically tracking "business-as-usual" attacks (probably 99% low-tier, low effort attacks that didn't go anywhere) or serious attack examples on other companies isn't going to change the culture. But a full blown, highly skilled attack, with a fully dedicate adversary, specifically targeted at your business and business value, with potentially devastating consequences - will do a better job of waking people up?
Yes. It seems stupid, but what you outline is a 100% match with what I’ve seen in practice. Until it happened to them, the other company was stupid and careless - not them.
At risk of repeating myself I’m chalking this up to basic human behaviour in all fields of life and the lack of taking responsobility by 80% of all people.
Security is very lopsided in that you just need 1 person to be careless for the attacker to get in, while the defender needs to be 100% secure across all vectors.
I could discuss this all day, and you know the importance of the topic, I know it, but the fact of the matter is that most non-tech people think of security as an annoyance. The solution? No idea yet, other than finding the right chord to strike and “fix” this psychological problem. We’ve made significant strides the last few months but getting companies more security conscious has been a tougher nut to crack than I first anticipated.
Feel free to email me at stan@site.security if you want to exchange thoughts on the topic. I’d love to take a deeper dive into the matter with anyone that’s passionate about solving the security problem in any way shape or form :)
Infosec in practice (not imaginary scenarios) is also about good hygiene by the regular plebs and investing in proper QA by hiring some people who are naturally paranoid and with enough clout to push back on bad lazy/ideas.
That plus regular fixed check ups, where more deep dives are done.
Like you said, and as the article points out, it seems to be as much a cultural day-to-day thing as it is about technical searches for vulnerabilities. Or worse renstalling noisey monitoring systems with a bunch of full positives and pointless investigative rabbit holes.
In short, people won’t change before shit has hit the fan, and a pentest is the closest you can get to a controlled shit-hit-fan situation without it being a meaningless drill :) How, and what is uncovered is besides the point and merely secondary when viewed from that perspective.