I saw this article the other day, IBM Scolds TippingPoint Over Hacking Contest and figured now what? But I decided it would be an interesting read.
A couple quick blurbs from the article:
IBM’s ISS division has torn into rival TippingPoint for sponsoring the hacking contest that led to the disclosure of a QuickTime vulnerability in Apple’s Safari browser. “IBM Internet Security Systems agrees with Gartner’s assessment that “public vulnerability research and ‘hacking contests’ are risky endeavors, and can run contrary to responsible disclosure practices.” It is for this reason that IBM ISS strongly adheres to its well-established responsible disclosure guidelines.”
Once I read the article it was then that I realized…. that it really wasn’t IBM, but ISS (who IBM purchased recently) that was scolding TippingPoint for sponsoring this contest. Immediately I thought about all the drama that went on when ISS disclosed their Apache Chunked Encoding Overflow back in 2002.
http://lwn.net/Articles/2756/ It all looks like a fairly normal response to security problems in the free software community, until you look a little more closely. It turns out that the Apache group was already aware of the problem and was working on a fix. The Computer Emergency Response Team (CERT) also was already involved. It also turns out that the ISS patch does not completely fix the problem. ISS, in its hurry to publicise the vulnerability, had not checked with either CERT or the Apache Software Foundation.
Does anyone remember all of this?
ISS took quite a bit of criticism for this disclosure and responded publicly to clean up any confusion and misunderstanding.
The very last portion of this posting is what I find real interesting:
ISS has made these decisions based on our mission to provide the best security to our customers and being a trusted security advisor.
For me personally.. It is kind of funny that disclosure almost always seems to come back to the argument of… we did it for the greater good… we did it for the benefit of others… we did it for the right reasons…
But you on the other hand…
In the final days of March, a “week of Vista bugs” was announced. As some suspected, it turned out to be a hoax. For the full story on how it was carried out, check the breakdown from the perpetrators.
All in all, not a very impressive hoax by any means. Even looking at the screenshot they include of Google, you can see that the top ten hits weren’t anyone seriously buying into it.
One of the most often used, and later debated, analogies used for actions in the security/hacker industry is that of comparing port scanning to walking down a road checking doors and windows to see which are unlocked. This is fundamentally flawed because port scanning looks for open services that your computer is offering other people on the network. There is no expectation of ‘services’ offered when walking down a neighborhood street, regardless of checking doors and windows. A slightly better analogy would be walking down a street full of shops that have no power (no lights, no neon open signs) checking doors to see which are open.
Earlier today, someone (likely troll) on Full-Disclosure used an analogy i have heard before but didn’t give thought to. he tried to compare aspects of the vulnerability disclosure debate to other virtual events as well as the ‘real world’.
And while you might think these efforts are noble, the reality of the situation is simple – this is absolutely no different than a bunch of Russians with botnets, forcing businesses to comply with their demands if that business wishes to continue existing on the Internet.
Bad analogy #1. A vendor who writes code resulting in an exploitable flaw is at fault for doing so. A vendor who is taken offline due to bandwidth saturation attacks is not at fault.
When was the last time an auto manufacturer was humiliated publicly because their car windows can easily be broken and contents of the car stolen? When have chain manufacturers been chastised by the mass media for the existence of bolt cutters? What about the serious threat of hacksaws?
Bad analogy #2a. Breaking windows and cutting locks is better compared to your beloved Russians with botnets. Software can be written not to be vulnerable to well published attacks while still being practical and functional. Glass can not be designed to be unbreakable while still being practical (the cost associated with it isn’t). Locks can be designed fairly securely if they are heavy enough and well done (and costly), but chains suffer the same problem as windows.
Bad analogy #2b. When was the last time we saw a manufacturer give credit to the people who discovered the problem? You see a jeep vendor giving props to the thirty two people that had theirs tip over when it shouldn’t? Do you see the vendor give a timeline and coordinate disclosure with the news outlets? No.
In general, I find it amusing that security professionals spend so much time coming up with poor analogies to describe simple actions we should all be familiar with, that are already morally ambiguous to begin with.
I previously blogged about the Month of PHP Bugs [MOPB], an effort lead by Stefan Esser and the Hardened PHP Project to raise awareness about vulnerabilities in the PHP language. The month has come and passed and of course I have to wonder about a few things.
1. The project ended up releasing 45 vulnerabilities over 31 days, many of them remotely exploitable. For anyone that was under the delusion that PHP was “pretty secure”, think again. Not only were some remote, many were methods for bypassing the native protection methods PHP offers like open_basedir or issues with various functions designed to filter bad input.
2. These “Month of X Bugs” always get a press blitz before it happens, but we rarely see the same news outlets cover the same thing a month later. It’s nice to see the results of the project, the number and type of vulnerabilities as well as any insights (see comments on previous blog post) the developers had.
3. The PHP project thankfully responded to many of these vulnerabilities already. PHP 5.2.1 and 4.4.5 fix a lot of security issues. Oh wait, that was released two weeks before the MOPB. Where is the next big release that fixes the unpatched issues?
All in all, a very impressive effort. Esser and the Hardened PHP Project have certainly raised the bar for the “Month of X Bugs” projects.
Last summer when I wrote “Vicious orchestrated assault on MacBook wireless researchers”, it set off a long chain of heated debated and blogs. I had hoped to release the information on who orchestrated the vicious assault but threats of lawsuits and a spineless company that refused to defend itself meant I couldn’t disclose the details. Well a lot has changed since then and researcher David Maynor is no longer working for SecureWorks and he’s finally given me permission to publish the details.
Apple is a mega corporation that nearly smashed the reputation of two individuals with bogus claims of fraud. It didn’t matter they weren’t the one’s pulling the trigger because they were pulling all the strings. David Chartier should be ashamed of himself and his blog. Jim Dalrymple of Macworld and his colleagues that jumped on the bandwagon should be ashamed of their reporting. Frank Hayes was the only one of Dalrymple’s colleagues that had the decency and honor to apologize. Most of all, shame on Apple.
Yes, the trend continues and gets more .. odd.
The purpose of the exercise is not so much to expose Myspace as a hive of spam and villainy (since everyone knows that already), but to highlight the monoculture-style danger of extremely popular websites populated by users of various levels of sophistication. We could have just as easily gone after Google or Yahoo or MSN or ZDNet or whatever. Myspace is just more fun, and is becoming notoriously dickish about responding to security issues.
I’m not exactly sure how MySpace deserves a “monoculture-style” designation since it is a single social web site and the vulnerabilities (presumably) aren’t specific to one web browser or operating system.
Hell hath no fury like a PHP developer scorned…
During the last months there have been the Month of the Browser bugs and the Month of the Kernel bugs projects that tried to raise awareness for security vulnerabilities in browsers and kernels.
After thinking a bit about this I started to wonder if I should not start a Month of PHP bugs somewhen in the first half of 2007. At the PHP conference it was once again claimed that it is not PHP that is insecure but the applications written by novice programmers. While it is true that many PHP applications are written by people with no clue about security it is absolutely not true that PHP is a secure programming language.
I think it is necessary to make ALL people aware of this. The plan is therefore to choose one of the 31 day months after January and release everyday a vulnerability in PHP itself. I would like to hear comments from the PHP community about this plan. (Be warned that anonymous rants will be deleted)
To check out the bugs, visit www.php-security.org/. I will also be adding comments to this entry pointing out some of the interesting commentary and underlying message behind this effort.
Steven Christey of CVE recently commented on the fact that Microsoft, Adobe, Cisco, Sun and HP all released multi-issue advisories on the same day (Feb 13). My first reaction was to come up with an amusing graphic depicting this perfect storm. Due to not having any graphic editing skills and too much cynicism, I now wonder if these are the same vendors that continually bitch about irresponsible disclosure and it “hurting their customers”?
These same customers are now being subjected to patches for at least five major vendors on the same day. In some IT shops, this is devastating and difficult to manage and recover from. If a single patch has problems it forces the entire upgrade schedule to come to a halt until the problem can be resolved. If these vendors cared for their customers like they pretend to when someone releases a critical issue w/o vendor coordination, then they would consider staggering the patches to help alleviate the burden it causes on their beloved customers.
The Vulnerability Disclosure Game: Are We More Secure?
By Marcus J. Ranum
Do you remember the original premise of the disclosure game? By publicly announcing vulnerabilities in products we will force the vendors to be more responsive in fixing them, and security will be better. Remember that one? Tell me, dear reader, after 10 years of flash-alerts, rushed patch cycles and zero-day attacks, do you think security has gotten better?
I know that Microsoft, Oracle and others have spent huge amounts of money improving the security of their software. Never mind the fact that 99.99 percent of the computer users in the world would rather they had spent that money making their software cheaper or faster, I suppose it’s a great thing to see that software security is being taken seriously. Security has gotten more expensive. But do you think security has gotten better?
It’s a tad ironic that the only way we could ever hope to answer this question is if the vendors practiced full-disclosure! The only way this question could be answered is to see a list of all the vulnerabilities that vendors like Microsoft or Oracle have found and fixed through in-house auditing. If they have found and fixed 1,000 vulnerabilities compared to the 250 publicly disclosed (arbitrary numbers), then yes, security has gotten better. Right? If software is shipping with less vulnerabilities per lines of code, then security has improved, and the “we’ll force your hand” crowd had something to do with it.
If twenty years of brutal full disclosure really did teach them the importance of security by forcing them to spend considerable money on said security, then didn’t those wiley “we’ll force your hand” folks in the 90′s do what they claimed, although a little differently than planned?
Full Disclosure of Security Vulnerabilities a ‘Damned Good Idea’
By Bruce Schneier
Full disclosure – the practice of making the details of security vulnerabilities public – is a damned good idea. Public scrutiny is the only reliable way to improve security, while secrecy only makes us less secure.
I don’t want to live in a world where companies can sell me software they know is full of holes or where the government can implement security measures without accountability. I much prefer a world where I have all the information I need to assess and protect my own security.
No reply needed.