Monthly Archives: September, 2007

arfis: Automated Remote File Inclusion Search

Nutshell What you see here is the output of the ”arfis project”, a simple perl script. It automatically downloads and extract PHP projects from sourceforge.net and checks for Remote File Inclusion vulnerabilities. It then post’s the potential (now it’s -potential-, cause the script is in an early stadium) vuln to this blog.

The idea behind this tool was joked about by several VDB managers over a year ago due to the growing trend of false vulnerability reports popping up in 2006 and 2007. The style of many posts to mail lists were becoming the same, several signatures suggesting a tool or group was involved appeared and it was speculated that many remote file inclusion (RFI) vulnerabilities were the result of a very primitive “grep and gripe” style vulnerability ‘research’. Jump to today and we have this script doing what we suspected all along. Some will proclaim “genious!” and others may be quick to download and taste the fame of being a “vulnerability researcher”. Before you plan your victory party and brush up your resume to include vulnerability research, consider that this script is blindly searching projects for specific lines that suggest an application is vulnerable to RFI. Without looking at the source code manually, there is no way to accurately determine if it is a legitimate vulnerability or a false positive. The people using this script don’t seem to fully understand that and blindly use the tool w/o consideration.

Recently, 8 or so of these arfis-found vulnerabilities were reported to milw0rm for inclusion in their database. Upon examination, 6 of the 8 were not legitimate vulnerabilities. Of the 2 that were, one had been reported two years prior. This is a good indication of how trustworthy the tool is, early release or not, and what kind of burden it places on VDBs who do their best to vet vulnerability disclosures to a limited degree.

2007 Top Vulnerable Vendors?

http://www.eweek.com/article2/0,1895,2184206,00.asp
http://www.eweek.com/c/a/Security/Report-MS-Apple-Oracle-Are-Top-Vulnerable-Vendors/

New IBM research shows that five vendors are responsible for 12.6 percent of all disclosed vulnerabilities. Not surprising: In the first half of 2007, Microsoft was the top vendor when it came to publicly disclosed vulnerabilities. Likely surprising to some: Apple got second place. IBM Internet Security Systems’ X-Force R&D team released its 2007 report on cyber attacks on Sept. 17, revealing that the top five vulnerable vendors accounted for 12.6 of all disclosed vulnerabilities in the first half of the yearor 411 of 3,272 vulnerabilities disclosed. Here’s the order in which the top 10 vendors stacked up, by percentage of vulnerabilities publicly disclosed in the first half of the year: Microsoft, 4.2 percent Apple, 3 percent Oracle, 2 percent Cisco Systems, 1.9 percent Sun Microsystems, 1.5 percent IBM, 1.3 percent Mozilla, 1.3 percent XOOPS, 1.2 percent BEA, 1.1 percent Linux kernel, 0.9 percent

This article was posted to ISN the other day and struck a nerve. How many times are we going to see vulnerability statistics presented without qualification? Rather than really get into the details, I replied with a single simple example on why such statistics are misleading at best and incorrect at worst. The bulk of my reply follows. My hopes for Lisa or IBM/ISS clarifying this is already dwindling.

One other factor, that Lisa Vaas apparently didn’t ask about, is how ISS X-Force catalogs vulnerabilities, and if their method and standards could impact these numbers at all. Take for example, two X-Force vulnerability database entries: Oracle Critical Patch Update – July 2007 http://xforce.iss.net/xforce/xfdb/35490 18 CVE, 30+ Oracle Oracle Critical Patch Update – January 2007 http://xforce.iss.net/xforce/xfdb/31541 30 CVE, 50+ Oracle So when comparing numbers, you have 2 X-Force entries that equate to 48 CVE entries that equate to *more than 80* unique and distinct vulnerabilities according to Oracle. I’m not a math or stat guy, but I have a feeling that this could seriously skew the statistics above, especially when you consider that Microsoft and Apple both have a more distinct breakdown and separation in the X-Force database. Anyone from IBM/ISS care to clarify? Lisa, did you have more extensive notes on this aspect that didn’t make it in the article perhaps?

Follow

Get every new post delivered to your Inbox.

Join 4,759 other followers