The Internet Explorer IQ Hoax and the State of Tech Blogging
Last Friday, the tech blogosphere was enamored by a study that claimed that Internet Explorer users had a lower IQ than users of other browsers. The study by AptiQuant found that the average IE6 user only scored just over 80 on its IQ test – a test score that is, in terms of real-life accomplishments, generally associated with elementary school dropouts and unskilled workers. The study was a hoax.
The Hoax
A hoax like this one obviously capitalizes on the inherent anxiety we all feel about our own intelligence and the prejudice that nobody in their right mind would ever use Microsoft’s Internet Explorer. It also allowed those who use fringe browsers like Opera and Camino to feel especially smug, as the average score of their cohort was supposedly around 125 (that’s close to the level of most neurosurgeons). Safari users (who are most likely to use Apple products) were also supposedly among the most “intelligent.”
Overall then, this was a well thought out hoax, though there were tons of red flags, as Wired’s Tim Carmody points out. The huge difference in scores, for example, doesn’t really make sense and the average Opera user – while making a fine browser choice – isn’t likely to be a genius either. A quick Google search would have shown that AptiQuant never really existed before it released this report (even though it claimed to have data from 2006). The data itself also isn’t exactly trustworthy, as it relies on online IQ test – likely delivered through spammy pop-ups – and carries little to no scientific relevance.
Why?
If this was so obviously a hoax then, why did virtually everybody in the tech world run with this story?
Here are a few reasons why I think this story was able to get so much play:
Pressure to be fast, write more stories and get more pageviews: This “report” was published on a Friday and while most people associate that day with fun, fun, fun, fun, writers still have to pump out a few stories and news is generally slow on that day (and that Friday was indeed a very slow news day). (That pressure, by the way, is even stronger for writers who are paid by story.)
Stories about statistics can be written quickly and get pageviews: Indeed, the constant pressure to write more stories that get as many pageviews as possible is one of the reasons why we writers love stories about statistics: they are easy and fast to write, generally come with some pretty graphics we can use and do well in terms of pageviews. I’ve written my fair share of those and there is a legitimate role for those stories that boil down lots of data into an interesting story. What often happens, though, is that writers will just believe anything they see in these studies and run with it, without ever questioning the study’s methodology.
Indeed, there is very little reward for those writers who spend a lot of time going through the methodology section of a report and then find that their time was wasted because the report turned out to be untrustworthy. Writing a story about how IE users are dumb makes for a good headline and lots of pageviewsafter all. A subtler story just wouldn’t get the kind of pageviews and rewards that “IE users are dumb as a bag of hammers” can get.
Microsoft sucks, doesn’t it?: There is also a general undercurrent of anti-Microsoft sentiment on most blogs that makes it even easier for a story like this to get through without even an ounce of fact checking (something most blogs don’t do anyway: you publish first, edit later and then update the story as necessary). If the story had claimed that Safari users were significantly dumber than Chrome users, chances are we would have seen a bit more pushback and less glee.
It’s worth noting that quite a few of the companies that create these studies also face a lot of pressure to get publicity and acquire new customers. Why they often risk their credibility by putting out statistics that are obviously wrong is beyond me, though. It’s up to the press, though, to examine this data and decide whether to trust it or not.