Planet Mozilla blog community. Over the last few days, in response to Mozilla's new name and shame list of slow add-ons, Palant has been investigating whether Mozilla's testing methods are actually accurate.
Rather surprisingly, it turns out that Mozilla's numbers could be significantly wrong -- and if they're not wrong, the factors that Mozilla uses to tabulate an add-ons final score should definitely be made more transparent.
In the first set of tests, Palant shows that FlashGot's position in the top 10 is probably due to a fault in Mozilla's testing setup, and that add-ons can perform very differently depending on which operating system they're being tested on. In the second analysis, Palant uncovers an irregularity that doesn't seem to have an obvious cause -- but it could be due to an I/O bottleneck on Mozilla's test machines. Basically, even though performance testing of Read It Later is disabled because of a bug, it still (somehow!) manages to record a 14% slow-down on Windows 7.
Palant concludes both analyses by scolding Mozilla for going public with the performance data before its testing methods had been confirmed accurate. It definitely looks like Mozilla has been more than a little reckless, considering the importance of Firefox's add-on ecosystem.