The methodology is very simple and is explained in post #152.

To go into more depth.. DVI/HDMI output a pure digital signal. I get a byte-for-byte dump of the port. I can then digitally analyse every single frame, and the software tells me the percentage difference between one frame and the next. Where a frame is identical to the last, that is a dropped frame. A game locked at 30fps (as many console games are) is still outputting 60Hz from its HDMI port, so you would expect every other frame to be 0% different from its predecessor. In the case of Tomb Raider on 360, by and large this is the case.

Due to the competely digital, totally lossless workflow, the results simply can't be argued with and the Tomb Raider video (same gameplay, if not the same actual video being generated) is a clear example of how this can help ascertain what's going on with this particular CPU in that it is showing a rock solid refresh rate (and therefore a similarly solid response from the controls) whereas the PS3 version refreshes very inconsistently in places. I believe this exactly what you guys are saying with regards the AMD chip vs Core 2.

So I'm sitting here with a Core i7 920-based system, a Q6600-based system and I've also got E8400 and Q9300 CPUs along with oodles of DDR2 RAM. I've also got a Radeon 4850, 4870 along with an 8800GT and a GTX295. What I'd need would be an AMD board and the Phenom. If this chip does indeed have the gaming advantage it is alleged to have, my results would be gold dust for AMD, so if they want to put the chip to the test, I'd be willing to do it for free. And yes, I am usually paid for this sort of thing I'd just like to see my tools being used outside the console arena.