Quote Originally Posted by grimREEFER View Post
this whole thing sounds like its gonna be another 8800gtx vs 3870x2. except this time, the ati card was released first.
pretty much, yeah...

im surprised how confident ati is though...
they knew that fermi is more than double the hw logic, and they knew that nvidia always targets to reach the same perf with a single gpu card as their previous dual gpu card... and from what nvidia showed at the deep dive event, thats exactly where fermi perf is at...

Quote Originally Posted by LesGrossman View Post
I guess the drama goes on and the "deep dive revelations" only made it worse.
well yeah, nvidia didnt say if the card they showed was a 360 or 380, how many cores, what clocks etc... of course this created a wildfire of speculation...

Quote Originally Posted by h0bbes View Post
Seems like fermi will not be as good as it was supposed to be and not as bad as it was hoped to be (by some)
whoever thought that fermi would be notably faster than a gtx295 wasnt anticipating but dreaming... :P

Quote Originally Posted by annihilat0r View Post
This is why you gotta applaud Nvidia for at least trying to do something different.
they are creating a massive expensive power hungry super performance part... hows that trying something different?

i think atis strategy to bring the same perf level down to much lower prices makes a lot more sense, especially if you consider how the pc industry has developed in recent years...

thx for all the interesting infos a couple of pages back LordEC911!
Quote Originally Posted by LordEC911 View Post
AMD/ATi gets ~60% more dies per wafer, Cypress vs GF100.
Then again, current yields put AMD/ATi ~3x higher.
40% yields for fermi? thats not going to be fully functional chips though... i can believe that its 40% 484core... but 40% 512core? nah...

Quote Originally Posted by SKYMTL View Post
I think the most impressive part about the Hair demo is that it was done TWO YEARS ago.
well.. i wasnt impressed by it tbh... it looks a lot like the mairmade nalu demo nvidia used a long time ago, and perf wasnt orders of magnitude higher... so i doubt we will see games simulate hair like that and it will remain a tech demo just like the thing nvidia showed before...

i hope im wrong though!

Quote Originally Posted by Andrew LB View Post
A fact? Really?
Two weeks ago I received my monthly investor update from TSMC which noted mass production has begun on nVidia's high end 40-nm GPU after months of delays caused by wafer production problems.
hmmm neliz probably meant card MP then... it takes a couple of weeks from wafers to chips... but im sure you know that
thx for the headsup!

Quote Originally Posted by Andrew LB View Post
nVidia's CEO also clearly stated at CES 2010 that the GF100 is now in mass production.
well, how would you prove he was wrong? im not saying hes lying, but he very well could be... hes famous for twisiting words around...

Quote Originally Posted by BenchZowner View Post
Leaving A.I. enabled in FarCry 2 benchmark does the following:

1) Allows the CPU performance ( not actual amount of cores, meaning a similarly clocked Core i7 920 with HT off will perform the same as a 920 with HT on ) to have a decent impact in the benchmark's performance results because it partially simulates normal gameplay ( as in the PC "calculates" the enemies & other "live" objects moves & artificial intelligence stuff ).

2) In short-term benchmarks it can introduce a lil' bit more discrepancy between the results ( but that gets covered by the 3 runs of the bench )

It's like the Unreal Tournament "flyby" ( rendering ) and "botmatch" ( rendering + AI ) benchmark modes.
so AI enabled basically means a FC2 run becomes a flyby, while AI disabled the bench tool will pretend somebody is playing the game and do all the ai calculations?
didnt know that, thx!