And...you know....the memory interface.
Printable View
"He looks really cool with this glasses"
I'd really want to edit that pic a little but the text is in the way. :)
they said 1. quarter 2010 :yepp:
Come on i want some GT300 Farming !!!:eleph:
Its January 2010.
Anybody here as disappointed as me with the so called "Fermi demo"?
What ever happened to the facebook photo of Fermi SLI setup - fake? Not ready yet?
Why did they only have 1 board to show at CES?
With some like me calling for aggressive short-sell/puts, certainly it wouldn't have hurt to show a dozen Fermi boards... ie prove really in production.
the video doesnt show fermi running anything, it shows a display... :D
i would have liked to see the card from all angles, yes...
is this a trick question? :D of course!
they could have done a lot more to impress the audience if you ask me... but yeah, the demo showed that their drivers are doing pretty well i guess...
nah... i wish nvidia would have STFU about fermi and wouldnt have created any hype for it in q4 of last year... they are stretching it too much... and no real news... its boring...
hey nvidia, sup?
im gonna launch fermi real soon!
oh wow, neat...
next month:
hey nvidia, hows it goin?
im gonna launch fermi real soon!
oh, cool!
another month later:
yo nvidia, anything new?
im gonna launch fermi real soon!
hmmm... ok...
yet another month later:
hey nvidia, still launchin that fermi huh?
im gonna launch fermi real soon!
rofl... yeah yeah... whatever... :D
huh? what'd i do? :D
they also said november... and christmas... :P
now THATS more like it :D
way to go for saving the day evga :toast:
those vents on the cards backs look very nice, should be good for some nice airflow and low noise... in idle at least :D
in the past nvidia locked tri and quad sli until they needed it...
if nvidia already supports tri sli of fermi, that kinda sounds like they will need 3 cards to beat 5970 xfire...
makes sense, that means a fermi will be faster than a 5870, but a 5890 might catch up...
interesting! :D
Woot they finally added Display Port output to their reference cards and got rid of that ridiculous S-Video output port! I'll be looking forward to hooking up DP to my monitor when I get one of these cards!
yeah... id really like to know who actually used those s-video ports... i used it like 10 years ago to hook up my tv to the card...
but tvs have dvi ports for a while now, so who actually still used the svideo ports? :D
HAHAH Yeah i used s-video about 10-11 years ago too to hook up my tv to cards as well because back then it looked respectable. Even if they don't have DVI, you can still use DVI->HDMI converter so no problemos. I'm just interested in playing with Display Port cause it's new and it comes standard on the gf100! I've always wondered how tight the cables are, and whether they easily fall out like E-Sata do!
Yep thats true but i dont like the thinness of the bars in the vent, i hope they are strong enough to take a small blow "Transporting safety"
I am sure that the GF100 will not be as fast as 5970 "With mature drivers" but seeing as how even my friends GTX295 beats 5870 its no surprise that GF100 can beat 5870 but by how much thats the real question.
Now since March is just too far off for me i'll have to buy the cheapest DX11 card i can find till that time or get a 5870 :(
Hi,
French report about nVidia Fermi @ CES2010, some information about heat dissipation. PCWorld is in LasVegas during CES, lucky they are!
Link in french: Link in french
Link in english (google translation)
ThermalTake and CoolerMaster statement: Fermi is hot, hot, hot! ThermalTake will sell a specific case, Fermi-certified... because for SLI it seems you need an additional cooling module.... CoolerMaster also build a Fermi-certified case, under NDA for the moment. But... very very hot... and 300W for ... a single GPU ... how will they do for the dual GPU? Anyway hope this is interesting for you .
FERMI + WATERCOOLING FTW GUYS :D ! A new heat , power and overclocking challenge is coming soooon :explode2::cheer:
300W for a single GPU?! That's insane!
Fermi better be amazingly fast for it's power consumption or it might be NV's version of the 2900XT. :(
Doesn't sound very good to me..
I'm going to develop my own video card.
/me opens notepad.
Yes, thank you for that profound statement that has illuminated us all and yet has nothing to do with what I was referencing, which was whether Fermi would be able to beat an HD5970.
If you're looking to pick a fight with an NVidia fanboy you might want to look at the signature first. :rolleyes: I simply said it would be rather disappointing if ATI's dual GPU card doesn't beat out Fermi. That's all.
I hope that GF100 can compete against HD 5970, which would be quite a feast for NVIDIA. But there really isn't any indication that this would be the case except for some baseless rumours so far.
I don't like the news from the French article... you need a certified case to run Fermi SLI because it might emit 300w per card...