It was never said the GF104 was a dual chip.
They do have such a part, and have since November. More official info will arrive at CES in a week for you guys.
Printable View
According to that post it is.
Quote:
GF100 promises to deliver at least 40% more performance than the GTX295 for less money. GF104 promises double that.
Do they differ much from the performance figures presented in October?Quote:
They do have such a part, and have since November. More official info will arrive at CES in a week for you guys.
Ah that was a bit of weird wording. The 100 is 40% faster than the 295, the 104 is 80% faster than the 295 (double of 40%). Which means the 104 is about 29% faster than the 100. Rahja cannot confirm or deny the 104 part is a single or double at all, it's under his Business NDA still.
My guess is the 100 could be like a "GTX360" with disabled parts or lower clocks, and the 104 like a "GTX380" full bore, both single chip. We will know for sure soon enough!
I'm not sure what the October numbers were or where they came from, but it seems like they will be releasing some official information in 2~ days and/or definitely at CES.
I'm sure a fully clocked and uncrippled Fermi can deliver those numbers. But can Nvidia/TSMC actually deliver such a chip in mass numbers for the market?
because from the looks of things so far even A3 had to be SP crippled/underclocked just to get it to run.
Correct me if I'm wrong please, I do not want to buy another dual gpu card.
Well, maybe they are watercooling them, else it is utter nonsense (or a hair dryer engine is providing the airflow).
Also GF104/Dual Fermi won't be able to double performance, SLI won't scale 100%, it never does, and secondly they won't be able to run it at full speed without exceeding 300W TDP.
I believe those figures only in case it's taken from an nvidia PR slide or something because for those slides they always pick games that's good for nvidia gpus and provide bad performance figures for a HD5970 and the load temp is prolly under ambient temp 0C or sth (who says it has to be 20C ambient?). :D
In a proper 3rd party review, the numbers wouldn't be as nice so I'm gonna stick to my original speculations that GTX380 single but highend gpu will be ~10% avg behind HD5970.
Ahh sorry it's 5am and my brain is fried!
This post should answer your question....kinda:
http://www.overclock.net/8085950-post411.html
He did mention "A3 silicon has just finished up, and is looking great." in this post:
http://www.overclock.net/8085891-post400.html
Thanks for the info Kuntz, now I know where the "Fermi is smaller than GT200" comes from. Also good he confirmed launch in the 2nd week of March and availability some time after that.
He's basically saying everything opposite from what I hear from people I trust. He says the Die size is surprising, I'm saying it's gt200-esque. He says performance and yield are great, I'm hearing they'll have a hard time getting quantities for launch. And with Quantities that means Chips that actually make the desired speed.
Also, did someone here say the numbers were go at 19x12 4xSSAA?
http://www.overclock.net/8085586-post389.html
Hmm...if Crysis was just on high thats not that huge of a jump.Quote:
Here are a few more bits of FPS data for you. These were collected at 1920x1200 with 4x MSAA (no SSAA this time) and 16xAF. All settings are on high (perhaps not extreme/highest though, it isn't clear from the included documentation)
Crysis Warhead: 58 FPS
Left For Dead 2: 143 FPS (Judging from this result, this appears to be CPU limited at this point)
Hawx: 127 FPS
World of Warcraft: 46 FPS (this result seems off to me, I assume this is a driver issue)
Fallout 3: 72 FPS (clearly a driver acceleration issue here again)
What are you talking about, Charlie did jump on the whole 448 SP thing except he used Nvidias own documents to do it instead of citing some unnamed source/rumour for which he would get flamed for.
All of you will see what is Fermi in TWO days, and someone will be hardly surprised :) :) :) Performance is BRUTAL!
Man this is going to be awesome. :D
i'd would not even comment on what the Rahja had to say..its just way too fishy + vague.
in a few days i hope we get a bone or two
At CES there will be a Tri-SLI running Fermi PC...
a 5850 at 1ghz has brutal performance.
(just need better drivers)
same goes for a 285 or a 295.
Performance is good enough today for hardware purposes.
We need, good overclockability, cheap, great drivers, and so on..
Visionfinity seems as a name, ludricous, but Nvidia is fond of renaming so I guess it fits for them.
Can Fermi work out of the box, with good drivers?
or is the chip so dependent on drivers and the team to write them it will take the whole year before they are mature enough?
Is the thermal specs to much?
If its get way hot, well lot of rma is gonna be caused to happen due to heat issues.
Fermi is an adaptive chip, made for the new market, not for gaming purpose first and foremost but for tesla and the folding market. which has different requirement to the design.
It be interesting to see a working card in action, not some fake excited ceo card.
It might be also that the cards will be rare until summer in the shops.
many questions, so little answers.
this Rahja dude is really "special":
http://www.guildwarsguru.com/forum/s...2&postcount=10Quote:
Originally Posted by Rahja
http://www.guildwarsguru.com/forum/s...7&postcount=12Quote:
Originally Posted by Rahja
the one tri-sli setup I know of will be there, no predictions... :)
TWO days to go :D
I really do hope we get some Fermi specs and benchmarks!
Oh and some new Drivers too.
John
Yeah I'm looking forward to seeing how Fermi performs and what all the specs are. I wonder if what xbitlabs is reporting is still correct with the flagship model still sporting the 512 cores. http://www.xbitlabs.com/news/video/d...ce_at_CES.html
what's this with newly registered users with secret info about Fermi lately? :D
neliz from b3d? he had the scoop on a lot of info on RV8xx and GT300 a long time ago, so he's been credible!
im ready for some $200 5850's
stop spying my dream grimREEFER
I have been living under a rock.
What exactly is "fermi?"
I am not kidding. I really do not know.
Alright, I have found a detailed overview of fermi/gt300.
http://www.realworldtech.com/page.cf...WT093009110932
I see the future of hybrid gpu/cpu technologies. A collision of general purpose and specialized task circuit designs will converge into a new chip to power both consoles and personal computers.. and it shall be called Megatron! er... wait..
For anyone hoping a for a pricedrop of Cypress parts soon. the Distributor price for one HD5850 is currently €224.40, excluding 20% VAT, I can't see Fermi pushing this price down anytime soon and this should be an indication of it's price after launch.
edit: Richard Huddy commented on Fermi's die size saying it's nearly 50% bigger then Cypress and should beat it.
So again, the "surprising Die Size" rumours have been heard since May by types like Fudo and Theo, who simply write whatever nvPR tells them, even though it's false.Quote:
Well if it's not faster and it's 50 per cent bigger, then they've done something wrong-
Fermi will have 3D Vision Surround support. Up to 3 monitors 3D.
http://www.nvidia.com/object/io_1262775447639.html
neliz did you screen cap it the link is down :(
BTW weren't you on rage3d ?? I do remember you name there amm maybe i am wrong or it maybe was b3d...
It's good to see neliz on XS :yepp:
FOR IMMEDIATE RELEASE:
2010 INTERNATIONAL CONSUMER ELECTRONICS SHOW (South Hall 4 – Booth #35912), Las Vegas — Jan. 6, 2010 —It’s clear that 2010 is poised to be the year of 3D and NVIDIA is leading the charge to bring 3D to the computing masses. At this year’s CES show in Las Vegas, NVIDIA is showcasing a host of new products and technologies that show how PCs based on GeForce® GPUs along with the NVIDIA 3D Vision™ hardware and software ecosystem are the platforms best suited to make 3D pervasive for all PC entertainment, including Blu-ray 3D movies, games, photos, and even the Web.
On display in the booth, NVIDIA is demonstrating 3D Vision technology running on:
Desktop PCs using new 3D Vision-Ready, 1080p, 120Hz LCD panels from leading display companies, including Acer, Alienware, and others.
New 3D-Vision-ready notebooks from leading notebook manufacturers, all equipped with new state-of-the-art 120Hz 3D capable displays.
NVIDIA 3D Vision Surround, the world’s first consumer, multi-display 3D solution which allows users to span 3D content across 3 high definition monitors or projectors for a truly breathtaking and immersive gaming experience! NVIDIA 3D Vision Surround does for 3D PC gaming just like what IMAX® 3D does for movies.
In addition to the great 3D hardware, there is also a ton of cool, compelling content on display, all viewed in 3D when used with the 3D Vision active-shutter glasses. Of note:
New Blu-ray 3D software players, including Arsoft’s TotalMedia Theatre 3 and Cyberlink’s PowerDVD Ultra
Blu-ray 3D content, including 3D movie trailers for Disney’s Toy Story 3, A Christmas Carol, Alice in Wonderland, and more.
The hottest PC games running in full 1080p stereoscopic 3D, including James Cameron’s Avatar: The Game, Batman: Arkham Asylum, Dark Void, Just Cause 2, Need for Speed: Shift, and more.
Plus,
The world’s first sneak peek of YouTube 3D, running on a 3D technology demonstration version of the Adobe® Flash® Player software that is viewable with NVIDIA 3D Vision glasses in full color.
For those able to come to CES, NVIDIA will also be hosting a 3D photo booth. Come by and have your picture taken in 3D with the new Fujifilm FinePix REAL 3D W1 digital camera. Pictures from the show will be made available online for viewing.
Also, get a truly immersive, hands-on gaming experience with NVIDIA 3D Vision Surround which can be seen in NVIDIA’s CES booth in the South Hall 4 #35745, running on DepthQ HD 3D projectors and Acer GD245HQ 1080p LCD displays.
For more information about NVIDIA 3D Vision technology and the 3D Vision ecosystem, please visit: www.nvidia.com/3DVision.
Thanks LibertySyclone from hardforum
Nope :/ edit: I see you've done an excellent job tracking the press release though! ;)
Yes An awesome (if I say so myself) F@H until I got caught at work when I ran F@H on servers we were preparing for the Dutch ministry of Foreign Affairs :rofl:Quote:
BTW weren't you on rage3d ?? I do remember you name there amm maybe i am wrong or it maybe was b3d...
(they never found the 50 P4HT's I ran it on as well ;) )
not sure what its going to be like to have 3d across bezels. i dont think it would be nearly as easy to get use too as standard 2d pictures.Quote:
NVIDIA 3D Vision Surround, the world’s first consumer, multi-display 3D solution which allows users to span 3D content across 3 high definition monitors or projectors for a truly breathtaking and immersive gaming experience! NVIDIA 3D Vision Surround does for 3D PC gaming just like what IMAX® 3D does for movies.
which of those are the hottest? seriously id like to know cause i have no intention of playing any of them at this point.Quote:
The hottest PC games running in full 1080p stereoscopic 3D, including James Cameron’s Avatar: The Game, Batman: Arkham Asylum, Dark Void, Just Cause 2, Need for Speed: Shift, and more.
I think Nvidia will keep B:AA in their "currently hottest games" list until 2045.
And where did you find folks naive enough to think Fermi would win in price war? nVidia priced higher is as certain as Intel priced higher than AMD.
The question is if the Fermi performance/feature lead (if any), will justify the price - by the time its launched, everybody worldwide will have had almost 1/2 year to buy a HD5870.
PS: Slim chance, but hoping nVidia comes up with better reasons to upgrade than 32AA or 300fps in L4D. If they still have that developer whip, now's the time to use it (instead of DX11 not important BS). After all, without the games, what's the point of a new video card?
I was saying that last year, that to me it seemed like the 5870 was a waste of money for gamers since refreshes would be out before the extra horsepower was needed. And it has nothing to do with being a fan of either NV or ATI, just that the majority of the popular titles could be played just fine on a 4890OC, GTX285, GTX295 or 4870X2. There are still no major DX11 titles (Dirt2....*snerk*), and the majority of gamers really don't need all the extra power in a card that is going to be surpassed within the next few months anyway with Fermi (supposedly) and the rumored 5890's and possible 5870X2 refresh. No real need to upgrade until we see more DX11 and graphically intensive titles.
I mean, from a benchmarking standpoint, yea get the fastest thing you can get. But from an actual consumer perspective, what's the draw of a DX11 card that barely scrapes out 10 more FPS in Crysis Warhead. I mean it's not really moving stuff that was unplayable on the previous cards into playable territory. :shrug: So you can crank the AA up from 4x to 8x and the AF from 8x to 16x and get the same frame rates. So what? Does the minuscule bump in settings warrant another play through of a 2 year old title? I guess maybe to some small segment of people it does.
I would imagine the first DX11 titles being played on first iteration DX11 hardware are going to tax those cards pretty hard. Seems like if you really want to be able to crank everything out with DX11 at high resolutions that you'd wait for refreshes before going off the deep end with respect to the $400+ GPU market....:cool: Not like we're going to see DX11 exclusive titles any time soon.
It's not an indictment of people who did choose to buy a 5870, I just don't see the appeal, particularly to those who already had the four cards I listed above...:shrug:
Well you should all try playing them again on an Eyefinity setup. From what I hear it breathes new life into older games too! :D
In another universe maybe the superior product can be cheaper than the inferior product B.
In this world, if A's better than B, then A is most of the times ( if it's not somebody's going to get fired from manufacturer A :p: ) it'll be more expensive.
Especially when we're talking about well-known brands and not a new boy on the block.
I like your "title" and remind me of my upgrade motivation: ever since Radeon HD 3870 I'm upgrading my GFX with each new generation 'cos of the fact that I want to reply Crysis again on higher level of detail (well thats about 92% of the reasons for jumping on to 4870, and now on 5870)!
I have 24" HP, and it's must to play in native 19x12! After that comes details, and 30 fps is required (when you play Crysis with XBOX 360 Joypad you really don't need more - yeah always is good to have as much as higher fps, but dynamic of joypad gaming, and speed of turning around lets you pass by with 30fps on average)
And once I'm done with Crysis (and WARHEAD) I replay all over again my favorite shooters :)
Hopefully I'll manage some time soon to play Crysis again on 5870 on Very High.
And it's amazing that years after publishing there's game that can come close to Crysis on Very High (or Warhead on Enthusiasts). First reall competitor (IMHO) is Bad Company 2! Frostbite 2 engine really looks gorgeous, and it is fast!
Now back on topic: CES is on, do we have any interesting info bout GF100??
While you are at it, you could do yourself a favor and buy a Panny/Pio 720p/1080p PDP. Even in "just" 720p, it will slaughter any "pro" monitor you can come up with when it comes to gaming/motion. And Crysis on an enormous dynamic range display is a feast for the retinas. ;) :up:
Fixed. ;)
It's not even close when it comes to fidelity. They have totally different design goals, it's just not comparable and there's really no need to do it. Both (three :)) games are/will be great. :up:
Adjust your opinion, BC2 runs on Frostbite, not Frostbite2 (I have facts to back that up)
Press is supposed to get some samples this week. Now I'm not sure of the numbers, I know that for the Cypress launch not even all the websites got cards.Quote:
Now back on topic: CES is on, do we have any interesting info bout GF100??
combination of peanut butter and chocolate, Jack Daniels and Coke, and macaroni and cheese = NVIDIA 3D Vision Technology "For me that equals vomit"
and
Of course, you can also use our Surround technology in 2D mode too, but why would you want to game in a flat environment? "So it can basically become eyefinity and you can use cheap 2D monitors with it yay...
http://blogs.nvidia.com/ntersect/201...echnology.html
At least there is 2d 3 monitor support which i like very much, 3d displays are tooo expensive for my taste but its a nice addin for the future tough.
I did remember then :D been ages i have forgotten the password for rage3d :(
2004-2005 were my active years in rage3d
How will the run it on a GTX285 without running SLI/Take the Quadro approach.
edit: I haven't folded since 2004 and am still #219 over at Rage3D /37000 overall.
No one reviewer will get GeForce 300 this week, NVIDIA will send samples later in february. But they will have many infos about GeForce architecture, features, performance - but no CLOCKS! In few days NDA lifts (someone will leak it, for sure) and every server will bring article about it, be patient.
Only question I still have is they managed to get clocks over 700Mhz or they're staying below it.
Rys from B3D said he would probably get one this week, now I don't know if he's considered a "reviewer" in that sense..
hehehe i predicted this when ati launched their 6 monitor stuff :DQuote:
NVIDIA 3D Vision Surround, the world’s first consumer, multi-display 3D solution which allows users to span 3D content across 3 high definition monitors or projectors for a truly breathtaking and immersive gaming experience! NVIDIA 3D Vision Surround does for 3D PC gaming just like what IMAX® 3D does for movies.
i think for 3d displays having a bezel between the screens is REALLY annoying though... its way worse than in 2d...
nothing new really... i had hoped they would have 240hz panels or bezel less 3d panels : /
Dam the link is not working for me... okk used a proxy now it works
Don't worry, LR has more pics.
But they mistook 2nd week of March for March 2nd.
http://www.legitreviews.com/article/1175/1/
http://www.legitreviews.com/images/r...gear_shift.jpg
looks like tessellation is working full swing according to the images :)
at what res etc?
PCGH shows it above 30fps, so please, a little more details:
http://www.pcgameshardware.com/aid,6...2009/Practice/
Ohhh ya did anyone notice Nvidia says results are better than 5870 "NOT 5970" and in other site it says around 30% better..
Sorry to disappoint you, but...
http://www.tech-caffe.com/gfx/fermi.png
Lol
Some one has stolen the other card!!!
Exhibit A
http://i186.photobucket.com/albums/x...tersen_675.jpg
Exhibit B
http://i186.photobucket.com/albums/x...leh/ces-03.jpg
In the bottom left you can clearly see the power cables for the other card!
Prime suspect - Tom Petersen
Was Nvidias CES date of 1 7 2010 chosen intentionally to commemorate that infamous Charlie story about 1.7% yields?
nice, facts are allways welcomed! feel free to provide them ;)
I wasn't inquiring about NVIDIA sampling strategy I was asking do we have some solid info about GT100?Quote:
Press is supposed to get some samples this week. Now I'm not sure of the numbers,
same will happen with GT100, but than again what does Cypress sampling situation has to do with the facts about Fermi?Quote:
I know that for the Cypress launch not even all the websites got cards.
http://forum.beyond3d.com/showpost.p...6&postcount=66
and:Quote:
Originally Posted by repi
http://forum.beyond3d.com/showpost.p...1&postcount=71
Quote:
BC2 does support DX11, although not as extensively as Frostbite 2.
wtf has GT100 got to do with this? Oh.. I see what you did there. I was talking about the very limited availability of parts, Some people were already disappointed with AMD's "ten thousands" of cards.Quote:
GT100?
...
GT100
I think over the next week or two, there will be quite a number of leaks. Good stuff.