Here they are:
http://www.fudzilla.com/content/view/15762/34/
http://www.fudzilla.com/images/stori...nfermi_wtr.jpg
http://www.fudzilla.com/images/stori.../fermi_wtr.jpg
Printable View
is it a new trend to make the heat sink suck at working ? lol
BIG FAKE PIECE OF PLASTIC:
http://i36.tinypic.com/2zz7ntf.jpg
source: bit-tech
http://i34.tinypic.com/34inz9j.jpg
http://i34.tinypic.com/2d7sx83.jpg
http://i33.tinypic.com/2enbyf9.jpg
http://i34.tinypic.com/o00x12.jpg
http://i34.tinypic.com/2wmdfec.jpg
http://i34.tinypic.com/2d7yaew.jpg
http://i37.tinypic.com/2yjwpba.jpg
http://i36.tinypic.com/wmeauw.jpg
http://i38.tinypic.com/t6bg95.jpg
http://i33.tinypic.com/2i92r0n.jpg
Seems a little short?
But then what's this :
Source -----> AnandtechQuote:
Originally Posted by Anandtech
yes it's definitely late. This is a damage control showing to stop the sale of 5000 cards. It'll be here by Christmas, and it'll be faster than a 5870.
1 8 pin connector?
Again a new thread...
That could be a blank PCB with a fancy cooler for all we know.
looks short.
Come on folks, this is a Tesla unit as written on the HS and only has one dvi out...
No the picture is of the highend G300 variant, it is just that the board layout/design of the power connectors is something we have seen before.
People have confirmed that the card in the picture, with the 8pin showing on the end, also has a 6pin.
When other 40nm variants get cancelled, you have got problems...
And actually there is a current NDA for some Nvidia info, plus one NDA just expired.
the pic is a tesla not a gforce
Well you think it's slower than a 5870? there is no fact or nda-breaking proof or benchmark to back up that assumption.
Think of it like 48701g x 2.2 = 58701gb, actual performance 48701gb x 1.7
and.... gtx280 x 2.4 = gtx380. where would you place a gtx380's performance?
certainly not slower than a 5870? I'm guessing it will be the same gap that was between a gt200 and rv770. A repeat of last generation, or slightly faster for Nvidia.
Wow that tesla looks beastly. I wonder how fast it can run Crysis? Oh wait, wrong thread. Sorry :rofl:
ouch @ the power connector on the end of the card
well done nvidia... but i dont really buy it...
the card seems to definately be a dummy, and the demos they showed probably didnt run on actual gt300 hardware... they never showed a running gt300, just performance graphs and details about the architecture.
intel has been doing almost the same with lrb for 2 years, and ati could have done the same with 8xx for over a year as well...
hints that the card is a dummy:
- 8pin power only
- pcb is shorter than gtx285
- no pins in the power connector
- no solder spots on the card at all
- no pcb components visible anywhere on the card from any angle
- only one dvi out
- nothing on the pcb below the fan, nothing at all
- second slot exhaust on the back is blocked by black plastic, probably of the fan shroud... why? otherwise we could see behind the plastic shroud and notice that theres actually nothing inside.
- the way jensen holds the cards shows that it has close to no weight
- jensen also holds the card in a special angle at all times, hiding the backside of the card so we cant see it... why? cause its most likely empty...
http://img199.imageshack.us/img199/8103/fermidummy.png
such an obvious fake...
now the real question is, were the demos they showed REALLY run on a gt300 or was it simulated or run on a gt200? everything they showed could be pre-rendered... except for this:
http://www.youtube.com/watch?v=RMtQ62CnBMA
but its not that impressive and could have been done on gt200 or maybe several gt200s or even on cpus...
everything they did, every single thing i saw at least, they could have done without having any gt300 silicon at all...
oh and then check out this:
www.nvidia.com/fermi
hmmmmm what was the last time i saw nvidia try to create hype for their next gen part before it was released? nv30 anybody? do you guys remember the geforce FX hype? they even had a custom url for the FX showing pictures of the gpu, a countdown to the launch, and kept talking about how programmable it is...
while it was true that it was really quite flexible, it completely sucked when it actually came to playing games...
i have a bad feeling history is about to repeat itself...
all they talk about on the fermi page on nvidia.com, the fermi event and all media material is how flexible it is, how programmable it is, how great it is for gpgpu, how great it is for physix calculations, how great it is for scientific calculations etc... but no word at all about gaming performance...
it seems that nvidia and intel both screwed up gaming performance of their gpus trying to outgun each other in gpgpu performance...
Why are there two large holes in the back of the PCB?? I dont remember GTX 280 having any, my GTX 275 does not any???
Also for something that is suppose to have more than twice the power of a GTX 285 the pcb seems very small IMO. Lets have a comparison of a new small GTX 260 and the supposed to be gtx 380.
http://img194.imageshack.us/img194/5690/001238704.jpg
http://www.techpowerup.com/img/09-09-30/74c.jpg
Also there are connectors for another DVI port under that....
lol, ya the FX thing really had me also i bought a FX 5900 aka the leaf blower "very wrong choice" also my brother bought a FX 5700 which was not that bad in price/performance as the FX 5900!!!
Come on, guys... another new GT300 thread? I'm starting to go crazy with this flood of different GT300 threads... wouldn't it be easier if we had a single thread about GT300 news? It's not like I'm a Gulftown...
I'm actually reading the paper about the new architecture, and it seems to have a few curious changes. Let's see how it all ends.
But on the other hand, I find highly worrying the fact that they are doing such a hyping campaign even before anouncing a release date (not even approximated). They usually never do that. Not hyping in their web about their next generation (whenever it happens), not papers purposely given to the media without NDA explaining the changes they are doing on the architecture and why, etcetera.
All of this makes me suspect that release date is far away yet. Otherwise they would simply release the videocards when it was and that's all. They are making such a big effort to create a halo of "all of you should wait for our product to come" that they must need it really...
1. The card showed is Tesla, so the lower power consumption can be understood through lower frequencies and the lower dimension of the gt300 gpu compared to gt200.
2. Also people dont know the power (connectivity) requirements for tesla: NVIDIA Tesla C1060 [2:15]
3. Dont compare Tesla to GTX285 consumer card or other consumer cards, in anyway possible.
Yep, tesla models have smaller energy draw.
well, they DID say "now you guys know what you want for christmas" which is a clear hint, but its vague enough to later say "huh? what? no, we never said it would come out for christmas" :D
for those who dont remember the geforce fx fiasco:
june 2002 - ati launches r300 aka radeon 9700 which completely destroys the geforce4ti series
november 2002 - www.geforcefx.com pops up but nvidia denies its affiliated with nvidia, later they admit it and some pr material and a countdown show up.
december 2002 - paperlaunch
jan/feb 2003 - actual launch of the fx 5800 which got ripped into pieces by most sites, expensive, incredibly noisy, hot, and slower than the previous gen geforce4 series in non dx9 games (gf4=dx8)
It does reek of that for me, not as extreme as the 9700pro was a bigger slap to nvidia. Although given months of delay and crappy silicon shat out, thats all it would take for nvidia to consider itself back to NV30. I would not put it above nvidia and just picturing the panic in their HQ over finally realizing their deja vu predicament :ROTF:
Sounds like a paper launch Xmas special to me all this. keep peopel from ATi with the promise of goodies. It's all in the hope of stopping people getting the 5800 series.
Here's another report alongside a DIE-shot
http://www.hardwareluxx.de/index.php...afikkarte.html
That's a huge exhaust there! :shocked:
Here: Anandtech: Nvidia's Fermi...
He sums it up nicely at the end. It exactly what I was thinking when I read the specs.
What's up with that 5th row of PEG solder points underneath the 8-pin? Did they saw part of the PCB off? lol?
http://i36.tinypic.com/so3hfo.jpg
Given all of nvidias recent practices concerning all kinds of matters, I really hope they get slapped hard by the ATi 5k series and Larrabee, might make nvidia roll a decent product out the door then instead of endless renaming schemes and rehash after rehash.
where? you mean the gpu shot? thats not a die shot! die = silicon, what they show is a lid labeled U90..... 0935 = end of august, so its 4 weeks old only... pretty much still steaming and warm for a piece of silicon :D
i saw more pics of the card... its a very weird contraption for sure...
it almost looks like a frankenstein card that somebody glued together out of other cards pieces :p:
oh and jensen apparently DID make a statement about retail availability. several sites quote him as saying "only some few short months" :eh:
a few is at least 3, and while im sure jensen wants them to be short, they wont pass any faster no matter what he does or says :D
todays the first of october, at least 3 months means late january early february... hmmm didnt somebody predict gt300 will arrive in late january early february? :wasntme: :D
Is this a functional card? I noticed something odd about the soldering points on the back of the card. The arrangement of them indicates that the connector should be on the top not the back. And, there should only be 4 so I am not sure what the extra 2 are for at the very edge of the card.
Here take a look at the back of the card. Now compare it to the back of GTX 285. The arrangement of those soldering points should tell you were the connector's orientation on the video card.
Edit:
Looks like there is a 6 pin connector on top. However, I don't see any soldering points for it.
Ninja Edit:
Here is another pic from another angle showing that there is
-6-pin connection on the top-rear side of the card
-8-pin connection on the rear-top side of the card
They were just showing off the tesla cooler cover, nothing to get worked up over. Nice looking cooler though. I like the chrome.
Yeah, not sure how the power connector can be out the back with the solder points in a row like that. They should be turned 90*.
thanks for the pix, eastcoasthandle.
It just looks like they sawed the end of the pcb off:
http://i35.tinypic.com/10yqrmt.jpg
If you noticed in that pic above, their is a small outline for what I believe to be the 4-pin fan connector (it could be for something else). Which is underneath the 6pin connector. However, there is no soldering points.
this card must use loads of energy, this is the tesla version which has a lower power usage than the GTX series and it still uses between 225w and 300w, so unless dual 8 pin and a 6 pin is an option for nvidia, we probably won't see a dual GF100 card.
is it possible this card is just a mockup for show? build the case but use an old PCB cut to fit? nothing about these connectors make sense. if they were optional, like a 8pin on top and side, and you choose which to use, that would be cool. but to say you have to have room up top AND on the side, its gonna look retarded ugly in a clean case.
EDIT: there a ****ing screw hole under the 6pin connector, that cant be right
add to the fact that you cannot exhaust air out of the thing, and I'd say you're on the right track :rofl:
http://i34.tinypic.com/s4cmmp.jpg
oh, and how are you supposed to put your sli connectors on? :shakes:
http://i34.tinypic.com/o00x12.jpg
Err?
The page does have a die shot. This:
http://i479.photobucket.com/albums/r.../Fermi_Die.jpg
This is what I've been assuming for sometime now. If we see one it will be on 32/28nm and at reduced clocks ( if not disabled units as well ) For Nvidia's sake hopefully the 380 can compete with the 5870x2 otherwise they are in trouble ( as they don't look to have any mid / low end GT300 cards in the design pipeline in any reasonable amount of time ; eg the real profit earners ) I sense GT200 rebrands...
does anybody else have a dejavu thing with that pcb?
i cant help but think ive seen it before...
they definately cut off the back part and made it shorter, like literally cut it off with a saw LOL... they even cut some solder pads in half as you can see in the better high res pics... :rolleyes:
its not any gtx260/75/85 pcb variation... its not the old or new 295 pcb, its not the 9800gx2 or 7950gx2 pcb... its not anything g92 either from what i can tell...
the odd thing is that it has the solder spots for 8pin to the top and then apparently another 6 pin to the top further to the back, but only 2 pins are visible, the rest of the pcb has been sawed off...
previous tesla cards have this odd one power connector up and one to the side thing as well, BUT they have the 8 pin to the side and the 6pin to the top... well, this card has as well, plastic connector wise, but the actual pcb solder spots are contradicting that completely :lol:
the heatsink shround will probably be used for the tesla card once its finished, they cant make it much longer cause theres just not more space inside their tesla blade servers and i think they dont want to redesign those... but the pcb is completely diferent from what the final cards will have...
what really sticks out when looking at this card compared to the previous generations of nvidia cards, is that the gpu seems to be centered above the pciE interface... thats odd, cause then either the nvio chip needs to be moved, and where to? or the memory chips need to be moved... that would mean there are only memory chips on 2 sides of the gpu... which sounds more likely.
conclusion:
very lame launch event from nvidia... very very lame... as in setting a new record in paper launches... :shakes:
they should have at least shown a dummy card that looks more like the actual retail cards... or maybe they dont even know it yet cause they are still that far away from actually mass producing them :rolleyes:
its beyond irony that its nvidia that criticized intel for their lrb pr campaign all the time, rightfuly!, and now they beat intel at lame pr campaigning by having a paperlaunch based on hot air and bold words, oh and a video of some floating green spots and a frankenstein sample card based on some old sample parts they fished out of the rnd labs trashcan :clap:
we'll just have to see how the drivers are, and also this list of things that could put the 58X2 ahead,
1. GF100 is purely a Tesla GPU, unlikely but the lack of hardware tessellation and seeing as this is focused at the GPGPU market, it's possible
2. GF100 uses so much power and emits so much heat, that they have to dial it down a few notches for consumer use.
3. GF100 costs about the same as the 5870 X2, performs a lot better in some games, gets owned when DX10.1 and DX11 come into equation.
4. In the GPGPU segment, larrabee takes the crown.
ohhhh die shot as in logic/design die shot! :D
my bad! :bows:
yeah but that doesnt mean anything, does it?
you can create this easily by using the design tools + photoshop...
you not only dont need working silicon, you dont even need any actual silicon for it...
they claim they do... but considering that they didnt mention them AT ALL, and didnt even bother to invest 20mins to glue some vga parts together to create a another dummy card... i really dont think they will have gt300 cut down cards very soon... i mean actually cut down silicon, not half broken gt300 trash...
helloworld 98, that makes sense...
i think it was a huge mistake from nvidia to make their gpus that gpgpu heavy...
the smart way is doing it the intel way, intel doesnt release a new cpu that is more gpu like, at the cost of cpu performance and price, they create a diferent chip. thats what nvidia should have done as well...
saaya, my guess is it's either 1/2 a 9800gx2, a last gen tesla board, 1/2 a gtx295, 1/2 a mars, or an internal lab sample. I've been kinda surfing around to see if i could find which board it is too :)
lol you mean like "faking" a paper launch? :ROTF:
What i found particularly ironic & funny was that with semiaccurate reporting all that "only seven samples came back from A1" bull:banana::banana::banana::banana:, there was a #7 written in black marker on top of the week 35 A1 fermi sample's IHS!!! Lol, the irony.
http://i36.tinypic.com/wmeauw.jpg
it is kinda funny they hold up a mockup, i truly wonder what kind of silicon is under that, if any.
and can it run crysis?
I dunno dan, but speaking of crysis... I was reading at [H] a post by trinibwoy - there was a poster talking about geforce/tesla and asking why all Nvidia did was talk about gpgpu and GPUc at GTC. Trinibwoy responds saying, "it's GPU Technology Conference, not Crysis Framerate Conference." LOL!
I believe he is correct. Here take a look at the gtx285. As you can see there is a ring around every hole were a screw is used.
Umm.... Isnt it small (by nvidia's standards ofcourse :p:)
Lol wow, how embarassing for nv. Well, I guess they actually have to give a *edited censor bypass - DilTech* about their custormers for it to be embarassing.
i checked the mars pcb, its not the same... the gpu is in a diferent position...
heheh, yeah it DOES look familiar doesnt it? i could have sworn i have seen those pcb vent slits before... but i cant find anything with it now...
i checked all tesla cards i could find, they look diferent...
gtx295, checked old and new, nope...
i suspect its an internal lab sample... possibly even of a gt300...
well for a second i even thought of making a video with some friends and staging our own :banana::banana::banana::banana::banana:infast vga launch in the same style and putting it on youtube ^^
but i dont have a cam and i think my buddies here in taiwan wouldnt want to spend time on that hehe :D
LOL i didnt notice that hahah :D
well didnt anandtech confirm the 7 working gpu thing?
he did put it in p[erspective though and said its not as bad as it sounds, and i agree with him... anyways, the no7 on it is def funny LOL
and it looks like they tried to wipe it off but you could still see it slightly :lol:
my guess is that its an internal lab sample of a dual gt200 card with a tesla mockup shroud.
so about crysis... i doubt it... especially since they sawed off a part of the pcb! :lol:
as if that would change anything :P
the pcb clearly has solder spots for an 8pin connector on top, and next to it a half cut off 6pin connector, both facing to the top... the card shows a 6pin facing to the top and an 8pin facing to the side :p:
they do, you can buy tesla cards as a standalone card to use in any pc or server, those cards have a propper shroud and all, google it :)
one thing i gotta say about this launch event... the plastic shroud of this frankenstein card DOES look really nice! :toast:
i really like it... looks better than the ati cards imo, the plastic the ati shrouds use seems really cheap
I think people here are getting too worked up over this videocard and it's upcoming release....let it go and stop analyzing into a few stupid pictures so much!
I don't think Nvidia will make the same mistakes they made years ago with cards like the FX....this card will almost undoubtedly beat the 5870 in performance but at the same time will cost more. I thought it was hilarious when the Nvidia guy cursed when telling Anandtech that when building huge GPUs, it takes a long f***ing time!
More important than trashing Nvidia and their paper launch, is how much money will they actually lose in the months that ATI will have their 5870 on the market for. If Fermi doesn't destroy the 5870 at launch performance-wise, they will lose this round yet again...and that's sad when you have the superior product.
I was talking about the tiny 1u blades similar to this: http://gizmodo.com/270650/nvidias-te...uters-fo-reals
I know you can buy them standalone with the shroud and fan.
The pic is fake for sure. No way could the light come from UNDER the power connector. Photoshopped.
Superior in which way? FLOPS/mm²? W/mm²? Manufacturing costs? raw performance? Performance/mm²? Performance/W²?
There is one thing which HD5k are superior; early in the market.
Looking at all the pics and replied, I am really starting to believe that it's fake, LOL.
Interesting read from jasoncross.org about this card.
Yes, because by the specs GF100 seems to be far from being superior compared to RV870. It's huge, its slow(GFLOPS wise), it will cost a ton, and there isn't cards out yet. So yeah, thats the "superior product" Nvidia is having. :rolleyes:
The only way I can see GF100 being superior is GPGPU applications. Thats where it's aimed towards. Gaming performance seems to be the secondary priority.
for gaming, its just bigger than ATI, not superior. for tesla, i think its a very nice upgrade, but whos competing with them?
Again, superior in which way? If you care for perf/watt or perf/price, is it really going to be even better?
From companys point of view it's even worse. For consumers GF100 based cards could actually be better deal in perf/price, but for Nvidia such deal is not really good, it just can not compete in price due to (likely to be) more expensive product.
i think we're agreeing, i said its just bigger and NOT superior. i think it will have a very similar perf/watt and perf/$ (like within 10%) of ati. price is fully controllable by nvidia, but i expect the architecture to do well enough to provide a close perf/watt ratio. and given the die size difference, i think it will have a lower perf/mm2, but that is expected with its CUDA background, and shouldnt affect anything except cost.
its not stupid pictures of a card, its pictures of a stupid card :D
if nvidia cant show a gt300 retail card or doesnt want to, fine...
if they decide to show a mock up instead... fine
if they decide to claim the mock up is the real deal... thats pretty lame, but whatever... their call...
but not even getting a propper mock-up done and instead hack-sawing through some old trash pcb and glueing parts to it... ouch... thats the most amateur move ive seen from nvidia in many years...
and you know, if it would have been a propper mockup, who would have cared? nobody would have complained in a couple of months when the card actually comes out, that it looks slightly diferent... if you plan to lie to people about this stuff, you might want to use something a little more sofisitcated than a 20minute hotglue mockup... :D
if they have the superior product, then they wont lose :)
how can they lose if they have the surperior product?
i was a bit bored so i put this together heheh :D
http://img28.imageshack.us/img28/2883/tesafilm.pngQuote:
Originally Posted by jensen huang
as you can see we added some nano-technology plastic tape, which is wrapped around the card, protecting it from super-charged alpha particles from outer space! *audience applauds*
:D
this forum is starting to suck... :rolleyes:
do you know what makes forums like this suck? users like you. :mad:
anti-nvidia sentiment is so stupidly high here, it's nearly forcing me into becoming a nvidia fanboy, even though it's been two years since I last used a nvidia product, and I am probably two weeks away from getting a 5870.
The funny part is, if Jen-Hsung said "We'll totally kick ass gaming performance wise" instead of the CUDA thing, you guys would still mock him.
We do not know the performance of Fermi. Kindly shut the hell up until you DO know that it is NOT performing and it IS expensive. Too much speculation, too much bullcrap talked about Nvidia based on that speculation.
Fermi's launch has been late and it's bad for Nvidia. However, this doesn't mean it's going to be slow or anything.
if perf is right, and the only problem is yield and heat, they can cut down gt300 to compete with 5870... they really HAVE to, i dont see how their strategy can be to only have gt300 and gt200 and nothing else in between...
who? me? :confused:
"LEAVE NVIDIA ALONE! ;_; :mad:"
So yeah. No negative speculation because it could actually imply that Nvidia would have really messed up big time this time?
We know for a fact that bigger chip means bigger costs.
We know for a fact that 384-bit memory controller means more complex PCB and more costs.
We know for a fact that Nvidia's CEO used FAKE card to show the upcoming card.
We know for a fact that Nvidia's product is late to the market, so it has been having some problems already.
Expensive card AND competitive pricing means less profit for the company.
So, based on this, what positive can you say about the upcoming product? If nothing, would you propose people to be quiet for the sake of "being fair"? :)
probably... for the geforce fx they did claim kick-4ss gaming perf and they failed big time... i think anything nvidia can say right now will be criticized by most... they are trying to be a party pooper for ati, so dont you think they deserve at least some of the criticism they get?
if theyd actually show something real then it would be acceptable imo, but yelling hey, hey, look at me! uh! uh! over here, and then all they talk about is gpgpu and physix most people are fed up with, and show a patched together mock-up card and claim its the next gen card when anybody who knows a BIT about vgas can tell its a fake? what do you expect? people cheering for them?
But you don't know that it's not going to be fast as to compensate for its price.
Yeah, you do know all you have listed; but come Fermi, and if its price is in line with its performance, then all of what you've said will be in vain.
There are two sides to the equation - price AND performance. So far we only know that it's going to be a big chip and thus going to cost a lot. This doesn't bode well for Nvidia but still none of the critics for Fermi (aside from the ones complaining for the late launch) know anything about the other side of the equation.
So it's funny that you guys are talking as if you know how the product's performance is going to be. Until then, all such bashing based on pure speculation is funny and stupid.
Can't say I'm sorry. A good joke is a good joke. Also, since the card was not the real deal and still was showed by the CEO I think it deserves some bashing. Had it been ATi that did the same thing we'd be hearing this from some of you instead (well, saaya probably would have been doing the photoshop anyway).
Edit:
Highoctane:
Sure, there are flames in Nvidia threads but equally in the ATI/AMD threads. And as I said, this fake needs to be called out on. It's really lame.
It's come to my attention that every single NV thread ends up with people posting with censor bypasses and inappropriate language.. If you see it, PM me directly and I will handle it, as my email is screwed up presently so I don't receive the notifications right now.
This is a warning, if any of you are guilty edit it now. At 5pm Central I will be sweeping thread by thread, and taking care of things.
If you're feeling froggy, I'll feed you an M-80.
I'm also convinced that Fermi graphics card is fake. I found this pic of PCI Express connector from Chinese website PCpop.com and it says: ""180-11072-1102-A00" (link)
I checked some previous PCB codes:
GeForce GTX 295: 180-10656-0102-B01
GeForce GTX 285: 180-10891-0102-A01
GeForce GTX 9800 GX2: 180-10790-0102-A02
saaya, do you have more detailed info about these codes?