Anyone see this? I don't know if it's real or not.
http://forums.overclockers.co.uk/sho...&postcount=112
Printable View
Anyone see this? I don't know if it's real or not.
http://forums.overclockers.co.uk/sho...&postcount=112
very nice post, thx, and also thx for bringing this thread back on topic! :D :toast:
and yes, i agree, i wouldnt be surprised if nvidia launches a cut down gpu card first. all they need right now is to beat a 5870 and a possible overclocked rv870 aka 5890... if they dont need all sps for that, they probably wont enable all sps initially.
oh really? hmmm thx i didnt know that...
im confused though, what do cards use the slot power for then?
memory pwm? but 75W for memory, isnt that way too much?
:lol:
the gtx360 will probably have 448 sp's. it will probably depend on what ATi has out on the market but that is pretty proportional to last gen.
i dont think nvidia will call their parts gtx380 or gtx360... doesnt sound that well and they claim its a revolutionairy new architecture... itll be expensive at launch...
theyll need more pr to sell it well so im pretty sure theyll come up with a new naming scheme :yepp:
Or 3D rendering APIs become more generalized and there will be no such thing as a "3D architecture" anymore since everything will be software based with maybe a few legacy hardware bits for texture mapping / decompression.Quote:
At that juncture, the architectures may have to diverge — especially if the professional
market grows larger than the consumer market.
I would bet money that Nvidia's launch for Fermi will include a ray-tracing demo.
physics:
http://www.youtube.com/watch?v=iyg9HgiD8X0
ray tracing:
http://www.youtube.com/watch?v=BAZQl...eature=channel
this is an older demo, now it just runs at a good fps. note how badly it destroys larrabee's demo.
Sorry, but the first link is a fake video.\
It shows a lot of movie effects (transformers, GI Joe etc. which i know where made using conventional pipelines based on CPU render power ) and...
at second 27-29 it shows some effects done with a software/plugin for 3dsmax named FumeFX, which renders effects (simulations actually, like smoke, fire, sparks etc.) solely on a CPU, since i have it at my work place.
So, if it was actually shown by nvidia at the GTC, it's a lame fake. If some guy posted it on Youtube claiming it's a nvidia demo, it's just him trying to fool other people into thinking that was done on an nvidia gpu.
Since GT300 will have about 500mm^2 die, which is similar to GT200, and assuming similar SP group defect rate (ie @65/55nm, 24SP occupy similar area as 32 GT300 SP @ 40nm).
GT200 launch (10 SP groups of 24):
GTX280 - 240 SP
GTX260 old - 192 SP (80%)
GTX260 new - 216 SP (90%)
RV870 (20 SP groups of 16, each with 5 execution units):
5870 - 1600
5850 - 1440 (90%)
Trend seems to be 80-90%.
For GT300, where SP are grouped by 32. Possible combinations:
512 <- difficult to get enough chips for "hard launch"
480 93.75% <- "1 less". Sounds a lot like PS3 Cell or G92 doesnt it?
448 87.5%
416 81.25% <- most likely choice for initial GT. Later can make 448SP.
384 75% <- if yields really bad. Still 60% more SP than GTX280.
I doubt nVidia will ship full out "Ultra" at launch. Trend with GF7, GF8, GF9 and GT200 is to have "half-refresh". So better to initially launch 416SP GT, and then 6month later "new" 448SP version.
However, perhaps engineers have anticipated yield issues. Perhaps whole SP group can be "salvaged" if 1 SP affected. Perhaps each group has 1 or 2 extra (So that even if 1/33 SP is bad, there will be 32 good remaining). Maybe can make groups of 30 instead of 32. Also, notice that AMD didn't change the grouping of SP. They are still sets of 16.
Ofcourse, for marketing purposes, its possible that Fermi actually has 544 or 576 SP, and that "512" versions will be sold as "perfect" product. And why not? Hard drive manufacturers have been selling hard drives with extra sectors and firmware that transparently replaces sectors with "defects" for years.
PRICE:
Putting down money on $699. All nVidia high end since GF7 have launched at $600 or higher.
Reality check:
Today is 4 October and Nvidia didn't sow a single GT300 in action. Not even 1 card in 1 Demo, or a "My Computer settings" screen.Quote:
Perhaps, Nvidia is somewhat too optimistic. At the GTC, Nvidia demonstrated the A1 revision of the Fermi-G300 graphics chip made in late August, whereas usually the company uses only A2 or even A3 revisions on commercial products. It usually takes months to create a new revision.
For example, ATI, graphics business unit of Advanced Micro Devices and the arch-rival of Nvidia, demonstrated its latest RV870 “Cypress” processor in action back in early June ’09, but was only able to release the chip commercially in late September ’09, about four months later.
IMHO, I'm no PCB designer, but there is no "power from connectors *MUST* be used to power RAM" rule.
I think most of the power is used from PCIE connectors, because they are not shared with other loads, they are *usually* seperate rails in PSU, so more even distribution of power, and perhaps even to avoid incompatiblity with low-end mobos which only provide 75W on paper.
REASONING/PROOF
Some video cards that require additional power connector, will still boot (and sometimes show driver error or beep), and display image (which required both RAM and GPU to be powered).
Power wise, TSMC 40nm seems to be pretty good. 4770 was much cooler than 4830. We also see 5870 managing quite well with better performance/watt than 4870. nVidia can always start of at low volts/clocks with some SP groups disabled. Later, have half-refresh with higher volts/clocks. Just like they did before with 8800GT (1.1V) and 8800GTS 512 (1.15V) and 9800GTX (change from 1 to 2 power connector).
It's a different world now! Nvidia is reliant on it's stock holders now.
Bro, the average person doesn't upgrade their card every generation. Many people are still using the X1900XT's or G80's and are waiting to upgrade this Xmas, in lue of Windows 7 or because of a new build/computer.
Secondly, Xmas is a HUGE time for video card sales. More people will by Video card in the next 3 months, than in the following 10 months. ATi will reap huge profits because their products are out.
Thirdly, the 5770, etc.. is due out. They will sell like hotcakes, because of their lower price points (ie: below $199). Nvidia will not have any DX11 part to compete in the sub $200 category till Jan/Feb..
When Investors find this out soon, Nvidia's stocks will plummet.
While ATi has a full line of new DX11 cards from $99 ~ $499 during the Holiday Season. People only buy a card once every other year, or possibly every year... means Nvidia will have to sit out till next Xmas.
Nvidia knows this, so will spin it a different way..
GF2, GF3, GF4, FX, 6800, 7800, 7900, GTX280
What's in common? NONE of these were launched for Christmas season. Most were in spring (Jan-March) or summer.
Only "GF1" and "GF8" (both G80 and G92), launched in the fall.
I think if nvidia got-by with "failing" to launch for Christmas holidays 7x in a row and still survived - it will be ok this time too.
The "REAL" problem, as others pointed out, is AMD will be shipping $150-$250 mainstream parts within weeks. nVidia doesn't even have timetable for first silicon yet.
Eventually they'll either get lucky with good silicon or desperately launch hardware and driver in rough shape.. Perhaps like FX Jan03, we'll see aggressive anisotropic filtering and shader "optimizations" to make sure nVidia comes 1st in benchmarks... *cough* 3Dmark03 *cough*
Only way to tell is to move along and wait 4-6 months for things to settle.
you are really relying too heavily on die size to predict the SM's for the chip. the reason why the made the gtx 260 216 was to compete with the 4870 1GB and 275 to compete with 4890. there is no way to turn off part of an SM either. nvidia does not count SFU's or DP units in their SP count so it does have 512 SP's. they market the chip with less alu's than it has as opposed to ATi.
That's a little bit of an over exaggeration.
http://jonpeddie.com/press-releases/...the-graphics-/
Is this architecture completely shader based or does it have ROPs and texture units?
I've looked at previews, and it just mentions a super-powerful upgraded shader architecture, but nothing about ROPs and texture units.
I had thought the DX11 architecture couldn't be completely shader based, but I could be wrong.
Also, a somewhat relevant question: Does the 5870's texture units do full non-optimized trilinear filtering in addition to having no af angle optimizations, or is bilinear all we get?
You're making a lot of claims, but not supporting your arguments. I'm saying people won't be doing much upgrading because there isn't the demand from games. You're saying they will be doing a lot of upgrading because...they're still using cards from a few generations back? So what? Why upgrade if those cards can run nearly every game (most of which is console ports)?
http://store.steampowered.com/hwsurvey/
While not the end-all, be-all of statistical information, as a rough idea this survey shows most people are using G80 and RV770 based cards. These are more than capable graphics cards. Millions of people aren't going to upgrade just for the sake of upgrading. Some might, but not the numbers that will cause NVIDIA the kind of harm you think it will. Yes, Christmas experiences the highest sales, typically, but a few percentage points more than the rest of the year isn't saying much, nevermind the sliver of that that'll go towards DX11 hardware. $200 is cheap, but it's still $200 most people would rather pocket than waste on an unnecessary upgrade.
I think you said it quite well. "The average person doesn't upgrade their card every generation." Indeed, and that's never been more true than in recent years. Again, I think NVIDIA will survive this generation.
Means nothing^^... this fall marks the release of one of the most anticipated OS of our time.
Windows 7 will be marketed heavily this Holiday Season. More than other OS in existence, the average Joe will indirectly know about DX11. Thus, people will know about ATi's DX11 offerings.
Re-branded DX10 cards aren't going to do anything for Nvidia. Those who need more than a Radeon 5870 are very few, as the HD5770 will plenty for 90% of gamers needs. SO when the GT300 Fermi does actually come around, who left to buy?
50k die hard Nvidia fans?
The average public doesn't know the difference between Saphire & ASUS, than is does about ATi & Nvidia... they will be looking for DX11 & Windows 7 compatability......!
Which Nvidia will not have for the Holiday Season!
Cybercat, you obviously don't follow the stock market or how a companies profit works... NVDA. All it takes for Nvidia's stock to head south is one bad Holiday season shopping report, for investors to pull out. As many are already starting to anticipate Nvidia's no show for Xmas.
OEM's are not sticking highend cards in every computer,at most a 5770 and more likely to be IGP in the cheaper models. they like to skimp on psu's to save $. you seem to be the only one who thinks that not having a high end card out for christmas will destroy their sales. OEM's are like fanboys too. dell and ATi are lovers just like HP and nV.