You invert the thing ... most peoples will not buy an high end card, just for the price, and so they don't want dual gpu cards or SLI/CFX cause the price, if they don't want put 500 dollars in a card, they don't want buy 2 at 300..
Specially since some years ... middle class card has become more and more reliable for gaming even at high resolution. the mass ( and i speak about gamers ) will buy a 5770 - 6850-70 a GTX460 and deal with that.
Most Peoples i know who are using CFX or SLI want just more perf of what a single high end gpu can offer...
Last edited by Lanek; 12-17-2010 at 12:50 PM.
CPU: - I7 4930K (EK Supremacy )
GPU: - 2x AMD HD7970 flashed GHZ bios ( EK Acetal Nickel Waterblock H2o)
Motherboard: Asus x79 Deluxe
RAM: G-skill Ares C9 2133mhz 16GB
Main Storage: Samsung 840EVO 500GB / 2x Crucial RealSSD C300 Raid0
What has that to do with what I said..? I wasn't talking about why mainstream gamers buy X or Y card. I was talking about differences between a high end single GPU card and a dual GPU card and how comparing both is not "fair" because they don't offer the same type of performance, cause the 5970 suffers from the same issues of all dual GPU solutions.
Last edited by RSC; 12-17-2010 at 03:05 PM. Reason: Typo.
what you are doing is invalidating amds strategy to make dual efficient chips to match a giant chip from nvidia. Basically you mean that it is only fair to compare one chip with one chip no matter how they are designed. Sure the 5970 has some microstuttering, but most people dont care or dont notice it when purchasing the product. The most fair comparison is that between the same price range and which fits into one pcie slot. And thus 580 vs 5970 is a fair comparison between similar products. Just because one is big and powerful and the other is dual efficient doesnt change much of the user experience today, they are merely design choices.
Last edited by Dimitriman; 12-17-2010 at 01:39 PM.
Gigabyte Z77X-UD5H
G-Skill Ripjaws X 16Gb - 2133Mhz
Thermalright Ultra-120 eXtreme
i7 2600k @ 4.4Ghz
Sapphire 7970 OC 1.2Ghz
Mushkin Chronos Deluxe 128Gb
SKYMTL thanks for clearing that up.
---
---
"Generally speaking, CMOS power consumption is the result of charging and discharging gate capacitors. The charge required to fully charge the gate grows with the voltage; charge times frequency is current. Voltage times current is power. So, as you raise the voltage, the current consumption grows linearly, and the power consumption quadratically, at a fixed frequency. Once you reach the frequency limit of the chip without raising the voltage, further frequency increases are normally proportional to voltage. In other words, once you have to start raising the voltage, power consumption tends to rise with the cube of frequency."
+++
1st
CPU - 2600K(4.4ghz)/Mobo - AsusEvo/RAM - 8GB1866mhz/Cooler - VX/Gfx - Radeon 6950/PSU - EnermaxModu87+700W
+++
2nd
TRUltra-120Xtreme /// EnermaxModu82+(625w) /// abitIP35pro/// YorkfieldQ9650-->3906mhz(1.28V) /// 640AAKS & samsung F1 1T &samsung F1640gb&F1 RAID 1T /// 4gigs of RAM-->520mhz /// radeon 4850(700mhz)-->TRHR-03 GT
++++
3rd
Windsor4200(11x246-->2706mhz-->1.52v) : Zalman9500 : M2N32-SLI Deluxe : 2GB ddr2 SuperTalent-->451mhz : seagate 7200.10 320GB :7900GT(530/700) : Tagan530w
Seeing microstuttering enough to annoy you is like being allergic to seafood. Sucks to be you. Dual gpu scaling is amazing this round for both AMD/Nvidia. 100% scaling on Metro 2033 for AMD, and 97% scaling for Nvidia
I wish that all games scaled like that but as we all know that isn't the case especilly once we start talking about games that aren't exactly the latest big thing. I still remember Morrowind with MGE chugging along at unacceptable framerates at settings that my GTX280 ran very well. Source ports like Darkplaces and eDuke32 didn't scale at all. These are just a couple of examples.
Last edited by BababooeyHTJ; 12-17-2010 at 03:24 PM.
Gigabyte Z77X-UD5H
G-Skill Ripjaws X 16Gb - 2133Mhz
Thermalright Ultra-120 eXtreme
i7 2600k @ 4.4Ghz
Sapphire 7970 OC 1.2Ghz
Mushkin Chronos Deluxe 128Gb
There's one thing:
Volume, people can buy HD6900 and find them easily. GTX500 never had a real hard launch. And in europe, many stores don't have stock right now. It's december and christmas, for me it seems like a win for the HD6900.
Nvidia with their current die size can't afford the same volume as cayman.
tajoh111
Then what can we say about GTX460 with his huge die size compared to barts selling for the same price. Barts cpu's has overprice and this days I see some kind of price-cuts. Both Sapphire/Gigabyte OC versions sells for the same price as their non OC ones.
Athlon II X4 620 2.6Ghz @1.1125v | Foxconn A7DA-S (790GX) | 2x2GB OCZ Platinum DDR2 1066
| Gigabyte HD4770 | Seagate 7200.12 3x1TB | Samsung F4 HD204UI 2x2TB | LG H10N | OCZ StealthXStream 500w| Coolermaster Hyper 212+ | Compaq MV740 17"
Stock HSF: 18°C idle / 37°C load (15°C ambient)
Hyper 212+: 16°C idle / 29°C load (15°C ambient)
Why AMD Radeon rumors/leaks "are not always accurate"
Reality check
/close thread lol
Another thing I find funny is AMD/Intel would snipe any of our Moms on a grocery run if it meant good quarterly results, and you are forever whining about what feser did?
Same with them too. NV must crazy hate to sell such a big chip at that type of price. The deals for the gtx 460 are ridiculous nowadays. Atleast NV had a few solid months of sales at its full price so everything is not completely bust.
And the gtx 560 just turns out to be gf104 but everything enabled, then things might just turn around for them.
Core i7 920@ 4.66ghz(H2O)
6gb OCZ platinum
4870x2 + 4890 in Trifire
2*640 WD Blacks
750GB Seagate.
Any links about that, I can't Google any but there are plenty reports about TSMC 32nm cancellation. Personally I believe TSMC cancelled the node because of so much problem with 40nm and GLobalFoundries announcing to work on 28nm.
The bottom line is, if GLobalFoundries got successful 28nm process against the TSMC 32nm than TSMC could have even loose the Nvidia business. They sure could not afford that.
Here is what AMD Vice President and General Manager of AMD's GPU Division said according to bit-tech
I don't think Skynner lied about it, after all they still need TSMCMr. Skynner admitted that the HD 6000 series was originally set to use TSMC's 32nm process, but that AMD had to opt back to 40nm earlier this year after that process was unceremoniously dumped by TSMC in favour of concentrating on 28nm only.
EDIT
If TSMC cancelled the 32nm because of AMD, why wouldn't the TSMC say so? Or did they? If they did that sure is a big new news.
Last edited by Heinz68; 12-17-2010 at 05:11 PM.
Core i7-4930K LGA 2011 Six-Core - Cooler Master Seidon 120XL ? Push-Pull Liquid Water
ASUS Rampage IV Black Edition LGA2011 - G.SKILL Trident X Series 32GB (4 x 8GB) DDR3 1866
Sapphire R9 290X 4GB TRI-X OC in CrossFire - ATI TV Wonder 650 PCIe
Intel X25-M 160GB G2 SSD - WD Black 2TB 7200 RPM 64MB Cache SATA 6
Corsair HX1000W PSU - Pioner Blu-ray Burner 6X BD-R
Westinghouse LVM-37w3, 37inch 1080p - Windows 7 64-bit Pro
Sennheiser RS 180 - Cooler Master Cosmos S Case
Like Heinz68, I am curious about this. Is this something AMD told you?
In your 6970 review you said AMD had taped out some of the new architecture products before deciding against using 32nm for all of them. So they had some products for this arch taped out before ~Nov'09? That seems like a really long time.
There are lots of people that get multiple midrange boards and SLI/CF them to match or beat the performance of larger single chip cards. Companies aren't offering (many) cards with multiple midrange chips because the extra cost of board components needed for CF/SLI offsets the savings from smaller chips.
WOW 4 GPU on one card what a bright new idea. The first GPU would say hi, the problem is the last GPU would not be able to close the door.
Plus if some people believe there is so much problems with 2 GPU, four would not make it any better. Most time there is very good scaling with 2 GPU, not sot so much with third one and even less with forth one, if any.
Core i7-4930K LGA 2011 Six-Core - Cooler Master Seidon 120XL ? Push-Pull Liquid Water
ASUS Rampage IV Black Edition LGA2011 - G.SKILL Trident X Series 32GB (4 x 8GB) DDR3 1866
Sapphire R9 290X 4GB TRI-X OC in CrossFire - ATI TV Wonder 650 PCIe
Intel X25-M 160GB G2 SSD - WD Black 2TB 7200 RPM 64MB Cache SATA 6
Corsair HX1000W PSU - Pioner Blu-ray Burner 6X BD-R
Westinghouse LVM-37w3, 37inch 1080p - Windows 7 64-bit Pro
Sennheiser RS 180 - Cooler Master Cosmos S Case
No one outright lies in this industry but PR is all about selective truth telling...and of course a fair amount of embellishment by certain publications in order to give a certain voice to articles.
TSMC cancelled their 32nm process. Why should anyone need to know more? Even the shareholders usually get a warmed-over version. There are so many stories within stories that the real truth is hardly ever so simple.
I am not saying that AMD's dropping of their lower-end 32nm cards was the end-all for 32nm but rather one of the main contributing factors to TSMC's re-evaluation of their roadmap.
In the past, it has been ATI's cards that have very much been route proving products for TSMC's High Performance lines. We saw this with 40nm, 55nm, etc. The manufacturing relationship between ATI (now AMD) and TSMC allowed for a mutually beneficial roll-out procedure that ended up benefiting clients like NVIDIA as well.
So yeah, there were probably other economic factors behind TSMC's shutting down 32nm fabrication before it even started producing anything past test wafers. However, loosing high volume parts from a major client likely had a massive impact.
Regardless of what certain outlets state, an initial tape-out usually happens 9-12 months (or even more) before volume production. And yes, I can state that my conversations with AMD covered the points above and then some. Some I can discuss, most I can't.
Just installed a HD6970.
Here are the 3DMark06 results.
![]()
Amazing how many people dont realize it is possible to compare apples to oranges. What matters is the end user's preference, not your own.
All along the watchtower the watchmen watch the eternal return.
Im not impressed with these. £220 for the cheapest 6950, £280 for the cheapest 6970 with those rubbish reference coolers (high temps, too much noise), or £155 for the MSI talon attack GTX 460 hawk edition with low temps and noise, and great overclock potential.
The GTX 560 looks like it will have the 6950 beat by a large margin.
Bookmarks