@ Helix
So not reliable then...
@ Helix
So not reliable then...
Gents!
To get the discussion back on topic there are some new leaks which haven't been posted here AFAIK:
Seems we will get our 2 GB of VRAM after all...wonder if Antilles gets 4 then....
Core i7 2600k|HD 6950|8GB RipJawsX|2x 128gb Samsung SSD 830 Raid0|Asus Sabertooth P67
Seasonic X-560|Corsair 650D|2x WD Red 3TB Raid1|WD Green 3TB|Asus Xonar Essence STX
Core i3 2100|HD 7770|8GB RipJawsX|128gb Samsung SSD 830|Asrock Z77 Pro4-M
Bequiet! E9 400W|Fractal Design Arc Mini|3x Hitachi 7k1000.C|Asus Xonar DX
Dell Latitude E6410|Core i7 620m|8gb DDR3|WXGA+ Screen|Nvidia Quadro NVS3100
256gb Samsung PB22-J|Intel Wireless 6300|Sierra Aircard MC8781|WD Scorpio Blue 1TB
Harman Kardon HK1200|Vienna Acoustics Brandnew|AKG K240 Monitor 600ohm|Sony CDP 228ESD
LOL
AMD has delivered some 25+ million hd5k chips (and that has all different chips) nVidia 10% of that of 4xx series. And you have gourage to say "most have already upgraded". Get your head out of ur arse, even if your friends have, it does not translate to everybody has.
For example, i upgraded just now, and only cos i got hd6870 for 189€. Btw I had hd3870 that i bought for some odd 150€ many years ago.
^ Wouldn't that make, in theory, 6970 only 2 times faster in tessellation than 5870? AFAIK they are claiming much higher numbers.
I swear to god i saw 2 units for pro and 3 units for xt somewhere in an amd slide. but since it was a while ago take it with a grain of salt. but imagine those same 3 units in each of antilles cores.
Gigabyte Z77X-UD5H
G-Skill Ripjaws X 16Gb - 2133Mhz
Thermalright Ultra-120 eXtreme
i7 2600k @ 4.4Ghz
Sapphire 7970 OC 1.2Ghz
Mushkin Chronos Deluxe 128Gb
I just pray that Tessellation is beefed up enough to be competitive with Nvidia's offerings. As of right now, it's not important. However, it will become important in the near future. Tessellation is the new "shader." It's a novelty that will make games look that much more impressive without a lot of extra work.
I really hope AMD sees it that way. Otherwise, the GTX 580 may win, and nvidia may gouge prices again. I seriously want a big, fat, price war so that we can't make a bad decision on which ever card we choose. I hate it when this market becomes one sided.
Here's the way I see.
First, there were vertex and texture mapping.
Next, we got pixel shaders
Then, we got enhanced lightmaps.
Then, we got self-shadowing characters and advanced lighting.
Then, we got HDR.
And NOW, we're on Tessellation.
So who's going to win this race?
Last edited by Mad Pistol; 10-30-2010 at 01:09 PM.
PII 965BE @ 3.8Ghz /|\ TRUE 120 w/ Scythe Gentle Typhoon 120mm fan /|\ XFX HD 5870 /|\ 4GB G.Skill 1600mhz DDR3 /|\ Gigabyte 790GPT-UD3H /|\ Two lovely 24" monitors (1920x1200) /|\ and a nice leather chair.
Antilles will be greater than GKxxx![]()
PII 965BE @ 3.8Ghz /|\ TRUE 120 w/ Scythe Gentle Typhoon 120mm fan /|\ XFX HD 5870 /|\ 4GB G.Skill 1600mhz DDR3 /|\ Gigabyte 790GPT-UD3H /|\ Two lovely 24" monitors (1920x1200) /|\ and a nice leather chair.
What do you guys mean about this Nordichardwa is saying?:
GeForce GTX 580 will most likely be the most powerful single-GPU graphcis card on the market in 2010. Now that we have the complete specifications for NVIDIA's GF110 GPU it becomes clear that NVIDIA is attempting to do what it failed to with GTX 480 - going all in - to win the performance war against AMD.
► ASUS P8P67 Deluxe (BIOS 1305)
► 2600K @4.5GHz 1.27v , 1 hour Prime
► Silver Arrow , push/pull
► 2x2GB Crucial 1066MHz CL7 ECC @1600MHz CL9 1.51v
► GTX560 GB OC @910/2400 0.987v
► Crucial C300 v006 64GB OS-disk + F3 1TB + 400MB RAMDisk
► CM Storm Scout + Corsair HX 1000W
+
► EVGA SR-2 , A50
► 2 x Xeon X5650 @3.86GHz(203x19) 1.20v
► Megahalem + Silver Arrow , push/pull
► 3x2GB Corsair XMS3 1600 CL7 + 3x4GB G.SKILL Trident 1600 CL7 = 18GB @1624 7-8-7-20 1.65v
► XFX GTX 295 @650/1200/1402
► Crucial C300 v006 64GB OS-disk + F3 1TB + 2GB RAMDisk
► SilverStone Fortress FT01 + Corsair AX 1200W
Numbers we've seen so far have seemed pretty impressive... see below.
Current benchmarks we've seen have shown the GTX 580 at stock taking down a GTX 480 at over 900 mhz. They apparently did more than just kick up the clocks to 772mhz and increase the shader count...
Bingo... Someone gets it. I want to see them trade blows once again, because when that happens WE win.![]()
Bookmarks