This was my initial thought at the idea that the 4870X2 has the lead. A dual chip, single card xfire vs a single chip single card nVidia solution will likely yield to the 4870X2.
If nVidia decided to do a dual chip approach, then not sure what would happen. Though, thinking about it... at the monsterously high power and the massive die, it will be difficult for nVidia to go the dual chip route in this incarnation.
One hundred years from now It won't matter
What kind of car I drove What kind of house I lived in
How much money I had in the bank Nor what my cloths looked like.... But The world may be a little better Because, I was important In the life of a child.
-- from "Within My Power" by Forest Witcraft
G80 had a smaller die size, used less power and generated less heat and we didn't see a GX2 card with that core until the G92 at 65nm so I doubt we will see GT200 in a GX2 card form until 40nm at least
I did read the link and the entire article when it came out.
What Nvidia is actually saying is, we CAN support DX10.1 features by coding drivers to "expose" these features in hardware. In other words, get the same result by exploiting specifics in the hardware. But Nvidia does not have a DX10.1 part.
The entire page needs to be read to get proper context. If NV could actually exploit DX10.1 features from a performance standpoint does anyone actually think they would force the removal of 10.1 support in Assassins Creed?
We know that both G80 and R600 both supported some of the DX10.1 featureset. Our goal at the least has been to determine which, if any, features were added to GT200. We would ideally like to know what DX10.1 specific features GT200 does and does not support, but we'll take what we can get. After asking our question, this is the response we got from NVIDIA Technical Marketing:
"We support Multisample readback, which is about the only dx10.1 feature (some) developers are interested in. If we say what we can't do, ATI will try to have developers do it, which can only harm pc gaming and frustrate gamers."
The policy decision that has lead us to run into this type of response at every turn is reprehensible. Aside from being blatantly untrue at any level, it leaves us to wonder why we find ourselves even having to respond to this sort of a statement. Let's start with why NVIDIA's official position holds no water and then we'll get on to the bit about what it could mean.
The statement multisample readback is the only thing some developers are interested in is untrue: cube map arrays come in quite handy for simplifying and accelerating multiple applications. Necessary? no, but useful? yes. Separate per-MRT blend modes could become useful as deferred shading continues to evolve, and part of what would be great about supporting these features is that they allow developers and researchers to experiment. I get that not many devs will get up in arms about int16 blends, but some DX10.1 features are interesting, and, more to the point, would be even more compelling if both AMD and NVIDIA supported them.
Nope, not me. I haven't believed NV's marketing in years; since the viral marketing initiative was exposed. Assasins Creed seals the deal on how NV is holding back progress IMO.
From techreport:
http://techreport.com/articles.x/14934/13
Antec 900
Corsair TX750
Gigabyte EP45 UD3P
Q9550 E0 500x8 4.0 GHZ 1.360v
ECO A.L.C Cooler with Gentle Typhoon PushPull
Kingston HyperX T1 5-5-5-18 1:1
XFX Radeon 6950 @ 880/1300 (Shader unlocked)
WD Caviar Black 2 x 640GB - Short Stroked 120GB RAID0 128KB Stripe - 540GB RAID1
Nah. I'm not buying that. You think Nvidia is just going to stand by while AMD takes the lead until they get to 40nm? No way. The 7950GX2 was a 7900GT sandwich at 90nm. There was no die shrink necessary to make that. Sure. Cooling will be a problem, but it always is with the GX2 cards. It's not like Nvidia has to even engineer a new card. They just need to stick two cards together and put them on a single PCIe slot. Nvidia is going to take back their lead no later than spring 2009 and a GX2 card is precisely how they are going to win it back.Originally Posted by zerazax
The fact is you can only get so much out of CF/SLI before diminishing returns kicks in. It's basically a hack introduced by 3DFX. We're lucky it works at all. Now maybe the 4870x2 is going to revolutionize the world of GPUs by changing that. But I'll believe that when I see it. Even in a post 4870x2 world Nvidia is still going to have the advantage in a sense because I doubt the 4870 is going to be faster than a GTX280 at 65nm and it's even less likely when the GTX280 is shrunk down to 55nm. We can all enjoy AMD's victory this summer, but they are going to have to pull out quite a few rabbits if they want to keep it.
isn't ati droppong to 45 or 40nm soon,like end of year or q1 09?
sorry i play at 25x16
ok 1% of the world population will maybe play in 2560x1600 with max details
The 9800 GX2 is still a better card compared to 2900 XT ..
Something old ..
With a large heatpipe cooler like the TRUE or the the Scythe Orochi, but engineered to attach to the video card. You have almost the same problem with GTX280 SLI but I haven't heard of cards melting just from the stock cooling.
steam shows widescreen 88,856 running over 24" 19.38% most of that is probley 30" lcd's the 27" is cheeper but not much
I love e-fights.
Crysis wont play smooth for me on 1680x1050 with very high settings, so yes, bring on next generation video cards.![]()
░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░ Intel i7 3770k
░░░░░░▄▄▄▄▀▀▀▀▀▀▀▀▄▄▄▄▄▄░░░░░░░░░ ASUS GTX680
░░░░░░█░░░░▒▒▒▒▒▒▒▒▒▒▒▒░░▀▀▄░░░░░ ASUS Maximun V Gene
░░░░░█░░░▒▒▒▒▒▒░░░░░░░░▒▒▒░░█░░░░ Mushkin 8GB Blackline
░░░░█░░░░░░▄██▀▄▄░░░░░▄▄▄░░░█░░░░ Crucial M4 256GB
░░░▀▒▄▄▄▒░█▀▀▀▀▄▄█░░░██▄▄█░░░█░░░ Hitachi Deskstar 2TB x2
░░█▒█▒▄░▀▄▄▄▀░░░░░░░░█░░░▒▒▒▒▒█░░ FSP 750W Gold
░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░ Fractal Arc Mini
Great performance from the HD 4850, and it seems like a very efficient card, the minimum frame rates are high, even equaling the GTX260 in a couple of games. It beats the 8800Ultra/9800GTX overall.
Can't wait for the 4870 reviews. I think I'm getting an HD 4870 1GB GDDR5, hope they won't cost more than $350
BTW, Sapphire HD Radeon 4850 in stock selling for $189
[SIGPIC][/SIGPIC]
AMD Radeon HD 4850 512MB Preview - RV770 Discovered
http://www.pcper.com/article.php?aid=579
Bookmarks