Dude both sides are going to release new drivers, the 295 and the 4870x2 will get better and better, why hate so much? The competition can only help us out.:yepp:
Printable View
Timothy Leary must be handing out medicine again. :rofl:
Note: I just sat back and watched this with a bag of chips and a couple beers while a few programs were downloading lastnight. Looks like it's gonna continue to be pretty entertaining. I'll be tweaking today and watching this off and on. Carry on.
SNiiPE quit being mad your gfx card now sucks and stop spamming the nvidia threads with your bull sh*t
Now this is some what of an overstatement, but I just dont understand why you care so much, the 295 is obviously just as good as the 4780x2, we should all be very happy! This will bring prices down, right? :yepp: We the consumer really want an equal playing field and no should not participate in any blind fanboyism.
Quote:
As you’ll know if you read our articles regular like, quad-core processors are not yet very well exploited in current games. Although dual core is more and more widespread, very few games can lay claim to making the most of more.
Over the course of various tests designed to observe this employment of quad-cores we came to the conclusion that there was one parameter that had a great deal more impact that expected: the graphics card, or rather, its driver. In our processor test protocol we use two games, the 1.2 version of Crysis (via the integrated benchmark_cpu2), and World in Conflict, version 1.0.0.9 (via the integrated bench). In both cases, the tests are carried out in 800x600 so as to limit the impact of the graphics card and concentrate on processor limitations, with as much load as possible.
Here are the results we got with an E8400 and a Q9650, both clocked at 3 GHz, coupled either with a Radeon 4870 (Catalyst 8.11 hotfix), or a GeForce GTX 280 (GeForce 180.48), in DX9 and DX10 mode (effects are set high in DX10 which is why we get the low frame rate), in 32-bit Vista
http://www.behardware.com/news/10005...quad-core.html
The results speak for themselves! In DirectX 10, with an NVIDIA card, you get between 21 and 24% performance increases by moving from dual to quad core. With a Radeon 4870 the gain is less significant. However in DirectX 9, the gains are smaller: only between 10 and 11% with NVIDIA, and, only in Crysis this time, a more or less identical gain with Radeon. Multithreading is of course not managed in the same way in Vista for DirectX 9 and DirectX 10.
The results obtained show clearly that the DX10 NVIDIA pilots are better multithreaded than the DX10 AMD pilots, which are in fact not at all or very minimally multithreaded. As a result, an NVIDIA card is better able to get more out of a quad core, which is not to be sniffed at if you are playing a game where CPU limitations are compromising performance.
According to Intel, 25 to 40% of the CPU load in a game is linked to Direct3D and the graphics driver. The threading of this load is not therefore insignificant and it would be more than useful for AMD to look at the question as soon as, to at least get on a comparable level with NVIDIA. We are now planning to update our Core i7 test, which we carried out with a Radeon HD 4870 X2 at first, this time using a GeForce GTX 280.
http://66.102.9.104/translate_c?hl=e...SRjaSrFmDB-FVQ
LaterQuote:
Contacted about this, AMD has confirmed this, stating that the current drivers were not fully optimized for sorting and quad core. This optimization should arrive during the first quarter of 2009 through new Catalyst drivers
Heh..thats maybe the biggest pile of BS I read in along time ^^
They basicly also flawed their own statement in the test:
http://www.behardware.com/medias/pho...IMG0024646.gif
8% for AMD in Crysis on a quadcore. Core 2 Quads are actually horrible to test such thing on. Since depending on how the threads are assigned. You can end up with different cache distributions. However AMD wont get 2-3% bonus if it wasnt "multicore". They would get 0. Its not like running another game would be different. And the DX9 run is a perfect example on that there is nothing to gain. 8% and 0% for AMD. Yet nVidia scores 9 and 11%. Pretty wierd eh? No not really...
The API execution overhead is also about half in DX10 than DX9. In short, DX10 needs half the CPU load that DX9 did.
Not always. It depends entirely on the title. I know I sound like a broken record, but flight simulators put one hell of a load on the CPU. They have to. FSX is a good example. Even with my Core i7 and a GTX280 OC'd it is not enough to run that sim at the glory level of eye candy I'd like to see. It's actually fun trying though, because getting it to run efficient makes every other game a breeze. Every title I currently have runs with ease except for that one. They offload a ton of stuff onto the CPU. It should've been coded a bit better for multicore CPU's still. I think alot of it has to do with trying to make the sim compatible with single and dual core CPU's. It's holding progress back. One of the things I'm looking for is a "hack pack" to get the sim to take full advanatge of these things properly. Even the way it is, it puts a load on everything...the RAM, the CPU, and the GPU like no other.
I've got another title still using DX7-9 technology that offload a ____load onto the CPU. That title (Falcon4) is totally CPU dependent. GPU upgrades do next to nothing for it. CPU and RAM upgrades are what really shines with it.
New games are gonna be ultizing multithreading more and more. They need to get with the program, and code these titles to take full advantage of ALL 8 threads and utilize them properly. Off load on the CPU. On some of these titles it's just rediculous...a Tri SLI setup is being taxed while the CPU is sitting there barely loaded doing almost nothing. Times are changing though.
We already saw the thread about needing an i7 to really get the full effect. We'll be seeing alot more of that. I hope that's sooner than later.
is there a single GPU by NV coming out soon? cause if so, I smell FAIL all over these 2x NV GPU.. given NVs track record, drivers are not their strong point for their GPUX2 cards - just look @ the 98GX2, at launch: driver issues, shortly after: driver issues fixed, & after that? dont even go there..
NV dual GPU card + drivers = FTL
it would appear to me that for short-term, NV dual GPU is good, but long term? :shakes: my bet is as soon as the next single GPU card(s) are out, these dual GPU cards will be forgotten, just like 98GX2 :yepp:
:2cents:
Yeah. Dual GPU cards like these:
http://img363.imageshack.us/img363/8...bleneckcd5.jpg
They look cool but, at the end, they are not very practical.
Those cat 9. whatever are going to come out when direct x 11 comes out. Just google Direct X 11 and read it's key features. "Better optimization for multithreaded processors"
Guess what? Nvidia's going to have a multithreaded driver release too. It's all comming with the direct X 11 cards. Cat 9.2 (if it really is the multithreaded driver) will not be out until a month or a couple of weeks before direct x 11 is launched. Save that one and put it in your back pocket ;)
By then we'll all have our eye on the new tech.
The choice is really very simple. If you need a new VC look at the GTX260 216 and the HD4870 1gb pick the cheapest and buy it. For anyone with a 1920x1200 display or smaller you will be all but burning your money wasting it on anything more expensive. With both of these cards capable of running at 60+ average fps, in most games with good IQ, there is no real advantage to getting anything faster atm. For all of you wack jobs that do 3D modeling for a hobby you can ignore this comment. Multi-GPU is still a stupid pet trick for all but the most elite users. I may just look into multi-GPU in 2 years. Until then, I have better things to do.
Two things I found most interesting. First It appears that all the previews have been done with ES parts not retail parts. Second only e-tailers will have them for sale at release. I suspect that the launch will be much softer than the launch of the HD4870x2. I do not remember any constraints on availability of the HD4870x2. Yes some e-tailers where out of stock for a day or 2 here and there but the supply for all the HD4000 series parts has been very good from the start. Will be interested in just how available the GTX295 will be. Will there be joy for all or just 100 units dropped into the channel and then you have to wait for the B3 stepping of the 55nm GT200, sometime around March?
EDIT: It does appear that the B3 stepping of the gt200b/gt206 is done. Nvidia really is working hard to get this part out. I still suspect that the availability will be very tight for some time. Will know for sure in a couple of weeks.Quote:
Toms Harware
According to Nvidia, the GeForce GTX 295 will launch at next year’s CES, just a couple of weeks away. It’ll be priced at $ 499—right where the Radeon HD 4870 X2 selling online—and will be available at e-tail on launch day. When we’re able to review retail hardware, rather than an early engineering sample, we’ll have a better idea as to the accuracy of those claims.
gt 206 stepping