If a dual fermi passes 300W then it would easily beat a 5970. But that doesn't necessarily mean that ATI will just let them have the lead without a fight. There is plenty of room in the evergreen arch for >300W cards also.
Printable View
If a dual fermi passes 300W then it would easily beat a 5970. But that doesn't necessarily mean that ATI will just let them have the lead without a fight. There is plenty of room in the evergreen arch for >300W cards also.
If it indeed passes 300W then it's gonna be incredibly funny. Nodes shrink, and game requirements don't go up at all, but power keeps going up because of epenis length of Nvidia or ATI or people actually buying those cards that don't have 3 monitors or more.
According to the engineers I spoke to, it is risky to do this due to the wide variance of rail-to-connector specifications on today's PSUs.
In addition, there are quite a few PSU manufacturers who have begun using thinner, less capable wiring to their connectors. While these "cheaper"-made PSUs are capable of delivering 150W+ to a PCI-E connector, doing so becomes increasingly risky as the wiring guage decreases in size.
More tesselation heavy games in the near future 2010 and beyond will utilize that power as long as there's hardware for it. Its like the chicken before egg scenario just like DX11 no one will make games for it if hardware doesn't exist.
The software always has to catch up. There's a few games that will also bring a 5970 or 3x GTX 285 to its knees like Arma2 and the STALKER when crank eye candy on even at 1080p resolutions. We should all welcome moar pwer!
Oblivion has a zillion mods i am using Qarl's Pack and its great...
Most of the badly coded games happen to be cross overs from the console department. I am surprised at GTA4 it was made using PC's to be played in consoles but PC's cant run it after wht a years gap.. :(
its about focus and the amount of time they were willing to spend to focus on optimization. Its easier on a console cause you create 1 goal to reach.
different on pc as given settings and produce differnt results and can be inconsistent if not enough time is spent in the optimization department.
to sum it up, they cared more about focusing on optimization the console version, once that goal was met, im shure things were cut for deadlines.
it wasn't until 8800 GTX was released that Oblivion could be played at HQ ; so it was definitely GPU bound too :)
even today if you add all possible Oblivion IQ mods, put AA level to 8xAA for everything (including trees etc) you make make a 5870 sweat if you bump up the resolution
What's worse is how wrong it is the "but Graphics is not everything" crowd. Historically Graphics IS almost everything not -so much- by making games shinier but mostly for allowing newer possibilities, new styles of gameplay, storytelling. Whole genres were born because of innovation in graphics, FPS, modern RPGs, Strategies, Realistic Rally games and so on and so forth.
There is no telling how much more could be invented if there was initiative for game devs to utilize the sheer power that we now are *only* sitting on. Few years of console domination and the stagnation in innovation is unparalleled, in no other era gaming did have so many sequels, it almost became like holywood, well -no- the whole scene is even duller.
I'm not a big gamer anymore due to real life reasons, but I would love to see game devs for once trying to be innovative by using new technologies and eventually becoming good users of it but I don't see it coming anytime soon. Game developing increasingly becomes the Luddite of an industry that it once led.
It is sad how gaming applications and the hardware created to support it were in fact pouring in so many other fields, from cinema to medical imagery. The stagnation in the 21st century's genre which was not meant to be is not only sad but actually stops innovation which we are not able even to imagine from realizing (who had thought of 3D GPS, 30 years ago, or body rendering, life like effects in movies and many others - in all those fields we would comparably, still, be in the stone age). Not to speak about new genres of gaming-entertainment-culture which would "shyly" emerge by now. But that's not happening and instead we're about to get the 7th Call of Duty in a short while.
The future is shaped by software not hardware, hardware is a necessary ingredient but far from sufficient. Kudos to ATI and nVidia for continuing creating wonderful hardware, shame to gaming industry who are not using the -aforementioned- hardware, thus burying the future...
Noooooooooooooo. That's what they want us to do.;)
Does anyone know what the hole mounting dimensions are? Same as GT200 cards?
Sorry if its already been discussed, but I don't have time to read a 65 page thread.
tc i dont think anyone discussed that
Here is a guide for you.
http://www.pwnordie.com/wp-content/u...9_fullsize.jpg
The first Stalker game started being developed in 2001 - I saw screenshots showcasing the graphics in 2002 and they didn't really look so much worse than those taken in 2007. Yes, that's right, like 9 years ago. In DirectX 8. Do you really think they re-made the engine from the scratch for CoP? Since it doesn't look that much better than SoC, I believe a whole ton of code they've written has been made in 2001, no wonder it's so sluggish...
Haha, brilliant! :rofl:
No Stalker game has ever looked anything better than "good" and CoP is no different. The CoP benchmark program made me seriously cringe, with sunshafts it shafts the system but graphics are hardly better than something by the Source engine (which would perform 10x better)