If a dual fermi passes 300W then it would easily beat a 5970. But that doesn't necessarily mean that ATI will just let them have the lead without a fight. There is plenty of room in the evergreen arch for >300W cards also.
If a dual fermi passes 300W then it would easily beat a 5970. But that doesn't necessarily mean that ATI will just let them have the lead without a fight. There is plenty of room in the evergreen arch for >300W cards also.
If it indeed passes 300W then it's gonna be incredibly funny. Nodes shrink, and game requirements don't go up at all, but power keeps going up because of epenis length of Nvidia or ATI or people actually buying those cards that don't have 3 monitors or more.
INTEL Core i7 920 // ASUS P6T Deluxe V2 // OCZ 3G1600 6GB // POWERCOLOR HD5970 // Cooler Master HAF 932 // Thermalright Ultra 120 Extreme // SAMSUNG T260 26"Has anyone really been far even as decided to use even go want to do look more like?
Coding 24/7... Limited forums/PMs time.
-Justice isn't blind, Justice is ashamed.
Many thanks to: Sue Wu, Yiwen Lin, Steven Kuo, Crystal Chen, Vivian Lien, Joe Chan, Sascha Krohn, Joe James, Dan Snyder, Amy Deng, Jack Peterson, Hank Peng, Mafalda Cogliani, Olivia Lee, Marta Piccoli, Mike Clements, Alex Ruedinger, Oliver Baltuch, Korinna Dieck, Steffen Eisentein, Francois Piednoel, Tanja Markovic, Cyril Pelupessy (R.I.P.), Juan J. Guerrero
According to the engineers I spoke to, it is risky to do this due to the wide variance of rail-to-connector specifications on today's PSUs.
In addition, there are quite a few PSU manufacturers who have begun using thinner, less capable wiring to their connectors. While these "cheaper"-made PSUs are capable of delivering 150W+ to a PCI-E connector, doing so becomes increasingly risky as the wiring guage decreases in size.
More tesselation heavy games in the near future 2010 and beyond will utilize that power as long as there's hardware for it. Its like the chicken before egg scenario just like DX11 no one will make games for it if hardware doesn't exist.
The software always has to catch up. There's a few games that will also bring a 5970 or 3x GTX 285 to its knees like Arma2 and the STALKER when crank eye candy on even at 1080p resolutions. We should all welcome moar pwer!
INTEL Core i7 920 // ASUS P6T Deluxe V2 // OCZ 3G1600 6GB // POWERCOLOR HD5970 // Cooler Master HAF 932 // Thermalright Ultra 120 Extreme // SAMSUNG T260 26"Has anyone really been far even as decided to use even go want to do look more like?
Oblivion has a zillion mods i am using Qarl's Pack and its great...
Most of the badly coded games happen to be cross overs from the console department. I am surprised at GTA4 it was made using PC's to be played in consoles but PC's cant run it after wht a years gap..![]()
Coming Soon
its about focus and the amount of time they were willing to spend to focus on optimization. Its easier on a console cause you create 1 goal to reach.
different on pc as given settings and produce differnt results and can be inconsistent if not enough time is spent in the optimization department.
to sum it up, they cared more about focusing on optimization the console version, once that goal was met, im shure things were cut for deadlines.
This post above was delayed 90 times by Nvidia. Cause that's their thing, thats what they do.
This Announcement of the delayed post above has been brought to you by Nvidia Inc.
RIGGY
case:Antec 1200
MB: XFX Nforce 750I SLI 72D9
CPU:E8400 (1651/4x9) 3712.48
MEM:4gb Gskill DDR21000 (5-5-5-15)
GPU: NVIDIA GTX260 EVGA SSC (X2 in SLI) both 652/1403
PS:Corsair 650TX
OS: Windows 7 64-bit Ultimate
--Cooling--
5x120mm 1x200mm
Zalman 9700LED
Displays: Samsung LN32B650/Samsung 2243BWX/samsung P2350
INTEL Core i7 920 // ASUS P6T Deluxe V2 // OCZ 3G1600 6GB // POWERCOLOR HD5970 // Cooler Master HAF 932 // Thermalright Ultra 120 Extreme // SAMSUNG T260 26"Has anyone really been far even as decided to use even go want to do look more like?
it wasn't until 8800 GTX was released that Oblivion could be played at HQ ; so it was definitely GPU bound too
even today if you add all possible Oblivion IQ mods, put AA level to 8xAA for everything (including trees etc) you make make a 5870 sweat if you bump up the resolution
What's worse is how wrong it is the "but Graphics is not everything" crowd. Historically Graphics IS almost everything not -so much- by making games shinier but mostly for allowing newer possibilities, new styles of gameplay, storytelling. Whole genres were born because of innovation in graphics, FPS, modern RPGs, Strategies, Realistic Rally games and so on and so forth.
There is no telling how much more could be invented if there was initiative for game devs to utilize the sheer power that we now are *only* sitting on. Few years of console domination and the stagnation in innovation is unparalleled, in no other era gaming did have so many sequels, it almost became like holywood, well -no- the whole scene is even duller.
I'm not a big gamer anymore due to real life reasons, but I would love to see game devs for once trying to be innovative by using new technologies and eventually becoming good users of it but I don't see it coming anytime soon. Game developing increasingly becomes the Luddite of an industry that it once led.
It is sad how gaming applications and the hardware created to support it were in fact pouring in so many other fields, from cinema to medical imagery. The stagnation in the 21st century's genre which was not meant to be is not only sad but actually stops innovation which we are not able even to imagine from realizing (who had thought of 3D GPS, 30 years ago, or body rendering, life like effects in movies and many others - in all those fields we would comparably, still, be in the stone age). Not to speak about new genres of gaming-entertainment-culture which would "shyly" emerge by now. But that's not happening and instead we're about to get the 7th Call of Duty in a short while.
The future is shaped by software not hardware, hardware is a necessary ingredient but far from sufficient. Kudos to ATI and nVidia for continuing creating wonderful hardware, shame to gaming industry who are not using the -aforementioned- hardware, thus burying the future...
Last edited by Stevethegreat; 01-19-2010 at 12:50 PM.
Noooooooooooooo. That's what they want us to do.![]()
Last edited by Rock&Roll; 01-19-2010 at 01:14 PM.
System: Core I7 920 @ 4200MHz 1.45vCORE 1.35VTT 1.2vIOH // EVGA x58 Classified E760 // 6GB Dominator GT 1866 @ 1688 6-7-6-18 1T 1.65V // Intel X25 80GB // PCP&C 750W Silencer
Cooling: Heatkiller 3.0 LT CPU block // 655 Pump // GTX360 Radiator
Sound: X-FI Titanium HD --> Marantz 2265 --> JBL 4311WXA's
Display: GTX480 // Sony GDM-FW900
Does anyone know what the hole mounting dimensions are? Same as GT200 cards?
Sorry if its already been discussed, but I don't have time to read a 65 page thread.
tc i dont think anyone discussed that
INTEL Core i7 920 // ASUS P6T Deluxe V2 // OCZ 3G1600 6GB // POWERCOLOR HD5970 // Cooler Master HAF 932 // Thermalright Ultra 120 Extreme // SAMSUNG T260 26"Has anyone really been far even as decided to use even go want to do look more like?
The first Stalker game started being developed in 2001 - I saw screenshots showcasing the graphics in 2002 and they didn't really look so much worse than those taken in 2007. Yes, that's right, like 9 years ago. In DirectX 8. Do you really think they re-made the engine from the scratch for CoP? Since it doesn't look that much better than SoC, I believe a whole ton of code they've written has been made in 2001, no wonder it's so sluggish...
Haha, brilliant!![]()
Last edited by zalbard; 01-19-2010 at 02:45 PM.
No Stalker game has ever looked anything better than "good" and CoP is no different. The CoP benchmark program made me seriously cringe, with sunshafts it shafts the system but graphics are hardly better than something by the Source engine (which would perform 10x better)
INTEL Core i7 920 // ASUS P6T Deluxe V2 // OCZ 3G1600 6GB // POWERCOLOR HD5970 // Cooler Master HAF 932 // Thermalright Ultra 120 Extreme // SAMSUNG T260 26"Has anyone really been far even as decided to use even go want to do look more like?
Bookmarks