You can overclock the 216 to 700MHz and it will perform within 5-10% GTX 280. :shrug:
GTX 260 Core 216 has better price/performance ratio, now that you can pick one up for less than $250.
Printable View
Nvidia estimates Shader Processing Rate as follows: *No. of SPs* X *Shader Clock* X 3 MFLOPS.
This would make a GTX260 @ 700 core (1512 shader clock) a 980 GFLOPS behemoth, faster than the 933 GFLOPS GTX 280. Of course the GTX 260 also possesses less bandwidth and it would be bottlenecked because of that on Ultra High Resolutions and ridiculous AA, @ 19x12 and modest AA, however, GTX 260's bandwidth and its 896MB or RAM are enough and in most resolutions it would beat the GTX 280 (except in the case of 25x16 res and more than 16xAA, I guess).
You can still oc the GTX 280 -though- and keep the extra bandwidth and amount of RAM, which makes all of the above considerations unimportant....
So is there any actual confirmation that these cards will be due out this month, or in January? I seem to keep hearing conflicting reports on this.
Official lauch is 8.1.2008 .... but you can expect some reviews this year ...
He means January 8th 2009
Well i know what my next video card is :D
I can't wait for Monday. :D
Looks like a lot of people are either going to be very happy or incredibly pissed if enough info is leaked. :devil:
Makes me very happy to own a galaxy 1000w psu. I know I don't need it to run this card, but I won't have power issues. :D
Well, even if it's only a small improvement in performance over HD 4870X2, it still knocks it out of the top, which will pressure ATi to respond with either price adjustment or/and new competing product.
I'm not too sure about that. There is really no reason for ati to drop their prices even lower just because nvidia's coming out with the new top end. Sales of the top end hardly affect the low-mid end. Unless nvidia comes up with some revolutionary architecture, which doesn't appear to be the case.
Besides, ati's supposed to have their new refresh of the rv770.. so no worries there
ive got a gtx260-216 maxicore and the step up program 100 day will it realy be worth stepping up ? depends i surpose on price anyone have a opinion for me cheers.
I'm in the same boat.
I don't think the gtx265 is worth it.
I think you're looking at gtx285 or gtx295.
If you dislike SLI or don't want to spend that much, you're looking at gtx285 effectively. I'm waiting to see pricing before I pick. I will not be getting a gtx265.
Thats my 2 cents on the matter.
I do, but do you get mine? Paying $100 more for something that gives 10% is pretty pointless. Especially when the faster card runs hotter and uses more power. It even costs you more in the long run. I just think value is going to outweigh minimal performance differences in the next while.
Precisely. That is the reason I own a GTX280. I could care less about price. I look for what is gonna give me eye popping graphics and max sliders and can clock like crazy and knock me right out of my chair. I have to admit I do look at price last, but it isn't a concern at the expense of performance.
The only video card that has ever went into one of my systems has been the best at the time when i buuilt them. I won't think twice about spending 400 dollars ona video card IF it measures up.
In the case of the card in question here, this thing is a nightmare. I don't care about what ATi is doing and trying to compete with that x2 crap. I wanna see a powerful Single GPU solution...what nVidia is known for. This microstuttering crap has got to go. This thing is also gonna detract from the drivers, becaseu it just doesn't belong. It's like a it's like an old antique twin engined dragster...yeah they were fast, but how mnay twin engine dragsters do you see anymore? None, becasue they aren't needed. It only takes one big 5000 horsepower engine to get it done, and the tracks can't even hold all of it.
It's the same way with GPU's. Single GPU per card is the way to go, and they need to make that big "5000 Horsepower GPU" instead of this hodgepodge arrangement of throwing 2 GPU's at the problem.
And Nvidia feels the exact same way, so they're hustling to put a dual-GPU solution on the market. http://blitzdod.com/source/style_emo...ault/laugh.gif