I hope not :(
Printable View
Comon now i got this info frm this post
http://www.xtremesystems.org/forums/...08738?t=208738
its not like i have the card or somthing:D
yea right...get with times people Gta 4 is the new benchmark you need core i7 on a phase change just to get 65 frames :D...:rofl:
i really dont see any real problem with this card...its 2 gtx260,s in sli with out needing an nvidia chipset... its not like any one is going to make you buy it
dont sound like its guna be a big heap in performance than a gtx 280 260 realy in my opinion sli aint worth it 2 much money 4 bit of a gain....
2 pcb design? Thanks but no thanks... :p:
lolsandwich
If this price matches the 4870x2 people will want it.
On the other hand... it will most likely be over priced and nVidia will just do with it like they did with the 9800 gx2 and it go the way of the dodo.
http://www.3fl.net.au/media/73/20071210-TF2_banner.jpg
"Give me back my SandVich™!"
I guess we all know who the next nVIDIA mascot is.
I heard that sometimes the manufacturers put some logic in the chip that they'll never use, just to study how it behaves on mass-production.
For example, I read somewhere that all the Geforce FX line had an experimental SLI logic in them.
As far as I know, using Sideport on HD4870X2 won't make a difference unless that card turns out being capable of using a shared memory pool for both GPUs. The presence of SidePort in R700 could be, as I told above, an experimental feature for the shared memory pool in R800. I also heard it could eventually be used for GPGPU apps in the future, but not for gaming (the same way the GT200 has a lot of stuff that only turns on for CUDA).
regardsQuote:
55nm GTX285 Climbs to Head of Single-GPU Cards
We have looked into 55nm GeForce GTX260 VGA Card, now let’s turn to 55nm GTX280. It features brand-new PCB design and enhanced default frequency. Besides, it’s been given a new brand – GeForce GTX285.
http://img142.imageshack.us/img142/6...egtx280rc9.jpg
Adopting P891 as reference design, GeForce GTX285 remains 10.5” length just as GTX280. And its cooler is dual-slot designed with dual DVI and S-Video output. The graphic memory carries on 1GB GDDR3, and its stream processors is 240. The frequency of GeForce GTX285 remains to be confirmed, but its performance can get up to 110% of GTX280.
We can regard GeForce GTX285 as GTX280 Overclocking Edition with less power consumption. GeForce GTX280’s Maximum Graphics Card Power is 236W, and it requires a 6-pin and a 8-pin PEG Power Connectors. However, GeForce GTX285’s power consumption has been reduced to 183W with only 2 6-pin PEG power connectors needed.
GeForce GTX285 with highest clock and lowest power consumption is undoubtedly the most powerful card in single-GPU cards, while new GeForce GTX260 arouses the most attention in middle and high-end cards’ price war. Dual-GPU GeForce GTX295 is expected to regain the crown of performance. All of these will happen in January 2009.
After unveiling details of NVIDIA new-line products, we will follow some news about AMD’s plan to fight back, such as RV775XT. (Special thanks to one of our readers who provides information)
http://en.expreview.com/2008/12/11/5...gpu-cards.html
Wonder if those power draw figures really are accurate, if so then that's a very nice decrease.
Tri-SLI GTX 285 here I come. I wonder if they will run cool enough that I can run them off of a 120.2 rad.
Haha, and you guys think that 6gpus are going to work toghether...not gonna happen any time soon. (I hope it does tho)
Eh?Quote:
GeForce GTX285’s power consumption has been reduced to 183W
GTX280 consumes 178W at peaks.
I don't know.. but almost every power consumption rating i've seen has had some variance across multiple reviews.
On a side note, looking at the recent anandtech coverage of the rv770 development process.. i wonder how long it'll take for nvidia's team to come up with the single card monster that will destroy the rv770.
this competition between ati and nvidia is just giving us better cards and cheaper prices..awsome i love it
http://resources.vr-zone.com/newvr/i...ceGTX295_2.jpg
http://resources.vr-zone.com/newvr/i...ceGTX295_3.jpg
http://vr-zone.com/articles/more-gef....html?doc=6274Quote:
Here are some more shots on the GeForce GTX 295 card. Enjoy! Among other things we have confirmed today are 480 SPs, 289W TDP, Quad SLI support. Clock speeds, performance and price still remains a mystery but we will soon find out. Can RV775 X2 card stands against this NV offering?
NO! damn the hole is still there...
I suppose it is necessary, but man... how ugly it looks when watercooled.
bah that damn hole!~
This thing is such a monstrocity that it may require a separate loop and triple rad for each card...or each "set" of cards. Heck, I don't even know what to call a card anymore when it's not a card. :ROTF: For each "card" there are actually two. Man, what a mess. They just had to go and delve into this lost cause called "multi GPU cards". It wasn't good enough that they had the best single GPU card out there. They had to go and compete with ATi's bad ideas.
Multi-GPU isn't a lost cause. ATI has the performance crown because of this.
Nonetheless, I don't want to open that can of worms.
The GTX 295 doesn't interest me. The 285 does. That's what dinos22 and I were referring to :D
this on VR-Zone pics is pre-sample .... exist Retail full ... with working power management ... and metal cover ...
deleted..............
we need photos :D
yeah i know.. Please post the photo's under a different name or something LOL
nvidia would probably subpoena the IP from XS
nv is watching in this second?don't believe so
guess who works for Nvidia? Fugger ;)
:slobber: im such a sucker for this :banana::banana::banana::banana:....
9800gx2 flashback... I don't like the dual pcb!
These sure will run hot with default cooler if the TDP is accurate, I expect load temps to be around 85~90C in a medium cooled comp case at stock clocks due to how packed and not much space at all to breathe between the cards.
I'm getting excited. So two regular GTX 280's should perform better than one of these cards right? Since the 295's bandwith has been cut down to a 260's 448bit bus
not really, if you look the reviews comparing a 216 and 280, the 216 OCed wins by a quite significant margin on most of the games take a look here
what I meant was 216 looks better for price/perf ratio just get one an OC and you got yourself 280... I know comparing stock clocks vs OC clock is not valid but OCing these cards can get you a nice boost, also the GTX 295 has 240 SPs too unlike 216.
you know whats funny. A few months back i had been asking people if they thought the 200 series would be making a GX2 version any time soon. Every single one of them said no. NOW LOOK AT THEM!!! :)
Single GTX 280's are down to $325 now. Overclock a 280 to 700mhz and it will kill a 260 Core 216 OC or not.
http://www.tigerdirect.com/applicati...709&CatId=3669
Me?? Because i dont have water cooling on the Graphics card haha. i'm at 702 right now. 702/1511/1265.. I thought that was a pretty nice overclock for a vanilla card. I probably could go further but when i play a game, it's usually for 4 or 5 hours straight and going past those speeds tend to crash the games after a hour or so. On benchmark runs i'll jack everything up a little more but that's just for a short period.
watercooling is overrated! just depends on luck of the draw, some 280s run better than others.
with the standard cooler i run mine at 670/1458/2600 flashed in bios unless the game needs it then i push it up to 700/1512/2700, and for benching it runs at 730/1566/2700 or 740/1512/2700 depending on the temp of the day. Being summer over here in aussie land (though sydney is winter some days right now) can't always push as hard on air unless the temp is right,
edit: yeah for gaming no point going too high just makes it unstable when playing for long periods. I haven't found a game where the extra clocks make the instability worthwhile anyhow.
I never say that. Just see his sig 675MHz and he speaks about 700MHz so i ask his speed ;)
Hey guys,
Long time reader, first time poster. Sorry to jump on the bandwagon so late in the thread, but seeing as though there are some here with inside information, and many others whose experience might translate into valuable insight, I'd like to put the following to your consideration:
The soon-to-be-released GTX265 and 285.. They're 55nm versions of the 260 and 280, right? So we're mainly talking about a "simple" die-shrink, not a major upgrade to the underlying design.. Right? A lower TDP, lower voltages needed to achieve the same clock frequencies might well mean that the 55nm versions can be OCd/pushed further than current ones (they might even be released with higher clocks across the board), but ultimately, the performance gains probably won't be anything to write home about. Especially if nVidia doesn't stick an attractive/competitive price tag on them. Sounds about right so far?
My problem is this: back when the GTXs came out, I was about to build a new system.. Sprung €200 for slim profile, full cover, single slot waterblocks (2) intending to wc 2 gtx280s (or 260s). Some unforeseen events killed my budget so now I'm sitting on two brand new blocks that I can't shift. They most certainly won't fit the new 55nms from the pics I've seen so far, so my question is: when the new versions come out, should I eat my losses, buy new blocks and the new cards, or wait til the new cards come out and use the price drop on the 65nm to save some money?
One last thing: are factory-OCd cards worth the extra dough? Even assuming they're higher binned versions, can't those same clocks be reached on vanilla cards safely and with stability for extended gaming sessions?
Thanks for your input
i would buy the old cards once they fall in price. i don't see the die shrink changing things up all that much, the 9800gtx 65nm/55nm may or may not be an indication of this. Blocks are pricey, i wouldn't waste what u got invested already.
Acho que consegues usar os blocos para as GT200 65nm nas novas 55nm... A furação parece ser idêntica, apesar do GPU ser mais pequeno. As novas têm um IHS (Integrated Heat Spreader) na mesma, portanto a furação deve ser igual. Espera que isso se confirme e assim não precisas de comprar nem novos blocos nem placas "velhas".
Sorry for posting in Portuguese guys :p:
^^
Um tuga por aqui!!! (sim, eu sei que há vários, mas ainda assim) :p:
(back in english)
I wish I was so optimistic.. But looking at the pics for 65nm 260s and the previews for 55nm 260s, then comparing both with my blocks, I spot a number of differences, check it out:
http://i35.tinypic.com/1zcpfno.gif
Yeah, the gpu and the (controller?) are both in the same place, and most holes do align nicely, but there's at least one that doesn't, and then all mem chips as well as a bunch of condensers and whatnots are all over the place compared to 65nm layout.. For full-cover waterblocks, specifically tailored to a certain design, it's too much of a change. Don't expect the 280/285 to be any different either. :(
How fast is a single 260gtx compared to 4870? Will this be the new flagship?
depends on which version of the 260 your talking about. They are very close nonetheless
Nvidia needs to release some info now.
I think we'll see some stuff by late monday or early tuesday. Drivers will be out monday for the 295. Allow a full day for benching and testing. I'm patiently awaiting.
I'm considering stepping up my gtx260 to a gtx295 b/c the 265 won't be worth the step-up. I've never done SLI, but I have a 1000 watt psu sitting around that needs to be used and it'll give me an excuse to rebuild my entire rig since I can't afford an i7 now and don't need more than my q6600.
You know, I bet EVGA and BFG won't even list the refresh 55nm version as a eligible card on the step up program. Looking at what they have listed now, you can't even step up from a Vanilla GTX 280 to a Water Cooled GTX 280.
Who knows tho. It would'nt suprise me if they did everyone like that.
They will
http://www.evga.com/forums/tm.asp?m=637898 ;)
They damn well better have those 55nm cards out by the end of the month!
Nvidia estimates Shader Processing Rate as follows: *No. of SPs* X *Shader Clock* X 3 MFLOPS.
This would make a GTX260 @ 700 core (1512 shader clock) a 980 GFLOPS behemoth, faster than the 933 GFLOPS GTX 280. Of course the GTX 260 also possesses less bandwidth and it would be bottlenecked because of that on Ultra High Resolutions and ridiculous AA, @ 19x12 and modest AA, however, GTX 260's bandwidth and its 896MB or RAM are enough and in most resolutions it would beat the GTX 280 (except in the case of 25x16 res and more than 16xAA, I guess).
You can still oc the GTX 280 -though- and keep the extra bandwidth and amount of RAM, which makes all of the above considerations unimportant....
So is there any actual confirmation that these cards will be due out this month, or in January? I seem to keep hearing conflicting reports on this.
Official lauch is 8.1.2008 .... but you can expect some reviews this year ...
He means January 8th 2009
Well i know what my next video card is :D
I can't wait for Monday. :D
Looks like a lot of people are either going to be very happy or incredibly pissed if enough info is leaked. :devil:
Makes me very happy to own a galaxy 1000w psu. I know I don't need it to run this card, but I won't have power issues. :D
Well, even if it's only a small improvement in performance over HD 4870X2, it still knocks it out of the top, which will pressure ATi to respond with either price adjustment or/and new competing product.
I'm not too sure about that. There is really no reason for ati to drop their prices even lower just because nvidia's coming out with the new top end. Sales of the top end hardly affect the low-mid end. Unless nvidia comes up with some revolutionary architecture, which doesn't appear to be the case.
Besides, ati's supposed to have their new refresh of the rv770.. so no worries there
ive got a gtx260-216 maxicore and the step up program 100 day will it realy be worth stepping up ? depends i surpose on price anyone have a opinion for me cheers.
I'm in the same boat.
I don't think the gtx265 is worth it.
I think you're looking at gtx285 or gtx295.
If you dislike SLI or don't want to spend that much, you're looking at gtx285 effectively. I'm waiting to see pricing before I pick. I will not be getting a gtx265.
Thats my 2 cents on the matter.
I do, but do you get mine? Paying $100 more for something that gives 10% is pretty pointless. Especially when the faster card runs hotter and uses more power. It even costs you more in the long run. I just think value is going to outweigh minimal performance differences in the next while.
Precisely. That is the reason I own a GTX280. I could care less about price. I look for what is gonna give me eye popping graphics and max sliders and can clock like crazy and knock me right out of my chair. I have to admit I do look at price last, but it isn't a concern at the expense of performance.
The only video card that has ever went into one of my systems has been the best at the time when i buuilt them. I won't think twice about spending 400 dollars ona video card IF it measures up.
In the case of the card in question here, this thing is a nightmare. I don't care about what ATi is doing and trying to compete with that x2 crap. I wanna see a powerful Single GPU solution...what nVidia is known for. This microstuttering crap has got to go. This thing is also gonna detract from the drivers, becaseu it just doesn't belong. It's like a it's like an old antique twin engined dragster...yeah they were fast, but how mnay twin engine dragsters do you see anymore? None, becasue they aren't needed. It only takes one big 5000 horsepower engine to get it done, and the tracks can't even hold all of it.
It's the same way with GPU's. Single GPU per card is the way to go, and they need to make that big "5000 Horsepower GPU" instead of this hodgepodge arrangement of throwing 2 GPU's at the problem.
And Nvidia feels the exact same way, so they're hustling to put a dual-GPU solution on the market. http://blitzdod.com/source/style_emo...ault/laugh.gif