where does it say dual chip? it can be one of Nivida's super die gpus; like the GT200. they dont like the idea of dual gpu cards and this can be a sign that the are sticking to their beliefs.
Printable View
Those hardspell numbers are pretty much fake in several aspects. Some are outright impossible, others are just pure bad business if implemented.
I thought nV were well up for dual-GPU cards and that was gonna be how they were to present their top-end cards from now on (with the odd exception,but if ATI are happy with their X2, nV need GX2 to match it)
thews might be my next upgrade
@ORB
Do you have any info on them?(don't mind the guys i still care for what you have to say)?
Those Hardspell numbers were pretty much seen fake long ago. Besides, at those frequencies and #'s, the card would exceed PCIE 2.0 limitations on power draw, and is thus impossible.
We won't be seeing those numbers until 45/40nm. This is all just wishful thinking.
err...thread res! 3 weeks to go, any more news?
He fails to deliver nearly 100% of the time you mean.
He takes rumors, adds his own spin to it, and posts it with many ;) 's to seem legit.
Well, he's proven time and time again to enjoy making things up for attention, so frankly, I think most of us have had enough of it.
Here, I'm OBR for a second here.
"GT300 Q2/Q3 09', shipping 900 mhz core clocks. My source says 40 nm and GDDR6 (early release sample) are likely. Keep updated, this my info guys!"
I'd say that's a darn good impersonation, accurate for sure. See what I did there, I made some stuff up and posted it as if I had insider information. It's cute, and it would get me lots of attention, but it's also underhanded and a pathetic display of creativity and dishonesty.
So yes, let's keep the wild fabricated claims to PM's, I think that would be for the best.
And I'm not trying to be a huge dork or be mean here, I just think that people need to understand that OBR has a long history of "predicting" things and then being completely wrong. This wouldn't be bad if he would just say "sorry guys, my mistake, I didn't have good info at all," but instead he just plays along like he knows something we don't the whole time.
It just gets old and I think people need to be aware to take his info with a whole lotta salt. Rumors get out of control when people accept them as fact.
You forgot to mention "NDA" so can't disclose too much.
Actually the clue that a GX2 is coming is the 260+ really. Why add the 2nd shader units when it brings you so close to the 280? Because if you were to bring a 260GX2, you'd want to be sure that SLI 260+'s were going to be better than the 260GX2 alone. But what happens to the 280? It becomes the 280+ @ 55nm.
So in Q4 I vision:
GX2 260 (55nm but just the normal 260, with lower power and slightly improved clocks(only maybe on the clocks))
280+ 55nm, faster, cooler but won't keep up with the GX2 260
260+ 55nm version of what we have now with the increased shaders
Thus 280+ in SLI>260+ in SLI>GX2 260>280+>260+
This is speculative, but it's what I think the clues are pointing to. A new orderly arrangement of chips, each with it's own niche in the power lineup above.
Next year you can look to 40nm and DX11 chips. Why? Well you know that Nvidia loves introducing a feature ahead of it's being useful. We had DX10 cards while Vista was in beta. Similar with DX11. It'll be "on the radar" next year, and Nvidia will probably use it as a new wave of 40nm chips (which will really be tweaked versions of the 200) to entice people to give up their 260 SLI setups to go DX11.
I say thats quite good speculation, because that was pretty much taht plan last go around, and what occured more so than not. But I am thinking this strategy my change some, due to the hard hit from AMD, to what I am not shure, but Im shure it will still stay in the Nvidia fashion of doing things such as......something that will sound surprising and end up not as surprising as we thought, but still pretty good.
That should be their motto. Not-
Nvidia "The Way Its Meant to be Played"
It should be: Nvidia "Something that will sound surprising and end up not as surprising as you thought, but still pretty good!"
Its a mouth full I know, but a work in progress.
Quote:
Originally Posted by DegustatoR
regards
So GT206 = GT200B? Or GT206 and GT200 are two different GPUs?
Any word on a refresh of the GTX280 before I pull the tigger on the buy button? I'm seeing good deals right now, and it could be a gamble, but I want to take advanatge of the rebates. I also don't wanna buy a card that will be obsolete before it's mounted in it's new X58 board. I'm buying things as I see deals.
Well, to be honest, this certainly is skippable.
I'd wait for next year's graphics if I were to upgrade, either.
:yepp:
I'm still using an 8800GTS 640MB, and I'm on 1920X1200! :eek: Although, I'm playing Oblivion right now, and that's all the horsepower I need. Plus, Oblivion will keep me busy until next semester, so best not to buy something I don't need, I suppose. I play games usually two years after their release date, which allows me to pump the frames and use my hardware more effectively. S.T.A.L.K.E.R., Prey, GRAW 2 and Rainbow Six: Vegas are up next. Can't wait to play them maxed at 1920X1200 on a 2009 GPU. :up:
Yeah, I'm most definitely going EVGA. GTX280 FTW to be exact. I wanna go with that version to assure I get a high speed binned GPU that I know will OC really well. I'm putting good watercooling on my GPU and CPU so i want to get the most out of it. lThis system will be built on release of the new Nehalem. At least it will be ordered then, and not a day later. I've already put a build off a year and I MUST have this system built this year...absolutely must.
I think I may just wait past the October 22nd date and see what happens. The rebate plan is good through Oct 31st so I still have a bit of time. I feel like they may not have any rebates during the Christmas sales rush and I don't wanna get caught by Christmas pricing. 40 bucks is a substantial savings.
Hey just letting you know man, I don't think the Step up program covers the factory overclocked versions, so you may lose more money if you try to upgrade.
Although my sig says the GTX 260 I have is superclocked, it's just the regular stock version.
However, it overclocks just as well, and surpassed EVGA's factory overclocked version handily.
The GTX 200 series seem to be highly overclockable, so I wouldn't recommend spending extra cash on the factory overclocked versions.