http://www.itocp.com/html/20100302/n...images/001.jpg
http://www.itocp.com/html/20100302/n...images/007.jpg
http://www.itocp.com/html/20100302/n...images/004.jpg
Printable View
I don't get it. Isn't this basically 9800 GX2?
woooooow
9800GX2 in single PCB flavours :clap:
What will it cost ?
Around $200 it may not be bad but i dont think anyone will pay more than $250 for this relic.
When will we have GTS 250 x3 or X4?
Come on Zotac, you can do it :)
Man whoever worked on G92 can probably retire now. Talk about a friggin goldmine.
The card is pretty cute, actually. Looks quite short to me. :shrug: Prolly shorter than my custom PCB 8800GT even. :D Still, it doesn't even support DX10.1, so.. 200 dollars max and it MAY be a good seller.
i dont understand the point of this card? :confused:
I have one question though...
Why the hell would they release this now?
Card would be amazing for F@H if priced correctly.
Another tall PCB card. Looks pretty good. Too bad G92 is so old...
Great card for FAH and if priced right, for titles that scale well in SLI.
These GPUs belong to the junkyard, but isn't this a part of a new trend?.
nVidia has always been re-branding by just re-naming and re-boxing the same junk, almost untouched. But I have seen a few special re-branding lately. They are refreshing and re-branding at the same time, by mixing several of the same (or different) junkyard-worthy GPU on a fresh PCB design.
I want one.. god that looks like fun trying to mount pots too :D
They could've given us standard mounting hole spacings :( No reason for tighter spacing given the layout
Realistically.... no need for stronger cooling than a good chiller
This card will do good, only if the price is right.
NO way! Its completely different! Its a GTS 250, a whole new generation newer! :D
In all seriousness, I don't really see the point. With the GTX 260, 275, 280, 285, etc they've basically rendered 9800GX2 or GTS250 X2, whatever, rather pointless. And that was over a year ago when the GTX 200's came out. We're on the verge of Fermi, so who would realistically want a dual GPU card now 4 generations old? Unless this is HD 5770 price range, there's going to be very little to no market for it IMO.
G92, back for more.
6+8 pin?+maybe 2g(1+1)? can anybody tell from the pics?
i bet at 1600x1200 this card rips just about everything a new ahole in the 220-240 range.... drivers will be very good(they had a few yrs for prefecting).
but even me the nv vga addict will pass at this point....but wishes it luck
I think that we need to start celebrating some birthdays for the G92 xD
There is always somebody who wants to save a few $, but it is kind of hard to justify a hot and power hungry old junk (compare to new generation GPU with dramatic die shrinks). Besides, how can you recommend a GPU without DX11 today?
It is not just the price that counts. Performance, power usage, heat and functions (specifically DX11 for now) counts a lot too.
Card looks excellent. If it has the right price and 1GB of memory unlike the 9800GX2 it might sell nicely. Remember it's a souped up G92 (not much but still...) and it will probably be around or over a GTX285 in performance. Besides that, something that justifies DX11 is oh so far away.
92b til infinity!
:)
Things that will survive a nuclear holocaust: roaches, Lemmy Kilmister and G92. :yepp:
I like how small and tiny it is... need a cute little cooler now.
I just was thinking about those old Radeon 2600 X2's and GeForce 8600 X2's few days ago and than this :D But what did you expect? Renamed G92 based models are the only NVIDIA cards for on market for a few months now. Fermi could be just low-number thing, where many vendors will get only few of them (or none), they have to sell something, no?
I don't think this got anything to do with the number of Fermi to the pertness, or such. This can be the last push to sell some old junk before the new GPUs with more die shrinks hits the marked and makes those G92 to look like a kilometer based technology :p: , compared to.
Its just wierd though that they would completely stop selling dual pcb models then make a single pcb model. You guys think you could have this and a gx2 together?
Also, what do you think the maximum amount of gpu's this would support for a f@h system? 8?
You win one internetz :clap: You just made me laugh so hard I nearly forgot to breathe! Maybe I should press charges for attempt of murder? :shrug: :p:
Something like that would have drool over it 24/7 :shocked:
We need something like this: :worship: http://www.xtremesystems.org/forums/...postcount=1505
Pure PCB-:banana::banana::banana::banana: :slobber:
Seems like a nice folding card. By the way, does anyone know if physix can run on multiple videocards at the same time? This might be a nice dedicated physix card for us ATI users, if the price is right.
I like this trend in making cards more "stout" rather than long.. good stuff!
If they wanna make this ground-breaking, they should make it tri-SLId G92. That will work.
And when the :banana::banana::banana::banana: are they releasing this?
Looks like a good card for folding and other CUDA applications.
I wish more companies made tall cards like this, any enthusiast case has plenty of room for it, and it gives them space for beefy power delivery circuitry. Plus, cards can be a bit shorter to avoid the "card getting in the way of hard drives" problem.
G92 is one of the best GPUs ever made, even though I don't like rebranding I don't mind developing new stuff like this. If they're not expensive as hell, I'm SURE I'll buy at least one of these:D Just wondering how to cool it... 2x Maze 4, perhaps?
this will be quite a collectors item, trust me... :D
i think i will buy one :)
i really like G92... clocks very well, performs very well, very power efficient...
one of the best gpus of all times, hands down!
its only dx10, but besides that... it should be an awesome card if priced right...
very nice move from zotac imo! :toast:
but they are quite late... they really should have hurried and released this at the end of last year...
i think for 199-250$ this would be a great or good card, but my guess is that itll come at 299$ and there it will fail...
i bet next gen consoles will be using g92...
mmhhhh dont think so... if nvidia would have managed to port it to 40nm, then yes, it would be very interestng... veeery interesting... even lower power, possibly higher clocks, lower cost... and 10.1 support... but g92 is only 10.0, not even 10.1... and its 55nm...
oh god, no, where's the next gen in that?
they have to use some 5970 level graphics power... or 10 years from now, we'll still be playing games with the same graphics as today, with 500 FPS and a new card release promising 600FPS instead of 500FPS is going to get us all excited...
pphhhhhhhh
gts250 is a pure rebrand, its 100% a 9800gtx+
and 5970 perf in a console? what for?
most people who have or will buy a new console have 720p displays, even a 5850 would be overpowered for that...
sure, you would have lots of headroom to let game devs come up with weird and intense new effects, but think of the costs and power consumption... i think next gen consoles will be based on second gen dx11 hardware, some 28nm mainstream-entry level stuff... cheap, low power, good dx11 perf... that should last a couple of years...
Man, I gotta say that the G92 has some serious staying power. 4850 levels of performance aren't really bad at all, put it on a newer processing node, give it a gig of gddr5 and overclock the hell out of it. You could squeeze out one more generation Nvidia!
Why upgrade hardware in the first place if its handling what you need it to? ONE of my HD4830s handles all the games I play with max settings @ 1680*1050, but I still have two of them in CF.
Power is good :D
Interestingly/Strangely enough I would prefer if Zotac was to make a GT 240 x2 card. It's TDP would be less than 150W, therefore it would only use one 6-pin cable. The clocks could/would be easily adjusted at the GTS250 levels (GT 240 allows this headroom), while it would be both cheap to manufacture (due to the 40nm scale of the GPUs) and it would also have DX10.1 support. Of course it would have 33% less shaders (192 total, instead of 256) but it would still be squarely at the GTX260/HD4870 level of performance, using less power and being cheaper to manufacture.
At $150 using a single power connector, whilst having nVidia's badge (TWIMTBP, physx) and DX10.1, it would be a good match against HD5770 certainly up until nVidia (*actually*) decides to create a mid-Range 40nm part.
lol spdif
INability you mean, YOU try that resolution on a 4650 :rofl:
lower tdp, yes
only one 6pin connector, yes, but who cares? :D
clocks easily adjusted to gts250 levels... im not sure man... nvidias 40nm parts dont clock that well...
dx10.1 support, yes, but how useful is that?
gtx260/4870 perf level, yes but it wont oc...
cheaper to make, im not sure... 40nm costs 30% more than 55nm and has worse yields, so i think its actually about the same, and clocks slightly worse, so...
you get dx10.1 and gddr5 and lower tdps but lose some clocks AND 30% of the gpus performance... tough trade... id go for a 250X2 over a 240X2 every day... this card will only last a year or two anyways... for that time you dont need 10.1 support. a 240X2 would support 10.1 but would probably be too slow to render games in that mode, so... heh... :D
@saaya: Indeed the low yields of the 40nm architecture is a problem which may or may not be fixed in the upcoming months. Still a GT 240 x2 would be a good card for the *mid range*. I have to agree though that GTS 250 x2 is quite more powerful, but also it comes in the cost of a much more complex chip, which both consumes too much for the performance it gives and it's not exactly cheap to be manufactured...
Supposedly/Apparently G92 chips are easier to come about than GT200s ones, which also says something about the "success" that GT200s had as chips :rolleyes: ...
Galaxy GTS250 X2
http://en.expreview.com/2010/03/10/g...sion/6851.html
Humm so it would apper that if this gets launched around Gf100/Fermi does you will get a nice tree:
GTX 480
GTX 470
GTS 250 X2
The result G92 lives on even after G100 has shown up.
it's like GTS 500 http://img695.imageshack.us/img695/932/awesomet.png
nvidia did that... but they cancelled it... they tried to shrink gt200 to 40nm and add 10.1 and gddr5 support, and they also tried to shrink G92... but it didnt work...
all they managed to get out was the small 40nm 10.1 chips... the biggest one is 30% smaller than G92... nobody knows what ever happened to the rest... i guess nvidia figured it would be too late for 10.1 parts, especially with the 40nm delays, so they skipped those and went for dx11 fermi right away?
no idea... charlie posted that their shrinks failed because G92 and gt200 were originally 80nm and 65nm, and shrinking them past 55nm isnt possible, you basically have to redesign them cause each major node step uses different transistors and you have to follow different rules... so instead of taking g92 and gt200 apart and putting it back together, they probably figured why not pull ahead gt300 instead for full dx11 and improved perf...
then again, they DID take G92 apart and put it back together and made some 10.1 40nm parts...
so they managed to do it... they def COULD have used more blocks and make a G92 style 10.1 40nm chip... or even gt200 sized... but they didnt... but why? who knows...
youd think bad yields... but then it doesnt make sense to INSTEAD come up with a much bigger gt300... which suffers even worse from bad yields... its a mystery to me... :shrug:
heheheh nice smiley :D
hmmm so there are several 250x2 cards?
the ebga g92 gt200 hybrid was most likely cooked up by evga AND nvidia...
im starting to think the dual g92 card is actually an nvidia design as well... and every partner that wants to, can use it...
Oh man it will be another 8600 GX2 I've wrote about :rofl: Total fail. If you want this kind of performance than go for Radeon HD 5770/HD 5830. You have performance without fear if it will actually scale good or bad due to SLI. Some games even perform like dual-core card has only one core! You'll got the same or lower FPS as on 9800 GT in this case. :down:
G84 was a profoundly bad and wasteful architecture, putting it in any amount of numbers in a single card it would still suck. I actually find GT 240 one of the best chips nVidia produced lately, it runs cool and it's quite powerful (almost 9800GT level of performance with almost half the consumption).
As for SLI support I find it excellent lately, my GTX 295 card behaves as a single card in any amount of games. SLI is mature, better in so many levels from the G8x days. The "RUSE Beta" I played lately -for example- gives almost 100% scaling at medium resolutions -with no hiccups- and it's still in Beta phase.
Even if they got anything else wrong, nVidia -lately- produces excellent drivers with -almost- catholic support. The games that do not support SLI are probably too old/simple to make use of the extra juice in the first place...
you mean they masage your back and carress your thives? :S
:lol:
sorry, couldnt resist ^^
id still prefer a gts250... while sli support is much better, its still not perfect, and some games scale nicely with sli, but there are still glitches and some games stutter... so the ability to fall back to a single G92 at 750mhz+ is very welcome, at least by me... :)
Interestingly enough its TDP would (had) be(en) lower than a single GTS250, offering 50% more shader power.
Anyway obviously my "recommendation" is not even that, companies do not seem too inclined towards that direction (the bad 40nm yield should also play some role); thus my idea is not of any consequence anyhow.
The grievances that many you have about SLI, though, I think are unfounded, you'd find more glitches at any given game -nowadays- due to independent reasons than SLI - also I can think of no modern (post 2007) AAA title that has no SLI support...
mass effect 1, for example... it had SLI support but in one map of the game FPS would be around 10 when you enabled SLI (60 when you disabled it).
That map turned out to be where the boss battle took place... And even months after the game's release, it still wasn't fixed.
250x2 should be almost as fast as a 280 witch the last time i looked is faster then 4890
http://media.bestofmicro.com/3/H/226...%20No%20AA.png
5770 is closer to GTX 260 in performance than a single GTS 250.
Besides is it not true that 4870 performed close to 9800 GX2 and 5770 performs close to 4870.
This is a 9800 GX2 with faster speeds but also higher consumption than the 5770, not to mention it lacks DX 11 support.
its like better version of 9800gx2 which ran out of memory on high res but question is do we really need another 9800gx2 ?
40.9/34.1/21.3 5770
37.3/30.8/18.8 gts250
and you say they are not close?
the 260 is faster than the 5770, so of course a 5770 is closer to a 260 than a gts250... but whats your point?
i dont understand how you are trying to conclude that a gts250x2 would be comparable to a 5770 when a single GTS250 is almost as fast as a 5770...
and a 9800gx2 performing CLOSE to a 4870... well how do you define close?
a 9800gx2 beats a gtx280... and a 4870 is slower than a gtx280, in fact its slower than a gtx275...
so to sum it up, you say a 9800gx2 is close to a 4870 when in fact there are 2 cards in between, a 280 and a 275... i think we can conclude that NO, a 9800gx2 is not close to a 4870, and a gts250X2 would be even faster than a 9800gx2...
a 9800gx2 clocks in at 600/1500 with 1000 memory vs 738/1600 and 1100 mem...
so a dual GTS250 would be 23%/6%/10% faster, all combined id guess around 15%...
which means a dual GTS250 card would be faster than a GTX285 and almost as fast as a 5850...
so NO, a gts250x2 is not "almost" faster than a 5770, its MUCH faster... ;)
and since there are GTS250 1gb cards for as low as 99$, this card has some serious price perf potential and can cost 250$, much less than a gtx285 and less than a 5850...
Has anyone heard any updates on these cards?
You forgot frequences saaya :
http://tof.canardpc.com/preview2/858...3dd022d52c.jpgQuote:
The dual-core GTS250 running at 675MHz/2000MHz/1696MHz (Core/Memory/Shader) got a 3DMarkVantage score of P13964, the performance is between GTX285 and GTX260.
How lovely for folding and i just so happen to have 3 spare PCI3 x16 lanes.
it seems to me that only two things will make a differance between the performance of this card and the gx2.
1 stock volts gx2 1.15 3d,if the 250x2 comes with the same it will run the same,unless the 250x2 has much better coolig,then maybe 1 more step in the core speed if that.
2 ram the base evga and bfg 250's i have,the ram is rated at 2600. if used it might have a small performace advantage over the slower speed ram gx2's(or even 98gtx+)overclocked.
on the otherhand i'm not sure if this card can be volt modded with software as the gx2.
that is if the card ever makes it to the wild.
oh the clocks are already decided?
lower core but higher shader clocks... sounds good!
and yeah, weve been hearing about those gts250x2 cards for a while now... still no launch date?