the "über" cards from nv since the 7xxx series allways where "near EOL" products, milking out the last drop of the enthusiasts. :D
Printable View
I've been using a single 8800GTX @ 630/1900 since nov '06 and now I'm considering an upgrade. This card still handles all games with ease, but I'm thinking about picking up a second one for SLI. But now that the 9900GTX is nearing, do you think a single 9900GTX would be able to outperform dual 8800GTX's @ Ultra speeds? Or should I just sell the 8800GTX (nvidia reference card) and wait for 9900GTX?
tough decisions...
try to get rid of that 8800 gtx as quickly as possible (more money) and buy a gtx 280 (faster than sli, no issues), if youve got 400$ to put onto the 150-200$ youll get for the 8800
edit: lol, those prices on ebay are so ridiculously high, so you can expect even more
Just put the card on ebay, 9900GTX here I come! :D
I'm willing to bet my Seasonic S12-500W will run the E8400 and the GTX 280 just fine, I ran a RTHDRBL + Prime95 torture test with a 3.4ghz 1.4v Q6600 and overclocked 8800GTX for a few hours last year with ZERO issues. Wattage requirements are grossly overstated to get you to buy :banana::banana::banana::banana: you don't need. My E8400 is using 100-150W tops + another 250W from the GTX 280 will put it right at peak of my PSU's efficiency curve. Quality>quantity.
Though I think you are correct in your situation (solely b/c of the brand), there are other things running aside from your CPU and GPU on teh 12v rail, the mobo for an example. Not to mention there are items on the 3.3 and 5v that, in a lot of PSU's, take away from the max output of the 12v rail.
PS, Hi neighbor!!!
RV770 only can compete with G92b 55nm, because we know GT200 will
be a Monster both in graphs/performance quality and Hot as hell... smells like
barbecue to me, hope not.
not really, but it's obvious amd's offer becomes mid-end with GT200 on the road
exception for the Dual RV770 of course.
old stuff check out: http://www.nordichardware.com/news,7765.html
Aqua Computer Launches Aquagrafx G200 for GeForce GTX 280 and 260 Video Cards
With NVIDIA's next generation video cards just around the corner, German water cooling expert Aqua Computer has decided to launch a video card water block that will fit on reference NVIDIA GeForce GTX 280 graphics cards. Supposedly, the water block will suit on GeForce GTX 260 cards too, but this information is not confirmed. Just like every other Aqua creation, the Aquagrafx G200 is a full-cover all copper block with G1/4" connectors and channels optimized for very low flow resistance. Now we only have to wait for the cards to come, until then this GPU block is not applicable. No word on pricing and availability yet.
source: Aquacomputer
I guess you all allready know that GT200 will be GEFORCE GTX 280 and 260 and G92b actually will be 9900 in 55nm
500 watt liberty is a decent PSU... just let him try it before going broke on wattages... I bought a Galaxy 1kw PSU to run SLI on 8800GTX , think the demand in wattage was so overrated, as that rig also ran fine powered by an Enermax Liberty 620... we will have to see when the reviews pop up what the real demands are... in case of instability it might be his PSU that borks out... only time will tell
and that's again one sexy waterblock there, me droooooooooooools
Repost repost repost.
http://www.xtremesystems.org/forums/...d.php?t=188698
http://www.xtremesystems.org/forums/...d.php?t=188821
http://www.xtremesystems.org/forums/...d.php?t=188859
Also, where do you get your info that rv770 will loose, benchmarks? Insider?
I might do what someone has already mentioned and avoid the inital price gouge. No pressing need here. :yepp:
I am building a new system around late June so i'm going to get owned by the price gouge! :(
Hopefully the GTX 260 will be affordable. I think 350-400 euro ($550-630) would be a good price
*edit*
LOL Scroll down for official GTX 260 specs!!!!!!!
http://www.sunocoinc.com/NR/rdonlyre...oco_260GTX.jpg
http://www.sunocoinc.com/Site/Consum...noco260GTX.htm
*Warning* the INQUIRER *Warning*
http://www.theinquirer.net/gb/inquir...0-280-revealed
LMAO, they prefaced their prediction with "quite likely" then stated the 280 would "lose very badly" uncertainty and strong predictions are typically mutually exclusive... Still, I'll believe it when I see it. If true no jagon would buy a $600 card that gets it's arse kicked by a $350 card.Quote:
The 280 has 240 stream processors and runs at a clock of 602MHz, a massive miss on what the firm intended. The processor clock runs at 1296MHz and the memory is at 1107MHz. The high-end part has 1G of GDDR3 at 512b width. This means that they are pretty much stuck offering 1G cards, not a great design choice here.
The 280 has 32ROPs and feeds them with a six and eight-pin PCIe connector. Remember NV mocking ATI over the eight-pin when the 2900 launched, and how they said they would never use it? The phrase 'hypocritical worms' come to mind, especially since it was on their roadmap at the time. This beast takes 236W max, so all those of you who bought mongo PSUs may have to reinvest if they ever get three or four-way SLI functional.
The cards are 10.5-inch parts, and each one will put out 933GFLOPS. Looks like they missed the magic teraflop number by a good margin. Remember we said they missed the clock frequencies by a lot? Here is where it must sting a bit more than usual, sorry NV, no cigar.
The smaller brother, aka low-yield, salvage part, the GTX260 is basically the same chips with 192 SPs and 896M GDDR3. If you are maths-impaired, let me point out that this equates to 24 ROPs.
The clocks are 576MHz GPU, 999MHz memory and 896MHz GDDR3 on a 448b memory interface. The power is fed by two six-pin connectors. Power consumption for this 10.5-inch board is 182W.
This may look good on paper, but the die is over 550mm, 576 according to Theo, on the usual TSMC 65nm process. If you recall, last quarter NV blamed its tanking margins on the G92 yields.
How do you fix a low yield problem? Well, in Nvidia-land, you simply add massive die area to a part so the yields go farther down. 576 / 325 = 1.77x. Hands up anyone who thinks this will help them meet the margin goals they promised? Remember, markets are closed Monday, so if you sleep in, no loss.
The 260 will be priced at $449 and go up against the ATI 770/4870 costing MUCH less. The 280 will be about 25 per cent faster and quite likely lose badly to the R700, very badly, but cost more, $600+.
Sounds like a ton of biased bull:banana::banana::banana::banana: but wait that's all the Inquirer publishes.
Unfortunately that will not happen.
And with mid to high AA/AF levels, the gap will be even bigger.
Considering that a gamer who gets 350$ out of his pocket wants to play with full details and AA/AF, the high end sector is green once again.
I'd really like ATi to come really close to or even surpass nVIDIA this round ( when I game I game with AA/AF, my eyes are too picky :D, so yes, I'm talking about AA/AF enabled ) for the sake of healthy competition & our pockets.
HAHAHAHA. Anyone else notice this?:rofl::rofl::rofl::rofl::rofl::rofl::rofl:
D for DUMB@$$! That equates to 28 ROPS as nvidia's ROPs are directly related to the memory controler, meaning 448/64=7 clusters of 4 ROPs, which means 28 ROPs.:rofl::rofl::rofl::rofl:Quote:
The smaller brother, aka low-yield, salvage part, the GTX260 is basically the same chips with 192 SPs and 896M GDDR3. If you are maths-impaired, let me point out that this equates to 24 ROPs.
Ohh inquirer, you never fail to amaze me and my maths skillz (being in calculus as a 15 year old never hurts either).
Personally I think he didn't realize that the performance of the 240 gt200 shaders != to 240 g80 shaders. So if that was the case, with the law of deminishing returns, I could see him being partially right, but a core of 1296? My d9gkx can't hit that!:shakes:
Guys don't consider half of what that article says, my estimate is that judging by the folding performance of the gtx 280 (and how companies love to give out numbers no one else seems to be able to get before they launch), I'd say the gtx 280 and 4870x2 will be very close in performance (gt200 having a slight lead), probably closer than nvidia will like because of how bad their yields will be.
http://www.theinquirer.net/gb/inquir...0-280-revealed
doesnt sound very promising, not at all
if thats true that card isnt going to have a long life...
That's possible, if it was only a 2x shader domain (can't see it being 2.5x as core clock would be too low), in which case that means the core is at 648mhz, which is far more reasonable. But if the gt200 shaders are only at 1296mhz, that means there's a chance they may not actually bring the performance boost we're expecting regardless of how much more efficient they are
Any don't worry I wasn't shooting the messenger, just farting fireballs at Charlie's left niple:D
And seriously people, please read the page before posting something, the 4xxx series thread has like 5 of the same links in some cases (just a thought)
jep, those clocks really suck
thats 24% less in comparison to the 9800gtx
Not to mention the 9900gtx will probably be somewhere close to 2ghz shader with the g92b core and will definitely oc better than gt200
That's right guys, keep your hopes down.
The lower your expectations become, the bigger the excitement with the first numbers ;)
Awfully low shader domain clock...
.
.
.
Yay! More OC'ing headroom!!!
:D
lol god, this articel is so full of errors... i think he wrote the while he was drunk....
Quote:
The clocks are 576MHz GPU, 999MHz memory and 896MHz GDDR3 on a 448b memory interface
uh, so we have ram with 2 clock domains... nice. :rofl:
not necessarily
imagine what those card suck energy wise with voltmods+ xtreme oc,
probably >300 w
Sometimes I wonder just how much about hardware the INQ actually knows...
x2
GT200 is a Monster in everyway...
Just Smells like barbecue to me, that's the part i don't like
And hughe electricy bills...
Amd can't compete with this GPU in single, perhaps with G92b 55nm
but Geforce GTX 280 can't even be cooled by air, that's how bad it gets!
imagine Ultra was 185w and nvidia had heat issues with G80,
so we all know 240w it's just to much for aircooling!
aquacomputer shown us they're offer allready, and this tell much about it.
They could use this sentence to promote the card:
GEFORCE GTX 280
So Powerfull that can't be cooled by Air,
Get your Nitro Kit, only 99 000$ Nvidia®
ahhaha LOL :D
cheers
Hehe look at the third quote in my sig :D,dates back to who knows when(when first rumors of the GT200 series emerged).With a such a HUGE die area,it better have some great air cooling solution,since if that's not the case,it will be the best selling product in "artic circle and surrounding areas" :D.
Benchmarkreviews tells a completely different story than The Inquirer:
Quote:
Jason Paul, the GeForce Product Manager and NVIDIA veteran, came right out and dropped the new product bomb: the GeForce GTX 200 graphics platform. Perhaps it was the off-interest discussion of CUDA which lowered the attention span, but Jason's brief discussion exposed that the new GPU play Crysis "damn fast". There wasn't any time wasted, and Jason quickly introduced Tony Tamasi to introduce the GTX 200 compute architecture. Unfortunately, the non-disclosure agreement Benchmark Reviews honors with NVIDIA prevents me from disclosing the details for this new GPU.
So you might be wondering what Jason's holding in the image above, right? It's large, almost the size of an original Penium processor, except that this particular item has 240x the number of cores inside of it. I would love to tell you what it is, but suffice it to say there's a good reason why Mr. Paul has a smile on his face. It put one on my face, too. Benchmark Reviews will reveal more at 6AM PST on June 17th, 2008.
http://benchmarkreviews.com/index.ph...d=178&Itemid=1Quote:
Just wait until June 17th when the GTX 200 series of GPU's launch, and you'll start asking yourself when you last witnessed such a dramatic technology improvement. If you thought the GeForce 8 series blew the 7-series out of the water, this is going to leave you in shock. That's not my own marketing spin... Benchmark Reviews is presently testing the new GeForce video card.
LoL...
Best quote ever-
Since I know nothing of the site, I did a little investigating.Quote:
Originally Posted by Bencmarkreviews
Out of the 17 pages of reviews, 5-10 reviews per page(most were not of GPUs), there was 1 for an AMD/ATi card...
While the G80/G92 series had multiple reviews of the same card? 10 reviews total.
Interesting...
Exactly, there's going to be a LOT of shocked faces when this thing drops. :yepp:
Seriously, I don't get why people are still under-estimating this thing. NVidia had an entire extra year to work on this chip and tweak it to it's maximum. Why do people think it's going to be slow enough for a RV670 on steroids to surpass it?
So then the performance gain from G80 > G200 will be much higher than that of 7900GTX/G70 > G80! This implies a 3-times gain, which would mean this product is a must-have! Wonder f its price/performance will be greater than that of 9600GT.
Easy tiger.
Don't rush into things, and certainly hold your laughing for later usage ;)
If we were talking about drugs, then AMD's steroids are ecstacy... while nVIDIA uses a mix of every kind of drug around.
Plus the MUL is working "properly" all the time now ( one thing you're missing when comparing the "GT200" with G80 ).
[mode=hands_on];)[/mode]
Well lets just hope they get the high TDP worth it
After all, they've had a lot of time considering how sure some members here were about G92 being the 1TFlop beast hitting us at the end of last year...
G200 is the codename, as evident from the chip shots seen on various pages.
Remember the last photos of the GT200 cooler???
http://img253.imageshack.us/img253/5...9478612ne9.jpg
now look at this card :
http://img58.imageshack.us/img58/488...iaeditoup6.jpg
;)
.......
regards
Is this 18th June launch a paper launch or will the cards be available in stores on the 18th ?
A low availability launch always sucks big time, hope decent numbers of physical cards are ready for release.
Dang. Searched and still can't find information regarding the bolt pattern around the GPU. Any chance old 8800/9800 air and water will mount on this new card?
NVIDIA's GeForce GTX 280 will be impressive
Just wait until June 17th when the GTX 200 series of GPU's launch, and you'll start asking yourself when you last witnessed such a dramatic technology improvement. If you thought the GeForce 8 series blew the 7-series out of the water, this is going to leave you in shock. That's not my own marketing spin... Benchmark Reviews is presently testing the new GeForce video card.
Benchmarkreviews
Source : If that is true I'm going to faint. M-m-m-m-mooooonster card. :D
Bigger performance increase then from 7 series to 8!!!!!!
gundamit,
According to the leaked 3D model of the G200 cicuit board the centers of the mounting holes are 61mm apart so they aren't compatible with those on G80/G92 which are 54mm apart.
Looks about right to me.
http://img156.imageshack.us/img156/6...day2008yh0.jpg
The above image lightened a bit, I don't think anything is being faked here. Just the perspective is different.
http://img55.imageshack.us/img55/143...iaeditoft9.jpg
Yeah looks alright in this picture on the other one it seems bigger, or is it a case of a worse photoshop :)
(if ya notice the weirdness in the shadow behind the CEO, seems something was chopped...)
i gotta say, as a long time nv owner, who has never owned an ATI card, if the rumors turned out to be true that GTX 280 costs >$600USD ... the rumored 4870's $350USD price tag sounds very good
but then again, for 600 bux the GTX 280 might be another 8800GTX = long product life... arrrrgh hard to choose
wonder why they didnt have a 8900GTX that was done on 65nm and has higher core clock, and more ROP/shaders//TMU ..etc, like 7800 GTX 512/7900GTX vs 7800GTX 256
Can't have your cake and eat it too....
7800GTX-512 had higher clocks on the same process and ran hotter
7900GTX had higher clocks on a smaller process and ran cooler
Neither had an increase in any type of unit over the original 7800GTX-256.
Expecting a 8900GTX on the same 65nm process as G92 with both higher clocks and higher unit counts is a bit optimistic. G92 wasn't exactly sipping power.
Still, it looks like Nvidia came in 100Mhz shy of its target shader clock...1400Mhz would have taken them to the 1 Teraflop finish line.
fornowagain, thanks for the alternative picture. But even if the first one was shopped, based on yours I'd still say this thing is massive, even though the 4870x2 is going to be a huge chunk of copper, too.
But what I really wonder is how much energy this thing will suck while idling around. ~240w peak seems much, okay, but when they managed to keep the idle pwr draw to a minimum I'd be okay with it.
Don't act like a jerk please.Quote:
Originally Posted by LordEC911
Don't know what made you say that, but it's definitely not beneficial for yourself, and surely not for another member of XS.
here is coverage of Nvidias editors day 2008; http://benchmarkreviews.com/index.ph...=178&Itemid=46
taken right from article, "Video games are now seeing an added dependence on physics processing, just as well as GPU and CPU. GPU's Such as NVIDIA's upcoming GeForce graphics processor have shifted the dependence of video games onto the GPU, and added an on-board PhysX co-processor, it won't be long before the CPU really offers no level of performance for video games."
DAM!!
I find it funny that nvidia is highlighting piclens, an ok firefox plugin.
I trust DilTech and BenchZowner about GTX 280. Hope you are enjoying the toys! :)
Now games just needs to catch up...
@alig
yeah, i was just thinking exactly the same thing
especially the statement "on-board" makes that apparent (physix is not managed by the gpu, otherwise it would have been "on-die" or something like that)
after all, who would actually use a gt200 card inside a media center pc, there are cards taking only 1/10 of the electricity, are much quieter and cheaper, but still offer good hd support (if nvio is not integrated in the gpu, which i seriously doubt; only reason that thing is included is to make specs more impressive/ people using that card shure have a processor running hd without gpu support)
It's NVIO 2.0, not PhysX.
A physx chip would also need aditional memory chips.
Physics on a GPU is still a joke and it wont change.
Reminds me, aint it about time to axe the analog outputs? (TV-Out). Everyone and their mother got DisplayPort/VGA/DVI/HDMI on their TV and equipment. With VGA being combined with one of the others usually.
I'm still wondering what's going on with Alan Wake... the website hasn't been updated since...well.
Can i run 2x GT-280 in SLI with 750w PSU?
fell the power of 512bit & 240 Stream processors
Physics and CUDA all in the same GPU, lovely
Well, to me it looks like you've taken what I said ( note I didn't say you are a jerk, I said don't act like a jerk ) the other way round, probably a round that suits your "purpose" better.
I've got nothing against you and I'm not here to flame people.
But the way you expressed your "thoughts" sounded bad.
People can be funny/amusing to other people in two ways... the good way, and the bad way.
The usage of those icons ( emoticons ) help the other party figure which way you meant things easily and quite fast.
My apologies if you felt offended in any way.
Very true. The strengths that the GTX 280 carries will make a notable difference in games such as Crysis. I'm personally interested in what the card(s) make of Age of Conan. Aside from Crysis and EQ2, its the first game I haven't been able to run remotely playable at 1920x1200 without greatly sacrificing IQ.
I'm hoping a single PA 120.3 can cool these things though. Imagine needing a loop per gpu :p
I'm most interested in whether this will be the first GPU to saturate PCIe 1.0 bandwidth.
Check this pic out from my contact at nVidia...
280s ready to ship...
hahah i love reading this thread
you can really see who's full of :banana::banana::banana::banana: and who's not ;)
keep it up guys :D:ROTF:
Yeah, I was noticing the same thing...
Interesting though...
ok guys, here are the final specs, i.e. real
"The GeForce 280 GTX is based on the GT200 graphics chip and taktet with 602/1.296/1.107 MHz (chip / shader / memory). In addition, the map more than 240 stream processors, a 512 bit wide memory interface and GDDR3 graphics memory.
The power is supplied by a 6-pin and an 8-pin power supply.
The maximum consumption is respectable 236 watts.
The GeForce 260 GTX is also the GT200 GPU to use and stroke rates are 576/999/896 MHz (chip / shader / memory)
The little "brother" of the GTX 280 should be with "only" 192 stream processors and a 448 bit wide memory interface constraints - sets but also on GDDR3 graphics memory.
The maximum consumption is 182 watts NVIDIA.
The prices are at 449 U.S. dollars for the GTX 260 and more than 600 dollars for the GeForce 280 GTX."
translated with google so excuse the grammer.
source; http://www.gamezoom.net/artikel/show...d,1,19210.html
Looks like just a rehash of the other info already floating around. Not sure i'd use respectable as an adjective to 236W :rofl:
GTX 260 @ ~400€ looks very sexy.
Too bad I'm unemployed right now :p:
sorry but couldn't resist...
http://img382.imageshack.us/img382/1...ja280gttr6.jpg
:yepp:
over $600 u.s for the 280..damn going to be around $800 use then