yes please
Printable View
hummm ~150w a card..... can i run Quad Core with two cards in SLI on 750w power supply?
Faster than 2 x 8800Ultra 0C in sli?
Yes or no will do..
Not sure if this is old hat but it seems to confirm the stream processors and memory bus for G200 and HD4000 cards.
Hexus
just curious, anyone got any info on how PCI-E 1.1's bandwidth might affect GTX 280's 512bit interface? just for single card (oh i have EVGA 680i btw, not sure if its 1.1 or 1.0)
i hope not much cuz i kinda wanna wait until nehalem to upgrade my board again ...
yeah, didnt know it supports DX 10.1.. didnt fud said it wont? lol fud got it wrong again
How come there are no benches? I thought the date was June 3rd for people to post results
it takes time,
it's only like midnight in America,
and cards are only shipped to retailers/reviewers today
then they gotta unpack etc set up shop before some guy can steal a card to leak
the nda stands until release so the only chance of benchies are from probably the chinese sites as usual
benches may not be up for days, at least Computex should display cards
Hey all - didn't see this posted already.
some pics of the card have surfaced!!!
http://www.vr-zone.com/articles/Deta...ures/5826.html
http://www.vr-zone.com/articles/GeFo...shot/5828.html
:shocked::up:
Dang it Dangals you beat me to it!:clap:
Oh now i see, ur in OZ too:). The americans are still asleep;)
Not really :p:
lol.... neither am i :D
Looks like by GPU-Z that the GTX280 clocks are indeed what the Inq said
I think 2 x 8800GTX (two times, not 8800GTX in SLI) is to be expected from the GTX 280; just as 2 x 3870 is to be expected from 4870
I still can't get it why the stone is often dubbed as GT200 as the chip itself has G200 printed on it, with no "T". Even GPU-Z displays GT200.
:confused:
over @ vr-zone
Pics
Clocks and more pictures
looks like one power hungry monster:eek:
Dunno if this should hve been in the computex section, sorry for opening another thread then
cheers!
As always, nvidia's shrouds are awesome.
Some guy from another forum explained that GPU-Z doesn't prove anything and just retrieves basic data stored in its database for a particular GPU. Which is true.
So the writers of GPU-Z created the database entry for GTX 280 with those numbers (they took to the rumors or received information from NVidia) and it just displays those numbers. As can be seen, it can't detect the card's clocks.
Sure it can. Overclock your card by 1 MHz and run GPU-Z.