Why? ATi has 6+8-pin from famous R600 core ...
Printable View
is this card still due for release this month?
It looks like my evga 9800gtx...
They kept their word with the first deadline, why should they change the second one? The cards are ready and people are benching already so all that's left is ramping up production to fill up stock until release I guess..
I'm really amazed about those idle power draw numbers, if they turn out to be somewhere near reality then this is awesome. I'd be folding on it anyway, but still I think it's important that a card of this calibre and price has some kind of intelligent power saving feature. :yepp:
:confused:
i never said that the r600 wasnt power hungry, it sure was, and moreover it underperformed
i simply said the GTX280 is a power hungry monster, and maybe all that power consumed will be well worth it due to monstrous performance :up:
i just hope ati gets the 4870 out in time, it would royally suck if they delayed it for another two months or so. 4850 does look ok, but isnt anything spectacular:shakes:
ahh great stuff :) im just in the process of choosing new hardware so ill defo wait for these!!! thanks guys
I like how nVidia has this boxy think happening with their cards.
that is one dang sexy card i would have to say. nice package. now we need the numbers:D
W1zzard is the writer of GPU-Z.
There is a "database entry" for each card for amount of ROPs and the die size and etc.
I am almost positive nVIDIA provided him with the information, he is up there with the cool cats.
And GPU-Z does detect clock changes and your current clocks.
512bit is what attracts me to the GT280, but the price :down:
Probably the card only draws 150W from power but also releases 80W of heat(or something like this)
The card looks very positive in every way except for the price :(
I want benches :(
^^^^
We All Do
woooo! we're getting there! when is this launch again?? :P
Is there an expected date for benches?
Nope.... ;)
Idle: 25W
3DMark06: 147W
http://forums.vr-zone.com/showthread.php?t=283808
ohh here we go again, just stop posting about ati. period. your word on ati holds so little value these days its not even worth your time to do so. Yes the r600 was a power hungry child in a monster costume, but no one said it wasn't, and the competition between ati and nvidia wasn't as heated as it is now, especially where price/performance and heat levels are starting to mean a lot
236 has to be a peak number. The numbers in the slide are measured. The 45w for the 9800GTX is for the entire board. So it's safe to assume that the 150w on the GTX280 is for the entire board as well.
http://www.xbitlabs.com/articles/vid...9800gtx_7.html
The dual-PCB 9800GX2 pulls only 180w. I don't know why there's so much misinformation and misunderstanding about power consumption. It's like some great conspiracy to sell expensive PSU's or something.
HussanAli,
Then you might want to head for the proper thread?
:p:
*points at thread title*
Hm....sexy beast but I'm more lean toward the GTX 260, $600+ doesn't seem to be a win-win situation for me. :down:
Hey......my birthday is in july......
CJ say's
http://forum.beyond3d.com/showthread...16#post1169916Quote:
Quote:
Originally Posted by serenity
I'm getting a sneaky suspicion that CJ's alarm clock didnt go off.
Or he might have hit the snooze button!
You were right.
Anyway, I got some updates.
The X4100 number for the GTX 280 was correct with the drivers at the time. Seems that NV was able to push it to X4800 with the latest drivers. They might add some more points when they enable PhysX on their GPU, but that kinda goes against Futuremark rules... since a PPU can assist for a higher score and not a GPU... this could add up to 1000 points in the CPU test... so we'll have to see what Futuremark decides about including the PhysX into the score....
But here's what you all wanted to hear.... According to the same source R700 scores about X5500> in Vantage.
Oh and my buddies at VR-Zone tested a HD4850 OC 700Mhz. P64xx score in Vantage.
WoW looks like nvidia lose for the frist time in a long time
25W in idle look nice.
GPU power only when needed.
What killed R600/RV670 was AF. If that's fixed, then I bet a lot of those "real world" benches are going to sing a different tune
Who knows, we wont, not till july.
In 2k5 and 2k6 yes , but last time i checked the ATi HD38x0 cards & drivers were not so good in 3dmark vantage and CJ is talking about RV770/R700 3DMark VANTAGE numbers , not 2k5 & 2k6 ........
so if that vantage numbers are real , that might be a good sign !
Let´s wait a few more days to see real benchs/games performance !
regards
Dang...very nice looking card.
Never, ever compare a SLI/CF frame rate with a non-SLI/CF frame rate. The two are not comparable. SLI/CF suffers heavily from unsynchronised frames from both cards and in most games, a lot of the frames SLI/CF produces don't increase your viewing experience (don't make the game smoother) but contribute to stuttering. When you're in SLI, what FRAPS shows you as FPS is irrelevant.
It's such a basic concept, and it's so easy to prove it by just taking some frame time benchmarks from FRAPS and so easy to understand what it means; yet nearly all of the major hardware reviewing sites ignore those and say "We had a frame rate increase of 30FPS when we plugged in a second 8800GT!!" Yeah, you did, but it's probably of no use. Hardware sites should inform the consumer about this. No one should buy SLI, 3870x2 or 9800GX2. I own a 8800GT SLI and know this exactly well.
3870x2 didn't take the performance crown from 8800 Ultra, and 9800GX2 didn't take the crown back from 3870x2. Until the new ATI and NVidia single-GPU parts are released, the performance king is still 8800 Ultra.
Nah I know this thread is about the gtx, and I try to stay neutral where I can, I'm just sick of hearing OBR's fanboy comments
But I do have to agree regardless of what the performance outcomes are the gtx 280 does definitely look a lot cooler (as in more awesome, no pun intended) than the r700
nvidia is quite a bit faster in this recent review of cod4:
http://www.anandtech.com/video/showdoc.aspx?i=3275&p=6
if you're talking about 3dmark06 scores, all the WR's are done on X2s and CrossfireX X2s
Man the quality of the reviews (imo) has really degraded, not to mention the numbers are on a huge spectrum also, sometimes you really wonder whether or not these sites are simply drawing numbers out of their ass, spray paint them gold and then pretend they are real or just look at other reviews, take pictures of a card, then change them numbers to their liking (but keeping them close enough to look legit)
A single card drawing 250 watts is impossible. Even a 8800 Ultra drew 130-150W. I wonder what TDP Ultra was rated at?
236 is absolute max, as in if EVERY single transistor of the chip is running at full load(here's a hint, that's not going to happen)... Every single shader, TMU, Rop, using ALL of the memory bandwidth as well as ALL off the memory itself...Everything.
In all reality, that pretty much never is going to happen, period. Once one part bottlenecks the performance, boom, you're not on full load of the chip. :up:
referring to whoever said the $600 was bad...wasnt that how it was with the G80 when it first came out?
512-Bit
1GB GDDR3
240 Stream Processors
PhysX Ready
CUD Technology
PureVideo HD technology
Full MS DirectX 10 Support
Open GL 2.1, SLI, PCI 2.0 Support
without using the PhysX
http://img355.imageshack.us/img355/9541/sinppuno4.jpg
using the PhysX
http://img187.imageshack.us/img187/755/conppuba5.jpg
Looking at the pictures, do you think the aftermarket heatsinks avaliable today can be used on it or they will make new ones for it?
Very funny. ;)
i'm just posting everything i find on the net :p:
I think this was hashed out finely in the main thread to the conclusion that a 90% as fast computer could be built in the U.S. for $650, past cards except the last generation or so have all been much cheaper, and people just don't care nowadays and want to pay as much as they humanly can for a card, no matter how out-of-line it is with the past. They see it as a good value to spend twice as much for 10% performance, everyone else be damned who cares about bang for the buck. They think it's the "pay to play" mentality, and deny that it's an issue.
http://www.xtremesystems.org/forums/...&postcount=731
http://www.xtremesystems.org/forums/...&postcount=750
Man.. you do realize that the bandwidth on that bus is bandwidth * frequency, right?
512bit doesn't mean anything by itself, it's just an out of context number. If you were discussing two buses with devices running at the same frequency, one bus at 512bit and one bus at 256bit - then there might be something interesting there. Really though you could get the same bandwidth by doubling the frequency of the 256bit bus.
So this isn't a case where "MOAR BITS EQUALS MOAR BETTAR."
yeah agree, read those "micro stuttering for SLI CF" thread in beyond3d,
not to mention that vista driver support for games, especially new games.. i remember it took a long time for Crysis to see some micro gain in both XP and Vista (for my 8800GTS 640 SLI), and the gain is soo minimal, i dont think ill ever use SLI again after seeing how little gain it provides, and how it restrict user to nv based boards... i mean sure it scores more 3dmark.. i care more about real world gaming performance
for me if GTX 280 suffers no bandwidth bottleneck from PCI-E 1.1 for single card, ill prolly get that and wait for the next single card monster... GTX 380? R800 single chip?
P4608 -- My 3D Monster Fusion can score that high:D
It is easy, indeed. Just tell the driver to hold off frame requests from the second GPU until a certain time passes. This way, every frame will be sync'd.
But NO!!!! FPS will be lower. So neither NVidia nor ATI applies this very easy solution, just to keep those rocketing SLI/CF results in review pages.
And when FPS goes any lower, there's no incentive to buy SLI anyway.
Multi-GPU sucks. Just go with a 4870 or GT200.
yeah since SLI,CF = extra cost(over single card) of 1 more graphics card, and possibly $100 more on the motherboard.. they definitely wanna make the case that it can reach 1.8x the fps of single card
but even with the micro stuttering, Crysis V high DX10 Vista on 8800GTS SLI still shows little performance gain for SLI over single card, not that single card is any good, but for another $100 + cost of another GFX card, i would like to see a lot more performance gain thats all
just my 2cents
I dont understand this vantage crap, I like just having the numbers.
yeah so like P6400 = high profile ??, and X4600 = extreme profile??
How well future proofed is vantage? Because it has this letter system cant it become maxed out? I like being able to run benches that I had from my first computer to see how ive progressed, like am3, first score 17k, current score 200k. Plus I think that 06 will still be more important as many people do not have vista yet.
Nice looking shroud, I suppose.
They really need to clean the dust out of the CPU heatsink if you look at the second link in the OP.
wow placement of the power connectors is a big turnoff. makes it almost impossible for cable management
wow so as someone suggested in the RV7700 thread that 4850 scored P6400.. then its pretty bad isnt?? if its true
It's higher than a 9800GTX.
i thought ultra score like 6400+, so yeah 4850 is pretty much an 8800ultra isnt it?? after 1+ year since the launch of ultra
9800GTX/8800 Ultra score from high 5xxx to mid 6xxx, depending on the cpu.
I think 4850 is more of an equivalent to 8800GTX/Ultra.
Actually I based that guess on my previous guesstimates. I guess 4870 will be around two times 3870, which puts 4850 somewhere near a G80 GTX.
The new graphics demo running on the gtx 260
http://uk.youtube.com/watch?v=K9gwJwCNvT8
I got tired of the fakes and uploaded
i upped it on rapidshare
http://rapidshare.com/files/11997794...AKES-.rar.html
youtube >>> rapidshare
btw. ;)
it's a gt 260 in the video btw,
I've not seen any leaks of it yet but the 6+6 power connectors define it as such
that card aint fake, neither the tech demo
man,first some gripe about cheap 98gtx waaaa its not new.
now you get a card twice as fast(well soon to be seen anyways)and $650 is too much?.....too bad
you wanted a new fast card and it seems you have to pay for it....or just wait 2 yrs you will get one cheaper
Or a superheatconductor
Oh, I know by the power cable numbers it's possible; but do we have a confirmed TDP rating of 250w for GTX 280?
A top end G92 (overclocked 9800GTX) draws at most near 90W. 250W is some way off double of that, although under %100 load it will likely be more than 150W.
Ouch 10.5 inches for the GTX260?
At least the G80 GTS's were 9"...
At most 90W for a 9800GTX? Its more than that, TDP is 168W btw.
Even estimating from a typical system power (allowing for PSU efficiencies) the difference from idle to load is around 100w. Look at SLi figures to minimise other system loads. Work them anyway you like, no way is a 9800GTX only using 90W max. More like 125w at stock, higher for an OCed version. If it were near 90W there would certainly be no need for 2x6 pin PCI-e plugs.
http://img212.imageshack.us/img212/9...weridlehi4.gifhttp://img181.imageshack.us/img181/9...werloadmp2.gif
You're right, my bad. It looks around 140W. If this is 140W, then a GTX 280 could easily be 250W.
I can't understand how a G80 GTX is only slightly less power efficient than the G92 GTX. This is what fooled me.
I wonder why nivdia card is always more expensive than the ATI ones are they really that good? in benching is the the ATI has the highest in 3Dmark06?
^^ most ATI/NVidia cards perform on par with their prices