That GTX 480 Vantage run was done with PhysX enabled. If PhysX was disabled it would be well behind the 5870 runs(unless I'm estimating the score wrong).
Printable View
That GTX 480 Vantage run was done with PhysX enabled. If PhysX was disabled it would be well behind the 5870 runs(unless I'm estimating the score wrong).
^how do you know it was with PhysX enabled ?
probably not 6, but a few im sure.
and time will tell what this card can do. since its built for dx11, which just got here. if there is any optimization to be made on dx11 titles, nvidia is probably going to get the most out of it.
that said, i believe that this card even though is a new architecture, does not have that much of an increase over the old one. 3870-4870 was great, this i dont think comes even close to that kind of a jump.
I don't think it would be possible for them to validate the card without in house drivers so I would imagine the driver team has had plenty of time up till now. However once the cards start shipping in qty and the end users start utilizing them there are bound to be issues that didn't happen in the labs.
The extremely high CPU test 2 gives it away. Compare it against http://hwbot.org/community/submissio...70_52193_marks :)
4 days left i need to retire my 8800GTX for physx and use a new Card common Nvidia give me something worthy...!
Wether PhysX is used or not doesn't effect the base GPU score and that is the only thing people should be comparing. Comparing CPU scores mean :banana::banana::banana::banana: all. Yes it effects the total but not as much as people think. I still insist that people only look at the gpu score when comparing ( for non benching purposes; eg comparing gpus, not identical configs ) and not total as it is misleading none the less.
dont you mean 2900xt-4870 was great jump? dont forget the 3870 had just a shrink over the 2900xt 80nm-55nm dx10 -10.1 thats it if i remember correctly (g80-g92 type of thing)
i hope nv gets 3d03 to run alot better with the 4xx cards like ati does with the last two series
you mean 512?
Bleh, was hoping for a bit better result for the GTX470, I guess I'll see for sure in reviews but might be ponying up for a 480 if those scores are on-target.
250W TDP for 480. Hmmm.... Do you think my Antec Earthwatts 500 will handle this plus a Q9650 OC?
It could damage the system. But go with it anyway, Fight The Power :D
A system with my specs in a firingsquad review pulled around 410w from the wall:
i7 920 overclocked
6gb ddr3
radeon 5870 stock
2 hard drives
1 ssd (in mine, at least =p)
several case fans
So, a GTX 480 with a 250w TDP, and let's say 240w real-world draw... compared to the 5870 with a 170-180w draw if I recall correctly, would make you hit around 470w... so as was said above, with an Earthwatts 500w you're going to be pushing it: it will run but it will definitely be hitting the PSU hard, I wouldn't count on overclocking the vidcard at all. I think the quad you have is a little lighter on power draw, but same difference for the most part.
Even if the eventual system load is "only" 400W... I still wouldn't push a PSU to near its max load ever
Um, can you link me to that? Xbit shows the real power draw of a 5870 to be more like 130W,. I'd be surprised if an overclocked i7 used even 150W. 500W should be fine for a q9650 and a 250W video card. The 9650 uses around 60W at stock, so even if it's OC'd and consumes 100W, he still isn't going to push the limits of the PSU.
Yeah, it's not good for it and definitely will burn it out faster, also to keep in mind is with heat the output capacity drops... so if the system runs hot, it's going to drag you below the technical maximum regardless. I agree that it would be smart for him to swap to a new PSU if he's going for the 480.
Ok. I'll just stick with 470 or 5870. I don't wanna live on the wild side :p:
Twice the power use for 10% more performance and it costs more? Please to explain the appeal here?
look here, it's sorta in the middle, 64.8W-
http://www.lostcircuits.com/cpu/inte...n/powermax.gif
http://www.lostcircuits.com/mambo//i...1&limitstart=6
Yes, the 9770 used more, probably used more voltage. If he has a newer revision 9650 it probably uses even less.
come on... :D
unless you run GPGPU DC or have a laptop for web browsing,emails,youtube and reading, listening to music, watching movies etc, id be very surprised if your vga is loaded most of the time your pc is turned on...
qx9650 is almost half the tdp of a qx9770? sorry but that doesnt make sense... 0_o
Q9650 = 95W tdp
QX9650 = 130W tdp
huh?
My Q9400 clocked at 3.0ghz should consume almost the same as a qx9650, right? I've got less cache, but surely that wont amount to 60W. Yet, under FULL system load, I only see 180-190W. That's with an 80W video card. So, if my CPU used as you say, 110W, it'll mean the rest of my system components are drawing no power. And I trust Lostcircuits, their methodology is quite sound.
edit- the 45nm c2q's were notoriously low power chips at stock, so I don't know why this would come as a surprise.
yeah, i trust MS a lot... but i know that there was quite some variation with c2q and qx tdps...
i didnt know 9770s were THAT much hotter than 9650s...
interesting... so yeah, i guess 9650s are around 80W...
DigitimesQuote:
Nvidia initial batch of GeForce GTX 400 series to have fewer cores than expected
They could and would only if it made sense. Until the release of a better 480 part ( die shrink ect ) I don't think there would be a place for it and even then any full 512sp part will likely be a limited run part like the 8800ultra ( on 40nm anyways )
Realistically people should be expecting a card that roughly matches the GTX 295 but with more vram and DX11.
ALL LIES I SAY..
pretty sure that fermi card is hangin out with loch ness monster and the yeti playing videogames..
5870s official TDP is 188 and 480's is 250W. If you're seeing several places that show it below 188 you're reading 5870 reviews in which power consumptions are measured. So the fact that you're saying "I'm also seeing places where 480s are closer to 290" means that you are also reading GTX 480 reviews 4 days before NDA, and there you are seeing measured power consumptions of 290W. You are making perfect sense.
Plus, 2 times 188 makes 376. Still not 290.
Has anyone in all this even bothered to say or think that despite the 480 being clearly higher in TDP, it's not going to use anywhere near that full 250 watts unless it's running a game that is maxed out with high resource graphics replete with full use of DX11's advancements. In fact you may only see it producing that much heat on synthetic bench tests like Heaven. Too much speculation here and not enough facts. Let's wait and see what the results are. Most of the time when actual wattage consumption and heat output are measured in real world applications, you find it takes considerable less power than the max load it's capable of to run the hardware.
There seems to be a lot of confusion about computer wattage. PSUs ain't 100% effective. Current new models have 80plus certs and a few with more, but thats best case scenarios. 80% efficiency is a realistic value and for you in 110V land maybe a little less.
This means that if you measure at the wall a 100W draw you're using 80W, the rest is heat in the PSU. This also means that a rated 500W PSU at maximum load will draw 625W from the wall..
So when you see statements like "the system draws 450W" which alludes to a killawatt measurement, that's only the maximum for a 360W PSU.
you gotta be an electrical engineer to figure this stuff out :)
From Fellix,
Bench setting is unclear...Quote:
OK, some first hand info, but my source is still under NDA, so not much in-depth details.
GTX470 is ~10% faster than stock HD5870, benched in Crysis.
The noise level under load is similar to GTX285 -- noticeable but not irritating.
that doesnt explain why they renamed their retail products though ;)
and it doesnt explain why they named the last G92 based OEM part GTS150 while the retail cards were called GTS250... same specs, same clocks, same chip... but sure, it was those oh so evil oems that push nvidia to rename their cards to make them sound more powerful :D
thx :toast:
no big difference though, 10% more or less... i doubt itll make a difference.
whether you play on a system with a 5870 or gtx470 or gtx480... it should be pretty much the same... unless you game at ultra high res, then the 480 might actually make a difference compared to the 470 and 5870... cant wait for reviews to check that :)
oh and about the 10% more perf for double the watts...
im not saying i believe it, i have no idea... but i believe he meant actual wattage of 5870s is around 130W and actual wattage of 480s is 250W, so then its ALMOST double the wattage... IF 480s are really that power hungry is another question...
since their tdp was rumored to be close to 300W first, that might have been an electrical worst case scenario which was very unrealistic, so nvidia lowered it to a more realistic max tdp value... but i dont think the 480 comes close to 250W under gaming load... id say 200W max when playing games, and a 5870 is probably around 140W max when playing games.
saaya: I think you missed that the guy reported %10 better perf. in Crysis of 470, not 480...
However, this still might be the 2560 + 4xAA case which says little...
must be the most power hungry config. ever...
http://img413.imageshack.us/img413/9...r0n1097648.jpg
we might see 90 pages before the 6th of april LOLL
It's like only 3 days left (it's past midnight here) and there's no comprehensive leaked stuff, yea the most important specs are revealed but nothing besides this, no trustworthy benchmarks etc. I must say this is very impressively kept secret. GT200 was kept quite secret until like a week beforehand, things started leaking out, benchmarks etc but even 3 days before fermi stuff is meant to be released no signs of real benchmarks etc. Congrats nvidia to this if nothing else. :clap: :D
It is GF100 A3,leaked by a chinese employee of Giada.
http://news.mydrivers.com/Img/20100316/11481863.jpg
http://news.mydrivers.com/Img/20100316/11484455.jpg
Your spouse/chiropractor laughed at you when you spent $400 on it.. now who's laughing!
1500W. Let that sink in.
mobo = 6pin = 75W, 8pin = 150W.
GTX480 = mobo + 6pin + 8pin = 300W max
GTX460 = mobo + 6pin + 6pin = 225W max
For 3x SLI need: 6 PCIE connectors, and 675-900W for video cards alone.
There are about a dozen 1100-1200W PSU on the market - which suddenly dont look so excessive. But most only have enough 6/8 pin PCIE connectors for 2 x GTX480 or 3x GTX470.
Newegg has three PSU with 8 PCIE connectors, and one amazing PSU has 4x 6pin, 4x 8pin ;). Sadly Antec only has 1200W - hardly enough for OC CPU AND Triple or Quad SLI.
PS: If they add even metal bars/rods along top of heatsink, it will leave the same marks as grill on your hamburgers :)
no, i didnt miss that...
but whats the difference between a 5870, 5850, gtx470 and gtx480?
unless you play at super high res you wouldnt be able to tell them apart by playing games on those rigs... same res, same aa, roughly same fps and min fps...
so one card is 5% faster here, another 10% there... that doesnt really make a difference, does it? i hope reviews will have oced vs oced comparisons as well, that would be interesting... especially for the 470...
FYI
headsup for all the fermi lovers out there :DQuote:
NVIDIA will be officially launching the GF100 graphics cards at PAX East 2010. We’d like to take this opportunity to get together with some of our biggest fans. Here’s your chance to meet some of your fellow Club SLI members as well as some of the folks from NVIDIA.
http://surveys.nvidia.com/index.jsp?...ced6122cfa2c4a
If it is true at anything less than 1920x1200 8xAA very high settings, I'll be impressed. Otherwise :rolleyes: Again we've already seen the leaks with the 470 pulling ahead at 25600x1600 4-8xAA but clearly this is a biased scenario as half the time the game wasnt playble (or just barely) on either card despite the extra video memory ( in other words 10%+ crap still is crap :up: ) I don't have many doubts that the 470 will do better at 25600x1600 on average however I still think it won't be enough as this resolution is really multi gpu territory (unless one stays clear of current titles anyways ). I don't see the point pairing a ultra highend display with a non ultra highend gpu setup ( not to say 5870/470s aren't highend, they just aren't ultra high end solutions )
:stick:
Just ask my wife what a inch or two of e-peen can do. Its all about girth.. sorry, I mean bandwith.
Honestly, its all about the same with these new cards. Whatever works for your res is the best choice. My 5850 maxes out almost everything but crysis at 1080p. I cant really ask for much more for just gaming. I think it was one of the best purchases I have ever made. I would wait till a fermi refresh honestly, this first batch looks horrid. I predict a fermi refresh asap, fermi just isnt profitable in its current state.
I have a hard time spending more than 250$ on a piece of hardware I know is going to be outdated and worthless in 6 months. I am way happier spending less and upgrading more often.:up:
Pc gaming is dying fast, the consoles and ports rule the market. ACCEPT IT
what has nVidia done for you in 2 years?
5 price brackets. 2 years. 5 products. 1 chip = whole lotta slackersCode:$200 $250 $300 $350 $500
Jun-08 GTX 260 GTX 280
Jul-08 GTX 260 GTX 280
Aug-08 GTX 260 GTX 280
Sep-08 GTX 260 GTX 280
Oct-08 GTX 260 GTX 280
Nov-08 GTX 260 GTX 280
Dec-08 GTX 260 GTX 280
Jan-09 GTX 260 GTX285 eol
Feb-09 GTX 260 GTX285 GTX 295
Mar-09 GTX 260 GTX285 GTX 295
Apr-09 GTX 260 GTX275 GTX285 GTX 295
May-09 GTX 260 GTX275 GTX285 GTX 295
Jun-09 GTX 260 GTX275 GTX285 GTX 295
Jul-09 GTX 260 GTX275 GTX285 GTX 295
Aug-09 GTX 260 GTX275 GTX285 GTX 295
Sep-09 GTX 260 GTX275 eol eol?
Oct-09 GTX 260 GTX275 eol eol?
Nov-09 GTX 260 eol eol
Dec-09 GTX 260 eol eol
Jan-10 GTX 260 eol eol
Feb-10 GTX 260 eol eol
Mar-10 eol? eol
Apr-10 nothin nothin nothin GTX470 GTX 480
May-10 nothin nothin nothin GTX470 GTX 480
CONTRAST to AMD. Unless I'm mistaken, you can still get rock bottom priced $100 4850's and $150 4870. In contrast, hard finding anything beyond GTX260 from nVidia for MANY MONTHS, and certainly nowhere remotely near $100.
Notice full lineup. No gaps. Notice how far prices fall yet availability remains.Code:$200 $250 $300 $350 $500
Jun-08 4850 4870
Jul-08 4850 4870 4870x2
Aug-08 4850 4870 4870x2
Sep-08 4850 4870 4870x2
Oct-08 4850 4870 4870x2
Nov-08 4850 4870 4870x2
Dec-08 4850 4870 4870x2
Jan-09 4850 4870 4870x2
Feb-09 4850 4870 4870x2
Mar-09 4850 4870 4870x2
Apr-09 4870 4890
May-09 4870 4890
Jun-09 4870 4890
Jul-09 4890 yeah, we're practically giving away those 4890s
Aug-09 4890 yeah, we're practically giving away those 4890s
Sep-09 4890 5850 5870
Oct-09 5770 5850 5870 5870x2
Nov-09 5770 5850 5870 5870x2
Dec-09 5770 5850 5870 5870x2
Jan-10 5770 5850 5870 5870x2
Feb-10 5770 5830 5850 5870 5870x2
Mar-10 5770 5830 5850 5870 5870x2
EDIT: These are very rough approximations based on MSRP.
Notice not counting 2 different GTX260 types for nVidia
For AMD, not counting, HD4770, HD4830, HD4870 1GB, HD4850x2 or 5870 Eyeinfinity edition.
Just like 4870/4890, 1 year from now, you will be able to buy a 5850 or 5870 for less than $200... Fermi - only in a liquidation sale
since when did all of the 200 series reach eol?
Your posts are (in general) good reads Deimos. :clap: Either full of information, full of teh lolz (not in a negative way!) or both :)
They've been EOL for quite a while now, try to find a GTX 275 in stores...
the only one left is the 285 for a ridiculous $375 which is close to the 5870. How much longer do we have to wait!!?!?!?!?
Don't forget about the G92 which is still goin strong in the low end, but can't really compete with ATI anymore in price. I think the G92 will never go EOL...just keep puttin them out
2560x1600 is not some mythical unicorn that no one runs with on modern games. It's 100% a good resolution to test in especially since it pushes the GPU and not the CPU in a test. Hardly biased since most people with these kinds of cards would be running it regardless. Battlefield Bad Company 2, which I didn't have issues with my 5870 with, was running pretty well (45-50fps most of the time) at 2560x1600 4x AA. A 20%-25% boost on that from a 480 would put it into the nicely playable territory (57-63fps). It's a brand new DX11 game we're talking about as well: same thing with games like Warhammer Online (another current MMO) where it's close to being playable but just dips and is a bit too low on average to really enjoy. Need For Speed Shift, same story really... 45-50fps on the 5870 when not encountering the issues from hitting other objects...
In short, if the 470 is 10% faster at those settings, then we can probably safely say a 480 would be 25%, and thus extremely attractive to high-end gamers. I disagree with you that 2560x1600 would be a rarity for people buying one to two $500 videocards. I'd think most people dropping that much cash on cards ($500-1k or so for a setup) would definitely have already bought the $750-1000 monitor to really show them off with since monitors last several years generally whereas a top-end card lasts 6-8 months as high-end.
I got my Dell 30" widescreen 3007WFP-HC 2560x1600 LCD for $750 shipped (refurb, pristine condition) with a 5-year warranty from them. You can find similar ones new for $1100-1200. What you describe is like buying a super-highend projector and then using it at 40" screen with a measly $100 home theatre in box setup: no one does it. They run 75-100" or more screens and get nice bookshelf or tower speakers to make the setup actually shine. There's little-to-no point in buying crossfire 5870's, a 5970, SLI 470's or 480's just to run them at 1920x1080 or 1680x1050, and I doubt most people do. It's overkill.
I can only think of 9, maybe 11 G92 products... (GT330 is probably that 40nm chip)
April 2010 - the all new, super uber cool, GTS450 (Iron Man2 edition) - c'mon nVidia, no need to hide those PR slides, we ALL knew another G92 was coming.:rolleyes:Code:The Life of Jeremy Bentham
Nov-10 8800 GT
Dec-07 8800 GTS 512MB
Jan-08 8800 GS
Mar-08 9800GX2
Apr-08 9800 GTX
May-08 9600 GSO
Jul-08 9800 GTX+ 9800 GT
Mar-09 GTS150 GTS250
Feb-10 GT330?
We all had a good ride.. now its time to take old Yeller out back with the shotgun. Common, old boy, dont stare at me with those big round puppy eyes - ugh I cant do it - ok ok this is last and final year, I promise ;)
You're missing the overall point - which is what many people do when they are backed into the corner by a sound argument supported by facts.
well if deimos could clarify his original point that would help. are you talking about rebrands or just different sku's based off of g92? there is a difference all you have to do is look:)
9800gtx
http://images.nvidia.com/products/ge...X_3qtr_low.png
gt330
http://images.nvidia.com/products/ge...0_low_3qtr.png
g92 was a great architecture, nvidia just cant get enough of it, the consumers might have a different opinion.
G92 will never die ....
You *completely* missed my point...
I clearly said it isn't realistic to expect to run 2560x1600 fluently with a single video card. It is multi gpu grounds. I also said the majoirty of people who do shell out the cash on a displays like this are likely going to do it justice with a adequte system ( assuming it is being used for gaming purposes ). No where did I say 30" LCDs are rare, merely that expecting to tame games at 2560x1600 with reasonable IQ is unrealistic. A single 4870 didn't even last a year running stuff at 1920x1200. I don't expect the 5870 to be any different ( I already find its performance to be lacking in some games, even without anti aliasing ) I suppose it does depends on ones defintion of playability but in my eyes most things aren't playable at 2560x1600 with the currently available single gpu cards (and before you comment I've put a 5870, 5970 and GTX295 through their paces on a 30" display). The fact remains 30" LCDs are the ultra high end. I'm willing to bet more multi gpu users run 22-24" 1080/1200 displays for that matter...
Well, Deimos and the red camp are sure having fun trolling this thread....
I'm not sure what some of you consider fluid and playable but I find a majority of games I have very playable at 2560x1600 with a lowly 260 with 2-4x aa details generally as high as they can be set without issue.
It seems there is too much bench/graph comparing going without enough actual hands on time in the real world or something.
GTX480 machine power consumption at full load is nearly 130W more than HD5870 machine,full load temperature is 92℃.Quote:
很夸张,整机466W显卡满载功耗,以前HD5870测试的时候是33xW,也就是说两者差了将近130W ,但是两家官方PPT里的TDP又是多少呢?HD5870为188W TDP,某卡为250W TDP,是谁再撒谎?
至于温度,满载1分钟不到飙升到98度,随后风扇开始提速,温度逐渐下降一直到92度,之后再也没有发生变 化.(室内温度20度)
nApoleon 发表于 2010-3-23 09:14 http://bbs.chiphell.com/images/common/back.gif
http://www.nordichardware.com/en/com...e-gtx-480.html
gtx480 and 5970 side my side
I think the pattern is fairly obvious. All except probaby GT330, use "G92 and/or G92b".. ie the same chip.
Is 8800GS a rebrand? 9800GX2? I guess you could say GTS250 is a rebrand, but it has different PCB, memory, heatsink. box!! etc.
=============
G200 eol a good thing?
Imagine you bought a GTX275 or worse, a GTX295 yesterday. Week later, your "investment" resembles those GM stocks you thought would be a good idea.
============
Those arguing not to worry if nVidia loses money on GTX470.
So how do they recuperate several $100m of R&D expense?
Uber super cool and brand new DX10 G92 products?
Those high margin non-existent DX11 cards that haven't launched?
Perhaps the millions of Tesla products you see every teen in school buying?
What is my point?
GTX 280 launced with MSRP of $650. Because of AMD's 4870, it was reduced a week later to $500. Lets pretend board has fixed cost of $300. That's difference of $350 vs $200 for profit... or almost half as much.
Fastforward a 2 years later, and nVidia is still hasn't learned. Boards and chips are bigger, hotter and more expensive to make.
Everything else being equal, consumers would probably buy:
- cooler/lower-power product
- higher performance
- nVidia if it still represents great brand name image - questionable?
So, you can't just say 10% higher performance, so we'll make MSRP 10% higher - need to factor in cons of power/heat/availability. Things would be rosy if a 384 bit board, 50% more DRAM chips, and 50% bigger die with low yields only added 10% more to costs. Very likely a lot lot more.
With 0 competition, AMD has sold millions of DX11 products and made mucho $$$. nVidia is at $0.00 And, people who already own a DX11 card, are unlikely to "upgrade".
So, just like AMD with Phenom(1), how do you sell the product?:
- lower price? -> temporary gain in marketshare at expense of less profits.
- lower power? -> trouble enough as it is getting working dies
- higher clocks? -> power is already very high
- lower costs? -> would you buy more expensive 768MB card? What if "some assembly required"?
- PhysX marketing? -> trouble is, its nothing new since all GF8/9/200 have it.
- DX11 promotion? -> sucks you spent last 6 months saying its not a big deal.
- CUDA GPGPU apps? -> its nice bonus. People don't buy $500 video cards to accelerate their Adobe PDF Reader.
- EyeInfinity? -> Don't got none.
The desperate "solution" is the same as it was in fx era... its the only thing you can really change...
questionable driver optimizations at the expense of rendering quality and driver stability.
Ofcourse like G80, about a year later they're gonna have a 32/28nm 448SP 256bit 1GB GDDR5 board. They're gonna call it "480GT"
So that metal part on top IS a part of the heatsink... interesting.
Gotta give kudos to the team that designed that thing, they certainly were creative with shoving as much heatsink onto the pcb as possible!
How is this different from ATI when compared to their previous gen, the boards and chips are bigger, hotter and of course more expensive to make as well.
As far as dx11 sure but obviously with the inventory shortages at etailers that is a clear indicator nvidia have managed to sell off inventory easily even in light of ATI's dx11 & performance advantage.Quote:
With 0 competition, AMD has sold millions of DX11 products and made mucho $$$. nVidia is at $0.00 And, people who already own a DX11 card, are unlikely to "upgrade".
I think you're blowing the desperation of the market out of proportion, graphics cards don't sell to the masses like iphones or something. Just because DX11 or newer faster hardware is released the majority of users aren't running out and buying new hardware simply because there is new hardware released, not everybody lives on the bleeding edge of tech.
And if you're worried that the card you bought today may be obsolete a month later when a new model comes out well you need a new hobby. We all know there is new hardware always around the corner and if you don't time your purchase early in the cycle your hardware will be superseded sooner or possibly the same day you make your purchase.