who's ashley mcbroom?
Printable View
who's ashley mcbroom?
Ok I have to say it, "Can you imagine 3-way SLI with GTX 480" :rofl: that is some serious power usage. Glad I have a HX1000 but it would be iffy if 1000W would be enough for 3-way SLI which I wouldn't even think of doing.
BTW Newegg has GTX 480 now listed, not in stock yet.
yes "his" card, he came to this thread talking about getting this next, before any reviews were even yet
And yes, its 25% more expensive, but even Anand concludes its only 10-15% faster on average. And overclocking works both ways :p:
I mean, by all means vote with your wallet, it ain't my money, but I don't see the 10-15% (lower with 2560x1600) being worth 25% more expense + 100W more power load, especially since I leave my computer on 24/7 for various programs
Read it now and it's a pretty decent review, I'll take my hat off to them for a balanced review.
The 480 is pretty much what I expected, although the power draw and heat is a concern.
Certainly ATI won't be losing sleep over it but it is powerful enough in DX11 games to be worth purchasing.
Sorry pal, but as long as I am applying for nuclear engineering at university next year, I just could not watch this :slap:
Generator for electricity, cooling tower for cooling!:slapass::rofl::ROTF:
http://www.hw-world.7u.cz/images/sto...mi_running.JPG
I think anybody "purchasing" one of these isn't going to see it for a couple of weeks. All sites seem to be showing "out of stock" and evga forums says 14 days until it hits stores....
I am very glad to read this from Hardware Canucks review on the GTX 480, I was quite concerned as 2560x1600 is all I care about. Now I am a few mins from pulling the trigger on GTX 480 SLI. I did think they would solve some performance issues with drivers but nice to see that it is being seen and worked on already.
"According to our conversations with NVIDIA, their current beta driver (and the one which will be launched with the card) exhibits an issue in this game (and DiRT 2) where framerates take a massive plunge at 2560 x 1600 resolution. They expect a fix to be forthcoming sometime in the weeks following launch."
HardwareCanucks review on the Battlefield Bad Company 2 page:
Good to hear... I was wondering what happened there otherwise.Quote:
According to our conversations with NVIDIA, their current beta driver (and the one which will be launched with the card) exhibits an issue in this game (and DiRT 2) where framerates take a massive plunge at 2560 x 1600 resolution. They expect a fix to be forthcoming sometime in the weeks following launch.
Damn, I agree. Good consistent benchmarks
http://i40.tinypic.com/25qfx3b.jpg
Words can not describe how pissed I am that they were all sold out of the pre-orders only 30 mins after end of Nvidia demo and start of the sales. I guess I should have pulled the trigger the moment I could add to cart. WTF.
Hence pre-order?
If true, expect these on ebay for 2x markup!
the 480 seems to be a fairly decent performer generally it comes in around the middle of the 5870 and 5970 in performance not bad at all and the price isnt too bad at 450 either. although power consumption is absurd it uses more power at load then a 5970... that's just not right at what point does it become too much? 480s in sli and god forbid tri sli is just mindblowing :shocked:!
Man this is ridiculous. Acording to all reviews I read up to time, HD 5970 is fastest almost everywhere and acording to Tom's HW, its quieter, consumes less and even has better price per dollar ratio! I just see no reason why somebody could pick GTX 480 over HD 5970:shrug::shrug:
Oh 10C* headroom before my card themrally throttles. SURELY...SURELY this will not be an issue in August...in florida....when my room gets to 90F...SURELY!:clap:Quote:
Finally, with this data in hand we went to NVIDIA to ask about the longevity of their cards at these temperatures, as seeing the GTX 480 hitting 94C sustained in a game left us worried. In response NVIDIA told us that they have done significant testing of the cards at high temperatures to validate their longevity, and their models predict a lifetime of years even at temperatures approaching 105C (the throttle point for GF100). Furthermore as they note they have shipped other cards that run roughly this hot such as the GTX 295, and those cards have held up just fine.
I find it humorous that NVIDIA hasn't made a single comment on their twitter or facebook pages about the launch, while the announcements of announcements were popping up all the time.
Being in Florida you probably have central AC or at least a room AC right? If you are in a room at 90F definately invest in AC first before you buy a video card. I mean seriously..... My house is 77-78'F everywhere and my office has a dedicated AC for rare occasions in the summer time IF it gets too hot, I'm in FL too, Fort Lauderdale area actually and it gets 96'F outside but isn't in here that's for sure.
BTW, Charlie "i'm an idiot" dedknefuhrefuohreofmerijian was wrong in most of the hate he posted. The product has one flaw and that is power draw but all in all it's high clocked and a real great performer. Eat that C.
:rofl:
come and thy it in arizona :up:
:horse:Maybe you should examine the headroom a 5870 has since it runs 86-90'C, you spoke of 90'F if you have a room in your house that is 90'F you should consult someone to correct that issue.
I think that we've had enough about power and heat, if you aren't aware so far it has been :horse:. What is the point on continuing to bring it up, Fermi is a much bigger chip and yep it uses more power and creates more heat.
Well, it's not so much in this price segment and you gain the money back in electricity for maybe one year?:shrug:
Anybody still believe this story about evil CrossFire/SLI? I've seen 20+ games just in last half an hour and the HD 5970's were fastest everywhere, so I doubt no problem there, or you see some?
ADD//
Well as was already noted he could be right at the time he wrote it as we have indications NIVIDA changed the specs several times.
Although the cards are not as slow as rumoured, they suck so much in every other aspect that AMD just really won't care about some NVIDIA for next at least 6 months. So no price cuts anywhere which is really bad and NVIDIA failed so much here!!
Hmmm looking at Evga product PDF specs of the GTX 480 Superclocked they mention 3-way and.... :eek: 4-way SLI.... Can you imagine? And I hope someone does put 4xGTX 480 Superclocked or even standard into a case and benches it. I would love to see the results all around.
http://www.evga.com/products/pdf/015-P3-1482-AR.pdf
|
It looks like a GTX 470 for me. :)
Crysis... Shelved the game because of 2560x1600 requirement, after reading this I'm a bit more excited.
"More relevant still is the awesomeness of GeForce GTX 480 SLI performance. We simply were not disappointed in the performance that GTX 480 SLI delivered. And get a load of this. GeForce GTX 480 SLI allows Crysis Warhead to be playable at 2560x1600 4X AA/16X AF all Enthusiast settings. Take that to the bank, GTX 480 SLI is the real deal. NVIDIA rules the schoolyard when it comes to multi-GPU scaling. CrossFire gets left with a black eye."
Source: http://www.hardocp.com/article/2010/...0_sli_review/8
Here it looks like ATI gets its ass handed to it in multi-gpu setup
http://www.anandtech.com/video/showdoc.aspx?i=3783&p=9
He was right about both SP number and relative performance. He predicted 480SP a long time ago, that was right. Not too long ago a couple of his sources got their hands on 512SP cards, but only clocked at 600 and 625MHz (as stated in the article), and that's where the 5% faster claim came from. Obviously the final 480 shader parts clocked at 700Mhz (with the rest of the core clocked at 700MHz as well) perform a little bit better. Charlie actually posted the facts this time, albeit with a bit too much 'dear leader'-esque Nvidia. Nvidia taking a long time to decide the final specs in no way invalidates any of the facts charlie gleaned.
As far as tessellation, he claimed Fermi was better in heaven benchmark but suffered when other work had to be done on the shaders. At face value this looks true -- Fermi performs a lot better in Heaven than actual DX11 games -- but is that due to Heaven using more tessellation, or actual DX11 games needing more shading power? We probably won't know that for a very long time, if ever.
Charlie never said 480SP. He first said 512, then said it was "castrated" to 448, then said it would be 512 again. He was never accurate on SP count, and he was never accurate on clocks. He predicted 600-625 mhz all the time, whereas even the 512SP part was to have 650mhz clocks.
His "no tessellator, it's going to be emulated, will suck" claim was 180 degrees wrong. Maybe tess. performance isn't as good as Nvidia claimed but it's definitely no worse than ATI.
This is a really expensive version of the George Foreman Grill. :rofl:
a 480 vs 5870 Collaboration of results, pls let me know of any errors. adding 5850 vs 470 atm as more reviews come in with it (470)
http://img11.imageshack.us/img11/136...s5870final.png
From TechPowerUp review:
NVIDIA first publicized its maximum board power as 295W, retracted it and posted it as "250W" probably fearing bad PR. We disagree with their 250W figure. Investigating maximum board power, we landed at the 320W mark, which is way off NVIDIA's claims.
Nvidia is just straight out lying about power figure?
This is the reason why I try not to get involved in these type of tech discussions, too many idiots making a huge fuss over different manufacturers' hardware.
http://dorkshelf.com/wordpress/wp-co...fanboys4xc.jpg
This is almost as pathetic as the Picard vs Kirk debate.
Well the funny thing is that not only is it hot and uses way too much power but also the picture quality is still under the 5870. From Anadtech
Quote:
Last but not least, there’s anisotropic filtering quality. With the Radeon 5870 we saw AMD implement true angle-independent AF and we’ve been wondering whether we would see this from NVIDIA. The answer is no: NVIDIA’s AF quality remains unchanged from the GTX200 series. In this case that’s not necessarily a bad thing; NVIDIA already had great AF even if it was angle-dependant. More to the point, we have yet to find a game where the difference between AMD and NVIDIA’s AF modes have been noticeable; so technically AMD’s AF modes are better, but it’s not enough that it makes a practical difference
So true!
Every review pinpoints the GTX 480 power load at around the 5970's... and the 5970 is rated at 294W. The 480 at 250W. Something obviously does NOT add up.
What's even more shocking is that the 470 uses almost 80-100W more than the 5850. It's one thing to say power doesn't matter when you're talking about the flagship, but now we're talking about the more budget-conscious 5850 vs. 470 bracket where the power consumption figures get noticed
Still to this date the AF on the evergreen cards causes a bigger performance hit than any other line has if I am not mistaken. Its not huge but bigger than what Id call ideal. Im an AF whore though so as a min I like 16af, I'll sooner omit the use of AA before AF but thats me. It is the best AF out (from a purely technical viewpoint ) granted but can I honestly say I notice the difference... not really.
Nvidia did improve their transparency AA however so that fact shouldn't be left out. Without a truely indepth article focusing on image quality alone (remember those nice G80 reviews that did this?), the verdict is still out on this I'd think.
As far as the whole power thing, it is quite odd. My understanding is due to the shader design (320shaders each with 4 simple shaders), it is very difficult to fully tax AMDs design (without very explicit programing from what I've heard) so in turn very unrealistic to reach their claimed TDP (reviews often have the 5870 well under its 188TDP ) Now Nvida on the other hand have a design which appears to actually reach its claimed TDP...
looking back
Predictions that I made before...
- it would not be available in Nov.. probably not even Dec!! (they called me a heretic back then)
- Like PS3 CELL, was 99% sure wouldn't be 512SP. Based on super low yields, guessed 480-448SP for GTX480, and 416SP for GTX470. :*(
- compared power/die to R600 and Phenom. Regardless if broken or high power it WOULD launch.
- 750Mhz is way too high. Guessed 600-650Mhz max based on low GT240 clocks.
- Laughed at folks suggesting $399 or even $299. I guessed $600 - but, although MSRP is lower, surely with high demand there will be ridiculous markups.
- Like everybody else, it was assumed 100% true that power >200W (ie Charlie's 280W rumour). Yet, I thought its only for special case like folding.
- no 6 display EyeInfinity.. well that was pretty much a given.
nVidia surprised me:
- 700Mhz, and yet can OC a good 10% more!
- Surely nobody was naive enough to believe +60% on everything. But amazing performance in some games (FC2, BFDX11)tesselation, folding@home and few demos. Fearing only +10%, performance better than expected (depending on which review you read)
- *only* 250W.. better than 280W, but something around 220W would be better still.
- it actually works in all the games. Some 2560 and TRAA driver issues to fix, but overall pretty well polished. Nothing broken (ie Phenom,R600)
with all due respect angle dependent AF isnt very useful if it is undersampled. its taking the correct samples it should but not enough, nvidia is the opposite. the net quality is about as good as nvidia's quality setting.
http://www.pcgameshardware.com/aid,6...eviews/?page=5
And yet, it pulls more power than the 5970, rated at 294W..., and the reviews consistently show it drawing 100W more than the 5870 which is rated at only 62W lower. Something doesn't add up with 250W
Thanks for the work. So at 1920 x 1200, roughly 8% faster without AA, 15% with AA.
Any 2560x1600 numbers?
The 250W is obviously a lie. Reviews prove it's 300W or even more.
Time to start another 100 page thread.
Are you saying nvidia fudged the Watts? *SHOCK!!*Code:GF104 is reported to consist of 256 SP - half of GF100.
same number of TMUs - 64; and only 33% less ROP - 32.
Memory bus is down to 256-bit - the same as ATI's Cypress.
Most reviews I think used Furmark. Maybe need to compare to 5970 across 5-10 games, before making conclusion.
10% OC headroom means slightly lower power version possible.
Realistically, 4-6 months for new PCB and new stepping/revision - or until next year for 28nm.
Till then, nVidia's gonna downplay the power and do the best trying to sell them.
http://www.tomshardware.com/reviews/...0,2585-15.html
They ran the 480 & 470 across Unigine....
Is a single GTX480 able to game on three monitors ?
If not, this can be the source of the TDP discrepancy. Maybe Eyefinity'ing at 3 monitors draws significantly more power than a single monitor setup. And since TDP is supposed to be the maximum, AMD was forced to measure it at 3 monitors whereas Nvidia measured it a 1 monitor because a single 480 isn't able to game on three monitors.?
The video out logic is quite low power and the 4x0 cards still can output onto 2 displays and still have 3 outputs ( although they aren't always in use ). Then there is still the fact that the 480 is still pushing 250+watts with 1 display so regardless that doesn't change those numbers but I know where you are coming from with that. It remains that 188 is worst case with the 5870 where Nvidias 250 figure is suspect ( we've yet to see 3D vision surround numbers on a newer intensive game such as Metro 2033 so it might get even worse who knows ) Still this is the first time we actually will need a 1000watt psu for a 2 card system. In the past its only been needed with tri / quad configs or in high end bench settings ( ie highly clock LN2 benching / volt modding )
EVGA 480/470 HC FTW:clap:
The additional video out logic ( adding another 3 outputs is more than a basic change ) does boost the TDP obviously and the vram does have its effect. My understanding is the PCB on the eye6 card is different.
I wonder how long it will take for EK to have their blocks out in the retail channels... I'm sure they will sell very well as these cards are going to end up widely watercooled no doubt.
Anand
"For whatever reason AMD can’t seem to keep up with NVIDIA when it comes to the minimum framerate, even at lower resolutions. Certainly it’s obvious when the 1GB cards are video RAM limited at 2560, but if we didn’t have this data we would have never guessed the minimum framerates were this different at lower resolutions.
Finally we have a quick look at SLI/CF performance. CF seems to exacerbate the video RAM limitations of the 5000 series, resulting in the GTX 480SLI coming in even farther ahead of the 5870CF. Even at lower resolutions SLI seems to be scaling better than CF."
Yes... one flaw only :rolleyes: no reason to consider these:
- noise (loudest cards ever according to anandtech)
- Time to market (6 months after 5870)
- Availability (this is a paper launch)
- Manufacturability (castrated 512 core)
- Manufacturing cost (larger chip, more ram, beefier cooling, pwms, etc...)
... those are all features right? :rofl:
Let Nvidia think about manufacturing cost and "manufacturability". What, aren't you going to buy it for $500 because it is costly to manufacture? Consumers don't care about manufacturing cost.
How do you know this is a paper launch? From everything I have heard, the availability of Fermi might be more than 5800s. Don't forget that while there were some 5800s you could buy at launch date, even after 2 to 3 months you still couldn't buy any. 5870 was only a little better than a paper launch.
http://ncix.com/search/?categoryid=0&q=gtx480
I see preorders.
Has anyone looked into ARTIFACTING? At these high temps I bet artifacts are going to be very common. Even overclocking has to be suspect. Just think about if Fermi artifacts right out the box, that would be awesome.
What about a nice dust clogged fermi heatsink, what effect will this have on load temps? I theink fermi is gonna literally be a RMA HELL.:shakes:
Water boils at 100c, I wouldnt be surprised if the dvi and hdmi cables melt off the back of the card.
are you ready? yes but sadly you missed the mark! shrink it and rework than ill be ready!!!!:)
i really quite like the exposed grill of the 480. can easily put fans beneath it and have them blow directly on the heatsink.
i hope we see this type of design more often.
looking like a gpgpu nightmare for a 24/7/365 power consumption:eek:
To be honest i wouldn't touch this until the next nanometer shrink. man what a waste of power:rofl:
I don't know if anyone has already posted it (I'm sure it has been)...
Quote:
Originally Posted by PNY
im sure i read in a review that nvidia recommends water cooling for tri sli...
i have to say that the cards have a few cons but seems like a decent purchase for future games like the 8800, i think the gtx480 will last a good couple of years
this is like the first draft of a novel. looks to intel with larabee. shoudn't have been released. intel knew they needed to rework. nvidia failed!
No more evga WTF HC for me after all the DOA 280s.
evga said the water block was too heavy, damaging the PCB during shipping so they shipped me a second DOA card with "safer packaging". Rather than use their foam lined box I received this...
http://img263.imageshack.us/img263/4640/img0438w.th.jpg
as shipped, needless to say it was DOA as well, so after six+ weeks, $800+ invested never having a working card and many many hours on the phone, I painfully extracted a full refund from evga.
I have 2 sli rigs. I find this power consumption absurd:shakes:
Who exactly is paying you to write this stuff? Is there any particular reason you're choosing to pollute the thread with consecutive posts like this? You're not even in any sort of discussion... you must just like seeing your name flash up on your LCD?
what im saying is all this wait and hype for this!
in b4 first heat related rma
Most of the review nerds are missing the mark. Even the few with useful benchmarks don't see the conclusion in front of their faces. Well, it IS hot and loud, they did get that part right.
It will be two months before people really get it. The value of an overclocked 5850 wasn't realized right away either.
Mark my words down. Things will look different when people are paying attention to minimum framerates on their overclocked 470s with giant heatsinks. People will be saying What 5870?
Whining about it does not make it better.
lokk im not trying to annoy anyone. im very surprised at the power it consumes per per perfomance return.
man, i've just finished reading some reviews. they are pretty depressing ... they mentioned of the term "fastest" when compared with 5870 ... as if 5970 doesnt exist
so, is it the final verdict? any o/c results?
I guess im just dissapointed. I expected a better card given their engineers!