GTX 480 3-Way SLI review :ROTF:
http://www.ngohq.com/news/17574-nvid...li-review.html
Printable View
GTX 480 3-Way SLI review :ROTF:
http://www.ngohq.com/news/17574-nvid...li-review.html
Well the GTX 470 does not seem bad but only when you OC it and when you OC a GTX 470 it will eat much more. Now if you think about the 25% or so OC needed to reach GTX 480 performance level and the amount of PSU power needed it might not be a bad idea given you have a strong enough PSU.
Alternatively if you think your PSU cant handle a OCed GTX 470 just get a 5870 its as simple as that, as far as temps go i have a open case since 6 years and dont plan on changing it.
Haha I am going to watch this movie in an hour and now I will only look for flickers and blurred scenes :rofl:
as for the ones here pointing at 3D as an nvidia exklusive feature - well I saw it on Cebit and just found it not really great. it's one of thoose things the PR and management get's wet about while the fanboys follow them blinded and the rest of the world just feels like beeing put into the desert by marketing when they figure how rarely it is supported / implemented currently and which technical issues there are still to be solved.
uh... avp... you think people will buy new hw to play avp? idk...
you sound exactly like that nvidia pr guy on twitter :lol:
idk, what ive seen from metro so far looks nice, but i really dont get what the game does with all the hw performance... cause it really doesnt look 20fps @ 1680x1050 in dx11 with max details good :confused:
yeah, wtf right? :D
why 5870? a 5850 is 25% cheaper and whatever a 5870 can do, a 470 can too :shrug:
dont worry, you dont have to look for them, there is no way to NOT see them... please check though if they use active shutter glasses as well now in germany, that would be interesting. the glasses here have a infra red style transparent red plastic thing right between your eyes. well, i guess they are acrive shutter glasses cause they are heavier (batteries inside) and have this infra red thing input diode.
if your glasses are passive, please lmk if it flickered a lot as well or not... maybe its the glasses that were causing me a headache there... litereally :D
metro sucks, but Just Cause 2 wins massively. I just love that game and I would love the water animation if I had a Nvidia card. Even without it the game looks fantastic and runs (mostly) smoothly despite the incredible draw distance.
Hey METRO 2033 is not that bad. The weapons look a little pointy and boxy but the game itself is somewhat interesting and the graphics is not bad. Watch some in-game videos on youtube, there are some really beautiful scenes... ;)
i think i would rather have a gtx 470 than either a 5870 or 5850.
When you look at things such as minimum frame rates and tessellation performance, it makes me think that you would be better off with it for future games.
now was it worth waiting 6 months for, no, but now that it's here, it's certainly a valid option.
@iTravis.... nice link... lol'd...:D
I preordered a GTX 470 for the minimum framerates and amazing performance @ 1920X1200 w/ AA + AF among other reasons. Firstly, I fold, and we've all see the Folding@home performance. Secondly, Fermi looks prepared to really shine with future DX11 titles (tessellation, as you say). The best part, though, is that Fermi is brand new, so we'll see some nice performance gains from future driver releases. Those types of gains are long gone for Cypress. Unless, of course, ATI decides to intentionally butcher IQ settings for better benchmarks. :p:
hahaha hilarious! :lol:
really? idk... it seems way too cartoony for my taste... the way you can jump and fly around... its way beyond maximum strength in crysis ^^
i found it too much of a 90s style hit n run + gta like game...
yeah not bad for sure... but its not pretty enough for the hw perf it needs imo... and the gameplay doesnt seem all that fun from what ive seen... is there a demo?
id rather say, neither one of them is going to be very good in future games that really use dx11 and hence use tesselation
http://www.pcgameshardware.de/aid,74.../Test/?page=13
even on a gtx480 metro 2033 is BARELY playable at 1680x1050 with max details... 26fps average and 22 minimum
and thats a 2010 game... buying a gtx470 because its going to be future proof in dx11 games is nonsense imo...
Well love it or hate it, here are a few facts :
1/ Though not perfect by any means, 3D Vision was acclaimed almost everytime it got reviewed.
http://www.guru3d.com/article/nvidia...vision-review/
http://www.legitreviews.com/article/889/1/
http://www.pcper.com/article.php?aid=656
http://www.tomshardware.com/reviews/...ereo,2121.html
http://www.overclockersclub.com/revi...dia_3d_vision/
http://www.firingsquad.com/hardware/...on/default.asp
...
2/ 3D Vision has been available for more than a year. There is more and more compatible hardware (24 inch screens and all the 3D TVs coming this spring/summer), and the list of games supported is nowhere as bad as you make it sound (i would definitely have agreed if you were talking about PhysX though) :
http://www.nvidia.com/object/3D_Vision_3D_Games.html
3/ ATI has just implemented the technical possibility of 3D stereoscopic gaming into Catalyst 10.3, but real world benchmarks remain to be seen.
Perhaps i am missing something here, so if there is a possibility to play in S3D with my CrossFired HD 5770's, tips are welcome. 3D Vision is one of the two reasons why i am switching to GTX 470.
A 5770 CF is more powerful than a single GTX 470 and you will get higher FPS in game which support CF. As for the 3D question the Nvidia 3D Vision is more of a certificate that the game will work with the nvidia hardware in question.
ATi's approach on the other hand is a two sided one, you must understand Nvidia uses a active system for 3D display something that is both expensive and also produces good results. ATi's has its fingers in both pots active as well as passive, you can expect DDD's Stereoscopic approach to be close/above to Nvidia's system and iZ3D on the other hand is more passive and would offer less quality than either Nvidia 3D or DDD's Stereoscopic approach.
EDIT: Forgot to include a link
http://www.mtbs3d.com/index.php?opti...iews&Itemid=76
This is a good review of all the 3 systems but do not take it to seriouslly since this is of 2009 and there are bound to be improvements in both systems after 10.3.
Always no test with gtx470 in SLI ? Incredible ! :(
So the results are so good that they want us to buy only gtx480 ? :confused:
iZ3D does support shutter glasses and 120Hz displays with the new driver.
So game support for ATI is extended to the whole iZ3D list, while you can use 120Hz LCDs with Mini-DIN and shutter glasses.
Only problem is that it's quite confusing for initial setup, and 3D ready LCDs still suck at large.
From how you post, you make it sound like ATI has something to do with these competitors, but when I look at the website. They both just look like companies trying to sell 3d product regardless of the videocard you have.
ATI has nothing to do with these guys as all these 3d test were done on gtx 285s.
That review was a demonstration of the tech evolved, these both companies are middleware for ATi and as such are very important. ATi has to support these guys if they expect anything from them, its like saying bullet and ATi are not connected at all.
http://3dvision-blog.com/wp-content/...initiative.jpg
I also wrote that the review was of 2009 and that enhancements would have been made on both platforms "DDD & iZ3D" on ATi...
http://www.mtbs3d.com/index.php?opti...iews&Itemid=76
The newest games are still using gtx 285.
http://www.ddd.com/about/about_history.html
They say nothing about an ATI partnership. This is more along the lines of AMD has no 3d solution so they are going to attach themselves freely to anything that out there because they are too lazy to develope a 3d technology themselves. These company have been around alot longer than that slide
There is nothing on either website to indicate they are being supported by AMD in anyway or partnered. The likelihood that 3d works comparably well on both AMD and NV hardware on comparable hardware shows that this has as much input and support from AMD and NV.
This is more of a software company trying to sell software. They are not really being supported by the hardware companies themselves.
The 3D vision initiative is alot different, they are actually putting money and driver support for NV cards. Additionally, they are leveraging themselves to get their games optimized for 3d vision.
From your reasoning, NV is both supporting a Closed support and Open support since DDD and iz3d both work on their cards along with NV 3d vision.
Look at the thermi:
480
IDLE
http://www.geeks3d.com/public/jegx/2...ature_idle.jpg
LOAD
http://www.geeks3d.com/public/jegx/2...under_load.jpg
480 SLI
IDLE
http://img707.imageshack.us/img707/1...480sliidle.jpg
LOAD
http://img97.imageshack.us/img97/8459/gtx480sliload.png
HD5870
IDLE
http://www.geeks3d.com/public/jegx/2...rared_idle.jpg
LOAD
http://www.geeks3d.com/public/jegx/2...rared_load.jpg
HD5870 CF
IDLE
http://www.hardware.fr/medias/photos...IMG0027445.png
LOAD
http://img52.imageshack.us/img52/1130/img0027448.png
http://www.hardware.fr/articles/787-...x-480-470.html
Why is SLI idle so much higher?
Let's not forget a 4870 under load:
http://www.geeks3d.com/public/jegx/2...on-hd-4870.jpg
4870 CF under load:
http://www.geeks3d.com/public/jegx/2...-crossfire.jpg
And before you try and say it's irrelevant because it's from last generation, no it isn't. If you were fine with HD4000 cards then the only reason you wouldn't be fine with GF100 would be unreasonable bias (avoiding the f-word).
i want one of thought heat cameras....... :p: how much do you think they cost?
In fairness the 4870 thermals were due to ATIs choice of fan profile. They were quite tamable cards with the reference cooling but they chose higher operating temps over higher noise levels ( and *had* that choice ; ie the fans / heatsink have plenty of headroom - case in point look at the 4890, very loud but I'm more than sure that same thermal picture would be in the 4890s favor ) Nvidia on the other hand didn't have much of a choice ( you can't ignore a high TDP ). As for the cards long term durability, I'm sure they'll be fine under normal usage conditions and the 4870s only help support this fact, which if I am not mistaken was your point (I'd be suspect of tri sli configs on air though ).
basically TDP is really the only thing that matters, if its going to have more heat, the card will be hot or loud. or something i hope for is they go with a 3 slot cooler eventually, cause not everyone wants to fill in all 7 PCI slots, and not everyone enjoys super loud gpus, but still likes to OC a little and see safe temps. seriously, add in 10$ more of copper/aluminum, charge 50$ more over reference, and be happy.
http://img97.imageshack.us/img97/8459/gtx480sliload.png
480 load
Geez, to me that really looks like a fire hazard. Apparantley those power cables get so hot they bend like a wet noodle.
SemiAccurate ponders GF104. http://www.semiaccurate.com/2010/04/...ode-names-pop/
However, this little tidbit was most entertaining :)
Quote:
GF100 is horribly weak in pixel ops. It has only 64 ROPs (Render Output Units), and that lack of pixel power shows up at higher resolutions.
^HAHAHA^
well charlie hasn't changed his ways (not that i was expecting him to), he states the obvious about gf100's ROPs, then claims that there will be no more cards after this batch... please. and finally, the gf104 will be a huge flop because gf100 has an ROP shortage at high resolutions.... :rofl:
i have to say, my 9800GX2's would hit 107-110 C under heavy load with the factory TIM. replacing that brought my load temps down 10 C, and none of these cards are close to those temps... everyone is calling this a potential RMA liability but i just don't see it.
How is it possible to blog about GPUs for a living and not know the difference between a ROP and a TMU? And even if he were to get that right, texture cache efficiency increases with resolution, not the other way around. I wonder how he feels about Cypress' 32 ROPs :)
/boggle
I seriously doubt the inclusion of a backplate would change things that much. They are still metal and will heatup just like the pcb. Yes there is more surface area for the heat to spread to but to go as far as believing the backplate is a game changer... Assuming that those load pictures were taken after an extended heavy load ( an hour + ) that gives the back plate plenty of time to absorb heat and thus reflect it in the pictures.
Anyways just look at the side of the 480s heatsinks ( which are still plastic casings similar to the 5870s ) It is clear that the shroud gets very very hot ( ie do not touch hot ) They do have vents yes but the 5870 sink also has some vents in the same area ( just fewer of them and they are smaller )
Again none of this should be a problem in larger cases with decent airflow and assuming you dont let anything touch the cards for extended peroids, all should be well ( ie lose / messy cables )
You can tell the shrouding is masking/hiding the heat by looking at the color where there are openings. Just like exhaust manifolds on cars use a small piece of tin/steel barely spaced off the exhaust manifold as heat shielding. If the shrouding was off and a picture of the pcb was taken it would would look very different than the one we are seeing.
http://img30.imageshack.us/img30/789...nfraredloa.jpg
Regardless of how it is looked at the surrounding case temps tell the story. 480s bleed crap loads of heat into a case. Anyone considering SLI better have insane airflow or plan to go the watercooling route otherwise their overall system temps are surely going to suffer substantially. These appear to be at least as bad as the 4870x2s ( mine made a good 10-15C difference on my northbridge on my x48 board and a good 5C increase on other components at full load )
As long as stability is in check/maintained I'm not sure I would over emphasize added heat dump. Sure its going to be hotter but whether or not that creates other issues remains to be seen.
Of course but it is something that must be accounted for none the less. As far as single gpu boards go this is the hottest one ever so people will have to rethink some things ( cooling demands after power consumption ) I'd hope most people who put these cards in their machines are well aware of this but there will be the vergence in the force with the odd few running their systems at the brink as is and it is an easy catalyst to send things overboard but thats nothing new ( ie average joe puts a 8800gtx in his hp oem pc which has a 250 watt sparkel psu... BooM -grilled pc sandwich :0 )
If what we are ultimately looking at how much heat is given off, regardless of the reasons, then why does nobody have a problem with the HD4000 cards?
And if we just want to see how how it actually gets for longevity sake, then yes it does make a difference on the HD5000 cards as to whether or not the backplate is on.
You totally missed my point... the only reason the backplate is there is due to memory, it isnt there to benifit the pcb directly (otherwise we'd probaley see one on the GTX4xx cards...) it might have a very minor side effect but its there for the benifit of the ram. I don't recall cards using backplates that didn't have memory on the rear side of the pcb.
I wonder if any of these pictures are EMI calibrated?
Well this is interesting:
Quote:
Moreover, I noticed the image to get somewhat blurry on the GeForce when running the test demo and checked this out with screenshots (on another level as I could not make identical screenshots on the Chaser map). You can see the difference yourself (the screenshots were captured at 1920x1080).
From the xbit 480 review in the Metro test.Quote:
It is easy to see that the Radeon produces a better-looking picture. All the textures are sharp, without fuzziness or anything. To remind you, I selected the same graphics quality settings for both graphics cards: DirectX 11, Very High, 16x AF, AAA, Advanced DOF and Tessellation. Besides, I selected the High Quality texture filtering mode in the Catalyst and GeForce/ION driver (the Quality mode is selected by default).
http://www.xbitlabs.com/articles/vid...tx-480_11.html
What's even more interesting is the part that you seemed to look over:
Sorry if you wanted to leave that part out, Flip. :rolleyes:Quote:
Perhaps this reduction of quality is the trick the GeForce GTX 480 resorts to in order to deliver a higher speed in Metro 2033? The game developer answered to our question promptly. Here is what Oles Shishkovtsov, 4A Games Chief Technical Officer said:
No, the observed difference in quality is not due to the performance of the graphics cards. Indeed, graphics cards from Nvidia and ATI render the scene differently in some antialiasing modes. We are trying to find a way to correct this issue. Again, this has nothing to do with speed.
Hopefully, people from 4A Games will find a solution.
Fail. If the 480 clearly outperforms the 5870 despite is's 60 TMUs and 48 ROPs, then what will be the issue with GF104? Further the specs of the GTS430, 440 and 450 (all using GF104) have supposedly leaked here:
http://www.dvhardware.net/article41879.html
All but the 430 have 64 TMUs (more than the GTX480) and 32 ROPs (same as the GTX285). Major fail on Charlie's part.
Yes this is true. My top 4870 burnt out from excessive temperatures in crossfire mode. They are horrible cards and I wouldnt reccomend them to anyone, and the same goes to the Fermi.
ATI sorted out these thermal issues from the 4890 / 4770 onwards, something that Nvidia forgot to do.
The backplate in 5870/5970 isnīt very good as heatspreader,
http://www.overclock.net/ati-cooling...backplate.html
but you cant see the real thermals behind it, so comparison is really useless in this case.
17 march
5 aprilQuote:
SemiAccurate moles are saying that plans are in the works to tape out a GF104 and a GF106 in short order as well, but they haven't left Dear Leader's workers' paradise yet.
So ... Semi ... Accurate :rofl:Quote:
SemiAccurate has confirmed that the GF104 did tape out a few months ago.
Actually I did leave that part out on purpose; to prove a dam point. I could have just as easily made a sensationalist thread and simply ask if they are cheating, based on that discovery. You know, just to make a mountain out of a mole hill.....Seems potentially far worse of a cheat than that little shimmering glitch in Crysis.
BTW, as far as the Crysis crisis, has there been any investigation done to try and find the cause, or is it all still just heresay? :rolleyes:
if GF100 is having problem with heat i cant imagine GF119 ....
gf119 is just a low end variant of gf100. Something like gefore g210,g220 and gt240 but with fermi architecture so heat shouldbe alot less.
GF119 is a higher name then the GTX480 wich is GF100 so sure its a low end fermi .....
The GeForce GTX 480 enjoys an average advantage of 30-37%, depending on the resolution,
......over the previous-generation flagship GeForce GTX 285.
ZING!
http://www.xbitlabs.com/articles/vid..._12.html#sect0
a 50% boost is nice, but clearly a far cry from the 100% nvidia claimed... they kept saying 2x the performance of a gtx285... which is not true... even if you allow them to backpedal their claims to refer to a 295... thats a cut down gt200, and its sli, and even THAT beats a 480 slightly...Quote:
The GeForce GTX 480 enjoys an average advantage of 30-37%, depending on the resolution, over the previous-generation flagship GeForce GTX 285. When full-screen antialiasing is turned on, the difference is 46-48%
gtx285 sli is about as fast as a 5970 isnt it?
i like xbitlabs review, too bad they are always 2 weeks behind every other site, but at least they do a really good job every time... :)
about rmas... didnt gt200 cards have pretty high rma rates too? and they didnt run that hot... i think theres more to this than just heat... fan profiles, heatsink design, heatsink mounting and card assembly... packaging and pcb problems...
i think the main reason nvidia uses that backplate is not as a heatsink, its to stabilize the pressure on the gpu packaging during different temperatures, so that they dont run into solder bump issues and gpu packaging issues again. lets see if it works...
GTS450 specs look nice... close to 5850 performance... but 200-230E for that?
in what, 4-6 months from now? if it would be out now that would be an ok-good price/perf point... but in 1-2 quarters?
we will see...
Yeah, there is no situation where hd5850 would be over that at the time 450 will hit the stores.. Just look at hd5850 launch price:clap:
Fermi is nice if you need power and can keep your house cool enough at summer. For me atleast, even all electronics off, it is too hot in summer..
Well there was at least one instance of more than 100% :D
http://img706.imageshack.us/img706/8945/1234oe.jpg
nvidia: one benchmark to rule them all
King Shamino breaks P31700 Pts with GTX 480 - WR Single GPU Vantage
http://tof.canardpc.com/preview2/7d4...103a8b564e.jpg
1190/1350/2380 GPU/RAM/Shaders ~1.35V
OMG look at the texture and pixel fillrate and memory bandwidth on a single GPU :shocked:
1350mhz on the ram, thats pretty nice,
I would love to see a video of the run - how the card drinks LN2 during benching :D Knowing how a 5870 does, I could image you need more than 3L per run.
Anyway that's more like it - score is very nice even I would like to see it on normal Gulfi also for a better comparison ;)
Very bad nail's
so he found a way to get mem higher heheh :D
the mem doesnt like to be too cold, so getting the gpu cold and the mem next to it not too cold must have been tricky... respect :toast:
this isnt everything though... hes still limited by the core to shader ratio, im pretty sure the shaders can go higher :yepp:
so once nvidia unlocks that ratio, id expect a few couple hundred mhz higher shader clocks. since shaders are HOTTTT this will hardly affect air and water overclocking though... and if you think about it, its actually quite surprising that it took this long for fermi to break the single card wr, isnt it? since its that hot id have thought under ln2 it would break all 3d records right when the nda expires... but it doesnt seem to clock all that nice on ln2 for some reason...
wow, those pcbs are huge... or is she that tiny? ^^
so how high do those pcbs clock? :D
different idea of beauty... in asia this is what chics want... its very popular here... i dont like it either lol...
hey, i live in taipei, 35c average temp in summer... which lasts 6 months... so i hear ya! :D
http://www.expreview.com/10021.html
GTX 470 in June...
where is the 95oC load temp?
http://j.imagehost.org/0790/800_1600_2000.jpg
that :clap: and the fact that people that live in warm countries wont be able to use Fermi based cards unless they do additionally cool their homes.once room temperature gets over 30c the gfx card fan will have to constantly spin at 100% and even that might not prevent crashes if the GPU is doing some heavy work.
and what about all the people who like to use their GPUs for crunching?
...no crunch for Indian people:confused::eek:
So in comparison to the Fermi fan set to 100%, does anyone want to know what HD 5000 cards can manage with 100% fan speed?
Click here if you want to find out.
on my gtx 295, the profile show no 100% fan UNTIL 95 degree
they let the card heat up to preserve "lower noise"
yes fermi is hotter than it should to, just don't buy it if you don't like jeez.
it's not like the card is uncoolable, my 295 will get near 95 degree when gaming in the summer, then what? nothing just play and stfu
i think we need AT LEAST another 120 pages of fermi bashing, keep it coming guys
only 8000 gf100 cards worldwide at launch?
only a few hundred in europe?
http://www.semiaccurate.com/2010/04/...aunch-targets/
an austrian site seems to have confirmed the rumor that nvidia is bundling gf100 cards with G92 and other older cards. if a shop/distri wants to order 1 gf100 based card, they have to order 20 G92 or other older cards as well.
http://hi-tech.at/wordpress/2010/03/...fiosi-familie/
hmmmm cant wait until the official gf100 launch and see how availability will look like. when was it again? april 23?
GTX480 quad sli...
http://techpowerup.com/119615/GeForc...-way_SLI_.html
damn... thats gonna be HOTTTTTT :eek:
this should be enough to play metro 2033 at 1920x1080 in stereo 3d... maybe even 2560x1600 stereo 3d? :slobber:
btw if 470 can clock up to 700 mhz on avg i am gonna suggest it instead of 58xx series for superior software support that nvidia gaves thats another truth that i learned after i switched to ati :D
I think you should compare the size of the GTX480 coolers and a CPU coolers... count heatpipes all you want (and you need some practice with that - there are 5 heatpipes :ROTF: ) but the fin array size is much smaller.
I am actually seeing a much different trend with retail cards, someone had 86C at auto fan speed.... I can dig up a link if anyone wants one.
What I'd like to know is why people who have no intention of buying the card are so obsessed with its noise level....? The people who actually are going to use the cards seem to care a lot less!