http://www.xtremesystems.org/forums/...d.php?t=246534
Printable View
to bad the 470 vents are just a sticker
Go on, complain with your 2560x1600 monitors, my 24 incher is dead and I am temporary stuck with 1024x768 while its being RMA'd, so now THAT is a pain in the butt! :D
I don't know about Firefox but in IE it simply scales all images to the size of your browser window dynamically, thus resulting in layouts being maintained as well. I could be missing something here or it may not be available in others like Firefox/Chrome, as to why people want the forum to do it however.
EVGA / BFG seem to always have OC, OCX, FTW etc cards so I'm guessing that we'll see a few types like almost every release. However at launch it is anyone's guess if they will have the OCs out of the gate.
I hope to see a BFG GTX 480 OC that's the 725 / 1050 spec but I'm not wanting to pay a large premium for it $25-$50 extra per card would be my limit. There's always going to be refreshes (GTX 280-285)and while I'd love to have that 512CC there's no way I'm waiting 3+ months for it.
If the info is out...can anyone tell me the power pin setup for both 470-480 cards. Sorry for being lazy...didn't want to read through 45 pages. My 850Watt OCZ has 4x 6pin's
Hoping 2 of these can run my 30' Dell 3008wfp @ 2560x1600 my 8800GT's are tired for sure.
470 : 2x6pin
480: 6+8pin
Do some reading, Nvidia has posted that they are working with AICs to replace out of warranty cards...... What more could you ask for besides not having that mistake to begin with?
BTW I used 196.75 for a while and monitored my fan speed and temp in my GTX 280 and it did not have any issues.
PHK posted new scores for Crysis Warhead :
http://tof.canardpc.com/preview2/6cc...601a5b301d.jpg
1920X1200 AA 4X
HD5870 10.3 - Min 19.60 Max 41.41 Avg 30.22
GTX 470 196.78 - Min 18.62 Max 30.60 Avg 25.53
1920X1200 AAQ 8X
HD5870 10.3 - Min 4.96 Max 37.87 Avg 23.44
GTX 470 196.78 - Min 15.63 Max 29.21 Avg 23.03
I called a BestBuy with the SKU number for the GTX470 and was told a price of $399.99, not sure if it's retail markup or the actual MSRP... if it is the MSRP and the difference is $400 vs. 500 that makes my decision much quicker/easier between the two to go for the GTX480. I don't know the SKU for the 480 so can't find out what their pricing is on it. Just interesting since current info says $350 for the 470 and $500 for the 480.
Yeah, right.
Even with a "maybe" young 196.78 set, scores are not really impressive.
Smooth gameplay on Crysis Warhead maxed out 1920X1200 seems to be not possible with this mono GPU generation. Maybe next ? :D
It seems the last ones was about 2560X1600 def
http://www.xtremesystems.org/forums/...19&postcount=2
Just disable advanced DOF and tessellation under Directx11 settings. I didn't notice any difference in graphics quality, yet my FPS skyrocketed. AA can also be quite heavy on this game. AAA works quite a bit better than MSAA 4x for me, although it does show some jaggies every now and then. It's not that bad though.
that pie chart reflects market share more than anything else. i have used both ATi and nvidia drivers. i do like having custom profiles with nvidia and in d3d apps i never had trouble with either. i use nvidia cards because i can run opengl apps that dont crash every 5 minutes from drivers.
if some kinda monster comes out of this bs i will be very suprised 2 days left still no decent bench around
Unless something super-reliable surfaces I don't really care about speculation with only 2.5 days to go, frankly.
To be fair, I think this is more a driver/profiling bug than manipulation. I'm not a nvidia lover by any means, but when they actually do cheat it's a little more subtle.
Sure they'll do gray all day long, as long as you need uncompressed jpg's and a magnifying glass to prove it, but nothing as black and white as this.
Edit: As for the Dirt 2 keys, I think it's to make sure you know every reviewer has access to the game, and thus if they leave it out they'd have to explain it. There aren't many DX11 games out, so it would be a crime to only bench Metro 2033. But saying that, I do think that nvidia might have a edge on post processing effects.
Looks like things are panning out to the tune of Charlie be right more than wrong... interesting.
The only thing I don't agree with in this article is his comment about 87C... this isn't unheard of. With the default fan profile a 5870 consistently hovers at 86-87C for me at 1920x1200 4x AA in many games (reviews also reflect similar temps ; its the thermal / fan profile AMD targeted ) And if I recall a certain "8800GTX" did the same (ran in high 80 range) Now that said the 98C in furmark comment is scary.
Its all in the name of PR. Anyone who is planning on buying a GTX4xx card without waiting for reviews won't have their minds changed by something such as this. As for the rest of the sane world, they will wait to take in the more level headed and honest reviews which will be out in force in a matter of days. Marketers embellishing the truth? Oh really?! :shocked:
And what about HD 4850 with 120° on Furmak ? Too hot ?
LOL @ nvidia fanboy blaming driver bug
It's not a driver bug. It's the Dirt2 Demo lacking the correct profiles for the GTX 4X0 cards (The demo was released 6 months ago)
The full game does seem to have the correct profiles.
Not mentioning that the benchmark is GTX470@DX9 vs HD5870@DX11 is a differend topic
Don't be so patronising, everyone has bugs.
As for cheating the rules have changed, in DX10 they added rasterisation rules. If a card is rendering something it has to be a certain way. So now if it isn't done the right way, it's normally because something is wrong not to get a speed boost.
Since DX10 you've not seen as many attacks on image quality except for the bs concerning the DX 10.1 of a certain game, nvidia smashed that like the hammer of Thor. But yeah as far as Microsoft are concerned they don't want to see any crap from ATI, nvidia or whoever.
So nope it's unlikely for nvidia to cheat, at least directly.
3 more days and all this Fermi nonsense hopefully ends.
so i have to explain it in a technical manner as to why i think its not a driver problem but more of a cheat that nvidia pulled to make fermi look like its a good piece of trash ????
come on ... all the sudo news + all the leaked benchy pointed to it months ago ... and now this dx9 stunt .....
again..... driver problem are dismised... wanna know why???? they had the fricking card for so long in their hand tweaked it that they had to tweak the driver package .... but hey your right its all speculation until release day ...
and why run a benchmark in dx9 and say we are having driver problem instead or do some pr spin on it ???? the reason is nvidia didnt want it to be known.... now the question is why ???
heres why i made the assumption.....
The VRMs don't pop. The card simply crashes and goes into a protection mode, because furmark (or was it OCCT?) was causing the OCP to kick in. Still a design flaw, but it only affected benchers and people running stresstests. Wasn't causing any problems for real games. Still a serious and bad problem though :shakes:
I haven't heard of any cards hitting 120C though :shocked:
You guys are ignoring the fact that Charlie has been wrong about clocks , Shader count , Power Consumption and nearly every aspect of GTX 4xx series except for the launch dates !
looks like he was right about the performance, though...
welcome btw, are you from B3D? I remember seeing you on some other forum
It's much better now. As long as you don't forget you are amusing, guessing and speculating, then it is OK .
Let me ask you a new question.
When you know your own ideas are based on speculations and assumption, how can you be so sure about your own impressions that you dismiss others ideas so firmly (even an obvious idea as possible driver problems in early stage), by calling them fanboys and such ?
I have seen the case with a HD 4850 Sapphire classic 2 slots cooler on a bad ventilated mid-ATX case with Furmark. Really massive heat :shocked:
http://forums.techpowerup.com/showpo...0&postcount=11Quote:
the card is designed to withstand 120°C operating temperatures - W1zzard
It will be boring compared to Fermi. No mock card to make fun of. Will not be 6+ months late, wood screws. etc etc . Gonna miss the Twitter leaks :ROTF: Leaks will probably come 2 months before release something like this
http://www.pclaunches.com/graphic_ca...ecs_leaked.php
Let's face it, they had not really been finalised until recently. They did another spin, and the reference cards they sent out had different clocks on them, so how could he get it right when nvidia weren't sure themselves?
As far as I'm concerned it doesn't matter how many shaders it has, what it's clocks are or its TDP (not power consumption), it's how it performs that matters.
Has anyone pointed out yet that ATI having had the 5800 series in market for ~8 months means they're more or less in refresh territory? :shrug:
If the GTX 480 can just barely, ever so slightly edge out the 5870 (while using much more power)... Things aren't looking too good if ATI can just launch a 5890...
Not that they will for sure, but AFAIK 8 - 12 months is a pretty typical time for a refresh (at least since I've been following tech).
These kind of cards should have been planned for a long time ago.
Actually I had expected a 5870 2GB a loooong time ago (4870 512MB -> 1GB anyone? http://www.anandtech.com/video/showdoc.aspx?i=3415) with a refresh like the 4890 for when Fermi would come out to finish the line up and make ATi really shine for the first time ever since the release of the 8800GTX.
Sadly they did not go that route (now if you are wondering why I mention sadly: I am kinda stuck at the red side, thanks to Eyefinity).
Well that card's TDP is something like 220W as far as I can remember, so the huge TDP advantage of 5870 (as compared to Fermi) vanishes there...
Yes, but that card will cost more over a 5870 than a 5870 2GB would, because of all the extra Eyefinity6 goodies that I don't want...
http://ic.tweakimg.net/ext/i/imagelarge/1269336601.jpeg
;)
But this is waaay too offtopic.
All i know, reviewers like SKYMTL and Benchzowner have hinted that GF 100 cards will have AWESOME gaming performance compared to competitor, even on todays games, that GF 100 will stomp RV 870 to the thrash bin where it belongs, so i think i will put more weight on their cues rather than Charlie the satan himself for this launch occasion. We will see in the next few days, who will be flat out right, and the one who talked out of their rear.
I'm tired of believing Charlie only to be mocked up by nVidia fanbois, so this time i'll swear by their side, and see which side is the correct one after all of this unveiled.