And what about HD 4850 with 120° on Furmak ? Too hot ?
And what about HD 4850 with 120° on Furmak ? Too hot ?
► ASUS P8P67 Deluxe (BIOS 1305)
► 2600K @4.5GHz 1.27v , 1 hour Prime
► Silver Arrow , push/pull
► 2x2GB Crucial 1066MHz CL7 ECC @1600MHz CL9 1.51v
► GTX560 GB OC @910/2400 0.987v
► Crucial C300 v006 64GB OS-disk + F3 1TB + 400MB RAMDisk
► CM Storm Scout + Corsair HX 1000W
+
► EVGA SR-2 , A50
► 2 x Xeon X5650 @3.86GHz(203x19) 1.20v
► Megahalem + Silver Arrow , push/pull
► 3x2GB Corsair XMS3 1600 CL7 + 3x4GB G.SKILL Trident 1600 CL7 = 18GB @1624 7-8-7-20 1.65v
► XFX GTX 295 @650/1200/1402
► Crucial C300 v006 64GB OS-disk + F3 1TB + 2GB RAMDisk
► SilverStone Fortress FT01 + Corsair AX 1200W
LOL @ nvidia fanboy blaming driver bug
Last edited by blob; 03-23-2010 at 02:51 PM.
Don't be so patronising, everyone has bugs.
As for cheating the rules have changed, in DX10 they added rasterisation rules. If a card is rendering something it has to be a certain way. So now if it isn't done the right way, it's normally because something is wrong not to get a speed boost.
Since DX10 you've not seen as many attacks on image quality except for the bs concerning the DX 10.1 of a certain game, nvidia smashed that like the hammer of Thor. But yeah as far as Microsoft are concerned they don't want to see any crap from ATI, nvidia or whoever.
So nope it's unlikely for nvidia to cheat, at least directly.
3 more days and all this Fermi nonsense hopefully ends.
► ASUS P8P67 Deluxe (BIOS 1305)
► 2600K @4.5GHz 1.27v , 1 hour Prime
► Silver Arrow , push/pull
► 2x2GB Crucial 1066MHz CL7 ECC @1600MHz CL9 1.51v
► GTX560 GB OC @910/2400 0.987v
► Crucial C300 v006 64GB OS-disk + F3 1TB + 400MB RAMDisk
► CM Storm Scout + Corsair HX 1000W
+
► EVGA SR-2 , A50
► 2 x Xeon X5650 @3.86GHz(203x19) 1.20v
► Megahalem + Silver Arrow , push/pull
► 3x2GB Corsair XMS3 1600 CL7 + 3x4GB G.SKILL Trident 1600 CL7 = 18GB @1624 7-8-7-20 1.65v
► XFX GTX 295 @650/1200/1402
► Crucial C300 v006 64GB OS-disk + F3 1TB + 2GB RAMDisk
► SilverStone Fortress FT01 + Corsair AX 1200W
so i have to explain it in a technical manner as to why i think its not a driver problem but more of a cheat that nvidia pulled to make fermi look like its a good piece of trash ????
come on ... all the sudo news + all the leaked benchy pointed to it months ago ... and now this dx9 stunt .....
► ASUS P8P67 Deluxe (BIOS 1305)
► 2600K @4.5GHz 1.27v , 1 hour Prime
► Silver Arrow , push/pull
► 2x2GB Crucial 1066MHz CL7 ECC @1600MHz CL9 1.51v
► GTX560 GB OC @910/2400 0.987v
► Crucial C300 v006 64GB OS-disk + F3 1TB + 400MB RAMDisk
► CM Storm Scout + Corsair HX 1000W
+
► EVGA SR-2 , A50
► 2 x Xeon X5650 @3.86GHz(203x19) 1.20v
► Megahalem + Silver Arrow , push/pull
► 3x2GB Corsair XMS3 1600 CL7 + 3x4GB G.SKILL Trident 1600 CL7 = 18GB @1624 7-8-7-20 1.65v
► XFX GTX 295 @650/1200/1402
► Crucial C300 v006 64GB OS-disk + F3 1TB + 2GB RAMDisk
► SilverStone Fortress FT01 + Corsair AX 1200W
again..... driver problem are dismised... wanna know why???? they had the fricking card for so long in their hand tweaked it that they had to tweak the driver package .... but hey your right its all speculation until release day ...
and why run a benchmark in dx9 and say we are having driver problem instead or do some pr spin on it ???? the reason is nvidia didnt want it to be known.... now the question is why ???
heres why i made the assumption.....
The VRMs don't pop. The card simply crashes and goes into a protection mode, because furmark (or was it OCCT?) was causing the OCP to kick in. Still a design flaw, but it only affected benchers and people running stresstests. Wasn't causing any problems for real games. Still a serious and bad problem though
I haven't heard of any cards hitting 120C though![]()
You guys are ignoring the fact that Charlie has been wrong about clocks , Shader count , Power Consumption and nearly every aspect of GTX 4xx series except for the launch dates !
looks like he was right about the performance, though...
welcome btw, are you from B3D? I remember seeing you on some other forum
INTEL Core i7 920 // ASUS P6T Deluxe V2 // OCZ 3G1600 6GB // POWERCOLOR HD5970 // Cooler Master HAF 932 // Thermalright Ultra 120 Extreme // SAMSUNG T260 26"Has anyone really been far even as decided to use even go want to do look more like?
It's much better now. As long as you don't forget you are amusing, guessing and speculating, then it is OK .
Let me ask you a new question.
When you know your own ideas are based on speculations and assumption, how can you be so sure about your own impressions that you dismiss others ideas so firmly (even an obvious idea as possible driver problems in early stage), by calling them fanboys and such ?
► ASUS P8P67 Deluxe (BIOS 1305)
► 2600K @4.5GHz 1.27v , 1 hour Prime
► Silver Arrow , push/pull
► 2x2GB Crucial 1066MHz CL7 ECC @1600MHz CL9 1.51v
► GTX560 GB OC @910/2400 0.987v
► Crucial C300 v006 64GB OS-disk + F3 1TB + 400MB RAMDisk
► CM Storm Scout + Corsair HX 1000W
+
► EVGA SR-2 , A50
► 2 x Xeon X5650 @3.86GHz(203x19) 1.20v
► Megahalem + Silver Arrow , push/pull
► 3x2GB Corsair XMS3 1600 CL7 + 3x4GB G.SKILL Trident 1600 CL7 = 18GB @1624 7-8-7-20 1.65v
► XFX GTX 295 @650/1200/1402
► Crucial C300 v006 64GB OS-disk + F3 1TB + 2GB RAMDisk
► SilverStone Fortress FT01 + Corsair AX 1200W
I have seen the case with a HD 4850 Sapphire classic 2 slots cooler on a bad ventilated mid-ATX case with Furmark. Really massive heat
http://forums.techpowerup.com/showpo...0&postcount=11the card is designed to withstand 120°C operating temperatures - W1zzard
Bookmarks