AMD Phenom II X6 1055T@3.5GHz@Scythe Mugen 2 <-> ASRock 970 Extreme4 <-> 8GB DDR3-1333 <-> Sapphire HD7870@1100/1300 <-> Samsung F3 <-> Win8.1 x64 <-> Acer Slim Line S243HL <-> BQT E9-CM 480W
|-------Conner-------|
RIP JimmyMoonDog
2,147,222 F@H Points - My F@H Statistics:
http://fah-web.stanford.edu/cgi-bin/...e=Conman%5F530
|-------Conner-------|
RIP JimmyMoonDog
2,147,222 F@H Points - My F@H Statistics:
http://fah-web.stanford.edu/cgi-bin/...e=Conman%5F530
i don't care how fast your core is if can't be suppied with the information by thte ram fast enough. when guys were modding the 7900gt's after you got 1900mem and anyware aboive 700core you would see no gain after you increased the core speed past that point. reason being a bandwidth problem. so you take a 8800gtx core and chop 1/3 of the bandwidth off it's going to make a huge difference. that just might be what he 8800gt is a shrikned hogher clocked lower bandwidth gtx
CPU: Intel Core i7 3930K @ 4.5GHz
Mobo: Asus Rampage IV Extreme
RAM: 32GB (8x4GB) Patriot Viper EX @ 1866mhz
GPU: EVGA GTX Titan (1087Boost/6700Mem)
Physx: Evga GTX 560 2GB
Sound: Creative XFI Titanium
Case: Modded 700D
PSU: Corsair 1200AX (Fully Sleeved)
Storage: 2x120GB OCZ Vertex 3's in RAID 0 + WD 600GB V-Raptor + Seagate 1TB
Cooling: XSPC Raystorm, 2x MCP 655's, FrozenQ Warp Drive, EX360+MCR240+EX120 Rad's
http://www.theinquirer.net/gb/inquir...-heat-problems
Panicked, last minute 'Thermal Analysis' suggests
By Charlie Demerjian: Thursday, 04 October 2007, 12:23 PM
Falk AdSolution
IT SOUNDS LIKE Nvidia's G92, the next-gen high-end part, is having heat problems. Several people told us that a few weeks ago, they got an urgent letter from NV to send them computers that the new G92 would go in for 'thermal analysis'. Hmmm, makes you wonder, doesn't it?
More interestingly, the OEMs: several told the same story, said they were given about a week to comply, slap it in a box and FexEx that sucker, ASAP. Other than 'thermal analysis' and 'do it now', no explanation was given. That really made uswonder.
It sounds like a cooling problem, not a die problem. The die itself is far smaller than the ~480mm^2 of the G80,. Those seen by our moles are just over 17*17mm or 289 mm^2 on a 65nm process. If you do the math, (.65 * .65)/(.80 * .80) * 480 mm^2 gives you about what you would expect for a more or less simple shrink with a few tweaks.
This means the chip will have approximately the power density of a modern CPU, assuming they didn't up the wattage by a lot. This is quite controllable; if ATI could do it on the X2900XT, the G92 should not pose much of a problem.
So, where does that leave us? I am guessing, and this is only a guess, that the cooler they ordered isn't exactly cutting it on production silicon in a real case. I can't think of another reason why they would have to jump through so many hoops so late in the process.
In any case, word should be leaking soon enough, and we will then know if we have another 5800 or 8800 on our hands. One thing for sure, you won't be seeing them in laptops, especially Montevina ones. ĩ
Asus Z9PE-D8 WS with 64GB of registered ECC ram.|Dell 30" LCD 3008wfp:7970 video card
LSI series raid controller
SSDs: Crucial C300 256GB
Standard drives: Seagate ST32000641AS & WD 1TB black
OSes: Linux and Windows x64
Should probably post that in this.
http://www.xtremesystems.org/forums/...d.php?t=160407
Regardless, that's the price they have to pay for single slot cooling.
They're probably shooting for ~1.8GHz SPs.
DFI LANParty DK 790FX-B
Phenom II X4 955 BE (1003GPMW) @ 3.8GHz (19x200) w/1.36v
-cooling: Scythe Mugen 2 + AC MX-2
XFX ATI Radeon HD 5870 1024MB
8GB PC2-6400 G.Skill @ 800MHz (1:2) 5-5-5-15 w/1.8v
Seagate 1TB 7200.11 Barracuda
Corsair HX620W
Support PC gaming. Don't pirate games.
Sounds to me like some engineers at NVIDIA have a LAN party planned this weekend and needed some good rigs for the occassion! LOL!Several people told us that a few weeks ago, they got an urgent letter from NV to send them computers that the new G92 would go in for 'thermal analysis'. Hmmm, makes you wonder, doesn't it?
So is G92 also a highend part now again? I get so confused about this part as one day they say it's the new highend, the other day they say it's a entry level, a few days later perhaps a midrange in form of 8800GT etc.
Intel? Core i5-4670K @ 4.3 GHz | ASRock Extreme6 Z87 | G.Skill Sniper 2x8GB @ DDR4-1866 CL9 | Gigabyte GTX 970 OC Windforce 3x | Super Flower Titanium 1000W | ViewSonic VX2268wm 120Hz LCD | Phanteks PH-TC14PE | Logitech MX-518 | Win 7 x64 Professional | Samsung 850 EVO & 840 Pro SSDs
If all people would share opinions in an objective manner, the world would be a friendlier place
Or:
Geforce 8800 GT doesn't do DirectX 10.1
http://www.fudzilla.com/index.php?op...=3479&Itemid=1No support for Shader model 4.1
Documents seen by Fudzilla indicates that G92/D8P aka the Geforce 8800 GT is not Shader model 4.1 compatible. It can mean one of two things, one, that Nvidia doesn't want release the information or two, simply that this chip doesn't have support for Shader model 4.1 and DirectX 10.1.
This comes as an interesting surprise as we know that the RV670 aka the Radeon HD 2950 series will support Shader model 4.1 and DirectX 10.1.
We will ask around and try to find out if this is the case, but this would be a big setback for Nvidia, at least when it comes to feature tick boxes on upcoming games.
Coding 24/7... Limited forums/PMs time.
-Justice isn't blind, Justice is ashamed.
Many thanks to: Sue Wu, Yiwen Lin, Steven Kuo, Crystal Chen, Vivian Lien, Joe Chan, Sascha Krohn, Joe James, Dan Snyder, Amy Deng, Jack Peterson, Hank Peng, Mafalda Cogliani, Olivia Lee, Marta Piccoli, Mike Clements, Alex Ruedinger, Oliver Baltuch, Korinna Dieck, Steffen Eisentein, Francois Piednoel, Tanja Markovic, Cyril Pelupessy (R.I.P. ), Juan J. Guerrero
To talk about SM4.1 now is abit of a joke, nomatter if they support it or not. DX10.1 main feature is audio. And even then, DX10.1 aint coming anytime soon either. And there is a reason its called SM4.1 and not 5.0. Not much changed.
And considering games that requires SM3.0 are just hitting the market now I dont see any "need".
Also this single slot card is a replacement for 8800GTS320.
The more main features of DX10.1 is 32-bit floating point filtering and required 4x anti-aliasing.
But again, nobody believed G80 would have unified shaders up to its launch either.
Last edited by Shintai; 10-05-2007 at 05:59 AM.
Crunching for Comrades and the Common good of the People.
Its funny where you base your knowledge from. And considering G8x already supports 2 of the 3 DX10.1 features. You might end up again..as one spreeing fud. But considering your "sources" that aint hard to do either.
More trustworthy sites like hkepc also says 4.1 if that can please you. maybe fud fumbled his chinese translator again
Last edited by Shintai; 10-05-2007 at 06:12 AM.
Crunching for Comrades and the Common good of the People.
Personally I have a hard time believing they're talking about the 8800GT card as at 65nm and 110W if the numbers are correct it won't need a good cooler at all and still run cooler than current 8800GTS cards. Besides 8800GT is a midrange, definitely can't be classified as "next gen high-end" part at least. My guess it's cards in the GX2 form that they're talking about, imagine 2x8800GTX tightly packed, even 65nm won't be the cure for the temperatures especially if you run it along with a modern hardware setup such as an ASUS P35 board along with Kentsfield in a closed box, ambient temps will skyrocket. Would be a good space heater for cold winters here in Finland.
Last edited by RPGWiZaRD; 10-05-2007 at 06:26 AM.
Intel? Core i5-4670K @ 4.3 GHz | ASRock Extreme6 Z87 | G.Skill Sniper 2x8GB @ DDR4-1866 CL9 | Gigabyte GTX 970 OC Windforce 3x | Super Flower Titanium 1000W | ViewSonic VX2268wm 120Hz LCD | Phanteks PH-TC14PE | Logitech MX-518 | Win 7 x64 Professional | Samsung 850 EVO & 840 Pro SSDs
If all people would share opinions in an objective manner, the world would be a friendlier place
First of all I really hope that you're not assuming/saying that I'm a nVIDIA fanboy, because I'm not, and I don't really care at all about the brands, etc, all I care about is performance and quality.
So your point of view in this thread is all about ATi vs nVIDIA and ATi fans vs nVIDIA fans ?
friendly as always,
BZ
Coding 24/7... Limited forums/PMs time.
-Justice isn't blind, Justice is ashamed.
Many thanks to: Sue Wu, Yiwen Lin, Steven Kuo, Crystal Chen, Vivian Lien, Joe Chan, Sascha Krohn, Joe James, Dan Snyder, Amy Deng, Jack Peterson, Hank Peng, Mafalda Cogliani, Olivia Lee, Marta Piccoli, Mike Clements, Alex Ruedinger, Oliver Baltuch, Korinna Dieck, Steffen Eisentein, Francois Piednoel, Tanja Markovic, Cyril Pelupessy (R.I.P. ), Juan J. Guerrero
If you watch the post you see:
BenchZowner: Also...are you sure you wanna base your opinion on FUDzilla ? ( ROFLMAOZORZ ? )
My answer: No.
What I said is that in the possibility of G92 donīt support DX_10.1, all fanboys came here with answers like yours saying.
I said previously that I donīt beleave in Fud, buts itīs funny to see your answer off the type:
"Ah DX_10.1 doesnīt matter"
Witch everyone knows that in markting itīs a key element. And also everyone want to have it īs support.
Oderwise anyone will buy HD 2600 and 8600 because X1950Pro/XT are lot lot lot lot lot lot better, but everyone goes for HD 2600 and 8600 even knowing that X1950Pro/XT are better and much faster in DX_9, and in DX_10 the HD 2600/8600 suck.
It is not something like you say that doesnīt matter. Itīs very very important....
Seems like the 8800GT cooler has 3 heatpipes so this whole overheating thing is probably in cases of bad case airflow.
http://www.vr-zone.com/articles/GeFo...ipes/5320.html
I'll say this nicely in hopes you'll understand it. I said DX10.1 is mostly audio related (XAudio2 incase you didn't know), SM 4.1 is just a small part of DX10.1.
In other news, the 8800GT is looking nice... shame I don't have a side window anymore.
http://www.vr-zone.com/articles/More..._PCB/5323.html
And apparently a R6xx product beating it's equivalent G8x/9x product in 3DMark means impending doom...It seems like more GeForce 8800 GT card photos have surfaced online as seen in our forums and apparently another website in China got the inside scoop. This time round the PCB is black in color, which looks better than the reference green PCB. The GPU heat-spreader looks the any other G80 series and the rumored die size of G92 is 289mm2 as opposed to ~480mm2 of the G80.
http://www.vr-zone.com/articles/RV67...s_Up/5322.html
AMD 1 - 0 NVIDIA
VR-Zone learned about Radeon HD 2950PRO (RV670 Revival) scoring around 10.7K in 3DMark06 sometime back with a good old FX-62 CPU. INQ now revealed that RV670 will score around 11.4K, some 600+ points higher than Nvidia's G92 reference score of 10.8K on the same platform. We heard that it was benched using a fast Core 2 processor. If all these figures provided by AMD and Nvidia are accurate, it looks that RV670 has an upper hand now. Also we heard that Nvidia wants 8800 GT to be launched at least 2 weeks earlier than RV670 so it could sell as many cards as possible before AMD ends their party on Nov 19th.
lol, X2900XT score way higher than 8800GTS in 3Dmark but does it translate to superior gaming prowess ? nada >_>. Stop basing graphics cards performance on 3Dmark, no body beside benchers care.
Last edited by Krizby87; 10-05-2007 at 07:17 AM.
Core i7 8700k @ 5.1Ghz * Gigabyte Z370 Aorus Gaming 5 * 4x8GB Corsair RGB @ 3600 16-18-18-36 * GTX 1080ti @ 2050/11400 * Plextor M8Pe 512GB * Creative Sound Blaster Z * Audioengine 5+ * Corsair Obsidian 750D * Corsair RM1000 watt
The 8800GT has its points, the HD 2950 has its own.
I don't see why one has to be better than the other for you to buy either one.
PS. The heat problems thread was older the posts above, so its on page 6.
You can continue the discussion here.
I for one am not surprised, except for the Inq's insinuation that the GT is on a 65nm process...that's the first time I'm hearing this but it is within Nvidia's capacity to do so.
Perkam
Last edited by perkam; 10-05-2007 at 08:06 AM.
Bookmarks