Ask nVidia. But if thats its size, then its a 128SP or more card most likely.
Printable View
i don't care how fast your core is if can't be suppied with the information by thte ram fast enough. when guys were modding the 7900gt's after you got 1900mem and anyware aboive 700core you would see no gain after you increased the core speed past that point. reason being a bandwidth problem. so you take a 8800gtx core and chop 1/3 of the bandwidth off it's going to make a huge difference. that just might be what he 8800gt is a shrikned hogher clocked lower bandwidth gtx
http://www.theinquirer.net/gb/inquir...-heat-problems
Quote:
Panicked, last minute 'Thermal Analysis' suggests
By Charlie Demerjian: Thursday, 04 October 2007, 12:23 PM
Falk AdSolution
IT SOUNDS LIKE Nvidia's G92, the next-gen high-end part, is having heat problems. Several people told us that a few weeks ago, they got an urgent letter from NV to send them computers that the new G92 would go in for 'thermal analysis'. Hmmm, makes you wonder, doesn't it?
More interestingly, the OEMs: several told the same story, said they were given about a week to comply, slap it in a box and FexEx that sucker, ASAP. Other than 'thermal analysis' and 'do it now', no explanation was given. That really made uswonder.
It sounds like a cooling problem, not a die problem. The die itself is far smaller than the ~480mm^2 of the G80,. Those seen by our moles are just over 17*17mm or 289 mm^2 on a 65nm process. If you do the math, (.65 * .65)/(.80 * .80) * 480 mm^2 gives you about what you would expect for a more or less simple shrink with a few tweaks.
This means the chip will have approximately the power density of a modern CPU, assuming they didn't up the wattage by a lot. This is quite controllable; if ATI could do it on the X2900XT, the G92 should not pose much of a problem.
So, where does that leave us? I am guessing, and this is only a guess, that the cooler they ordered isn't exactly cutting it on production silicon in a real case. I can't think of another reason why they would have to jump through so many hoops so late in the process.
In any case, word should be leaking soon enough, and we will then know if we have another 5800 or 8800 on our hands. One thing for sure, you won't be seeing them in laptops, especially Montevina ones. ĩ
Should probably post that in this.
http://www.xtremesystems.org/forums/...d.php?t=160407
Regardless, that's the price they have to pay for single slot cooling.
They're probably shooting for ~1.8GHz SPs.
Sounds to me like some engineers at NVIDIA have a LAN party planned this weekend and needed some good rigs for the occassion! LOL! :p:Quote:
Several people told us that a few weeks ago, they got an urgent letter from NV to send them computers that the new G92 would go in for 'thermal analysis'. Hmmm, makes you wonder, doesn't it?
So is G92 also a highend part now again? I get so confused about this part as one day they say it's the new highend, the other day they say it's a entry level, a few days later perhaps a midrange in form of 8800GT etc. :D
Or:
Geforce 8800 GT doesn't do DirectX 10.1
http://www.fudzilla.com/index.php?op...=3479&Itemid=1Quote:
No support for Shader model 4.1
Documents seen by Fudzilla indicates that G92/D8P aka the Geforce 8800 GT is not Shader model 4.1 compatible. It can mean one of two things, one, that Nvidia doesn't want release the information or two, simply that this chip doesn't have support for Shader model 4.1 and DirectX 10.1.
This comes as an interesting surprise as we know that the RV670 aka the Radeon HD 2950 series will support Shader model 4.1 and DirectX 10.1.
We will ask around and try to find out if this is the case, but this would be a big setback for Nvidia, at least when it comes to feature tick boxes on upcoming games.
To talk about SM4.1 now is abit of a joke, nomatter if they support it or not. DX10.1 main feature is audio. And even then, DX10.1 aint coming anytime soon either. And there is a reason its called SM4.1 and not 5.0. Not much changed.
And considering games that requires SM3.0 are just hitting the market now I dont see any "need".
Also this single slot card is a replacement for 8800GTS320.
The more main features of DX10.1 is 32-bit floating point filtering and required 4x anti-aliasing.
But again, nobody believed G80 would have unified shaders up to its launch either.
Its funny where you base your knowledge from. And considering G8x already supports 2 of the 3 DX10.1 features. You might end up again..as one spreeing fud. But considering your "sources" that aint hard to do either.
More trustworthy sites like hkepc also says 4.1 if that can please you. maybe fud fumbled his chinese translator again :rofl:
Personally I have a hard time believing they're talking about the 8800GT card as at 65nm and 110W if the numbers are correct it won't need a good cooler at all and still run cooler than current 8800GTS cards. Besides 8800GT is a midrange, definitely can't be classified as "next gen high-end" part at least. My guess it's cards in the GX2 form that they're talking about, imagine 2x8800GTX tightly packed, even 65nm won't be the cure for the temperatures especially if you run it along with a modern hardware setup such as an ASUS P35 board along with Kentsfield in a closed box, ambient temps will skyrocket. Would be a good space heater for cold winters here in Finland. :D