Page 6 of 9 FirstFirst ... 3456789 LastLast
Results 126 to 150 of 206

Thread: G92 is 8800GT 512mb

  1. #126
    Xtreme Enthusiast
    Join Date
    May 2006
    Location
    over the rainbow
    Posts
    964
    Quote Originally Posted by Raptor22 View Post
    The 6600GT beats most of the x800 series, and the 6600 only has a 128bit bus... it all depends on clocks, architecture and overclocking,
    X800 XT is faster than 6800Ultra, so how could 6600GT be faster than "most of the x800 series"?
    AMD Phenom II X6 1055T@3.5GHz@Scythe Mugen 2 <-> ASRock 970 Extreme4 <-> 8GB DDR3-1333 <-> Sapphire HD7870@1100/1300 <-> Samsung F3 <-> Win8.1 x64 <-> Acer Slim Line S243HL <-> BQT E9-CM 480W

  2. #127
    Xtreme Addict
    Join Date
    Jun 2007
    Location
    Northern California
    Posts
    2,144
    Quote Originally Posted by w0mbat View Post
    X800 XT is faster than 6800Ultra, so how could 6600GT be faster than "most of the x800 series"?
    X800XL, X800Vanilla....
    |-------Conner-------|



    RIP JimmyMoonDog

    2,147,222 F@H Points - My F@H Statistics:
    http://fah-web.stanford.edu/cgi-bin/...e=Conman%5F530

  3. #128
    Xtreme Addict
    Join Date
    Jun 2007
    Location
    Northern California
    Posts
    2,144
    Quote Originally Posted by CandymanCan View Post
    SLI 6600's werent as fast as a single 6800GT so how could a single 6600 be faster then all X800's


    edit nvm, your comparing it to a crippled el cheapo class x800 arent you lol
    Well Im not comparing an 8800GT to a GTX either am I?

    EDIT: Oh... and the 6600s sucked the 6600GTs were the b!tchin' variety...
    |-------Conner-------|



    RIP JimmyMoonDog

    2,147,222 F@H Points - My F@H Statistics:
    http://fah-web.stanford.edu/cgi-bin/...e=Conman%5F530

  4. #129
    Xtreme Enthusiast
    Join Date
    Mar 2007
    Location
    Portsmouth, UK
    Posts
    963
    Quote Originally Posted by v_rr View Post
    lol witch type of surprises?
    Double Precision FP
    NVIO + PureVideo HD on chip (maybe)
    Possible unlockables or just a 128SP part with 32SP's fused off
    More cache maybe

    Alternatively since die shrinks are rarely perfect it could just be a non perfect shrink.

  5. #130
    Xtreme Cruncher
    Join Date
    Apr 2006
    Posts
    3,012
    Quote Originally Posted by Raptor22 View Post
    The 6600GT beats most of the x800 series, and the 6600 only has a 128bit bus... it all depends on clocks, architecture and overclocking,
    i don't care how fast your core is if can't be suppied with the information by thte ram fast enough. when guys were modding the 7900gt's after you got 1900mem and anyware aboive 700core you would see no gain after you increased the core speed past that point. reason being a bandwidth problem. so you take a 8800gtx core and chop 1/3 of the bandwidth off it's going to make a huge difference. that just might be what he 8800gt is a shrikned hogher clocked lower bandwidth gtx
    CPU: Intel Core i7 3930K @ 4.5GHz
    Mobo: Asus Rampage IV Extreme
    RAM: 32GB (8x4GB) Patriot Viper EX @ 1866mhz
    GPU: EVGA GTX Titan (1087Boost/6700Mem)
    Physx: Evga GTX 560 2GB
    Sound: Creative XFI Titanium
    Case: Modded 700D
    PSU: Corsair 1200AX (Fully Sleeved)
    Storage: 2x120GB OCZ Vertex 3's in RAID 0 + WD 600GB V-Raptor + Seagate 1TB
    Cooling: XSPC Raystorm, 2x MCP 655's, FrozenQ Warp Drive, EX360+MCR240+EX120 Rad's

  6. #131
    I am Xtreme
    Join Date
    Oct 2004
    Location
    U.S.A.
    Posts
    4,743

    G92 has heat problems

    http://www.theinquirer.net/gb/inquir...-heat-problems

    Panicked, last minute 'Thermal Analysis' suggests

    By Charlie Demerjian: Thursday, 04 October 2007, 12:23 PM

    Falk AdSolution

    IT SOUNDS LIKE Nvidia's G92, the next-gen high-end part, is having heat problems. Several people told us that a few weeks ago, they got an urgent letter from NV to send them computers that the new G92 would go in for 'thermal analysis'. Hmmm, makes you wonder, doesn't it?

    More interestingly, the OEMs: several told the same story, said they were given about a week to comply, slap it in a box and FexEx that sucker, ASAP. Other than 'thermal analysis' and 'do it now', no explanation was given. That really made uswonder.

    It sounds like a cooling problem, not a die problem. The die itself is far smaller than the ~480mm^2 of the G80,. Those seen by our moles are just over 17*17mm or 289 mm^2 on a 65nm process. If you do the math, (.65 * .65)/(.80 * .80) * 480 mm^2 gives you about what you would expect for a more or less simple shrink with a few tweaks.

    This means the chip will have approximately the power density of a modern CPU, assuming they didn't up the wattage by a lot. This is quite controllable; if ATI could do it on the X2900XT, the G92 should not pose much of a problem.

    So, where does that leave us? I am guessing, and this is only a guess, that the cooler they ordered isn't exactly cutting it on production silicon in a real case. I can't think of another reason why they would have to jump through so many hoops so late in the process.

    In any case, word should be leaking soon enough, and we will then know if we have another 5800 or 8800 on our hands. One thing for sure, you won't be seeing them in laptops, especially Montevina ones. ĩ


    Asus Z9PE-D8 WS with 64GB of registered ECC ram.|Dell 30" LCD 3008wfp:7970 video card

    LSI series raid controller
    SSDs: Crucial C300 256GB
    Standard drives: Seagate ST32000641AS & WD 1TB black
    OSes: Linux and Windows x64

  7. #132
    Xtreme Addict
    Join Date
    Apr 2004
    Posts
    1,640
    Should probably post that in this.

    http://www.xtremesystems.org/forums/...d.php?t=160407

    Regardless, that's the price they have to pay for single slot cooling.

    They're probably shooting for ~1.8GHz SPs.
    DFI LANParty DK 790FX-B
    Phenom II X4 955 BE (1003GPMW) @ 3.8GHz (19x200) w/1.36v
    -cooling: Scythe Mugen 2 + AC MX-2
    XFX ATI Radeon HD 5870 1024MB
    8GB PC2-6400 G.Skill @ 800MHz (1:2) 5-5-5-15 w/1.8v
    Seagate 1TB 7200.11 Barracuda
    Corsair HX620W


    Support PC gaming. Don't pirate games.

  8. #133
    Xtreme Addict
    Join Date
    Jul 2006
    Location
    Vancouver, BC
    Posts
    2,061
    Several people told us that a few weeks ago, they got an urgent letter from NV to send them computers that the new G92 would go in for 'thermal analysis'. Hmmm, makes you wonder, doesn't it?
    Sounds to me like some engineers at NVIDIA have a LAN party planned this weekend and needed some good rigs for the occassion! LOL!

  9. #134
    3D Team Captain Don_Dan's Avatar
    Join Date
    May 2007
    Location
    Munich, Germany
    Posts
    4,199
    Quote Originally Posted by virtualrain View Post
    Sounds to me like some engineers at NVIDIA have a LAN party planned this weekend and needed some good rigs for the occassion! LOL!
    That's not so unlikely... thourough testing I would call it!

    Quote Originally Posted by chew* View Post
    You can never have enough D9's.

  10. #135
    Xtreme Guru
    Join Date
    Apr 2005
    Location
    Finland
    Posts
    4,308
    So is G92 also a highend part now again? I get so confused about this part as one day they say it's the new highend, the other day they say it's a entry level, a few days later perhaps a midrange in form of 8800GT etc.
    Intel? Core i5-4670K @ 4.3 GHz | ASRock Extreme6 Z87 | G.Skill Sniper 2x8GB @ DDR4-1866 CL9 | Gigabyte GTX 970 OC Windforce 3x | Super Flower Titanium 1000W | ViewSonic VX2268wm 120Hz LCD | Phanteks PH-TC14PE | Logitech MX-518 | Win 7 x64 Professional | Samsung 850 EVO & 840 Pro SSDs

    If all people would share opinions in an objective manner, the world would be a friendlier place

  11. #136
    Xtreme Addict
    Join Date
    Nov 2005
    Posts
    1,084
    Quote Originally Posted by DeathReborn View Post
    Double Precision FP
    NVIO + PureVideo HD on chip (maybe)
    Possible unlockables or just a 128SP part with 32SP's fused off
    More cache maybe

    Alternatively since die shrinks are rarely perfect it could just be a non perfect shrink.
    Or:

    Geforce 8800 GT doesn't do DirectX 10.1

    No support for Shader model 4.1

    Documents seen by Fudzilla indicates that G92/D8P aka the Geforce 8800 GT is not Shader model 4.1 compatible. It can mean one of two things, one, that Nvidia doesn't want release the information or two, simply that this chip doesn't have support for Shader model 4.1 and DirectX 10.1.


    This comes as an interesting surprise as we know that the RV670 aka the Radeon HD 2950 series will support Shader model 4.1 and DirectX 10.1.


    We will ask around and try to find out if this is the case, but this would be a big setback for Nvidia, at least when it comes to feature tick boxes on upcoming games.
    http://www.fudzilla.com/index.php?op...=3479&Itemid=1

  12. #137
    Xtreme Enthusiast
    Join Date
    Mar 2007
    Location
    Portsmouth, UK
    Posts
    963
    Quote Originally Posted by v_rr View Post
    Or:

    Geforce 8800 GT doesn't do DirectX 10.1


    http://www.fudzilla.com/index.php?op...=3479&Itemid=1
    Since DX 10.1 is mostly about converting your PC into a Xbox 360 clone (audio at this stage) I personally don't see that as a big loss.

    Also, I did not mention either DX10.1/SM4.1 as possible "surprises".

  13. #138
    Xtreme Addict
    Join Date
    Nov 2005
    Posts
    1,084
    Quote Originally Posted by DeathReborn View Post
    Since DX 10.1 is mostly about converting your PC into a Xbox 360 clone (audio at this stage) I personally don't see that as a big loss.
    SM4.1 is Audio?

  14. #139
    One-Eyed Killing Machine
    Join Date
    Sep 2006
    Location
    Inside a pot
    Posts
    6,340
    Quote Originally Posted by v_rr View Post
    SM4.1 is Audio?
    Will you see a game utilizing SM4.1 , or DX10.1 ( which is nothing special over DX10 ) before the new series of GPUs ? Nope...so...

    SM4 or SM4.1 or SM50.3 who cares.

    Also...are you sure you wanna base your opinion on FUDzilla ? ( ROFLMAOZORZ ? )
    Coding 24/7... Limited forums/PMs time.

    -Justice isn't blind, Justice is ashamed.

    Many thanks to: Sue Wu, Yiwen Lin, Steven Kuo, Crystal Chen, Vivian Lien, Joe Chan, Sascha Krohn, Joe James, Dan Snyder, Amy Deng, Jack Peterson, Hank Peng, Mafalda Cogliani, Olivia Lee, Marta Piccoli, Mike Clements, Alex Ruedinger, Oliver Baltuch, Korinna Dieck, Steffen Eisentein, Francois Piednoel, Tanja Markovic, Cyril Pelupessy (R.I.P. ), Juan J. Guerrero

  15. #140
    Xtreme Cruncher
    Join Date
    Aug 2006
    Location
    Denmark
    Posts
    7,747
    To talk about SM4.1 now is abit of a joke, nomatter if they support it or not. DX10.1 main feature is audio. And even then, DX10.1 aint coming anytime soon either. And there is a reason its called SM4.1 and not 5.0. Not much changed.

    And considering games that requires SM3.0 are just hitting the market now I dont see any "need".
    Also this single slot card is a replacement for 8800GTS320.

    The more main features of DX10.1 is 32-bit floating point filtering and required 4x anti-aliasing.

    But again, nobody believed G80 would have unified shaders up to its launch either.
    Last edited by Shintai; 10-05-2007 at 05:59 AM.
    Crunching for Comrades and the Common good of the People.

  16. #141
    Xtreme Addict
    Join Date
    Nov 2005
    Posts
    1,084
    Quote Originally Posted by BenchZowner View Post
    Will you see a game utilizing SM4.1 , or DX10.1 ( which is nothing special over DX10 ) before the new series of GPUs ? Nope...so...

    SM4 or SM4.1 or SM50.3 who cares.

    Also...are you sure you wanna base your opinion on FUDzilla ? ( ROFLMAOZORZ ? )
    No.
    Just watching that NVIDIA fanboys in case that G92 donīt suport DX_10.1/SM4.1 say that itīs no big deal. And DX_10.1 itīs nothing.

    If it was ATI not suporting that it was a hole big deal, and ATI suck, bla bla bla.

    Lot funny

  17. #142
    Xtreme PSU Tester
    Join Date
    Jul 2006
    Posts
    1,380
    Quote Originally Posted by RPGWiZaRD View Post
    So is G92 also a highend part now again? I get so confused about this part as one day they say it's the new highend, the other day they say it's a entry level, a few days later perhaps a midrange in form of 8800GT etc.
    Seriously. Is this the 8800GT with it's single slot cooling solution they're talking about? If it is, I can sort of understand. It doesn't seem like a very good design to me.

  18. #143
    Xtreme Cruncher
    Join Date
    Aug 2006
    Location
    Denmark
    Posts
    7,747
    Quote Originally Posted by v_rr View Post
    No.
    Just watching that NVIDIA fanboys in case that G92 don&#180;t suport DX_10.1/SM4.1 say that it&#180;s no big deal. And DX_10.1 it&#180;s nothing.

    If it was ATI not suporting that it was a hole big deal, and ATI suck, bla bla bla.

    Lot funny
    Its funny where you base your knowledge from. And considering G8x already supports 2 of the 3 DX10.1 features. You might end up again..as one spreeing fud. But considering your "sources" that aint hard to do either.

    More trustworthy sites like hkepc also says 4.1 if that can please you. maybe fud fumbled his chinese translator again
    Last edited by Shintai; 10-05-2007 at 06:12 AM.
    Crunching for Comrades and the Common good of the People.

  19. #144
    Xtreme Guru
    Join Date
    Apr 2005
    Location
    Finland
    Posts
    4,308
    Quote Originally Posted by jonnyGURU View Post
    Seriously. Is this the 8800GT with it's single slot cooling solution they're talking about? If it is, I can sort of understand. It doesn't seem like a very good design to me.
    Personally I have a hard time believing they're talking about the 8800GT card as at 65nm and 110W if the numbers are correct it won't need a good cooler at all and still run cooler than current 8800GTS cards. Besides 8800GT is a midrange, definitely can't be classified as "next gen high-end" part at least. My guess it's cards in the GX2 form that they're talking about, imagine 2x8800GTX tightly packed, even 65nm won't be the cure for the temperatures especially if you run it along with a modern hardware setup such as an ASUS P35 board along with Kentsfield in a closed box, ambient temps will skyrocket. Would be a good space heater for cold winters here in Finland.
    Last edited by RPGWiZaRD; 10-05-2007 at 06:26 AM.
    Intel? Core i5-4670K @ 4.3 GHz | ASRock Extreme6 Z87 | G.Skill Sniper 2x8GB @ DDR4-1866 CL9 | Gigabyte GTX 970 OC Windforce 3x | Super Flower Titanium 1000W | ViewSonic VX2268wm 120Hz LCD | Phanteks PH-TC14PE | Logitech MX-518 | Win 7 x64 Professional | Samsung 850 EVO & 840 Pro SSDs

    If all people would share opinions in an objective manner, the world would be a friendlier place

  20. #145
    One-Eyed Killing Machine
    Join Date
    Sep 2006
    Location
    Inside a pot
    Posts
    6,340
    Quote Originally Posted by v_rr View Post
    No.
    Just watching that NVIDIA fanboys in case that G92 donīt suport DX_10.1/SM4.1 say that itīs no big deal. And DX_10.1 itīs nothing.

    If it was ATI not suporting that it was a hole big deal, and ATI suck, bla bla bla.

    Lot funny
    First of all I really hope that you're not assuming/saying that I'm a nVIDIA fanboy, because I'm not, and I don't really care at all about the brands, etc, all I care about is performance and quality.

    So your point of view in this thread is all about ATi vs nVIDIA and ATi fans vs nVIDIA fans ?

    friendly as always,
    BZ
    Coding 24/7... Limited forums/PMs time.

    -Justice isn't blind, Justice is ashamed.

    Many thanks to: Sue Wu, Yiwen Lin, Steven Kuo, Crystal Chen, Vivian Lien, Joe Chan, Sascha Krohn, Joe James, Dan Snyder, Amy Deng, Jack Peterson, Hank Peng, Mafalda Cogliani, Olivia Lee, Marta Piccoli, Mike Clements, Alex Ruedinger, Oliver Baltuch, Korinna Dieck, Steffen Eisentein, Francois Piednoel, Tanja Markovic, Cyril Pelupessy (R.I.P. ), Juan J. Guerrero

  21. #146
    Xtreme Addict
    Join Date
    Nov 2005
    Posts
    1,084
    Quote Originally Posted by Shintai View Post
    Its funny where you base your knowledge from. And considering G8x already supports 2 of the 3 DX10.1 features. You might end up again..as one spreeing fud. But considering your "sources" that aint hard to do either.

    More trustworthy sites like hkepc also says 4.1 if that can please you. maybe fud fumbled his chinese translator again
    If you watch the post you see:

    BenchZowner: Also...are you sure you wanna base your opinion on FUDzilla ? ( ROFLMAOZORZ ? )
    My answer: No.

    What I said is that in the possibility of G92 donīt support DX_10.1, all fanboys came here with answers like yours saying.

    I said previously that I donīt beleave in Fud, buts itīs funny to see your answer off the type:
    "Ah DX_10.1 doesnīt matter"

    Witch everyone knows that in markting itīs a key element. And also everyone want to have it īs support.

    Oderwise anyone will buy HD 2600 and 8600 because X1950Pro/XT are lot lot lot lot lot lot better, but everyone goes for HD 2600 and 8600 even knowing that X1950Pro/XT are better and much faster in DX_9, and in DX_10 the HD 2600/8600 suck.

    It is not something like you say that doesnīt matter. Itīs very very important....

  22. #147
    Xtreme Enthusiast
    Join Date
    Mar 2007
    Location
    Portsmouth, UK
    Posts
    963
    Seems like the 8800GT cooler has 3 heatpipes so this whole overheating thing is probably in cases of bad case airflow.

    http://www.vr-zone.com/articles/GeFo...ipes/5320.html

  23. #148
    Xtreme Enthusiast
    Join Date
    Mar 2007
    Location
    Portsmouth, UK
    Posts
    963
    Quote Originally Posted by v_rr View Post
    SM4.1 is Audio?
    I'll say this nicely in hopes you'll understand it. I said DX10.1 is mostly audio related (XAudio2 incase you didn't know), SM 4.1 is just a small part of DX10.1.

    In other news, the 8800GT is looking nice... shame I don't have a side window anymore.

    http://www.vr-zone.com/articles/More..._PCB/5323.html

    It seems like more GeForce 8800 GT card photos have surfaced online as seen in our forums and apparently another website in China got the inside scoop. This time round the PCB is black in color, which looks better than the reference green PCB. The GPU heat-spreader looks the any other G80 series and the rumored die size of G92 is 289mm2 as opposed to ~480mm2 of the G80.
    And apparently a R6xx product beating it's equivalent G8x/9x product in 3DMark means impending doom...

    http://www.vr-zone.com/articles/RV67...s_Up/5322.html

    AMD 1 - 0 NVIDIA


    VR-Zone learned about Radeon HD 2950PRO (RV670 Revival) scoring around 10.7K in 3DMark06 sometime back with a good old FX-62 CPU. INQ now revealed that RV670 will score around 11.4K, some 600+ points higher than Nvidia's G92 reference score of 10.8K on the same platform. We heard that it was benched using a fast Core 2 processor. If all these figures provided by AMD and Nvidia are accurate, it looks that RV670 has an upper hand now. Also we heard that Nvidia wants 8800 GT to be launched at least 2 weeks earlier than RV670 so it could sell as many cards as possible before AMD ends their party on Nov 19th.

  24. #149
    Xtreme Member
    Join Date
    Nov 2005
    Posts
    126
    lol, X2900XT score way higher than 8800GTS in 3Dmark but does it translate to superior gaming prowess ? nada >_>. Stop basing graphics cards performance on 3Dmark, no body beside benchers care.
    Last edited by Krizby87; 10-05-2007 at 07:17 AM.
    Core i7 8700k @ 5.1Ghz * Gigabyte Z370 Aorus Gaming 5 * 4x8GB Corsair RGB @ 3600 16-18-18-36 * GTX 1080ti @ 2050/11400 * Plextor M8Pe 512GB * Creative Sound Blaster Z * Audioengine 5+ * Corsair Obsidian 750D * Corsair RM1000 watt

  25. #150
    Live Long And Overclock
    Join Date
    Sep 2004
    Posts
    14,058
    The 8800GT has its points, the HD 2950 has its own.

    I don't see why one has to be better than the other for you to buy either one.

    PS. The heat problems thread was older the posts above, so its on page 6.

    You can continue the discussion here.

    I for one am not surprised, except for the Inq's insinuation that the GT is on a 65nm process...that's the first time I'm hearing this but it is within Nvidia's capacity to do so.

    Perkam
    Last edited by perkam; 10-05-2007 at 08:06 AM.

Page 6 of 9 FirstFirst ... 3456789 LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •