Page 16 of 33 FirstFirst ... 61314151617181926 ... LastLast
Results 376 to 400 of 823

Thread: 8800GT Official Reviews

  1. #376
    Registered User
    Join Date
    Oct 2007
    Location
    Atlanta, GA
    Posts
    29
    It's not my expected 2x performance, but it's a nice 30% improvement, I guess it's worth selling my X1900XTX and upgrading to 8800GT if it costs me under $50


    Where do you sell old Video Cards for $200?
    New Rig OTW
    Case - Thermaltake Armor VA8003BWS 25CM Fan
    PSU - Ultra X3 800w Modular
    CPU - Q6600 GO With Thermalright Ultra 120 Extreme
    Motherboard - Abit IP35 Pro
    Ram - DDR2 Crucial Ballistix-6400 2gig
    Video Card - EVGA 8800GT Superclock
    Hard Drive - Samsung Spinpoint 500gb
    DVD/CD Burner - Samsumg 20X OEM
    Case Fans - 4x APEVIA CF12SL-UBL 120mm
    MySpace

  2. #377
    Xtreme Guru
    Join Date
    Apr 2005
    Location
    Finland
    Posts
    4,308
    Nice, looks like it might be the first time I choose Palit.
    Intel? Core i5-4670K @ 4.3 GHz | ASRock Extreme6 Z87 | G.Skill Sniper 2x8GB @ DDR4-1866 CL9 | Gigabyte GTX 970 OC Windforce 3x | Super Flower Titanium 1000W | ViewSonic VX2268wm 120Hz LCD | Phanteks PH-TC14PE | Logitech MX-518 | Win 7 x64 Professional | Samsung 850 EVO & 840 Pro SSDs

    If all people would share opinions in an objective manner, the world would be a friendlier place

  3. #378
    Registered User
    Join Date
    Aug 2007
    Location
    New York
    Posts
    63
    I ordered my 8800GT Superclocked.. ( couldn't wait for the SSC anymore)
    from zipzoomfly. It seems really good. 279 and free shipping. But I got 2 day shipping for only 2.99!!! Haha, can't wait until it comes... it should by tomorrow.
    [SIGPIC][/SIGPIC]

  4. #379
    Xtreme Guru
    Join Date
    Apr 2005
    Location
    Finland
    Posts
    4,308

    SPARKLE Unveils New Calibre P888 Graphic Card For High-End Gaming



    Specifications

    Calibre P888
    NVIDIA GeForce 8800 GT
    675 MHz Core clock
    1800 MHz Memory clock
    512MB GDDR3
    256-bit Memory bus
    112 SPs
    1728 MHz Shader clock
    PCI-Express 2.0
    400Mhz RAMDAC
    Dual DVI-I
    Resolution up to 1920 X 1080i
    HDCP: Yes

    Source
    Last edited by RPGWiZaRD; 10-30-2007 at 10:54 AM.
    Intel? Core i5-4670K @ 4.3 GHz | ASRock Extreme6 Z87 | G.Skill Sniper 2x8GB @ DDR4-1866 CL9 | Gigabyte GTX 970 OC Windforce 3x | Super Flower Titanium 1000W | ViewSonic VX2268wm 120Hz LCD | Phanteks PH-TC14PE | Logitech MX-518 | Win 7 x64 Professional | Samsung 850 EVO & 840 Pro SSDs

    If all people would share opinions in an objective manner, the world would be a friendlier place

  5. #380
    Xtreme Enthusiast
    Join Date
    Mar 2005
    Location
    Imladris
    Posts
    967
    Quote Originally Posted by RPGWiZaRD View Post


    Specifications

    Calibre P888
    NVIDIA GeForce 8800 GT
    675 MHz Core clock
    1800 MHz Memory clock
    512MB GDDR3
    256-bit Memory bus
    112 SPs
    1728 MHz Shader clock
    PCI-Express 2.0
    400Mhz RAMDAC
    Dual DVI-I
    Resolution up to 1920 X 1080i
    HDCP: Yes

    Source
    I wonder what's the likelyhood either this or the Palit would work in SLI with those big coolers?
    The little air system that could.


  6. #381
    Xtreme Enthusiast
    Join Date
    Nov 2004
    Posts
    510
    Quote Originally Posted by HousERaT View Post
    I wonder what's the likelyhood either this or the Palit would work in SLI with those big coolers?
    That's a very good question, lol.
    ASUS M5A99X EVO
    AMD Deneb 3.4Ghz Black Edition
    Gigabyte Radeon HD5950 1GB
    Mushkin Redline 2x4GB DDR3 2133

    Quote Originally Posted by afireinside View Post
    The ps2 was disappointing it's entire life and the ps3 is downright junk so far.
    Quote Originally Posted by ahmad View Post
    No body is going to be writing games for the Cell.. no one has the time or developers that care. Thanks to SONY, the Cell is already history.

  7. #382
    Xtreme Mentor
    Join Date
    Oct 2005
    Location
    Portugal
    Posts
    3,410
    I think maybe with palit SLI work fine , look at the card , dual slot instead single slot but the cooler dont seem taller than the slot







    i think it´s the same thing you had Two G80 8800GTS with stokcooler (dual slot) @ SLI


    regards
    Last edited by mascaras; 10-30-2007 at 11:35 AM.

    [Review] Core i7 920 & UD5 » Here!! « .....[Review] XFX GTX260 216SP Black Edition » Here!! «
    [Review] ASUS HD4870X2 TOP » Here!! «
    .....[Review] EVGA 750i SLi FTW » Here!! «
    [Review] BFG 9800GTX 512MB » Here!! « .....[Review] Geforce 9800GX2 1GB » Here!! «
    [Review] EVGA GTX280 1GB GDDR3 » Here!! « .....[Review] Powercolor HD4870 512MB GDDR5 » Here!! «

  8. #383
    Xtreme Guru
    Join Date
    Apr 2005
    Location
    Finland
    Posts
    4,308
    Hmm I like the freespace on the card that cooler creates, leaves much room for further cooling improvements. Such as attaching a 80mm fan over the edge of the card to the right using the unused screwholes in the corner. Did same mod to my 6800GT as long as I was using stock cooler and it actually helped quite a few degrees (~7C or so I think) and was an easy 1 min job almost.

    So yes the reference NVIDIA cooler design seems to be very crappy if even a Zalman VF700 copy brings noticable improvement. And judging by all the temps measured in reviews it seems to be the case.
    Last edited by RPGWiZaRD; 10-30-2007 at 11:37 AM.
    Intel? Core i5-4670K @ 4.3 GHz | ASRock Extreme6 Z87 | G.Skill Sniper 2x8GB @ DDR4-1866 CL9 | Gigabyte GTX 970 OC Windforce 3x | Super Flower Titanium 1000W | ViewSonic VX2268wm 120Hz LCD | Phanteks PH-TC14PE | Logitech MX-518 | Win 7 x64 Professional | Samsung 850 EVO & 840 Pro SSDs

    If all people would share opinions in an objective manner, the world would be a friendlier place

  9. #384
    Xtreme Addict
    Join Date
    Mar 2004
    Location
    Toronto, Ontario Canada
    Posts
    1,433
    I'm surprised no one has tested the effects of PCIe 2.0 especially in SLI config.

  10. #385
    Xtreme Guru
    Join Date
    Apr 2005
    Location
    Finland
    Posts
    4,308
    Quote Originally Posted by HKPolice View Post
    I'm surprised no one has tested the effects of PCIe 2.0 especially in SLI config.
    Yea I would love to see a PCI-E 2.0 vs 1.x comparision in both single and SLI configuration. To see if it's less than 1 FPS increase or maybe even 5.
    Intel? Core i5-4670K @ 4.3 GHz | ASRock Extreme6 Z87 | G.Skill Sniper 2x8GB @ DDR4-1866 CL9 | Gigabyte GTX 970 OC Windforce 3x | Super Flower Titanium 1000W | ViewSonic VX2268wm 120Hz LCD | Phanteks PH-TC14PE | Logitech MX-518 | Win 7 x64 Professional | Samsung 850 EVO & 840 Pro SSDs

    If all people would share opinions in an objective manner, the world would be a friendlier place

  11. #386
    Registered User
    Join Date
    Nov 2006
    Posts
    77
    Anyone here know what "shortly" means to an Evga customer service rep? LOL I called to check on the status of the SSC I ordered yesterday morning with way overpriced overnight shipping. The rep indicated it would ship "shortly," however, when asked if shortly meant a few days, a day, today, the rep only said "shortly." WOW, talk about being helpful, what azzhats.

  12. #387
    Xtreme Enthusiast
    Join Date
    May 2007
    Location
    Philippines
    Posts
    793
    Im using a palit 8600GT sonic, the cooler looks the same and it really just takes two slots. SLI won't be a problem with this card.

    But, if it gonna be palit, I'd wait for the sonic version.


    Rig Specs
    Intel Core 2 Extreme QX9650 4.0ghz 1.37v - DFI Lanparty UT P35 TR2 - 4x1GB Team Xtreem DDR2-1066 - Palit 8800GT Sonic 512MB GDDR3 256-bit
    160GB Seagate Barracuda 7200RPM SATA II 8MB Cache - 320GB Western Digital Caviar 7200RPM SATA II 16MB Cache - Liteon 18X DVD-Writer /w LS
    640GB Western Digital SE16 7200RPM SATA II 16MB Cache - Corsair HX 620W Modular PSU - Cooler Master Stacker 832
    Auzen 7.1 X-Plosion - Zalman ZM-DS4F - Sennheiser HD212 Pro - Edifier M2600



    Custom Water Cooling
    Dtek Fusion Extreme CPU Block - Swiftech MCR-220 - Swiftech MCP655-B - Swiftech MCRES-MICRO Reservior - 7/16" ID x 5/8" OD Tubings
    Dual Thermaltake A2018s 120mm Blue LED Smart fans.


    www.mni-photography.site88.net

  13. #388
    Xtreme Mentor
    Join Date
    Oct 2005
    Location
    Portugal
    Posts
    3,410
    Last edited by mascaras; 10-30-2007 at 01:45 PM.

    [Review] Core i7 920 & UD5 » Here!! « .....[Review] XFX GTX260 216SP Black Edition » Here!! «
    [Review] ASUS HD4870X2 TOP » Here!! «
    .....[Review] EVGA 750i SLi FTW » Here!! «
    [Review] BFG 9800GTX 512MB » Here!! « .....[Review] Geforce 9800GX2 1GB » Here!! «
    [Review] EVGA GTX280 1GB GDDR3 » Here!! « .....[Review] Powercolor HD4870 512MB GDDR5 » Here!! «

  14. #389
    Registered User
    Join Date
    Jul 2004
    Location
    arlington,texas
    Posts
    73
    those palits look pimp
    Processor: INTEL I7 930 @2.8GHz OVERCLOCKED 4.1G
    Memory: CORSAIR DOMINATOR 12GB (6 x 2GB) 240-Pin DDR3 SDRAM DDR3 1600
    Motherboard: ASUS P6X58D PREMIUM
    Sound Card:Creative PCI Express Sound Blaster X-Fi Titanium Fatal1ty Champion Series Sound Card
    Video Card(s): NVIDIA GTX480 x2 in SLI MODE
    Hard Drive(s): Western Digital VelociRaptor WDBACN3000ENC-NRSN 300GB 10000 RPM SATA 3.0Gb/s 3.5" Internal Hard Drive
    CD/DVD: ASUS BLU RAY/DVD PLAYER/BURNER
    Power Supply: OCZ 1000WATT Z SERIES
    Case: COOLER MASTER HAF 932 Advanced RC-932-KKN5-GP Black Steel ATX Full Tower Compucase Case with USB 3.0 and Black Interior
    Monitor: DELL 3007HC @2560X1600
    OS:Windows 7 64-bit

  15. #390
    Xtreme Member
    Join Date
    Oct 2006
    Location
    Amsterdam - NL
    Posts
    136
    Quote Originally Posted by cstkl1 View Post
    E6850 L720A478 @ 4ghz 1.45v
    XFX 8800gt Alpha Dog Edition ( 700/1750/2000) .. any card can do this
    Asus Maximus Extreme SE
    Team Xtreem PC2-6400 CL3@ 1000mhz 4-4-4-8 2.28vdimm

    700/1750/2000
    Left it to run from 4a.m till 8a.m with fan/aircon off
    GPU Fan 100%
    Live in Malaysia

    3dmark06
    For comparison - 8800GTS 640:


  16. #391
    Xtreme Guru
    Join Date
    Apr 2005
    Location
    Finland
    Posts
    4,308
    That 8800GT score looks a bit low for that CPU frequency. Not that much but had expected around 300 - 400 points more.

    EDIT: NVM just saw his services list. Ewww, ~20 services 24/7 usage ftw.
    Last edited by RPGWiZaRD; 10-30-2007 at 02:28 PM.
    Intel? Core i5-4670K @ 4.3 GHz | ASRock Extreme6 Z87 | G.Skill Sniper 2x8GB @ DDR4-1866 CL9 | Gigabyte GTX 970 OC Windforce 3x | Super Flower Titanium 1000W | ViewSonic VX2268wm 120Hz LCD | Phanteks PH-TC14PE | Logitech MX-518 | Win 7 x64 Professional | Samsung 850 EVO & 840 Pro SSDs

    If all people would share opinions in an objective manner, the world would be a friendlier place

  17. #392
    Xtreme Mentor
    Join Date
    Jun 2007
    Location
    Chicago
    Posts
    3,284
    736/1083/1712shader... that is very fast and quite a bit higher than most gts cards.
    I had 2 xfx cards and the most I could overclock them to (watercooled) was 688/1000/1620 shader

    I think if you were to compare the 2, you would need to find a gt thats just as fast on the oc 750+/1850/2100+ to really compare since your card has quite the unusual overclock.
    Asus P6T, I7-920, 6gb ocz xmp, 4890, Raid 0-1 Terabyte, full watercooled - Triple Loop 5 radiators

  18. #393
    Xtreme X.I.P. Soulburner's Avatar
    Join Date
    Oct 2003
    Location
    Lincoln, NE
    Posts
    8,868
    Quote Originally Posted by TreeTop View Post
    It's not my expected 2x performance, but it's a nice 30% improvement, I guess it's worth selling my X1900XTX and upgrading to 8800GT if it costs me under $50
    Up to 30%...at some resolutions with AA/AF, the GTS 640 can still be the better card. You get anywhere from 0-30% with the GT.
    System
    ASUS Z170-Pro
    Skylake i7-6700K @ 4600 Mhz
    MSI GTX 1070 Armor OC
    32 GB G.Skill Ripjaws V
    Samsung 850 EVO (2)
    EVGA SuperNOVA 650 G2
    Corsair Hydro H90
    NZXT S340

  19. #394
    Xtreme Guru adamsleath's Avatar
    Join Date
    Nov 2006
    Location
    Brisbane, Australia
    Posts
    3,803

    look how much difference the oc to shaders makes...
    can you oc the shaders on the old gts? and if so why did i sell mine !!
    guffaw chuckle....but anyway the new cards are 65nano....maybe cooling advantage.

    and as has been shown some games (at particular settings) up to 30% perf. increase.
    especially that shader @1770, Rest Stock bar ...holymoly.

    the shaders oc accounts for 80.5% of the performance increase according to this graph

    14190-12495=1695
    13858-12495=1363
    1363/1695=0.8041
    http://www.xtremesystems.org/forums/...ghlight=8800gt

    & i like the look of the sparkle Calibre with a double fan
    Last edited by adamsleath; 10-30-2007 at 04:26 PM.
    i7 3610QM 1.2-3.2GHz

  20. #395
    Xtreme Mentor
    Join Date
    Oct 2005
    Location
    Portugal
    Posts
    3,410
    Quote Originally Posted by adamsleath View Post
    ..
    can you oc the shaders on the old gts? and if so why did i sell mine !!
    yes with rivaruner :




    NVIDIA G80 based GPU shader clock speed adjustment using 163.67 drivers and RivaTuner 2.04

    >> http://forums.guru3d.com/showthread.php?t=238083




    NVIDIA G80 based GPU shader clock speed adjustment using 163.67 drivers and RivaTuner 2.04
    ================================================== ====================

    Overview/Background
    ---------------------

    Prior to NVIDIA driver release 163.67, the shader clock speed was linked to the core clock (aka ROP domain clock) speed and could not be changed independently by itself. The relationship between core and shader domain clock speeds (for most cards) is shown in table A. Some cards have slightly different set freq vs resultant core/shader speeds so take the table as an illustration of how the shader clock changes with respect to the core clock rather than precise values. To overclock the shader speed it was necessary to flash the GPU BIOS with a modified version that sets a higher default shader speed.

    By way of an example, my 8800 GTS EVGA Superclocked comes from the factory with BIOS programmed default core and shader speeds of 576 and 1350, respectively. When increasing the core speed, I found 648 to be my maximum stable speed. From table A, you can see that with a core of 648 the maximum shader speed (owing to the driver controlled core/shader speed linkage) is 1512. To push it higher you increase the BIOS set shader speed. For example, with a BIOS set to core/shader 576/1404 (from 576/1350), all linked shader speeds are bumped up by 54MHz. So now when increasing the core to 648, the maximum shader speed becomes 1512+54=1568. I eventually determined my maximum stable shader speed to by 1674 (achieved with GPU BIOS startup speeds set to 576/1512; overclocking core to 648 now yields a shader speed of (1512-1350)+1512=1674).

    However, as of NVIDIA driver release 163.67, the shader clock can now be modified independently of the core clock speed. Here is the announcement by Unwinder:

    "Guys, I've got very good news for G80 owners. I've just examined overclocking interfaces of newly released 163.67 drivers and I was really pleased to see that NVIDIA finally added an ability of independent shader clock adjustment. As you probably know, with the past driver families the ForceWare automatically overclocked G80 shader domain synchronicallly with ROP domain using BIOS defined Shader/ROP clock ratio. Starting from 163.67 drivers internal ForceWare overclocking interfaces no longer scale shader domain clock when ROP clock is adjusted and the driver now provides completely independent shader clock adjustment interface. It means that starting from ForceWare 163.67 all overclocking tools like RivaTuner, nTune, PowerStrip or ATITool will adjust ROP clock only.
    However, new revisions of these tools supporting new overclocking interfaces will probably allow you to adjust shader clock too. Now I've played with new interfaces and upcoming v2.04 will contain an experimental feature allowing power users to definie custom Shader/ROP ratio via the registry, so RT will clock shader domain together with ROP domain using user defined ratio.
    And v2.05 will give you completely independent slider for adjusting shader clock independently of core clock.

    Note:

    By default this applies to Vista specific overclocking interfaces only, Windows XP drivers still provide traditional overclocking interface adjusting both shader and ROP clocks. However, XP drivers also contain optional Vista-styled overclocking interfaces and you can force RivaTuner to use them by setting NVAPIUsageBehavior registry entry to 1."

    Two big points of note here:
    *) The driver's new overclocking functionality is only used *by default* on Vista. Setting the rivatuner NVAPIUsageBehaviour registry entry to 1 will allow XP users to enjoy the new shader speed configurability.
    *) With the new driver interface, by default, the shader speed will not change AT ALL when you change the core speed. This is where the use of RivaTuner's new ShaderClockRatio registry value comes in (see below). It can be found under the power user tab, RivaTuner->Nvidia->Overclocking.


    Changing the shader clock speed
    --------------------------------

    On to the mechanics of the new ShaderClockRatio setting in Rivatuner 2.04. Here's more text from Unwinder:

    "Guys, I’d like to share with you some more important G80 overclocking related specifics introduced in 163.67:

    1) The driver’s clock programming routine is optimized and it causes unwanted effects when you’re trying to change shader domain clock only. Currently the driver uses just ROP domain clock only to see if clock generator programming have to be performed or not. For example, if your 8800GTX ROP clock is set to 612MHz and you need to change shader domain clock only (directly or via specifying custom or shader/ROP clock ratio) without changing current ROP clock, the driver will optimize clock frequency programming seeing that ROP clock is not changed and it simply won’t change the clocks, even if requested shader domain clock has been changed. The workaround is pretty simple: when you change shader clock always combine it with ROP clock change (for example, if your 8800GTX ROP clock is set to 612MHz and you’ve changed shader clock, simply reset ROP clock to default 576MHz, apply it, then return it to 612MHz again to get new shader clock applied). I hope that this unwanted optimization will be removed in future ForceWare, and now please just keep it in mind while playing with shader clock programming using RT 2.04 and 163.67.
    2) Currently Vista driver puts some limitations on ROP/shader domain clocks ratio you’re allowed to set. Most likely they are hardware clock generator architecture related and hardware simply cannot work (or cannot work stable) when domain clocks are too asynchronous. For example, on my 8800GTX the driver simply refuses to set the clocks with shader/ROP ratio within 1.0 – 2.0 range (default ratio is 1350/575 = 2.34), but it accepts the clocks programmed with ratio within 2.3 – 2.5 range. Considering that the driver no longer changes domain clocks synchronically and all o/c tools (RT 2.03, ATITool, nTuner, PowerStrip) currently change ROP clock only, that results in rather interesting effect: you won’t be able to adjust ROP clock as high as before. Once it gets too far from (or too close to) shader clock and shader/ROP clock ratio is out of range – the driver refuses to set such clock. Many of you already noticed this effect, seeing that the driver simply stops increasing ROP clock after a certain dead point with 163.67."

    and

    "In the latest build of 2.04 (2.04 test 7) I've added an ability of setting ShaderClockRatio to -1, which can be used to force RivaTuner to recalculate desired Shader/ROP ratio automatically by dividing default shader clock by default ROP clock.
    So if you set ShaderClockRatio = -1 and change ROP clock with RT, it will increase shader clock using you card's BIOS defined ratio (e.g. 1350/576=2.34 on GTX, 1188/513 = 2.32 on GTS etc). If you wish to go further, you may still override the ratio, for example increase shader clock by specifying greater ratio (e.g. ShaderClockRatio = 2.5)."


    Three important points here:
    *) The driver currently imposes restrictions on how far the shader clock speed can be changed from what it otherwise would've been when linked to the core clock speed in old drivers (it is suspected that the restriction is owing to hardware limitations rather than a driver software design choice). This means you can't set an arbitrary shader speed which you know your card is capable of and necessarily expect it to work.
    *) Setting the ShaderClockRatio to the special value of -1 will give you a very similar core / shader speed linkage that you had under previous drivers (163.44 and older).
    *) When the change the value of ShaderClockRatio, in order for it to take effect, you must make a core speed. So, for example, you might reduce the core speed a little, apply and then put it back to how it was and apply again.



    Worked example
    ----------------

    Surprise surprise, back to my EVGA 8800 GTS superclocked! .. First off, if you've not already done so, I recommend setting up RivaTuner monitor to show the core clock, shader clock and memory clock speeds so that you can immediately tell if your core/shader clock changes are having any effect. My setup is vista with 163.67 drivers. With RivaTuner 2.03, when overclocking the core to 648, the shader would now stick at the bootup default speed of 1512 MHz (see last paragraph of "Overview/Background" above). If I had blindly run 3dmark2006 tests after installing the 163.67 driver, I would've assumed that the new drivers give worse performance but the rivatuner graphs show you that the shader is not running at the expected speed.

    After installing RivaTuner 2.04, we are now able to set the ShaderClockRatio value to restore a higher shader clock speed. In my case since I want a shader speed of 1674 when the core is 648, I use 1674/648 = 2.58.


    =======================


    Table A
    --------

    Some cards have slightly different set freq vs resultant core/shader speeds so take the table as an illustration of how the shader clock changes with respect to the core clock rather than precise values.

    Code:
    Set core  | Resultant frequency
    frequency |   Core  Shader
    ---------------------------------
    509-524   |   513   1188
    525-526   |   513   1242
    527-547   |   540   1242
    548-553   |   540   1296
    554-571   |   567   1296
    572-584   |   576   1350
    585-594   |   594   1350
    595-603   |   594   1404
    604-616   |   612   1404
    617-617   |   621   1404
    618-634   |   621   1458
    635-641   |   648   1458
    642-661   |   648   1512
    662-664   |   675   1512
    665-679   |   675   1566
    680-687   |   684   1566
    688-692   |   684   1620
    693-711   |   702   1620
    712-724   |   720   1674
    725-734   |   729   1674
    735-742   |   729   1728
    743-757   |   756   1728






    regards
    Last edited by mascaras; 10-30-2007 at 04:42 PM.

    [Review] Core i7 920 & UD5 » Here!! « .....[Review] XFX GTX260 216SP Black Edition » Here!! «
    [Review] ASUS HD4870X2 TOP » Here!! «
    .....[Review] EVGA 750i SLi FTW » Here!! «
    [Review] BFG 9800GTX 512MB » Here!! « .....[Review] Geforce 9800GX2 1GB » Here!! «
    [Review] EVGA GTX280 1GB GDDR3 » Here!! « .....[Review] Powercolor HD4870 512MB GDDR5 » Here!! «

  21. #396
    Xtreme Guru adamsleath's Avatar
    Join Date
    Nov 2006
    Location
    Brisbane, Australia
    Posts
    3,803
    i7 3610QM 1.2-3.2GHz

  22. #397
    Xtreme Guru adamsleath's Avatar
    Join Date
    Nov 2006
    Location
    Brisbane, Australia
    Posts
    3,803
    ...but if you had to choose from a 640mb 8800gts and the new 512 gt...for the same price...for example ...which one would you take????
    i7 3610QM 1.2-3.2GHz

  23. #398
    Xtreme Mentor
    Join Date
    Oct 2005
    Location
    Portugal
    Posts
    3,410
    Quote Originally Posted by adamsleath View Post


    Quote Originally Posted by adamsleath View Post
    ...but if you had to choose from a 640mb 8800gts and the new 512 gt...for the same price...for example ...which one would you take????

    if same price i would choose the NEW 8800GTS 640mb 320-bit 112SP



    regards
    Last edited by mascaras; 10-30-2007 at 04:50 PM.

    [Review] Core i7 920 & UD5 » Here!! « .....[Review] XFX GTX260 216SP Black Edition » Here!! «
    [Review] ASUS HD4870X2 TOP » Here!! «
    .....[Review] EVGA 750i SLi FTW » Here!! «
    [Review] BFG 9800GTX 512MB » Here!! « .....[Review] Geforce 9800GX2 1GB » Here!! «
    [Review] EVGA GTX280 1GB GDDR3 » Here!! « .....[Review] Powercolor HD4870 512MB GDDR5 » Here!! «

  24. #399
    Xtreme Guru adamsleath's Avatar
    Join Date
    Nov 2006
    Location
    Brisbane, Australia
    Posts
    3,803
    dat's the new gts and it will be fairly pricey...but will it beat the gtx at a lower price point ?

    and i still think 65nano card should be the focus.
    Last edited by adamsleath; 10-30-2007 at 04:51 PM.
    i7 3610QM 1.2-3.2GHz

  25. #400
    Xtreme Mentor
    Join Date
    Oct 2005
    Location
    Portugal
    Posts
    3,410
    Quote Originally Posted by adamsleath View Post
    dat's the new gts and it will be fairly pricey...but will it beat the gtx at a lower price point ?

    and i still think 65nano card should be the focus.
    wait for the new 8800GTS 512Mb G92@65nm 128SP


    [Review] Core i7 920 & UD5 » Here!! « .....[Review] XFX GTX260 216SP Black Edition » Here!! «
    [Review] ASUS HD4870X2 TOP » Here!! «
    .....[Review] EVGA 750i SLi FTW » Here!! «
    [Review] BFG 9800GTX 512MB » Here!! « .....[Review] Geforce 9800GX2 1GB » Here!! «
    [Review] EVGA GTX280 1GB GDDR3 » Here!! « .....[Review] Powercolor HD4870 512MB GDDR5 » Here!! «

Page 16 of 33 FirstFirst ... 61314151617181926 ... LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •