Page 7 of 7 FirstFirst ... 4567
Results 151 to 175 of 175

Thread: G80 De-Mystified; GeForce 8800GTX/GT

  1. #151
    Xtreme Addict
    Join Date
    Jan 2004
    Posts
    1,313
    Quote Originally Posted by Dublin_Gunner
    I was jokin m8, while giving an example of crappy shadows!
    Being a DX9 game, obviously DX10 wont fix it!
    sometimes hard to tell if you are joking.
    but, I suppose if a company like Bungie updated Halo, then Dice can update BF2 too ((HL1 source also comes to mind). Get it all Vista "compatible" and showcase those dark sily smooth shadows.. oh yeah baby.

    Likewise with 3DMark2005.. especially apparent on the deck when the captain is looking out for the beast in game test 3.

    But, ofcourse you'll only see these benefits if the hardware supports it... and I doubt folks would buy G80/R600 to relive the "classics".

    FROM THE ARCHIVES:
    I got so excited to see 4-5 trees at a time in Unreal2 preview.. I though wow, this is next generation technology. Yet, all those teaser clips and screenshots of HL2 and Doom3 looked absolutely incredible (back then). At least half the people discredited them as pre-rendered, or photoshoped. Then, I think it was March/APril 2004, OUT OF NOWHERE, Far Cry. My hippie pot smoking friend, eyez ablaze, turned down a reefer to barge into my room, and get me to download this "demo".. pff.. another bunch of screenshots or some stupid clip of "supposed gameplay".

    My heart skipped a beat. OMG, I couldn't take my eyes of the screen. I spent hours literaly crawling through the vegetation. Exploring the island. Taking long luxurious swims. Probably took more in-game screenshots that day then all the years before combined! For such a long time, in all sorts of games, you "imagined" you were in a jungle (ex Goldeneye N64), or in a forest, or just walking on gravel or through grass... now, you no longer had to imagine.. it was there. Probably the 3rd religious awakaning, after seeing 3D for first time (Quake), and ofcourse the enchanting music and that memorable warthog in Halo.

    24/7: A64 3000+ (\_/) @2.4Ghz, 1.4V
    1 GB OCZ Gold (='.'=) 240 2-2-2-5
    Giga-byte NF3 (")_(") K8NSC-939
    XFX 6800 16/6 NV5 @420/936, 1.33V

  2. #152
    Xtreme News Addict
    Join Date
    May 2005
    Location
    Winnipeg, Manitoba, Canada
    Posts
    2,065
    Quote Originally Posted by ***Deimos***
    sometimes hard to tell if you are joking.
    .
    Well, stop being so serious all the time Deimos

  3. #153
    Xtreme Addict
    Join Date
    Sep 2006
    Location
    Surat, India.
    Posts
    1,309
    How much power Will this Tower really Eat up ?? They Say 300w, But i seriously doubt it. If yes, then they would fry like hell. Cant say about the R600 since they will be very hot. Will my OCZ 700w be fine powering 8800GTX.


    Also was wondering, since this Card IS SUPPOSED TO BE VERY LONG, how long will be R600, (ATI say they will be the biggest Cards ever) OMG.
    Sound: Asus Essense ST | Wharfedale Diamond 9.1 | Norge 2060 Stereo amp | Wharfedale SW150 sub (coming soon)
    Camera Gear: Canon 6D | Canon 500D | Canon 17-40L | Canon 24-105L | Canon 50mm f1.4 | Canon 85mm f1.8 | Rokinon 14mm f2.8 | Sigma 10-20EX HSM | Benro A3580F + Vanguard SBH250 | Bag full of filters and stuff

  4. #154
    Xtreme Mentor
    Join Date
    May 2005
    Location
    Westlake Village, West Hills
    Posts
    3,046
    I don't like it when they release these school bus size cards. They are just bleh. I remember the 6800ultra, that was a huge card, then the 7900GTX came out and that was pretty big. Pretty soon we are going to need a case just for a GPU. It will be linked with a floppy cable haha.
    PC Lab Qmicra V2 Case SFFi7 950 4.4GHz 200 x 22 1.36 volts
    Cooled by Swiftech GTZ - CPX-Pro - MCR420+MCR320+MCR220 | Completely Silent loads at 62c
    GTX 470 EVGA SuperClocked Plain stock
    12 Gigs OCZ Reaper DDR3 1600MHz) 8-8-8-24
    ASUS Rampage Gene II |Four OCZ Vertex 2 in RAID-0(60Gig x 4) | WD 2000Gig Storage


    Theater ::: Panasonic G20 50" Plasma | Onkyo SC5508 Processor | Emotiva XPA-5 and XPA-2 | CSi A6 Center| 2 x Polk RTi A9 Front Towers| 2 x Klipsch RW-12d
    Lian-LI HTPC | Panasonic Blu Ray 655k| APC AV J10BLK Conditioner |

  5. #155
    Xtreme Addict
    Join Date
    Jan 2004
    Posts
    1,313
    Pretty soon you will buy a video card and plug the motherboard into it!!

    24/7: A64 3000+ (\_/) @2.4Ghz, 1.4V
    1 GB OCZ Gold (='.'=) 240 2-2-2-5
    Giga-byte NF3 (")_(") K8NSC-939
    XFX 6800 16/6 NV5 @420/936, 1.33V

  6. #156
    Xtreme Member
    Join Date
    Jan 2006
    Location
    Gotham City, Sweden
    Posts
    367
    Quote Originally Posted by ANP !!!
    How much power Will this Tower really Eat up ?? They Say 300w, But i seriously doubt it. If yes, then they would fry like hell. Cant say about the R600 since they will be very hot. Will my OCZ 700w be fine powering 8800GTX.


    Also was wondering, since this Card IS SUPPOSED TO BE VERY LONG, how long will be R600, (ATI say they will be the biggest Cards ever) OMG.

    Your psu should handle one at least. Quad Sli might not work though
    AMD Phenom II 940BE | DFI LP DK790B-M2RS | Echo audio Layla 3G | 8GB Corsair C5 PC6400 | Sapphire HD3870 with HR-03+ | Genelec 8030A | Samsung 244T | Antec P182 gun metal | Thermal right Ultra120 | UPS: APC BACK-UPS RS 1500VA | Windows Vista Ultimate 64 | DAW: Ableton/Sonar | WAN: 100/10Mbit/s | OS on: WD Velocioraptor Storage: Rocket Raid 2300 PCI-E + 4*400GB Samsung T133 @Raid5. Firewall: Tyan Tomcat 945GM | Core Duo T2600 | 2*512MB ram | Nexus PM PSM-5000 | Picu PSU.


    "People who enjoy waving flags
    don't deserve to have one".

  7. #157
    Xtreme Addict
    Join Date
    Jan 2004
    Posts
    1,313
    Quote Originally Posted by Poodle
    Your psu should handle one at least. Quad Sli might not work though
    I doubt they would try quad again.. That whole project was doomed from the start. Direct3D games see little benefit over SLI, since typically only triple buffering. No limitation on buffers in OpenGL, but then again, not many games either. Quad SLI is just way too impractical, and inefficient.

    However, even SLI G80 will surely be big burden. That's probably where they came up with the 250-300W figures. But, also remember you need at least 100-150W for the rest of the system, as well as demand response, and margin for stability. All of a sudden 500W+ PSU requirement doesn't sound crazy anymore.

    And one other important thing.. heat. And not just the challenge of cooling G80 (dual slot heatsink or aftermarket water cooling). G80 SLI will surely stress other components too. Some special applications may generate huge PCIE bandwidth load. Chipset probably hotter than ever. Likewise, more CPU-limted scenarios, and more stress on the CPU. If PCIE slot is relied on for power, significant strain there too. And ofcourse, with nearly 500W total being used by the system, even an efficient 80% 500W powersupply will be putting out nearly 100W of heat, all by itself.

    24/7: A64 3000+ (\_/) @2.4Ghz, 1.4V
    1 GB OCZ Gold (='.'=) 240 2-2-2-5
    Giga-byte NF3 (")_(") K8NSC-939
    XFX 6800 16/6 NV5 @420/936, 1.33V

  8. #158
    Xtreme Member
    Join Date
    Apr 2005
    Posts
    108
    i wonder if there will be a gx2 version of the g80. aka a more effecient sli solution? who knows? it would be cool tho.

    feeling better about my $230 7900gt(x) LOL

  9. #159
    c[_]
    Join Date
    Nov 2002
    Location
    Alberta, Canada
    Posts
    18,728
    Ah, so T&L is more a weaker less efficient form of what STMicro used in the Kyro series, though limited to the "frustrum". That makes more sense now.

    Having mentioned STMicro, anyone else been keeping tabs on the things they've been doing? I poke over there once in a blue moon and saw a few rather interesting things..

    EDIT - Looking forward to the panty pic, though there are bound to be more "entry level" consumer cards as well.. so maybe not.

    All along the watchtower the watchmen watch the eternal return.

  10. #160
    Xtreme Addict
    Join Date
    Jan 2004
    Posts
    1,313
    Quote Originally Posted by STEvil
    Ah, so T&L is more a weaker less efficient form of what STMicro used in the Kyro series, though limited to the "frustrum". That makes more sense now.

    Having mentioned STMicro, anyone else been keeping tabs on the things they've been doing? I poke over there once in a blue moon and saw a few rather interesting things..

    EDIT - Looking forward to the panty pic, though there are bound to be more "entry level" consumer cards as well.. so maybe not.
    Perhaps you misunderstood me
    there IS NO SUBSTITUTE for T&L (or TLC). Its an integral part of the graphics pipeline. Doesn't matter if you're using pixel shaders or vertex shaders.. those are add-ons. You're always doing T&L.

    ST Micro Kyro chips had no T&L hardware acceleration. It was done in software, ie the CPU did the calculations. Games which dont have compatibility fall-back.. games that require hardware pixel shaders or T&L.. they will not work at all with the Kyro.

    24/7: A64 3000+ (\_/) @2.4Ghz, 1.4V
    1 GB OCZ Gold (='.'=) 240 2-2-2-5
    Giga-byte NF3 (")_(") K8NSC-939
    XFX 6800 16/6 NV5 @420/936, 1.33V

  11. #161
    X.I.P
    Join Date
    Mar 2005
    Posts
    1,964
    the die is huge!!!!
    using 1.1ns gddr3
    it has something like HT build-in

  12. #162
    Admin
    Join Date
    Feb 2005
    Location
    Ann Arbor, MI
    Posts
    12,338
    Is it at least small enough for a MCW60 to make contact with the whole thing?

  13. #163
    Xtreme Mentor
    Join Date
    Sep 2005
    Location
    Netherlands
    Posts
    2,693
    Quote Originally Posted by ***Deimos***
    sometimes hard to tell if you are joking.
    but, I suppose if a company like Bungie updated Halo, then Dice can update BF2 too ((HL1 source also comes to mind). Get it all Vista "compatible" and showcase those dark sily smooth shadows.. oh yeah baby.
    ur kidding right?
    Bungie didnt update halo2 they are just gonna put some code init so it will only run on Vista no other changes are done to it. It wont use Dx10 and it wont look anything fancier then ont he old Xbox. So to be short its a very easy job for a developer.

    and Dice/EA will have Bf2142 out when Vista comes so i highly doubt theyl waste any energy on fixing or upgrading anything about Bf2 by then.
    ya i guess they are the kind of company that will slap a Vista compatible sticker onit for a value version of Bf2 just to sell to the suckers out there but i doubt theyl fix anything once bf2142 is out.
    Time flies like an arrow. Fruit flies like a banana.
    Groucho Marx



    i know my grammar sux so stop hitting me

  14. #164
    Xtreme Addict
    Join Date
    Jan 2004
    Posts
    1,313
    sometimes hard to tell if you are joking.
    but, I suppose if a company like Bungie updated Halo, then Dice can update BF2 too ((HL1 source also comes to mind). Get it all Vista "compatible" and showcase those dark sily smooth shadows.. oh yeah baby.

    EDIT: its very well possible... I didn't say they WILL do it.

    24/7: A64 3000+ (\_/) @2.4Ghz, 1.4V
    1 GB OCZ Gold (='.'=) 240 2-2-2-5
    Giga-byte NF3 (")_(") K8NSC-939
    XFX 6800 16/6 NV5 @420/936, 1.33V

  15. #165
    I am Xtreme
    Join Date
    Oct 2005
    Posts
    4,682
    I had an odd thought

    Regarding the power connectors.


    Okay so instead running two seperate PCI-e cables to two cards, what if.

    It would use one cable, have a "jumper" to the second card.

    Running two cards off of one PCI-e cable?
    fermiNow Dave will see FERMI where ever I go
    Quote Originally Posted by jbartlett323 View Post
    So please return to the "Darkside of the Moon" and check your "Pulse" while you wait for the "Animals" that will be "Obscured By Clouds". And watch me wave as I say "Wish You Were Here" in "A Momentary Lapse of Reason"

  16. #166
    Xtreme Member
    Join Date
    Sep 2006
    Posts
    243
    Once there's sufficient current on that rail, I dont see how that would become a problem.

    I still fail to see why they would need 2.
    E6400 L628 @3.4 1.4V
    Ultra 120 Extreme
    2GB team Xtreem @ DDR2-850 4 4 3 10
    Abit Quad GT
    X1800XT flashed to XTPE 700/1600
    OCZ GameXstream 600W

  17. #167
    Xtreme Addict
    Join Date
    Apr 2006
    Location
    City of Lights, The Netherlands
    Posts
    2,381
    Quote Originally Posted by Dublin_Gunner
    Once there's sufficient current on that rail, I dont see how that would become a problem.

    I still fail to see why they would need 2.
    The only reason for 2 connectors is because the specs don't allow you to draw more then 75W from a single connector, but I'm sure that you can draw 150+W from a single connector without any problems.

  18. #168
    Xtreme Mentor
    Join Date
    Sep 2005
    Location
    Netherlands
    Posts
    2,693
    Quote Originally Posted by ***Deimos***
    sometimes hard to tell if you are joking.
    but, I suppose if a company like Bungie updated Halo, then Dice can update BF2 too ((HL1 source also comes to mind). Get it all Vista "compatible" and showcase those dark sily smooth shadows.. oh yeah baby.

    EDIT: its very well possible... I didn't say they WILL do it.

    me asking if ur kidding is cause you say: bungie updating halo.
    were did they update the upcoming Halo 2 for Vista.
    the screenshots seen sofar look exactly like what it looked like a few years ago on the xbox. It hasnt been updated in anyway.

    edit:
    if ur updating is refered to directx or whatever.
    dunno what Dx version Halo2 is/was but Vista will simply emulate it.
    Time flies like an arrow. Fruit flies like a banana.
    Groucho Marx



    i know my grammar sux so stop hitting me

  19. #169
    X.I.P
    Join Date
    Mar 2005
    Posts
    1,964
    just touch a died 8800GTX and 8800GT

    the 8800GTX is crazy long. about 1.5inch longer than 7950GX2
    i am 100% sure it is using GDDR3 1.1ns!
    very nice card and fan is better than 6800Ultra!!

    but it dead!!!!! lol

  20. #170
    Xtreme Addict
    Join Date
    Jan 2005
    Location
    Northern VA
    Posts
    1,556
    gddr3? interesting

  21. #171
    Xtreme Addict
    Join Date
    May 2006
    Location
    Herbert's House in Family Guy
    Posts
    2,381
    im thinking of getting X1900 or X1950 crossfire when G80- comes out, they better make it so i can do it in less than $600 heh
    E6600 @ 3.6
    IN9 32x MAX
    EVGA 8800Ultra
    750W

  22. #172
    Xtreme Addict
    Join Date
    Jan 2004
    Posts
    1,313
    Quote Originally Posted by Starscream
    me asking if ur kidding is cause you say: bungie updating halo.
    were did they update the upcoming Halo 2 for Vista.
    the screenshots seen sofar look exactly like what it looked like a few years ago on the xbox. It hasnt been updated in anyway.

    edit:
    if ur updating is refered to directx or whatever.
    dunno what Dx version Halo2 is/was but Vista will simply emulate it.
    Bungie updated Halo for PC. I didn't say any more. I dont know why you insist that I prove something about Halo2 and Vista.. how did you come up with that?

    Just saying, that sometimes, rarely, games DO get updated. Not always. Not always for Vista. Maybe not all Halo games. Please dont misunderstand.

    Quote Originally Posted by Helmore
    The only reason for 2 connectors is because the specs don't allow you to draw more then 75W from a single connector, but I'm sure that you can draw 150+W from a single connector without any problems.
    Quite frankly, I'm not up to date on PCIE.. their website requries registration to view the whitepapers and data sheets... however, I recall about the same values.. 75W from PCIE slot,and 75W from PCIE power connector. Also other websites state 150W from PCIE total.

    As for the 6 pin connector, I think it only supplies two 12V lines... so for 150W, that would be 12.5A.. 6.25A per cable pair. That is a LOT. Perhaps not impossible. But, certainly would affect stability. (note: there is 2 molex to 1 PCIE Y connector)

    From what I recall, X800XTPE uses up to about 3A from 12A on power connector. An overclocked 6800Ulltra uses 4.5A and even 5A. But, it has two power connectors.

    I think there are some good reasons/explanations to use two power connectors on G80:
    1. Multiple voltage regulators and buck converters like on 6800Ultra. A connector/voltage regulator.
    2. Power supplies now feature multiple rails.. this way you can draw power from both and not overburden any single one.
    3. More even power distribution, less signal noise, and higher stability.
    4. Scare the competition!?

    24/7: A64 3000+ (\_/) @2.4Ghz, 1.4V
    1 GB OCZ Gold (='.'=) 240 2-2-2-5
    Giga-byte NF3 (")_(") K8NSC-939
    XFX 6800 16/6 NV5 @420/936, 1.33V

  23. #173
    Xtreme Addict
    Join Date
    Jan 2004
    Posts
    1,313
    Quote:
    Originally Posted by guess2098
    using up to 250W!!! full load for GTX ( single)
    the water cooling G80 is going to cost around 900$ OMG!!)(not sure but it is what they are talking about today)

    for GT is about 180W !!??

    short supply for G80
    using GDDR3???

    Quote:
    Originally Posted by ***Deimos***:
    I think you're completely wrong... I'm so sure, that I'm willing to bet that if I lose I'll post photo of me in panties.

    no pink panties for you
    * it turns out in many games GTX uses about the same power as X1900XX/X1950XTX. And even in most power demanding situations hits 142-149W (depending on website review).
    * G80 doesn't REQUIRE water cooling, let alone $900 worth, and uses a quite conventional heatsink.
    * The GT in many cases uses less power than XTX, far far from 180W.
    * short supply, reason why GDD3... only time will tell.

    FYI: I find it very interesting which of my guesstimates were right/wrong.
    80nm=NO. ~7900GT OC clocks at 1.25-1.3V to compensate for complexity=YES, 125-150W=YES, water cooler not required=YES, presumed still using vector units and hence 1.35Ghz ridiculous = DEAD WRONG... even though expecting the unimaginable, nVidia shocked even me... they really went all out!
    Last edited by ***Deimos***; 11-11-2006 at 05:42 PM.

    24/7: A64 3000+ (\_/) @2.4Ghz, 1.4V
    1 GB OCZ Gold (='.'=) 240 2-2-2-5
    Giga-byte NF3 (")_(") K8NSC-939
    XFX 6800 16/6 NV5 @420/936, 1.33V

  24. #174
    Xtreme Member
    Join Date
    Sep 2006
    Posts
    243
    Quote Originally Posted by ***Deimos***

    I think there are some good reasons/explanations to use two power connectors on G80:
    1. Multiple voltage regulators and buck converters like on 6800Ultra. A connector/voltage regulator.
    2. Power supplies now feature multiple rails.. this way you can draw power from both and not overburden any single one.
    3. More even power distribution, less signal noise, and higher stability.
    4. Scare the competition!?
    Actually, the reason is quite simple.

    PCI-E spec allows for 75w per 16x slot, and 75W per power connector.
    = 150W max for 1 power connector, but 8800gtx needs closer to 180W or so I believe, hence the need for the second PCI-E power connector, for a total max wattage of 225W
    E6400 L628 @3.4 1.4V
    Ultra 120 Extreme
    2GB team Xtreem @ DDR2-850 4 4 3 10
    Abit Quad GT
    X1800XT flashed to XTPE 700/1600
    OCZ GameXstream 600W

  25. #175
    Xtreme Addict
    Join Date
    Jan 2004
    Posts
    1,313
    some sites are reporting:

    8800GT = 3 phase power
    8800GTX = 4 phase power

    24/7: A64 3000+ (\_/) @2.4Ghz, 1.4V
    1 GB OCZ Gold (='.'=) 240 2-2-2-5
    Giga-byte NF3 (")_(") K8NSC-939
    XFX 6800 16/6 NV5 @420/936, 1.33V

Page 7 of 7 FirstFirst ... 4567

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •