Page 2 of 4 FirstFirst 1234 LastLast
Results 26 to 50 of 100

Thread: 8800GT Spec is out! 65NM CONFIRMED !!

  1. #26
    Registered User
    Join Date
    Sep 2007
    Posts
    66
    Quote Originally Posted by knightwolf654 View Post
    now all we need is some info on those g100's or whatever their calling them,
    yup....the high end is still the only thing I'm interested in. I wish someone would leak some info

  2. #27
    One-Eyed Killing Machine
    Join Date
    Sep 2006
    Location
    Inside a pot
    Posts
    6,340
    Quote Originally Posted by K404 View Post
    Thanks Benchzoner- do you know what cards will be PCI-E 2.0 and/or when they'll be out?

    On another note...still no-one agrees on the specs of this card. There isnt even agreement on what card we're talking about.
    Controversy in details and stuff can be funny and at the same time irritating

    Making all the new cards PCI Express 2.0 would be a good idea, and that's what it looks like it will be from now and on.
    With the PCI Express 2.0 being fully compatible with PCI Express 1.0 you can't go wrong here

    I'm currently having a chat with a VGA manufacturers' personnel, and looking for new info on cards to be released, and specs.
    I've put my best bait on my data-fish-hooks and looking for a big fish to have a go at 'em

    The 8800GT looks like a very good deal for the price asked ( if the given MSRP is true ).

    Oh wait! I felt some vibrations from my data fishing rod hold on
    Coding 24/7... Limited forums/PMs time.

    -Justice isn't blind, Justice is ashamed.

    Many thanks to: Sue Wu, Yiwen Lin, Steven Kuo, Crystal Chen, Vivian Lien, Joe Chan, Sascha Krohn, Joe James, Dan Snyder, Amy Deng, Jack Peterson, Hank Peng, Mafalda Cogliani, Olivia Lee, Marta Piccoli, Mike Clements, Alex Ruedinger, Oliver Baltuch, Korinna Dieck, Steffen Eisentein, Francois Piednoel, Tanja Markovic, Cyril Pelupessy (R.I.P. ), Juan J. Guerrero

  3. #28
    Xtreme Addict
    Join Date
    Jan 2005
    Posts
    1,970
    Quote Originally Posted by BenchZowner View Post
    Controversy in details and stuff can be funny and at the same time irritating



    Oh wait! I felt some vibrations from my data fishing rod hold on

  4. #29
    One-Eyed Killing Machine
    Join Date
    Sep 2006
    Location
    Inside a pot
    Posts
    6,340
    Don't remember/know what's been posted here and the rest 40 threads about the same thing ( not funny ), here's some things that I've heard from a contact in a VGA manufacturer:

    -nVIDIA GeForce 8800GT: Release Date, 29 Oct 2007
    512MB, 256BIT, 600MHz Core Clock
    -ATi HD2950Pro & HD2950XT will be ready & launched in mid November

    That's the "best" of what this contact could give me at the moment...gotta check another contact later on though for some real juice.
    Coding 24/7... Limited forums/PMs time.

    -Justice isn't blind, Justice is ashamed.

    Many thanks to: Sue Wu, Yiwen Lin, Steven Kuo, Crystal Chen, Vivian Lien, Joe Chan, Sascha Krohn, Joe James, Dan Snyder, Amy Deng, Jack Peterson, Hank Peng, Mafalda Cogliani, Olivia Lee, Marta Piccoli, Mike Clements, Alex Ruedinger, Oliver Baltuch, Korinna Dieck, Steffen Eisentein, Francois Piednoel, Tanja Markovic, Cyril Pelupessy (R.I.P. ), Juan J. Guerrero

  5. #30
    Xtreme Enthusiast
    Join Date
    Nov 2005
    Location
    Sweden, Örebro
    Posts
    818
    Quote Originally Posted by BenchZowner View Post
    Don't remember/know what's been posted here and the rest 40 threads about the same thing ( not funny ), here's some things that I've heard from a contact in a VGA manufacturer:

    -nVIDIA GeForce 8800GT: Release Date, 29 Oct 2007
    512MB, 256BIT, 600MHz Core Clock
    -ATi HD2950Pro & HD2950XT will be ready & launched in mid November

    That's the "best" of what this contact could give me at the moment...gotta check another contact later on though for some real juice.
    I've heard the same

    Also heard 96 SP again from another source.

    //Andreas

  6. #31
    Xtreme Addict
    Join Date
    Jan 2005
    Location
    Grand Forks, ND (Yah sure, you betcha)
    Posts
    1,266
    OCW has been saying 112, and a 96 part might be in the wings depending on sales/Rv670.

    I am in the faction whom (now) believes 96 was the original target for this part, but rv670 changed the game, and the 640MB GTS. Extremely possible this is a G80 shrink with a 256-bit bus like RV670, and we'll see a fully decked-out or GX2 version come Q108.
    That is all.

    Peace and love.

  7. #32
    Xtreme Enthusiast
    Join Date
    Nov 2005
    Location
    Sweden, Örebro
    Posts
    818
    Quote Originally Posted by turtle View Post
    OCW has been saying 112, and a 96 part might be in the wings depending on sales/Rv670.

    I am in the faction whom (now) believes 96 was the original target for this part, but rv670 changed the game, and the 640MB GTS. Extremely possible this is a G80 shrink with a 256-bit bus like RV670, and we'll see a fully decked-out or GX2 version come Q108.
    The last I heard from them they said 96 too.

    //Andreas

  8. #33
    Xtreme Enthusiast
    Join Date
    Jun 2007
    Posts
    546
    Just going soley based on the Nvidia marketing scheme, theoretically shouldn't the GT be slower than the GTS. Bumping the GT to 112 would put it faster than the GTS (assuming the GTS doesn't get any bump).

  9. #34
    Xtreme Member
    Join Date
    Jun 2007
    Posts
    324
    http://www.xtremesystems.org/forums/...d.php?t=161754

    here is somethin' about 8800GT(and GTS)

  10. #35
    Live Long And Overclock
    Join Date
    Sep 2004
    Posts
    14,058
    Title edited for this:

    Quote Originally Posted by Dailytech
    The one thing that didn't change on G92 is the process node. NVIDIA's foundry partner, TSMC, forwarded a second memo to DailyTech confirming G92 is in mass production at the company's Fab 12 with samples available now on 65nm process node. NVIDIA's GeForce 8500 and GeForce 8600 are manufactured on TSMC's 80nm node; GeForce 8800 GT will be the company's first 65nm graphics processor.
    Blazingo

    Perkam

  11. #36
    Xtreme Guru
    Join Date
    Apr 2005
    Location
    Finland
    Posts
    4,308
    Ahh, I feel relieved altho I always felt it just has to be 65nm but how dissapointed it would have been if not. lol 2½ weeks remaining eh, feels like 2 months at this rate with all 2950 and 8800GT talk. I wonder if 2950 Pro and 8800GT doesn't take the top position when it comes to anticipated midrange cards?

    Now the remaining thing is the amount SPs.
    Last edited by RPGWiZaRD; 10-10-2007 at 02:53 AM.
    Intel? Core i5-4670K @ 4.3 GHz | ASRock Extreme6 Z87 | G.Skill Sniper 2x8GB @ DDR4-1866 CL9 | Gigabyte GTX 970 OC Windforce 3x | Super Flower Titanium 1000W | ViewSonic VX2268wm 120Hz LCD | Phanteks PH-TC14PE | Logitech MX-518 | Win 7 x64 Professional | Samsung 850 EVO & 840 Pro SSDs

    If all people would share opinions in an objective manner, the world would be a friendlier place

  12. #37
    Xtreme Member
    Join Date
    Apr 2007
    Location
    Miami Beach, FL
    Posts
    111
    Nothing is confirmed until there is a press release from Nvidia or a partner. Until then I say BS.
    Last edited by DavidP; 10-10-2007 at 05:23 AM.
    | EVGA X58 | i7 920 D0 @ 4Ghz | 6GB Corsair DOMINATOR 1600 | Kingston 256GB SSDNow V+ Series | Data Drives Caviar® Black™ 640 GB 32 MB Cache, 7200 RPM X2 in Raid 0 | GTX 275 SC SLI | SB X-Fi Fatal1ty | Plextor PX-716SA | NEC DVDRW | Sunbeam Rheobus | PCP&C 1200W Turbo-Cool

    Case: MM U2-UFO
    Case Fans: Yate Loon 12SM-12

    WC Stuff: Swiftech GTZ | PA120.3 | DDC-2 w/ Petra's Top | MCRES-MICRO

    Loop: Swiftech GTZ -> MCRES-MICRO -> DDC-2 w/ Petra's Top -> PA120.3 -> CPU

  13. #38
    Diablo 3! Who's Excited?
    Join Date
    May 2005
    Location
    Boulder, Colorado
    Posts
    9,412
    Well if it's 65nm maybe it'll clock like a beast. Loving these new midrange cards, this is what we should have had 8 months ago.

  14. #39
    Xtreme Enthusiast
    Join Date
    Nov 2005
    Location
    Sweden, Örebro
    Posts
    818
    Quote Originally Posted by DavidP View Post
    Nothing is confirmed until there is a press release from Nvidia or a partner. Until then I say BS.
    Doesn't that kill the fun of trying to figure it out?
    I mean it's like a huge puzzle where you always miss at least a few pieces of the entire picture, but now and again you get a new piece and with the help of that one you get a better understanding for the whole picture, but there is still no way to be entirely sure
    And sometimes they decides to draw a new picture without finishing the first one and you have to start all over again. The puzzling and the guessing is what makes it all fun.

    //Andreas

  15. #40
    Xtreme Member
    Join Date
    Jun 2004
    Posts
    436
    Speculation is not fun, puzzles however are. I'd rather have the facts whenever things are annouced, that way I wouldn't have the worrying feeling or any doubt on my mind, I would know exactly what I'm getting and exactly when I'm getting it and I should expect no less. 100% Facts also weed out all the crazy fanboys and people with completely false information, I don't know about you, but I can live happily without any of that garbage.
    Home PC: Intel i7 4770K @ 4.6ghz l Asus Maximus VI Hero l Corsair Dominator Plantinum 2400mhz (4x4GB) l Asus GTX 690 l Samsung 840 Pro 256gb l 2 x WD Black 1T storage drive l WD MyBook 500gb External l Samsung SH-S203N DVD l Creative X-Fi Titanium HD l Corsair AX1200 PSU l Planar SA2311W23 3D LCD Monitor l Corsair 800D Case l Windows 7 Ultimate 64 bit l Sennheiser HD-590

    Water Cooling Setup: Swiftech 320 Radiator (3 X Gentle Typhoons 1450rpm 3 x Gentle Typhoons 1850 rpm) l Swiftech Pump w/XSPC Res Top l Heatkiller 3.0 CPU Block l Heatkiller GPU-X GTX 690 "Hole Edition" Nickel l Heatkiller Geforce GTX 690 GPU Backplate l Koolance 140mm Radiator l Danger Den 1/2ID UV Green tubing l EK EKoolant UV Green Liquid


    -Impossible is not a word

  16. #41
    Xtreme Enthusiast
    Join Date
    Nov 2005
    Location
    Sweden, Örebro
    Posts
    818
    Quote Originally Posted by WeaponX View Post
    Speculation is not fun, puzzles however are. I'd rather have the facts whenever things are annouced, that way I wouldn't have the worrying feeling or any doubt on my mind, I would know exactly what I'm getting and exactly when I'm getting it and I should expect no less. 100% Facts also weed out all the crazy fanboys and people with completely false information, I don't know about you, but I can live happily without any of that garbage.
    But the facts are revealed when things are announced
    Everything up until then you either ignore or try to make the most of, you choose to read or not to read the rumors stories.
    That's what so great about the freedom of mind, you can actually ignore stories like this if you want and never have to get worried or excited about things before they are either enforced or demolished

    You do have a great point on the fanboy issue however.

    //Andreas

  17. #42
    Wanna look under my kilt?
    Join Date
    Jun 2005
    Location
    Glasgow-ish U.K.
    Posts
    4,396
    Quote Originally Posted by BenchZowner View Post
    With the PCI Express 2.0 being fully compatible with PCI Express 1.0 you can't go wrong here
    Is it absolutely 100% working both ways? If a PCI-E 2.0 GPU is plugged into a PCI-E 1.0 board, isnt there theoretically 1/3 of the possible power missing?

    Also- on a PCI-E 2.0 mobo, which I think allows 150W to be taken through the PCI-E slot, wont that be a challenge for a lot of PSUs, especially once we start overclocking? The 12V rails often aren't really tuned for that distribution are they...especially for a new SLI board? Will the reliability depend strongly on whats powered by each PSU rail? e.g... if 24-pin ATX and molex are on the same rail...uh-oh?

    getting specific- if I plug a 200W stock PCI-E 2.0 GPU that has 1 additional 6-pin power plug into my PCI-E 1 mobo, wont it work in LP mode, if it works at all?

    Sorry for asking the same thing lots of different ways, but I want it to be crystal clear for everyone...and me
    Quote Originally Posted by T_M View Post
    Not sure i totally follow anything you said, but regardless of that you helped me come up with a very good idea....
    Quote Originally Posted by soundood View Post
    you sigged that?

    why?
    ______

    Sometimes, it's not your time. Sometimes, you have to make it your time. Sometimes, it can ONLY be your time.

  18. #43
    Xtreme Mentor
    Join Date
    May 2007
    Posts
    2,792
    Why do the GPUs need more power potential at lower process node with lower voltages (i.e. PCIe 2.0)? It should be exactly the reverse, so all mainstream and performance cards should work fine with basic PCIe 1.0 power available, and PCIe 2.0 power potential should be what is kept in-safe for higher power/rocket priced enthusiast cards, as well as those bearing the dual-GPU frameworks. That's what it was originally implemented and appealed for.

  19. #44
    Xtreme Cruncher
    Join Date
    Aug 2006
    Location
    Denmark
    Posts
    7,747
    Quote Originally Posted by K404 View Post
    Is it absolutely 100% working both ways? If a PCI-E 2.0 GPU is plugged into a PCI-E 1.0 board, isnt there theoretically 1/3 of the possible power missing?

    Also- on a PCI-E 2.0 mobo, which I think allows 150W to be taken through the PCI-E slot, wont that be a challenge for a lot of PSUs, especially once we start overclocking? The 12V rails often aren't really tuned for that distribution are they...especially for a new SLI board? Will the reliability depend strongly on whats powered by each PSU rail? e.g... if 24-pin ATX and molex are on the same rail...uh-oh?

    getting specific- if I plug a 200W stock PCI-E 2.0 GPU that has 1 additional 6-pin power plug into my PCI-E 1 mobo, wont it work in LP mode, if it works at all?

    Sorry for asking the same thing lots of different ways, but I want it to be crystal clear for everyone...and me
    The PSU is not really a problem.

    And for the PCIe 2.0 vs 1.0. It will be quite some years before there is 2.0 only GPUs. I cant even imagine it within the first 3 years from now. Until then they will have all the 6/8pin conenctors that they will need for working in 1.0 aswell.

    Also atleast nVidia prefers to use non slot power. And ATI also did quite abit in the past.
    Crunching for Comrades and the Common good of the People.

  20. #45
    Registered User
    Join Date
    Nov 2006
    Posts
    37
    Good question there dude. was wondering along those same lines....

  21. #46
    Xtreme Member
    Join Date
    Apr 2006
    Location
    San Gabriel, CA
    Posts
    451
    http://www.theinquirer.net/gb/inquir...ittle-g80-65nm

    Inq says that G92 originally has 128SPs but will have 16 disabled for the 8800GT

    yeah, its Inq so its just more wood for the fire of rumors

    a part of me is saying G92 is just like G71 was to G70, die shrink and replaces current G80 cards. logically thinking it makes more sense to since G92 is cheaper to make.

    a G92 based GTS is just like 7800GT(20Pixel Shaders) to 7900GT(24Pixel Shaders + die shrink).

    just my thoughts.
    i5 2500k @ 4.6GHz - Corsair A70 | Biostar TP67B+ | G.Skill RipJaws DDR3-1600 2x2GB | MSI HD7950 TF3 | X-Fi Titanium | WD 750GB Black | CM 690 II - Corsair TX850 | 2xDell 2407WFP A04/A03 | Win 7 Pro x64

    http://www.heatware.com/eval.php?id=48222

  22. #47
    Xtreme Mentor
    Join Date
    May 2006
    Location
    Croatia
    Posts
    2,542
    PCIE 2.0 card will draw all it's needed power through PCIE slot when pluged into PCIE 2.0 capable board.

    When pluged in PCIE 1.0 board the card will operate just as it does today, using external power connectors to the card itself.

    Its logical! Just think!
    There will be virtually NO 2.0 capable boards sold by the time these cards appear.
    Did you really think they would be so stupid and screw up their sales with the lack of user base using the new 2.0 standard - NOT POSSIBLE
    Quote Originally Posted by LexDiamonds View Post
    Anti-Virus software is for n00bs.

  23. #48
    Xtremely Hot Sauce
    Join Date
    Sep 2007
    Location
    New York
    Posts
    3,586
    It seems to me that this is how it'll all be distributed...

    8800GT - G92 with 64 SP, 256-bit, PCI-E 2, and $220
    8800GTS 320 - G92 with 96 SP, 320-bit, PCI-E 2, and $280
    8800GTS 640 - G92 with 112 SP, 320-bit, PCI-E 2, and $370

    I'd really like to hear more about G98, however. If it's the 8600 Ultra, it'll be a dream finally come true.

    I wonder about SLI performance, however--just how well will these newer cards perform?

    My toys:
    Asus Sabertooth X58 | Core i7-950 (D0) | CM Hyper 212+ | G.Skill Sniper LV 12GB DDR3-1600 CL9 | GeForce GTX 670-2048MB | OCZ Agility 4 512GB, WD Raptor 150GB x 3 (RAID0), WD Black 1TB x 2 (RAID0) | XFX 650W CAH9 | Lian-Li PC-9F | Win 7 Pro x86-64
    Gigabyte EX58-UD3R | Core i7-920 (D0) | Stock HSF | G.Skill Sniper LV 4GB DDR3-1600 CL9 | Radeon HD 2600 Pro 512MB | WD Caviar 80GB IDE, 4TB x 2 (RAID5) | Corsair TX750 | XClio 188AF | Win 7 Pro x86-64
    Dell Dimension 8400 | Pentium 4 530 HT (E0) | Stock HSF | 1.5GB DDR2-400 CL3 | GeForce 8800 GT 256MB | WD Caviar 160GB SATA | Stock PSU | (Broken) Stock Case | Win Vista HP x86
    Little Dot DAC_I | Little Dot MK IV | Beyerdynamic DT-880 Premium (600 Ω) | TEAC AG-H300 MkIII | Polk Audio Monitor 5 Series 2's

  24. #49
    Xtreme Mentor
    Join Date
    May 2007
    Posts
    2,792
    AFAIK, the 8800GT MSRP was $250 not $220.

  25. #50
    Xtreme Addict
    Join Date
    May 2007
    Location
    'Zona
    Posts
    2,346
    And Nvidia is releasing a new 8800GTS with a 1200mhz shader domain and 112SPs to replace the current 640mb GTS.

Page 2 of 4 FirstFirst 1234 LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •