Page 1 of 2 12 LastLast
Results 1 to 25 of 41

Thread: G70 Scores 7800 in 3DMark05

  1. #1
    Xtreme Member
    Join Date
    Jan 2005
    Posts
    466

    NEWS: G70 Scores 7800 in 3DMark05

    http://www.hardspell.com/news/showco...?news_id=13528

    "Previously, we've reported the news that G70 will use the 7x00 name. At the Computx, further details of the next gen G70 GPU from nVidia is revealed.

    Core frequence 500MHz
    110nm Technology
    24 Pipeline
    512MB DDR3 at 1400MHz

    G70 will have a GTX version (Basically the Ultra version of previous generation cards), G70's early sample received a 7800 core in 3DMark05. After release, it is likely that it will score even higher.

    G70 Completely support SLI and there will be a AGP version, using HSI brige chip."
    Last edited by koei; 05-21-2005 at 08:37 AM.
    Main: i7 2700k @ 5.0Ghz, ASUS Maximus V Formula Z77
    16GB Corsair Vengeance 1600, ASUS GeForce GTX680
    2x240GB SanDisk Extreme RAID0, Seasonic Platinum 1000, Corsair H100
    Server: i7 2700k, ASROCK Z68 Extreme7 Gen3
    16GB Patriot 1600, 120GB SanDisk Extreme
    5TB JBOD, Corsair 850TX, Corsair H100
    Media: Phenom II X4 940BE, Biostar TA790GX XE
    4GB Corsair XMS2 1066, VisionTek HD4850 512MB
    60GB OCZ Agility 3, SilverStone ST400

  2. #2
    The Blue Dolphin
    Join Date
    Nov 2004
    Location
    The Netherlands
    Posts
    2,816
    Not very good for now, but it's normal because the card isn't finished yet.

    I expect the R520 to be more powerfull but we'll see.

    *EDIT: The website states it has a core clock of 430mhz but that's the only freakin thing I understand because it's all Chinese or something.
    Last edited by alexio; 05-20-2005 at 08:16 AM.
    Blue Dolphin Reviews & Guides

    Blue Reviews:
    Gigabyte G-Power PRO CPU cooler
    Vantec Nexstar 3.5" external HDD enclosure
    Gigabyte Poseidon 310 case


    Blue Guides:
    Fixing a GFX BIOS checksum yourself


    98% of the internet population has a Myspace. If you're part of the 2% that isn't an emo bastard, copy and paste this into your sig.

  3. #3
    Xtreme Member
    Join Date
    Jan 2005
    Posts
    466
    The picture posted from their original article states 430MHz for core, but this article says 500MHz. Not sure which one to believe, so I just translated straight.
    Main: i7 2700k @ 5.0Ghz, ASUS Maximus V Formula Z77
    16GB Corsair Vengeance 1600, ASUS GeForce GTX680
    2x240GB SanDisk Extreme RAID0, Seasonic Platinum 1000, Corsair H100
    Server: i7 2700k, ASROCK Z68 Extreme7 Gen3
    16GB Patriot 1600, 120GB SanDisk Extreme
    5TB JBOD, Corsair 850TX, Corsair H100
    Media: Phenom II X4 940BE, Biostar TA790GX XE
    4GB Corsair XMS2 1066, VisionTek HD4850 512MB
    60GB OCZ Agility 3, SilverStone ST400

  4. #4
    Xtreme Member
    Join Date
    Jan 2005
    Posts
    466
    Here is the picture from their original article.

    Main: i7 2700k @ 5.0Ghz, ASUS Maximus V Formula Z77
    16GB Corsair Vengeance 1600, ASUS GeForce GTX680
    2x240GB SanDisk Extreme RAID0, Seasonic Platinum 1000, Corsair H100
    Server: i7 2700k, ASROCK Z68 Extreme7 Gen3
    16GB Patriot 1600, 120GB SanDisk Extreme
    5TB JBOD, Corsair 850TX, Corsair H100
    Media: Phenom II X4 940BE, Biostar TA790GX XE
    4GB Corsair XMS2 1066, VisionTek HD4850 512MB
    60GB OCZ Agility 3, SilverStone ST400

  5. #5
    The Blue Dolphin
    Join Date
    Nov 2004
    Location
    The Netherlands
    Posts
    2,816
    Quote Originally Posted by Troman
    wrong...
    according to analust R520 wil end up a tad slower then G70..
    Well according to me, the R520 is gonna be faster, there's no reason why you can call someones prediction "wrong" when no sollid information is available. Both cards are in the beta or even alpha state soo what I'm posting is just a guess.

    But yeah, I really beleave ATI's R520 is gonna be faster. My guess is that the R520 atleast uses less power if the cards are equal in performance.
    Blue Dolphin Reviews & Guides

    Blue Reviews:
    Gigabyte G-Power PRO CPU cooler
    Vantec Nexstar 3.5" external HDD enclosure
    Gigabyte Poseidon 310 case


    Blue Guides:
    Fixing a GFX BIOS checksum yourself


    98% of the internet population has a Myspace. If you're part of the 2% that isn't an emo bastard, copy and paste this into your sig.

  6. #6
    Xtreme Recruit
    Join Date
    Mar 2005
    Posts
    78
    We will just have to see how fast each card is when it comes out.

  7. #7
    Xtreme Addict
    Join Date
    Jan 2005
    Location
    Bay Area, CA
    Posts
    1,331
    Quote Originally Posted by alexio
    But yeah, I really beleave ATI's R520 is gonna be faster. My guess is that the R520 atleast uses less power if the cards are equal in performance.
    Does you guess have something to do with your Avatar?

  8. #8
    The Blue Dolphin
    Join Date
    Nov 2004
    Location
    The Netherlands
    Posts
    2,816
    No, I have a 6800NU @ GT right now and have a 9800 pro laying around. The 9800 pro is allmost dead so I can't use it for gaming.

    I've had experience with both and everyone is just saying that the 6800 serie is the best Nvidia has ever made while I find it a plain piece of junk compared to the 9800 pro. That's one reason I don't have much faith in Nvidia. The other reason is that I think ATI's vmem is going to be better (read higher clocked) and I like the unified shader concept.
    Blue Dolphin Reviews & Guides

    Blue Reviews:
    Gigabyte G-Power PRO CPU cooler
    Vantec Nexstar 3.5" external HDD enclosure
    Gigabyte Poseidon 310 case


    Blue Guides:
    Fixing a GFX BIOS checksum yourself


    98% of the internet population has a Myspace. If you're part of the 2% that isn't an emo bastard, copy and paste this into your sig.

  9. #9
    Xtreme Enthusiast
    Join Date
    Oct 2004
    Posts
    684
    If the R520 is like the R500 used in the Xbox360 then I don't give much hope for the G70. From what specs that are coming from Beyond3d that Dave is comferming, the R500 has 48 pipelines that can be used for Pixle or Vertex shading and can do 2XAA with no preformans hit and 4XAA with no to only 5% max profromans hit up to 1080i. It looks the R500 was more powerful then most of us thought it was going to be.

  10. #10
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    Quote Originally Posted by Troman
    Nvdia rsx 550 core based on the G70-->result faster than 2X6800ultra in sli...

    what can you make up fom this?obvious i think
    thats marketing
    in some situations rsx+cell might be more than 2x as fast as a pc with two 6800Us in sli...

  11. #11
    Xtreme Addict
    Join Date
    Apr 2005
    Posts
    1,087
    I highly doubt the Cell processor would be released for the desktop platform. But who knows, IBM might just make profit out of it. I know they're planning to release for their server systems.
    ________
    CD175
    Last edited by clayton; 03-10-2011 at 05:52 AM.

  12. #12
    Xtreme Member
    Join Date
    Jan 2005
    Posts
    466
    They wouldnt, because desktop CPU is much more for all purpose. Single core or even dual core is relatively easy to program. But when you are 7/8 cores, the programming for it is much much more time consuming. Consider all the software they need to develop for desktop CPU, it just isn't cosft effective to have such powerful CPU in PCs at the moment.
    Main: i7 2700k @ 5.0Ghz, ASUS Maximus V Formula Z77
    16GB Corsair Vengeance 1600, ASUS GeForce GTX680
    2x240GB SanDisk Extreme RAID0, Seasonic Platinum 1000, Corsair H100
    Server: i7 2700k, ASROCK Z68 Extreme7 Gen3
    16GB Patriot 1600, 120GB SanDisk Extreme
    5TB JBOD, Corsair 850TX, Corsair H100
    Media: Phenom II X4 940BE, Biostar TA790GX XE
    4GB Corsair XMS2 1066, VisionTek HD4850 512MB
    60GB OCZ Agility 3, SilverStone ST400

  13. #13
    Xtreme Enthusiast
    Join Date
    Apr 2005
    Location
    Windsor, Canada
    Posts
    858
    Quote Originally Posted by koei
    Here is the picture from their original article.


    I REALLY doubt that the G70 will have a 430MHz core clock. Thats only 5MHz more than the BFG GeForce 6800 Ultra OC and only 30MHz more than the normal GeForce 6800 Ultra. I think it will be the 500MHz that you were talking about.
    Quote Originally Posted by jimmyz View Post
    A DFI board is like a divorce, expensive, but well worth it.
    Quote Originally Posted by virtualrain View Post
    I dunno... I think a DFI board is more like marriage... demanding, time consuming, and a PITA but rewarding in it's own twisted way.

  14. #14
    Xtreme Member
    Join Date
    Feb 2005
    Location
    Waterloo, Ontario
    Posts
    216
    I'm with koel on this one, programming for the PS3 is going to be brutal. Also, I think PCs will be faster than the xbox360 or PS3 when they come out, because this always happens. Where they're ahead of PCs before release, then they fall behind because the PC market moves ALOT faster.
    3500 clawhammer @ 230x11 1.7V
    Foxconn geforce 6150
    x800gto @ stock
    2x512 mismatched sticks in dual channel
    Black case of godliness

  15. #15
    Banned
    Join Date
    Jan 2005
    Location
    Intels Labs, having my Thermal diode modified.
    Posts
    345
    Nvidia will scrap G70 at the last minute and release its secret weapon codenamed G90.

    The specifications will be as follows:

    Single die dual-core package 45nm @ 900Mhz per core
    4MB associative off-chip Cache @ 900Mhz
    4-way HyperChannel™ instruction pipeline between cache and core @ 225mhz
    1024MB of UL RAM (Űber Leet RAM) @ 600mhz (2.4Ghz eff) 0.2ns
    Dedicated shader processor @ 1Ghz with unified architecture
    Shader Model 5 support
    NSS shader support (Nvidia standard shader)
    ReaLife™ engine 1.0

    speculation is wonderful eh?

  16. #16
    Registered User
    Join Date
    Mar 2005
    Posts
    38
    Quote Originally Posted by alexio
    I've had experience with both and everyone is just saying that the 6800 serie is the best Nvidia has ever made while I find it a plain piece of junk compared to the 9800 pro. That's one reason I don't have much faith in Nvidia. The other reason is that I think ATI's vmem is going to be better (read higher clocked) and I like the unified shader concept.
    Please explain in what respect a card such as the 6800 (at GT speeds) is worse than a 9800 pro... Ah wait, that's why, for some bizarre reason nVidia decided to make their next gen card slower than generations before it! So the G70 should be about as good as a geforce 2 GTS then yea?

    Also your comment about the R520 using less power is probably true considering the smaller process it will be made on.

  17. #17
    c[_]
    Join Date
    Nov 2002
    Location
    Alberta, Canada
    Posts
    18,728
    and almost 1/3rd the transistor count of the G70 (with the 10mb cache included in that count as far as I understand it).

    All along the watchtower the watchmen watch the eternal return.

  18. #18
    Registered User
    Join Date
    Mar 2005
    Posts
    38
    Quote Originally Posted by STEvil
    and almost 1/3rd the transistor count of the G70 (with the 10mb cache included in that count as far as I understand it).
    I call BS on that. If it's got so few transistors how exactly does it have so many pipelines? What is it using instead of transistors!!

    Also, I'm pretty sure the 10mb cache is not on the same die as the GPU core, there are some pics on HardOCP which give you an idea of where the ram actually is.

    10mb of cache would also use a lot of transistors.

  19. #19
    Xtreme Member
    Join Date
    Nov 2004
    Location
    Japan
    Posts
    243
    Quote Originally Posted by Northwood
    Nvidia will scrap G70 at the last minute and release its secret weapon codenamed G90.

    The specifications will be as follows:

    Single die dual-core package 45nm @ 900Mhz per core
    4MB associative off-chip Cache @ 900Mhz
    4-way HyperChannel™ instruction pipeline between cache and core @ 225mhz
    1024MB of UL RAM (Űber Leet RAM) @ 600mhz (2.4Ghz eff) 0.2ns
    Dedicated shader processor @ 1Ghz with unified architecture
    Shader Model 5 support
    NSS shader support (Nvidia standard shader)
    ReaLife™ engine 1.0

    speculation is wonderful eh?
    from where you got this info ?


  20. #20
    c[_]
    Join Date
    Nov 2002
    Location
    Alberta, Canada
    Posts
    18,728
    Quote Originally Posted by MetalStorm
    I call BS on that. If it's got so few transistors how exactly does it have so many pipelines? What is it using instead of transistors!!
    Transistors are only part of the battle

    Also, I'm pretty sure the 10mb cache is not on the same die as the GPU core, there are some pics on HardOCP which give you an idea of where the ram actually is.
    I agree, it is not (currently) a physical part of the die, but as I understand it, it is included in the total transistor count that has been used in press media at E3 showing Xbox360 vs PS3.

    Maybe the count was the GPU only, but until someone tells us we can only speculate... either way for the transistor count its doing good with approximately 1/3rd the transistors.

    10mb of cache would also use a lot of transistors.
    Yup. I believe there should be numbers around somewhere on how many a P4 with 2mb L2 or whatever has. Could extrapolate based on that number and it should be close... not to include the cores surrounding the cache which perform the FSAA functions, though.

    All along the watchtower the watchmen watch the eternal return.

  21. #21
    Registered User
    Join Date
    Mar 2005
    Location
    Drayton Valley,AB
    Posts
    76
    R520 won't even be close to R500 I'm afraid. M$ owns that chip and it's tech for one thing ATI would prolly have to licence it back from M$ just to use it all or part.

    I can't remember where I read it but they were saying that the R500 is way differant than R520, so much so that they won't be able to use the tech in it's current form with R520. But it's a lot of speculation.

  22. #22
    Xtreme Member
    Join Date
    Jan 2004
    Posts
    243
    Sorry to burst your bubble butA ATI owns the r500 Core they designed it to spec for MS that doesn't mean ATI will just hand them the architecture just because its in Xbox 360.. Did nvidia give the IP of the Gf3 from xbox, nope. MS might have some access to the basics but there is NO way in Hell ATI would ever give up their IP, MS aproached ATI for the design and not the other way around.

  23. #23
    Registered User
    Join Date
    Jun 2004
    Posts
    6
    G70 sounds like a die shrunk 6800 (12pixel x 5 vertex) in a dual-core set up to me. Nvidia can keep their next Gen. cards Im going with ATI's offerings. Im still waiting for a good all around driver from Nvidia for my 6800GT not that one that gets me 5800 in 3DM2k5 http://service.futuremark.com/compare?3dm05=668959 up more then 25% from year ago but my games still have glitches and low FPS. My point is that without good drivers doesnt matter how good the specs are. I think they should call it the "FX 7800".
    Last edited by Roadburn; 05-22-2005 at 02:09 AM.

  24. #24
    Registered User
    Join Date
    Mar 2005
    Posts
    38
    Quote Originally Posted by Roadburn
    G70 sounds like a die shrunk 6800 (12pixel x 5 vertex) in a dual-core set up to me. Nvidia can keep their next Gen. cards Im going with ATI's offerings. Im still waiting for a good all around driver from Nvidia for my 6800GT not that one that gets me 5800 in 3DM2k5 http://service.futuremark.com/compare?3dm05=668959 up more then 25% from year ago but my games still have glitches and low FPS. My point is that without good drivers doesnt matter how good the specs are. I think they should call it the "FX 7800".
    That's BS.

    For a start the standard 6800 (12 pipe) is the exact same core as the 6800U (16 pipe). So it can't just be a stanard die shrink. I'm sure a large amount of the technology in the G70 is the same mind, but that's only because making a whole new GPU or CPU or anything takes a very long time.

    I can't actually comment on what nVidia has done as I don't have the information to base it on, however they were saying that it's a somewhat different design to the NV40 - hence why they cancelled the NV50 and worked on the G70. That is about as much as we can speculate about the architecture at the moment.

    Also the FX was a failed brand name, which is why they went with Geforce 6800, not Geforce FX 6800.

  25. #25
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    31,147
    easy there...

    and you dont know anything about the architecture so how can you call his opinion/speculation bs? :P

Page 1 of 2 12 LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •