Results 1 to 21 of 21

Thread: Confirmation gt 200 is a dual core gpu?

  1. #1
    Xtreme Guru
    Join Date
    May 2007
    Location
    Ace Deuce, Michigan
    Posts
    3,955

    Confirmation gt 200 is a dual core gpu?

    Could this mean anything? A little slip up by VR-Zone?

    AMD is coming up with their much-awaited Radeon HD 4870 X2 next-generation graphics card, while NVIDIA has responded with its GeForce GTX 260 and GTX 280 dual core GPUs.
    http://www.vr-zone.com/articles/H1_&...X2/5766-6.html

    Its in the trinity review, last page, final paragraph
    Quote Originally Posted by Hans de Vries View Post

    JF-AMD posting: IPC increases!!!!!!! How many times did I tell you!!!

    terrace215 post: IPC decreases, The more I post the more it decreases.
    terrace215 post: IPC decreases, The more I post the more it decreases.
    terrace215 post: IPC decreases, The more I post the more it decreases.
    .....}
    until (interrupt by Movieman)


    Regards, Hans

  2. #2
    Xtreme Addict
    Join Date
    Jan 2008
    Posts
    1,176
    what about the pictures of the gt200?

    the heatsink appears to be that of a single gpu card, and then there's a naked pcb shot leaked

  3. #3
    Registered User
    Join Date
    Oct 2005
    Posts
    386
    Quote Originally Posted by Jowy Atreides View Post
    what about the pictures of the gt200?

    the heatsink appears to be that of a single gpu card, and then there's a naked pcb shot leaked

    Meaning to two GPU cores next to each other, like a dual core CPU. Not like the 3870X2 with two seperate GPUs.

    To infinity, and beyond! (Azza was taken )

  4. #4
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    2,128
    Quote Originally Posted by Baron View Post
    Meaning to two GPU cores next to each other, like a dual core CPU. Not like the 3870X2 with two seperate GPUs.
    Will not happen.

  5. #5
    Xtreme Enthusiast
    Join Date
    Jul 2006
    Location
    the Netherlands
    Posts
    558
    Yeah seems a bit unlikely to me too since gpu's already consist of multiple shader processors?
    making a dual core gpu-die doesn't make any sense to me..
    Rig 1:
    Intel E4300 @ 3Ghz - 2gb OCZ PC2-8500 - Asus P5N-e SLI - Club3d 9600gt @ 750/1950/1100Mhz - Vista 64

    Rig 2:
    Intel celeron L420 @ 2.6Ghz - 2gb OCZ PC2-6400 - Asus P5B - XFX 8800GS 384mb - XP 32

    Laptop
    Acer Aspire 3610, Pentium M725 OC @ 2.23Ghz - 2gb PC2-3200 - crappy Intel I915 gfx

  6. #6
    Xtreme Cruncher
    Join Date
    Aug 2006
    Location
    Denmark
    Posts
    7,747
    Its amazing we have this...dualcore crap on the run. Who seriously makes these things up and still believe it. If you say enough times the moon is a big cheese people will believe it. At best 280 GTX would be a GX2 style card.
    Crunching for Comrades and the Common good of the People.

  7. #7
    I am Xtreme
    Join Date
    Jul 2007
    Location
    Austria
    Posts
    5,485
    Quote Originally Posted by Shintai View Post
    Its amazing we have this...dualcore crap on the run. Who seriously makes these things up and still believe it. If you say enough times the moon is a big cheese people will believe it. At best 280 GTX would be a GX2 style card.
    which would be an utter failure... fsk dual cards, i dont need sli/cf...

  8. #8
    YouTube Addict
    Join Date
    Aug 2005
    Location
    Klaatu barada nikto
    Posts
    17,574
    Quote Originally Posted by Calmatory View Post
    Will not happen.
    Why Not?
    Fast computers breed slow, lazy programmers
    The price of reliability is the pursuit of the utmost simplicity. It is a price which the very rich find most hard to pay.
    http://www.lighterra.com/papers/modernmicroprocessors/
    Modern Ram, makes an old overclocker miss BH-5 and the fun it was

  9. #9
    Xtreme Mentor
    Join Date
    Feb 2004
    Location
    The Netherlands
    Posts
    2,984
    Quote Originally Posted by nn_step View Post
    Why Not?
    heat per square inch

    Ryzen 9 3900X w/ NH-U14s on MSI X570 Unify
    32 GB Patriot Viper Steel 3733 CL14 (1.51v)
    RX 5700 XT w/ 2x 120mm fan mod (2 GHz)
    Tons of NVMe & SATA SSDs
    LG 27GL850 + Asus MG279Q
    Meshify C white

  10. #10
    Wanna look under my kilt?
    Join Date
    Jun 2005
    Location
    Glasgow-ish U.K.
    Posts
    4,396
    "thermal dissipation density" lol sorry- same thing.

    a "dual-core" GPU wouldnt work because the die size would be BIG and yields would suffer.

    2 dies on one sub would work (like Intel quads,) or 2 discreet cores (like the 3870X2) or obviously theres the dual-PCB option

    Then the thermal envelope becomes the limiting factor..... would only work on lower-end cores, then theres the question- are 2 lower-end dies gonna perform better than one high-end. The major case for that would be yields of small Vs big cores
    Last edited by K404; 05-20-2008 at 11:16 AM.
    Quote Originally Posted by T_M View Post
    Not sure i totally follow anything you said, but regardless of that you helped me come up with a very good idea....
    Quote Originally Posted by soundood View Post
    you sigged that?

    why?
    ______

    Sometimes, it's not your time. Sometimes, you have to make it your time. Sometimes, it can ONLY be your time.

  11. #11
    YouTube Addict
    Join Date
    Aug 2005
    Location
    Klaatu barada nikto
    Posts
    17,574
    Quote Originally Posted by biohead View Post
    heat per square inch
    same argument was made about Intel's Smithfield (2 Prescott dies under one IHS)
    Fast computers breed slow, lazy programmers
    The price of reliability is the pursuit of the utmost simplicity. It is a price which the very rich find most hard to pay.
    http://www.lighterra.com/papers/modernmicroprocessors/
    Modern Ram, makes an old overclocker miss BH-5 and the fun it was

  12. #12
    Xtreme Cruncher
    Join Date
    Aug 2006
    Location
    Denmark
    Posts
    7,747
    Quote Originally Posted by K404 View Post
    "thermal dissipation density" lol sorry- same thing.

    a "dual-core" GPU wouldnt work because the die size would be BIG and yields would suffer.

    2 dies on one sub would work (like Intel quads,) or 2 discreet cores (like the 3870X2) or obviously theres the dual-PCB option

    Then the thermal envelope becomes the limiting factor..... would only work on lower-end cores, then theres the question- are 2 lower-end dies gonna perform better than one high-end. The major case for that would be yields of small Vs big cores
    Plus 2x memory, plus bigger total diesize, plus CF/SLI scaling and other issues and so on

    CPUs MCM design makes sense since they have a low bandwidth interchange requirement, completely dfferent memory management and because scaling is vastly different. Specially due to scaling of the single die CPU. A 10 issue wide Core 2 wouldnt be any faster than the 4 issue wide we have. But a 10 wide GPU would be 2½x faster than a 4 wide. So why waste TDP and wafer on extra interconnects, MPU and so on. If an issue port on a CPU is bad. The entire CPU is bad. If its on a GPU. You just remove that and the rest still works. Same reason we have 112 and 128 shader G92 etc. You could call them 7 and 8 core GPUs.
    Last edited by Shintai; 05-20-2008 at 11:33 AM.
    Crunching for Comrades and the Common good of the People.

  13. #13
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    It is dual-core, but not like you guys seem to think...

    GT-200 = one core, NVIO = one core. Put them together on the same card? That's right, dual core.
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

  14. #14
    Xtreme Addict
    Join Date
    Dec 2007
    Posts
    1,030
    Quote Originally Posted by DilTech View Post
    It is dual-core, but not like you guys seem to think...

    GT-200 = one core, NVIO = one core. Put them together on the same card? That's right, dual core.
    lol, that was smart
    Are we there yet?

  15. #15
    Xtreme Cruncher
    Join Date
    Aug 2006
    Location
    Denmark
    Posts
    7,747
    Quote Originally Posted by DilTech View Post
    It is dual-core, but not like you guys seem to think...

    GT-200 = one core, NVIO = one core. Put them together on the same card? That's right, dual core.
    Priceless
    Crunching for Comrades and the Common good of the People.

  16. #16
    Banned
    Join Date
    Jan 2008
    Location
    Florida
    Posts
    428
    Nvidia seem behind on the race as far as dual core is concerned so doubt they will have a true dual core GPU out before ATI but stranger things have happened...

  17. #17
    I am Xtreme
    Join Date
    Jul 2007
    Location
    Austria
    Posts
    5,485
    Quote Originally Posted by bluehaze View Post
    Nvidia seem behind on the race as far as dual core is concerned so doubt they will have a true dual core GPU out before ATI but stranger things have happened...
    neither ati nor nv has a "dual core" gpu solution... what makes you think one is behind another when both of them have nothing on there hand?


    If you refer to the 3870X2, thats no "dual core" gpu solution just crossfire on one board.
    Last edited by Hornet331; 05-20-2008 at 05:13 PM.

  18. #18
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    Guys...DUAL CORE GPUS AREN'T LIKE DUAL CORE GPUS!

    Every shader on a gpu IS a core!
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

  19. #19
    Live Long And Overclock
    Join Date
    Sep 2004
    Posts
    14,058
    Quote Originally Posted by Luka_Aveiro View Post
    lol, that was smart
    ??? He's right. In addition, VR-Zone has non-English based news writers that can easily miswrite dual gpu as dual core.

    Perkam

  20. #20
    I am Xtreme
    Join Date
    Jul 2007
    Location
    Austria
    Posts
    5,485
    what i want to know is, if die stacking would be a possibility for gpus. Lets say rops+video processing are the base and the spus are stacked on the rops, but i dont know how much power the rops consume so it maybe would be a bad idea.

  21. #21
    Xtreme Addict
    Join Date
    Dec 2007
    Posts
    1,030
    Quote Originally Posted by perkam View Post
    ??? He's right. In addition, VR-Zone has non-English based news writers that can easily miswrite dual gpu as dual core.

    Perkam
    I haven't said/meant he's wrong, I agreed with the statement
    Are we there yet?

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •