Page 1 of 4 1234 LastLast
Results 1 to 25 of 76

Thread: Intel's Sandy Bridge architecture to feature up to 2 GPUs on 1 monolithic die

  1. #1
    Xtreme Enthusiast
    Join Date
    Aug 2007
    Location
    Orange County, Southern California
    Posts
    583

    Arrow Intel's Sandy Bridge architecture to feature up to 2 GPUs on 1 monolithic die

    Sandy Bridge CPU to have up to 2 GPUs



    Monolithic on-die

    We were quite shocked to learn that Intel might be the first company to ever launch two GPUs on the same die. Not only does it look like Intel might be the first with monolithically integrated with the CPU and have Fusion before AMD, bit it might be the first “graphics” company to ever launch two GPUs on the same die.

    According to Intel’s plans that were revealed to partners and industry insiders Sandy Bridge, brand new 32nm architecture will have one to two graphics cores on a monolithic die.

    Intel will also let you connect discrete graphics via PCIe 16X and even SLI / Crossfire should be possible with 2x 8X PCIe slots. This is likely to happen on desktop boards rather than notebook.

    At this time we are not aware that Nvidia and ATI will launch their dual GPU solutions on a single chip, but if you look a high-end card, for example a Fermi, it card has 512 shader cores and if you cut it in half you get a mainstream card. With this in mind, ATI and Nvidia practically make high-end chips out of four low end GPUs.
    Source
    EVGA X58 SLI Classified E759 Limited Edition
    Intel Core i7 Extreme 980X Gulftown six-core
    Thermalright TRUE Copper w/ 2x Noctua NF-P12s (push-pull)
    2x EVGA GeForce GTX 590 Classified [Quad-SLI]
    6GB Mushkin XP Series DDR3 1600MHz 7-8-7-20
    SilverStone Strider ST1500 1500W
    OCZ RevoDrive 3 240GB 1.0GB/s PCI-Express SSD
    Creative X-Fi Fatal1ty Professional / Logitech G51 5.1 Surround
    SilverStone Raven RV02
    Windows 7 Ultimate x64 RTM



  2. #2
    Xtreme Addict
    Join Date
    Dec 2007
    Location
    Hungary (EU)
    Posts
    1,373
    Dual GMA 4500.
    -

  3. #3
    Xtreme Member
    Join Date
    Sep 2009
    Location
    Czech Republic, 50°4'52.22"N, 14°23'30.45"E
    Posts
    474
    Oh god you have some faster wire Gonna to del my thread...

  4. #4
    I am Xtreme
    Join Date
    Dec 2008
    Location
    France
    Posts
    8,829
    FUD? Why not just use one single bigger GPU core? Or is it going to be some sort of 2 CUDA cores, just not CUDA? The die's monolithic... Doesn't make much sense, does it?
    Donate to XS forums
    Quote Originally Posted by jayhall0315 View Post
    If you are really extreme, you never let informed facts or the scientific method hold you back from your journey to the wrong answer.

  5. #5
    Xtreme Member
    Join Date
    Jun 2005
    Posts
    160
    Quote Originally Posted by zalbard View Post
    FUD? Why not just use one single bigger GPU core? Or is it going to be some sort of 2 CUDA cores, just not CUDA? The die's monolithic... Doesn't make much sense, does it?
    Yeah the reason people have not done this before was because it was better to just join the two, they must each serve some sort of different purpose.

  6. #6
    all outta gum
    Join Date
    Dec 2006
    Location
    Poland
    Posts
    3,390
    Yeah, right, but why would anyone want to put two GPUs on the same die?
    www.teampclab.pl
    MOA 2009 Poland #2, AMD Black Ops 2010, MOA 2011 Poland #1, MOA 2011 EMEA #12

    Test bench: empty

  7. #7
    I am Xtreme
    Join Date
    Jul 2007
    Location
    Austria
    Posts
    5,485
    this makes no sense at all... espeical since there is no multigpu approach from intel till now...

  8. #8
    Xtreme Addict
    Join Date
    May 2007
    Posts
    2,125
    Quote Originally Posted by Oliverda View Post
    Dual GMA 4500.
    just what i always wanted!

  9. #9
    Xtreme Addict
    Join Date
    Jan 2009
    Posts
    1,432
    i think that it is probably 2 reasons for this:

    1. to better compete with llano, i do not think that anyone here can argue that intel is better at graphics than ati.

    2. these two gpus have some sort of special function.
    [MOBO] Asus CrossHair Formula 5 AM3+
    [GPU] ATI 6970 x2 Crossfire 2Gb
    [RAM] G.SKILL Ripjaws X Series 16GB (4 x 4GB) 240-Pin DDR3 1600
    [CPU] AMD FX-8120 @ 4.8 ghz
    [COOLER] XSPC Rasa 750 RS360 WaterCooling
    [OS] Windows 8 x64 Enterprise
    [HDD] OCZ Vertex 3 120GB SSD
    [AUDIO] Logitech S-220 17 Watts 2.1

  10. #10
    Xtreme CCIE
    Join Date
    Dec 2004
    Location
    Atlanta, GA
    Posts
    3,982
    I'm guessing dual GMA for better dual display performance?

    Definitely did not see this coming from Intel, but good for them if they can pull it off I suppose.
    Dual CCIE (Route\Switch and Security) at your disposal. Have a Cisco-related or other network question? My PM box is always open.

    Xtreme Network:
    - Cisco 3560X-24P PoE Switch
    - Cisco ASA 5505 Firewall
    - Cisco 4402 Wireless LAN Controller
    - Cisco 3502i Access Point

  11. #11
    Xtreme Member
    Join Date
    Mar 2005
    Posts
    447
    Maybe one dedicated for HD video decoding/h.264/blu-ray etc and the other for gaming/graphics? They've stressed modularity over the past few years with nehalem and westmere.

    On the flip side, maybe since LRB will no longer be making it on die anytime soon, they realized that their alternative graphics core (GMA whatever) wouldnt be enough so they are putting 2. That seems more logical to me.
    Iron Lung 3.0 | Intel Core i7 6800k @ 4ghz | 32gb G.SKILL RIPJAW V DDR4-3200 @16-16-16-36 | ASUS ROG STRIX X99 GAMING + ASUS ROG GeForce GTX 1070 STRIX GAMING | Samsung 960 Pro 512GB + Samsung 840 EVO + 4TB HDD | 55" Samsung KS8000 + 30" Dell u3011 via Displayport - @ 6400x2160

  12. #12
    Registered User
    Join Date
    Dec 2007
    Posts
    22
    Quote Originally Posted by Oliverda
    Dual GMA 4500.
    Yeah... GMA X5200 should be a "revolutionary" IGP, but it was just a common IGP, no "revolutionary" performance. Intel does not know how to GPU's high performance.

    (word of who had GMA 845, 865, 900, 950, 3100, X3100, and X5200 about to have)

    CPU: Core i3 530ES @4.76GHz 1.48V HT by Hyper 212 Plus
    MB: Gigabyte GA-P55A-UD3R
    RAM: 4GB DDR3-1600 @1850 8-8-8-21 1T G.Skill Trident
    VGA: HIS HD 4850 IceQ4
    HD: 500GB HDD's Samsung
    PSU: Satellite SL-8600EPS

    Sorry my bad english

  13. #13
    Xtreme Addict
    Join Date
    Aug 2005
    Location
    Germany
    Posts
    2,251
    At this time we are not aware that Nvidia and ATI will launch their dual GPU solutions on a single chip, but if you look a high-end card, for example a Fermi, it card has 512 shader cores and if you cut it in half you get a mainstream card. With this in mind, ATI and Nvidia practically make high-end chips out of four low end GPUs.
    how... what, errr, he? with what in mind? i don't get this part :F
    1. Asus P5Q-E / Intel Core 2 Quad Q9550 @~3612 MHz (8,5x425) / 2x2GB OCZ Platinum XTC (PC2-8000U, CL5) / EVGA GeForce GTX 570 / Crucial M4 128GB, WD Caviar Blue 640GB, WD Caviar SE16 320GB, WD Caviar SE 160GB / be quiet! Dark Power Pro P7 550W / Thermaltake Tsunami VA3000BWA / LG L227WT / Teufel Concept E Magnum 5.1 // SysProfile


    2. Asus A8N-SLI / AMD Athlon 64 4000+ @~2640 MHz (12x220) / 1024 MB Corsair CMX TwinX 3200C2, 2.5-3-3-6 1T / Club3D GeForce 7800GT @463/1120 MHz / Crucial M4 64GB, Hitachi Deskstar 40GB / be quiet! Blackline P5 470W

  14. #14
    Xtreme Cruncher
    Join Date
    Oct 2007
    Posts
    1,630
    Dual GPU cores on one die onboard the CPU? Why? By the time they are done with this thing its gonna look like an old Pentium 2 Slot 1 cartridge!

  15. #15
    Banned
    Join Date
    Jun 2008
    Posts
    763
    Well imo, doubling up the crap only gives you a bigger crap...

  16. #16
    Xtreme Member
    Join Date
    Aug 2008
    Posts
    135
    Quote Originally Posted by Tenknics View Post
    Maybe one dedicated for HD video decoding/h.264/blu-ray etc and the other for gaming/graphics? They've stressed modularity over the past few years with nehalem and westmere.
    Video decoding is not done in a GPU, but in dedicated logic/DSPs. Like UVD on ATI GPUs, or PureVideo on Nvidia's, or Broadcom chip at some netbooks, etc. It is totaly different logic, even inside the same silicon, in the same way as a memory controller is different from an CPU core.

    I see no reason for an "dual core gpu". I call FUD in this too.

  17. #17
    Xtreme Mentor
    Join Date
    May 2008
    Posts
    2,550
    How often is Fudzilla ever right when it comes to this kind of stuff? My local weather man is more accurate while trying to forecast a snowstorm two weeks in advance. There are two types of links that I won't click on and one is toms and the other is fudzilla.

  18. #18
    Xtreme Member
    Join Date
    Jun 2008
    Location
    New Jersey
    Posts
    208
    Redundancy because the have extra allocatable space on die? I think not.

    Multiple monitor support? I think not.

    Hack-eyed attempt to run them together with Lucid technology? Maybe >_<

    One dumpy GPU and one Dumpier GPU for energy conservancy? Might work.

    Fud being full of it? Most likely.

  19. #19
    Xtreme Mentor
    Join Date
    Apr 2005
    Posts
    2,548
    why there's no [FUD] at the beginning of this thread title?

    BTW even Intel knows that idea of monolithic dual-die GPU is pure nonsense

    "At this time we are not aware that Nvidia and ATI will launch their dual GPU solutions on a single chip, but if you look a high-end card, for example a Fermi, it card has 512 shader cores and if you cut it in half you get a mainstream card. With this in mind, ATI and Nvidia practically make high-end chips out of four low end GPUs."

    and chocolate manufacturers make 300g bars out of lots of small dices

    eh Fudo should know better than this... but then again ridership age/knowledge should always come first
    Last edited by Nedjo; 02-08-2010 at 03:56 PM.
    Adobe is working on Flash Player support for 64-bit platforms as part of our ongoing commitment to the cross-platform compatibility of Flash Player. We expect to provide native support for 64-bit platforms in an upcoming release of Flash Player following the release of Flash Player 10.1.

  20. #20
    Xtreme Enthusiast
    Join Date
    Jun 2007
    Location
    Finland
    Posts
    831
    Quote Originally Posted by zerazax View Post
    just what i always wanted!
    in Slifire?

    ::: Desktop's - Intel *** Intel 2
    2 x Xeon E5-2687W *** Intel i7 3930k
    EVGA SR-X *** Asus Rampage IV Extreme
    96Gb (12x8Gb) G.Skill Trident X DDR3-2400MHz 10-12-12-2N *** 32Gb (8x4Gb) G.Skill Trident X DDR3-2666 10-12-12-2N
    3 x Zotac GTX 680 4Gb + EK-FC680 GTX Acetal *** 3 x EVGA GeForce GTX780 + EK Titan XXL Edition waterblocks.
    OCZ RevoDrive 3 x4 960Gb *** 4 x Samsung 840 Pro 512Gb
    Avermedia LiveGamer HD capture card
    Caselabs TX10-D
    14 x 4 TB WD RE4 in RAID10+2Spare
    4 x Corsair AX1200

    ::: Basement DataCenter :::
    [*] Fibreoptic connection from operators core network
    [*] Dell PowerConnect 2848 Ethernet Switch [*] Network Security Devices by Cisco
    [*] Dell EqualLogic PS6500E 96Tb iSCSI SAN (40 2Tb Drives + 8 Spare Drives, Raid10+Spare Configuration, 40Tb fail safe storage)
    [*] Additional SAN machines with FusionIO ioDrive Octal's (4 total Octals).
    [*] 10 x Dual Xeon X5680, 12Gb DDR3, 2x100Gb Vertex 2 Pro Raid1 [*] 4 x Quad Xeon E7-4870, 96Gb DDR3, 2x100Gb Vertex 2 Pro Raid1

    [*] Monster UPS unit incase power grid failure backed up by diesel powered generator.

  21. #21
    Registered User
    Join Date
    Dec 2007
    Posts
    22
    Quote Originally Posted by rintamarotta View Post
    in Slifire?
    In QPIFire

    CPU: Core i3 530ES @4.76GHz 1.48V HT by Hyper 212 Plus
    MB: Gigabyte GA-P55A-UD3R
    RAM: 4GB DDR3-1600 @1850 8-8-8-21 1T G.Skill Trident
    VGA: HIS HD 4850 IceQ4
    HD: 500GB HDD's Samsung
    PSU: Satellite SL-8600EPS

    Sorry my bad english

  22. #22
    Xtreme Addict
    Join Date
    Aug 2007
    Location
    Toon
    Posts
    1,555
    Quote Originally Posted by Larvitar View Post
    In QPIFire
    Yeah, there so rubbish we have to plumb two of them directly into the QPI to get anywhere near 5450 performance!
    Intel i7 920 C0 @ 3.67GHz
    ASUS 6T Deluxe
    Powercolor 7970 @ 1050/1475
    12GB GSkill Ripjaws
    Antec 850W TruePower Quattro
    50" Full HD PDP
    Red Cosmos 1000

  23. #23
    Xtreme Addict
    Join Date
    Apr 2006
    Posts
    2,463
    Quote Originally Posted by zalbard View Post
    FUD? Why not just use one single bigger GPU core? Or is it going to be some sort of 2 CUDA cores, just not CUDA? The die's monolithic... Doesn't make much sense, does it?
    This! I totally agree! I mean, you can barely use Intel's GPUs for anything else than office related work, wth would you need two GPUs for?! Unless they are "Larrabee"-cores which can be used for OpenCL-computation as well.

    Doesn't AMD follow that path with their "APU"? Use the GPU for some parts of the calculations?

    (Sorry if what I'm writing is a bit mixed up, had a few Mojitos and a little bit of that: http://img10.abload.de/img/img_3144fe2a.jpg )
    Notice any grammar or spelling mistakes? Feel free to correct me! Thanks

  24. #24
    Xtreme Enthusiast
    Join Date
    Apr 2006
    Posts
    939
    If those two GPU's are Larrabee based, then thats a shed load of x86. They'll make nvidia nervous as it'll sit in the area where they are trying to sell Fermi at 2k a pop. If that's the plan Intel don't have to beat Fermi, they just have to not use 300watts.

  25. #25
    Xtreme Member
    Join Date
    Jul 2009
    Location
    NY
    Posts
    224
    If true, likely a marketing gimmick from Intel.Advertising dual core GPU's after success of dual core CPU , could be a good strategy.I already see an advert explaining what GPU is , and how doubly awesome this one is.
    My Heatware
    Originally Posted by some guy on internet
    That's your problem right there. Just forget about how things look on paper as that's irrelevant.

Page 1 of 4 1234 LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •