Page 1 of 2 12 LastLast
Results 1 to 25 of 47

Thread: nVidia DX10.1 cards specs and pictures

  1. #1
    Xtreme Addict
    Join Date
    Dec 2007
    Posts
    1,326

    nVidia DX10.1 cards specs and pictures

    GeForce G 210

    - 16 processing cores
    - 64-bit memory interface with 512MB GDDR2
    - DirectX 10.1 support
    - 589MHz core clock
    - 1402MHz shader clock
    - 500MHz (1000MHz DDR) memory clock













    GeForce GT 220


    - 48 processing cores
    - 128-bit memory interface with 1GB GDDR3
    - DirectX 10.1 support
    - 615MHz core clock
    - 1335MHz shader clock
    - 790MHz (1580MHz DDR) memory clock









    Source: TechConnect Magazine



    G210 is a little meh (G98 shrink to 40nm?) but the GT220 certainly looks interesting for laptops.
    I figure the GT220 could fit in 12 and 13" laptops, and that's a lot of power for such small laptops (it's still better than the crappy 9600M GT we see in most "gaming" laptops today).
    Last edited by ToTTenTranz; 07-08-2009 at 06:24 AM.

  2. #2
    I am Xtreme
    Join Date
    Dec 2008
    Location
    France
    Posts
    8,829
    Guess TSMC is still having issues since there are no bigger 40nm chips being made yet...
    God, I really believe they slowed down video card progress big time.
    Regarding these cards... They are OK I guess. Just a bit sad the ancient G80 is still miles ahead of them.

  3. #3
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    1,870
    Quote Originally Posted by ToTTenTranz View Post
    G210 is a little meh (G96 shrink to 40nm?)
    Yeah that's really curious. I'm surprised the die isn't too small for a 64-bit bus. I saw die size estimates somewhere but I can't recall where.

  4. #4
    Xtreme Addict
    Join Date
    May 2007
    Location
    Europe/Slovenia/Ljubljana
    Posts
    1,540
    Sorry, but whats the point of supporting DX10.1 if the cards are total junk anyway and won't be able to run anything that's using DX10.1.
    It's like sticking a nitrous on old FIAT 500. Sure you have nitrous, but it you're still moving like snail when you use it.
    Intel Core i7 920 4 GHz | 18 GB DDR3 1600 MHz | ASUS Rampage II Gene | GIGABYTE HD7950 3GB WindForce 3X | WD Caviar Black 2TB | Creative Sound Blaster Z | Altec Lansing MX5021 | Corsair HX750 | Lian Li PC-V354
    Super silent cooling powered by (((Noiseblocker)))

  5. #5
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    1,870
    For the same reason your 8400GS supports DX10

  6. #6
    Xtreme Addict
    Join Date
    Dec 2007
    Posts
    1,326
    Quote Originally Posted by trinibwoy View Post
    Yeah that's really curious. I'm surprised the die isn't too small for a 64-bit bus. I saw die size estimates somewhere but I can't recall where.
    You mean it should be pad-limited?



    Quote Originally Posted by RejZoR View Post
    Sorry, but whats the point of supporting DX10.1 if the cards are total junk anyway and won't be able to run anything that's using DX10.1.
    It's like sticking a nitrous on old FIAT 500. Sure you have nitrous, but it you're still moving like snail when you use it.
    The GT220 should be a pretty capable graphics card for around 40€ (this should be its price or it will be wasted by ATI's HD4650).
    Last edited by ToTTenTranz; 07-08-2009 at 06:29 AM.

  7. #7
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    1,870
    Quote Originally Posted by ToTTenTranz View Post
    You mean it should be pad-limited?
    Yeah.

    And it's interesting that they're OEM only. I guess Nvidia is happy to keep pushing 9500GT's in retail. Also lol@1GB DDR3 on the GT220.

    http://www.nvidia.com/object/product...gt_220_us.html
    http://www.nvidia.com/object/product...e_g210_us.html

  8. #8
    Xtreme Cruncher
    Join Date
    Aug 2006
    Location
    Denmark
    Posts
    7,745
    I´m abit sad to see the 210 with only 16 shaders when the 220 is 48. Seems like a larger difference. And 16 seems incredible small. 32 would have been nice.

    Else its always ncie to see low profile cards
    Crunching for Comrades and the Common good of the People.

  9. #9
    Xtreme Addict
    Join Date
    Oct 2004
    Posts
    1,834
    hmm, the reference cards have hdmi....so does it do sound like the ati cards now?

    g210 would be pretty nice for laptops....how long till we start seeing them in thin laptops?
    Last edited by grimREEFER; 07-08-2009 at 06:50 AM.
    DFI P965-S/core 2 quad q6600@3.2ghz/4gb gskill ddr2 @ 800mhz cas 4/xfx gtx 260/ silverstone op650/thermaltake xaser 3 case/razer lachesis

  10. #10
    Xtreme Cruncher
    Join Date
    Aug 2006
    Location
    Denmark
    Posts
    7,745
    Quote Originally Posted by grimREEFER View Post
    hmm, the reference cards have hdmi....so does it do sound like the ati cards now?
    Sound is easily digitally carried from the audio codec in the chipset. You dont need onboard sound on a graphics card. You can even do it all in SW.
    Crunching for Comrades and the Common good of the People.

  11. #11
    Xtreme Member
    Join Date
    Dec 2007
    Location
    CR:IA
    Posts
    384
    Quote Originally Posted by Shintai View Post
    I´m abit sad to see the 210 with only 16 shaders when the 220 is 48. Seems like a larger difference. And 16 seems incredible small. 32 would have been nice.

    Else its always ncie to see low profile cards
    ion has 16 shaders (iirc)
    PC-A04 | Z68MA-ED55 | 2500k | 2200+ XPG | 7970 | 180g 520 | 2x1t Black | X3 1000w

  12. #12
    Xtreme Addict
    Join Date
    Dec 2007
    Posts
    1,326
    Quote Originally Posted by Shintai View Post
    Sound is easily digitally carried from the audio codec in the chipset. You dont need onboard sound on a graphics card. You can even do it all in SW.
    Sound carried by SPDIF is easy to be carried into the HDMI port.
    This means 16bit lossy 5.1 or 24bit lossless stereo.

    But that's still miles away from what the HDMI can do -> discrete 7.1 lossless 24bit channels. No connection was ever made to pass that except for HDMI itself, so the have Blu-Ray quality audio it needs to come from the graphics card itself (or the sound can later be mixed with the video in the HDMI port through a specialized soundcard).


    The ATI HD4000 family is still the only option for someone who wants to connect a media center to a high-quality surround setup.

  13. #13
    Xtreme X.I.P.
    Join Date
    Nov 2002
    Location
    Shipai
    Posts
    34,549
    wow... thats nvidias latest and greatest?

    i know i know... but come on, its hilarious that this is all they could get done, isnt it?
    nothing new at least decent performing on 55nm or 40nm... nothing... i dont even know what those cards are supposed to be used for... media center pcs i guess...

    so is nvidia recycling the same chip for ion2 or is that another chip that has yet to be patched up errr mass produced by tsmc?

    on a more serious note... this is a bad sign... i have a bad feeling about this...
    nvidia is really in bad shape if all they launch for over half a year is a cheaper cost down gtx295 and some tiny crappy entry level cards...
    who knows, after all larrabee might not even have to perform all that great as nvidia seems to be stuck with 55nm performance parts for the near future...

  14. #14
    L-l-look at you, hacker.
    Join Date
    Jun 2007
    Location
    Perth, Western Australia
    Posts
    4,642
    I'm not liking the tiny fans on those. Would have been easy to make them fanless, which would have been a lot better for OEM and HTPC.
    Rig specs
    CPU: i7 5960X Mobo: Asus X99 Deluxe RAM: 4x4GB G.Skill DDR4-2400 CAS-15 VGA: 2x eVGA GTX680 Superclock PSU: Corsair AX1200

    Foundational Falsehoods of Creationism



  15. #15
    Xtreme Addict
    Join Date
    Oct 2008
    Location
    The Curragh.
    Posts
    1,294
    Not just the fans, they just don't seem right.

    1GB on a 128-bit interface seems odd really.

    They're also just the same old cards that you can get now.

    I hope they have better luck getting their current mid range on 40nm and DX10.1. They need to catch up at the momen on the low end HTPC segment.

  16. #16
    Xtreme Cruncher
    Join Date
    Aug 2006
    Location
    Denmark
    Posts
    7,745
    Quote Originally Posted by saaya View Post
    wow... thats nvidias latest and greatest?

    i know i know... but come on, its hilarious that this is all they could get done, isnt it?
    nothing new at least decent performing on 55nm or 40nm... nothing... i dont even know what those cards are supposed to be used for... media center pcs i guess...

    so is nvidia recycling the same chip for ion2 or is that another chip that has yet to be patched up errr mass produced by tsmc?

    on a more serious note... this is a bad sign... i have a bad feeling about this...
    nvidia is really in bad shape if all they launch for over half a year is a cheaper cost down gtx295 and some tiny crappy entry level cards...
    who knows, after all larrabee might not even have to perform all that great as nvidia seems to be stuck with 55nm performance parts for the near future...
    Hehe. Well you got DX10.1 You cant get it all

    Quote Originally Posted by SoulsCollective View Post
    I'm not liking the tiny fans on those. Would have been easy to make them fanless, which would have been a lot better for OEM and HTPC.
    Clocks are somewhat high. And the 40nm aint going good. So perhaps fanless parts aswell later. But yes, fully agree.
    Crunching for Comrades and the Common good of the People.

  17. #17
    I am Xtreme
    Join Date
    Dec 2008
    Location
    France
    Posts
    8,829
    Quote Originally Posted by SoulsCollective View Post
    I'm not liking the tiny fans on those. Would have been easy to make them fanless, which would have been a lot better for OEM and HTPC.
    That's a very good point. Totally agree. But perhaps we will see some non reference designs, this isn't an uncommon thing, at least in regards to heatsinks.
    Quote Originally Posted by Shintai View Post
    Well you got DX10.1
    Welcome to the last year...
    Last edited by zalbard; 07-08-2009 at 08:16 AM.

  18. #18
    Xtreme Addict
    Join Date
    May 2003
    Location
    Peoples Republic of Kalifornia
    Posts
    1,572
    ^^^ I believe you'll only be able to get Dolby Digital 5.1, DTS 5.1, and multi-channel bitstream by passing the pure untouched signal through your video card and into an HDMI cable. S/pdif enabled video cards that can pass the audio signal through the DVI port (then into a DVI to HDMI converter) are bandwidth limited due to s/pdif only carrying a maximum of about 2mb/s in audio data. If you want Dolby True HD or DTS HD Master 7.1 lossless audio, many changes must occur to the way audio is transferred between your optical drive/motherboard/soundcard/video card in order for a 15mb/s+ signal to be sent across the bus.

    "If the representatives of the people betray their constituents, there is then no resource left but in the exertion of that original right of self-defense which is paramount to all positive forms of government"
    -- Alexander Hamilton

  19. #19
    all outta gum
    Join Date
    Dec 2006
    Location
    Poland
    Posts
    3,389
    Weird memory chips on GT220, so thin
    www.teampclab.pl
    MOA 2009 Poland #2, AMD Black Ops 2010, MOA 2011 Poland #1, MOA 2011 EMEA #12

    Test bench: empty

  20. #20
    Xtreme Guru
    Join Date
    Jan 2005
    Location
    Tre, Suomi Finland
    Posts
    3,866
    ^The pic's gotta be of a version with DDR2 instead of GDDR3.
    You were not supposed to see this.

  21. #21
    Banned
    Join Date
    Jun 2008
    Location
    Somewhere Up to my Ears in Ye Yo
    Posts
    1,124
    i have read every thread, and agree with most, but i must add,

    i just dont get the point here,

    is this a sort of nvidia stepping stone thing here i missed? if it is, god help us with any support for DX11

  22. #22
    Xtreme Addict
    Join Date
    Oct 2004
    Posts
    1,834
    here's an interesting question:

    if you have a corei7 with qpi and triple channel ddr3, wouldnt it be better for a graphics card to use system mem than use gddr2 with a 64bit bus?
    DFI P965-S/core 2 quad q6600@3.2ghz/4gb gskill ddr2 @ 800mhz cas 4/xfx gtx 260/ silverstone op650/thermaltake xaser 3 case/razer lachesis

  23. #23
    Xtreme Cruncher
    Join Date
    May 2009
    Location
    Bloomfield
    Posts
    1,968
    mobile graphics is not targeting core i7/ high end desktop.

  24. #24
    Xtreme Addict
    Join Date
    Dec 2008
    Location
    Sweden, Linköping
    Posts
    2,033
    Quote Originally Posted by grimREEFER View Post
    here's an interesting question:

    if you have a corei7 with qpi and triple channel ddr3, wouldnt it be better for a graphics card to use system mem than use gddr2 with a 64bit bus?
    AMDs HD3200 and HD3300 (integrated chips), is supposed to get a theoretical 33% performance increase if your system is using DDR3, instead of DDR2.

    But for the situation you're explaining I really don't know...
    SweClockers.com

    CPU: Phenom II X4 955BE
    Clock: 4200MHz 1.4375v
    Memory: Dominator GT 2x2GB 1600MHz 6-6-6-20 1.65v
    Motherboard: ASUS Crosshair IV Formula
    GPU: HD 5770

  25. #25
    Xtreme Addict
    Join Date
    Oct 2004
    Posts
    1,353
    Even if these were mid-high end cards, would it really matter? I know I for one ain't upgrading GPU until the next gen is out.

    And if DX10.1 is a huge selling point, ATI has had that covered for quite some time now. Bring on DX11 and G300 I say!

Page 1 of 2 12 LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •