Page 4 of 6 FirstFirst 123456 LastLast
Results 76 to 100 of 126

Thread: Photos of GeForce GTX 280 graphics card

  1. #76
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    2,128
    Quote Originally Posted by Nuker_ View Post
    So you are claiming no one will buy ati gfx cards?
    Well, I should've said that "of those who will buy nvidia card", stupid me.

    Anyway, you get the point.

  2. #77
    Banned
    Join Date
    Feb 2008
    Posts
    696
    Quote Originally Posted by largon View Post
    Incompetence = not being able to do such a thing when competitor has done it ages ago.
    Oh you mean when a competitor did it out of necessity to find a home for their products because they couldn't compete on the high end?

    Of course if they can't compete at the high end, they want to go cheaper, which means they have to make their product cheaper in every way possible, including risking on process.

    Do you really think NVIDIA can have the fastest GPUs on the planet, but can't go to 55nm?

    It's not even a matter of NVIDIA having the process, because they just put in an order to TSMC the same way AMD does. Yields on relatively new processes (new for the fab in question) can be a and it's a huge risk.

    What if I told you that for your new house you're building from the ground up, there's a new foundation technology which only 1 house has been built on before and it's only been out for a year, so the kinks are not necessarily all worked out? Want to invest your $600,000 on top of it?
    Last edited by Sr7; 05-22-2008 at 02:47 AM.

  3. #78
    Xtreme Guru
    Join Date
    Jan 2005
    Location
    Tre, Suomi Finland
    Posts
    3,858
    Again, that all boils down to incompetence. No matter how you look at it.
    And besides, 55nm is now just as mature as 65nm was when G92 came out.
    Quote Originally Posted by Calmatory View Post
    So when has ATI made 576mm² chips @ 55nm?

    Err... what?
    55nm is the issue here. 55nm works for ATi, and assuming the bunnies at TSMC are not ATi-fans then the root of the problem is at nV.
    Last edited by largon; 05-22-2008 at 02:51 AM.
    You were not supposed to see this.

  4. #79
    Banned
    Join Date
    Feb 2008
    Posts
    696
    Quote Originally Posted by largon View Post
    Again, that all boils down to incompetence. No matter how you look at it.
    And besides, 55nm is now just as mature as 65nm was when G92 came out.

    Err... what?
    55nm is the issue here. 55nm works for ATi, and assuming the bunnies at TSMC are not ATi-fans then the root of the problem is at nV.
    That's the point exactly, it's not a problem. It's a decision. It is intentionally, decisively, and consciously made, not "damn if only we were good enough to do 55nm"... you don't seem to get that.

  5. #80
    Xtreme Guru
    Join Date
    Jan 2005
    Location
    Tre, Suomi Finland
    Posts
    3,858
    Anyways, I think it's a problem.
    Infact, the only way consumers could see it is, as a problem.
    Now, why is it not a problem for ATi - or rather wasn't a problem 6months ago?
    Last edited by largon; 05-22-2008 at 03:11 AM.
    You were not supposed to see this.

  6. #81
    Xtreme Addict
    Join Date
    Aug 2006
    Location
    The Netherlands, Friesland
    Posts
    2,244
    Quote Originally Posted by largon View Post
    Again, that all boils down to incompetence. No matter how you look at it.
    And besides, 55nm is now just as mature as 65nm was when G92 came out.

    Err... what?
    55nm is the issue here. 55nm works for ATi, and assuming the bunnies at TSMC are not ATi-fans then the root of the problem is at nV.
    Yes, 55nm really works for ATI. 576mm2 is to large, especially when a RV770 is twice as small.
    RV770 = between 830 and 1300 million transistors = 256mm2
    GT200 = 1000 million transistors =576mm2

    If you look it like this they really should have used 55nm die shrink instead of 65nm. GT206 will be 55nm.

    R600 had 30% transistors then G80 but G80 owned R600. ATI RV770 has to be really efficient to keep up with nVidia.
    It looks to me nVidia might have a killer card, although I'm not sure about the heat dump. I hate stupid heatspreaders on GFX cards.
    Last edited by ownage; 05-22-2008 at 03:22 AM.
    >i5-3570K
    >Asrock Z77E-ITX Wifi
    >Asus GTX 670 Mini
    >Cooltek Coolcube Black
    >CM Silent Pro M700
    >Crucial M4 128Gb Msata
    >Cooler Master Seidon 120M
    Hell yes its a mini-ITX gaming rig!

  7. #82
    Xtreme Addict
    Join Date
    May 2007
    Posts
    2,125
    I do wonder how good their margins will be if yields aren't great. I mean, that is a huge huge die along with the 16 GDDR3 chips and the very densly packed PCB (compare it to the G80 GTX and you'll see what I mean).

    Actually, what I'm more amazed by is how tightly lipped ATI has been about RV770 (with no die shots of any kind yet) compared to GT200 which has had everything but the kitchen sink leaked out.

  8. #83
    Banned
    Join Date
    Feb 2008
    Posts
    696
    Quote Originally Posted by ownage View Post
    Yes, 55nm really works for ATI. 576mm2 is to large, especially when a RV770 is twice as small.
    RV770 = between 830 and 1300 million transistors = 256mm2
    GT200 = 1000 million transistors =576mm2

    If you look it like this they really should have used 55nm die shrink instead of 65nm. GT206 will be 55nm.

    R600 had 30% transistors then G80 but G80 owned R600. ATI RV770 has to be really efficient to keep up with nVidia.
    It looks to me nVidia might have a killer card, although I'm not sure about the heat dump. I hate stupid heatspreaders on GFX cards.
    So how is it you don't know how many transistors RV770 has but you seem to know that GT200 has exactly 1 billion? Did you count them yourself?

  9. #84
    Xtreme Addict
    Join Date
    Aug 2006
    Location
    The Netherlands, Friesland
    Posts
    2,244
    Quote Originally Posted by Sr7 View Post
    So how is it you don't know how many transistors RV770 has but you seem to know that GT200 has exactly 1 billion? Did you count them yourself?
    Yes, one by one
    Hmm, i forgot to edit that post like i did with my post in the other thread. I've seen some rumors about close to 1 billion but also close to 1300 million
    Sorry about that.
    Every info on the net is based on rumours. Maybe 1 billion is not that accurate but my post can give you an idea about how small the RV770 is and how big the GT200 is. If you ask me a big difference.
    Last edited by ownage; 05-22-2008 at 04:22 AM.
    >i5-3570K
    >Asrock Z77E-ITX Wifi
    >Asus GTX 670 Mini
    >Cooltek Coolcube Black
    >CM Silent Pro M700
    >Crucial M4 128Gb Msata
    >Cooler Master Seidon 120M
    Hell yes its a mini-ITX gaming rig!

  10. #85
    Xtreme Enthusiast
    Join Date
    Mar 2007
    Location
    Portsmouth, UK
    Posts
    963
    Quote Originally Posted by largon View Post
    I should be allowed my rightful anger because nV uses an inferior process only because of their sheer incompetence. :/
    55nm would have shrinked the chip to about the size of G80.
    It's a purely business decision. nVidia designed the G200 for 65nm, they would have to re-design & test it for 55nm before even attempting production. G200 on 65nm was in production a while ago now & there is a 55nm design in the works but it's coming later, once all the potential issues are addressed & fixed/bypassed.

    It is not incompetence but a safer, cheaper decision that could be the difference between making money or losing money.

  11. #86
    Xtreme Member
    Join Date
    Oct 2007
    Posts
    407
    Have any other graphics cards had memory chips on the back of the card? How are they going to cool those? Some kind of sandwich cooler?

  12. #87
    Xtreme Guru
    Join Date
    Apr 2005
    Location
    Finland
    Posts
    4,308
    Quote Originally Posted by DeathReborn View Post
    It is not incompetence but a safer, cheaper decision that could be the difference between making money or losing money.
    It's not cheaper but you're right that it should be more safe. I think NVIDIA doesn't wanna risk anything this time around, it's quite important launch for both companies and screw ups aren't allowed.
    Intel? Core i5-4670K @ 4.3 GHz | ASRock Extreme6 Z87 | G.Skill Sniper 2x8GB @ DDR4-1866 CL9 | Gigabyte GTX 970 OC Windforce 3x | Super Flower Titanium 1000W | ViewSonic VX2268wm 120Hz LCD | Phanteks PH-TC14PE | Logitech MX-518 | Win 7 x64 Professional | Samsung 850 EVO & 840 Pro SSDs

    If all people would share opinions in an objective manner, the world would be a friendlier place

  13. #88
    Xtreme Cruncher
    Join Date
    Aug 2006
    Location
    Denmark
    Posts
    7,747
    Quote Originally Posted by gojirasan View Post
    Have any other graphics cards had memory chips on the back of the card? How are they going to cool those? Some kind of sandwich cooler?
    Plenty of cards before had them on back aswell. No cooling needed.
    Crunching for Comrades and the Common good of the People.

  14. #89
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    Quote Originally Posted by largon View Post
    I should be allowed my rightful anger because nV uses an inferior process only because of their sheer incompetence. :/
    55nm would have shrinked the chip to about the size of G80.
    NVidia never start their high-end on a new process. ATi have been a process ahead for awhile because of a gamble they took with the R520 that costed them drastically in both market share as well as stock value, and is arguably the reason ATi was able to be bought out by AMD.

    So when you think about it, having a high-end part be your first test run of a new process size is incompetence. Personally, I'd rather have a larger chip than one with half the yields. Call me crazy, but I'd like to actually be able to buy this thing rather than it be nowhere to be found and price gouged up in the $800+ range(remember the 7800GTX 512MB?).
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

  15. #90
    Banned
    Join Date
    Jan 2008
    Location
    Canada
    Posts
    707
    Quote Originally Posted by Sr7 View Post
    Do you really think NVIDIA can have the fastest GPUs on the planet, but can't go to 55nm?

    It's not even a matter of NVIDIA having the process, because they just put in an order to TSMC the same way AMD does.
    Chip design takes into account the fabrication process. It is not as simple as saying, hey let's put in an order for 55nm chips, and that's that. A chip is designed specifically for the process used to make it.

    Nvidia no doubt would love to make their highest end stuff @55nm and below, and you can bet they would have if it was feasible. ATI does look to have a leg up on Nvidia when it comes to their working relationship with TSMC, but that has not hurt Nvidia at all because they have a better/faster design arch. Now if ATI had the performance lead AND was also doing everything on a smaller process, Nvidia would be in trouble.

    Also, Nvidia has been able to field reasonably power efficient parts even on an "inferior" process. The smallest nm process is not everything.

  16. #91
    Xtreme Guru
    Join Date
    Apr 2005
    Location
    Finland
    Posts
    4,308
    Quote Originally Posted by DilTech View Post
    NVidia never start their high-end on a new process. ATi have been a process ahead for awhile because of a gamble they took with the R520 that costed them drastically in both market share as well as stock value, and is arguably the reason ATi was able to be bought out by AMD.

    So when you think about it, having a high-end part be your first test run of a new process size is incompetence. Personally, I'd rather have a larger chip than one with half the yields. Call me crazy, but I'd like to actually be able to buy this thing rather than it be nowhere to be found and price gouged up in the $800+ range(remember the 7800GTX 512MB?).
    I agree with you, gambling isn't my cup of tea and shouldn't be NVIDIA's either. For the competitor with a better budget and market share it's more logical to go the safe route which might or might not be the better case while the competitor with a lot smaller budget and market share will have to take more risks in order to be able to compete on a good level against the bigger opponent. If ATI doesn't take any risks they'll always be one step behind NVIDIA and that's a certain way to get the company become smaller and smaller and harder and harder to compete against the opponent.

    Remember R600 delays? That's one example taking a risk doesn't always result in a good result. R600 arch underperformed a bit compared to 8800 series and this was the beginning of the "performance" card sales as ATI figured it won't be able to compete in the highend with this arch without having at least one or a few NVIDIA cards beating them and I don't think 4xxx series will be any different either but after that series I'm expecting more drastical changes and perhaps we'd see another attempt fighting for the performance crown but I'm a lil unsure if ATI will even try that anymore and simply continue this current trend. Props for AMD/ATI to be able to compete this well against a lot bigger opponents though.
    Last edited by RPGWiZaRD; 05-22-2008 at 08:39 AM.
    Intel? Core i5-4670K @ 4.3 GHz | ASRock Extreme6 Z87 | G.Skill Sniper 2x8GB @ DDR4-1866 CL9 | Gigabyte GTX 970 OC Windforce 3x | Super Flower Titanium 1000W | ViewSonic VX2268wm 120Hz LCD | Phanteks PH-TC14PE | Logitech MX-518 | Win 7 x64 Professional | Samsung 850 EVO & 840 Pro SSDs

    If all people would share opinions in an objective manner, the world would be a friendlier place

  17. #92
    Xtreme Enthusiast
    Join Date
    Mar 2007
    Location
    Portsmouth, UK
    Posts
    963
    Quote Originally Posted by RPGWiZaRD View Post
    It's not cheaper but you're right that it should be more safe. I think NVIDIA doesn't wanna risk anything this time around, it's quite important launch for both companies and screw ups aren't allowed.
    Do you know how much time & money it takes to redesign a GPU for a new process? It's a lot cheaper to go with tried & tested but take a cut in profits than risk a huge loss.

    As DilTech rightly said, ATI gambled on a new architecture on a new process and it cost them. If nVidia made a similar mistake they could become another cog in Intel/Samsung/[insert semiconductor company here] machine, like ATI became a cog in AMD's machine.

    They do have 55nm on the way but timing is everything. RV770 will be better than G92 and nVidia fel the need to be on top, the 65nm G200 will ensure that & then a 55nm/45nm shrink might keep them ahead of whatever AMD comes up with next.

  18. #93
    Xtreme Guru
    Join Date
    Apr 2005
    Location
    Finland
    Posts
    4,308
    Quote Originally Posted by DeathReborn View Post
    Do you know how much time & money it takes to redesign a GPU for a new process? It's a lot cheaper to go with tried & tested but take a cut in profits than risk a huge loss.
    I was personally talking about IF NVIDIA will introduce them at 65nm and release identical chips but on 55nm process and maintaining both types similiarly with same model numbers etc. Marketing them with a different model number might give some slight bonus marketing capabilities (like 9800 series compared to 8800), although wouldn't be appriciated for enthusiasts knowing NVIDIA's marketing strategy. Focusing on one process would be a bit more simple and cheaper, but of course it's not usually that expensive to do the die-shrink process itself.
    Last edited by RPGWiZaRD; 05-22-2008 at 08:50 AM.
    Intel? Core i5-4670K @ 4.3 GHz | ASRock Extreme6 Z87 | G.Skill Sniper 2x8GB @ DDR4-1866 CL9 | Gigabyte GTX 970 OC Windforce 3x | Super Flower Titanium 1000W | ViewSonic VX2268wm 120Hz LCD | Phanteks PH-TC14PE | Logitech MX-518 | Win 7 x64 Professional | Samsung 850 EVO & 840 Pro SSDs

    If all people would share opinions in an objective manner, the world would be a friendlier place

  19. #94
    Xtreme Guru
    Join Date
    Jan 2005
    Location
    Tre, Suomi Finland
    Posts
    3,858
    TSMC 55nm is a half-node/optical shrink so minimal retrofitting should be required when coming from a native 65nm design.
    You were not supposed to see this.

  20. #95
    Xtreme Member
    Join Date
    Nov 2007
    Location
    Belgium
    Posts
    246
    new pics :




    Quote Originally Posted by Johnny Bravo View Post
    That's because in the process of benching the card all setup I learned a few things about Qimonda memory:
    [*]It sucks[*]It doesn't like voltage[*]It doesn't like the cold[*]It suck[*]It dies real easily[*]err....did I mention it sucks?

  21. #96
    Xtreme Member
    Join Date
    Nov 2007
    Location
    Belgium
    Posts
    246
    Funny thing I think Tomshardware have one ^^


    http://www.tomshardware.co.uk/hdmi-d...ews-28242.html

    Quote Originally Posted by Johnny Bravo View Post
    That's because in the process of benching the card all setup I learned a few things about Qimonda memory:
    [*]It sucks[*]It doesn't like voltage[*]It doesn't like the cold[*]It suck[*]It dies real easily[*]err....did I mention it sucks?

  22. #97
    Xtreme Mentor
    Join Date
    Jan 2005
    Posts
    3,080
    I wonder what kinda heat that card is gonna put out....maybe i could save money on my home heating?
    Gigabyte EP45-DQ6 - rev 1.0, F13a bios | Intel Q9450 Yorkfield 413x8=3.3GHz | OCZ ProXStream 1000W PSU | Azuen X-Fi Prelude 64MB X-RAM| WD VelociRaptor 74HLFS-01G6U0 16MB cache 74GB - 2 drive RAID 0 64k stripe | ASUS 9800GT Ultimate 512MB RAM (128 SP!!) | G.SKILL PC2-8800 4GB kit @ 1100MHz | OCZ ATV Turbo 4GB USB flash | Scythe Ninja Copper + Scythe 120mm fan | BenQ M2400HD 24" 16:9 LCD | Plextor 716SA 0308; firmware 1.11 | Microsoft Wireless Entertainment Desktop 8000 | Netgear RangeMax DG834PN 108mbps; firmware 1.03.39 + HAWKING HWUG1 108mbps USB dongle | Digital Doc 5+ | 7 CoolerMaster 80mm blue LED fans | Aopen H700A tower case | Vista Home Premium - 32bit, SP1

  23. #98
    Xtreme Member
    Join Date
    Nov 2007
    Location
    Belgium
    Posts
    246
    Quote Originally Posted by Richard Dower View Post
    I wonder what kinda heat that card is gonna put out....maybe i could save money on my home heating?
    I'm sure greenpeace works with nvidia

    Quote Originally Posted by Johnny Bravo View Post
    That's because in the process of benching the card all setup I learned a few things about Qimonda memory:
    [*]It sucks[*]It doesn't like voltage[*]It doesn't like the cold[*]It suck[*]It dies real easily[*]err....did I mention it sucks?

  24. #99
    Xtreme Mentor
    Join Date
    Sep 2006
    Posts
    2,834
    Quote Originally Posted by Richard Dower View Post
    I wonder what kinda heat that card is gonna put out....maybe i could save money on my home heating?
    I'm getting AC for my room this summer. Even with my 8800GTS 640MB and C2D @ 3.2Ghz it gets really hot.

    For my part I know nothing with any certainty, but the sight of the stars makes me dream.

    ..

  25. #100
    Xtreme Member
    Join Date
    Oct 2007
    Posts
    407
    It's so beautiful. Go Nvidia!

Page 4 of 6 FirstFirst 123456 LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •