Page 47 of 167 FirstFirst ... 37444546474849505797147 ... LastLast
Results 1,151 to 1,175 of 4151

Thread: ATI Radeon HD 4000 Series discussion

  1. #1151
    Xtreme Guru adamsleath's Avatar
    Join Date
    Nov 2006
    Location
    Brisbane, Australia
    Posts
    3,803
    GTX 280 = X4098
    GTX 260 = X3782
    if these are true then they might actually be worth the pricetag.

    what is the source of the vantage scores?
    i7 3610QM 1.2-3.2GHz

  2. #1152
    Xtreme Addict
    Join Date
    May 2007
    Posts
    2,125
    Yeah I don't think people realize that the card is nearly 2x the score of the 8800 Ultra at those settings, where the 8800Ultra still has the edge on the G92 cards (besides the 9800GX2). And 3dMark always seems to be optimized well for multi-GPU setups, so 9800GX2 will score higher than real-world game performance where they're at the mercy of multi-GPU driver optimizations.

  3. #1153
    Xtreme Addict
    Join Date
    Apr 2008
    Location
    Lansing, MI / London / Stinkaypore
    Posts
    1,788
    Quote Originally Posted by adamsleath View Post
    if these are true then they might actually be worth the pricetag.

    what is the source of the vantage scores?
    Problem is, the GTX 260, being only 0.8-0.85 of a 280, should not score that high. 3200-3400 at best I reckon.

    There's some problems with the source! Ahem...

  4. #1154
    Xtreme Addict
    Join Date
    Aug 2006
    Location
    The Netherlands, Friesland
    Posts
    2,244
    Quote Originally Posted by adamsleath View Post
    if these are true then they might actually be worth the pricetag.

    what is the source of the vantage scores?
    http://gathering.tweakers.net/forum/...ssage/30144631
    CJ
    >i5-3570K
    >Asrock Z77E-ITX Wifi
    >Asus GTX 670 Mini
    >Cooltek Coolcube Black
    >CM Silent Pro M700
    >Crucial M4 128Gb Msata
    >Cooler Master Seidon 120M
    Hell yes its a mini-ITX gaming rig!

  5. #1155
    Xtreme Guru adamsleath's Avatar
    Join Date
    Nov 2006
    Location
    Brisbane, Australia
    Posts
    3,803
    sprechen zie deutsche?
    i7 3610QM 1.2-3.2GHz

  6. #1156
    Xtreme Addict
    Join Date
    Aug 2006
    Location
    The Netherlands, Friesland
    Posts
    2,244
    Quote Originally Posted by adamsleath View Post
    sprechen zie deutsche?
    Dutch man.
    >i5-3570K
    >Asrock Z77E-ITX Wifi
    >Asus GTX 670 Mini
    >Cooltek Coolcube Black
    >CM Silent Pro M700
    >Crucial M4 128Gb Msata
    >Cooler Master Seidon 120M
    Hell yes its a mini-ITX gaming rig!

  7. #1157
    Xtreme Guru adamsleath's Avatar
    Join Date
    Nov 2006
    Location
    Brisbane, Australia
    Posts
    3,803
    ok.
    i7 3610QM 1.2-3.2GHz

  8. #1158
    Xtreme Addict
    Join Date
    Aug 2006
    Location
    The Netherlands, Friesland
    Posts
    2,244
    Quote Originally Posted by adamsleath View Post
    ok.
    And ATI numbers are fake.
    http://www.forumdeluxx.de/forum/show...postcount=1378
    >i5-3570K
    >Asrock Z77E-ITX Wifi
    >Asus GTX 670 Mini
    >Cooltek Coolcube Black
    >CM Silent Pro M700
    >Crucial M4 128Gb Msata
    >Cooler Master Seidon 120M
    Hell yes its a mini-ITX gaming rig!

  9. #1159
    Xtreme Mentor
    Join Date
    Feb 2004
    Location
    The Netherlands
    Posts
    2,984
    Quote Originally Posted by adamsleath View Post
    ok.
    yeah man, watch it buddy.

    Ryzen 9 3900X w/ NH-U14s on MSI X570 Unify
    32 GB Patriot Viper Steel 3733 CL14 (1.51v)
    RX 5700 XT w/ 2x 120mm fan mod (2 GHz)
    Tons of NVMe & SATA SSDs
    LG 27GL850 + Asus MG279Q
    Meshify C white

  10. #1160
    Xtreme Mentor
    Join Date
    Oct 2005
    Posts
    2,788
    Quote Originally Posted by ownage View Post
    Hmm... I hope the real ones are higher by a fair amount. If ATI fails me again I am just going to get a GTX280.
    Asus Rampage II Gene | Core i7 920 | 6*2GB Mushkin 998729 | BFG GTX280 OCX | Auzentech X-Fi Forte | Corsair VX550
    —Life is too short to be bound by the moral, ethical and legal constraints imposed on us by modern day society.

  11. #1161
    Xtreme Member
    Join Date
    Sep 2007
    Posts
    216
    Quote Originally Posted by 003 View Post
    Hmm... I hope the real ones are higher by a fair amount. If ATI fails me again I am just going to get a GTX280.
    I doubt they will be any higher. And they don't need to be at half the price of GTX 280.

  12. #1162
    Xtreme Guru
    Join Date
    May 2007
    Location
    Ace Deuce, Michigan
    Posts
    3,955
    Quote Originally Posted by ownage View Post
    Why not? The HD4870 score seems more then logic. HD4870 seems to be 80% faster then the HD3870. It was rumoured the HD4870 would be 50% faster in overall performance, but the HD4870 has far better AA performance, so 80% doesn't seems wrong to me.
    Yeah I realize that, but the chart initially showed the 3870=4850 and something like x1950 (which I quoted) and that's what I saying, way too low, should be 50%+, 80% sounds perfect to me actually

    Quote Originally Posted by Helmore View Post
    Oh and AliG it says the 3870 scores 1440 while the 4870 does 2595.
    Oh well, I won't make any conclusions about their performance until the cards are officially released.
    As I said, look at what I quoted, the poster fixed it
    Quote Originally Posted by Scimitar View Post
    Remember that the shader clock on the 4870 is rumored to be 1,050 MHz (200 MHz over the core clock). When you take this into account, the 4870 has (in theory) twice the shader power of the 3870, not 50% more.
    Once again, I understand this
    Quote Originally Posted by Hans de Vries View Post

    JF-AMD posting: IPC increases!!!!!!! How many times did I tell you!!!

    terrace215 post: IPC decreases, The more I post the more it decreases.
    terrace215 post: IPC decreases, The more I post the more it decreases.
    terrace215 post: IPC decreases, The more I post the more it decreases.
    .....}
    until (interrupt by Movieman)


    Regards, Hans

  13. #1163
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    2,128
    Quote Originally Posted by 003 View Post
    Hmm... I hope the real ones are higher by a fair amount. If ATI fails me again I am just going to get a GTX280.
    Ever heard about R700?

  14. #1164
    Xtreme Cruncher
    Join Date
    Jun 2006
    Posts
    6,215
    Quote Originally Posted by Calmatory View Post
    Ever heard about R700?
    It seems more and more people are trying to compare 4870 to GTX280...It's like they never heard of dual GPU card called 3870X2 .Similarly R700 will be introduced and I really expect it to perform on par or slightly worse than GTX280.

  15. #1165
    Xtreme Addict
    Join Date
    Nov 2004
    Posts
    1,692
    Quote Originally Posted by informal View Post
    It seems more and more people are trying to compare 4870 to GTX280...It's like they never heard of dual GPU card called 3870X2 .Similarly R700 will be introduced and I really expect it to perform on par or slightly worse than GTX280.
    Problem is (and I'm sure not everyone will agree) is that ATI is still a product cycle behind, all due to R600, and arguably, started earlier then that, with the 1800XT, and they haven't caught up yet. In August when R700 launches, nVidia will just release their dual chip card. If AMD would have a card that performed as well as the GT280, then they would sell it for a much higher price. Lower price just means that it can't compete performance wise.

    Even with nvidia giving them time with the whole 9800GTX debacle, they were just laughing and perfecting G200.

    When ATI's new architecture launches, nVidia will be there with their new card, or G200 on a smaller process, whatever is necessary to stay on top. Unless nVidia fails with a chip or process, like the FX5800, I think they'll stay on top, they've got the time and resources to do so. AMD doesn't have that necessarily. Or if AMD would drop a bombshell next year with a new architecture, the latter being more likely. But like we've seen before, because nVidia has got a cycle ahead, they can test, and use the smaller process on their midrange cards first, and then doing it to the big guns. Perfection.
    Last edited by Tim; 05-29-2008 at 03:05 AM.

    Intel Core i7-3770K
    ASUS P8Z77-I DELUXE
    EVGA GTX 970 SC
    Corsair 16GB (2x8GB) Vengeance LP 1600
    Corsair H80
    120GB Samsung 840 EVO, 500GB WD Scorpio Blue, 1TB Samsung Spinpoint F3
    Corsair RM650
    Cooler Master Elite 120 Advanced
    OC: 5Ghz | +0.185 offset : 1.352v

  16. #1166
    Xtreme Addict
    Join Date
    May 2007
    Posts
    2,125
    Quote Originally Posted by Tim View Post
    In August when R700 launches, nVidia will just release their dual chip card.
    GT200 will not be on a dual chip until a die shrink

    They couldn't do it with G80 and G80 was less hot and less power-intensive as GT200 will be. What makes you think they will suddenly have a dual card

  17. #1167
    Live Long And Overclock
    Join Date
    Sep 2004
    Posts
    14,058
    Quote Originally Posted by Tim View Post
    Problem is (and I'm sure not everyone will agree) is that ATI is still a product cycle behind, all due to R600, and arguably, started earlier then that, with the 1800XT, and they haven't caught up yet. In August when R700 launches, nVidia will just release their dual chip card. If AMD would have a card that performed as well as the GT280, then they would sell it for a much higher price. Lower price just means that it can't compete performance wise.

    Even with nvidia giving them time with the whole 9800GTX debacle, they were just laughing and perfecting G200.

    When ATI's new architecture launches, nVidia will be there with their new card, or G200 on a smaller process, whatever is necessary to stay on top. Unless nVidia fails with a chip or process, like the FX5800, I think they'll stay on top, they've got the time and resources to do so. AMD doesn't have that necessarily. Or if AMD would drop a bombshell next year with a new architecture, the latter being more likely. But like we've seen before, because nVidia has got a cycle ahead, they can test, and use the smaller process on their midrange cards first, and then doing it to the big guns. Perfection.
    Technically, Nvidia's 9900GTX WILL be a current gen card, seeing as it will not have a GTX 2xx based card in that range for a little time to come.

    In addition, a single 4870X2 w/ GDDR5 should cost around $599 as well and should be provide a significant boost over the 3870X2. (960 Shaders and 64 TMUs anyone )

    Perkam

  18. #1168
    Banned
    Join Date
    May 2006
    Location
    Skopje, Macedonia
    Posts
    1,716
    Quote Originally Posted by zerazax View Post
    GT200 will not be on a dual chip until a die shrink

    They couldn't do it with G80 and G80 was less hot and less power-intensive as GT200 will be. What makes you think they will suddenly have a dual card
    I doubt that nVidia will make a dual-chip GT200 based card. Even with a die-shrink, a single die is still going to be a space heater.

    IMO, with die-shirnk nVidia are going to raise the clocks of GTX 280, add 16 more SPs and they'll do some simple improvements.

  19. #1169
    Xtreme Member
    Join Date
    Sep 2007
    Posts
    168
    How did cj of daamit get the new cards I wonder?

  20. #1170
    Xtreme Enthusiast
    Join Date
    Sep 2006
    Posts
    881
    Looks like the HD48xx series will be beaten by the GT2xx series again. But I don't mind, as long as the lower the HD3870 to about $100 when the HD48xx series releases I'm happy.

  21. #1171
    Xtreme Addict
    Join Date
    Apr 2008
    Location
    Lansing, MI / London / Stinkaypore
    Posts
    1,788
    Quote Originally Posted by gOJDO View Post
    I doubt that nVidia will make a dual-chip GT200 based card. Even with a die-shrink, a single die is still going to be a space heater.

    IMO, with die-shirnk nVidia are going to raise the clocks of GTX 280, add 16 more SPs and they'll do some simple improvements.
    The GT200 seems to have only 240 SPs instead of the popular consensus of 256 + 96TMU. No redundancy that is, like the 8800GTX at launch.

    They can't really shrink to much efficiency, they have to get a balance between the bus width (too big) and ROP count (too much). 384bit + GDDR5 in the future seems to be the best balance, otherwise they'd still have to use a 400+mm^2 die in 40nm!

    Why are people disregarding a chance for ATI to make the same RV770, just with double the specs, and still a small chip compared to the GT200 later on (units are small compared to memory bus), I have no idea. That chip is apparently, called the R(V)870 or what you may say otherwise.


    Heard GT200 info from a buddy who has looked at the core die itself, so I'd let you guys go on from there.

  22. #1172
    Xtreme Enthusiast
    Join Date
    May 2006
    Location
    over the rainbow
    Posts
    964
    Quote Originally Posted by awdrifter View Post
    Looks like the HD48xx series will be beaten by the GT2xx series again. But I don't mind, as long as the lower the HD3870 to about $100 when the HD48xx series releases I'm happy.
    The HD4k series was never ment to compete w/ the GTX2xx series.
    AMD Phenom II X6 1055T@3.5GHz@Scythe Mugen 2 <-> ASRock 970 Extreme4 <-> 8GB DDR3-1333 <-> Sapphire HD7870@1100/1300 <-> Samsung F3 <-> Win8.1 x64 <-> Acer Slim Line S243HL <-> BQT E9-CM 480W

  23. #1173
    Live Long And Overclock
    Join Date
    Sep 2004
    Posts
    14,058
    Quote Originally Posted by w0mbat View Post
    The HD4k series was never ment to compete w/ the GTX2xx series.
    That point will never get across m8, its useless to try.

    Heck, Nvidia has their GTX 280 going up against the 3870 on their editor's day last week , so there's no way people will suddenly realize that a $350 card was never meant to compete with a $600 card.

    Over time members will see the performance and price and decide for themselves for what they "NEED" for their gaming needs

    Perkam

  24. #1174
    Xtreme Addict
    Join Date
    May 2007
    Location
    'Zona
    Posts
    2,346
    Quote Originally Posted by Macadamia View Post
    They can't really shrink to much efficiency, they have to get a balance between the bus width (too big) and ROP count (too much). 384bit + GDDR5 in the future seems to be the best balance, otherwise they'd still have to use a 400+mm^2 die in 40nm!
    40nm should allow them to shrink the core to ~220mm2, so +400mm2 for a dual GPU card.

  25. #1175
    Xtreme Addict
    Join Date
    Apr 2008
    Location
    Lansing, MI / London / Stinkaypore
    Posts
    1,788
    Quote Originally Posted by LordEC911 View Post
    40nm should allow them to shrink the core to ~220mm2, so +400mm2 for a dual GPU card.
    It doesn't work like that. Look at G80->G92. It went from 90 to 65 (same steps as the GT200 shrink) BUT was still crazy big (320-330mm2) and that was with 2 64-bit mem controllers taken.


    GT200 won't shrink well. At least the memory bus won't. Even with that out of the way (256 bit), they will still be seeing 350+mm^2 per chip (not the big deal) and 120-150W per chip (BIG deal)

Page 47 of 167 FirstFirst ... 37444546474849505797147 ... LastLast

Tags for this Thread

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •