MMM
Page 5 of 18 FirstFirst ... 234567815 ... LastLast
Results 101 to 125 of 433

Thread: HD 4770 Previews/Reviews

  1. #101
    Engineering The Xtreme
    Join Date
    Feb 2007
    Location
    MA, USA
    Posts
    7,217
    Quote Originally Posted by SKYMTL View Post
    Finally got a card on Friday from an anonymous source. Then spent most of the day sending emails to my rep at ATI for drivers without a response, spent the better part of the weekend calling in favors to get the most current drivers sent to me off the grid.

    No thanks to ATI, a review will be up one way or another.

    @ Rasamaha: would you mind telling us which program you used to bypass ATI's CCC overclocking limitations?
    Can you do some idle power testing in the review to show what the consumption would be -IF- the powerplay was working? I just wanna know how low it can go

  2. #102
    Registered User
    Join Date
    Jun 2007
    Posts
    62
    would be interesting if there will be non reference card with just 4 RAM chips, maybe power consumption would be better and there would be less of those repeating post about it

  3. #103
    Xtreme Member
    Join Date
    Jun 2005
    Location
    MA, USA
    Posts
    146
    Quote Originally Posted by SNiiPE_DoGG View Post
    Can you do some idle power testing in the review to show what the consumption would be -IF- the powerplay was working? I just wanna know how low it can go
    Me too, very interested for HTPC use.
    | Cooler Master 690 II Advanced | Corsair 620HX | Core i5-2500K @ 5.0GHz | Gigabyte Z68XP-UD4 | 2x4096MB G.Skill Sniper DDR3-2133 @ 2134MHz 10-11-10-30 @ 1.55V | 160GB Intel X-25 G2 | 2x 2TB Samsung EcoGreen F4 in RAID 1 | Gigabyte HD 7970 @ 1340MHz/1775MHz | Dell 30" 3007WFP-HC | H2O - XSPC RayStorm and Swiftech MCW82 on an MCP350 + XSPC Acrylic Top, XSPC RX240 and Swiftech MCR220 radiators.

  4. #104
    Xtreme Addict
    Join Date
    Aug 2004
    Location
    Sweden
    Posts
    2,084
    My guess is that the 4750 is the low power consumption alternative since it got GDDR3.

  5. #105
    Xtreme Member
    Join Date
    Jul 2007
    Posts
    371
    Quote Originally Posted by trinibwoy View Post
    No need to test it, it's 72GB/s ~ GTS 250.
    Not quite... GDDR5 has much higher latency than GDDR3 but a 40% overclock is mighty impressive nonetheless

  6. #106
    Xtreme Addict
    Join Date
    Apr 2008
    Location
    Lansing, MI / London / Stinkaypore
    Posts
    1,788
    What if the 4750 matched the 9600GT in cases where bandwidth limitations weren't that severe?


    I heard that was their target previously, 9800 for the highend version and 9600 for the GDDR3. Granted, the lake of salt is right... there.
    Quote Originally Posted by radaja View Post
    so are they launching BD soon or a comic book?

  7. #107
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    1,870
    Quote Originally Posted by Boissez View Post
    Not quite... GDDR5 has much higher latency than GDDR3 but a 40% overclock is mighty impressive nonetheless
    Latency isn't particularly relevant for GPUs. They are built to hide it. In any case you can never get any reliable measure of actual bandwidth. It varies according to workload. Best metric for comparison that we've got is theoretical maximums.

  8. #108
    Xtreme Member
    Join Date
    Jul 2007
    Posts
    371
    Quote Originally Posted by trinibwoy View Post
    Latency isn't particularly relevant for GPUs. They are built to hide it. In any case you can never get any reliable measure of actual bandwidth. It varies according to workload. Best metric for comparison that we've got is theoretical maximums.
    The HD4850 would be about 10-15% slower if it had GDDR5 instead of GDDR3 - i'd say it's relevant

  9. #109
    Engineering The Xtreme
    Join Date
    Feb 2007
    Location
    MA, USA
    Posts
    7,217
    Quote Originally Posted by Boissez View Post
    The HD4850 would be about 10-15% slower if it had GDDR5 instead of GDDR3 - i'd say it's relevant
    ...what are you talking about? ggdr5 vs gddr3 is the only difference between the 4850 and the 4870... I'd say its a pretty big difference.

  10. #110
    Xtreme Member
    Join Date
    Jul 2007
    Posts
    371
    Quote Originally Posted by SNiiPE_DoGG View Post
    ...what are you talking about? ggdr5 vs gddr3 is the only difference between the 4850 and the 4870... I'd say its a pretty big difference.
    I don't quite get your point. I'm just arguing that GDDR5 is about 10-15% slower than GDDR3 given the same bandwidth.

  11. #111
    Xtreme Addict
    Join Date
    Dec 2007
    Posts
    1,326
    Quote Originally Posted by Boissez View Post
    The HD4850 would be about 10-15% slower if it had GDDR5 instead of GDDR3 - i'd say it's relevant
    erm... no.

    Those graphs are comparing GDDR3 with GDDR5 at half the speed of GDDR3. For example, in that case it's 993Mhz GDDR3 versus 497Mhz GDDR5. However, it would be pretty stupid for the HD4850 to have 500Mhz GDDR5. If the HD4850 had GDDR5, it would be 750Mhz minimum.


    Quote Originally Posted by Boissez View Post
    I don't quite get your point. I'm just arguing that GDDR5 is about 10-15% slower than GDDR3 given the same bandwidth.
    No, it's given the same.. let's call it the same "single data-rate clockspeed".

    It's always been like that. DDR was already slower than SDR if DDR was underclocked to the half of the speed of SDR.



    Quote Originally Posted by Macadamia View Post
    What if the 4750 matched the 9600GT in cases where bandwidth limitations weren't that severe?


    I heard that was their target previously, 9800 for the highend version and 9600 for the GDDR3. Granted, the lake of salt is right... there.
    Well.. ATI has launched their share of highly bottlenecked cards before, like the HD4650 DDR2 (phail...).
    Or even worse: the FirePro V3750, which is a 320sp RV730 with a 64bit memory controller (yeah, the original 320sp R600 had 8x more bandwidth).
    Last edited by ToTTenTranz; 04-27-2009 at 10:00 AM.

  12. #112
    Xtreme Enthusiast
    Join Date
    Jul 2004
    Posts
    535
    Quote Originally Posted by ToTTenTranz View Post
    erm... no.

    Those graphs are comparing GDDR3 with GDDR5 at half the speed of GDDR3. For example, in that case it's 993Mhz GDDR3 versus 497Mhz GDDR5. However, it would be pretty stupid for the HD4850 to have 500Mhz GDDR5. If the HD4850 had GDDR5, it would be 750Mhz minimum.



    No, it's given the same.. let's call it the same "single data-rate clockspeed".

    It's always been like that. DDR was already slower than SDR if DDR was underclocked to the half of the speed of SDR.
    That's exactly what the dude is saying. GDDR5 will be slower than GDDR3 at the same *effective* clockspeed. Ergo, while the 4770 can get in the same ballpark as GTS 250 in terms of bandwidth, it's memory subsystem will still probably be slower on the whole. Of course, take all this speculation with a grain of salt as the GTS 250 is a lot older, the memory controller in the HD 4770 is tweaked vs. the 4800 series, and the less complex 128-bit interface might result in some kind of lower latency (whether or not it would be significant, I have no idea)
    Last edited by hurleybird; 04-27-2009 at 09:41 AM.

  13. #113
    Xtreme Member
    Join Date
    Jan 2007
    Location
    Argentina
    Posts
    412
    Quote Originally Posted by Boissez View Post
    I don't quite get your point. I'm just arguing that GDDR5 is about 10-15% slower than GDDR3 given the same bandwidth.
    Not exactly. That is true given the same bandwidth but maintaining the timings required for 1000Mhz (GDDR5-4000) operation. Running at 500Mhz will tolerate better timings thus increasing the performance.

    Anyway, at the same BW GDDR5 will be slower, but it's far from 10% with adequate timings.

  14. #114
    Xtreme Guru
    Join Date
    Aug 2007
    Posts
    3,562
    Quote Originally Posted by SNiiPE_DoGG View Post
    Can you do some idle power testing in the review to show what the consumption would be -IF- the powerplay was working? I just wanna know how low it can go
    In an upcoming article, yes. That will be coupled with some additional cooling tests since right now there is no way to accurately test the heat output of the 40nm core against any other card due to the oddball offset of the heatsink mount. After some preliminary tests with some modded heatsinks, I am certain the results will shock many people who are defending the move to 40nm.

  15. #115
    Registered User
    Join Date
    Mar 2009
    Posts
    89
    Did you get 2 cards to do some crossfire testing SKYMTL?

  16. #116
    Xtreme Guru
    Join Date
    Aug 2007
    Posts
    3,562
    Quote Originally Posted by Macadamia View Post
    I heard that was their target previously, 9800 for the highend version and 9600 for the GDDR3. Granted, the lake of salt is right... there.
    Great for them but the 8800 GT / 9800 GT has been their target for the LAST TWO YEARS. Even the HD 4830 didn't win convincingly against the 9800 GT.

  17. #117
    Xtreme Guru
    Join Date
    Aug 2007
    Posts
    3,562
    Quote Originally Posted by Wadkiller View Post
    Did you get 2 cards to do some crossfire testing SKYMTL?
    As I stated above, getting even one card was a stretch this time. Without ATI's support, websites have to go either to board partners (who didn't actually get the cards until last week) or to our contacts in Asia who may or may not get it to us in time. As it stands, I have one and will probably receive another a bit later in the week.

  18. #118
    Xtreme Addict
    Join Date
    Jun 2007
    Location
    Thessaloniki, Greece
    Posts
    1,307
    Quote Originally Posted by SKYMTL View Post
    In an upcoming article, yes. That will be coupled with some additional cooling tests since right now there is no way to accurately test the heat output of the 40nm core against any other card due to the oddball offset of the heatsink mount. After some preliminary tests with some modded heatsinks, I am certain the results will shock many people who are defending the move to 40nm.
    X1650 compatible heatsinks dont fit?
    Seems we made our greatest error when we named it at the start
    for though we called it "Human Nature" - it was cancer of the heart
    CPU: AMD X3 720BE@ 3,4Ghz
    Cooler: Xigmatek S1283(Terrible mounting system for AM2/3)
    Motherboard: Gigabyte 790FXT-UD5P(F4) RAM: 2x 2GB OCZ DDR3 1600Mhz Gold 8-8-8-24
    GPU:HD5850 1GB
    PSU: Seasonic M12D 750W Case: Coolermaster HAF932(aka Dusty )

  19. #119
    Registered User
    Join Date
    Mar 2009
    Posts
    89
    Sorry asking something you already posted

    I'm really curious as to how the crossfire will perform. If a single card perform almost on par with a 4850, in crossfire they surely be close aswell, right? I'm either getting two of these or two 4850's. Not sure which will be better. I't s pity the stuff is so hellishly expensive in South Africa compared to the USA/UK

  20. #120
    Xtreme Guru
    Join Date
    Aug 2007
    Posts
    3,562
    Quote Originally Posted by BrowncoatGR View Post
    X1650 compatible heatsinks dont fit?
    Those I don't have so I don't know if they would fit. If you can find a heatsink with a 4.5cm mounting hole offset then let me know. For the time being I have drilled some extra holes in a HR-03 GT bracket I had lying around.

  21. #121
    Xtreme Addict
    Join Date
    Dec 2007
    Posts
    1,326
    Quote Originally Posted by SKYMTL View Post
    In an upcoming article, yes. That will be coupled with some additional cooling tests since right now there is no way to accurately test the heat output of the 40nm core against any other card due to the oddball offset of the heatsink mount. After some preliminary tests with some modded heatsinks, I am certain the results will shock many people who are defending the move to 40nm.
    The only people who should be defending 40nm at this point is ATI, because they can put more cores into each waffer and start testing and tweaking this new process.
    No "new" process has ever been instantaneously beneficial for power consumption and overclocking capacity, as far as I know.




    Quote Originally Posted by SKYMTL View Post
    Great for them but the 8800 GT / 9800 GT has been their target for the LAST TWO YEARS. Even the HD 4830 didn't win convincingly against the 9800 GT.
    I think you got things backwards.
    AMD doesn't target GPUs, it targets price-points. What chip is in what graphics card and how old the architecture is, is completely irrelevant for the end user. What matters is what performance+features you get for how much money, period.
    And the HD4830 is quite successfull at its price point. Where I live, the HD4830 is priced at the level of a 9600GT.


    Quote Originally Posted by hurleybird View Post
    That's exactly what the dude is saying. GDDR5 will be slower than GDDR3 at the same *effective* clockspeed.(...)
    He did say bandwidth, not clockspeed. If he "meant" clockspeed, all is well .

  22. #122
    Registered User
    Join Date
    Mar 2009
    Posts
    89
    Quote Originally Posted by BrowncoatGR View Post
    X1650 compatible heatsinks dont fit?
    Yes they will. According to the Ecpreview review they use the same 43mm spacing as on the x1650/8600gt/8500gt

    I was kinda worried about that, cause i have two Dangerden Maze 4's with both 43mm & 53mm (as the 8800gt) holes.

  23. #123
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    1,870
    Quote Originally Posted by Boissez View Post
    I don't quite get your point. I'm just arguing that GDDR5 is about 10-15% slower than GDDR3 given the same bandwidth.
    You can't really draw that conclusion from the data we have. First of all, Damien's test underclocked the GDDR5 way below its normal operating parameters. Secondly there are a host of other variables that affect the efficiency including the sizing of buffers and command protocols used by the memory controllers.

    So simply underclocking GDDR5 to half its speed and saying "see it's slower than GDDR3 at the same speed" isn't a very scientific test. Or in other words there's no way we can say that RV740's 72GB/s isn't as good as G92's 72GB/s.

  24. #124
    Xtreme Guru
    Join Date
    Aug 2007
    Posts
    3,562
    Quote Originally Posted by ToTTenTranz View Post
    The only people who should be defending 40nm at this point is ATI, because they can put more cores into each waffer and start testing and tweaking this new process.
    I agree but they are marketing it as efficiency personafied.

    AMD doesn't target GPUs, it targets price-points. What chip is in what graphics card and how old the architecture is, is completely irrelevant for the end user. What matters is what performance+features you get for how much money, period.
    And the HD4830 is quite successfull at its price point. Where I live, the HD4830 is priced at the level of a 9600GT.
    Upon its release the HD 4830 was in the same price point as the 9800 GT. To the dollar here in Canada. It was only through the latest round of price reductions that the HD 4830 is now able to compete price-point wise with the 9600 GT.

    Pricing based on a feature list is completely understandable but there comes a price point where people start looking less and less at features and more about gaming capacity. To me, that is right around the $100 - $120 price points and above since below that you can get the same features (albeit less gaming potential) for much less money.
    Last edited by SKYMTL; 04-27-2009 at 10:53 AM.

  25. #125
    Xtreme Addict
    Join Date
    Dec 2007
    Posts
    1,326
    Quote Originally Posted by SKYMTL View Post
    I agree but they are marketing it as efficiency personafied.
    You mean the marketing guys are trying to deceive people?!?? Now that's a first!



    Quote Originally Posted by SKYMTL View Post
    Upon its release the HD 4830 was in the same price point as the 9800 GT. To the dollar here in Canada. It was only through the latest round of price reductions that the HD 4830 is now able to compete price-point wise with the 9600 GT.
    So you agree with me when I say the HD4830 is, right now, a successfull product in its price range? Of course, a RV770, even a salvaged one, should be more expensive to make than the G94b, but that's one of the main reasons for the RV740 appearance anyways.
    Last edited by ToTTenTranz; 04-27-2009 at 11:08 AM.

Page 5 of 18 FirstFirst ... 234567815 ... LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •