How else does everyone feel about the 128 bit bus? I don't know if I can get myself away from the 256bit cards.
- opinions?
How else does everyone feel about the 128 bit bus? I don't know if I can get myself away from the 256bit cards.
- opinions?
I second pinball with Alternate Extraball Rendering
3570K @ 4.5Ghz | Gigabyte GA-Z77-D3H | 7970 Ghz 1100/6000 | 256GB Samsung 830 SSD (Win 7) | 256GB Samsung 840 Pro SSD (OSX 10.8.3) | 16GB Vengeance 1600 | 24'' Dell U2412M | Corsair Carbide 300R
Once again it looks like ATI's PowerPlay is not working...
RiG1: Ryzen 7 1700 @4.0GHz 1.39V, Asus X370 Prime, G.Skill RipJaws 2x8GB 3200MHz CL14 Samsung B-die, TuL Vega 56 Stock, Samsung SS805 100GB SLC SDD (OS Drive) + 512GB Evo 850 SSD (2nd OS Drive) + 3TB Seagate + 1TB Seagate, BeQuiet PowerZone 1000W
RiG2: HTPC AMD A10-7850K APU, 2x8GB Kingstone HyperX 2400C12, AsRock FM2A88M Extreme4+, 128GB SSD + 640GB Samsung 7200, LG Blu-ray Recorder, Thermaltake BACH, Hiper 4M880 880W PSU
SmartPhone Samsung Galaxy S7 EDGE
XBONE paired with 55''Samsung LED 3D TV
No. To be brutally honest with you; ATI cut us out of this one for some reason. Be that as it may, I have cards winging their way over as we speak. The review just won't be up on launch date.
As for the PowerPlay issue, the vast majority of consumers and reviewers seem to forget about everything that was originally promised with PowerPlay. We were supposed to get great idle power consumption and temps through what ATI called Clock Gating. However, it seems that neither the dynamic clock speed adjustments nor dynamic voltage reduction has been happening on current ATI cards through the Catalyst drivers. Unless GPU-Z is wrong in the pictures above, it seems that when at idle there are very few clock speed reductions even with this new card.
If you need any more proof of ATI's issues with idle power consumption, take a look at any review out there that pits the higher-end ATI cards against their Nvidia competition. Just omit the 9800 GTX+.![]()
Last edited by SKYMTL; 04-23-2009 at 06:09 AM.
well why dont we wait until you get it? seeing as your power reviews are the best there are
its possible ATI said hey, this chip is so low power we dont need to downclock, but we will see, GGDR5 is power hungry....
From my personal tests with HD4870 I've concluded that Clock Gating is working as advertised.The problem lies in lack of ability to downclock GDDR5 memory on the fly.
I've did simple test back when the cards arrived. Using my Kill-A-Watt I've first measured idle power consumption of my system with card clocked in CCC 500/900 then I manually reduced core clock to 160MHz leaving memory at 900MHz. This saved me 1W off total system power consumption. On the other hand downclocking memory from 900MHz to 465MHz saved me 40W!
I think the problem with GDDR5 lies in it's specifics like clock retraining for different freq. and only newer memory controller will hopefully fix that. For now every time I'm changing memory clock on my cards screen flickers for a fraction of a second (probably retraining memory clocks during that time).
RiG1: Ryzen 7 1700 @4.0GHz 1.39V, Asus X370 Prime, G.Skill RipJaws 2x8GB 3200MHz CL14 Samsung B-die, TuL Vega 56 Stock, Samsung SS805 100GB SLC SDD (OS Drive) + 512GB Evo 850 SSD (2nd OS Drive) + 3TB Seagate + 1TB Seagate, BeQuiet PowerZone 1000W
RiG2: HTPC AMD A10-7850K APU, 2x8GB Kingstone HyperX 2400C12, AsRock FM2A88M Extreme4+, 128GB SSD + 640GB Samsung 7200, LG Blu-ray Recorder, Thermaltake BACH, Hiper 4M880 880W PSU
SmartPhone Samsung Galaxy S7 EDGE
XBONE paired with 55''Samsung LED 3D TV
Yes and no. The issue is that the Clock Gating was supposed to work for both the memory AND the GPU core. To be honest with you though, I have tried the same test with a clamp meter and a HD 4870 1GB plugged into a PCI-E daughter board and reducing the core clock with the 9.3 drivers (at the time) to 200Mhz reduced the power consumption by a good 25W at idle if I remember correctly.
I think these variances in readings are largely based on the movement from one driver to the next. It almost seems like ATI is still fiddling with the settings every now and then.
RiG1: Ryzen 7 1700 @4.0GHz 1.39V, Asus X370 Prime, G.Skill RipJaws 2x8GB 3200MHz CL14 Samsung B-die, TuL Vega 56 Stock, Samsung SS805 100GB SLC SDD (OS Drive) + 512GB Evo 850 SSD (2nd OS Drive) + 3TB Seagate + 1TB Seagate, BeQuiet PowerZone 1000W
RiG2: HTPC AMD A10-7850K APU, 2x8GB Kingstone HyperX 2400C12, AsRock FM2A88M Extreme4+, 128GB SSD + 640GB Samsung 7200, LG Blu-ray Recorder, Thermaltake BACH, Hiper 4M880 880W PSU
SmartPhone Samsung Galaxy S7 EDGE
XBONE paired with 55''Samsung LED 3D TV
GPU-Z main window reports my HD4890 idles at 750 core 850 mem, eventhough core and mem are both at 250MHz while stock is the usual 850/975.
You were not supposed to see this.
gpu-z realtime clock reading isnt working correctly. also i think the number of rops is wrong.
ati hd4xxx power management has a problem with changing memory clocks: it causes flickering. that's why all (?) production boards ship with memory clock set to the same frequency for all power states
Do you guys think the DD Maze 4 will fit on the 4770 with the stock moun ting mech that came with the Gpu block?
The image flicker, is a problem with GDDR5 memory or 700 series GPUs themselves? Does HD4850, too, flicker when memory is downclocked?
I don't think it's a big deal since there's only a fraction of a second lasting image corruption the moment memory is downclocked. It's not like the image would be glitched constantly when memory clocks are lower than stock, no, it happens just for a quarter of a second. Maybe AMD should've just ignored it and set the cards to idle at low mem clocks to cut idle consumption, flicker or not...
Would ofcourse prefer that they'd not flicker.
:P
Last edited by largon; 04-24-2009 at 02:48 AM.
You were not supposed to see this.
I would prefer the flicker when chaning modes rather than high power consumption. Looking forward to the reviews with a good focus on power consumption.
Crunching for Comrades and the Common good of the People.
This looks like a nice card!
Too bad I just scrapped my budget building, but this card looks good to go with low cost dual cores for cheap gaming PCs.
Crossfire is amazing. 16x adaptive AA in everything at 1920x1200 is sweet. And just about every modern game does support it.
GDDR5 seems to kill idle power numbers yet again. And HD4850 looks quite abit better in performance.
Last edited by Shintai; 04-24-2009 at 04:38 AM.
Crunching for Comrades and the Common good of the People.
WOW, they got teh core to 900mhz without to much hassles, which gave them almost 10000 performance vantage marks. Not to shabby. Just a pity that the mounting holes is spaced 43mm from each other & not 53 like the 48xx series cards, so i won't be able to watercool them.![]()
good article, was a nice read
9800GT is just blown away
Anyway nVidia may still compete well by lowering 250 price a bit (performance wise only, tecnological gap would be quite big in favor of 4770)
Last edited by Tuvok-LuR-; 04-24-2009 at 05:23 AM.
3570K @ 4.5Ghz | Gigabyte GA-Z77-D3H | 7970 Ghz 1100/6000 | 256GB Samsung 830 SSD (Win 7) | 256GB Samsung 840 Pro SSD (OSX 10.8.3) | 16GB Vengeance 1600 | 24'' Dell U2412M | Corsair Carbide 300R
Bookmarks