would be interesting if there will be non reference card with just 4 RAM chips, maybe power consumption would be better and there would be less of those repeating post about it![]()
| Cooler Master 690 II Advanced | Corsair 620HX | Core i5-2500K @ 5.0GHz | Gigabyte Z68XP-UD4 | 2x4096MB G.Skill Sniper DDR3-2133 @ 2134MHz 10-11-10-30 @ 1.55V | 160GB Intel X-25 G2 | 2x 2TB Samsung EcoGreen F4 in RAID 1 | Gigabyte HD 7970 @ 1340MHz/1775MHz | Dell 30" 3007WFP-HC | H2O - XSPC RayStorm and Swiftech MCW82 on an MCP350 + XSPC Acrylic Top, XSPC RX240 and Swiftech MCR220 radiators.
My guess is that the 4750 is the low power consumption alternative since it got GDDR3.
The HD4850 would be about 10-15% slower if it had GDDR5 instead of GDDR3 - i'd say it's relevant
erm... no.
Those graphs are comparing GDDR3 with GDDR5 at half the speed of GDDR3. For example, in that case it's 993Mhz GDDR3 versus 497Mhz GDDR5. However, it would be pretty stupid for the HD4850 to have 500Mhz GDDR5. If the HD4850 had GDDR5, it would be 750Mhz minimum.
No, it's given the same.. let's call it the same "single data-rate clockspeed".
It's always been like that. DDR was already slower than SDR if DDR was underclocked to the half of the speed of SDR.
Well.. ATI has launched their share of highly bottlenecked cards before, like the HD4650 DDR2 (phail...).
Or even worse: the FirePro V3750, which is a 320sp RV730 with a 64bit memory controller (yeah, the original 320sp R600 had 8x more bandwidth).
Last edited by ToTTenTranz; 04-27-2009 at 10:00 AM.
That's exactly what the dude is saying. GDDR5 will be slower than GDDR3 at the same *effective* clockspeed. Ergo, while the 4770 can get in the same ballpark as GTS 250 in terms of bandwidth, it's memory subsystem will still probably be slower on the whole. Of course, take all this speculation with a grain of salt as the GTS 250 is a lot older, the memory controller in the HD 4770 is tweaked vs. the 4800 series, and the less complex 128-bit interface might result in some kind of lower latency (whether or not it would be significant, I have no idea)
Last edited by hurleybird; 04-27-2009 at 09:41 AM.
Not exactly. That is true given the same bandwidth but maintaining the timings required for 1000Mhz (GDDR5-4000) operation. Running at 500Mhz will tolerate better timings thus increasing the performance.
Anyway, at the same BW GDDR5 will be slower, but it's far from 10% with adequate timings.
In an upcoming article, yes. That will be coupled with some additional cooling tests since right now there is no way to accurately test the heat output of the 40nm core against any other card due to the oddball offset of the heatsink mount. After some preliminary tests with some modded heatsinks, I am certain the results will shock many people who are defending the move to 40nm.
Did you get 2 cards to do some crossfire testing SKYMTL?
As I stated above, getting even one card was a stretch this time. Without ATI's support, websites have to go either to board partners (who didn't actually get the cards until last week) or to our contacts in Asia who may or may not get it to us in time. As it stands, I have one and will probably receive another a bit later in the week.
Seems we made our greatest error when we named it at the start
for though we called it "Human Nature" - it was cancer of the heart
CPU: AMD X3 720BE@ 3,4Ghz
Cooler: Xigmatek S1283(Terrible mounting system for AM2/3)
Motherboard: Gigabyte 790FXT-UD5P(F4) RAM: 2x 2GB OCZ DDR3 1600Mhz Gold 8-8-8-24
GPU:HD5850 1GB
PSU: Seasonic M12D 750W Case: Coolermaster HAF932(aka Dusty)
Sorry asking something you already posted
I'm really curious as to how the crossfire will perform. If a single card perform almost on par with a 4850, in crossfire they surely be close aswell, right? I'm either getting two of these or two 4850's. Not sure which will be better. I't s pity the stuff is so hellishly expensive in South Africa compared to the USA/UK
The only people who should be defending 40nm at this point is ATI, because they can put more cores into each waffer and start testing and tweaking this new process.
No "new" process has ever been instantaneously beneficial for power consumption and overclocking capacity, as far as I know.
I think you got things backwards.
AMD doesn't target GPUs, it targets price-points. What chip is in what graphics card and how old the architecture is, is completely irrelevant for the end user. What matters is what performance+features you get for how much money, period.
And the HD4830 is quite successfull at its price point. Where I live, the HD4830 is priced at the level of a 9600GT.
He did say bandwidth, not clockspeed. If he "meant" clockspeed, all is well.
You can't really draw that conclusion from the data we have. First of all, Damien's test underclocked the GDDR5 way below its normal operating parameters. Secondly there are a host of other variables that affect the efficiency including the sizing of buffers and command protocols used by the memory controllers.
So simply underclocking GDDR5 to half its speed and saying "see it's slower than GDDR3 at the same speed" isn't a very scientific test. Or in other words there's no way we can say that RV740's 72GB/s isn't as good as G92's 72GB/s.
I agree but they are marketing it as efficiency personafied.
Upon its release the HD 4830 was in the same price point as the 9800 GT. To the dollar here in Canada. It was only through the latest round of price reductions that the HD 4830 is now able to compete price-point wise with the 9600 GT.AMD doesn't target GPUs, it targets price-points. What chip is in what graphics card and how old the architecture is, is completely irrelevant for the end user. What matters is what performance+features you get for how much money, period.
And the HD4830 is quite successfull at its price point. Where I live, the HD4830 is priced at the level of a 9600GT.
Pricing based on a feature list is completely understandable but there comes a price point where people start looking less and less at features and more about gaming capacity. To me, that is right around the $100 - $120 price points and above since below that you can get the same features (albeit less gaming potential) for much less money.
Last edited by SKYMTL; 04-27-2009 at 10:53 AM.
You mean the marketing guys are trying to deceive people?!?? Now that's a first!
![]()
So you agree with me when I say the HD4830 is, right now, a successfull product in its price range? Of course, a RV770, even a salvaged one, should be more expensive to make than the G94b, but that's one of the main reasons for the RV740 appearance anyways.
Last edited by ToTTenTranz; 04-27-2009 at 11:08 AM.
Bookmarks