ATI 9700 with 5 years can do ray-tracing....
Printable View
ATI 9700 with 5 years can do ray-tracing....
AliG - 160GB/sec? Sorry but you've been on the funny sauce again. It's a 48 lane PCIe 2.0 Bridge chip from either PLX or IDT (more than likely PLX). Thats 16 lanes for each GPU, and 16 lanes for the bus connection, which means at most, 32GB/sec bandwidth between GPU's.
Anyone hear about 4870's refusing to post? My new card bit the dust this morning during a reboot after working fine for the last 24 hours. There's another guy over in the general hardware forum who has the same problem on 2 different 4870's.
hey guys, UPS says my 4870's are out for delivery so i will be doing my own mini review of crossfire scaling and such and ill post it up sometime tommorrow night :) testbed will be the system in my sig
that's what the ATI representative said, don't shoot the messenger. I'll dig up the link. Only thing I don't get is why does ati need a fancy bridge chip when the 9800gx2 clearly did fine without one, all it does is add power consumption.
And you said I'm on the funny sauce again, when was the first time?:rolleyes:
http://www.hardforum.com/showthread.php?t=1319753
2x4870 real life comparison against GTX 280 from an owner. Nicely written IMO.
What I can't believe is the power consumption of the 4870, especially in CF. It's just nuts.
I'm serious, I believe the only reason why they have the bridge chip is to increase the bandwidth between the two gpus for a faster interconnection as they've claimed no micro-stuttering with both the 3870x2 and 4870x2. Of course later on they admitted the micro-stuttering on the 3870x2 was horrible because the bridge chip didn't provide enough bandwidth (8 GB/s) and the extra latency made it worse than regular xfire.
That was interesting info for me:
Anyone to confirm this from personal experience?Quote:
The GTX 200 series power saving features do NOT work on an Intel platform. That’s right. I thought I was the odd one out when I saw countless other websites posting about the down-clocks and down-voltages, whereby the GPU settles at 300Mhz Memory to 100Mhz etc when no 3D app is running. What some of them failed to state was that they only achieved this on an Nvidia platform; while ALL the reviews out there failed to warn consumers that this does NOT work on an Intel platform at all. The writeups out there further confuse you by stating the downclocks are achieved in drivers, while only the full hybrid power features, i.e., shutting down the GPU and offloads 2D to the onboard IGP require an Nvidia platform. They just don’t know what they are talking about.
This was confirmed after a 30 minute phone conversation with an XFX engineer, the manufacturer of my GTX 280.
What this translates to is that all the idle power consumption graphs you saw on reviews over the GTX 200 series do NOT apply to an Intel platform. Again, out of the 15 odd reviews out there, not one, I repeat, NOT A SINGLE ONE, bothered to state this clearly. Whats the user footprint in the real world with an Nvidia platform? 0.1% of the PC population?
The GPU temp does drop back to the 50s while idling in Windows, but the clocks do not scale back, which leads me to believe there is no real power savings.
If you mean clocking down when not in 3D,that works just fine for me on an Intel chipset.
Guess it works.Quote:
Edit:
There's been more than 1 response saying the downclock works. I stand corrected and it must have been due solely to my copy of the XFX card. The engineer I spoke to may not be in a very experienced position with the GTX 280 to identify whether its the card or the series at fault.
That guy sounds like an AMD/ATI infomercial though! It almost sounds too good to be true, but I'm rooting for the underdog here... hope it is true
I stopped reading when he was attacking reviewers credibility and then proceeded to use his "trusty right foot" for temperature testing. :)
That, along with many other flaws/contradictions.
would 850w psu be enough for 4870 1b cf?