-
UPDATED! 5870X2 & 5850X2 on display
-
-
They'll probably keep it in the "buffer" for the day Nvidia releases the GTX 380.
-
-
-
Holy :banana::banana::banana::banana:!
How much longer than the 4870X2 is it, 50mm?
I foresee problems with to "short" computer cases!
-
Quote:
Originally Posted by
Kallenator
Holy :banana::banana::banana::banana:!
How much longer than the 4870X2 is it, 50mm?
I foresee problems with to "short" computer cases!
email to ATI:My 5870x2 hits my HDDs, you guys suck!
email from ATI: Get an SSD you nub, you have the money for it!
-
Cut out the hdd cage for my V5 6000 once ;), can do it again lol
-
Quote:
Originally Posted by
Manicdan
email to ATI:My 5870x2 hits my HDDs, you guys suck!
email from ATI: Get an SSD you nub, you have the money for it!
Lmao lol man, warning may need to cut you pc case to fit in
-
Quote:
Originally Posted by
Manicdan
email to ATI:My 5870x2 hits my HDDs, you guys suck!
email from ATI: Get an SSD you nub, you have the money for it!
i think this beast would only fit into my case if there were no hdds/ssds at all :p:
but tbh, it doesn't look that much longer than a 5870...
btw, check this thread: http://www.xtremesystems.org/forums/...d.php?t=234555
http://img6.imageshack.us/img6/8716/5870family.jpg
-
Wow thats a big card. Dont you think they should change the cooling? I would think that one fan is going to have a lot of work ahead of it.
-
Damn it, I wish ATi or Nvidia would make some power efficient and "smaller" graphic cards.
BTW,
Is it me or does the only hardware that keeps getting Larger is in the category of Graphic cards.
-
in for 2
but can it play crysis?
-
that pic makes the x2 look larger then it is, its less then 1cm longer then 5870
-
Quote:
Originally Posted by
Jamesrt2004
that pic makes the x2 look larger then it is, its less then 1cm longer then 5870
if that's the case i really wonder why they didn't make the 5870 shorter :confused:
-
Quote:
Originally Posted by
RaZz!
if that's the case i really wonder why they didn't make the 5870 shorter :confused:
i hope its cause they packed in a big cooler under that shroud
-
Quote:
Originally Posted by
Bobbylite
in for 2
but can it play crysis?
Starting on this gen, this joke has lost its reason to be :P
Crysis is totally playable at 1920x1080 with a single HD5870, so the for X2 it will be child's play.
Maybe even at 2560x1600, who knows.
Also, holy crap! that's what i call a brick card!
-
Are they gonna release a 2GB and 4GB version?
-
i want a 4GB version mmm. i don't want to know the heat dissipation on this thing. my 4870x2 is a space heater right now. i wonder if the artic cooler xtreme 4870x2 will fit this thing.
-
Holy god! *insert e-p*nis joke here!*
-
Am I seeing things, or is that still 6x8 pin? Pretty amazing they managed to fit that in considering its power :eek:
-
For such a length I think it would be good to do some redesign on the cooling. The fan won't be very effective at low speeds and at high speeds it's loud. Either burn the card or hurt your ears, sounds like a nice deal! :p:
I'd expect like 90C load @ stock in a mediocre cooled case and like 85C in open bench/optimal case airflow @ 40% fanspeed. At 100% you'd prolly shave off like 8~9C but who really wants to run it at that speed when not benching? But yea it's not like the first time ATI is confident despite their cards are in the 90~100C load range.
-
looks like another warm winter in my house....
-
Quote:
Originally Posted by
purecain
looks like another warm winter in my house....
Russian winters are cold, we should get these for free! :rofl:
-
Is it just me or does that pic look entirely photoshopped? The pic posted by RaZz has the same lighting on the same spots for both the 5870 and the 5870 six.
-
Quote:
Originally Posted by
DdotRoq
Is it just me or does that pic look entirely photoshopped? The pic posted by RaZz has the same lighting on the same spots for both the 5870 and the 5870 six.
It's not photoshopped, I made it in MS paint. The 5870 six & 5870 vanilla have the same shroud, fan, pcb, & gpu. All that differs between the two is 6 mini dp vs. 2dvi,dp,hdmi, 2gb vs. 1gb 5gbps samsung.4ns, and 8+6pin vs. 2x6pin
-
Jesus, they keep reducing the size and power consumption on all other components only to transfer all that to the card. But n/e who, when when when, and will the AC 4870X2 cooler fit? Finally, I can buy something this long. All hail 5870X2.
-
Quote:
Originally Posted by
trans am
They responded " All your base are belong to us ":shocked:
-
It looks like 15-18" long. It's kinda heavy.
-
ati Really Beats Nvidia :)
-
Quote:
Originally Posted by
Jamesrt2004
that pic makes the x2 look larger then it is, its less then 1cm longer then 5870
Any source about the above.
The HD 5870 is 10.5" long, the same length as HD 4870 X2 and single GPU nVidia GTX 8800.
I had Mid Tower Antec P160W and the cards did fit in with at least 1cm to spare.
Anyway now I have the CosmosS case so the size is not going to be any problem, even if much longer.
-
Quote:
Originally Posted by
LC_Nab
They responded " All your case are belong to us ":shocked:
*FIXED*
-
Quote:
Originally Posted by
damtachoa
It looks like 15-18" long. It's kinda heavy.
that's what she said
sorry had to
-
prepare for it.....
its over 9000!!
-
Quote:
Originally Posted by
purecain
looks like another warm winter in my house....
just hook it up to ur central heating m8 ...sorted and the wifes happy too lol
-
I don't think there's a PLX chip onboard...
-
Quote:
Originally Posted by
largon
I don't think there's a PLX chip onboard...
Does that mean it's running on Sideport? Iguess we'll have to wait for the drip feed of spy shots of naked 12" c***s ;-)
-
After driver issues with the 4870x2 I'll be skipping this card.
Although I hope ATI can work out most of the bugs berfore they released this X2.
-
Quote:
Originally Posted by
safan80
After driver issues with the 4870x2 I'll be skipping this card.
Although I hope ATI can work out most of the bugs berfore they released this X2.
For me there are only driver issues if I was careless when updating them. if updating to 5870x2 is faster than 4870x3 I might consider it.
-
Quote:
Originally Posted by
initialised
For me there are only driver issues if I was careless when updating them. if updating to 5870x2 is faster than 4870x3 I might consider it.
I bought the 4870x2 at launch and it took ATI 4 months to improve their drivers. They had a lot trouble with crossfire and speed step. disabling catalyst AI was not a fix for the X2 because it disabled the second gpu.
-
Quote:
Originally Posted by
initialised
*FIXED*
LOL hahahahha better :D xD
-
Quote:
Originally Posted by
safan80
I bought the 4870x2 at launch and it took ATI 4 months to improve their drivers. They had a lot trouble with crossfire and speed step. disabling catalyst AI was not a fix for the X2 because it disabled the second gpu.
A single 4870x2 operates in Xfire mode ..?
-
-
-
-
Quote:
Originally Posted by
Bobbylite
in for 2
but can it play crysis?
Depends on your definition of play. If 30-35fps at 1920x1200xVH with no AA works for you, even 1 HD5870 can "play Crysis." If you are talking about 60+ fps, a pair of 5870X2s should handle it even with terrible CF scaling; I think Triple SLI GTX 285s are in the mid/low 50 fps range.
-
They better get crossfire working w/ eyefinity pretty fast. That's going to be a killer requirement for multi-monitor setups.
-
http://img30.imageshack.us/img30/7902/62915902.jpg
HD58x0X2 :cool: - notice soldering space for 2x8pin PCIe, but it's using 1x6 and 1x8
-
any pics of the other side?
-
Glad they fixed their BIOS on the 58XX, about time they figure that memory caused the heat and adjust it's speed to 300Mhz when Idle. Been doing the Rivatuner Autotuning fix for a while after getting my 4870X2, 40c Idle 67c Load on stock cooler :D. Before doing that i had a room heater as others mentioned, 78c idle 105c Load...
I got my Accelero Xtreme cooler for it in June, temps are now 37 Idle and 55 Load @default powerplay clocks and SILENT, yes that stock cooler sucks and would hope my AC cooler still fit the new X2! :D
-
Quote:
Originally Posted by
DdotRoq
Is it just me or does that pic look entirely photoshopped? The pic posted by RaZz has the same lighting on the same spots for both the 5870 and the 5870 six.
Agreed with you, it looks photoshopped:
http://i239.photobucket.com/albums/f...alsmith/x2.jpg
Look at that pci bridge, it look like weels...
-
I hope the 5870x2 doesn't have the same growing pains that 4870x2 users have mentioned. I had my fair share of issues ( driver wise ) with mine from launch it to about January 09. Perhaps the catalyst team has learned from their mistakes / road bumps.
Its funny you brought up the Accelero. I was waiting for those things for forever but they never came in time... mabey someone will offer a 3rd party air solution for the 5870x2 before its EOL... lol.
-
Yeah i had waited since the press release and couple back orders lol... but the profile fix was a lifesaver as i couldn't bare the heat in my room.
Wait n see if i upgrade but as of now i'm quite happy since i didn't have to WC my setup $$$$ and didn't get any driver issues and Heat pipes FTW! http://www.techpowerup.com/104447/Sa...X_Spotted.html
-
Between the two cores, it looks like a PLX chip is still there :(
-
Quote:
Originally Posted by
trans am
:lol:
-
Quote:
Originally Posted by
Origin_Unknown
any pics of the other side?
updated first post
-
Quote:
Originally Posted by
jaredpace
updated first post
Someone's been watching too many conspiracy-themed movies to pull this off:rofl::ROTF:
-
Quote:
Originally Posted by
jaredpace
updated first post
so the 5850 X2 will be about the same size as the 5870?
-
Quote:
Originally Posted by
blindbox
Someone's been watching too many conspiracy-themed movies to pull this off:rofl::ROTF:
haha! I'm just going by what cowboy was saying on the forum. I thought it was BS til the pics came up.
-
is taht plx chip on 5850 x2 ? eww so its same story again .....
-
aaah, how i love such leaks:D
wonderful...
-
Quote:
Originally Posted by
eric66
is taht plx chip on 5850 x2 ? eww so its same story again .....
Was the sideport rumored to make it for the 58xx x2 family? I don't think so, but I'm still disappointed
-
It's soooo big!! whose case can hadle it? :O
-
Is ATI still headquartered in Canada?
-
Quote:
Originally Posted by
munim
Is ATI still headquartered in Canada?
Not sure, but the cards are made there:
http://i35.tinypic.com/8wjuw7.jpg
-
Quote:
Originally Posted by
munim
Is ATI still headquartered in Canada?
Yes, Markham Ontario, near Toronto.
-
Quote:
Originally Posted by
The Coolest
Between the two cores, it looks like a PLX chip is still there :(
Yep, colour me disappointed as well.
-
Quote:
Originally Posted by
eleeter
Yep, colour me disappointed as well.
Pardon me for asking noob question but what would've been ideal instead of the PLX?
-
Quote:
Originally Posted by
kadozer
Pardon me for asking noob question but what would've been ideal instead of the PLX?
Lucid Hydra FTW!!! Of course I think everyone was hoping for SidePort.
-
Now a lucid hydra 200 chip on card would surely make things interesting.
I am not gonna get my hopes up thou.
-
Quote:
Originally Posted by
kadozer
Pardon me for asking noob question but what would've been ideal instead of the PLX?
Any method that is transparent to software. I don't want Crossfire on a stick, I was really hoping AMD would take the X2 to the next level where each GPU had a shared memory pool etc. in other words an actual dual core GPU in function if not in silicon.
-
Quote:
Originally Posted by
eleeter
Any method that is transparent to software. I don't want Crossfire on a stick, I was really hoping AMD would take the X2 to the next level where each GPU had a shared memory pool etc. in other words an actual dual core GPU in function if not in silicon.
+1
Which is why I have decided to go with "just" one 5870 instead of my 4870x2.
For me it's a no brainer. Less power usage a about the same performance level.
-
Quote:
Originally Posted by
eleeter
Any method that is transparent to software. I don't want Crossfire on a stick, I was really hoping AMD would take the X2 to the next level where each GPU had a shared memory pool etc. in other words an actual dual core GPU in function if not in silicon.
It should have been obvious that wasn't going to happen this generation. It may happen eventually, but clearly not on the current architecture.
-
-
How was memory reported in CCC for HD4870X2? As 1GB or 2GB??
-
Quote:
Originally Posted by
Lightman
How was memory reported in CCC for HD4870X2? As 1GB or 2GB??
1GB, same as GPU-Z.
-
Quote:
Originally Posted by
jfromeo
1GB, same as GPU-Z.
Thanks!
-
Quote:
Originally Posted by
eleeter
Any method that is transparent to software. I don't want Crossfire on a stick, I was really hoping AMD would take the X2 to the next level where each GPU had a shared memory pool etc. in other words an actual dual core GPU in function if not in silicon.
Thanks for explaining. Here's to hoping for next year (6series), since the 5 series is going to be the last video cards that will be based on older architecture for AMD.
-
Quote:
Originally Posted by
Xoulz
A single 4870x2 operates in Xfire mode ..?
I bought 2 4780x2s at launch.
-
is that going to be reference cooling for the 5850x2? looks pretty decent
-
I hope my fullcover block could be ported to a 5870x2, it'll make a nice spring upgrade, once the gt300 lowers the prices.
-
Quote:
Originally Posted by
cky2k6
I hope my fullcover block could be ported to a 5870x2, it'll make a nice spring upgrade, once the gt300 lowers the prices.
I highly doubt it... :(
-
Quote:
Originally Posted by
villa1n
is that going to be reference cooling for the 5850x2? looks pretty decent
I am not sure here, that's two very hot chips. Will see I guess! :D
-
Quote:
Originally Posted by
grimeleven
Glad they fixed their BIOS on the 58XX, about time they figure that memory caused the heat and adjust it's speed to 300Mhz when Idle. Been doing the Rivatuner Autotuning fix for a while after getting my 4870X2, 40c Idle 67c Load on stock cooler :D. Before doing that i had a room heater as others mentioned, 78c idle 105c Load...
I got my Accelero Xtreme cooler for it in June, temps are now 37 Idle and 55 Load @default powerplay clocks and SILENT, yes that stock cooler sucks and would hope my AC cooler still fit the new X2! :D
Nothing to do with the bios, they had to fix the memory controllers and possibly have something change on the GDDR5.
There was a flickering when the memory clocks changed, which is why they didn't implement the lower memory clocks w/ powerplay.
-
Quote:
Originally Posted by
zalbard
I am not sure here, that's two very hot chips. Will see I guess! :D
I prefer open air cooling into the case because my ultrakazes solve that problem for me ^^, with closed reference cooling i m at its mercy
-
Quote:
Between the two cores, it looks like a PLX chip is still there
Quote:
Any method that is transparent to software. I don't want Crossfire on a stick, I was really hoping AMD would take the X2 to the next level where each GPU had a shared memory pool etc. in other words an actual dual core GPU in function if not in silicon.
LMAO at how everyone on this forum thinks they know better than AMD's engineers. You don't, so don't try to give them advice on how to make their GPUs work better. Sure, it's good for AMD to listen to the fans, but only on the topics that our knowledge extends to, and this is way out of our league. It's like complaining that the HD 4870 runs at 90c load, it's perfectly safe FFS. AMD would not have let it go that high if they were not 100% sure it was safe. But once again, everyone thinks they know how to make graphics cards better than the engineers.
Quote:
Lucid Hydra FTW!!! Of course I think everyone was hoping for SidePort.
:shakes:
-
Quote:
Originally Posted by
WorshipMe
LMAO at how everyone on this forum thinks they know better than AMD's engineers. You don't, so don't try to give them advice on how to make their GPUs work better. Sure, it's good for AMD to listen to the fans, but only on the topics that our knowledge extends to, and this is way out of our league. It's like complaining that the HD 4870 runs at 90c load, it's perfectly safe FFS. AMD would not have let it go that high if they were not 100% sure it was safe. But once again, everyone thinks they know how to make graphics cards better than the engineers.
:shakes:
90C might be safe, but it severely limits oc'ing headroom, i think that is more what people worry about.
-
Quote:
Originally Posted by
WorshipMe
LMAO at how everyone on this forum thinks they know better than AMD's engineers. You don't, so don't try to give them advice on how to make their GPUs work better. Sure, it's good for AMD to listen to the fans, but only on the topics that our knowledge extends to, and this is way out of our league. It's like complaining that the HD 4870 runs at 90c load, it's perfectly safe FFS. AMD would not have let it go that high if they were not 100% sure it was safe. But once again, everyone thinks they know how to make graphics cards better than the engineers.
:shakes:
Totally agree with this.
You'd think the ENGINEERS that actually design these would know how to make something that performs pretty damn well. You all keep mentioning sideport, so I'm PRETTY sure ATi themselves know about it. If they want it in their card, they'll put it in there. If they don't then they have a reason.
Also, the 5870x2 isn't even released yet nor do we know the official details of it. Sure, we know the specs since it's essentially 2 5870's, but we don't know about the specific design changes when compared with the 4870x2. Some of you need to just chill and wait for more info instead of either hyping up the card, or bashing it when there isn't even any relevant info out.
Really.. not too long ago some of you were praising the 5800 series and couldn't wait for the launch. Now the benchmarks and cards are out and most of you are 'disappointed' or not that impressed. Well you have yourself to blame for expecting too much and hyping it up. Every single next gen release is the same on this forum :shrug:
-
Quote:
Originally Posted by
WorshipMe
LMAO at how everyone on this forum thinks they know better than AMD's engineers. You don't, so don't try to give them advice on how to make their GPUs work better. Sure, it's good for AMD to listen to the fans, but only on the topics that our knowledge extends to, and this is way out of our league. It's like complaining that the HD 4870 runs at 90c load, it's perfectly safe FFS. AMD would not have let it go that high if they were not 100% sure it was safe. But once again, everyone thinks they know how to make graphics cards better than the engineers.
:shakes:
LMAO at the fact you think ATi is perfect and i dont even no where you got the idea that they thought they were superior to ATi's engineers. i wouldnt trust 100% them due to the fact they cant run furmark on their last gen. to anyone who complains about furmark being unrealistic any other gpu runs it just fine so i dont no why the 4870 shouldnt. i am sure they decided on what is best for the card when it comes to multi-gpu. i dont know the advantages of hardware v software multigpu but at least you dont have to wait for driver updates to run games in sli.
-
put A Backplate On That 5850 X2
-
If the Lucid Hydra is really all that, is a question of time some manufacturer swap the PLX switch by a Hydra :)
-
Quote:
Originally Posted by
doompc
If the Lucid Hydra is really all that, is a question of time some manufacturer swap the PLX switch by a Hydra :)
& put gtx295 & HD5870 together na! :rofl:
Hydra can work that way also ....................
-
Quote:
Originally Posted by
villa1n
90C might be safe, but it severely limits oc'ing headroom, i think that is more what people worry about.
This is true but the 4870x2 doesn't clock much higher for one and secondly it has shown little benefit at higher clocks. They have to achieve a somewhat human noise level so they compromise with the high load temperatures. Running the fan at full blast will bring them down a lot but I personally find the 4870x2 fan way loud past 50% ( I've rarely seen it past 45% on the auto profile under most circumstances ). I expect similar if not slightly higher temps / heat output with the new X2(s).
-
Quote:
Originally Posted by
Chickenfeed
This is true but the 4870x2 doesn't clock much higher for one and secondly it has shown little benefit at higher clocks. They have to achieve a somewhat human noise level so they compromise with the high load temperatures. Running the fan at full blast will bring them down a lot but I personally find the 4870x2 fan way loud past 50% ( I've rarely seen it past 45% on the auto profile under most circumstances ). I expect similar if not slightly higher temps / heat output with the new X2(s).
Thats why i wait, for NonReference cooling ^^, The VaporX version from Sapphire will probably be very quiet. I have ultrakazes on my case anyways, so when I m pwr using my comp / gaming, the noise is already there for me.. but with headphones i don't notice it at all. :P I can understand though, having heard my buddy's 4870x2's xfired with fans at 60%... its definately audible.
-
Quote:
Originally Posted by
villa1n
Thats why i wait, for NonReference cooling ^^, The VaporX version from Sapphire will probably be very quiet. I have ultrakazes on my case anyways, so when I m pwr using my comp / gaming, the noise is already there for me.. but with headphones i don't notice it at all. :P I can understand though, having heard my buddy's 4870x2's xfired with fans at 60%... its definately audible.
I just wish Artic Cooling (or someone) would make some new heatsinks that use a full shroud and exhaust but still outperform the reference ones. My biggest issue with aftermarket is everything but water bleeds all the heat into the case. This is why I never ordered a 4870x2 Accellero ( that and it took 10 years to release ) I couldn't imagine the heat from 2 (up to 600watts of heat...) of those being bled into a case... Any reduction in noise levels would be lost as you'd need to have some serious high rpm exhaust to combat the ambient case heat XD
-
Quote:
Originally Posted by
Chumbucket843
LMAO at the fact you think ATi is perfect and i dont even no where you got the idea that they thought they were superior to ATi's engineers. i wouldnt trust 100% them due to the fact they cant run furmark on their last gen. to anyone who complains about furmark being unrealistic any other gpu runs it just fine so i dont no why the 4870 shouldnt. i am sure they decided on what is best for the card when it comes to multi-gpu. i dont know the advantages of hardware v software multigpu but at least you dont have to wait for driver updates to run games in sli.
They forgot to put the Rev limiter on the 4800s.
I wonder how many cars would hold up for an hour without the Rev limiter with people flooring it.
-
Quote:
Originally Posted by
Chumbucket843
LMAO at the fact you think ATi is perfect and i dont even no where you got the idea that they thought they were superior to ATi's engineers. i wouldnt trust 100% them due to the fact they cant run furmark on their last gen. to anyone who complains about furmark being unrealistic any other gpu runs it just fine so i dont no why the 4870 shouldnt. i am sure they decided on what is best for the card when it comes to multi-gpu. i dont know the advantages of hardware v software multigpu but at least you dont have to wait for driver updates to run games in sli.
The fact that the coding used in both programs, OCCT and Furmark, are specifically coded to exploit ATi/AMD's architecture is the problem with the explanation of "unrealistic" loads.
-
Yes, OCCT and Furmark are on a vendetta to personally destroy all ATi/AMD cards on the market.
No, its more like, there was a hardware defect... which is the designers fault (for not including the regulators) and the way it ran caused it to "break."
So what if its not a "normal" test? It's software... software should not be able to physically break something unless the hardware itself already has flaws. Its the hardwares fault...not the softwares.
-
Quote:
Originally Posted by
orangekiwii
Yes, OCCT and Furmark are on a vendetta to personally destroy all ATi/AMD cards on the market.
No, its more like, there was a hardware defect... which is the designers fault (for not including the regulators) and the way it ran caused it to "break."
So what if its not a "normal" test? It's software... software should not be able to physically break something unless the hardware itself already has flaws. Its the hardwares fault...not the softwares.
Well the guy that made OCCT was sponsored by Nvidia...
And isn't it interesting that a 4890 greatly outperforms a GTX285 in OCCT and Furmark?
Any hardware can have problems with certain software which results in crashes/throttling.
These programs are nothing more than power viruses and the necessary steps have been taken with the current series of GPUs from ATi/AMD.
-
Quote:
Originally Posted by
cky2k6
I hope my fullcover block could be ported to a 5870x2, it'll make a nice spring upgrade, once the gt300 lowers the prices.
yeah man i feel your pain
i dont buy full cover anymore because i go through gfx cards to fast. just get a gpu water block and all the ram sinks you can find, use zip ties to mount the unmatching holes till you get the mounting plates and continue that through the generations