Can someone please tell me whats on next after the ATI-FILES.:ROTF:
Printable View
Can someone please tell me whats on next after the ATI-FILES.:ROTF:
The head engineer at ATI sold his soul for design schematics to Satan. It turned out the schematics were for a GT300. The designers then decided what a piss poor design we'll only get yields less than 10%. So they sent out a signal to aliens. Aliens then met with the engineers from ATI. In agreement for a new design they must collect puppies, kittens, and kangaroo's, to be used as fuel for their space ships. Mean while the facility in AZ is getting retrofitted for kangaroo's. This was to be the most effective way to store the animals, as kangaroo's are used to a hotter climate. The aliens feeling like they have a chance to be competive went to Nvidia to bargain with them to develope their next generation gpu. However Jen-Hsun Huang declined as his primary diet is already that of kittens and puppies.
for accuracy's sake, can we rename this thread to:
The ATi HD5XXX Thread + Nvidia 300 Series Speculation
:shrug:
Just what the hell are you talking about? :confused: Clear Sky RUNS JUST FINE for several months now.
They not only fixed everything but xRay v1.5 (current engine under a fully patched Clear Sky) is actually a gorgeous-looking DX11 engine that scales from DX8 hardware to my 4850 X2 2GB and runs at decent framerate.
If you want to see a really poorly coded (=PoC-level) game look at Crysis.
Hitler gets informed about the rumored ATI Evergreen prices
http://www.youtube.com/watch?v=VLXfDO1u7-I
Hitler, as Nvidia's CEO, gets informed about ATI's Evergreen series
http://www.youtube.com/watch?v=FR45ja_fNzU
6 days to go, and no leaks?
what did ati do to the NDA signed people?
Used some gadgets or hired FBI using smoke and dagger to follow them 24/7?
its impressive none the less.
This thread needs moderation.
Dunno if it has been posted already, but this worries me a bit... performance might be brillinat, heatoutput might be amasingly high
Sorry it it's a repost :
Time to put off the heating again :)Quote:
We've heard that ATI is working on an X2 version of Radeon HD 5870 card but the biggest obstacle is the power and how to cool such a card.
We’ve learned that the current TDP for X2 is 376W and that the company is working on this issue, as apparently they will have to slow down the GPUs down by quite a lot to get rid of the heat.
Even if they use downclocked Radeon 5850 cores that run at 725MHz, the power goes down by only 36W (2x170W) to 340W. The hottest card from ATI so far was HD 4870 X2 that had TDP of 286W. To release a Radeon HD 5870 X2 card ATI should go down at least to 300W, especially due to thermal issues, but who knows, maybe ATI will launch 300W+ card.
Are they serious ?Quote:
We might be looking at the dawn of graphics cards with three or even four power connectors, as two might not be enough this time.
That's from fudzilla. Like everyone said, 5870x2 doesn't mean doubling the TDP. Looke at 4870x2 for example, one 4870 is 160~170W TDP. So far fudzilla has been the most unreliable this time around.
looking at the amd perf slides i have to say... i expected more?
the 5870 beats a 295 by a long shot in several games, but where it matters, where you really need the extra fps, ie far cry2, crysis, cod... its 20% slower than a 295...
and the 5850 beats a 285 for sure, but not by that much, and again where you really need the extra perf its about the same as the 285...
upgrading from a 285 to 5850 doesnt make sense i think, upgrading to a 5870 maybe... but from a 295 to a 5870 again doesnt seem to make sense, on the contrary...
im really surprised that doubling every resource in the gpu, tweaking it and clocking it really high... the performance is still not notably higher than that of the 285, which is an oced 280, which is 18 months old...
looking forward to propper reviews though, who knows how good those amd benchmarks are, probably all with amd cpu at stock speed, so not really that useful and painting quite a diferent picture :D
Some people should calm down :) And just wait. 23/24 September we will know what the HD5870 really can do. RV5870 CF performance will show us what a X2 could do. So we will have quite a good indication of what AMD can do this year.
GT300 is still far away. I wont be surprised if we see good GT300 availability somewhere in Q2 2010.
If the current leaked specs are true we will be looking at a GTX285 x 2 and maybe a bit more. But we dont know enough about the GT300 to know that for sure.
a moderator should go through this thread and delete any post which is off topic including this one...
or the name of the thread should be changed to dx11 tech, multimonitor speculation thread...
this thread has lost all coherance...:shrug:
I think you should take a more detailed look to the performance comparisons (particularly the ones with numbers and resolutions and all): HD5850 is beating GTX285 in nearly all games (including STALKER CS and Crysis Warhead), some by more, some by less, so no way that the HD5870 is not noticeably faster than GTX285.
On the other hand, in the comparison between HD5870 and GTX295, you can see that STALKER CS runs 38.4 vs 40.5 without AA and 24.2 vs 22.3 with 4xAA (so no 20% lower, even higher with AA) and Crysis warhead is 51.5 vs 53.6 and 43.6 vs 48.5 respectively (~3% and ~11% performance difference respectively) so...
In general, the difference between HD5850 and GTX285 (favorable to the Radeon) is equal or greater than the difference between GTX295 and HD5870 (favorable to the GeForce, if we don't count 8xMSAA results). So we could say that GTX285 is trading blows with HD5850 (favoring the ATi card) and GTX295 with HD5870 (favoring the NVIDIA dual card).
You can take a look to this post if you want to examine better the leaked benchmarks: http://www.xtremesystems.org/forums/...&postcount=892
Of course, you should note that maybe a change from GTX295 to HD5870 is not so greatest performance wise (probably consumption, heat and price will be another thing), the same way that a change from HD3870 to HD4650 is not. If you want to see if changing from a dual high end chip configuration to the new generation is a gain, you should wait for HD5870 X2, or make an idea with a HD5870 CrossFire (wich here you can see how it scales, destroying a GTX295 SLI -as it's to expect given that the NVIDIA configuration is a 2x2=4 chips scaling, so much harder to scale).
Less than one week left sirs
not really, 5850 seems to be beat a 285 but not by much... and a 5870 wont be 2x as fast as a 5850, i think we can all agree on that :D
i think you misread my post, i said:
5850 beats 285 but not by muchQuote:
upgrading from a 285 to 5850 doesnt make sense i think
upgrading to a 5870 maybe...
but from a 295 to a 5870 again doesnt seem to make sense, on the contrary...
so going from a 285 to a 5850 doesnt really make sense...
upgrading to a 5870 MAYBE makes sense...
upgrading from a 295 to 5870 would actually be a step back... unless your just into benching and want hundreds of fps... where you really need extra gpu perf to get acceptable playable fps, ie demanding games, the 295 seems to be about the same or faster than the 5870.
but like i said, thats probably with amd cpu at stock speed... and as anandtech recently uncovered, some games seem to get a notable perf boost on nvidia gpus when bundled with an amd cpu for some reason... that might actually bite amd in their own pr arse in this document :lol:
I still wanna know what does it take to fully take advantage of HD5870 CF? A core i7 @4.0Ghz at least?
Impossible question. It is matter of TASTE actually. In some situations, CPU can be somewhat slowing down, in some situations the GPU. And in some situations the actual "bottleneck" can be the PCI-E bus.
It depends on:
software/game used
settings
the actual scene in the game
drivers
GPU/VRAM speed
CPU/RAM speed
PCI-E bus bandwidth.
So yes, talking about "CPU only" here, it is practically impossible to say accurately. There will always be someone arguing about it. Maybe a 4 GHz i7 can sustain it most of the time without causing major bottleneck, but for sure 7 GHz i9 with 8 cores would do MUCH better still. No one knows, a matter of taste and opinion.
And before someone comes raging that "omg put slow cpu and your fps drops"(sure it does), I suggest "you" to imagine WHY it happens. Hint: Time spent(per frame) by: CPU(both processing and data retrieval from RAM), PCI-E bus data transfer latency, GPU(both processing and data retrieval from VRAM).
Could be, but it depends a lot on resolution. At 1920x1200 (or better, 2560) I doubt CPU is going to be a big bottleneck. And if you are using a lower resolution then you certainly don't need two 5870's. In fact, you don't even need one - a 5850 or even a 4870 is enough at 1680 and below
simple answer, yes.
you can check some numbers from people who do overclocking over the top stuff and see how the scaling goes.
Its also a matter of resolution, I would assume, that 1920x1200 is actually a little to low resolution, unelss we go for the 3 screen set ups.
Its a great idea, ati makes a card have a feature, that adds more fps, but what to do with all that power?
Run quake at 1000fps?
No, they spent some time, talking to Samsung, guys we wanna make this feture we call eyefinity, so, if we do that, and you make a special set up, you cut us in for a deal?
Samsung, hum what is it?
ati: Its where our cards are so powerful we needed to add a feature so people would buy more screens to use that power, and guess who they wanna buy from?
Samsung, hum who?
Ati: The guys making screens with thin bezels...
Samsung, you think so?
Ati: come and sit down and play this game here.
Samsung representative: OMG!
Ati: yes we know.:ROTF:
http://www.fudzilla.com/content/view/15535/1/Quote:
Nvidia confident that GT300 wins over 5870
A few people confirm
Nvidia is still not giving and hard details about its chip that we still call GT300 but one thing is getting quite certain. A few people who saw just leaked performance date of ATI's soon to launch Radeon HD 5870 have told Fudzilla that they are absolutely confident that GT300 will win over Radeon HD 5870.
Since GT300 has its GX2 brother they should not fear much against Radeon 5870 X2 whenever that comes out, as if Nvidia wins a single chip battle, they can win the dual one as well.
We can only confirm that GT300 is not a GT200 in 40nm with DirectX 11 support. It’s a brand new chip that was designed almost entirely from the ground up. Industry sources believe that this is the biggest change since G80 was launched and that you can expect such level of innovation and change.
The launch as far as we know is still slated for late November and hasn’t been confirmed.
LOOOL
Hitler, as Nvidia's CEO, gets informed about ATI's Evergreen series
-> http://www.youtube.com/watch?v=FR45ja_fNzU
:rofl::rofl::rofl::rofl::rofl:
Hitler gets information xxx never works for me.... it's not funny in anyway. :p:
I do agree. The HD5870 will only be 15-20% faster. But that is where is HD5870 X2 comes in to play. Depending on the time nVidia needs for its GT300 GX2 (if there will be a GX2) Its also possible that AMD will have a refresh of the RV870 (HD5890 / HD59xx ish)
At the start i think it will be HD5870 and the X2 VS the GT300 Single card range.
Plese close that site
http://www.fudzilla.com/content/view/15535/1/
They need to price 5870 at $299 to sell lots of cards.
NO WAI! :p:
i believe radeon 6870 will be faster than gt300. and the 6870x2 will be faster than the gt300 "gx2" (i'm wondering how he even came up with gx2, as it wasn't used since the 9800gx2...) as well! so, do i get my own news by fudzilla now? :):):)
:ROTF:
*edit* btw... i saw the benchmarks (which are also in the techpowerup leakage collection) where they tested wolfenstein with AA. since AA doesn't work in wolfenstein for 4800 series cards i'm wondering if it'll work flawlessly with the 5800 series?
Watch the video and you'll understand
The techpowerup 5xxx leak collection
http://www.techpowerup.com/reviews/A...0_Leaks/1.html
Source Tweakers.netQuote:
New update, temperatures and fan speed.
Idle, check also the GPU and memory clock.
Idle @ 20%
http://i25.tinypic.com/e297s.png
Idle @ 50%
http://i30.tinypic.com/2py6edh.png
Idle @ 100% (Like a hairdryer!)
http://i27.tinypic.com/23kukg7.png
Load test, stressed the GPU to 75%.
Load @ 20% (91° and rising)
http://i31.tinypic.com/waq4ib.png
Load @ 30% (default fan speed, stable at 75°)
http://i25.tinypic.com/dbqlj.png
Load @ 50%
http://i28.tinypic.com/jhtq3l.jpg
Load @ 100%
http://i30.tinypic.com/k1572c.png
Finally stressed the GPU to 95%, but the temperature was not rising.
http://i30.tinypic.com/9q8wzp.png
So we can expect real benchmarks soon? What's the point of them covering that part when the first pic shows everything? Very nice idle temps.
Mistake? I saved the pic anyway, just in case.
Anyway, they seem to be fakes.
20 % Fan, 75 % load on GPU: 91 C
30 % Fan, 75 % load on GPU: 75 C
30 % Fan, 95 % load on GPU: 74 C
Right. :rolleyes:
If anything, the pics may be telling that the temp/noise levels will be around the same that of HD4k series cards. Afterall, they use the same fan and the temps seem to be quite close.
good test with fan speed and temps
but whats the highest quiet and inaudible settings?
No 2D/3D clocks? ;)
What's probably true here is the issue with temp and noise will be the same with 4800 series. In that case, what we got to know is if the sockets are still the same so that aftermarket heatsinks that used to fit 4800's will fit these... Can't bear that stock fan of theirs.
It's funny that he blanked out his monitor text (the stuff in square brackets) in all but the first image. I don't get that.
I don't care about their reasons behind the stuff they spit out, but the information is clearly false. It does not make sense any way. UNLESS, the load figures don't come from the same program. E.g. other one(95 %) is random game and other(75 %) is Furmark. And in that case, the information is still false.
What I suspect is that the 95 % load figure is taken like half a minute after the load started, and the 75 % figure is taken 30 minutes since the load started. "But but full load regardless ;););)".
Seems that Tweakers.net is yet another site which to stay away from. :down: :down:
Good to see powerplay working full swing, hopefully the auto fan speed control will actually work this time around as well.
why false? crysis at 75% still heats up more than far cry 2 at 100%.
looks legit to me.
edit- hmm not much different from juniper
http://img10.imageshack.us/img10/8876/junip.jpg
As I said in my post, in that case the information is still false, or at least misleading. Holding absolutely zero value and misleading anyone who believes in it. Garbage, BS.
The 75 % and 95 % values hold no information if the values and the temps aren't measured with same software stressing the GPU. AND I am sure that EVERYONE knows this. Just pulling off stunts like this is just as BS as pulling the values off from ones hat. Selective results. PR at it's best. No value for the end user. Misleading information.
those fanspeed and temp values look weird indeed...
So the mem clocks throttle down now too? That should make for some nice idle power usage stats.
idle power was suppose to be 28W, this is looking to be true.
ok, calm down children
As long as these things aren't 4890s all will be well. The 4870s were fairly respectable noise wise with the stock fan profile / heatsink design. Given it is has a similar max board power but also a full length pcb / heatsink, it should be just as quiet if not more so. I was never one of those OCD types who'd complain about 80+ load temps so I have no problem with these thigns getting hot if ATI says they can ( used my 4870x2 with stock profile from day one with no artifacts / thermal issues at 90-95C on the hotter of the 2 cores )
The progession of this thread (which has been hilarious for me since its all speculation at this point).
First reports: :fact:
Interest and ATI fan celebrations: banana3: :banana2: :rehab: :banana:
Nvidia Cronies chime in: :soap:
Arguments commence :argue:
Goes way off topic: :eleph:
So what is it we can see? Well despite speculations from either side at least they have more to offer than Intel ha! :P
I see the GDDR5 still doesnt downclock in 2d.
Well seeing as how I am at work, I cannot see the pictures.
This morning there was a couple shots of a 5800 series card idling at 157/300...
Here-
http://www.xtremesystems.org/FORUMS/...postcount=1156
and this was posted-
So obviously GDDR5 is downclocking.
I also mentioned this earlier-
there is ore than 1 picture, some show 300mhz for the memory
Isn't GDDR5 4 data cycles per clock, so 300mhz => 1200mhz?
I read 300MHz too
http://i30.tinypic.com/2py6edh.png
Again, I don't understand what the heck you people are comparing here... Based on pricing - the only meaningful base for market viability comparisons - 285 will fight 5870 and 295's counterpart ill be most likely the X2 and both cases NV gets spanked pretty badly.
Once they bring out GT300 the story will change but I have serious bets it won't happen this year - IIRC each respin takes 2-3 months and it's mid-September already and we know from more than one sources that NV is having serious issues...
no info about 5870 device-id yet ?
Question for the Gurus.
Remember a game called DOOM 3 where it had really bad AF on cards and gave you a stairstep effect or *shimmering*
will these new cards fix that issue becuse thats what really bothered me in that game the most.
We just need a shot from AF Tester of a flat plane. That will give you a good clue as to whether or not shimmer will be an issue.
So when does the nda lift?
is not today ? (thursday)
On my HIS 4870 I get 65c at 30% fan speed(idle). (corrected) Looks like these new cards get about 40c at 20% fan speed.(idle) That actually seems better than my 4870.
http://www.techpowerup.com/reviews/A...s/idleat20.jpg
http://www.techpowerup.com/reviews/A...s/idleat20.jpg
Not sure who's really going to have their fan up at 100% at idle. LOL
"Hey man, that's a scream'n loud card ya got there."
"Yeah, I leave like this 24/7 so I can have a 31c card temp..cool, huh?"
"Ummm, no, it isn't"
LOL
http://www.techpowerup.com/reviews/A.../idleat100.jpg
http://www.techpowerup.com/reviews/A.../idleat100.jpg