My first overclocked '06 run ever: 11800 :)Quote:
Originally Posted by k|ngp|n
Plenty of things to improve. CPU @ 3,43 GHz.
Printable View
My first overclocked '06 run ever: 11800 :)Quote:
Originally Posted by k|ngp|n
Plenty of things to improve. CPU @ 3,43 GHz.
Intel 65NM 955XE dual core without coldbug will do much better than AMD FX-60 in 3dmark06 afaik.
Perkam
I have one of those, so will try it propably later.Quote:
Originally Posted by perkam
Quote:
Originally Posted by Kinc
Do what i do, always use your US passport, and if you see any of these running around, dont worry, it's normal :
http://www.extradtp.net/Datas/Pdts/Ima/P24109.jpg
you are funny gnome
congrats man, nice scores :)
ok so im lost here I thought RD580 was x1900 so why did they delete the score?
RD580 is mobo, R580 is X1900:p:Quote:
Originally Posted by cMw
yea the X1900's have already been released so why dont thye want scores to be posted ?? im lost ??Quote:
Originally Posted by Winterwind
because rd580 makes crossfire perform much betterQuote:
Originally Posted by cMw
Thats not a very good comparison though of CF and SLI effeciencty to use a set of #'s that are run with stock settings and stock cooling. and the next set of #'s with CF/SLI to be over clocked to the max with extreme cooling.Quote:
Originally Posted by Willis
Point is though its not efficient in the least. OCd or stock.
alright im good now i got mixed up with rd580 and r580 :P
didnt know the rd580 hadn't been released yet
but personally i cant wait to see a DFI version of this
Crotale and I ran some 955 this weekend. It never worked as it should and now the cpu is almost dead. The resler core may be sensetive to high voltage and clocks. Since it has been geting weaker after each time.Quote:
Originally Posted by Sampsa
http://service.futuremark.com/compare?3dm06=150598
Willis, that's how it goes when its Intel+ATI vs. AMD+nV.
Intel is strong in 3DM06 CPU tests while AMD runs the first two Game tests better.
1st HDR test is better for ATI and 2nd runs better with nV (for now atleast..) =)
HDR can be cheated by using LOD, and thats allowed.... - I've heard it wipes out the HDR effect from a dude that tested x1900CF..
EDIT:
Macci, can you comfirm this?
Yes, completely different plattform and different cards.Quote:
Originally Posted by Willis
The CPU is really degraded now :( 1.8vcore seems to be a little too much for the 65nm process.
R600's properly ready soon as well, and that chip is a KILLER... - A bit more sucky pipelines, but ALOT of them...!Quote:
Originally Posted by Willis
Heard some insane specs once... - Properly not true, but 92 / 96, dont remember which, sounds INSANE :D
Anyhow... - First big difference in GFX will properly be GDDR4... ;)
NO...Quote:
Originally Posted by M.Beier
In ATI roadmap, R600 its to come probably some where in October...while Nvidia will lauch its G80 in June-July if the rumours are true...
And off course both cards will come with GDDR4 and DX10 (PS4)
Very nice scores! Just the numbers I was expecting to get with some R580s in CF w/ a RD580 mobo. I want to see 20k fall this weekend! :woot:
65nmQuote:
Originally Posted by M.Beier
64 Shader pipelines (Vec4+Scalar)
32 TMU's
32 ROPs
128 Shader Operations per Cycle
800MHz Core
102.4 billion shader ops/sec .
512GFLOPs for the shaders
2 Billion triangles/sec
25.6 Gpixels/Gtexels/sec
256-bit 512MB 1.8GHz GDDR4 Memory
57.6 GB/sec Bandwidth (at 1.8GHz)
WGF2.0 Unified Shader
I think I also read it will be 48pp but I am not totally sure.
EDIT: you mean probably :p:
Why don't they are in the HALL OF FAME?
Grats! :D
gl, break WR in 05. You can do it. :toast:Quote:
Originally Posted by gocchin