I am running a sandy 3700+ at stock and the x1600xt cards also.
Those are the only changes in my sig.
http://img399.imageshack.us/img399/8...03stock8by.jpg
Printable View
I am running a sandy 3700+ at stock and the x1600xt cards also.
Those are the only changes in my sig.
http://img399.imageshack.us/img399/8...03stock8by.jpg
05 score not bad.
http://img399.imageshack.us/img399/9...05stock2ym.jpg
Not bad, take 10k ! you can do it! :toast:
faster than a 7800gtx in 2k5, nice!
that 2k3 is dissapointing though, same score as a single x850xt...
i dont like it hmmmm weird scores how 3dmark03 is so low??
will rerun
2k3:
9000 to 11500
crossfire is 36% faster than 1 card
2k5:
5200 to 8400
crossfire is 62% faster than 1 card
could the cpu be limiting the cards that much already?
Could you provide pics of the two cards in crossfire ? Which brands ?
Been Waiting for this...:slobber:
I hope ATI release a software voltage adjustment tool for these...ppl are making huge investments on these x1600 rigs not to have a promised feature unavailable to them.
Saaya: In 03 he's very cpu limited, hence the score imo...He'll top out at 650/800 on the two cards without more voltage to the core, which should do 700/800 quite easily as its cooler than the x1800 core...
Perkam
Cards are sapphire, pics coming up shortly.
They are using copper heatsinks with the sapphire logo on top.
The pro cards are using aluminum heatsinks:p:
OVERCLOCK THEM!!!!!!!!!!!!!
:)
I wanna see @ what point 10k falls
EDIT*
WTF IS THIS?!?!??!?!!
http://www.newegg.com/Product/Produc...82E16814102656
pic mistake?
only 11K in 3dmark 2003?
thats waaaaaaaaaaaaaaaaaaaay low.
thats not even coming close to what 6600 GT's do in SLI and those have been around forever, and can be had for as little as 120...
for some reason, I expected way more from the X1600XT.
theres no way it could be that bad.
thats sub par, barely enough to take out regular 6600's in SLI.
as mentioned many times in various threads, those pics are wrongQuote:
Originally Posted by vapb400
althes does not tweak nothing, look at his screen, hes got themes and backrounds and junk going on...
Overlock the system and cards, and 10k will fall no problem...
that 3d03 run seems messed up... hopefully he will rerun it :toast:
dats true... thats what ive been saying all along, 1600s in crossfire seem to suck from what weve seen so far... :(Quote:
Originally Posted by Kunaak
I reran 03 still the same thing.
I may do a reinstall of everything and see if the cpu is the bottleneck.
how do those xts compare to the pros? does mem bandwidth really make that much of a difference?
is that 3d05 score with the cards clocked up or no?
3d05 score is stock afaik.
Perkam
It makes so much difference.:)Quote:
Originally Posted by saaya
Score is cards at stock.;)Quote:
Originally Posted by einCe
I think his drivers are :banana::banana::banana::banana:ed up again. they were for his x1600pro's too and he scored really low until he fixed themQuote:
Originally Posted by Kunaak
http://img443.imageshack.us/img443/2...03test30yg.jpg
you're getting beat by x1600pro's o.O Deff reinstall
how much :PQuote:
Originally Posted by althes
perkam your request
http://img516.imageshack.us/img516/9444/dscf00409jc.jpg
Will do a reinstall in a couple hours, see what happens.
The RV530 strong areas are shaders. I think that is why the 3Dmark 05 numbers are high. 01 and 03 test more texture procesing and I think that is RV530 weak point. Its texture procesing is the same as the RV515 so I would expect lower scores for 01 and 03.
but he is getting outbenched by x1600pros in 03:stick:Quote:
Originally Posted by SnipingWaste
x1600xt ddr3 vs x1600pro ddr2 = huge differance. DDR3 has almost 2x the bandwidth.Quote:
Originally Posted by LucusScott
which drivers are u using?
You can only use the 5.13betas.
5.13's are no longer betas...Quote:
Originally Posted by althes
as perkam said,
5.11 is best for single core
5.12 is best for dual core
5.13 just has some avivo features that do not help benching
Now, reinstall and get that 3d03 up past 15k :toast:
I have to use a divider on this board and its really screwing things up.
http://img357.imageshack.us/img357/6...05test35bh.jpg
You can only get crossfire for x1600xt with 5.13catsQuote:
Originally Posted by Welz
The core on these dont want to go anywhere in crossfire.
The memory on these x1600xt are very good.
I hope you can raise the voltage via software it would really help.
This sandy I have is doing really good but a little more vcore
Much better, break 10k :toast:Quote:
Originally Posted by althes
I think I will switch ram now and try different sticks, the 3.2 is stopping me from really using the ocz.
I heard that someone on the dfi forum was able to get the dfi xfire board to run x1600xt using the 5.13cats
That should break 10k in its sleep...your core is way too low...the lack of voltage doesnt kick in until 650.
Perkam
Maybe it does on that card....Quote:
Originally Posted by perkam
Doesn't seem like many X1300 or X1600s like to go past 640 on air w/o Vmod.
That is very true these cards just dont liek going anywhere near 650with the core:confused:Quote:
Originally Posted by DrJay
the 1600s are definately starving for mem bandwidth....
in 2k1 upping the gpu clocks 100 mhz gave close to no boost at all, upping mem 50/100 ddr gave a 500point boost.
i have a feeling that the 1600s dont care about cpu speed, they are already maxed out very fast, reminds me of my g4mx wich didnt give a thing about cpu speed, lol :D
posted results in my thread, only single card so far though.
2k1 2k3 and 2k5 at different clockspeeds.
wasnt it 128mb cards ddr1 vs 256mb cards gddr2? i remember those cards, the early gddr2 was running incredibly hot and performed worse than gddr1 at the same speeds.Quote:
Originally Posted by LucusScott
Thats exactly what happened saaya.
If these cards can be volted by software a whole new ballgame opens up.
May have to voltmod these cards
well give the pencil mod a try, fast n easy, and enough for air cooling :)
but then again, having to mod this cards to hit 10k in 2k5...
well if they do like the higher volts they might hit 750+, wich would be a nice boost, but then again, kep in mind how hot theese cards run... dont know how much it will limit them, whats the temps on the xts with that copper heatsink?
my pros are 63°C under load and 32°C idle when oced to 650 core
seeing as my pros oc to the same speed as your xts im kind sceptical about a decent boost with more vcgpu, but lets see :D
Will do the pencil mod.
Did you run 3Dmark05 with single card?
I am going to do all that today.
Need some pics of the mods for the cards first using the pencil may solder later
Looking forward to it :slobber:
Perkam
single card scores
http://img367.imageshack.us/img367/3...03test19pw.jpg
damn good single card scores for a $160 card, when you gonna vmod it? I want 700+ XD I wonder if you can get 1GHz on the mem with vmod.
hey althes u think u could post a pic with out the heatsink on so i can see if my current wb will work
I am waiting on some pics of the mods so I can do them I want to be very carefulQuote:
Originally Posted by sabrewolf732
Pics coming upQuote:
Originally Posted by msimax
your single oced 1600xt beats my stock 1600pros in crossfire, LOL :D
single 1600pro maxed out on air vs 1600xt maxed out on air in 2k5:
4300 vs 5800....
1600pro crossfire stock vs 1600xt maxed out in 2k5:
6450 vs 5800
pencil mod on both vards
http://img244.imageshack.us/img244/8...05test14qr.jpg
nice clocks. 1.8GHz ram :slobber: im assuming you can break 10k in cf?
I will attempt to break 10 in xifre tomorrow.:D
Leaving us waiting.Quote:
Originally Posted by althes
What card u would take, considering the price too, the 68GS or x1600XT?
Waiting for xfire :D
Now plug both cards into a NF4-SLI board and see if the CrossFire ability is still available.
If not, someone HACK the drivers 5.13... to take out the NF4 prohibition.
These cards refuse to go over 630 in xfire, single card the can get to 660 but not in xfire.
GIVE US SCORES !!!! :slap:Quote:
Originally Posted by althes
lol page 2 gnome :nono:Quote:
Originally Posted by Gnome
http://www.xtremesystems.org/forums/...3&postcount=36
lol, even with mods???Quote:
Originally Posted by althes
btw, so the faster memory gives an oced 1600xt a 1500 boost in 2k5 over an oced 1600pro... LOL :D
talk about bandwidth limited.... yay
I may try to mod with a pot.
I am going to post pics with out heatsink tomorrow.
You aint kidding, its just sad that the mem can get 1.8ghz and barely 630 on the core what sort of card is this.Quote:
Originally Posted by saaya
If these cards were able to use voltage thru software they would really be a handful.
I a couple 47k resistors I am tempted to solder them on, and see what happens.
:D
I did the pencil mod and single card can reach 658/900 but in xfire limited to 630/873 not nice.
658 is still really bad althes, my cards can run that without any mods if i open the window and let some fresh air in....
are you sure the mod is working?
looks like its not if you ask me...
I think the problem is no power plug..
edit
what is power consumption of 256mb GDDR3 vs. 512mb/256mb GDDR2?
Depends on what speed you run them at.
edit:spelling
my badQuote:
Originally Posted by Welz
the difference is quite large, a 1800xl consumes 60W a 1800xt 110W under full loadQuote:
Originally Posted by STEvil
the cards have both the memory and core clocked at different speeds and at least vgpu is lower for the xl if not vmem as well. the speeds dont have a big impact on heat dissipation/power consumtpion, its the voltages that make the big difference afaik.
id guess that the power consumption of the xl vpu is 40W and the xt vpu is 60W
that means the memory of the xl consumes 20W, wich sounds about right imo, and the memory of the xt consumes around 50W, wich also sounds about right considering its double the memory and clocked at higher speeds.
so 256mb gddr3 consumes around 20W, maybe 25W when clocked to high speeds. no idea about 512mb gddr2 :shrug:
maybe stevil is right... one card shouldnt be a problem for the mainboard to power, but two cards... and it makes sense, when one card gets oced it ocs higher than when both cards are oced...
and all our cards seem to max out at 660 core when we use them alone, althes, you even vmodded them and still couldnt get higher than 660... maybe it has to do with the poer consumption of the card beeing more than the pciE slot can supply?
but i dont think so... the cards dont run that hot... at least min dont...
and theres an easy way to find out, if we lower the memory clock we should be able to reach 670 core.
if not then its def the limit of the core id say... or a limit of atitool maybe...
edit:
a 1600xt has a power consumtpion of 40W max, theres no way that ocing them without vmodding gets you to 60W, thats the max the pciE slot can power i think.
I thought a higher freq.= more work being done=more power consumption.Quote:
Originally Posted by saaya
Take a gpu, for instance. An increase in clock freq., even W/O a voltage increase, can increase temps. At the very least you'd be losing more to heat production at higher speeds.....wouldn't you? :shrug: A cpu / gpu / ram chip should draw more current the faster you run it...to a point.
Oh well, someone will correct what I've screwed up here.....but I always thought voltage increases were only indirectly responsible for temps...as they allow for higher current and freqs.
EDIT: On a similar topic; Contrary to popular belief, temp. control on CPUs / GPUs / ICs will not keep them safe regardless of voltages. To high a voltage can destroy the P/N junctions within a transistor.
its a popular belief that temp protections save peoples hardware from dieing from too much vcore/vgpu? :confused:
that sounds weird... :D
if you increase vcore/vgpu/vdd you get a bump in temps no matter what clockspeeds the ic is working at, and even in idle afaik.
increasing the speed the ic works with also increases the power consumption and heat output, yes, but not as much as increasing vcore/vgpu/vdd
at least in my exprience...
check your cpu temps at a low speed and then at stock or an oced speed under load, using stock vcore. then increase vcore and check the temps under load at the low and at the high clockspeed.
increasng vcore by 10% results in a much bigger power boost/heat boost as increasing the clockspeed by 10%!
the latter usually only increases the cpu temp by 1-2°C under load, if at all...
That is because if you increase Vgpu, you are also increasing the current and so, power consumption. Voltage is just a difference of potential.
Saaya, RE: the first item you quoted me on.... Many times in these and other forums, I've read that the overvolt given to a processor doesn't matter so much as long as its' temp is kept down.
I may have been getting a little OT....but I thought not too much...
consumption or dissipated as heat? Some are more efficient than others so they could be consuming more amps (therefore more wattage) but still put out less heat.Quote:
Originally Posted by saaya
Anyways, max PCI-E 16x power supply is 75w IIRC, but we both know thats a load of bull :fact:
Wow, These resuls are different...
Amazing how well they go in 3dmark05 compared to 3dmark03.
In 2003 I get 12147 with an AGP 6800nu, vmodded and unlocked/overclocked too, but only on an athlon XP http://service.futuremark.com/compare?2k3=4059557
In 05 however, I managed to puch 5201 marks out of the 6800nu agp
http://service.futuremark.com/compare?3dm05=949255
This shows how much better the new ATI series can do shader heavy stuff, but how mediocre they are in less intensive stuff
for cpus thats true in my expirince... for videocards and memory etc things are different :)Quote:
Originally Posted by DrJay
keping them cool alone doesnt mean they are safe no matter what voltages you bump through them :D
dont get what you mean, consumed power = dissipated heat... or where else should the consumed power go? ^^Quote:
Originally Posted by STEvil
99% of the power consumed in ics is dissipated as heat afaik.
and the numbers are from xbitlabs who modified a mainboard to meassure the power consumed through the pciE slot, wich turned out to be very accurate.
so 40W is power consumption under load.
and well... 75W for the pciE slot isnt bs i think... its just not 75W but a certain amount of W on each rail... theese 1600s seem to suck really hard on the 12v rail though... maybe the board cant keep the 12v rail in the pciE slots high enough?
hmmm and hence vgpu drops or gets unstable?
imma meassure the volts under load and idle on the cards.
Morgoth, i wouldnt say they are power full in intense stuff, they are just powerfull in different intense stuff, aka shaders.
the question that pope up though is how good is shader power if the card cant handle the geometry and texture load?
the 1600 shader power reminds me of the geforce fx5200, dx9 wow!, but it couldnt even run simple pixel shaders effciently...
same with the 1600s, sm3.0, wow, shader power, wow, but the cards are way too weak to run future games that heavily use pixel shaders and hdr...
even in crossfire...
so it makes more sense to buy a x850xt, wich is faster, and get a REAL sm3.0 card later when you really need sm3.0 and hdr capabilities.
but im not too sure, i wish i had a day of defeat copy so i could check hdr in that game, its the only game i know with hdr. sure, far cry has a hdr mod as well, but its buggy and more of a nice thing to play with than something you want to run the entire game.
Through to ground. T-bred A vs. T-bred B (1 layer on top of the core is the difference).Quote:
Originally Posted by saaya
dead wrongQuote:
99% of the power consumed in ics is dissipated as heat afaik.
Got a link to this? Personally i'd just set up a system with a PCI vid card and measure idle/load draw then add the PCI-E card and see what increased and how much under idle/load. Probably more accurate than anything they manageded.Quote:
and the numbers are from xbitlabs who modified a mainboard to meassure the power consumed through the pciE slot, wich turned out to be very accurate.
so 40W is power consumption under load.
Drooping voltage means extra heat generated at the connector or somewhere in the motherboard where the restriction is which means less clean power and lower clocks. I still say 75w is BS.. , ~30rms at best...Quote:
and well... 75W for the pciE slot isnt bs i think... its just not 75W but a certain amount of W on each rail... theese 1600s seem to suck really hard on the 12v rail though... maybe the board cant keep the 12v rail in the pciE slots high enough?
You will need to measure at the PCI-E slot or somewhere on the PCB of the card where it comes out of hte PCI-E slot. Should have my x1600xt in a couple weeks......Quote:
hmmm and hence vgpu drops or gets unstable?
imma meassure the volts under load and idle on the cards.
HL2 Lost Coast too. I've got all 3 and all i've got to say really is that HDR seems to be a riveboy gimmick so far :slapass: (as in we already get the affect when moving from a dark room to a light one as our eyes adjust to the brightness of our monitor.. hence why gaming in dark room then walk outside into the daylight sucks :toast: )Quote:
Morgoth, i wouldnt say they are power full in intense stuff, they are just powerfull in different intense stuff, aka shaders.
the question that pope up though is how good is shader power if the card cant handle the geometry and texture load?
the 1600 shader power reminds me of the geforce fx5200, dx9 wow!, but it couldnt even run simple pixel shaders effciently...
same with the 1600s, sm3.0, wow, shader power, wow, but the cards are way too weak to run future games that heavily use pixel shaders and hdr...
even in crossfire...
so it makes more sense to buy a x850xt, wich is faster, and get a REAL sm3.0 card later when you really need sm3.0 and hdr capabilities.
but im not too sure, i wish i had a day of defeat copy so i could check hdr in that game, its the only game i know with hdr. sure, far cry has a hdr mod as well, but its buggy and more of a nice thing to play with than something you want to run the entire game.
dont get what you mean...Quote:
Originally Posted by STEvil
then where does the power go?Quote:
Originally Posted by STEvil
i think those guys know what they are doing...Quote:
Originally Posted by STEvil
and nope, because then youd meassure the efficiency ratio of your psu as well, wich varies from psu to psu and ambient temp and whether you have a 50hz or 60hz outlet or 120v or 220v etc...
no big impact, but this guy knows what hes doing and wanted to get as close as possible to the real numbers :D
http://www.xbitlabs.com/articles/vid...-x1000_14.html
no idea what your talking about... explain :DQuote:
Originally Posted by STEvil
yepp, thats what i was going to do, plus check vgpu under load and idle on the back of the card, vmem as well i guess while im at it. and in a couple of weeks? by then the 1700 should be out :DQuote:
Originally Posted by STEvil
xactly... and well, lost cost and the far cry stuf are more like mods/patchesdemos, dad source is the only thing id call a hdr game :DQuote:
Originally Posted by STEvil
and even thats argueable ^^
im using a fortron 350W (400W) psu, so dont be surprised by the dipping 12v rail wth both cards under load ^^
this is an old psu with low watt rating, but its a quality psu, fortron btp series.
psu rails:
12v 12.10 idle 12.00 load
5v 5.22 idle 5.23 load
3.3v 3.38 idle 3.38 load
rails meassured on the mainboard:
12v 12.10 idle 11.96 load
5v 5.22 idle 5.23 load
3.3v 3.38 idle 3.38 load
rails measured on the pciE slot: (2nd from right when having the mainboard layflat and looking at the back of the vidcard. this and the following pins are 12v, then a couple of ground rails and then the 3.3v rail)
12v 12.06 idle 11.93 load
3.3v 3.36 idle 3.36 load
videocard volts (massured on the cap legs on the back of the card, 2 caps close to the videocard fan plug are vgpu, the other cap on the edge of the card is vmem)
vgpu 1.43 idle 1.45 load
vmem 2.10 idle 2.12 load
11.93 sounds low, the board is eating .4v from the 12v rail...
i will bump the 12v rail and see if it helps to get a higher oc... but i doubt it
Through to ground. If there is no curcuit then there is no place for the power to go at all and no reason for it to be where it is in the first place.Quote:
Originally Posted by saaya
Really skeptical after having read that.. it seems like they are saying the x1800xt (for example) draws all of its power through the PCI-E slot but pulling 112.2w (as measured by them) is just not going to happen. Cant imagine what their numbers would have been with it overclocked...Quote:
i think those guys know what they are doing...
This is why you get a mean power first, then you measure. Of course there will be a 2-5% offset due to efficiency, but we dont need 100% accuracy. If we wanted 100% accuracy the only way to go about this would be to connect power meters inline with the ATX header and any power adaptors for the video cards (also any molexes that plug into the motherboard).Quote:
and nope, because then youd meassure the efficiency ratio of your psu as well, wich varies from psu to psu and ambient temp and whether you have a 50hz or 60hz outlet or 120v or 220v etc...
I dont think his numbers are right if they are measuring the power consumption of the PCI-E slot (as mentioned above).Quote:
no big impact, but this guy knows what hes doing and wanted to get as close as possible to the real numbers :D
http://www.xbitlabs.com/articles/vid...-x1000_14.html
when volts/amps encounter resistance heat is generated.Quote:
no idea what your talking about... explain :D
I intend to retest HDR with my x1600xt vs. the minimal HDR affects my 9700pro gives.. maybe i'm missing something, but its just useless and compounds an effect we already percieve. Same for motion blur in some games (namely need for speed or Day of Defeat: Source when you are near a grenade explosion).Quote:
yepp, thats what i was going to do, plus check vgpu under load and idle on the back of the card, vmem as well i guess while im at it. and in a couple of weeks? by then the 1700 should be out :D
xactly... and well, lost cost and the far cry stuf are more like mods/patchesdemos, dad source is the only thing id call a hdr game :D
and even thats argueable ^^
.04v droop isnt much. The droop will probably happen after the connector. Also, the power reguators might be insufficient much like the problem many 9600 series cards had when trying to OC both core and mem.Quote:
11.93 sounds low, the board is eating .4v from the 12v rail...
i will bump the 12v rail and see if it helps to get a higher oc... but i doubt it
I will do the mods today see where I get
correct me if im wrong but dont all sites and people meassure the power consumption of the hardware, and not how much power flew through it?Quote:
Originally Posted by STEvil
only the differential between what flew in and what flew out gets measured afaik. so the 40W is what was brought to the card and didnt leave it, and the only way it can leave the card would be to ionize the air around it, wich would be 0.0000w id guess :D
or through heat.
where do they say the card draws all the power through the board?Quote:
Originally Posted by STEvil
as i said, they wanted to get as close to the 100% accuracy as possible, why would you go for something less accurate if its possible to get better results without a big effort?Quote:
Originally Posted by STEvil
if you have a question about how they did it just email them, they always replied to my emails so far...
dude, right n the very first paragraph of the page of the article i linked you to:Quote:
Originally Posted by STEvil
READ dude, READ!Quote:
To measure how much power the graphics accelerator consumes through the external connector, we used an adapter equipped with special shunt and connectors.
O rly? :PQuote:
Originally Posted by STEvil
still dont get what you mean with
please explainQuote:
Drooping voltage means extra heat generated at the connector or somewhere in the motherboard where the restriction is which means less clean power and lower clocks. I still say 75w is BS.. , ~30rms at best...
yeah, i tested the nw far cry patch wich was released 3 days ago... even at the lowest settings the blending is so strong that some parts of the level become so bright you can barely see anything, and the door buttons and laptop monitors etc that glow get so bright you cant see anything or read what the buttons say :stick:Quote:
Originally Posted by STEvil
adn yeah, this could be achieved with sm2.0 hadware without any problems afaik...
expect a x700 performence wise when you get your 1600 and bench it, otherwise you will be dissapointed :D
its a bit faster than a x700 at stock, quite a bit in some situations, but its not a x800 level card...
its .13 droop wich shows my psu cant keep up on the 12v rail.Quote:
Originally Posted by STEvil
correct me if im wrong, bu droop means the voltage drops because the draw is so big that the circuit becomes less efficient hence the voltage drops.
the .4v is just the circuit resistance, wich is imo pretty large.
for vdim its usually .2 and on this board theres a molex plug 1cm above the first pciE slot... so its kinda weird the traces have such a high resistance...
im discussing this with some guys on 3dcenter.org and there are 2 guys who think the cards are rather fillrate/tmu limited, wich makes sense...
damien, do you cards still scale well when ocing the memory from... lets say 1.7 to 1.8ghz?
how many points in 2k3 and 2k5?
nice usage :DQuote:
Originally Posted by saaya
The gain isn't really that much.
So I can see they are fill rate limited.
If the crappy cores would oc worth a damn these cards might actually be worth it.
I tried the mods and boom.
Cards are being sent back.
Seems in xfire the boards may be pulling too much powere from the slots so the board cant give it too them.
ALso my 12v is being sucked dry and the ocz520 can give some power.
I have a x800pro that I am going to test on this board and see how the benches.
heehe i saw the year of the owl flash at ytmnd and have been waiting for a chance to use it somehow ever since ^^Quote:
Originally Posted by vapb400
the cards blew up?
somebody on 3dcenter.org said there was a 630 lock on the cards... maybe a bios lock? wouldnt be the first time ati did this...
and extra bandwidth doesnt help?
hmmmm
this all explains why theese cards are so useless in cf... fillrate doesnt scale with cf and sli afaik, so the fillrate limit gets even worse with cards in cf... the only thing that gets a boost is shader power i think, and thats something theese cards are good in, so... 1600s in cf doesnt make any sense... makes me wonder why ati went for so much shader power and so little tmu power... maybe tmus are expensive ie cost a lot of transistors?
but rv530 is almost half the size of r520!
so whats taking so much die space?
the shader units?
hmmmmm maybe its the threadding engine!
thats a part they cant remove from the design without chaning the whole concept of the architecture, efficient use of the resources...
but still, they should have capped some shader power and invested those transistors in tmu power i guess...
too bad really...
and whats scaring me is that r580 is rumored to have only 16 tmus, the same amount of tmus r520 has!
if thats true then it will suffer from the same problems as x1600s, a fillrate bottleneck! :S
http://img468.imageshack.us/img468/1...neseowl3hn.jpg
that one is better :banana:
i had a similar one as a desktop background for some time, together with the "derka derka" one and the "o vreimant?" one ^^