nice one sabre, saaya the cards wont go anywhere with the mods so I think it really is a 630 bios lock,
This is stupid in my opinion.
Printable View
nice one sabre, saaya the cards wont go anywhere with the mods so I think it really is a 630 bios lock,
This is stupid in my opinion.
can you mod the bios? the 9500p had the same problemQuote:
Originally Posted by althes
The cards are packed and going back.:rolleyes:
x1600xt's? you rmaing them w/ restocking fee? if so Ill buy one from ya lol:toast:Quote:
Originally Posted by althes
Wow. I'm glad I didn't jump the gun on these.
well they are not bad cards... they beat a 6600gt, but the ocing lock sux...
and they dont scale with cpu power...
they are cheap, and will hopefully get even cheaper... but id rather recommend people to pay 2x the price and get a decent card.
wich lasts 2x as long and lets you play with high settings for at least 1 year...
6800gs, 1800xl, 7800gt... way better price performence deals if you ask me...
[QUOTE=saaya]correct me if im wrong but dont all sites and people meassure the power consumption of the hardware, and not how much power flew through it?[quote]
If you dont measure the full power being used are you really measuring the correct "power" ? :stick:
heat or through to ground like any circuit.Quote:
only the differential between what flew in and what flew out gets measured afaik. so the 40W is what was brought to the card and didnt leave it, and the only way it can leave the card would be to ionize the air around it, wich would be 0.0000w id guess :D
or through heat.
They didnt on the page you linked, but they didnt state they were also measuring the power connector on that same page either (ie: I didnt read the whole article since all I wanted was that one page...).Quote:
where do they say the card draws all the power through the board?
My point is that the method they used didnt seem as accurate as could have been done.Quote:
as i said, they wanted to get as close to the 100% accuracy as possible, why would you go for something less accurate if its possible to get better results without a big effort?
if you have a question about how they did it just email them, they always replied to my emails so far...
didnt have time to read the whole thing, my bad.Quote:
dude, right n the very first paragraph of the page of the article i linked you to:
READ dude, READ!
:slap:Quote:
O rly? :P
still dont get what you mean with
please explain
Seems better i just about everything than the x700 to me.. but i'm still using 9700pro :rolleyes:Quote:
expect a x700 performence wise when you get your 1600 and bench it, otherwise you will be dissapointed :D
its a bit faster than a x700 at stock, quite a bit in some situations, but its not a x800 level card...
http://www.xtremesystems.org/forums/...0&postcount=94Quote:
its .13 droop wich shows my psu cant keep up on the 12v rail.
correct me if im wrong, bu droop means the voltage drops because the draw is so big that the circuit becomes less efficient hence the voltage drops.
the .4v is just the circuit resistance, wich is imo pretty large.
for vdim its usually .2 and on this board theres a molex plug 1cm above the first pciE slot... so its kinda weird the traces have such a high resistance...
12.00v to 11.93v is 0.07v.... (need to use load numbers)
40/11.93 = 3.35A (all on 12v rail? unknown..)
3.35A * 0.07v = 0.235w being generated within the traces of the motherboard or at the connector.
No i am rmaing them because they arent working at all.Quote:
Originally Posted by sabrewolf732
See if I can get back a better pair
you're rmaing them cause they dont oc well? that's shady :slapass:Quote:
Originally Posted by althes
NO they are not working at all wont boot up I am getting the beeps
oh that sucks hard :stick: Where you got them from?Quote:
Originally Posted by althes
stevil, i dont want to argue with you, please make your point instead of dissaproving what i said and what xbit said.
so you think they meassured the power consumption bad?
how? and please explain what you mean and dont post a half sentence :P
and you think the power consumption of a card =! the heat dissipation of it?
if so then please specify what makes you think so and how you could/can prove it.
Quote:
They didnt on the page you linked, but they didnt state they were also measuring the power connector on that same page either (ie: I didnt read the whole article since all I wanted was that one page...).
in the first paragraph of the site i linked you to they clearly say they also manipulated the power plug to meassure the draw through itQuote:
didnt have time to read the whole thing, my bad.
http://www.xbitlabs.com/articles/vid...-x1000_14.html
To measure how much power the graphics accelerator consumes through the external connector, we used an adapter equipped with special shunt and connectors
:cord:
? didnt you mention that there are other ways to meassure the power consumption you would have used wich are LESS acurate than what they did?Quote:
My point is that the method they used didnt seem as accurate as could have been done.
and for somebody who didnt even read the first paragraph of page 14 of a 30 page article you dont really have the right to criticize their methods if you ask me. :P
plus how would they have done it in a better way?
your alternative would have been even less accurate.
so whats your point?Quote:
http://www.xtremesystems.org/forums/...0&postcount=94
12.00v to 11.93v is 0.07v.... (need to use load numbers)
40/11.93 = 3.35A (all on 12v rail? unknown..)
3.35A * 0.07v = 0.235w being generated within the traces of the motherboard or at the connector.
why would the cards have a 630 clock limit... weird...
why 630?
and i dont think its that they dont get enough power through the mainboard... as i said, on my board theres a moley next to the pciE slots and i adjusted the 12v rail and theres no difference at all...
As I said, I skimmed over it rather quickly...
.Quote:
Originally Posted by STEvil
Also as I said, their methods seemed to be less accurate than what could have been done, which was because I didnt take the time to read it thoroughly enough... I thought the shunt was installed between the PCI-E slot and the video card..
Quote:
Originally Posted by STEvil
no +5v?Quote:
Originally Posted by Xbit
There are two ways to do it, one more and one less accurate.Quote:
? didnt you mention that there are other ways to meassure the power consumption you would have used wich are LESS acurate than what they did?
They didnt give us a picture of what they did, do they have the right to tell us they've done something without showing us as proof? Yes, i'm just being hard on them here...Quote:
and for somebody who didnt even read the first paragraph of page 14 of a 30 page article you dont really have the right to criticize their methods if you ask me. :P
The more accurate alternative would have been to connect the power meter right at the PCB of the PSU for each rail (+3.3v, +5v, +5vsb. +12v) and use a single rail psu (no split +12v since those could skew readings).Quote:
plus how would they have done it in a better way?
your alternative would have been even less accurate.
my point is you are getting negligible losses at the motherboard/connector so your PSU and/or mobo/pci-e slot are likely not at fault. 0.07v is a far cry from 0.4v or 0.13v as well.Quote:
so whats your point?
There really is not much more to say than voltage/amperage when encountering resistance produces heat. This is a basic rule of electricity or any kind of energy. Energy encountering resistance must dissipate some of its value to continue to its destination.Quote:
and please explain what you mean and dont post a half sentence :P
and you think the power consumption of a card =! the heat dissipation of it?
if so then please specify what makes you think so and how you could/can prove it.
As to proving that not all energy "consumed" goes directly to heat... first lets see you cool an opteron or X2 putting out 118w with the x1800xt cooler, then you can explain to me where you get that 118w from if there is no circuit for it to travel.
Power consumption != heat, thats just common sense. If electronics produced heat with all of its watts we wouldn't be typing right now.Quote:
Originally Posted by saaya
stevil
pciE cards cant use the 5v rail, the pciE slot only provides 3.3v and 12v, and the power plug is 12v only, at least for the cards ive seen.
how is connecting to the psu pcb more accurate? then you meassure the resistance in the psu cables... plus you couldnt meassure what power goes to the videocard and what to the cpu and memory etc, this would only work to meassure the overal system power consumption.
and they did post pics when the first used this method of meassuring the power consumption of cards when r420 launched iirc... or when 6800 launched? yeah i think when 6800 launched.
a 1800xt consumes 118W, the card, not the vpu!
the 1800xt vpu is probably only around 70W id say, considering the large die that produces the heat, there is no problem cooling it with the rather small and simple heatsink.
and ive seen some guys put a geforce fx leafblower on an athlonxp and they got really nice temps, those heatsinks are pretty good since they have a lot of cfm and the air is flowing steadily over all fins and theres no dead spot in the center of the heatsink like with all "normal" heatsinks.
afaik power consumption = heat...Quote:
Originally Posted by sabrewolf732
If all the power is being transformed into heat, how would anything work? What would separate pc's from space heaters?
space heater = free electrons of the electron flow (electricity) randomly hit other electrons bound to atoms, the atoms and electrons start moving faster.Quote:
Originally Posted by sabrewolf732
heat=speed matter is moving at, the faster the matter moves the hotter it gets
petty much like a party where matter = people and electricity = booze :D
the more boze you induce to the party the more it will make people start to dance n freak out :D
the faster peopl dance and the more they freak out, the better the party = the bigger the heat.
the difference between a heater and an ic is that the electrons dont just hit random atoms but that they run through traces and shut switches on and off.
on the way they hit atoms and create heat though, and switching the transistors also results in heat afaik...
teh heat is generated when the transitors switch. electrical power turned to mechanical, then heat, via friction. this is the power the cpu/gpu comsumes. Heat is also given off in power leakage tho, so all heat given off by the cpu/gpu is not directly related to power consumption, as the leakage will vary from chip to chip.
well from what i know the leakage only determins how much of the power the ic uses is actually used to do some usefull work, aka switching a transistor, and how much goes wasted by just leaking out of the traces before it reached a transistor to switch.
afaik the leakage of prescott is pretty high with 40% or so.
so of the 140W+ a prescott consumes 56W get lost inside the traces before they can switch a transistor, and the rest, 84W get consumed/turned into heat by switching transistors.
So if all the power being consumed is going directly to heat, where is the energy going needed to open and close the gates? You're trying to get something (computer output) for nothing (saying all power = heat)Quote:
Originally Posted by saaya
uh, kinda, but no. those 56w get lost whether they are switching transitors or not. leakage can be different things, but generally, simply current leaking out of the circuit. becasue silicon is an insulator, it will turn this leakage into heat, hopefully enough of it before the leakage is enough to interfere with the switching of the transistor where the leakage lands. this sort of leakage is what affects the r520, to the point that the leakage creates a path to ground, and stops the flow of the circiut, leading to the gpu locking up @ higher voltages.Quote:
Originally Posted by saaya
no, the elelctrons carry power wich they use to hit the switches and change them from 0 to 1 or 1 to 0, then switches however dissipate the power as heat again.Quote:
Originally Posted by sabrewolf732
if you hit them and set them to 1 they keep the energy until they get switched off, then the energy gets transformed to heat and they move to 0.
or its the other way around, dont remember.
you dont get anything for free, you had to dump loads and loads at electrons to make them move the switches. in the end the energy gets transformed to heat, but the switching of the transistors doesnt come for free at all.
the heat is just a bonus, wich you can use to heat your room or cook your food in theory :D
but the hea is more like a sideproduct.
somebody could as well say that by keeping cattle to produce methaine gas in a bio reactor you get meat for free... :D
or by keeping cattle to have meat you get methaine gas for free to power your house :D
yeah, thats what i meant, the leakage is lost anyways, they wont be able to switch transistors because they get lost along the way.Quote:
Originally Posted by cadaveca
but your right, they can still affect the ic.
however, i doubt that the lockups of a r520 or any other ic are caused by a shortage within the ic.
a shortage within the ic would kill it i think.
it would be like a flash, a big load of current would travel through the path where electrons managed to escape the traces and go to ground, and the traces they flow through would literally melt.
LOL, i tried all 3 ati bios editors i could find but none of them will work, 2say bad file and 1 crashes :(
so i opened the bios file i saved from my card with notepad and guess what the first line says, lol :D
test bios??? on a retail card? :stick: lol...Quote:
RV530 PCI_EXPRESS DDR2 A67611 RV530PRO TEST BIOS DDR2 500e/400m
and this is even more interesting:
so those cards had a power plug originally?Quote:
YOU HAVE NOT CONNECTED THE POWER CABLE TO YOUR VIDEO CARD.PLEASE REFER TO THE 'GETTING STARTED GUIDE' FOR PROPER HARDWARE INSTALLATION.
the bios is from 31st of oktober.
this explains why the cards suck so hard on the 12v rail, originally they had a power plug i think, and the power plug is 12v only.
but ati wanted them without power plug i guess...
or im just wrong and es cards always have a power plug, and thats why theres this message in the bios.
you'd be right, but you are not. the leakage is enough that the GPU cannot continue to run...and the issue is a "soft ground", something that is kinda built in to deal with leakage issues, but way so bad at this point that the gpu would falter...Quote:
Originally Posted by saaya
And yes, it IS melting these R520 cores. i think the issue IS causing definate damage, as each time i run the card over 1.275, i get more and more artifacts @ stock.
hmmm over 1.275 already gives you artifacts at stock?
i thougt people were running 1.5v with decent cooling?
are you on air?
how high did you set vgpu?
I've gone as high as 1.35 under water. it didn't run very long before it locked up.Quote:
Originally Posted by saaya
running 1.275 for a few benches, @ 675g/730m, upon reboot i noticed artifacts while running 3d...@ stock voltage. Changing drivers does not help the issue, and it is evident during all 3d apps now.
try RMAing your card back...Quote:
Originally Posted by cadaveca
I don't RMA parts i kill.:stick:
see if they will take it and test it to see if it is a soft ground issue or not. If it were this might indicate a manufacturing defect they could notifiy the die producer of since it was supposedly fixed.
seeing how production of these cores has stopped, i hardly think it matters. they stopped production so early for a reason, after all.
well lets hope r580 doesnt have this...Quote:
Originally Posted by cadaveca
pretty sure it doesn't. I'm pretty sure i can place the exact issue, based on how it artifacts, asnd i can understand why it took them so long to figure it out, but i DO know that they got it covered.
i thougt the issue that caused dying vpus was already fixed in the current r520 chips? it was a 3rd party component that caused this issue, thats at least what beyond3d reported.
Maybe in that instance...then again, i find not everything out there is truth. Everyone likes to pass the buck; not saying that's what happened, merely that it can.
yeah, claming "some third party company" could as well be an excuse from amd not finding the issue they caused themselves...Quote:
Originally Posted by cadaveca
why not ?Quote:
Originally Posted by cadaveca
i'm honest. seems you are not.:rolleyes:
define honest... if they send you back a new one that means they consider the board screwy, from there on it's not our problem...Quote:
Originally Posted by cadaveca
that's pretty shady. Read the warranty, if you modify the card, you void the warranty. You sending it back is dishonest. Good man cadaveca :toast: Not making industry prices go up :toast:Quote:
Originally Posted by Gnome
Quote:
TI WILL NOT BE LIABLE UNDER THIS WARRANTY IF ITS TESTING AND EXAMINATION DISCLOSE THAT THE ALLEGED DEFECT OR MALFUNCTION IN THE PRODUCT OR SOFTWARE DOES NOT EXIST OR WAS CAUSED BY CUSTOMER’S OR ANY THIRD PARTYY’S MISUSE, NEGLECT, IMPROPER INSTALLATION OR TESTING, UNAUTHORIZED ATTEMPTS TO OPEN, REPAIR OR MODIFY THE PRODUCT OR SOFTWARE, OR ANY OTHER CAUSE BEYOND THE RANGE OF THE INTENDED USE, OR BY ACCIDENT, FIRE, LIGHTNING, OTHER HAZARDS, OR ACTS OF GOD. THIS WARRANTY WILL NOT APPLY TO PRODUCTS USED FOR NUCLEAR RELATED, WEAPONS RELATED, MEDICAL OR LIFE SAVING PURPOSES.
gnome, yhpm
if they send you a new card its most likely because they dont have the time to check or dont have the knowledge to find out that you manipulated their product and maliciously damaged/destroyed it.
you are free to do whatever you desire, breaking the law included, but we wont allow discussions about this on the forum.
well technically:Quote:
Originally Posted by saaya
you're allowed to send it back, if they don't find it it's their problem according to their wording. But, yes, still shady.Quote:
WILL NOT BE LIABLE UNDER THIS WARRANTY IF ITS TESTING AND EXAMINATION DISCLOSE THAT THE ALLEGED DEFECT OR MALFUNCTION IN THE PRODUCT OR SOFTWARE DOES NOT EXIST OR WAS CAUSED BY CUSTOMER’S OR ANY THIRD PARTYY’S MISUSE, NEGLECT, IMPROPER INSTALLATION OR TESTING, UNAUTHORIZED ATTEMPTS TO OPEN, REPAIR OR MODIFY THE PRODUCT OR SOFTWARE, OR ANY OTHER CAUSE BEYOND
more then technically, little emphasis onQuote:
Originally Posted by sabrewolf732
meaning that not only are you allowed to send it back, but it's THEIR TESTING that decides wether your warranty applies or not.Quote:
WILL NOT BE LIABLE UNDER THIS WARRANTY IF ITS TESTING AND EXAMINATION DISCLOSE THAT THE ALLEGED DEFECT OR MALFUNCTION IN THE PRODUCT OR SOFTWARE DOES NOT EXIST OR WAS CAUSED BY CUSTOMER
anyway back to topic cause this is a bit off, i've already had my warning...
Okay, I did some '01 benchmarking with a X1600XT...its not in crossfire but I didn't want to start a new thread.
First one is at stock clocks of 580c/684m.
#2 is with a mild OC to 641c/837m.
I know there has been a lot of talk about the RV530 having 12 pixel shaders but only 4 'complete' pixel pipelines.....but over 29K...not bad really. I was kinda thinking this card would suck at '01 with the DX7 and all. With a volt mod and a OC to 700+Mhz, a single X1600XT should be well over 30K.
EDIT: I suppose this is more in line with OC'd 8 pipe cards like X700pro, X800GT and 6600GT.
3100MHz is alot of power
the 1600s dont care about cpu power, that 2k1 run shows your cpu power more than your gpu power man... :D
Think this card could run into ~32K with a better CPU.
Quote:
Originally Posted by saaya
2k1 doesn't care about cpu? :slap: Perhaps not in 03 or 05 where the x1600 is too slow, but 01? hah
read my post again...Quote:
Originally Posted by sabrewolf732
the 1600s dont care about cpu power, i never said 2k1 doesnt care about cpu power. i even said his 2k1 run is showing of his cpu speed more than his gpu speed :slap:
doh' Im sutpid :toast: :woot:Quote:
Originally Posted by saaya
nah, you just need to pay a lil more attention :D :toast:Quote:
Originally Posted by sabrewolf732
i dont know what makes gpus cpu dependant, but whatever it is, the 1600s dont have it... wich is really too bad as we have so powerfull cpus nowadays... the cpu power just exploded as dual core cpus arrived and ati releases a card that doesnt benefit from all that cpu power... weird...
i hope these new cores are better
You guys are stating the obvious here. Besides, what makes this any different than any other 8 pipe card that can score ~30K in '01 with proper cpu power?Quote:
Originally Posted by saaya
EDIT: I was just trying to show that these cards are capable of decent '01 scores similar to other mid range cards. (although, they will probably need more MHZ than say a 6600GT)
For sure. Even without more cpu Mhz, could likely reach 30-31K just with higher gpu clocks.Quote:
Originally Posted by perry_78
Quote:
Originally Posted by sabrewolf732
How about 11K. Doesn't seem too bad for a single card, considering its only at 648Mhz.
EDIT: Covered up in the pic but cpu speed was set @ 3,010Mhz.
Despite what ATItool says, the default speeds are 680.5 / 684.
1600s are 4 pipes ;)Quote:
Originally Posted by DrJay
other cards benefit from more cpu power, 6600s for example, 1600s dont, not at all...
iirc i got a boost of 50 points in 2k3 going from 2ghz to 2.5ghz a64 power...
and no difference at all in 2k5
Hi ,is it possible to change the voltage through Ati tool for x1600,or only with Ati overclock tool?
...other cards that are in the same mid-range category and happen to be 8 pipes...don't be so literal.Quote:
Originally Posted by saaya
1600s do benefit from extra cpu power...that is if you are talking about '01.
ATItool hasn't worked for me on X1300 or 1600.Quote:
Originally Posted by TheVaLVe
well it doesnt matter if its 4 pipes or 8 pipes, all other mid range card i know scale very well with more cpu power, the 1600s dont.Quote:
Originally Posted by DrJay
the fact that they are 4 pipes probably explains it, its a fast 4 pipe card, but its a 4 pipe card, and other 4 pipe cards are maxed out cpu wise as well by todays cpus.
and of course it beneftis from cpu power in 2k1 :P
2k1 is cpu limited for almost all cards nowadays, hence they will all scale better with higher cpu clocks.
this doesnt mean that the cpu scales well with more cpu power.
its in 2k3 and 2k5 where you can see if and how well cards scale with more cpu power
But 2K1 is what we were talking about. The '01 score I posted yesterday prompted the last several posts.
I have seen about as much increase (from cpu scaling)in scores in '03 and '05 with the 1600xt as with other cards. (not that much)
No problems though, we can disagree. ;)
what cards for example?
i see much bigger bumps with more cpu speed with my other cards...
anyways, i think i found the bios lock!!! :banana:
i tried the latest rabit but it still doesnt work, it doesnt crash when loading the bios but it cant set anything or even read the memory timings.
so i went for a hex editor and looked at the bios i made.
i knew that the reference clock for the card was 27mhz, thats all the extra info rabit gave me.
so i searched for 27
5b2c and following it found this:
27;9743234551,'/0234433258:=<98899:741
98899:741 is what? 133
and whats the speed the 1600 cards stop clocking higher?
630mhz, or to put it differently 500mhz default clock +133mhz = 633mhz
27mhz reference clock ??? =< (is below) 133
could it be that this is basically the lock telling the cards default clock+133mhz = :nono: ?
whoooohooo im getting an unlocked 1600pro and 1600xt bios :banana:
ill post them here as soon as i get them :)
Im not saying the x1600xt is a slow card, its actually a damn decent card, but 03 and 05 are too gpu intensive for the cpu to make much differance imoQuote:
Originally Posted by DrJay
tbh my x800pro didnt really scale at all with cpu power, neither did my 6800 :confused:Quote:
Originally Posted by saaya
Quote:
Originally Posted by saaya
:woot: :toast: let us now how these x1600s can clock! would be crazy if they can hit 750+ :cool:
my x1600xt arrived today.. notice a lot of missing SMD components.
Anyone got links to pics of a Sapphire X1600XT vs. other manu's X1600Pro/XT's sporting GDDR3 so we could compare PCB components (hopefully similar design).
Yeah, that was my point. It is a decent card. Compares nicely to X700s, 6600gt and some others in '01. Scales just as well in '01. Seems to have the same lack of cpu scaling as other cards in '03 and '05.Quote:
Originally Posted by sabrewolf732
I'll be volt modding my card as soon as I get home from work tomorrow. 5K Ohm pot between pins 5 and 7. Stock Vgpu is 1.37.........resistance between FB and GND is 59 Ohms. pot will need to be turned below 2K before any significant gains are realized.
yes, its a nice card, it beats the 6600gt, but it costs more/the same and the 6600s are 2 years old now... so... wow! they beat 6600s... :P
they are ok cards, but im still surprised how huge the core is... the 6600 is tiny and performs around the same...
That is kinda strange. The core on my 1600 is smaller than the NV43 die.
Voltmodded my x1600xt. Stock Vgpu=1.37 / modded Vgpu set @ 1.42.
With room temp water cooling @ 702Mhz core / 837Mhz mem, got the following scores:
01: 30,458
03: 11,561
05: 6,131
Still can't publish X1600 class projects.
Interesting...might have to pick up a second card.
Next stop 750Mhz.
yes but its 90nm vs 110nm, rv530 is bigger than nv43Quote:
Originally Posted by DrJay
http://www.cdrinfo.com/Sections/Arti.../card/nv43.jpg
http://www.itxian.com/Files/BeyondPi...1600_thumb.jpg
cool, so your bios is not locked. what driver did you use?
what app did you use to oc?
what card is it? a sapphire? powercolor? asus?
actually the 6600 and 6800 are alot slower if you look at benchiesQuote:
Originally Posted by saaya
Suspect dr jay got an unlocked bios on his card.
cat 5.13s, Overclocker tool, SapphireQuote:
Originally Posted by saaya
Re: Saaya, come on now....you can look right a NV43 core and plainly see that it is larger than the RV530. You're saying that doesn't count because it is built on a 90nm process? Or, are you just talking about transistor count and not actual size?
As far as the '03 score in my last post goes, that's not really any higher than a 6600GT, is it? Hmmmmm
hmmm did you use the 5.13ccc?
and yeah i was talking about transistor size not size of the core, sorry, should have made myself more clear :)
its mostly about performence/transistors, and rv530 really sux there... :/
especially if you consider its 2 years!!! older than rv530
6600's are going EOL.
here My results with a pair of X1600XT's in CF
2K3
System Stock (Venice 3200+, 1024MB 2-2-2-5, MSI RD480)
http://img150.imageshack.us/img150/8601/record0bb.jpg
2K5
System (Venice 3200+ @2.5Ghz, 1024MB 2.5-3-3-7, MSI RD480)
http://img444.imageshack.us/img444/5538/dibujo7ir.jpg
2K6
System (Venice 3200+@2.5Ghz, 1024MB 2.5-3-3-7, MSI RD480)
http://service.futuremark.com/compare?3dm06=232388
WOW that is an impressive 05 score there. And you only have an RD480... That score should go up considerably with a better board. With an RD580 you will gain about 1500-2000 points in 3DMark05 and 1100 points in 3DMark06.
CPU: Intel Pentium D 820 2,8GHz 2X1MB L2 cache @ 3,5GHz / 1,5v
MB: Asus Socket 775 P5WD2
RAM: 4 x 512MB Kingston DDR2 PC2-3200 400MHz @ 250MHz / 3-3-3-8 / 1,8v
GFX: 2 x Gigabyte X1600PRO 500/400 @ 621/459
3DMark Score 7660
http://service.futuremark.com/compare?3dm05=1905498
damn, i had SmithField that day:(
cards had DDR2 mem which could do only ~450mhz
gpu-s are passive cooled;)
now i will run it with the cards oced.
these scores are with cards at stock.
impressive keep it up
thx
now i got 9.2K in 3DM05 and 3.9K in 3DM06
CPU: Intel Pentium D 920 2,8GHz 2X2MB L2 cache @ 4,2GHz
MB: Asus Socket 775 P5WD2
RAM: 4 x 512MB Kingston DDR2 PC2-3200 400MHz @ 450MHz
GFX: HIS X1600XT + Sapphire X1600XT CrossFire @ default
3D Mark 2005 Score 8526
http://img312.imageshack.us/img312/1764/05def3rs.th.jpg
HIS & Sapphire TEAMWORK:toast:
3dmark 03 is low and 06 crashed...
maybe thanks to cat 6.4
edit:
ok, got 06 runnin
CPU: Intel Pentium D 920 2,8GHz 2X2MB L2 cache @ 4,2GHz
MB: Asus Socket 775 P5WD2
RAM: 4 x 512MB Kingston DDR2 PC2-3200 400MHz @ 450MHz
GFX: HIS X1600XT + Sapphire X1600XT CrossFire @ default
3DMark Score 4043
http://img342.imageshack.us/img342/4715/06def1vs.th.jpg
low score?
edit2:
Cards:
HIS X1600XT + bundle
http://img324.imageshack.us/img324/4...37169ac.th.jpg
HIS X1600XT top
http://img199.imageshack.us/img199/6...37170gk.th.jpg
HIS X1600XT bottom
http://img199.imageshack.us/img199/2...37185yj.th.jpg
Sapphire X1600XT top
http://img199.imageshack.us/img199/1...37195nx.th.jpg
Sapphire X1600XT bottom:
http://img324.imageshack.us/img324/9...37201id.th.jpg
ATi rules, diffrent firm cards work fine together - they even have diffrent clocks
very nice
CPU: Intel Pentium D 920 2,8GHz 2X2MB L2 cache @ 4,2GHz
MB: Asus Socket 775 P5WD2
RAM: 4 x 512MB Kingston DDR2 PC2-3200 400MHz @ 450MHz
VGA: Sapphire X1600XT 583/689 @ 621/725 CrossFire
3DMark Score 8866
http://service.futuremark.com/compare?3dm05=2000170
2nd place in orb for two Sapphire cards:)
Well try Colorful Radeon X1600 XT, 256MB GDDR3, DVI, TV-out, PCIe =)
It has stock 650/700 clocks...
Regards Dennis
Now I've two 1600xt in crossfire. :cool:
3dmark03: 15.557
3dmark05: 9.196
Both cards are from Sapphire, running on an Abit AT8 32X with an A64@2250mhz.
I've not many time atm, so the system is nearly out of the box. The GPU's have not many room for oc, I'll leave them on normal settings, but the CPU have room for improvement.
someone have the pics about all vmod possible for the x1600xt?
thanks ;)
ok i have a dilemma...a friend that is running single card always wants to buy my x1800xt 512mb, and for about 60% what i get for it i can buy 2 x1600xt...my mobo is abit at8 32x, opteron165@2.6ghz and im getiing cca. 5200 3dmarks 06 with xt oced at 730/820...now could the x1600xt reach these numbers (at least 4500 or so) and is there a particular game setting where 256-bit bus and 512mb of single card will blow the limited x1600xts away (like really bad :P)
Quote:
Originally Posted by d4d4cH
the x1600xt is not techonlogically capable of matching a x1800xt. It would be a 8x3 config and the 1800 is 16x1, but the inneficiency of xfire and the 128bit mem bus will kill it.
my x1600xt's at 600/800 and 165 @ 2600 (cfx3200-dr mobo) are good for 10k 05/5.5k 06. Scores on hwbot.org
Just dont use AA (or too much rather) with them and they fly fine. 1680x1050 16AF hasnt failed me yet.
so 4xAA kills the performance? how bad?...can u run a benchmark where u use 4xAA 16xAF @ 1680x1050, since it looks like we have the same monitor :)
not really sure, never bothered with AA over 2x since running 1680x1050 or playing an FPS negates the need for it really.
nice cards & nice score :D
not bad at all
ok...one more thing...i can only find your score of 4.5k in 2006 on that site :)
it was only 4.5, typo ;)
hmm...i got a good opportunity to trade my powercolor x1800xt@pe for 2x powercolor x1800gto...should i do it?
:)
I would.