CCC for 5.12's is f'd up.
just use CCC from 5.11's.
Printable View
CCC for 5.12's is f'd up.
just use CCC from 5.11's.
6800gs takes 1.6+v to hit 600mhz stock cooling on the core.
runny the x1600xt can do 600mhz at what? .9V? 1.0?
I think the 128bit/256bit memory bandwidth difference iwll be made up easily jsut from the massive ocing in the x1600xt. wish i had one in, Taking forever to get any good ocing and voltmodding results.!
I'm trying to remember...the 6800GS is 12x1 and the X1600XT 4x3 I believe. 12x1 is "faster" than 4x3...for now. I think thats how it works.
Looking forward to 3d05 results :)
Perkam
Hold tight, monday night i should have the card on water :)
:eek: in water!!!!!! :eek:Quote:
Originally Posted by Dillusion
lol j/k
can't wait to see your benchies there :)
Those should be some nice benches
cards tested so far are 128bit? anyone got their hands on the 256bit card yet? any idea where the 256bit cards are being sold?
No 256 bit cards.Quote:
Originally Posted by HousERaT
I don't think that's how it works. According to the block diagrams from ATI's material, if the X1600 is 4x3 then the X1800 is 4x4.Quote:
Originally Posted by jjcom
x1800 is 16x1 afaik.Quote:
Originally Posted by DrJay
Correct.Quote:
Originally Posted by sabrewolf732
Dillusion...what were you getting on air with these? Most of us are here for air benchies lol :p:
Perkam
That is what I was pointing out. The X1800 is 16x1, the X1600 is 12x1. I guess I need to be more clear....Quote:
Originally Posted by perkam
I'm 99% sure that the X1600 is 4x3, not a 12x1
Look at the block diagrams that compare the r520 and r530 in the X1000 series reviews.
If it really was 4x3, then the X1600pro should not be any faster than a X1300pro in many, if not most apps.
Could you give me a link? I'm having trouble finding comparisons
I've found this
http://www.extremetech.com/article2/...1896684,00.asp
Then toward the end of this page
http://www.anandtech.com/video/showdoc.aspx?i=2552&p=3
The old way of 4x3 is 4 pipelines with 3 TMU but what is ment now is pipelines with 3 fragment procesing units on each pipeline. The RV515 is a 4 pipeline with 1 fragment procesing unit for each pipeline and the RV530 is 4 pipelines with 3 fragment procesing units for each pipeline. Beyond3D has more info on this.Quote:
Originally Posted by DrJay
http://www.beyond3d.com/misc/chipcom...r=Order&cname=
http://www.beyond3d.com/misc/chipcom...r=Order&cname=
True SnipingWaste!!
baiscly rv530 have same fillrate as rv515 but can process 3x more shader operations and 3x more vertex operations! if new ati video format recompressor is doing maths on gpu shaders then on rv530 should be around 3x faster than rv515:D
PS. ati can do some physics on GPU (for example in Toy Shop demo), again rv530 will be 3x faster than rv515:cool:
Do we now which X1600XT's have the best memory yet? I'm sure you have the answer Perkam! :D
rofl...:p:Quote:
Originally Posted by C Stat B
Without individual reviews, I have no way of knowing, like anyone else. However, 1.4ns ram is pretty much standard on all XTs. Be wary of Gecube x1600XTs as someone at that place is going to get fired for selling x1600xts w/ gddr2.
Other than that, powercolor will be coming out with a newer model with vivo and with a cooler that cools the ram as well (current one only cools half the ram), HIS will be coming out with its trademark Turbo Cool versions which is like an arctic cooling silencer that comes with the card :) ... MSI's XT has copper (or it looks like copper) cooler for both ram and gpu, Sapphire's solution is also a good one, but the fan blows air straight onto the pcb instead of inside the cooler, so I have no idea how effective that'll be.
My advice would be to wait a while for the rest of the models to show up. There may be manufacturers putting 1.2ns ram on some but there's none planned atm. I'm personally waiting for the Powercolor X1600XT Bravo Edition....if you're looking to get a PRO, wait for the Powercolor X1600PRO as it has the same cooler as the XT Bravo :)
Beyond that, as cards come in, reviews start to come out and ppl actually start buying them, there's no way to know which are best...they all seem standard fare with 590 core and 700 memory.
**NOTE** If you're buying a vf-700 or have one already, none of the above matters. Grab the cheapest X1600XT you can find, rip the stock cooler off and start clocking. If anybody's finding it hard to make a decision, they can PM me too...you might end up walking away with a GS, if thats the thing for you, it has a very strong market position atm.
Perkam
What type of memory do the 512MB X1600 Pro cards use? Same as the 256MB?
It's only a $20 upgrade for double the mem...
Both DDR and DDR2 versions are available...you have to be careful in picking the right one.Quote:
Originally Posted by Shadowmage
Perkam
Thats really what I'm looking for, one with 1.2ns memory (I guess that might never happed though). A shop near my house has the Sapphire's in and I'm really temped, I might pick one up with some Christmas money. Thanks for the reply.Quote:
Originally Posted by perkam
Np. I wish i knew more, but only reviews of these newer non-reference designs will tell us which one is the best.Quote:
Originally Posted by C Stat B
Perkam
That is supposedly what Nvidia started using in their GF 6 series as well, the fragment pipelines that is.Quote:
Originally Posted by SnipingWaste
That is what I'm talking about here, also. I think we are talking about the same thing with different terminology.
To jj: I don't remember which reviews showed the diagrams...but it was quite a few,...shouldn't be scarce. Basically, it showed 4 blocks for pixel quads in the X1800, 3 for the X1600 and 1 for the X1300.
However, those diagrams may be a bit misleading...
Although, the diagrams clearly show 3 quads of pipelines, whether they be fragment pipelines or not. If it really was 4 pipelines with 3 fragment processing units per.....there would only be one quad. If the blocks for the 3 "quads" in the diagram were used to designate fragment units, that doesn't seem to make sense either...that way there would be 3 blocks of 4 frag units with no mention of the pixel pipelines.
Maybe ATI was being a little misleading??