Jep ...there are two s478 boards with pcie.
Asus P4GPL-X (which im using)
Asus P4GD1
i realy wonder when those sweet x1800xts will arrive in germany :D
raiden
Printable View
Jep ...there are two s478 boards with pcie.
Asus P4GPL-X (which im using)
Asus P4GD1
i realy wonder when those sweet x1800xts will arrive in germany :D
raiden
roflQuote:
Originally Posted by ***Deimos***
I stand corrected.Quote:
Originally Posted by Raiden Zero
Quote:
Originally Posted by ***Deimos***
i don't get the "elegant" argument... graphics cards are not elegant, crossfire isn't elegant and neither is SLI... discussing how "elegant" a solution is to me makes strictly no sense... we can see some good marketing work :stick:
it has nothing to do with being elegant. I didn't even mention SLI.. perhaps you are replying to someone else.Quote:
Originally Posted by Gnome
Regarding my previous post... Its much like P3 and Athlon. They were relatively close in terms of performance/Mhz. But, P4 widened the Mhz as well as the performance/Mhz gap (increase and decrease respectively). Ofcourse the performance of ATI pipelines is slightly different, and so are the the ratio of ROP and VS, but for pure pixel shader processing we can make the ESTIMATION of approximately 3:2 performance per Mhz for 7800GTX because it has 24 vs 16 pipelines. So obviously, as I already mentioned, you need less overclocking to get the same amount of PS performance.
However, notice how each would scale... The VS, ROP and PS would all rise in equal ratio on R520. But, on G70 (discounting the +40Mhz thingie.. not sure if they accounted for that in the article), they would diverge by a wider and wider gap.. the ratio is always approximately the same, but who would want *only* 666Mhz VS and ROP speed, compared to R520's 1000Mhz, even though the PS performance is about equal.
In the end, as always, things are more complicated than they seem, and such comparisons at equal Mhz just raise more questions.
My XFX 7800GTX 557/1430 All air stock 9389 not bad.. :toast: http://img203.imageshack.us/img203/2355/m055aa.jpg Link http://service.futuremark.com/compare?3dm05=1304002 2k3 19713 http://service.futuremark.com/compare?2k3=4314701
Jo what kind of cores do you have ? i have heared that the W14 ( Ref 4 ) where the cores with softground problems And those chips are on the XL boards but you have XT boards with W14 chips. The most XT boards have W15 chips ( Ref 5 ). I whant to buy a X1800 but i dont know if i should buy a XL or XT. I whant sky high clocks and not be limmited @ a max of 600 by the softground problems. I have seen a XL hit 705-704 @ XT stock voltages. But the question is will all XL do that good ? or was that a verry lucky board ?
865pe....stay on topic, we have threads you can put your OC and scores in :slapass:
They'll do that well if you can handle the heat....these things COOK at 550/1.05V and 625+/1.25Vish is A LOT hotter....the new revisions lower the heat and make XT speeds more attainable.
I hope ATI gets the x1800xt in the channel with force before nVidia gets the 90nm stepping with 1.2 ns GDDR3 cards into the pipeline. If they get those out, with the ability to get to 600 on the core and 1400 MHz of 512 MB memory at spec, it may hurt ATI a little too much for them to be a big player this round. ATI really needs to get it done this time, and by the end of November, no later. Or, nVidia may trump their release.
I don't think it will hit 600. ATi his whole chip is build on speed. The GTX is not. I don't think they wil be able to delifer high amounts of cores that run 600. I think more of 500 and 550 max.Quote:
Originally Posted by HeavyH20
But Any way ATi will need the R580 to get back on top.
Nice score there. This thread really make me want to upgrade to X1800XT
Definetly wait.. there is only limited amount of "old" w14 chips. Once they are used up, you will notice better OC from X1800XL on average. Ofcourse it will still depend on the quality of the core, and the pcb etc.. but at least you'll know you wont be limited for certain. The 700Mhz XL was very likely a new core (w15).Quote:
Originally Posted by Astennu
Also interesting point about the voltage. The X1800XT probably have much higher voltage. How much I'm not sure. But, most reviews I see, have BIG difference in power/heat compared to the relative clock rates.
About 7800GTX... consider the simple fact that despite smaller 110nm vs 130nm 6800 its default clocks go from about 400 to 430.. (425/450 to 450/490 if you consider the OC editions that are selling). This indicates that architecturally 7800 hasn't changed much, and as others stated, is simply not made for high clocks (Athlon64 vs P4 prescott). You also have to consider that nVidia will be raising the bar from official 430Mhz reference (490!!). So, like before it will be "conservative" to allow for OC eddition. Best guess is something in the 500-525 range, with possible for OC editions at 550... however, it also very much depends on what special optimizations they use for their manufacturing process (low-k, SOI, stretched Si, sleep transistors, etc)
awesome! :p: what kind of temps does that card put out?Quote:
Originally Posted by sxs112
That's the reason why this topic exists. :)Quote:
Originally Posted by longsiew
But don't be disappointed when you end up with a pure benchmarking card that will sometimes even get asskicked by some X850XT or 6800U parts when it comes down to real gaming performance ... :p:
X1800 has real gaming performance, not sure what you mean. Games dont run? bad drivers? I been playing BF2 on the 7800GTX and switched to the X1800 and to be honest I am scoreing much better and I do not have the driver issues I had with other games. How can I score better? better average frame rate vs spike/dip in frame rate.
The TOW missle with TV guideance is so much better on Radeon.
I just hope you are not referring to wallhacks and other shader hacks on NVidia being easier to setup??
if they are nvidia gets my vote :stick:Quote:
Originally Posted by FUGGER
I am referring to official benchmarks like this:Quote:
Originally Posted by FUGGER
http://www.computerbase.de/artikel/h...les_of_riddick
In 1280x1204 the X1800XT gets :slapass:ed by a 6800Ultra.
In 1600x1200 4xAA/16xAF the X1800XT gets :banana:ed by a X850XT-PE.
Moreover, nobody can say, this game is NV-optimized since even the X850XT-PE can beat the X1800XT. This card is just a sucky underperformer. High 05 scores but no real world performance at all.
Another example here:
http://www.computerbase.de/artikel/h...e_of_empires_3
I realy love to get one of these X1800 XT 512 mb cards :D
I wonder if the new Asus P4RD1-MX based on the Xpress 200 chipset will work with Dothan . . .Quote:
Originally Posted by Raiden Zero
Currently running a Dothan at 2.80Ghz on P4C800-E Deluxe board. Would be nice if I could grab an PCI-Express board and still use the Dothan. Or better yet, get a Yonah in 2006 :D
Very nice result ;-) go ahead
whats some of these gtx vs x1800 people are assuming is that performance based on clock speeds are linear.
Its actually a bit geometric at a rate of maybe 1.1x or so.
for instance, downclocking a x1800xt to GTX speeds and disabling 8 pipelines of the gtx the gtx wins in performance.
but if you overclocked both to 650gpu and the memory accordingly, I think you'll notice that the x1800 would actually perform better at higher clocks due to how its designed not to mention enhanced memory controller.
Although I'm not sure if the gtx @16pipes is even more geometric than the x1800 that requires tests.
Moral: you cna't really comapre the gtx and x1800 even if u do use the same pipelines... to determine which hardware is better.
If you are basing your overall view of two cards exclusively on Chronicles of Riddick (or any game for that matter, unless that is indeed the only one you play - EVER), then you will obviously have a biased view. However, it is sometimes good to look at similar games to get a general (but not definitive) guideline of upcoming game's performance. Ex.. games on Doom3 engine will presumably continue to run better on nVidia hardware.. sad considering ATI was supposedly trying to fix this issue since Quake3 days (1999).
Unoid: I hope you are not implying that doubling Mhz would more than double performance (-_- so wrong...). The article that made that comparison was just a general overview or architectures... and should not be treated as conclusive. They failed to account 16/24 nature of GTX vs 16/6 for XT, the +40Mhz clock domain of GTX, the different timings used for both cards, the different amount of memory on each card... and even effect of drivers which may be fooled to do stupid things because the ratio of various component's performance is altered.
That article doesnt really provide anything new... it just reaffirms what you see from the plethora of "reviews"/benchmarks out there - they are both very fast, and in same general league.
Maybe you haven't heard of the X1K MemoryMap Fix for OpenGL and AA. x1800XT performs faster than the 7800GTX in doom3 with AA enabled (this will be incorperated in the 5.11 catalysts).Quote:
Originally Posted by ***Deimos***
I agree with the first part of your statement, its good to compare performance for similar games ie. F.E.A.R or Battlefield 2 :)
I have not heard of this admad where have you? Will this fix all OPENGL games starting from Catalyst 5.11 and above? Sounds cool to me.
Name is ahmad :slap: :DQuote:
Originally Posted by cantankerous
Sampsa did a review of it: http://www.soneraplaza.fi/tietokonee...310025,00.html
Hexus did a nice review that includes chronicles of riddick as well:
http://www.hexus.net/content/item.php?item=3668
The best part is: this is only the beginning :)
Also I should note that this fix is only for the x1xxx cards. Previous gen cards got a slight boost in openGL performance with the 5.10 cats.