o.O it showed it beating the 7800gtx 512 in almost all the games by a healthy 15-20%, maybe Im reading differant reviews? I read hexus's and a few others.Quote:
Originally Posted by Pinnacle
Printable View
o.O it showed it beating the 7800gtx 512 in almost all the games by a healthy 15-20%, maybe Im reading differant reviews? I read hexus's and a few others.Quote:
Originally Posted by Pinnacle
Quote:
Originally Posted by Pinnacle
you need to read more reviews dude
http://www.hardwareluxx.de/cms/artik...id=319&seite=7Quote:
Originally Posted by `SippY
Im talking about 1280x1200 settings which most us play at. It does not beat the 512mb GTX by 20%
to play 1280x1024 you dont need a 7800gtx 512 or x1900 :rolleyes: you can get by fine with a single 7800gt. The people that are paying $500+ for a video card have monitors that are capable of 1600x1200.Quote:
Originally Posted by Pinnacle
Unfortunately, your wrong.Quote:
Originally Posted by sabrewolf732
At 1280x1024 4xAA, 16xAF the GT will lag. :slap:
Correct.Quote:
Originally Posted by sabrewolf732
I was really expecting alittle more from the XTX, thats all Im trying to say.
Quote:
Originally Posted by Pinnacle
my x800pro played fear at 1024x768 with 4x aa and 8x af, you're telling me a 7800gt cant go up a resolution? And btw, you're wrong.
http://hardocp.com/image.html?image=...JfMV9sLmdpZg==
btw I suggest you read more reviews, all sites call the single x1900xt/xtx the fastest single card out :rolleyes:
again read more reviews!Quote:
Originally Posted by Pinnacle
Were not talking about your x800 pro, stop flipflooping.Quote:
Originally Posted by sabrewolf732
MY X2 3800+ at 2.4 w/7800GT at 1280x1024 4xAA, 16xAF lagged in fear.
Face it, your wrong. PERIOD
You really havent said anything usefull in this thread:rolleyes:Quote:
Originally Posted by `SippY
So me along with mass amounts of reviews are wrong? Uh huh :rolleyes:Also, I was giving the x800pro as a referance point, obviously :rolleyes:Quote:
Originally Posted by Pinnacle
the same can be said for you basing your opinions off of one review and only considering low resolutions when A SINGLE x1900xt can play games playably up to 1900x1400 :rolleyes: People that dish out $500-$600 for a single card usually have enough for a monitor capable of at least 1600x1200Quote:
Originally Posted by Pinnacle
Quote:
Originally Posted by THREAD!
How has what you've said so far related at all to the thread?
How do they justify the price tag in the UK?
The 1900xtx is retailing for £469.94 which translates to US$ 838.79
Once again people in the UK get screwed over :(
Edit, I just checked and I could buy one on newegg if I lived in the states for $549 = £307.58. A penalty of £162 for living in the 51st state.
Quote:
Originally Posted by GazC
All them taxes and stuff :( :slapass:
Pinnacle, pls stop modding this thread on your own. Everyone is entitled to their opinions and comparisons.
Some great prices to be found on these...
Perkam
Depending on where you live.....:mad:Quote:
Originally Posted by perkam
No kidding, and Im not disputing thatQuote:
Originally Posted by perkam
Check out the first post in the official X1900 thread GazC...plus check out page 4 of the same thread...some prices to be had on these applies to around the globe this time.
We're seeing good prices and availability in Sweden, France, Germany, United Kingdom, Italy, Canada, and many more places.
Perkam
Well this is the HQ! and its been released so why not discuss it here, the XTX that is. Your the one that took it off topic comparing it to the 7800GT and your x800 pro.Quote:
Originally Posted by sabrewolf732
Just like your facts being useless. If you havent noticed most of the sites that review the 7800GT are tested with fx-53's and above. And most of the time the 7800GT doesnt reach the min playable FPS of 30 at 1600x xxxx. If ppl can afford a fx they will obviously not go with a single GT.
Well, I just talked to my buddy at a local shop, he said we'd be getting some XTXs next week for a nice fee of $899 Canadian
Not after I read a lot:Quote:
After reading reviews etc. I have to say, Im alittle disappointed with the X1900.
Anyone feel thesame? This card is supposed to challenge the G71, but it looses to the 7800GTX 512mb in almost all tests(at realistic settings. I mean who uses 1920x1400 and above)
Quote:
I also want to note that the efficiency of Shaders 3.0 in terms of (dynamic) branching is manifold higher in the X1900 XTX than in the competitor from NVIDIA, which may also have a positive effect in future games.
Quote:
NVIDIA again gets advantage here from 16-bit precision (don't forget that intensive intermediate calculations of this precision may result in noticeable deterioration of rendering quality and the current de facto standard and requirement to all future APIs is internal calculations in FP32 format). The R580 is nearly three times as fast as the R520 in the complex computing model. It noticeably outperforms NVIDIA, especially if we don't take FP16 into account.
So it's executed faster by NVIDIA, which has more texture units. Here is the dilemma we mentioned earlier - different algorithms may be implemented in different ways, one chip will favour computation priority, the other - texture access priority. Some things can be calculated, others can be looked up in a prearranged table. Unfortunately, architectures differ very much now. Each of them will have its own optimal shaders, which will pose new difficulties to programmers, especially to those who thoroughly optimize performance of their shaders.
Anyway, the R580 works much better with pixel shaders than the R520. The new chip can be really called a shader king. It noticeably outperforms the G70 in any calculations that are not limited by texture sampling
Quote:
it’s the crop of games that haven’t been released yet which could show the most significant benefits. As game developers continue to incorporate more shaders into their games and begin to rely on more of shader model 3.0’s features like dynamic branching, the performance potential of the Radeon X1900’s R580 architecture will increase.
http://www.digit-life.com/articles2/...580-part3.html
Considering the capacity of the latest generation of accelerators, performance analysis should be performed only with the AA+AF load.Otherwise, it would be a competition of processors or system units rather than video cards.
It's no secret that many games have already reached the limits of CPU and system capacities on the whole, so we don't see the potential of top video cards even in AA 4x and AF 16x modes. That's why we introduced a new HQ (High Quality) mode, which means: HQ:
* ATI RADEON X1xxx: AA 6x, plus Adaptive AA, plus Temporal AA, plus AF 16x High Quality mode
* NVIDIA GeForce 6xxx/7xxx: AA 8x, plus TAA, plus AF 16x.
F.E.A.R.
Taking into account that this game requires really huge computational resources from an accelerator in the first place and only then texturing capacities, the new R580 architecture acts brilliantly here. Of course, the number of ROPs and TMUs is a deterrent,
Splinter Cell Chaos Theory
The advantage is again not manifold - texturing plays important role in this case.
Half-Life2
But I repeat that it's a bright example of a passing generation of games. If the game had had HDR and Shaders 3.0, the X1900 XTX would have risen sky high, as the XT model.
>>Triple shader capacities are of little help here.
We can see that the famous hobby horse of NVIDIA (efficient operations with textures) has put the X1900 XTX on the same level with the GeForce 7800 GTX. And the 1900 XT is even outperformed.
This test is limited by CPU capacities more than Far Cry, so we are interested only in HQ mode.
>>Here we can see that the game has a previous-gen engine, where texturing and computational capacities of shader pipelines are used on equal grounds, hence almost no advantage over the X1900 XTX.
Chronicles of Riddick
Alas, OpenGL is again Achilles' heel of ATI's products. However, the problem is not only in API. Everybody knows that DOOM III and its engine are fine-tuned for NVIDIA cards (texture operations, stencil buffer, etc). That's why all GeForce cards can be victorious here, there is no reason to be surprised.
Nevertheless, if we enable HQ, such a load miraculously turns all the victories of the GeForce 7800 GTX into defeats. The X1900 XTX as well as the 1900 XT shoot forward. The gameplay is not very good, but we should keep in mind that resolutions higher than 1024x768 are useless.
3DMark05: MARKS
the X1900 XTX is obviously victorious, but not manifold again. Even less than by 50%. But texturing still plays a very big role even in such a super-shader test, its computing/texturing requirements ratio is far from 3-to-1.
Quote:
In all cases, ATI will have the image quality advantage with angle independent AF and 6x MSAA.
We tend to like NVIDIA's transparency SSAA a little better than ATI's adaptive AA, but that may just come down to opinion and it still doesn't make up for the quality advantages the X1900 holds over the 7800 GTX lineup.
Quake 4 at 1600x1200, Quake 4 clearly prefers the more balanced combination of high clock speeds and 24 pixel shaders/24 texture units in the GeForce 7800 GTX 512MB.
Quote:
Originally Posted by Pinnacle
Want to argue talk to me on aim.Quote:
Pinnacle, pls stop modding this thread on your own. Everyone is entitled to their opinions and comparisons.
Staying away from all the stray bullets....so the 1900 series will be the new high and mid high end and the x1800 series will be mid end while the x1600 should be bargain basement. Am I correct?
x1800 is discontinued.Quote:
Originally Posted by situman
if x1800 is discontinued, what will take its place? Its gonna leave a big whole in the middle of the price spectrum.