I wonder if we will see better performance from the xt/xtx (512/1024) as drivers mature?
Printable View
I wonder if we will see better performance from the xt/xtx (512/1024) as drivers mature?
Yes. There will be ongoing driver revisions.
BTW.......Has anyone benched these cards in Vista?
I have seen performance benchmarks that outperform
the Ultras in CF by 20%.
MM
Guys, for those guys that must have one of these 1GB monsters,
I am currently working on having one exclusive AIB to carry it on their
ecommerce site. I will keep everyone posted tomorrow.
MM
If the GDDR4 is so expensive why not use the lowest latency possible and use 64x12 or a 64x14 memory IC configuration instead of 64x16? It will cut cost to manufacture and reduce cost to the consumer...
call of juarez dx10 benchmark ate my rig for bfast. Vista 64-bit. stock 2900xt speeds.
just popped the option tab to "high" and ran it.
yup. 7.5 doesn't add any significant performance boost at least with the tests ive been running.
(HL2:Ep1 Bench, Call of Juarez Bench, 3d'06, Unigine v.4)
drivers aside, the 2900xt is a slug. even if Hexus.com is getting 20k in 3d'06 with it, they had to have watercooling and crossfire. both GPU's were really OC'd too.
@trajik78
what cpu? also on wc? if so, its a good score ; ) i belive they dont tweak their os for benching ^_^
I'm getting 16.5k with 3.15ghz quadLINK...20k is easy with a good cpu, and cards @ stock clocks.
CoJ I get very decent speeds, higher than posted above, with 939 4400+ @ 2.5ghz(on A8N SLi, so no crossfire) ie, 12 min, 44 max, 22avg. That there bench is system memory/HDD limited, as GTX does not score much more.
Corssfire gives me 14fps min, 71FPS max, 34FPS avg...the addition of a second card, plus the small increase in minimum frames, while maximum frames has doubled, speaks volumes of how muc hgrowth is available for this game, and the immaturity of HD2900XT drivers.
Running crosfsire with official drivers, from boot, slave card does not get 2-d clocks...3D clocks only, leaving the slave card 12-14c higher in temps than master card. In Vista, this weird behavior allows for Crossfire overclocking, however you can only change clocks once for each boot...setting different clocks will cause a freeze, as the driver rests card clocks to 3d clocks on the slave, but only with 2d voltage.
One of the major reasons these cards are so "poor" performing is because of cpu limits. It's very interesting to see crossfire x1950's get beat by a single card, purely because of a lack of cpu work from driver do to the lack of running Crossfire with one card.
Seeing performance differences speaks volumes on how biased alot of the info is out there...we have recently had nV driver that raises cpu scores in benches...I wonder what a similar change in ATI driver may bring for performance...
Yet still these ATI cards outshine G80 in benches. I can feel the "wait" from some of the top benchers...alot are holding back ATM...
Denny, will YOU be the first to publically show confirmed 1350mhz gpu speed? I am very impatient for this...I know it's possible...DI and 1.525volts...:lol2:
Alright guys..........For those guys that live south of the Canadian border,
the DIY crowd will be able to pick up these 1GB cards on the 14th of June.
They will start delivering on the 15th. The cards will come in 4 tropical flavors according to the arrangements that were made. Diamond will have the
1GB boards available of their site. $579.00 and $599.00.
Any info about Europe?
Bloodbanger.........We am working on the logistics and schedules for
Europe. I will estimate that Europe should have them with 10-15 business
days. The should be also available in Latin America, starting with Mexico.
You wont be disappointed.
MM
i see the product info on their site, but no "buy it" link.
GPU is 825 (up from 740 on 2900xt 512)
Mem is 1100 (up from 825 2900xt 512)
guess they're going after the GTX with this card aye? wonder if it's gonna do as bad as the 2900xt did against the 8800GTS?
One serious online shop has them on preorder here, 500€ is way too much tho'.
http://www.materiel.net/ctl/Cartes_g...1_Go_OEM_.html
Edit: Specs are not the same as the Canadian one, GPU is 743MHz, ram is 2000MHz.
You guys who are looking for better performance out of the 2900XT would do better to run the newer Alphas (8.37.4.2). I've found they are much better performers across the board than the 8.38s. Oblivion is running 400% faster with these (no joke, legit: http://www.nvnews.net/vbulletin/show...&postcount=679 ) and Rainbow Six Vegas is running ~ 20% faster.
The 8.38s are crap, IMO. They blue screen left and right on me, and performance is abysmal in some games.
What version of 8.37.42.. cause in real we have got yet more of 6 betas ( 8.37.4096 / 8.37. 4322 ( not sure of the last numbers) / 8.38 RC1/2/7 and offcial).... for yet all games i play run fine with the 7.5.. but it's clear some are certainly run faster with old betas and some lower.... Now i need make a choice, and for yet i can only wait a little bit a next driver...
This is from your own pics.
Your alpha's
http://img339.imageshack.us/img339/9...6109ac3.th.jpg
and the better 8.38s
http://img339.imageshack.us/img339/9...0414bl9.th.jpg
That's just one obvious example, I could keep picking them apart if you'd like. All they are is hacked drivers that drop iq to gain performance. Alpha and Omega have always been that way. Just tweaked to either gain performance or IQ, I've never been a big fan of them. This is nothing new.
What you're pointing out isn't a bump-mapping deficiency. It's the dynamic weather effect in Oblivion with the HDR not showing through as clear.
Here's a shot of the direct sunlight bouncing off that same stone arch on the XT w/ the Alphas:
http://img168.imageshack.us/img168/3...0336053yy6.jpg
Nice try, though. Next?
So you got closer and changed the angle and there is still not as much detail. LOL. And look at the grass to the right in the first pics. If you can't see it, oh well.
Closer? The picture is further away. Look at it again. Instead of standing on the pedastal he's looking up at it from a further distance.
And no, I don't see it, so you're going to have to point it out to me, because the IQ looks freaking identical between the two. The only difference is that you pulled a shot without the HDR glare effect (it was active, just dynamic and not in the current scene) and compared it to one with the glare effect.
So again.. next? So far it just looks like you're making up stuff. There's not a bit of truth to what you're claiming.