I wonder if we will see better performance from the xt/xtx (512/1024) as drivers mature?
Printable View
I wonder if we will see better performance from the xt/xtx (512/1024) as drivers mature?
Yes. There will be ongoing driver revisions.
BTW.......Has anyone benched these cards in Vista?
I have seen performance benchmarks that outperform
the Ultras in CF by 20%.
MM
Guys, for those guys that must have one of these 1GB monsters,
I am currently working on having one exclusive AIB to carry it on their
ecommerce site. I will keep everyone posted tomorrow.
MM
If the GDDR4 is so expensive why not use the lowest latency possible and use 64x12 or a 64x14 memory IC configuration instead of 64x16? It will cut cost to manufacture and reduce cost to the consumer...
call of juarez dx10 benchmark ate my rig for bfast. Vista 64-bit. stock 2900xt speeds.
just popped the option tab to "high" and ran it.
yup. 7.5 doesn't add any significant performance boost at least with the tests ive been running.
(HL2:Ep1 Bench, Call of Juarez Bench, 3d'06, Unigine v.4)
drivers aside, the 2900xt is a slug. even if Hexus.com is getting 20k in 3d'06 with it, they had to have watercooling and crossfire. both GPU's were really OC'd too.
@trajik78
what cpu? also on wc? if so, its a good score ; ) i belive they dont tweak their os for benching ^_^
I'm getting 16.5k with 3.15ghz quadLINK...20k is easy with a good cpu, and cards @ stock clocks.
CoJ I get very decent speeds, higher than posted above, with 939 4400+ @ 2.5ghz(on A8N SLi, so no crossfire) ie, 12 min, 44 max, 22avg. That there bench is system memory/HDD limited, as GTX does not score much more.
Corssfire gives me 14fps min, 71FPS max, 34FPS avg...the addition of a second card, plus the small increase in minimum frames, while maximum frames has doubled, speaks volumes of how muc hgrowth is available for this game, and the immaturity of HD2900XT drivers.
Running crosfsire with official drivers, from boot, slave card does not get 2-d clocks...3D clocks only, leaving the slave card 12-14c higher in temps than master card. In Vista, this weird behavior allows for Crossfire overclocking, however you can only change clocks once for each boot...setting different clocks will cause a freeze, as the driver rests card clocks to 3d clocks on the slave, but only with 2d voltage.
One of the major reasons these cards are so "poor" performing is because of cpu limits. It's very interesting to see crossfire x1950's get beat by a single card, purely because of a lack of cpu work from driver do to the lack of running Crossfire with one card.
Seeing performance differences speaks volumes on how biased alot of the info is out there...we have recently had nV driver that raises cpu scores in benches...I wonder what a similar change in ATI driver may bring for performance...
Yet still these ATI cards outshine G80 in benches. I can feel the "wait" from some of the top benchers...alot are holding back ATM...
Denny, will YOU be the first to publically show confirmed 1350mhz gpu speed? I am very impatient for this...I know it's possible...DI and 1.525volts...:lol2:
Alright guys..........For those guys that live south of the Canadian border,
the DIY crowd will be able to pick up these 1GB cards on the 14th of June.
They will start delivering on the 15th. The cards will come in 4 tropical flavors according to the arrangements that were made. Diamond will have the
1GB boards available of their site. $579.00 and $599.00.
Any info about Europe?
Bloodbanger.........We am working on the logistics and schedules for
Europe. I will estimate that Europe should have them with 10-15 business
days. The should be also available in Latin America, starting with Mexico.
You wont be disappointed.
MM
i see the product info on their site, but no "buy it" link.
GPU is 825 (up from 740 on 2900xt 512)
Mem is 1100 (up from 825 2900xt 512)
guess they're going after the GTX with this card aye? wonder if it's gonna do as bad as the 2900xt did against the 8800GTS?
One serious online shop has them on preorder here, 500€ is way too much tho'.
http://www.materiel.net/ctl/Cartes_g...1_Go_OEM_.html
Edit: Specs are not the same as the Canadian one, GPU is 743MHz, ram is 2000MHz.
You guys who are looking for better performance out of the 2900XT would do better to run the newer Alphas (8.37.4.2). I've found they are much better performers across the board than the 8.38s. Oblivion is running 400% faster with these (no joke, legit: http://www.nvnews.net/vbulletin/show...&postcount=679 ) and Rainbow Six Vegas is running ~ 20% faster.
The 8.38s are crap, IMO. They blue screen left and right on me, and performance is abysmal in some games.
What version of 8.37.42.. cause in real we have got yet more of 6 betas ( 8.37.4096 / 8.37. 4322 ( not sure of the last numbers) / 8.38 RC1/2/7 and offcial).... for yet all games i play run fine with the 7.5.. but it's clear some are certainly run faster with old betas and some lower.... Now i need make a choice, and for yet i can only wait a little bit a next driver...
This is from your own pics.
Your alpha's
http://img339.imageshack.us/img339/9...6109ac3.th.jpg
and the better 8.38s
http://img339.imageshack.us/img339/9...0414bl9.th.jpg
That's just one obvious example, I could keep picking them apart if you'd like. All they are is hacked drivers that drop iq to gain performance. Alpha and Omega have always been that way. Just tweaked to either gain performance or IQ, I've never been a big fan of them. This is nothing new.
What you're pointing out isn't a bump-mapping deficiency. It's the dynamic weather effect in Oblivion with the HDR not showing through as clear.
Here's a shot of the direct sunlight bouncing off that same stone arch on the XT w/ the Alphas:
http://img168.imageshack.us/img168/3...0336053yy6.jpg
Nice try, though. Next?
So you got closer and changed the angle and there is still not as much detail. LOL. And look at the grass to the right in the first pics. If you can't see it, oh well.
Closer? The picture is further away. Look at it again. Instead of standing on the pedastal he's looking up at it from a further distance.
And no, I don't see it, so you're going to have to point it out to me, because the IQ looks freaking identical between the two. The only difference is that you pulled a shot without the HDR glare effect (it was active, just dynamic and not in the current scene) and compared it to one with the glare effect.
So again.. next? So far it just looks like you're making up stuff. There's not a bit of truth to what you're claiming.
don't use the jpeg extension, as it's lossy. use lossless PNG or bmp.
Offtopic, Xion that red is really distracting. But it looks cool :)
Heh, thanks. I need to update it, though, since I have my card now. :P
She's clocked @ 875|975, by the way.
that's too low :rolleyes:
this is my benchmark with 1 HD2900XT stock clock and E6300 at 1680x1050 w/ 4xAA
http://img511.imageshack.us/img511/3...aresultzg2.jpg
well i dunno what to say about that. all my settings are as you see them in the benchmark and in my sig...?
what OS are you running?
___EDIT___
Diamond just changed the specs of the 2900XT 1GB. they lowered the GPU speed on the web page from 825mHz which i reported in post #115 this morning to 743mHz. WTH!!??
http://www.diamondmm.com/2900XT1GPE.php
specs are subject to change aye?! :D
Hmmm, you see they are claiming it as the "The World Fastest Graphics Card" ......?????
technically it is the worlds fastest clocked retail gpu? am i wrong? i just think drivers need to tap all this raw POWA
Water, closed case. I'm voltage limited until ATI Tool comes out.
http://www.xtremesystems.org/forums/...postcount=4020
i believe several people are using custom bios's from what they have said.
elmore for one, but obviously from his post someone else made it so more people must know.\
http://www.xtremesystems.org/forums/...0&postcount=28
Macci was the first to use the custom BIOSes, I'm pretty sure anyone else got them from him.
Hey Guys.
First if you want to tweak and find out how high we can push your cards. First thing we have to do is unlock the threshold safety mechanism, and then flash your card. Next you can play with higher clk speeds all you want. i will work on a bios that will enable this. I will post it later this weekend.
Next.........For all you DIY guys that need these 1GB cards, please PM me your contact information and I will arrange for my team to get you one. Price
is in line with the GTX. So dont worry about getting sticker shocked.
MM
Price for two is inline with GTX? my 2 512mb cards cost that...local pricing has 1gb cards $100CAD above 512mb cards.
We really need access to proper tools, and we really shouldn't have to go through NDA for them either. I've heard some interesting tales about bioses on these cards...
I've also seen alot more communication coming from the right people in the last two weeks. It's very nice. VERY nice. ;)
Thanks for trying to meet our needs...we do pay the bills, in the end, n-e-way. :lol2:
stupid question, but is mad maverick from AMD, or a OEM?
MiddleMan.
MM...it fits so well...:rofl:
I wonder if a bios update is in our future to get the HD 2900 that extra boost in performance (instead of driver update alone)?
Well from what ATI have said officially, it's definately possible.
however, when it comes to things such as memory timings, these things can be adjusted by driver. Frame buffer tweaks have long been a staple in ATI's long list pf performance tweaks. remember when 256mb cards had no performance boost, and then Catalyst 4.10 came along?
However, based on what ATI has said, there is purposely alot of headroom in these cards. 800mhz seems entirely possible for each card, but thermal limits imposed by the gpu's process are the main thing affecting these cards ATM. This is why I'm eagerly awaiting some decent cooling solutions...
Look at the top guys...getting good clocks by changing cooling, some running only 1.35v into 80nm. ONLY 1.35v...
;)
I hope I am wrong but it appears that more games are requiring more then 512Megs of on board ram. Games like Dirt (ultra settings), Crysis (ultra high settings), Quake 4 (ultra high requires at least 512 Megs), BF2 (the more memory the better) and so on. I am not sure about Alan Wake yet (but it appears that will be a Vista only title). It looks like more memory is needed in order to max out settings for current or soon to be released games.
DIRT's ok Crossfire, But i understand where you are coming from. Keep in mind that because AA is shader-based, less of that framebuffer gets used for AA, AFAIK...if it's more..well damn, it explains alot...
i'm just curious why ATi is holding back the hd2900xt's true potential. we know that a bios update will greatly increase performance, but why hasn't ati released it?
Um, I have, now, the cloud that follows my car...
But I didn't before.
Before, 1 card was ok, but a bit slow, avg abt 32FPS, so lots of dips lower. enable crossfire, and performance slowed to maybe .5FPS.
I have been installing games and such, and found that now Crossfire works, but I have the cloud that follows my card, and like 70-90FPS.
All settings are maxed in game, 1680x1050.
Now, there were many DX9 updates with the software I installed. I tihnk this may be the reason. Vista was kinda OK with the game until DX9 was updated.
So it's something with this engine, or a DX-specific problem, as it's not just one card that has this issue.
Then again, maybe you are right...so these 1GB cards do hold alot of interest. Maybe the hint of these cards has kept a few people from buying R600 as of late...
That's nothing new nicepun, both companies generally do that if they launch a flagship card.
Shader based AA should use roughly the same amount, because it's still has to get sent through the framebuffer to share info across the shaders. It's one of the reasons I'm so curious as to how the 1 gb will perform in games, but no one who has the 1 gb plays games it seems.
Hey Guys,
Somehow, the cards are out and available in ecommerce.
http://cgi.ebay.com/ATI-Exclusive-HD...QQcmdZViewItem
Hi Guys,
The 1GB cards for the DIY market are hitting ecommerce next week (friday)
The GTX and ultras will be in trouble at the price you will see.
MM
Typical etailers in North America & Canada.
MM
Pricing..........For all you guys that want this card. Ecommerce partners
will have these cards at a competitive price of $510.00 - $550.00 USD
Keep your eyes open at:
Newegg
Mwave
Zipzoomfly
Tigerdirect
NCIX
Extreme PC
Memory Express
Canada Computers
wow 427 euro for a x2900XT with 1gb :) nice prize
Do you really think that it will be such "cheap" in europe? I think the price will be 1:1 to $
For european, i know Overclock.uk have some preorder, i don't how many they got..
I just hope have the money for take one next week..
So the real question at this stage of the game....buy a HD2900XT 1024mb, or wait for the Nvidia G92 (I heard Xmas time?) and possibly an ATI revision and release of an XTX..... Anyone got some input on that?
the 2900 will be making a move to 65nm process. not exactly sure of the time frame but i think it was slated for late this year. at that point im sure it be called something different. (2950xt??)
I might just be wiser to wait out on an upgrade for another couple months, I mean Ive gone this long on my X1900XTX :up:
I see you guys have a lot of money and can actually buy an high-end graphics card every year =/
Your XTX handles DX9 quite well, your better off waiting for real DX10 games =)
Where did you see that? I havent seen anything about R650/R680 or any other 65nm shrink, since Fudzilla posted this:
http://www.fudzilla.com/index.php?op...1385&Itemid=34
actually i see 2 versions of 1go R600.. one using BC06 and one BC09 (the actual).. the R650 (or perhaps 750) will be 0.65nm, this version should be here on last quartet of '07..(normally..) We have no more see infos yet.. But it's sure like AMD is doing test of this at this moment.
I can think we will see a xtx version using the BC06 soon.. or they put them on the card directly like this and it will stay XT.. or they use them on a stock overclocked GPU ( BC06 allow lower latency and higher speed of memory...) We will perhaps see so a XTX version with more core speed and memory.. ( at the place of ATI this is what i will do.. and this come in place with the change of the code of the actual 2900XTX to XT.. for let a place for do a XTX in the series. )
Is someone have see some test / review of the 1Go DDR4 version yet? I have find a bad where the guys have use a old 7.4 driver (8.36 beta for 2900XT 512).. the score look totally not normal.... but at same time i see a good good thing.. the card look like handle really great the increase of resolution and AA... perhaps the amount of memory and the 512/1024 bit controller work better with 1024 Mo
Perhaps the good solution for the strange drop we have from non AA to 4xAA ( then in most games 8x-16xAA don't let drop so much the perf..)
I found this card wired:
Link to storeQuote:
The high-end next-generation graphics card, PowerColor HD 2900 XT. This 700 million-transistor monster clocks in at 740MHz with 1GB GDDR3 memory at 1650MHz.
I have looked over the internet and couldn't find any 1GB RAM card based on GDDR3. This either is a mistake in spelling or exclusive version for this e-tailer .
Any thoughts?:confused:
yeah I was confused about that as well... waas going to purchase it, but not if it's GDDR3
<look for me like an error... or it's the 512Mo ( memory is at 828 on mine reading AMD GPU clock x 2 you are close to the 1650mhz..(825x2)).. or they have the new BC06@1650mhz (.6ns) and this will say it's GDDR4 (1650x2 3300mhz).. But i don't think the card with BC06 have begin hit the market.. so...
Look for me like a typo error...
Try send them a mail for verify.. the 2900XT 1Go DDR4 default design is 1.1ghz (x2 =2.2ghz)
Has anyone tested the Extreme PC
Extreme PC Exclusive SI Diamond Radeon HD2900XT 825Mhz Core | 1GB 2150Mhz DDR4
I just ordered them, seems like they might be promising.
@ChaosMinionX
sounds fun, waiting for results ; )
http://www.extreme-pc.ca/showproduct...oductid=371414
Thats the link to it right here.....they look like nasty cards....
The company makes the following claim...
http://www.extreme-pc.ca/uploads/XPC...1GB-DDR4.1.jpg
EDIT: Do I need a 8pin PCI-E connector?Quote:
HD2900XT 1GB DDR - 825Mhz Core | 2150Mhz DDR4 Price $574 CND - EXTREME EDITION - Hand picked and tested - Best HD2900 Core Possible - Used in our systems as very good chassis cooling is required. In CrossFire mode it matches 8800GTX ULTRA SLI performance and in same games, surpasses it!
Since few North American competitors are claiming they have the fastest HD2900XT 1GB solutions, we decided to separate ourselves from the competition by upgrading our bios speeds to even more EXTREME level. Since you can't overclock the 2900s in CrossFire mode with any tool aside the ATI's Overdrive limited OC tool, you get the FASTEST CROSSFIRE performance in THE WORLD without any need for overclocking! Also we can now sell these separately as well - no more with system only restriction - Thanks AMD! We'll also price match any competitor's price on this product and we'll give you 1 contest entry per card - see our contest info for details.
No... in real it's better but all peoples working with 2x 6 pin have not see problem.. But i don' t know if a 8 pin increase stability..
There's a fix a guy here have tell me for got the overdrive working with 2x 6 pin.. It's just needed make a bridge with the pin missing..
I m alittle bit in pain... i want go for Cfire and want a 1Go DDR4 version.. i ask me if i will not sell my 2900XT 512mo, buy the 1go DRR4 and thenbuy a second, but i have a problem.. this will say the next will be with BC06 DDR chips, so higher memory speed and lower latency of the actual....
I need eat my hand for not pass the order for the 1go GDDR4 version right now..:D
Card will be here by like Monday :up:
You lucky ... ;) ;P
This result are so strange.. one.. the driver they use (8.38 RC2?...we have use it a long time ago the first week we have got the 512mb version... with RC7 and other beta version of 7.5....)... But the Cfire show a good potential here, it's where we see the Cfire have a little advantage in efficiency compared to SLI...
Hope we will see soon the next drivers.. for see what can give both 512 and 1Go DDR4...
I never understand why some review allways use this 5 games... it's not really where ATI is best ( Only the perf in Oblivion surprise me as the X1900 was largely better of the 7900 series on it..).. I think it's the 10th reviews for the HD2900XT512 or 1Go using only this games.. I really enjoy the game, and looking to replay each time i unlock a new car..
@ChaosMinionX.... If you have Dirt and if the 7.6 work with your card... give a try.. the difference of Image quality between 7.5 and 7.6 is just amazing....
http://img523.imageshack.us/img523/9838/dirt307ym4.png
Image size is reduced. this give a more fine texture and overall feelings.. I don't wanna "overload" the forum.
http://img266.imageshack.us/img266/495/dirt3072ny1.png
that looks really good lane :up:
Colin McRae Dirt.
Denny or anyone with the 1gb GDDR4 card ... can you tell me if you are getting a big slowdown in 3Dmark06 when the dragon comes out of the water, mine seems to slow down a lot for that bit, both with Cat 7.5 and 7.6.
Is there a seperate driver needed for the 1gb cards, or anything newer than 7.6 ?
hi. I actually took delivery of two of these ocuk 1gb cards yesterday.
i seem to have trouble getting any 3dmark program to run. i got 3d06 to run by modifying my shortcut and adding nosystem info.
but is there a way to run the programs without having to do this.
specs..
asus mvp deluxe crossfire mobo
amd x2 3000 cpu
4gb ocz ram
2x x2900xt 1gb ddr4 core 750 memory 1000
2x seagate barricuda sata drive
will check back later cheers.
Just delete the folder named "Futuremark" from C:\windows\system32\.
Why not?
I renamed it to _Futuremark ( for backup ), but its ok with deleting too.. :)
the dll worked thanks i will do some test runs now
bear in mind that i am only using 4x6pin plugs so lets see what score i get. will post back
http://i88.photobucket.com/albums/k1...o/3dmark05.jpg
tbh i am not seeing a massive increase of the score i have with my x1950xtx's
16413 in 2k5 with CF ? LOL