He probably didn't have one. :rolleyes:
I'm so damn tired of scores in 3dcrap by now, show us what the damn card can do in games instead. :cool:
Printable View
The card does well but only when overclocked to max overdrive. I wasnt impressed until be broke 10k in 06 and even that needed 860 on the core and 900 on the memory. That says nothing about how it does in actual games either. If all cards are guaranteed to do max overdrive that is great but what if your card is weak and falls short. You are losing performance right off the bat that is needed to stay high.
LMAO! Thanks for the laugh, what a terrible argument. G80 has been shipping for 6 months and you're saying "oh don't compare on Vista cuz Nvidia is behind." The test was completed with Beta drivers from ATI on a card that isn't released to the public yet vs NVIDIA's shipping drivers and product. :stick:
Its no secret ATI/AMD spend alot more time on Vista drivers and knowhow. When MS told the GPU makers to start, ATI relatively quickly followed. While nVidia was busy playing with quad sli drivers. Hell, G80 was even launched and didn´t have a Vista driver before what? 3 months ago?
So to conclude we can't compare R600 and G80 on Vista coz they both use experimental drivers:D
NEW review,looks more thrust-worthy than the previous ones!
http://www.techpowerup.com/reviews/A...n_HD_2900_XT/1
:thumbsup:Quote:
ATI HD 2950XT scheduled for Q3
To save the lousy year
We already wrote about the ATI HD 2950XTX card based on R650, the 65 nanometre chip scheduled for Q3 2007 here.
There will be one more high end card based on the 65 nanometre chip that is likely to replace the soon to be launched Radeon HD 2900XT cards. The new 65 nanometre core codenamed R650 based card will be known as ATI Radeon HD 2950XT. This is the slower revision of the R650 core.
We hope for DAAMIT's sake that they will manage to launch these R650 based cards in September / end of Q3, but how can you put your trust in roadmaps from a company that delayed every single chip this year.
The Nvidia G90 should be ready around the same time, but we have very little information about shrinked version of the G80 chip.
http://www.fudzilla.com/index.php?op...916&Item id=1
I found some info from an unknown source...
Quote:
Originally Posted by unknown source
>>> http://www.xtremesystems.org/forums/...&postcount=811
The review was taken down because they ask ppl not to leak anything from the review
Quote:
"We signed an nda for this - if you leak we will get sued
regards
:( ok, perkam
Yet more confirmation of what we've been hearing from it-review, Kyle, Dailytech, Fuad ect., again from a normally pro-AMD source.
http://www.theinquirer.net/default.aspx?article=39526
It goes on to make some fairly harsh assessments.Quote:
The card goes tit for tat with the GeForce 8800 GTS, winning some and losing some in a vaguely equal proportion, whilst coming in at the same price point. DAAMIT is hoping the DRM infection of HDCP and HDMI will be a selling point, but don't count on it.
I've said it before, I'll say it again.
Wait for launch day. Future drivers will make its performance much higher than it is today, overclocking will be simply amazing on this card with the right cooling...heck i'm sure we'll see after market bios for the card running at much higher core and mem.
Perkam
What i wonder is if it will beat the GTX both cards are maxed out... seems from what im hearing it wont, but hey maybe that means price war. :woot:
lol perk i am just as big as an ati fan as you are but it's clear this card is not going to be the new king of the hill - even kinc and sampsa have said so.
But what it might be is the new value king, coming in at $500 and allowing sick overclocks this could be one of the most fun cards to come out in a long time. Especially if it is responsive to cooling, all of us will have fun making our own coolers and such - overclocking skills will once again come into play - even on air. I for one am very excited for the experience.
$400 ;)
I'll agree it won't revolutionize gaming fps, but the ORB will have rattled when members use two of these in crossfire for benchmarking purposes.
In benchmarks, the HD 2900 will be a tremendous successor to the X1950XTX, it will be leaps and bounds ahead in most benchmarks...which is all i can say.
Perkam
A 500mHz increase in overclock just with water? Surely you must be joking.
Unless you're talking about a volt-mod and some other hardcore revisions to go along with it. (which leads me into my next question--will you be able to volt-mod these cards through software again, you think? Like the 1900-series?)
How do you get this? Again, I'm sincerely curious since I'm looking seriously at getting the card to see what it can do on my water setup.
http://xtremesystems.org/forums/show...5&postcount=34
http://xtremesystems.org/forums/show...1&postcount=17
:toast:
couldn't find kinc's quote but i think you get the idea.
Uhm... how naive can you get.... once you put it on the internet you can count on it it's going to get read and spread..Quote:
We signed an nda for this - if you leak we will get sued
Don't wave with the cookie jar unless you're willing to share... :nono:
I wouldn't say rattled... Sampsa confirmed 06 will still belong to the 8800Ultras, which means no one is going to claim them all with their setup.
Fortunately though, benchmarks aren't what matters most. Graphics cards true purpose is gaming, as I've said on numerous occasions. The card that loses in gaming is considered the loser, regardless of 3dmark scores. :fact:
It's a pretty large margin, 33%. To put it in retrospect you'd have to clock the 8800GTX to 764.75, or the 8800GTS to 665 to obtain an overclock of the same margin. As for obtaining the same performance, no one not under NDA knows currently, as we have no idea how gaming performance scales with clock speed with these cards.
Any card is very good in 3dmark 2006 and then suck in games.
There is no game with the efects of 3dmark2006. When games came with those efects of the 3dmarks the 2900 XT can invert things....
In heavy/intensive tasks that is the future of gaming and the 2900XT can take the lead. Can also take the lead in DX_10.
Yeah who knows what dx9 numbers will be like, the real deal will be when dx10 comes.
When games come out with the effects from 3dmark?... You did know games have come out using similar/the same shaders, right? Far Cry used shaders found in GT3 and GT4 of 3dmark03, as anyone who had a 9800pro/xt when cat 3.8 came out will remember from the block shadows indoors on far cry. F.e.a.r. uses a lot of the shaders found in 05's GT1, and oblivion uses a lot of shaders found in 03, 05, and 06.
Personally I don't put stock in any of the 3dmarks. I'd rather take it for a whirl through actual games.
As for the future...we'll see, but I know I wouldn't buy a card based upon the assumption of power in the future when currently it loses. You could be right, but I'm not holding my breath on that happening at all. We'll see when the time comes, that much is for sure, but for now I really wouldn't hold your breath.
First his hd 2900 xt photos in spain.
http://www.hardspain.com/index.php?o...id=62&Itemid=2
ate 1600 x1200 you´r right ( for first drivers i dont think its bad)
FEAR 1024 x768 No AA No AF
X2900XT = 147 FPS
8800GTS = 137 FPS
7900GTX = 99 FPS
FEAR 1280x1024 2x AA 8x AF
8800GTX = 125 FPS
X2900XT = 109 FPS
7900GTX = 85 FPS
FEAR 1600x1200 4X AA 16x AF
8800GTX = 79 FPS
X2900XT = 60 FPS
7900GTX = 47 FPS
btw: i already saw X2900XT for 350€ :)
regards
still confused over the 8 pin pcie power connector i have heard that an adapter will be included with the cards to plug 2 6x pin pcie connectors to one 8 pin pcie connector , i have looked all over the uk and no-one stocks these adapters themselves anybody have any insight on this, also i have a tagan 1100 psu and that has one 8 pin power connector is that the same or is it a different type
if you conect 2x6 pin to the 8 pin one, what do you plug to the 6 pin one ? or are there PSUs with 4 6 pin conectors ?
yes the tagan 1100 has 4 x 6 pin pcie connectors so in theory you could if you had 2 of these cards in crossfire use 2 x 6 pin pcie cables into an adaptot to plug into the 8 pin pcie power input on the card and then use a 4 pin molex plugged into an 6 pin pcie adapter and put that into the 6 pin pcie connector, god that was confusing
:lol: :D
http://forums.hardwarezone.com/showp...9&postcount=33Quote:
ok dokey. upon request by many.......
compare the 8800GTS 640Mb from Leadtek as comparison.......
Not necesarily. Silicon Image developed some chips which were supposedly capable of passing audio without violating the DVI 1.0 standard. I would have thought that this would require a revision of the spec but there might be a provision for transmitting additional data over the link. DVI does pass enough bandwidth to permit this so I suppose they could have allowed for expansion in the spec draft.
:shrug:
Not sure if these have been posted.
http://www.zomp.net/shop/product_inf...ucts_id=441388
http://www.lambda-tek.com/components...prodID=1379079
£258.98 inc. VAT ASUS EAH2900XT/G/HTVDI/512MB PCIe RADEON
Good price, but expect GTS to drop way below that, scan.co.uk has the GTS for £214 inc VAT. That's a very good price.
new driver for x2900 OUT , they talk about + 10%-30% performence
regards
30% more performance in what? Games or just more 3DMark?
my friend is testting and in VISTA +10% in HL2 and 3dmark 2006
With Windows XP should be better :)
btw:
:thumbsup:Quote:
Asus bundles Stalker and HL2 with R600
Launches on Monday
Asus is planning to launch a Radeon HD 2900 XT card on Monday and so far didn't want to sample. MSI also didn't sample its cards at least not that we know of.
MSI has a Borg on the cooler and you can see it here.
While Asus has a Stalker sticker, Asus has a Stalker one. Asus is bundling Stalker with every high end card they have and it is a good game. Apart of Stalker Asus also bundles Half Life 2 episode two, and the rest of the black box games.
The card is a reference one and should cost around €399 in the continental Europe.
http://img178.imageshack.us/img178/690/asusr600jw8.jpg
http://www.fudzilla.com/index.php?op...d=938&Itemid=1
regards
If you no see, you see now:
http://www.hardspain.com/index.php?o...id=62&Itemid=2
Link
Catalyst 7.4???
http://img293.imageshack.us/img293/7898/image1gi2.jpg
only $514 for a Diamond HD 2900XT according to NCIX Canada
http://www.ncix.com/products/index.p...DATA%20PRODUCT
maybe with old driver and the test was in VISTA
Test environment : Windows Vista32
http://translate.google.com/translat...&hl=en&ie=UTF8
with 8.374 driver :
X2900XT with a E6400 2Mb cache @ 3000Mhz in windows XP = ~12000 marks (2k6) ;)
X2900XT with E6400 2mb cache @ 3000mhz in Vista = ~11000 makrs ( vista driver suks)
regards
what??? 3dmark 2006 scores with E6400 @ 3GHhz?? (my friend have one)
btw:
Quote:
There is already 8.38 out
8.37
http://img292.imageshack.us/img292/7...d6b7ix5.th.jpghttp://img292.imageshack.us/img292/7...a9efza5.th.jpg
3dmark 2006: 12086
frames 63
8.38
http://img292.imageshack.us/img292/8...adfalk3.th.jpghttp://img292.imageshack.us/img292/3...38f9ha6.th.jpg
3dmark2006: 12530
frames : 78
http://chiphell.com/viewthread.php?t...&extra=&page=3
E6600 @ 3600mhz ( 400x9) >>> 12530 marks ( 8.38) :)
Quote:
Driver packaged version 8.38-070508a-047443E-ATI
ATI Technologies Inc. Vendor ATI Technologies Inc..
2D-driven version 6.14.10.6698
2D-driven program file path System/CurrentControlSet/Control/Video/{8593CF7B-C241-498F-A753-EE FC259F4B6F}/0000
Direct3D version 6.14.10.0503
OpenGL version 6.14.10.6590
Catalyst ® Control Center Version 2007.0508.21 44.36940
Stream (T200) Driver version 6.14.10.1080
Both ATI and nVidia drivers still lack alot in Vista. We gonna see this on all fronts for alittle year or so yet.
I expect that this is a driver error, cos VSM has been present since the 6800. See http://http.download.nvidia.com/deve...hadow-Maps.pdf for details.
Furthermore VSM is highly optimized for G80. See http://forum.beyond3d.com/showthread.php?t=38165 for details.
It is somewhat worrying that, if there is nothing wrong with the ATi drivers, programmers need to optimize the code to make the performance acceptable (a factor 3 is insane).
Mascaras--great feedback, thanks! So your friend scored ~12,500 on 3dM06 with his E6600 at stock speeds? Was the 2900XT overclocked?
Nice to see the 15fps jump in Dark Messiah w/ the new drivers as well! It looks as if they are making progress.
http://img292.imageshack.us/my.php?i...b1ed6b7ix5.jpg
no, also 3600mhz, 400fsb, 6x multiplier because of EIST.
Great results!
I've used HIS in the past and I haven't had any issues with it thats why I say that. Sure notebooks might have issues or OEM machines because they sometimes switch the bios to only load from that place but regular retail or OEM shouldn't even consider that as an issue.
negative , with 8.37 he have EIST enable ( when CPU is Idle run multi x6 and FULL run multi x9 )
look at cpu-z >> FSB = 400 (400x9 = 3600mhz) ;)
He run Both tests :
- E6600 @ 400x9 = 3600mhz >>> 12530marks (3.8)
- E6600 @ 400x9 = 3600mhz >>> 12086marks (3.7)
>>> 78 FPS in Game (3.8)
>>> 63 FPS in Game (3.7)
regards
I had a HIS 1900xt and currently I'm using a HIS 1950pro Iceq3 on my internet rig. I've always used official ATI drivers or Omegas without a single problem. Their bundles are great, too.
does anybody know were could I get latest drivers ?
http://www.sapphiretech.com/us/media...iew.php?pid=80
Lol,Sapphire could foresee the future,look at the date:)
my friend is testing X2900 with E6400
that 3dmark 2006 scores and Dark Messiah FPS with E6600 its from a guy of other forum , the source is there (in the end of the post)
Source >> http://chiphell.com/viewthread.php?t...&extra=&page=3
link doesnt work
regards
Think this is what he was linking.
http://www.sapphiretech.com/us/media...iew.php?pid=78
But the link to the press release doesn't work on the site.
http://www.sapphiretech.com/us/media...iew.php?pid=80
This is what he meant;).
The power requirements for this card is 225W so that means: 75W from the PCI-E slot + 75W from the 6 pin connector + 75W from another 6 pin connector = total of 225W
That means there is NO need for 8 pin PCI-E connector?
If overclock then:
If there is a 2x 6 pin to 8 pin in the package? that would be 150W combined + 75W another 6 pin connector + 75W from the PCI-E slot = 300W . Then O/C should be a breeze
Someone correct me if I am wrong, thanks
512MB is jsut too small ... i am waiting for HD2950XTX ... AMD better put up a good fight or better yet destroy nvidia so i can have HD2950 CF + X38
also from what i heard 8pin has 100W rating ... so 100W 8pin + 75W 6pin + 75W PCI-E = 250W total ... 2 x 6p will give u 225W ... should be enough to run at stock , but for ocing .... especially if ATI tool allows VGPU adjust
btw Denny over at 3d section has the card ... and did a wonderfull testing ... unbias
Has this already been posted?
Second R600 peview of it-review. :)
http://it-review.net/index.php?optio...1325&Itemid=91
Let us hope those numbers are not accurate.
IQ is better on ATI, at least they said something good!