Soo you don't think 40+% performance improvement isn't crushing?
You might want to tell that to the Conroe fanboys :rolleyes:
Printable View
http://sg.vr-zone.com/?i=4935Quote:
We learned that AIB partners for both NVIDIA and ATi have already received the GeForce 8800 Ultra cards and Radeon HD 2900 XT cards and are preparing for shipments now. These cards will hit the retail shelves by May 14th, the same day AMD officially launch their R6xx series. However, only 2900 XT cards will be available while 2900 XTX, 2600 and 2400 cards will be available at a later date. We heard that there are some 6000 pieces of 2900 XT cards by the first week of launch and no supply problems are expected.
HD 2900XTs at $399 should be enough to hold off the G80 series for now...
A $700+ Ultra isn't going to make a dent in AMD's R600 sales, irrespective of how good looking it can be :slobber:
Perkam
It is crushing by any means.
The numbers for example in 3Dmark06 with the HD 2900XT are double than my current card a x1950xtx.
Thats crushing.
It runs Vista and Crysis generation 2 engine in DX10 and 10.1 which no other ATI card can do previous generation, that is crushing.
It has HDMI, programable GPU and will become more and more performance wise in older games and newer games so the crushing will continue over time.
It draws more power than any other previous generation and crushes by power alone ;)
Its a canadian card that is crushing alone even it it does not count.
It has high quality at default and will run anything I can throw at it and will come cheaper by 2k at launch which is welcome since the older cards did cost to much in high end.
It has coustomable AA with a better imagequality than Nvidias 8800 and the old x1950 series.
So not only will it crush the old tech but it will look better doing so ;)
It runs at a higher precision point than x1950xtx.
It will crush due to having more transistors.
700mil+
It has a wide memory bandwidth and its a lot more than previous gen.40+gb more.
Rejoice all 24 and 30inch screeenowners.
It process HDR textures biliniar filtered at "7X" faster from previous X1 gen.
Its HOT and RED and has looks you can die for.
What else do anyone want?
Green and slimy or RED and HOT?
Are Paris Hilton not enough in daylight?
Anyone not seeing the numbers and the performance and the tweakability are fooling themselves, this card will last a long time, much longer than Nvidias serie.
A lot of people still use 9700 series from ati.
Crossfire scales much better than previous generation which then 2 cards are something even I might consider down the line for once.
The card also overclock well and 1ghz core might be mainstream with watercooling...that is so sexy and hot by itself.
It crush the previous generation due to some good old fashion creative thinking from ATI design team.
:woot: HD 2900XT :toast:
Red and hot and with looks you can die for, what else would a man want?
If ATI would deliver the cards with one hot sexy woman and do personal service but I consider that over the top though....
Wow I just wasted sometime catching up with 10 pages of rumours @ beyond3d.
I think... I hope... I don't know...
I gotta use the smilie again: http://www.xtremesystems.org/forums/...1&d=1177841274
Pictures of X2900s
http://forum.beyond3d.com/showpost.p...postcount=3823
For moment my x1900xt 512 is always able to do it's work well.
Hope R650 with 32TMU.
too bad it isn't integrated like the XTX is suppose to have.
Does this adapter come with the card?
Not in dual link, no. And it's supposed to be the full bandwidth res for the 30" Dells etc. I was expecting to see a special HDMI to DVI adapter with a fly lead into the mini Din socket.
http://img119.imageshack.us/img119/2...ipinoutct7.png
1/R600XT, default E6600, default R600XT, ALL MAX 4xAA+HDR (1280x1024)
http://www.chiphell.com/attachments/...kMK47f6XI8.jpg
http://www.chiphell.com/attachments/...YksBRHKU2s.jpg
http://forum.beyond3d.com/showpost.p...postcount=3848
I'm using ultimate x64 and overdrive is there and always was since 7.2.
^^ look on the overlapping adaptor. Its a photoshop :)
This is what I was wowing at, compared to the R600 links above.
3/3.52G E6400 2GB ram 660/2200 8800GTX ALL MAX 4xAA +HDR (1280x960)
http://www.chiphell.com/attachments/...ow9FjyyONs.jpg
deathman, do you know, does fps tend to drop at higher speeds? i don't know as i've never played the game, but in the 2 screenshots the frame rates seem to be identical at both 104 mph, and 0 mph. Any significance?
http://www.chiphell.com/attachments/...kMK47f6XI8.jpg
http://www.chiphell.com/attachments/...YksBRHKU2s.jpg
True but 1 of them is looking at a similar scene. And well no a 6600GT couldn't get more FPS if its truly using 4xAA and HDR. Im talking in general. Can't be nit picky yet on any thing currently because most are faked anyways which probably like this one is.
I've never played the game myself so I have no clue. NFS Most Wanted I didn't think it made much difference in FPS.
http://www.fudzilla.com/index.php?op...d=754&Itemid=1 +1. He`s great...
seriously TDU plays awesome on my bro's 8800GTS so if an R600 can take it up a notch then it is definitely a good thing!
6pin + 6pin + pci-e=225w
6pin + 8pin + pci-e= 250w
need 250w to overclock.
is not cpu, is spec for the plug. 6-pin pci-e is 75w, supposedly. Slot provides 75w. add the three, you get 225w.
8-pin pci-e plug is 100w(figure 25w/wire pair). This provides extra power needed when overclocking.
ATi HD2900XT - AMD Athlon 64 X2 3800+@2009Mhz - 1GB DDR2
3DMark 06: 11335
http://www.generation-3d.com/UserImg...markhd2900.jpg
http://www.generation-3d.com/11335-s...-XT,ac8886.htm
i cant beleive
too high score
if true this is obviously overclocked radeon
Maybe that R600XT has the real proper drivers.
The Radeon HD 2600 XT have 25 - 140 % more performance that 1950 XTX
http://www.generation-3d.com/La-rade...XTX,ac8885.htm
if the ss is true then the system with stock QX6700 will reach 5*6963*4150/(1.7*4150+0.3*6963) = 15802 marks
Screens of the R600 DX10 Demo
http://www.hardspell.com/pic/2007/4/...e59df3fd1a.jpg
http://www.hardspell.com/doc/hardware/36897.html
realtime dynamic GI and transparent raytracing? now that's cool !!
if this could be used as an accelerator for render engines... wow :eek:
I hope for ATI this is true and the previous crap was wrong!
One more than quite possible fake SS, and everyone goes nuts again in hype. Its like when one hype is shoot down, a new emerge in some rapid hopes.
Sound didn´t turn out well either. And if it was such a killer, it would be a 699$ card!
Its beyond a tech discussion, its a "want to believe" thread now.
shhhtt Shintai,please let us dream....lol
Actually not.
people have commented on the dailytech benches, that seems to have the card already.
That they are off.
The tech is highly adpative for DX10, and not OLD archelogical OLD DX9.
Its more adapative than Nvidias cards.
Using drivers to enhance imagequality down the line.
So, ati can enhance imagequality, Nvidia are stuck at current patterns until G90.
There are a ton of people out there who want to play Crysis gen 2 engine game, when it is out.
If ATI has better imagequality, a better card for DX10 and has a better future capability, well I dont know but for me it seems stupid to even buy Nvidia card from the 8800 series by those tech stuff alone.
If I was an ATI employee, I would market it like that,:fact:
AMD/ATI deliver you a stunning visual experience with windows Vista and Crysis 2 that will be even better down the line when we upgrade your eyecandy with drivers alone, yes the card you buy will be able to become better and better looking and you wíll not belive your eyes!
(here then comes a text in how they will be able to tune eyecandy and imagequality over time)
You can always buy another card than one from ATI/AMD but then your stuck at current imagequality since ati deliver the industry adaptive card for the next gaming platform.
Windows Vista and DX10.1
Oh, yes btw, we even support the next installation of Directx, something we are proud of.
If you want to stay with a card for the next 2 to 3 years then AMD/ATI is the way to go.
AMD/ATI the gamers choice for todays and tomorrows games.:woot:
(in fact, if amd and ati want some advice I am open for work)
Personally I myself could care less about that card! I am more interested in their success with it due to AMD'S involvement with owning ATI and the outcome of the well being of AMD.
Besides I have better things to spend my cash on like my new server board, ram, 2 chips
Well, semantics.
You know what I mean ;)
Its based on cryengine 2, sure, its also the basis for crysis the game from that engine.
Might not been clear for a new reader but for extrme ppl it is ;>)
Its all how you measure it, crysis is the 2 game they done from the same ppl who more or less did the same engine twice. ;)
Its just prettier:clap:
flopper, you are a dreamer ;)
And no card is DX10.1 ready. Since it would require SM5.0 among other things. How can they make a card for specs that aint finalized. Again, snap out of the dream.
@shintai.....I agree with you on all the gosip about this card I myself am sick and tired of it too, but don't be such a Killjoy all the time.
2900XT>8800 Ultra?
For some reason you always forget when I bash Intel or nVidia. Makes it more valid for you?
Perhaps thats because you use all your time bashing those and praising AMD.
Im abit like Ed, just overall always negative :p:
Crititique is what improve things. Not just sitting and clapping hands.
:para: :para:
Ya right. We will see when DX_10 games came out. And then a 8800 be so good at DX_10 as Nvidia FX where with DX_9.
Donīt forget that if ATI wanted big performance at DX_9 they only need to take the R580 arquitecture and put the specs of the R600 on it. That card simply crush 8800 in DX_9, but in DX_10 it sucked....
The specs in R600 are there. Now is only a time to came out games thar use those specs for processing high-complex scenes. Donīt forget the videos of dx_10 games runing and the power that is needed for that.
The discussion now in DX_9 games is only for fanboys. The market want DX_10 numbers, and not fanboy talk....
All i can say it's that the score is true and even truely amazing.
http://www.crazypc.ro/forum/attachme...3&d=1149528112
R520 and R580 prove themselves more future proof than their Nvidia counterpart. R600 is suppose too be a monster with geometry shaders so we can suppose that it will performs very well in Directx10 games but for sure supposition is not fact so let's release some Directx10 games!!!
Just because those DX_10 games are 5 times more complex. Or what you see on the movies is made by the power of good?
Donīt forget the specs of the R600. They crush G80 specs. Is only a matter of time to have those specs in R600 working at 100% and then we will see who wins who.
Neither Nvidia engeneers are miraculous, neither ATI engeniers are stupid.
Is only a matter of optimising the arquitecture for DX_9 ou DX_10. ATI whent clearly for DX_10.
true calculated you mean
Well my buddy just had me build a conroe system for him with a 8800GTX and benching his system with a E6600 @3.0ghz with the 8800gtx in Aquamark3 it scored 179,860 but benching my current system with his 8800gtx I got 108,000 and generally with my ATI 1950XTX no OC I get 103,000
Do I think that the new ATI card will whoop on the 8800GTX most deffinately considering that my ATI card scores only 5,000 less than the 8800gtx does and thats at lower clock settings with no O.C.
The tests that you see online in that review have to be wrong and driver related
because there is no way seeing how I have done a side by side comparison in my system with ATI & Nvidea's current flag ships!
Again if true, since everyone seems to flip out if I say wow or something, thats amazing imagine under a C2D, with more horse power say 3.6Ghz surely it will score above 15k.
Well if you want to be suspicious its that high SM2 and SM3 score if you want to look at anything thats very high compared to even a 8800GTX.
Aquamark 3 is all cpu related.
Doubt I was CPU challenged considering I had my OPty @2980MHZ for testing.
Uhmm..yes, your CPU was the slow part.
Opteron 165 @ 3GHz: 120,655
Conroe @ 3GHz: 162,561
AMD Opteron 165 @ 3GHz
2x eVGA 7900 GT SLI @ 700/800 - Forceware 84.66
2x 1GB G.Skill DDR500 3-4-4-8
DFI NF4 Ultra-D A0 Modded to SLI-D
Western Digital 250GB 7200RPM SATA HDD
OCZ GameXStream 700w PSU
Intel Core 2 Duo (Conroe) E6600 ES @ 3ghz
2x eVGA 7900 GT SLI @ 700/800 - Forceware 84.56
2x 1GB G.Skill DDR2-800 4-4-4-12
Intel D975XBX "Bad Axe" Mobo
Western Digital 250GB 7200RPM SATA HDD
OCZ GameXStream 700w PSU
No doubt so we will just have to put my XT1950 XTX in his build and see whats what.
I thought you didn't like AMD???? You paint yourself as a devot hater of AMD....LOL...in actual fact you just be a fair weather friend:nono:
I don't get it?? What you just said makes no sense as its one and the same. I will see how the 2 compare in his rig but, I still think the tests they did must be wrong and driver related.
But can I intrest you in some happy pills?
Sorry just re read the post, I need sleep gotta get some shut eye.
But I will test my card in his build and get back to you boys on what I find.
It's amazing what hope will cause people to swallow.
First the March delay and the absurd "strategy" excuse about the family launch. AMD/ATI dying for cash and they delay for marketing reasons? Where is the family launch now? R630 and R610 coming out on 5/14, or are they going to delay the R600 again? LOL to anyone that bought that joke of an excuse.
Second, the NDA at CeBIT. DAAMIT's got nothing in the $300+ market to cannibalize. What possible reason for the NDA if benchies are good??
Third, the NDA in Tunis. Same as above but X10. What possible reason if the numbers are good? Don't give me a Bush-level "strategerie" like the family launch above.
Fourth, the pricing. When you are releasing a new flagship and your whole company is dying for cash, who the hell talks about pricing and undercutting the competition instead of beating the crap out of them performance wise??
Fifth. DT, Kyle and Fuad (a pure ATI fanboi) all working together in some gigantic FUD conspiracy to trash the R600 just a couple days before launch. Yeah, ok. To what purpose when the numbers will be out for all to see in a couple of days except to trash their own credibility?
Fifth, ATI's own slides not even bothering to mention any XTX, talking about "technology" instead of "performance" leadership, and comparing the XT against the GTS or even their own X1950 LOL.
All of this, and there are still people searching for any excuse to convince themselves that the R600 is going to kick the GTX's ass. WTF?
Here's the bottom line. The R600 is going to be an NV30 level disaster. Crazy late, with performance hardly better than the competition's 2nd best from 6 months ago, and while sucking up an insane level of power (200 watts, with 225+ needed for overclocking?!?) that dwarfs the already power hungry GTX and with heat and cooling issues associated with that. After all this time that's nothing but a very ugly disaster, one that won't be helping AMD one bit and will do very little to push NV into cutting prices. That is bad for everyone, but it doesn't change reality, and that is what the reality is.
DT, Kyle, and Fuad may have all had the same source...a source that no longer works for AMD/ATI after the merger. Simple answer to that question.
It was said MUCH earlier...NO XTX this gen. Original XT is the 12-inch models, which actually feature smaller PCB than current "XT", with much crappier power regulation. XTX has become the XT
They sold enough OEM 12-incher's to cut out the XT part and switch XTX to XT, buying them time to pull a miracle w/ GDDR4 boards. XL can then be 12-inch model with current "XT" heatsink slapped on(notice that 12-inch features only three heatpipes, but 9.5inch card has 4, good way to deal with poor yeild parts).
Performance...I cannot comment on. I do however know that alot of NDA's on the old parts should have expired recently, and that alot of people have been mentioning recently that they have not signed any NDA's...which tells me that previous expired, and they chose not to sign new NDA.
No new NDA signed...info begins to leak. However, info is based on prevous incarnation of R600, and some development cards(which had half the ringbus turned off), neither of which really portray the real performance of R600. Figure in that in march...all GDDR was questionable in how it performed. If you check boards produced during this period, we have lots of samsung go missing, and infineon starts to appear. Looking back on orders placed by various parties, and available parts, it's easy to see what happened, IMHO. Oh, and don't forget nV's BIG bigwig paying a visit in person to TSMC...
Seems to me that 6000 HD 2900XT boards is worldwide, based on allocation. Boards ship out to retailers tomorrow...because back rooms of my favorite retailers do not have these parts ATM. Launch is 14th, but not all retailers may have recieved thier parts, or may hold back thier parts to see how other set pricing. Irregardless, I want 6 cards...cash saved...I doubt I'll get one. Full retail availability in June, for ALL PARTS.
Who is to say r600 is going to fly on DirectX9?
What if their true strength is DirectX10.
Look at this the G80 uses 128 stream processors.
AMD uses 320 (Probably more inefficient than Nvidia) the DirectX10 format is made for using a unified structure, DirectX9 needs a converter to translate to the unified shader system (Which might become more inefficient with more stream processors).
Don't say its a failure till we see benches of Both DirectX10 and DirectX9.
I'd be very happy to be proved wrong. I would be very happy to replace the GTS I got in Jan with a $399 card that blew away the GTX. Wishes, however, don't make reality. I gave multiple reasons, the strongest being DAAMIT's own actions, the pricing and the NDAs, why I think it's going to be exactly what I said it was.
I'm not happy about it, I'm angry. Angry because NV feels so confident that they can put out the 8800ultra and the 8600s at rip-off price points because of the lack of competition. Angry because the whole of AMD/ATI is endangered because of the stupid price/manner of the ATI acquisition and the recent lack of execution from both AMD and the former ATI.
Must R600 news is BULL:banana::banana::banana::banana::banana:, Fudzilla has to shut up.
You onlys should believe news from guys that are under NDA, but are allowed to release a part of the info.
A few guys on tweakers.net forum are under NDA, and one of them has two X2900XT cards to test on his DFI RD600.
This info did he get from ATI
Quote:
Review Guideline:
Required & Restricted to compare with GF8800GTS 320/640M, GF8800GTX, 7900GTX, X1950XTX, X1950XT
Highlight: DirectX® 10 Ready, Avivo™ HD, UVD (full hardware decoding of Blu-ray and HD DVD), Built-in HDMI and 5.1 surround audio, Free “Black Box” game bundle, killing price/performance ratio over 8800GTS/GTX.
Radeon™ HD 2900XT Product Features
•Superscalar unified shader architecture
•320 stream processing units
•512-bit 8-channel memory interface
•Comprehensive DirectX® 10 support
•Integrated CrossFire™
•High-speed 128-bit HDR (High Dynamic Range) rendering
•Up to 24x Custom Filter Anti-Aliasing
•ATI Avivo™ HD video and display technology
•Built-in HDMI and 5.1 surround audio
•Dynamic geometry acceleration
•Game physics processing capability
•Free “Black Box” game bundle from Valve Corporation*
ATI Avivo™ HD
• ATI Radeon HD 2000 Series GPUs with ATI Avivo HD technology offer advanced audio, video processing, display and connectivity capabilities for high definition entertainment solutions
• Introduces a breakthrough for the playback of Blu-ray and HD DVD discs.
• UVD (Universal Video Decoding) is a new full-spec HD video processing technology that provides full hardware decoding of Blu-ray and HD DVD
• UVD technology enables a cool, quiet media PC with low power requirements for the GPU and CPU
• UVD technology enables entry level PCs to play full-spec HD discs
• ATI Avivo HD technology includes fully integrated high definition audio support enabling playback of multi-channel (5.1) audio streams and when combined with the integrated HDCP copy protection, enables a one cable HDMI connectivity solution to high definition home theaters
ATI benchmarks (<--allready seen a while ago!
http://file01.uploaddump.nl/~file01/...rysis-demo.JPG
http://file01.uploaddump.nl/~file01/...comparison.JPG
Isn't it kind of funny why they even show the "product comparison table" when it just lists everything the R600 has and nothing that the 8800GTX/GTS has? They could've just made a list lol, just thought that was kind of funny how they would make a whole chart to show that.
How is the r600 a second gen unified shader architecture? Did ATI make a card that I am not aware of?
yes, the Xenos based card for the Xbox360 was a unified shader part.
That comparison chart is stupid. Listing a 512-bit memory interface over the 8800 means nothing if the 384-bit 8800 out performs the ATI card even with its higher bit interface. Higher numbers mean nothing.
They mention integrated crossfire, yea? Big deal, the 8800 can do integrated SLI, can ATI's card do that?
One thing I do like however is the first two sentences.
Review Guideline:
Required & Restricted to compare with GF8800GTS 320/640M, GF8800GTX, 7900GTX, X1950XTX, X1950XT
Highlight: DirectX® 10 Ready, Avivo™ HD, UVD (full hardware decoding of Blu-ray and HD DVD), Built-in HDMI and 5.1 surround audio, Free “Black Box” game bundle, killing price/performance ratio over 8800GTS/GTX.
I like how they mention GTS/GTX, not just GTS.
I wonder what the valve gaming pack will be, Left 4 Dead? Ep.2? Team Fortress 2?
I hope tomorrow truly is the day that all beans are finally spilled.
Half-Life 2: Black BoxQuote:
Originally Posted by cantankerous
http://www.xtremesystems.org/forums/...1&d=1178039391
Also known as: Black Box --- Half-Life 2: Episode Two, Team Fortress 2, & Portal
That Crysis benchmark still looks kind of wierd.
that crysis bench is on par with other rumors saying ~+4 fps on average for hd2900. I hope that bench is real, because if it is that means that 8800 and hd2900 owners can play @ 1600x1200 4xfsaa 16x aniso with a decent framerate. Normally we should get more info tommorow :p
Personally i hope the hd2900xt is faster then the 8800gtx, just to make nvidia release (yet again) that magic driver they keep for when ati launches their new cards... now everyone has cards to play crysis on their 1080p screens...
la vie serait belle!
i dont know man .. i mean G80's shaders r running at much higher clocks speed.. the bus width does help R600 a lot especially in bandwidth