er.. its a man
Printable View
no the man was furgie (misspelled intentionally, or was it that freak of nature (singer)). seriously though, seen the rumours of ange being terminal ill? i mean the looks go with it according to some pics on superficial or ontd.
but anyway. the pleasant surprise.. well as far as i know it hasn't been explained or elaborated on, but when i see and hear statements like the one in denny's thread about certain gt3 scores being WAY low and his stock score being close to or on par with a 8800gtx, then i'll say.. what the heck, this IS a pleasant surprise.
i mean, yea i've had a love-hate relationship with the r600. starting with loving then loathing, loving, ... you get the picture. atm i'm looking forward to it, although i won't be buying one for at least 2 or 3 months (being out of country and away at around the time it goes on sale will save me this time, yay). but i'll be interested to see it rock the socks off (hoping it will anyway).
No OT Pls.
Perkam
That Sapphire cooler has a MUCH bigger fan than reference ATI heatsink. Should cool better hopefully as well as much quieter. When all the reviews come out a week Monday I hope they review some of these models with aftermarket coolers and not just the stock reference cooler.
Well, first thing first, i never mentioned air :fact:
Second, if you base your speculation on what 8600gts can do and call it common sense, we'll just have to disagree on what common sense is, there is much more to the whole picture.
And third
hmm after seeing some 3dmark scores from Kinc and Denny, my hope has been renewed for this card. Hopefully, there will be a x2900XL at around the $300 range.
damned,high end watercooling...What do they mean with that..my next case is going to be a TT Armor with build-in wc-system,so that's not a good choice?I mean high-end WC is over-priced here in Belgium...And i'm a noob when it comes to putting a rig together,so i have to pay like a 100$ extra only for assembling the WC-system...
X2900 3DMark numbers?
http://www.nordichardware.com/forum/...=8428&forum=45
damned,i want to get my claws at one of these:),too bad there won't be much cards available in the beginning....
For 357 EUR (you hear me) I'm tempted to pre-order the Sapphire HD2900XT 512MB ;o
It's hard to believe that no one has leaked any SKU or pricing information on these beyond a couple of websites with crazy high pre-order prices. Is there a reliable source for pricing info?
I donno about that. Maybe no WR in 06. Have you seen this yet?
8800 ULTRA Single card 3dmark 2006:
http://www.overclockingpin.com/ultra...le%20ultra.jpg
http://www.kinc.se/06.JPG
@Your_Boss: Why are u posting 8800U numbers in the HD2900 thread?
8800U |= HD2900
@Your_Boss
To compete with this highly overclocked uber part from nVidia R600 would need to be clocked around 1.2GHz... Who knows if it can...
Back on topic.
Mayby this info will be of some relevance to someone :)
Attachment 58600
Source:
http://forum.beyond3d.com/showpost.p...postcount=4547
Not that bad if manufacturer states 750-800MHz in given TDP range ;) Thats like saying 800MHz is guaranteed on this card no matter what (even with 2x6pin power connectors, obviously blocked by graphics card BIOS).
Yea, 2 XT is what price/performance is to be compared to 1 Ultra card.
I guess ati wins then, landslide.
Half the price, with about the same performance using one card, I guess those XT cards isnt so shabby then.
I dont know, but who want to pay double the price for essentially the same performance?
If those price levels are at launch I guess buying a card this summer is a good deal then.
Good for me, good for ati/amd.
:woot:
Quote:
Radeon HD 2900XT
http://aycu17.webshots.com/image/151...5869153_rs.jpg
http://aycu07.webshots.com/image/164...4250356_rs.jpg
http://aycu28.webshots.com/image/139...9974738_rs.jpg
http://aycu31.webshots.com/image/156...8063797_rs.jpg
http://aycu16.webshots.com/image/171...7714818_rs.jpg
http://aycu16.webshots.com/image/171...3437993_rs.jpg
Radeon HD 2900XT’nin Teknik Özellikleri:
- Grafik İşlemci Hızı: 757MHz
- Bellek Hızı: 1656MHz
- Bellek Veri Yolu: 512-bit
- Bellek Miktarı: 512MB
- Bellek Tipi: GDDR3
- Toplam Transistör Sayısı: 720 Milyon - 8800GTX’de ise yaklaşık 675 Milyon
- ROPs ve TMUs: 16 ve 32 adet
- Ramdac: 2×400MHz
- Bellek Bant Genişliği: 105.98(GB/sn)
- Unified Shader İş Hattı: 128
- DirectX ve OpenGL: DirectX 10 / Shader Model 4.0 ile OpenGL 2.0 desteği
- FSAA: Smoothvision HD + Adaptive AA
- HDTV, HDMI ve HDCP Durumu: Var, HDMI 1.2 uyumu, Var (HDMI Modları: 480p, 720p, 1080i)
- PCB: 12 Katmanlı
http://www.fx57.net/?p=637
regards
the more is see pics of the card,teh sexier it seems lol:)
you cant compare with the 8800GTX 64 TMU´s , the X2900 TMU´s run with higher clocks
the x1900 also had less TMUs and in complex shader intensive games they had advantage , and the ratio in R600 is even higher , with x1900 the ratio was 3:1 , now with R600 the ratio is 4:1
regards
Yup, I'm getting one.
What is the bad news ? only 16 rop ? ... a french friend told me AMD said at Tunis a bad news about R600.
i belived it was only 16TMU ... but ... dammm :/
So whats the new word on the street have they confirmed a definate launch date seeing how they don't do soft launches...LOL ...the part about that is that this card has been publicaly promised since Febuary good thing they don't do soft launches eh?
I would want super high resolution textures in the places a shader so far can't keep from looking bland, but the real reason thats been held back over the years is developers don't want their games getting too fat.
Sure I can. Watch me (assuming that is correct):
8800gtx
575 x 64 = 36.8 GT/s
8800GTS
500 x 48 = 24.0 GT/s
HD2900XT:
757 x 32 = 24.2 GT/s
Looking at TMUs, and overclocking, the XT and GTS look comparible. GTS's get to what, ~660 when overclocking? That would take a R600 at damn near 1ghz (990mhz).
Whoopsy. :p:
I know, I know...Theoretically speaking. The point is, there is definitely a gap between the GTX and the XT, and I doubt the efficiency is going to go in the way of the Radeon. Even if it does, could it close a 1/3 gap? Don't get me wrong, hopefully it does.
I just keep asking myself why ATi keeps screwing themselves over when it comes to texture units.
Relax, you'll see it soon. :D
And I'm representing 99.9% of world population so you gunna pay £1057 for an G80Ultra for me? :p: By conversion that's $2109.96 of rip. :nono:
No I didn't think so. The BFG G80GTX cards that are used in one of my uncles work place system, oc'd and wc'd run at 655/1500/2150 anyway, so G80Ultra isn't nothing to compare with a $400 product, coming with bags of accessories which in effect make it $300 at the very least.
I'm just curious about the sound device details and the crossfire performance compared to a reference G80GTX SLI now. I know most details by now, just need a fuller picture.
Yeah, RV670. :)
Since the XL exists (apparently), and the high-end part is $399 MSRP, it makes you wonder where the XL and PRO will fall price-wise. At least one of them will have to be a tasty deal.
As for texture units...Of course R300-r520 didn't use the TMUs, because it's a 1:1 ratio...and thank goodness they moved away from that because you're right, they would never be used in today's games. I was under the impression the ratio is somewhere around 4-5:1 in heavy arithmetic games, much less for current and older games or newer games that are more texture-heavy, making 32 still seem short when paired with 320 MADDs (10:1), especially compared to G80 which is 256:64 and 192:48 (4:1). Also, ATi has said in the past it would offer more texture units when the bandwidth became feasible to do so. If 2ghz+ on a 512-bit bus is not bandwidth-feasible, than I'd like to know what is.
That being said, the future is HEADING in the 10:1 direction, and maybe we'll get there in R600's lifetime using DX10, but we sure as heck are not there yet, hence it could easily become a bottleneck.
roflol,look at the date from that review:It states:January 1st,1970:)
Page 19th will be very interesting ;)
:p:Quote:
UnReal Overclocking!
From the screenshots from techPowerUp regarding the 2900XT on HIS.
Why in the second screenshot does it have "Crossfire cable x 1" in cable/adaptor bundled?
Does this mean like the cable dongle thing or a bridge?
Should set a new fps record in CoH...
Perkam
I'm curious on that one as well. I'm betting that page is the volt modded extreme cooling page.
I know it's the volt modded, and generally extreme cooling page, but I'm still curious myself what they can do frozen, even if it honestly doesn't mean anything to me if it's not 24/7.
Regardless, should be interesting to see the numbers.
100%++ Overclock is my guess
I know, I saw it. I'm still guessing that's the volt modded extreme cooling section of the review, as vr-zone generally has one for their reviews.
Did you find this out from R300King! at Rage3D? He said it there yesterday. :D http://www.rage3d.com/board/showpost...&postcount=701
I think unreal overclocking will be a software mod(probably the overdrive people were talking about). I think it's where it used that extra 8-pin connector. So the unreal OC maybe just for those who can supply this much more power to their board.
Well, can't wait until the 14th. Hope VR-Zone opens up shop early tho. :p:
VR-Zone's last figures were way off, so I'm not counting on them this time when I know the guys who have the cards. I'll wait for what they dish out 12/13 May.
According to [Kinc]: Unknown VGA scores around 12k 3DM06, stock clocks on everything, pre-release with a ES QX6700 on "I think (?)" a Striker MB. :slobber:
Now I'll take you to a few months back when the G80GTX was released.
Back in mid-November when I first got the XFX GeForce 8800 GTX (575/1350/1800) nV ForceWare 96.94, eVGA 680i SLI MB with the P21 BIOS, the QX6700 at stock with Corsair Dominator XMS2 (PC2-6400 CL3) @ DDR2-800MHz 3-3-3-8 2T -- it hit a score of 11651 in 3DM06 @ default res.
Which makes "unknown VGA beast" look VERY promising and even yet, pre-release. ;) :D :eek:
Keep your pants and nickers on boys 'n girls .. heheehee :D :banana:
Anyone seen this? HD2900XT 512MB for $449, pre-order obviously: http://www.amazon.com/ATI-Radeon-290...8553792&sr=8-1
MSI HD 2900XT picture
http://my.ocworkbench.com/bbs/showth...421#post412421
That´s only why i love ati :D
Will this card be faster the two 8800gts 640 in sli?
From what I've heard though, there will be insane headroom on this cards, it shouldn't be a bad buy at all.
As of now, it looks like the following price wise:
$149< HD 2400 Series
$149-$199 - HD 2600 PRO (June-July)
$199-$249 - HD 2600 XT (Three Versions - 256MB GDDR3, 512MB GDDR3, 512MB GDDR4) (June-July)
$249-$299 - HD 2900 PRO (Q3)
$299-$349 - HD 2900 XL (June-July)
$399-$449 - HD 2900 XT (Two Versions: 512MB GDDR3, 1GB GDDR4) (May)
$549-$599 - HD 2900 XTX (Might be more) (May-June)
Approximate...
Perkam
so there really is a 1GB version being released on in May as well? I thought the 1GB GDDR4 was exclusive to the XTX line and an XT version with 1GB of ram would not surface to reality. I guess another 6 days will tell.
i prefer wait for R650, R600 is great but my R580 Too ;)
The 1Gig version can be considered nearly vapor ware at this point because we won't see it in our systems unless you go through like dell, hp, etc. As well the 1Gig version which we thought was XTX was renamed to just a XT version currently. If there really isn't going to be a refresh of this product that means when the 65nm core R650 comes out they can still use the XTX name still for that core with the higher clocks.
makes sense guys, I heard this too but thought perhaps things could have changed... yet again. I will not be looking out for a 1GB version it seems afterall.
Totally forget about the idea of an XTX for now.
It's no where to be seen for us. Maybe in the future, but there's no official confirmation of that yet either.
However the 512MB XT will release soon. And it scales very well, gaming performance/quality is supposed to be better than number applications.
Here's one to buy, pre-order for 357 Euros: http://www.icomputers.nl//articledetail.aspx?A_ID=10878
Quality. ;)
You know what I'm thinking, what might be a crucial point in deciding between the 8800GTS and 2900XT.
PSU.
I have a Corsair 520w, good enough for 8800GTS in SLI, but I don't think it is good enough for the 2900XT in Crossfire.
Buy a 8800GTS now, if Crysis comes, I can upgrade to SLI, but if I buy the 2900XT, then I'm stuck. If performance is close to each other, I'm buying a 8800GTS.
Makes sense doesn't it? 225w for the XT is a no no, too high, single card yes, but Crossfire....phew.
We're talking about $350-450 graphics cards and you're worried about a $125 PSU upgrade?
That's for Crossfire IIRC, and it was originally posted by Fudzilla, dunno if all that is correct, anyway, my PSU is powerful enough for 2x8800GTS both cpu and cards overclocked . Many people have this setup. (It's the Corsair HX520w)
I have no hope of running Crossfire with that if a single card draws 225w. :cool:
Not sure if this was allready posted!
http://forum.coolaler.com/showthread.php?t=152864
:slobber: :slobber:
HD 2900 thermal numbers out, 225W
Quote:
Three kings including the XL
ATI will introduce three Radeon HD 2900 cards on the 14th of May. According to a document we've seen the most powerful one is called the Radeon HD 2900 XT 1024MB DDR4. Its TDP, thermal design power of the GPU, is an astonishing 180 Watts, while the whole board dissipates 225 Watts.
The second card is the Radeon HD 2900 XT 512 MB GDDR3 and its GPU's TDP is 160W due to slightly lower clock speed and the whole board again dissipates 225 Watts.
The third and final card is the Radeon HD 2900 XL, a cheaper version of the marchitecture with 512 MB GDDR3 and its GPU will dissipate 130 Watts, while the board will dissipate 205 Watts. We believe that the power components on the card gets really hot, 82 degrees C on the outside of the cooler is what we've already seen.
http://www.fudzilla.com/index.php?op...d=850&Itemid=1
:toast:
regardsQuote:
R600 minimal requirements pictured
Update: Works with a good 400-500W as well
As some of our readers actually didn't believe that their bellowed R600XT might need 750W or more here is some proof. AMD clearly states you need a 750W or better power supply for this hot beast. You also need an Athlon or Pentium 4 of better CPU, 512MB to work and 1024MB for optimal performance or more. You can read the original part here.
The card supports Windows XP, 2000 and Vista both 32 and 64 bit one. We still have hope that the card might work with a good 500W but this is what AMD recommends. Well, it's all here.
Update: We still stay behind the picture we posted but several sources close to AMD, partners and retailers who had the card confirmed that a single Radeon HD 2900 XT works with a good quality 450 - 500W PSU as well. One of the sources confirmed the slide we posted but said that this is actually a Crossfire Radeon HD 2900XT with Quad core system requirement. Some of our friends will test the card for us and we will let you know if it works with 500W. Stay tuned. Gibbo from Overclockers had tested his card and he said it will run with 400-500W PSU and you can read it here.
http://aycu04.webshots.com/image/155...0649484_rs.jpg
http://www.fudzilla.com/index.php?op...d=810&Itemid=1
Yes, I can run it on that too. Running at stock and a stock speed C2D. Put in a C2Q/X2 and it quickly gets more problematic. And forget any OC.
OCZ gamexstream 600wts PSU no problem at all overcloking E6600 and x2900xt 512mb :)
Btw: with 8.374 driver on windows XP in 3dmark 2006 with card default and e6400 @ 3ghz the score is very good
with Vista the driver still not so good , less ~1000 marks and also less FPS (some games -60 FPS )
regards
Of course it won't need a 750W PSU. Anyone could've written that document.
eek. That Coolaler link is showing the card running off of HYNIX ram. Is this going to be a problem performance wise? I thought the card was using tried and true Samsung.
It looks as if they removed the second fan connector as well.
insane overclock to get a high score? Obviously this will improve with better drivers.
http://forums.vr-zone.com/showpost.p...1&postcount=14
Quad core? What clock for that I wonder?
Link
http://img66.imageshack.us/img66/782/cooledr6001qc1.jpg
http://img66.imageshack.us/img66/697...edr6003bd8.jpg
the pci-express slot itself supplies 75 watts itself so using the 225 watts with 2 notmal 6 pin pci-e cables in it should add up to 225 watt total
It does add up to 225W, but they don't run right to the edge and use a tolerance.. The cards TDP is 225W.
http://img523.imageshack.us/img523/6...pvfr600td6.jpg
So to overclock it needs the extra capacity of the 8 pin connector. 2x6 will run at stock.
Isn't TDP for AMD a maximum value and for Intel an average value?
No not the average, Intels are a bit short of maximum, maybe 10%. Neither are the maximum wattage a chip/card will draw in our world because we overclock/overvolt, taking them beyond TDP.
http://www.silentpcreview.com/article169-page3.html
Quote:
This means that TDP, as defined by AMD, is measured at the maximum current the CPU can draw, at the default voltage, under the worst-case temperature conditions. This is the maximum power that the CPU can possibly dissipate. Intel, however, has a different definition.
Intel’s TDP is actually lower than the maximum power dissipation of the processor (and as you’ll see later, it can be significantly lower). This is in stark contrast to AMD’s TDP numbers, which are higher than the respective processor’s maximum power dissipation.
Thats not entirely true.
http://www.xbitlabs.com/images/cpu/c...ut/power-2.png