^^That graph shows only the inaccurate measurement on Xbitlab's side,nothing else(as for the higher than official TDP for AMD)
And BTW ,this is OFFTOPIC!
Printable View
^^That graph shows only the inaccurate measurement on Xbitlab's side,nothing else(as for the higher than official TDP for AMD)
And BTW ,this is OFFTOPIC!
Well my GTS is going on Sunday so hopefully I will have one of these on Tuesday or Wednesday if the price is right.
What isn't?
The TDP is just a design envelope indicating the upper thermal dissipation needed for a chip/card. Overclock or overvolt and it will move beyond this design envelope. That's my point. For Intel the main issue is when Tc >Tc-max. Take the C2D E6600 has a TDP of 65W, the same across the range except the 75W extreme versions. Its just a thermal generalization for the chip package, think about it, a 6300 should be lower than 6600, its not. Its really there for Engineers to design cooling solutions in the real world. Its not that accurate a reflection on maximum wattage. The Intel value is certainly not an average, more like the typical maximum (stock). AMD? Well its supposed to be the maximum (stock). If you're saying they under report it, I wouldn't know, its hard to measure. Theoretically if you ran identical systems the power delta should be the same as the TDP difference, but then you get into internal/external memory controllers and chipset efficiencies. I wouldn't place to much stock in charts.
Sorry for the OT
2900XT 3DMark06 number
http://forums.vr-zone.com/showpost.p...1&postcount=14
To late, one page back #693
Check out the ghetto cooling. Love it.
Holy crappola, it's doing 1000/1000 on air :eek:
R600 can't walk on water but it sure can fly and with the 357€ that has been seen on some site already to my surprise i am even considering crossfire in a not so distant future, DAAMIT :devil:
It's better to invest in watercooling than to invest in crossfire for now...OR
If you are bent on buying two cards, wait for the XL...two of those will be more nominal with similar performance.
It will be a little more than 320mb price range but still retain the 512mb gddr3 of the XT.
Perkam
How does that score stack up with the 8800GTX?
what's the difference between GPU TDP and Board TDP ?
so the R600's(GDDR3) power consumption is 160 or 225 ??
and G80's Board TDP is 225W too , no ?
Board is the total power for the entire board. Gpu, ram, and other various chips.
The 8800GTX is 175-177, I can't remember which, total.
so the important thing is the Board TDP not the GPU TDP ?
And your point being? Its the same thing. And I´m quite sure we can assume a somewhat equal efficient VRMs. So making all numbers alittle lower would what? Change anything, not really since the point was it had LOWER TDP! :slapass:
So counting the loss in the VRM just makes it even better. :fact:
Are we still under the impression that May 14th is the date?
Shintai,
Sorry, my bad. I slightly missinterpreted what you said.
Thanks fella's, delay would that suprise you if they did? May `14th is a far cry from the 3rd week of Febuary so at this poit it would not shock me one bit!
KInc strikes again
http://www.nordichardware.com/news-pics/47.jpg
http://www.nordichardware.com/forum/...b4604cbddc65ab
Quote:
ATI overdrive secret uncovered
Schematics about the R600 power
The R600XT, orRadeonHD 2900 XT needs two power connectors, one 2x3 pin and one 2x4 pin. If your PSU doesn’t have the 2x4 pin one, then you can plug in a 2x3 pin one, but you wont get the full potential of the card.
You need the second 2x4 pin for overclocking, as othwise ATI's Overdrive overclocking utility won't work at its full potential. You can read more about it here>> http://www.fudzilla.com/index.php?op...=673&Itemid=34
The card will overclock even without the 2x4 pin connector, but the scores won't be great. We heard that there will be a lot of new PSU's introduced starting Monday. Here is a diagram of how it looks in the real world.
http://aycu22.webshots.com/image/150...7671669_rs.jpg
http://www.fudzilla.com/index.php?op...d=894&Itemid=1
Quote:
Partners to do overclocked HD 2900XT
The reference sucks
At least a few partners confirmed that they will be introducing the overclocked versions of the R600XT card. These cards will come some two weeks after original introduction and most of them will work on significantly higher clocks.
The reference card works at 745 MHz core and 1656 MHz memory. There is still no confirmation if any of these cards will use the water cooler but we would not be surprised.
With a reference cooler the card works stable at 840 MHz as we proved here >> http://www.fudzilla.com/The%20reference%20sucks
and with a better cooler such a few partners have in their labs you should be able to do even more. With water we think even above 900 MHz should be possible.
We will try to get some clocks but at this point we can confirm that the overclocked HD 2900 XT cards are on its way.
http://www.fudzilla.com/index.php?op...905&Item id=1
Quote:
AMD HD 2900XT BLACK BOX voucher pictured
If you did not know, AMD and Valve are teaming up once again to bundle a Steam voucher in their upcoming range of HD 2900XT graphics cards. Here is some proof...
http://aycu22.webshots.com/image/155...3764673_th.jpg
The voucher will cost AIB partners roughly $6 - 7 USD each and will be bundled with all XT cards but it is unsure at this stage if the voucher will be bundled with lower end graphics cards.
If you have Steam installed, just enter the code when the games are ready and you will be able to download and play Half-Life 2: Episode 2, Team Fortress 2 and Portal. It will be a shame if users will need to wait over a year to play the games though but the games could not be too far away as BETA versions of each are already floating around the traps.
Good times!
http://www.tweaktown.com/news/7470/a...red/index.html
regards
http://www.fudzilla.com/index.php?op...d=906&Itemid=1Quote:
R600XT overclocks with two 2x3 pin
Practical example
The Radeon HD 2900 XT will overclock to 840 MHz on the air and if you havent seen that story you can see it here. The 840 MHz core on the Radeon HD 2900 XT card is possible with the reference cooler.
Even if you don't have the 2x4 pin necessary for overclocking you will be able to overclock. The plain two 2x3 pin power connectors will do the job but the Catalyst control center won't recognise that the card has the Overdrive capability.
All you really need is the GPU Overclocking tool that we mentioned and linked here and you can overclock even with two 2x3 PCIe power connectors. So you won't be needing a new PSU.
http://www.fudzilla.com/index.php?op...d=906&Itemid=1Quote:
Partners to do overclocked HD 2900XT
The reference sucks
At least a few partners confirmed that they will be introducing the overclocked versions of the R600XT card. These cards will come some two weeks after original introduction and most of them will work on significantly higher clocks.
The reference card works at 745 MHz core and 1656 MHz memory. There is still no confirmation if any of these cards will use the water cooler but we would not be surprised.
With a reference cooler the card works stable at 840 MHz as we proved here and with a better cooler such a few partners have in their labs you should be able to do even more. With water we think even above 900 MHz should be possible.
We will try to get some clocks but at this point we can confirm that the overclocked HD 2900 XT cards are on its way.
Quote:
Owners of Listan/Be Quiet! power supplies to get free upgrade to 8-pin PCI Express connectors
If you bought a Listan/Be Quiet! power supply that was made in December 2006 or later, you are supposedly eligible for a free upgrade to PCI Express 2. This means that before the ATI Radeon HD 2900 series is released, you can get the proper power connector. The new connectors will be compatible with the older 6-pin PCI Express power interface, thanks to a simple mechanism to remove the extra pins.
In short, look at the pretty picture, and expect a form to get one of these if you own a Listan/Be Quiet! power supply made on or after December 2006.
http://www.techpowerup.com/img/07-04...PcieII_thm.jpg
Source: The Inquirer
Quote:
8-pin PCIe Adapter Cables from Corsair
As the high-end R600 cards require an 8-pin PCI-Express power cable to be plugged into the corresponding connector (look at the pictures of the HD 2900XT and OEM XTX) especially if you intend to overclock these, power supply manufacturers started to design adapter cables for their current iteration of PSUs that don't come with 8-pin PCIe cables natively. The guys from BeQuiet!/Listan were then the first in providing such adapter cables for free, see the details here.
Now Corsair stepped into the line as well and if you live in the U.S. you can contact Corsair's Customer Service and will be charged for the shipping only an Official Corsair Representative over at the [H]ard|Forum mentioned.
But remember, this will only apply if you purchased the power supply during the last weeks.
http://www.techpowerup.com/img/07-04...PcieII_thm.jpg http://www.techpowerup.com/img/07-04...OEM_09_thm.jpg http://www.techpowerup.com/img/07-05...nonref_thm.jpg
Source: [H]ard|Forum
http://forums.techpowerup.com/showthread.php?t=30861
23 FPS in STALKER ??? maybe its FAKE !!!
regards
Stalker is a tad low...but one should be ready for the possibility that the rest are about right on pre-release drivers.
Remember that many more driver releases will increase the XT's performance, and ATI cannot put out drivers that give 100% of the possible performance in one release.
Secondly, headroom on the XT (on all types of cooling) will exceed that of headroom on the GTS 640, so really it'll be an overclocker's card.
On a brighter note: http://www.xtremesystems.org/forums/...&postcount=137
NO COLDBUG ON R600 CONFIRMED !!!!! :woot:
Perkam
I am liking the idea of partners released highly overclocked models from the get go. The higher clocks themselves are nothing really as we can always flash new bioses or overclock ourselves but the idea of some cooling better than the stock is leaving a nice thought in my head. If these things really are limited in their overclock due to heat than the better cooling we have the higher we should be able to go, especially with the hot summer months now upon us. I just hope some of these better cooling options exhaust heat out of the back of the case because no longer will have I have all this hot air raising my case temps like my X1900XTX with Accelero X2 does.
These results are not looking good at all, in fact I would say they are pathetic to say the least. I hope these are wrong and the 2900XT isn't just going to be a glorified 3DMark tool as it seems those are the only programs that are showing some good results with this card.
Some parts? To me that entire review showed them lower and by a fair margin as well except for the Company of Heroes which said was probably flawed and shouldn't be taken seriously. Monday is coming I know. We will all find out then from the many more reviews that will be popping up.
Where are you getting this? The only game I've seen benchmarked out of about 15 so far where this was the case is Company of Heroes. In everything else, it gets killed by the GTX.
Got a link?
This comes from a sincere interest, btw. I've been really unhappy with Nvidia's buggy drivers / hardware and am looking at going another route. But I haven't been impressed at all by what I've seen from the R600 so far.
i think i'll crawl back under my rock,to get back from it on the 14th may...I'm getting nervous from all those Frutzilla BS,and all those "fake" reviews,only for the sake of getting more hits on a site...
And WHY the hell evrybody still thinks the XT competes with a GTX?ATI stated themselves it's on par with a GTS?
why does the ati card take such a hit at higher resolution? Thought this 512 bit interface was supposed to allow "free" 4xAA which should mean that the card would perform very well at high res, with or without AA....
Sapphire Radeon HD2900XT 512MB € 339,45 :eek:
http://www.azerty.nl/producten/produ...-hd2900xt.html
CoH doesn't count...
Sounds like AA/AF isn't working properly with CoH on the HD 2900XT.Quote:
There are two things we can conclude from our first round of testing on this pretty obsolete, but still the highest-end chipset on the market for Socket AM2. First thing, there are some very strange "happenings" with Company of Heroes that forced us to suspect some driver problems. It seems possible that AA/AF settings have some problems with CoH and Radeon HD2900XT, so take these results with a tiny bit of salt. We'll add sugar to "neutralize" a bit later. We noticed similar problems (a bit worse, though) with Radeon HD 2600/2400 when we played around with them last week, but they have entirely different set of drivers that aren't ready right now. Video - very soon.
Can't wait to test this baby out on EQ2.
Day - 5 ... :banana:
AA isn't properly enabled in COH on my X1950XTX either unless I rename the .exe to "Oblivion.exe". I believe this game uses HDR, and thus somehow you have to make the drivers think it's Oblivion (chuck patch, anyone?) to make it work. It's been like this since Cat 7.1 or so.
It's going to get cheaper for sure....
Well most ppl can get close to GTX Speeds with a GTS as long as you O/C the card. At near 200$, the 8800GTS 320MB card is really a difficult card to beat unless you are running super high resolution.
Obviously question #1 is which drivers were used. Second thing is, they show both AM2 and Conroe hardware in the system specs, and a few motherboards. This is a huge flaw because if they ran the ATI tests on the AMD platform, we all know its gonna be worse than if it was run on the Intel setup.
And 3rd and most important thing is: show me stock 3dmark numbers. They don't mean much to me, but I use them as a reference when looking at results.
This review tells me nothing besides the fact that the author wasted his time and ours...
EDIT: I am referring to this: http://it-review.net/index.php?optio...=1314&Itemid=1
Oh boy.. here we go with this stuff again.
Apples to oranges. Can you not also overclock a GTX to widen the performance gap once again?
I've never understood the mindset of guys who want to compare an overclocked card with a non-overclocked card. Makes no sense.
A better comparison to make in this case would be a GTX overclocked to its max on stock voltage and an XT overclocked to its max on stock voltage. That's the setup that will be most common.
Just revisiting the review posted earlier they've added this, which is strangely exactly how I imagined him to look. :D
http://img267.imageshack.us/img267/2...0test12cj3.jpg
LOL, ape with 6 R600 cards OMG!!
5 R600 and one X2600 :p:
That "Review" is complete and utter B.S. They used an AM2 cpu on a NForce 590 motherboard for the HD 2900XT, and a Conroe chip for the 8800s. How is that an apples to apples comparison? Lets put the 8800s on a Pentium 3 and see how they do against a HD 2900XT on a Phenom.
Driver issues are the only concern I have there...
Otherwise, as i said...ppl should be ready for those being the correct stock numbers...
Again though, the power lies in the oc for this card (with confirmed no cold bug till -90C).
BIG EDIT: I also saw that they used two systems for the test but didn't say which card they used with with system...kind of pathetic.
Perkam
Yep, driver issues for sure, but lots of nVidiots only look at the first benchmarks. And after they have seen some benchmarks, they dont look again 3 months later. I know some diehard nVidiots, and i allready know they will say X2900 sucks for the rest of there lives :mad:
But luckely i know better :toast:
The "power" is right. With these things sporting a TDP of 225 watts at stock just to deliver sub-GTS performance (?!?). What is the performance/watt here? Something like half that of G80 if there is any truth to that review?
Bah, this is stupid. I've long suspected the worst, but if true this isn't another NV30, it's actually worse than the NV30. How could they have possibly f'd up so badly? NV is going to laugh all the way to the bank with its rip-off pricing.
That guy nees a shave :p:
there are a few dutch guys who are under NDA, and have allways been reliable.
Hacku on tweakers.net forum posted this:, thnx Hacku :toast:
Quote:
FarCry 1.4 SM3 HDR [1280x1024]
- 8800 GTX: 123.3 fps
- 2900 XT: 119.3 fps
- 8800 GTS: 106.2 fps
- X1950XT: 85.8 fps
FarCry 1.4 SM3 HDR [1920x1200 - 4x AA 8x AF]
- 8800 GTX: 76.7 fps
- 2900 XT: 70.2 fps
- 8800 GTS: 54 fps
- X1950XT: 50.1 fps
Prey [1280x1024]
- 8800 GTX: 184.3 fps
- 2900 XT: 131.2 fps
- 8800 GTS: 127.3 fps
- X1950XT: 117.6 fps
Prey [1920x1200 - 4x AA 8x AF]
- 8800 GTX: 89.9 fps
- 2900 XT: 70.5 fps
- 8800 GTS: 65.5 fps
- X1950XT: 55 fps
Splinter Cell: Chaos Theory SM3 HDR [1280x1024]
- 2900 XT: 142.5 fps
- 8800 GTX: 131.7 fps
- X1950XT: 94.2 fps
- 8800 GTS: 94 fps
Splinter Cell: Chaos Theory SM3 HDR [1600x1200 - 4x AA 8x AF]
- 8800 GTX: 85.6 fps
- 2900 XT: 86.5 fps
- 8800 GTS: 68.5 fps
- X1950XT: 64.7 fps
X3: The Reunion [1280x1024]
- 2900 XT: 118.5 fps
- 8800 GTX: 99.2 fps
- X1950XT: 97 fps
- 8800 GTS: 88.1 fps
X3: The Reunion [1920x1200 - 4x AA 8x AF]
- 8800 GTX: 74 fps
- X1950XT: 65.4 fps
- 8800 GTS: 58.8 fps
- 2900 XT: 56.5 fps
Recentere games:
Company of Heroes [1280x1024]
- 8800 GTX: 142.5 fps
- 2900 XT: 121.4 fps
Company of Heroes [1920x1200 - 4x AA 8x AF]
- 8800 GTX: 75.8 fps
- 2900 XT: 69.5 fps
S.T.A.L.K.E.R. [1280x1024]
- 8800 GTX: 107.8 fps
- 2900 XT: 96.2 fps
S.T.A.L.K.E.R. [1920x1200 - 4x AA 8x AF]
- 8800 GTX: 87 fps
- 2900 XT: 46.9 fps
^^ doesn't make any sense, the memory on the 2900XT should make the performance drop much less than what it is... atm this is looking like pure cack!
Those scores look more believable. But the high end with the AA doesn't make any sence where it drops really low. Must be a driver issue with AA or something still. As well depends what AA they use as well. And to top it off what clocks where they using on everything, system, GTS, GTX and the XT, oh ya system specs too.
R600 unified shaders are vec4 + scalar or vec5 superscalar ??
whats the difference ?
I predict the 2900XT bests the 8800 Ultra in Rainbow Six: Vegas which uses base code from Unreal Engine 3! Any takers?
Highest in-game quality settings, no AA.
There is no magic people should try to get over the fake benches, bad drivers, bad card arguments.
2900XT will (relatively) suck big time and its going to leave everyone who's been waiting for it disappointed because not only does it not come even close to 8800GTX but it loses to 8800GTS in about half of the benches I've seen so far.
For a product that is 6 months newer, every bench that leaked out so far showed R600 as being a flop and I'm really just accepting it now.
By flop I mean not beating 8800GTX in Most benchies which even the 2900XT should do after coming to the market so late.
Oh, and then there is that power consumption..
just...
sigh...
I need to get away from my computer for a couple days until this card comes out. All the buzz is just killing me. I need to stop waiting for the next "thing" to be "leaked". Only a few days left...now to find something to take this stuff off my mind...
Vista is the gaming platform of choice using crysis and other titles that soon comes out.
there is the ati XT the card to have.
Crossfire solutions will also be one where 8 cards might run at the same time.
Where performance will scale well. Nvidia has nothing there as of now.
Dx9 isnt the design choice for 2900XT as I seen it, all stuff indicate and points to DX10 and up and Vista.
So not suprised if games show a lower number in the begining, drivers will enhance that pretty much even in dx9.
3Dmark indicate the cards works well, so its all drivers to get.
hm..ill wait when there are more reviews and with some fresh drivers. The card cant be that bad.Quote:
Thank god that nVidia arent that stupid do that kind a thing.
1500W powerconsumption, wohoo! :rolleyes:
Are you by any chance working for ATI's PR team? Because you are very good att promotiong ATi's products.Quote:
Vista is the gaming platform of choice using crysis and other titles that soon comes out.
there is the ati XT the card to have.
Me too. Look at the AA numbers from that review:
1280 - 4x AA - CoH - 122
1280 - 4x AA - FEAR - 81
1280 - 4x AA - FarCry - 97
1280 - 4x AA - Oblivion - 47
1280 - 4x AA - Prey - 81
1280 - 4x AA - SC - 43
1280 - 4x AA - X3 - 71
1280 - 8x AA - CoH - 117
1280 - 8x AA - FEAR - 48
1280 - 8x AA - FarCry - 86
1280 - 8x AA - Oblivion - 41
1280 - 8x AA - Prey - 66
1280 - 8x AA - SC - 37
1280 - 8x AA - X3 - 61
2560 - 4x AA - CoH - 53.3
2560 - 4x AA - FEAR - 35
2560 - 4x AA - FarCry - 44
2560 - 4x AA - Oblivion - 24
2560 - 4x AA - Prey - 46
2560 - 4x AA - SC - 20
2560 - 4x AA - X3 - 36
2560 - 8x AA - CoH - 53.1
2560 - 8x AA - FEAR - 32
2560 - 8x AA - FarCry - 43
2560 - 8x AA - Oblivion - 22
2560 - 8x AA - Prey - 32
2560 - 8x AA - SC - 20
2560 - 8x AA - X3 - 29
So much weird stuff in those numbers.....
GeForce 7 compete with Radeon X19xx in all game when both cards where release but in all recents games Radeon X19xx truncate them.
Look at ATI numbers versus Nvidia numbers. ATI cards totally dominate Nvidia card!!! Nvidia cards grow old badly compare to ATI cards. 9800pro card numbers are impressive in today games.
Uhmm...no?
http://www.techreport.com/reviews/20...x/index.x?pg=1
And even in new games, only a few got the ATI dominance. And mainly due to product refresh and driver focus.
But you left all the reason out on, why it should be faster in DX10.
What the meaning of the TechReport review? :confused:
Moreover saying that ATI dominance in some game (GeForce 7 vs X19xx) is due to product refresh and driver focus is very funny : Nvidia nerver refreshed their line and never released driver update?:stick:
I can't prove at the moment that they will be faster in DX10 but you can't prove either way that "it's a desperate hope".
Quick test on new ATI new GC
http://forums.hardwarezone.com/showthread.php?t=1607936
Link
http://img132.imageshack.us/img132/4...xtlargewj0.jpgQuote:
At the MSI PC2008 event yesterday in Taipei MSI showed off its watercooled version of the Radeon HD 2900XT and we snapped a shot for you to ogle over for now. We’re working on the coverage of the whole event and it will be up later today. But for now, enjoy this picture of this card which has a very nice looking watercooler on it.
8800GTS StockQuote:
okey dokey. i can only consider myself one of those lucky guys to get my hands on some new toys (retail version) before it's launched next week for some testing.
however, i encounter some issues in running benchmarks on Win XP, thus have to do it on Windows Vista. Would feedback to ATI on this issue & see if there's any new or official drivers to solve it.
& of cos, not forgetting it's challenger... i used the 88gts 320MB from Asus as a comparison.
a teaser for all interested.
Test Platform
Intel E6600
Asus P5B Deluxe
2 x 1GB D9GMH KVR
Seagate SATA 80GB HDD
Windows Vista Home Premium
& le't's go
http://img513.imageshack.us/img513/3...defaultby2.jpg
ATI Stock
http://img155.imageshack.us/img155/7...teditedrm3.jpg
ATI Overdrive MAX OC
http://img155.imageshack.us/img155/8...xeditedbi9.jpg
Now that GTS is getting spanked.
If only that was so, then a 8600GTS would be alot better buy than a x1950Pro. Yet, when we drop 3Dmark and play actual games it turns around. And 3DMark cheat drivers aint uncomon ;)
Last nVidia driver gave what, 200-500 more 3Dmarks and nothing in real games.
And Vista is also a bad test, since nVidias drivers is abit behind there.
why didn't he use a 8800Gts 640mb?It seems to me that that card is the one 2900XT is competing against?