Page 52 of 61 FirstFirst ... 24249505152535455 ... LastLast
Results 1,276 to 1,300 of 1518

Thread: Official HD 2900 Discussion Thread

  1. #1276
    Xtreme Mentor
    Join Date
    Jul 2004
    Posts
    3,247
    Quote Originally Posted by R3APER View Post
    And will aftermarket coolers (core only) for teh GTX work on the 2900?

    And when the hell can we expect the XTX?
    Maybe Q3

  2. #1277
    Xtreme Enthusiast
    Join Date
    Jan 2006
    Posts
    569
    Quote Originally Posted by XeRo View Post
    I thought the ATi CC can do it.

    EDIT: I think people also use this to monitor temps.

    http://forums.techpowerup.com/showthread.php?t=30555
    I don't have 8-pin PCI-Express, so I can't get the overdrive tab, which shows temps...
    Intel Core i7 920 @ 3.8GHz (183x21)
    Gigabyte EX58-DS4 BIOS F5
    3GB PATRIOT PC3-10666 DDR3
    Sapphire Radeon HD4870 512MB BLACK
    2x500GB SEAGATE SATA-II 7200.11
    OCZ GameXstream 750W PSU
    Antec Three Hundred Chassis

  3. #1278
    Xtreme Member
    Join Date
    Oct 2004
    Posts
    282
    Quote Originally Posted by R3APER View Post
    I know, but if the tradeoff is going with a worse mobo for the new ATi cards, I would reconsider. Which do you think is better?
    you said u would trade ur gtx for a pair of 2900xt. you cant crossfire on a nvidia chipset.

  4. #1279
    Xtreme Enthusiast
    Join Date
    Jan 2006
    Posts
    569
    Okay, so temperatures as requested;

    Idle: 60c
    Load: 70c
    Intel Core i7 920 @ 3.8GHz (183x21)
    Gigabyte EX58-DS4 BIOS F5
    3GB PATRIOT PC3-10666 DDR3
    Sapphire Radeon HD4870 512MB BLACK
    2x500GB SEAGATE SATA-II 7200.11
    OCZ GameXstream 750W PSU
    Antec Three Hundred Chassis

  5. #1280
    Xtreme Cruncher
    Join Date
    Jul 2006
    Posts
    1,374
    thats not bad at all...

  6. #1281
    Xtreme Addict
    Join Date
    Apr 2006
    Location
    Hong Kong
    Posts
    1,905
    70 load isn't bad at all!

    I'm getting more and more anxious by the minute. In fact, I'm heading out now to see if I can get one of these bad boys... then my rig will be complete!
    -


    "Language cuts the grooves in which our thoughts must move" | Frank Herbert, The Santaroga Barrier
    2600K | GTX 580 SLI | Asus MIV Gene-Z | 16GB @ 1600 | Silverstone Strider 1200W Gold | Crucial C300 64 | Crucial M4 64 | Intel X25-M 160 G2 | OCZ Vertex 60 | Hitachi 2TB | WD 320

  7. #1282
    Xtreme Cruncher
    Join Date
    Nov 2005
    Location
    Rhode Island
    Posts
    2,740
    @Rob GL,

    Nice performance, but is that Half Life 2: Episode Two or 1?
    Fold for XS!
    You know you want to

  8. #1283
    Xtreme Addict
    Join Date
    May 2006
    Location
    Herbert's House in Family Guy
    Posts
    2,381
    the 4 screws next to the core looks about the same size as X1900 ... maybe D Tek Fuzion GPU will do the trick ...
    E6600 @ 3.6
    IN9 32x MAX
    EVGA 8800Ultra
    750W

  9. #1284
    Xtreme Enthusiast
    Join Date
    Aug 2006
    Location
    Southern California. USA.
    Posts
    632
    Quote Originally Posted by [XC] Lead Head View Post
    @Rob GL,

    Nice performance, but is that Half Life 2: Episode Two or 1?
    Its the "Black Box" deal so you get....

    Half-Life 2: Episode 2
    Portal
    Team Fortress 2

    EDIT : I should probably mention these games aren't out yet. What you get is a voucher to download it for free on steam when it is released (right?).

    .....and its supposed to retail for $40 so that is added value.

    I ran some calculations at 1920x1200 from anandtech's graphs and I found that the 2900XT is on average 2 FPS faster than the GTS640 and if we were to make that into a % average that would come out to the 2900XT is on average 6% faster than the GTS640. For ~$100 more that seems a bit steep
    Last edited by kuhla; 05-14-2007 at 08:46 PM.

  10. #1285
    Xtreme Addict
    Join Date
    May 2006
    Location
    Herbert's House in Family Guy
    Posts
    2,381
    GTS 640 is about low $300's now .... EVGA $329.99 after rebate ... i bought my 2nd one almost 3 months ago and cost 10 bux more ...
    E6600 @ 3.6
    IN9 32x MAX
    EVGA 8800Ultra
    750W

  11. #1286
    Xtreme Enthusiast
    Join Date
    Sep 2006
    Location
    Nordschleife!
    Posts
    705
    I think a lot of people here are in denial and I kind of understand that. Many of us just waited an waited for something that is, let's face it, a major disappointment. The hype over R600 was something downright insane. Everybody was like "R600 wil be at least 50% faster then 8800gtx and with a much better IQ on top of that".

    A few of weeks ago when it became evident that HD2900xt wouldn't be the G80 killer that everyone took for granted, some argued that future drivers would solve that. More in-depth reviews like the ones at Anand and Beyond3D clearly state that R600 architecture is extremely software dependant:

    Quote Originally Posted by Anandtech
    If it seems like all this reads in a very complicated way, don't worry: it is complex. While AMD has gone to great lengths to build hardware that can efficiently handle parallel data, dependencies pose a problem to realizing peak performance. The compiler might not be able to extract five operations for every VLIW instruction. In the worst case scenario, we could effectively see only one SP per block operating with only four VLIW instructions being issued. This drops our potential operations per clock rate down from 320 at peak to only 64.
    Quote Originally Posted by Anandtech
    But maximizing throughput on the AMD hardware will be much more difficult, and we won't always see peak performance from real code. On the best case level, R600 is able to do 2.5x the work of G80 per clock (320 operations on R600 and 128 on G80). Worst case for code dependency on both architectures gives the G80 a 2x advantage over R600 per clock (64 operations on R600 with 128 on G80).
    Quote Originally Posted by Anandtech
    While NVIDIA focused on maximizing parallelism in this area of graphics, AMD decided to try to extract parallelism inside the instruction stream by using a VLIW approach. AMD's average case will be different depending on the code running, though so many operations are vector based, high utilization can generally be expected.
    Quote Originally Posted by Beyond3D
    While going 5-way scalar has allowed AMD more flexibility in instruction scheduling compared to their previous hardware, that flexibility arguably makes your compiler harder to write, not easier. So as a driver writer you have more packing opportunities -- and I like to think of it almost like a game of Tetris when it comes to a GPU, but only with the thin blocks and with those being variable lengths, and you can sometimes break them up! -- those opportunities need handling in code and your corner cases get harder to find.

    The end result here is a shader core with fairly monstrous peak floating point numbers, by virtue of the unit count in R600, its core clock and the register file of doom, but one where software will have a harder time driving it close to peak. That's not to say it's impossible, and indeed we've managed to write in-house shaders, short and long and with mixtures of channels, register counts and what have you, that run close to max theoretical thoughput. However it's a more difficult proposition for the driver tech team to take care of over the lifetime of the architecture, we argue, than their previous architecture.
    But what people don't realize is that ATI had a billion years to develop proper drivers. It doesn't matter what ATI says. The R600 is very late to the game, unfortunately. That "magic driver" is to be taken with a HUGE grain of salt. The IQ loss is very evident.

    Like I said in another 2900xt thread:

    "the HD2900xt is such a disaster that instead of bringing Nvidia's prices down, it made'em go up. Yesterday the cheapest 8800gts 320 was $250 and now is $295 at newegg:

    http://www.newegg.com/Product/Produc...613&name=320MB

    Before anyone call me a fanboy let me put this: I owned only ATI cards since R300. Not beacause I like ATI and hate Nvidia but just cuz they were better and faster. I too had hope that R600 would be a beast of a GPU but, unfortunately, it isn't. In my book it is a major flop and I truly hope that AMD/ATI get their act together and put some serious competition to both Nvidia and Intel
    Last edited by Caparroz; 05-14-2007 at 10:11 PM.
    Murray Walker: "And there are flames coming from the back of Prost's McLaren as he enters the Swimming Pool."

    James Hunt: "Well, that should put them out then."

  12. #1287
    Xtreme Enthusiast
    Join Date
    Dec 2006
    Location
    France
    Posts
    741
    http://www.hardware.fr/articles/671-...d-2900-xt.html
    French test better driver. on Vista.





    All is not perfect but seeing this review and Digit-Life one after one day is begin to be very interesting.

    @Caparroz : Using Anandetch review is good but u can use it entirely. In their image quality part they say nothing wrong. Image Quality in Hardware.fr (BeHardware) review is equal to X1900 image quality but a little inferior to 8800 image quality.
    AMD Phenom II X2 550@Phenom II X4 B50
    MSI 890GXM-G65
    Corsair CMX4GX3M2A1600C9 2x2GB
    Sapphire HD 6950 2GB

  13. #1288
    Xtreme Enthusiast
    Join Date
    Sep 2006
    Location
    Nordschleife!
    Posts
    705
    ^^I was talking about the Call of Juarez test. The 2900xt fireplace pic clearly shows blurred textures. That bench used the "magic" 8.37.4.2 driver, didn't it?
    Murray Walker: "And there are flames coming from the back of Prost's McLaren as he enters the Swimming Pool."

    James Hunt: "Well, that should put them out then."

  14. #1289
    Xtreme Addict
    Join Date
    May 2006
    Location
    Herbert's House in Family Guy
    Posts
    2,381
    Quote Originally Posted by Caparroz View Post
    ]
    Before anyone call me a fanboy let me put this: I owned only ATI cards since R300. Not beacause I like ATI and hate Nvidia but just cuz they were better and faster. I too had hope that R600 would be a beast of a GPU but, unfortunately, it isn't. In my book it is a major flop and I truly hope that AMD/ATI get their act together and put some serious competition to both Nvidia and Intel
    yes i too hoped that R600 was gonna be fast ... so that nv will drop price and consumer is the winner, but no ... nv didnt drop

    i see a lot of peopel who hyped this card tried damage control ...

    heres what most of em said :
    1. HD2900XT was only meant to compete with GTS

    2. image quality is better

    3. immature dirver

    i will say that for $399 its a helluva card , and from what i have seen its a good 3dmarkcard .... i'd say it score the same as a GTX ie 2900XT 850/2000 vs GTX @ 660/2100, but games .... GTX wins hands down
    Last edited by theteamaqua; 05-14-2007 at 11:31 PM.
    E6600 @ 3.6
    IN9 32x MAX
    EVGA 8800Ultra
    750W

  15. #1290
    Xtreme Addict
    Join Date
    Aug 2006
    Location
    The Netherlands, Friesland
    Posts
    2,244
    Quote Originally Posted by xlink View Post
    PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE PLEASE

    be true
    http://www.computerbase.de/news/trei...ibervergleich/
    There's the link
    >i5-3570K
    >Asrock Z77E-ITX Wifi
    >Asus GTX 670 Mini
    >Cooltek Coolcube Black
    >CM Silent Pro M700
    >Crucial M4 128Gb Msata
    >Cooler Master Seidon 120M
    Hell yes its a mini-ITX gaming rig!

  16. #1291
    XS News
    Join Date
    Aug 2004
    Location
    Sweden
    Posts
    2,010
    Thats some sick improvment, can anyone get that driver ?
    But with the loss of quality =(
    Last edited by Ubermann; 05-14-2007 at 11:46 PM.
    Everything extra is bad!

  17. #1292
    Xtreme Cruncher
    Join Date
    Aug 2006
    Location
    Denmark
    Posts
    7,747
    Its easy to enhance performance if you can sacrifice other things...
    Crunching for Comrades and the Common good of the People.

  18. #1293
    Xtreme Enthusiast
    Join Date
    Dec 2006
    Location
    France
    Posts
    741
    All modern test have IQ part, so cheat with IQ is the best way to be caught. May be demo bug or drivers issues but for some it seems esier to call it cheating.
    AMD Phenom II X2 550@Phenom II X4 B50
    MSI 890GXM-G65
    Corsair CMX4GX3M2A1600C9 2x2GB
    Sapphire HD 6950 2GB

  19. #1294
    I am Xtreme
    Join Date
    Jul 2005
    Posts
    4,811
    Quote Originally Posted by Caparroz View Post
    I think a lot of people here are in denial and I kind of understand that. Many of us just waited an waited for something that is, let's face it, a major disappointment. The hype over R600 was something downright insane. Everybody was like "R600 wil be at least 50% faster then 8800gtx and with a much better IQ on top of that".

    A few of weeks ago when it became evident that HD2900xt wouldn't be the G80 killer that everyone took for granted, some argued that future drivers would solve that. More in-depth reviews like the ones at Anand and Beyond3D clearly state that R600 architecture is extremely software dependant:



    But what people don't realize is that ATI had a billion years to develop proper drivers. It doesn't matter what ATI says. The R600 is very late to the game, unfortunately. That "magic driver" is to be taken with a HUGE grain of salt. The IQ loss is very evident.

    Like I said in another 2900xt thread:

    "the HD2900xt is such a disaster that instead of bringing Nvidia's prices down, it made'em go up. Yesterday the cheapest 8800gts 320 was $250 and now is $295 at newegg:

    http://www.newegg.com/Product/Produc...613&name=320MB

    Before anyone call me a fanboy let me put this: I owned only ATI cards since R300. Not beacause I like ATI and hate Nvidia but just cuz they were better and faster. I too had hope that R600 would be a beast of a GPU but, unfortunately, it isn't. In my book it is a major flop and I truly hope that AMD/ATI get their act together and put some serious competition to both Nvidia and Intel
    Quote Originally Posted by Caparroz View Post
    ^^I was talking about the Call of Juarez test. The 2900xt fireplace pic clearly shows blurred textures. That bench used the "magic" 8.37.4.2 driver, didn't it?
    Would you give it a rest already, that's the most broken record statement(s) of the year. Do you need the approval of others to validate your own opinion of the R600 before mature drivers are released? Also, your statement is purely one from the Nvidia fan camp. Besides, what IQ test? Do you have a link?
    Last edited by Eastcoasthandle; 05-15-2007 at 03:38 AM.
    [SIGPIC][/SIGPIC]

  20. #1295
    Registered User
    Join Date
    Feb 2006
    Posts
    98
    Quote Originally Posted by Eastcoasthandle View Post
    Would you give it a rest already, that's the most broken record statement(s) of the year. Do you need the approval of others to validate your own opinion of the R600 before mature drivers are released? Also, your statement is purely one from the Nvidia fan camp. Besides, what IQ test? Do you have a link?
    I have no clue what you are talking about, so I assume you mean the Call Of Juarez test.

    Non-mouse over:
    http://tertsi.users.daug.net/temp/R600/iq/coj_ati.jpg
    http://tertsi.users.daug.net/temp/R600/iq/coj_nv.jpg

    Mouse-over:
    http://tertsi.users.daug.net/temp/R6...ia_vs_ati.html

    Article:
    http://www.legitreviews.com/article/504/2/

    The "mature drivers" thing is getting annoying. How do you define drivers to be mature? When ATI releases an official version? Why am I even asking; no one is going to agree on a definition of mature....

  21. #1296
    I am Xtreme
    Join Date
    Jul 2005
    Posts
    4,811
    Quote Originally Posted by Noobie View Post
    I have no clue what you are talking about, so I assume you mean the Call Of Juarez test.

    Non-mouse over:
    http://tertsi.users.daug.net/temp/R600/iq/coj_ati.jpg
    http://tertsi.users.daug.net/temp/R600/iq/coj_nv.jpg

    Mouse-over:
    http://tertsi.users.daug.net/temp/R6...ia_vs_ati.html

    Article:
    http://www.legitreviews.com/article/504/2/

    The "mature drivers" thing is getting annoying. How do you define drivers to be mature? When ATI releases an official version? Why am I even asking; no one is going to agree on a definition of mature....
    Don't start that crap with me. Current drivers don't show the potential of the card yet, bottom line. The term "mature" is not relative. It has a very specific meaning. Therefore, know what you are talking about before you start blathering.
    As for the photos:
    -He clearly states he has pre-alpha (but doesn't mention which version). Again, drivers are NOT MATURE!
    -The image test looks like the bitmap was set to quality and not High Quality. I find myself having to change this with each driver update.
    -He clearly states:
    When it comes to image quality we installed both drivers and without adjusting any of the settings in the control panel jumped right into the benchmark to see how they did.
    But never shows a pic of those settings and never explains what those settings were at the time he provided those photos. Not only do we not know what those adjustments were between R600/G80, we have no control example to bench against at other bitmap/AA/AF settings.

    In all, this is a pure example of FUD through photos. There is no supporting documentation (regarding those photos) to provide an explain of how he arrived to that conclusion (through photos). People taking these photos at face value without asking questions first is the real problem here. Why shouldn't you ask questions? It's a freaking 3 page (short) review of the HD 2900 XT.
    Last edited by Eastcoasthandle; 05-15-2007 at 06:03 AM.
    [SIGPIC][/SIGPIC]

  22. #1297
    Registered User
    Join Date
    Jul 2006
    Posts
    13
    I'll wait for overclocked partner cards and Catalyst 8.38 before making a judgement. I'm thinking of buying the HD2900XT, it has about the same performance as the GTS and still has headroom for driver improvement. So i guess this should be a better card to buy than the GTS. This card shows potential, it is not as good as the GTX, but can come close to it with driver improvement (I hope).
    I only had NVidia cards (GeForce 2 MX400, GeForce 5600XT and GeForce 7300LE), but now i'm thinking of changing to AMD.
    I waited for this card 5 or 6 months and i hoped for alot more, but it is not that bad for the money it costs.
    As for the reviews, some sound fishy to me. I don't know what to believe anymore tbh.

  23. #1298
    Xtreme Enthusiast
    Join Date
    Sep 2006
    Posts
    881
    Any reviews of the HD2600XT? How does it compre to the X1950Pro?

  24. #1299
    Xtreme Cruncher
    Join Date
    Aug 2006
    Location
    Denmark
    Posts
    7,747
    Quote Originally Posted by awdrifter View Post
    Any reviews of the HD2600XT? How does it compre to the X1950Pro?
    Dont expect anything but around 1950pro performance +DX10. 8600GTS didn´t show wonders either. And it will have the same 128bit bus as the 8600GTS, while the x1950Pro got 256.
    Crunching for Comrades and the Common good of the People.

  25. #1300
    Xtreme Addict
    Join Date
    Nov 2002
    Location
    Houston
    Posts
    1,123
    Quote Originally Posted by Eastcoasthandle View Post
    email w0mbat

    It's 8.37.4.2 the driver you want. There should be a 8.39 also but I am not sure if it's been released yet.
    The 8.37.4.2 drivers are alpha and have numerous issues. They do give a glimpse as to how ATI/AMD will optimize this card. The 8.38b2 are based on this driver and offer further refinements but CrossFire/OpenGL is not working right under Vista, AVIVO is not working correctly with Blu-ray/HD-DVD, IQ is not up to par in several areas. Unless you need them to show to some nice 3DMark marks then I would wait on the next release that is scheduled on 5/23. 8.39s are in alpha testing now.

Page 52 of 61 FirstFirst ... 24249505152535455 ... LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •