Page 53 of 61 FirstFirst ... 34350515253545556 ... LastLast
Results 1,301 to 1,325 of 1518

Thread: Official HD 2900 Discussion Thread

  1. #1301
    XS News
    Join Date
    Aug 2004
    Location
    Sweden
    Posts
    2,010
    The card was dealyed cuz they wanted to release all cards at the same time.
    They were spitting out 65nm cards and there was no problem what so ever with silicon or software, this is what AMD said.

    So we waited all this time and they only launched this card anyway..people were telling that it would be worth the waiting cuz we would get something extra that everyone wanted, wtf was that ? HL2 ?

    This sux so much! Or is ít AMD that sux ? Im switching side..Im now an official Intel and Nvidia Fanboy idiot.
    Last edited by Ubermann; 05-15-2007 at 05:19 AM.
    Everything extra is bad!

  2. #1302
    Xtreme Cruncher
    Join Date
    Aug 2006
    Location
    Denmark
    Posts
    7,747
    Quote Originally Posted by Ubermann View Post
    The card was dealyed cuz they wanted to release all cards at the same time.
    They were spitting out 65nm cards and there was no problem what so ever with silicon or software, this is what AMD said.

    So they waited all this time and only launched this card anyway..
    This sux so much!
    You forgot the 100million GPUs this year with uber performance according to theinq. :P
    Crunching for Comrades and the Common good of the People.

  3. #1303
    Xtreme Mentor
    Join Date
    Oct 2005
    Location
    Portugal
    Posts
    3,410
    I received this today :








    tests tomorow



    regards
    Last edited by mascaras; 05-15-2007 at 05:44 AM.

    [Review] Core i7 920 & UD5 » Here!! « .....[Review] XFX GTX260 216SP Black Edition » Here!! «
    [Review] ASUS HD4870X2 TOP » Here!! «
    .....[Review] EVGA 750i SLi FTW » Here!! «
    [Review] BFG 9800GTX 512MB » Here!! « .....[Review] Geforce 9800GX2 1GB » Here!! «
    [Review] EVGA GTX280 1GB GDDR3 » Here!! « .....[Review] Powercolor HD4870 512MB GDDR5 » Here!! «

  4. #1304
    Xtreme Addict
    Join Date
    Feb 2004
    Posts
    1,176
    No one in there right mind would upgrade until there is a value to having DX10, let alone a quad scenario anyway - wait until September.

  5. #1305
    Xtreme Member
    Join Date
    May 2004
    Posts
    187
    no wait until 2008, no, fall 2009 or better yet 2010... Everyone needs to stop playing the waiting game, THERE IS ALWAYS SOMETHING BETTER A FEW MONTHS FROM NOW IN THIS HOBBY OF OURS!!!
    e6320 @ 3.2Ghz 7x458 1.38v
    DS3 rev. 1.6
    Diamond 2900 XT
    G. Skill 2X1GB
    250GB WD
    X-530 5.1 Speakers
    NEC 16x Dual Layer DVD-RW
    Turbo-Cool 510w Deluxe
    Vista 64

  6. #1306
    Xtreme Cruncher
    Join Date
    Aug 2006
    Location
    Denmark
    Posts
    7,747
    Quote Originally Posted by gunit View Post
    no wait until 2008, no, fall 2009 or better yet 2010... Everyone needs to stop playing the waiting game, THERE IS ALWAYS SOMETHING BETTER A FEW MONTHS FROM NOW IN THIS HOBBY OF OURS!!!
    Currently there were already something better half a year ago

    I think we have to wait for 2950/2650 before its "worth" it. Just like 8600 aint really worth it either right now.
    Crunching for Comrades and the Common good of the People.

  7. #1307
    Registered User
    Join Date
    Feb 2006
    Posts
    98
    Quote Originally Posted by Shintai View Post
    Dont expect anything but around 1950pro performance +DX10. 8600GTS didn´t show wonders either. And it will have the same 128bit bus as the 8600GTS, while the x1950Pro got 256.
    If the x2600 uses a similar ring-bus as the x2900, and has 4 memory chips rather than 8, and still uses one ring-stop for 2 memory chips, it would be more efficient than the x2900, thus making up for the large drop in bus-width.

    I wouldn't hold your breath on it tough.

    There is another difference between 8600-8800 and the 2600-2900 relation. The 8800 has 6 shader clusters of 16 shaders each (96 shaders). The 8600 has 2 shader clusters of 16 shaders each (32 shaders). However the 2900 has 4 shader clusters of 16 shaders each (64 shaders), but the 2600 has a total of 24 shaders total. Apparently it has 2 shader clusters of 12 shaders each, or possible even 4 shader cluster of 6 each. The reduction of shaders in the shader cluster has an impact on the performance, positive or negative I do not know. It could also be so small it's unnoticeable....
    Last edited by Noobie; 05-15-2007 at 05:52 AM.

  8. #1308
    Xtreme Addict
    Join Date
    Nov 2006
    Posts
    1,402
    Quote Originally Posted by gunit View Post
    no wait until 2008, no, fall 2009 or better yet 2010... Everyone needs to stop playing the waiting game, THERE IS ALWAYS SOMETHING BETTER A FEW MONTHS FROM NOW IN THIS HOBBY OF OURS!!!
    wait is not ridiculus.

    lot of guy's here have already big cards. Why should we take a new card ?

    buy a g80 or R600 why ? Graphic power is low, and dx10 is far.

    If you do alldays over 6k@3dmark06 like me, I don't think this so usefull.

  9. #1309
    Xtreme Enthusiast
    Join Date
    Jun 2004
    Location
    Olot (Girona)
    Posts
    693
    I got my own numbers...

    QX6700 @ 3.4Ghz
    Asus P5K Deluxe
    2x1GB Gskill PC8000HZ

    Forceware 160.03 for NVIDIA card
    6.87.4.2 for ATI Card


    Asus EAH2900XT

    3DMark 2001: 55524
    3DMark 2003: 37779
    3DMark 2005: 19830
    3DMark 2006: 12041

    AA16X = AA8X Filter Wide-Tent = 16 Samples

    FEAR
    1024x768 AA4X, AF 16X Tot max: 92 FPS
    1024x768 AA16X, AF16X Tot max: 49 FPS

    Company of Heroes
    1680x1050 All max, no AA: 91 FPS
    1680x1050 All max, AA16X: 55 FPS

    Supreme Commander
    1680x1050 high quality, AA4X AF16X: 45 FPS
    1680x1050 high quality, AA16X AF16X: 26 FPS

    Lost Coast
    1680x1050, AA8X AF16X HDR FULL: 76 FPS
    1680x1050, AA16X AF16X HDR FULL: 32 FPS

    Episody One
    1680x1050, AA4X AF16X HDR FULL: 128 FPS
    1680x1050, AA16X AF16X HDR FULL: 39 FPS

    Oblivion Indoor
    1680x1050, AA4X AF16X HDR FULL IQ MAX: 54 FPS
    1680x1050, AA16X AF16X HDR FULL IQ MAX: 39 FPS

    Far Cry 1.4beta (Regulator)
    1680x1050, AA4X AF16X IQ MAX: 77 FPS
    1680x1050, AA16X AF16X IQ MAX: 28 FPS

    GRAW
    1680x1050, AA4X AF16X IQ MAX: 64 FPS
    1680x1050, AA16X AF16X IQ MAX: 64 FPS (I guess AA16X wasn't applied correctly)

    STALKER
    1680x1050, AA MAX (ingame) AF16X IQ MAX HDR FULL: 28 FPS
    Asus 8800 Ultra

    3DMark 2001: 58734
    3DMark 2003: 43949
    3DMark 2005: 19369
    3DMark 2006: 13884

    FEAR
    1024x768 AA4X, AF 16X Tot max: 156 FPS
    1024x768 AA16X, AF16X Tot max: 145 FPS

    Company of Heroes
    1680x1050 All max, no AA: 113 FPS
    1680x1050 All max, AA16X: 86 FPS

    Supreme Commander
    1680x1050 high quality, AA4X AF16X: 71 FPS
    1680x1050 high quality, AA16X AF16X: 62 FPS

    Lost Coast
    1680x1050, AA8X AF16X HDR FULL: 127 FPS
    1680x1050, AA16X AF16X HDR FULL: 115 FPS

    Episody One
    1680x1050, AA4X AF16X HDR FULL: 178 FPS
    1680x1050, AA16X AF16X HDR FULL: 128 FPS

    Oblivion Indoor
    1680x1050, AA4X AF16X HDR FULL IQ MAX: 113 FPS
    1680x1050, AA16X AF16X HDR FULL IQ MAX: 102 FPS

    Far Cry 1.4beta (Regulator)
    1680x1050, AA4X AF16X IQ MAX: 130 FPS
    1680x1050, AA16X AF16X IQ MAX: 112 FPS

    GRAW
    1680x1050, AA4X AF16X IQ MAX: 102 FPS
    1680x1050, AA16X AF16X IQ MAX: 95 FPS

    STALKER
    1680x1050, AA MAX (ingame) AF16X IQ MAX HDR FULL: 54 FPS

    Power Consumption

    GTS 320: 244W idle 437W full
    GTX: 226W idle 447W full
    Ultra: 259W idle 484W full
    HD2900XT: 225W idle 502W full
    Last edited by krampak; 05-15-2007 at 06:16 AM.
    || Core 2 Quad QX6850 ES @ 3.5Ghz 1.35V || Thermalright Ultra 120 || Asus P5K3 Deluxe || Gskill F3-12800CL7D-2GBHZ
    || XFX 8800GTX || Dell 2005FPW 20" || Ultra X-Pro 750W LE || 3 x WD 320GB SD + 1 x Hitachi 500GB

  10. #1310
    Xtreme Member
    Join Date
    Dec 2006
    Posts
    404
    Why you put 2900XT against a Ultra?We already know it can NEVER compete against a Ultra...
    Quote Originally Posted by verndewd View Post
    Then we devise an nda breaking method that reroutes the broken nda from china and points to a pig name zhiang in a mud pit in hubei farmlands. The pig would have to have no owner.

    This can be done.
    SPECS:TT Kandalf LCS Case,Asus P5K Deluxe mobo,Intel Q600 CPU,XFX 8800Ultra,Samsung 500Gb HD,Gskill PC6400 ghz memory,Nec DVD-Writer,Creative Xfi Xtreme Music, Corsair HX620Logitech:G15 keyboard,MX518 mouse and Z-5450 Speakers

  11. #1311
    Xtreme Mentor
    Join Date
    Oct 2005
    Location
    Portugal
    Posts
    3,410
    @KrampaK the new Wide Tent filter its same image quality of traditional AA x8 and you got less FPS with Wide tent filter , Try run same games with traditional AA x8 and see if the FPS go UP .....

    BTW: Windows Vista or XP ???


    regards
    Last edited by mascaras; 05-15-2007 at 06:39 AM.

    [Review] Core i7 920 & UD5 » Here!! « .....[Review] XFX GTX260 216SP Black Edition » Here!! «
    [Review] ASUS HD4870X2 TOP » Here!! «
    .....[Review] EVGA 750i SLi FTW » Here!! «
    [Review] BFG 9800GTX 512MB » Here!! « .....[Review] Geforce 9800GX2 1GB » Here!! «
    [Review] EVGA GTX280 1GB GDDR3 » Here!! « .....[Review] Powercolor HD4870 512MB GDDR5 » Here!! «

  12. #1312
    Xtreme Cruncher
    Join Date
    Nov 2002
    Location
    Belgium
    Posts
    605
    Krampak where do you get these numbers from ?
    is that own experience ?


    Main rig 1: Corsair Carbide 400R 4x120mm Papst 4412GL - 1x120mm Noctua NF-12P -!- PC Power&Cooling Silencer MK III 750W Semi-Passive PSU -!- Gigabyte Z97X-UD5H -!- Intel i7 4790K -!- Swiftech H220 pull 2x Papst 4412 F/2GP -!- 4x4gb Crucial Ballistix Tactical 1866Mhz CAS9 1.5V (D9PFJ) -!- 1Tb Samsung 840 EVO SSD -!- AMD RX 480 to come -!- Windows 10 pro x64 -!- Samsung S27A850D 27" + Samsung 2443BW 24" -!- Sennheiser HD590 -!- Logitech G19 -!- Microsoft Sidewinder Mouse -!- Fragpedal -!- Eaton Ellipse MAX 1500 UPS .





  13. #1313
    Xtreme Addict
    Join Date
    Nov 2005
    Posts
    1,084
    AMD Cautioned Reviewers On DX10 Lost Planet Benchmark

    Tomorrow Nvidia is expected to host new DirectX 10 content on nZone.com in the form of a “Lost Planet” benchmark. Before you begin testing, there are a few points I want to convey about “Lost Planet”. “Lost Planet” is an Nvidia-sponsored title, and one that Nvidia has had a chance to look at and optimize their drivers for. The developer has not made us aware of this new benchmark, and as such the ATI Radeon driver team has not had the opportunity explore how the benchmark uses our hardware and optimize in a similar fashion. Over the next little while AMD will be looking at this, but in the meantime, please note that whatever performance you see will not be reflective of what gamers will experience in the final build of the game.
    http://www.vr-zone.com/

  14. #1314
    Xtreme Member
    Join Date
    Sep 2006
    Posts
    304
    Quote Originally Posted by Ubermann View Post
    The card was dealyed cuz they wanted to release all cards at the same time.
    I thought delay was so they could launch the entire family at once. BTW, what happen to we'll launch 10 DX10 GPUs?

  15. #1315
    Registered User
    Join Date
    Feb 2006
    Posts
    98
    Quote Originally Posted by krampak View Post
    I got my own numbers...

    QX6700 @ 3.4Ghz
    Asus P5K Deluxe
    2x1GB Gskill PC8000HZ

    Forceware 160.03 for NVIDIA card
    6.87.4.2 for ATI Card
    That power consumption is insane. Can you stop OC'ing and test the power usage then?

    GRAW uses the same (type of) engine as STALKER. If you're using the "Full Dynamic Lighting" option (which I can only assume you are), then any AA settings you have in your control panel have no effect. The in-game settings only provide marginal quality improvement, so you should also run both games with AA off.
    Last edited by Noobie; 05-15-2007 at 07:51 AM.

  16. #1316
    Xtreme Guru
    Join Date
    Jan 2005
    Location
    Tre, Suomi Finland
    Posts
    3,858
    There's more sofware bottlenecks to R600 than the drivers ATi writes. The shaders on R600 are so much different that current DirectX can't properly utilize them.

    I reckon we'll see noticeable gains as soon as Microsoft updates DirectX and more importantly, optimises the HLSL (high-level shader language) compiler in it for the shaders found in R600.
    You were not supposed to see this.

  17. #1317
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    Guys... ATi's R600 lacks the hardware for AA Resolve.... Sadly this means AA is likely to always be slow on the part, and no driver will fix that because it's using shader power to do this job.

    Quote Originally Posted by beyond3d
    Then we come to the ROP hardware, designed for high performance AA with high precision surface formats, at high resolution, with an increase in the basic MSAA ability to 8x. It's here that we see the lustre start to peel away slightly in terms of IQ and performance, with no fast hardware resolve for tiles that aren't fully compressed, and a first line of custom filters that can have a propensity to blur more than not. Edge detect is honestly sweet, but the CFAA package feels like something tacked on recently to paper over the cracks, rather than something forward-looking (we'll end up at the point of fully-programmable MSAA one day in all GPUs) to pair with speedy hardware resolve and the usual base filters. AMD didn't move the game on in terms of absolute image quality when texture filtering, either. They're no longer leaders in the field of IQ any more, overtaken by NVIDIA's GeForce 8-series hardware.
    http://www.beyond3d.com/content/reviews/16/16

    Quote Originally Posted by anandtech
    First, they refuse to call a spade a spade: this part was absolutely delayed, and it works better to admit this rather than making excuses. Forcing MSAA resolve to run on the shader hardware is less than desirable and degrades both pixel throughput and shader horsepower as opposed to implementing dedicated resolve hardware in the render back ends. Not being able to follow through with high end hardware will hurt in more than just in lost margins. The thirst for wattage that the R600 displays is not what we'd like to see from an architecture that is supposed to be about efficiency. Finally, attempting to extract a high instruction level parallelism using a VLIW design when something much simpler could exploit the huge amount of thread level parallelism inherent in graphics was not the right move.
    http://www.anandtech.com/video/showdoc.aspx?i=2988&p=31

    Due to this, it's unlikely the R600 will ever catch up with AA involved.
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

  18. #1318
    Xtreme Enthusiast
    Join Date
    Mar 2007
    Location
    Portsmouth, UK
    Posts
    963
    Quote Originally Posted by v_rr View Post
    AMD Cautioned Reviewers On DX10 Lost Planet Benchmark


    http://www.vr-zone.com/
    Funny how AMD are happy to give out an outdated version of CoJ that has bugs specific to nVidia Hardware then call fowl on Lost Planet. Pot, Kettle & Black anyone?

  19. #1319
    Xtreme Enthusiast
    Join Date
    Mar 2007
    Posts
    516
    Quote Originally Posted by XeRo View Post
    I thought delay was so they could launch the entire family at once. BTW, what happen to we'll launch 10 DX10 GPUs?
    Did people seriously believe this excuse? A more transparent piece of BS is hard for me to imagine.

  20. #1320
    Xtreme Enthusiast
    Join Date
    Sep 2006
    Location
    Nordschleife!
    Posts
    705
    Quote Originally Posted by largon View Post
    There's more sofware bottlenecks to R600 than the drivers ATi writes. The shaders on R600 are so much different that current DirectX can't properly utilize them.

    I reckon we'll see noticeable gains as soon as Microsoft updates DirectX and more importantly, optimises the HLSL (high-level shader language) compiler in it for the shaders found in R600.
    Finally someone with some sense that actually read the architecture bit of the reviews. The erratic performance of 2900xt across several diferent game benchmarks is to be expected. It's simply the nature of the R600 architecture design.

    Quote Originally Posted by Periander6
    Did people seriously believe this excuse? A more transparent piece of BS is hard for me to imagine.
    People will believe whatever they're willing to believe and it takes very hard work to change that. As a advertising professional I thank for that.
    Last edited by Caparroz; 05-15-2007 at 09:10 AM.
    Murray Walker: "And there are flames coming from the back of Prost's McLaren as he enters the Swimming Pool."

    James Hunt: "Well, that should put them out then."

  21. #1321
    Xtreme Member
    Join Date
    Sep 2006
    Posts
    304
    Quote Originally Posted by Periander6 View Post
    Did people seriously believe this excuse? A more transparent piece of BS is hard for me to imagine.
    What can I say? I'm just very gullible

  22. #1322
    Xtreme Addict
    Join Date
    Nov 2005
    Location
    Where the Cheese Heads Reside
    Posts
    2,173
    Quote Originally Posted by largon View Post
    There's more sofware bottlenecks to R600 than the drivers ATi writes. The shaders on R600 are so much different that current DirectX can't properly utilize them.

    I reckon we'll see noticeable gains as soon as Microsoft updates DirectX and more importantly, optimises the HLSL (high-level shader language) compiler in it for the shaders found in R600.
    Totally forgot about the HLSL. Good catch, I know I read some things on it but it sounded like it was already implemented, guess more or less they where talking about when its implemented.
    -=The Gamer=-
    MSI Z68A-GD65 (G3) | i5 2500k @ 4.5Ghz | 1.3875V | 28C Idle / 65C Load (LinX)
    8Gig G.Skill Ripjaw PC3-12800 9-9-9-24 @ 1600Mhz w/ 1.5V | TR Ultra eXtreme 120 w/ 2 Fans
    Sapphire 7950 VaporX 1150/1500 w/ 1.2V/1.5V | 32C Idle / 64C Load | 2x 128Gig Crucial M4 SSD's
    BitFenix Shinobi Window Case | SilverStone DA750 | Dell 2405FPW 24" Screen
    -=The Server=-
    Synology DS1511+ | Dual Core 1.8Ghz CPU | 30C Idle / 38C Load
    3 Gig PC2-6400 | 3x Samsung F4 2TB Raid5 | 2x Samsung F4 2TB
    Heat

  23. #1323
    Xtreme Enthusiast
    Join Date
    Sep 2006
    Location
    Nordschleife!
    Posts
    705
    Quote Originally Posted by deathman20 View Post
    Totally forgot about the HLSL. Good catch, I know I read some things on it but it sounded like it was already implemented, guess more or less they where talking about when its implemented.
    HLSL is nothing new. It was introduced when DX9c was released, IIRC. The only diference is that all shaders must be writen in HLSL for DX10.
    Murray Walker: "And there are flames coming from the back of Prost's McLaren as he enters the Swimming Pool."

    James Hunt: "Well, that should put them out then."

  24. #1324
    Xtreme Addict
    Join Date
    Sep 2006
    Location
    Stamford, UK
    Posts
    1,336
    hmmm so all that bandwidth is wasted since the card is poor at doing AA anyway?! No way...
    FX8350 @ 4.0Ghz | 32GB @ DDR3-1200 4-4-4-12 | Asus 990FXA @ 1400Mhz | AMD HD5870 Eyefinity | XFX750W | 6 x 128GB Sandisk Extreme RAID0 @ Aerca 1882ix with 4GB DRAM
    eXceed TJ07 worklog/build

  25. #1325
    YouTube Addict
    Join Date
    Aug 2005
    Location
    Klaatu barada nikto
    Posts
    17,574
    hardware processing has always been and will always be the fastest however, it is the absolutely most limited. hence we have Software controlling our Processors instead of the computer being hard coded.
    Fast computers breed slow, lazy programmers
    The price of reliability is the pursuit of the utmost simplicity. It is a price which the very rich find most hard to pay.
    http://www.lighterra.com/papers/modernmicroprocessors/
    Modern Ram, makes an old overclocker miss BH-5 and the fun it was

Page 53 of 61 FirstFirst ... 34350515253545556 ... LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •