Page 56 of 61 FirstFirst ... 64653545556575859 ... LastLast
Results 1,376 to 1,400 of 1518

Thread: Official HD 2900 Discussion Thread

  1. #1376
    Xtreme Enthusiast
    Join Date
    Jul 2006
    Location
    the Netherlands
    Posts
    558
    Quote Originally Posted by OBR View Post
    Go to hell with stupid 3D mark benchmarks ... Yes ATi has good score, but in games is totally crappy ... and in quality of picture .... looser ...

    PS. dont flame anything about ummatured drivers ... it is not true, ummature is design of chip not drivers ...
    Bullsh*t.
    Nobody's flaming about immature drivers. Its the only reason why it performs like utter crap in one game, and its a very good performer in other games.
    You need to stop flaming that this card sucks, if it still performs this bad when NEW DRIVERS come out; sure. You can say it sucks. But as long as there are no proper drivers; stop flaming.

  2. #1377
    Registered User
    Join Date
    Feb 2006
    Posts
    98
    Let's take a Core 2 Solo 4 GHz.
    Let's take a Core 2 Duo 3 GHz.
    There are some games where the Solo will win.
    There are some games where the Duo will win.

    Let's take a Core 2 Solo 4 GHz.
    Let's take a Core 2 Quad 2 GHz.
    There are some games where the Solo will crush the Quad.
    There are some games where the Quad will crush the Solo.

  3. #1378
    I am Xtreme
    Join Date
    Jul 2005
    Posts
    4,811
    Quote Originally Posted by OBR View Post
    Go to hell with stupid 3D mark benchmarks ... Yes ATi has good score, but in games is totally crappy ... and in quality of picture .... looser ...

    PS. dont flame anything about ummatured drivers ... it is not true, ummature is design of chip not drivers ...
    No need to get this emo over a video card that you are not interested in. For one your post is crap . Second, you have no proof of this until better drivers are developed. The G80 went through similar growing pains so give it a break already!
    Last edited by Eastcoasthandle; 05-16-2007 at 07:27 AM.
    [SIGPIC][/SIGPIC]

  4. #1379
    Love will tear us apart
    Join Date
    Mar 2006
    Location
    wrigleyville.chi.il.us
    Posts
    2,350

    Hrm

    From a post in the Bluesnews thread re. Lost Planet demo:

    "This is already the second DX10 demo where nvidia owners have at least 3 times the framerate and better image quality than the HD2900XT fellows (CoJ was the first DX10 demo). I'm still waiting for the DX10 demo where ATI excels over nvidia like predicted by many ATI fans."

    Um, ok. Can anyone confirm or deny this? No direct response to the guys claim nor other info is provided in that thread...
    Dark Star IV
    i5 3570K | Asrock Z77 Extreme4 | 8GB Samsung 30nm | GTX 670 FTW 4GB + XSPC Razer 680 | 128GB Samsung 830
    Apogee HD White | 140MM UT60/120MM UT45/120MM ST30 | XSPC D5 V2

  5. #1380
    Xtreme Member
    Join Date
    Sep 2006
    Posts
    304

  6. #1381
    I am Xtreme
    Join Date
    Mar 2005
    Location
    Edmonton, Alberta
    Posts
    4,594
    Quote Originally Posted by Eastcoasthandle View Post
    No need to get this emo over a video card that you are not interested in. For one your post is crap . Second, you have not proof of this until better drivers are developed. The G80 went through similar growing pains so give it a break already!
    Neither do you, ECH, but I HAVE the card, and I say that most of the comments made by people are not true. The performance in games IS good, not bad, the QUALITY is GOOD, not bad, and each and every complaint made earlier either stems from users who were testing the cards, and were met with some issues, or from nV's FUD team. I could not be more happy with my cards.

    Reviews on the web are not accurate...

  7. #1382
    I am Xtreme
    Join Date
    Jul 2005
    Posts
    4,811
    Quote Originally Posted by WrigleyVillain View Post
    From a post in the Bluesnews thread re. Lost Planet demo:

    "This is already the second DX10 demo where nvidia owners have at least 3 times the framerate and better image quality than the HD2900XT fellows (CoJ was the first DX10 demo). I'm still waiting for the DX10 demo where ATI excels over nvidia like predicted by many ATI fans."

    Um, ok. Can anyone confirm or deny this? No direct response to the guys claim nor other info is provided in that thread...
    It cannot be confirmed do to the generalization of the statement. For one this is a push to downplay the HD 2900XT because they know sooner or later the drivers (release the week of 5-22-07) will mature improving the HD overall. Although I still awesome that another driver release maybe needed after 7.5 before we start seeing complete maturity and better results. Second, there is no indication of which G80 outperforms the HD. I've seen benchmarks that went both ways even with immature drivers. Third, there are no photos that suggest that COJ/LP have better image quality in G80 then in the HD, making this statement false.
    Last edited by Eastcoasthandle; 05-16-2007 at 07:46 AM.
    [SIGPIC][/SIGPIC]

  8. #1383
    I am Xtreme
    Join Date
    Jul 2005
    Posts
    4,811
    Quote Originally Posted by cadaveca View Post
    Neither do you, ECH, but I HAVE the card, and I say that most of the comments made by people are not true. The performance in games IS good, not bad, the QUALITY is GOOD, not bad, and each and every complaint made earlier either stems from users who were testing the cards, and were met with some issues, or from nV's FUD team. I could not be more happy with my cards.

    Reviews on the web are not accurate...
    Neither do I what?
    [SIGPIC][/SIGPIC]

  9. #1384
    XS News
    Join Date
    Aug 2004
    Location
    Sweden
    Posts
    2,010
    Your saying all those reviews are bad/fake/wrong cuz you got the card ?
    I dont think anyone is saying the card is bad or waste of money, but compared to G80 its not as good as everyone expected.
    And this is their high end..when 65nm version of this appear then that will be their next high end.

    But think G80 shrinked to 65nm..then ATI lost again.
    So they need a new GPU already.
    Last edited by Ubermann; 05-16-2007 at 07:38 AM.
    Everything extra is bad!

  10. #1385
    Xtreme Cruncher
    Join Date
    Aug 2006
    Location
    Denmark
    Posts
    7,747
    I think some people got too high hopes for the drivers...its like keep pushing..if not next driver then the driver after that for sure!!! Its a continual push in denial. lets face it, R600 is a bummer getting replaced by R650 in 2-3 months. Even AMD stated that on an interview. It should also be a big hint on why tehre is no GT/XL/XTX version. Its simply a temporary "hotfix".
    Crunching for Comrades and the Common good of the People.

  11. #1386
    I am Xtreme
    Join Date
    Mar 2005
    Location
    Edmonton, Alberta
    Posts
    4,594
    Quote Originally Posted by Eastcoasthandle View Post
    Neither do I what?
    Have proof that any of your comments may be true.


    Look, in alot of situations, X1950Crossfire is faster than a single G80. the HD2900XT is faster than X1950Crossfire, so some conclusions can be made...


    however, this gpu is superscalar. this means that the standard way of doing things does not apply any more. There are instances were a driver CAN NOT HELP AT ALL...especially when it comes to DX9. DX10 can use load balancing, so I'm not so "unconfident" in how things will turn out.



    ANd yes, Ubermann, when reviews say "colour is washed out", etc, etc, and I do not see the same problem, then I call shens.

    Fact of the matter is that I have been pretty muc haccurate about this card from the get-go, including the bit about us getting UFO only first, about the gpu being superscalar, about ALOT of things. This, to me, means that things are pretty "cut and dry" here, and everything else, all the complaints and worries, are FUD.


    And no, I do not care about comparisons w/ G80. When DX10 comes out in full force, then I might...but if one card is not fast enough, I'll simply toss in another, thanks to the cost. I bought G80 when it first came out too, adn the horrible driver bugs had me sell them right away. Until we have some applictions that can properly measure the performance difference between G80 and R600(in all types of rendering), no real comparisons can be made, as the fact of the matter is that because of this gpu's structure, drivers play a far more important role in DX9 than ANYONE thinks.
    Last edited by cadaveca; 05-16-2007 at 07:48 AM.

  12. #1387
    Banned
    Join Date
    May 2006
    Posts
    458
    It cleary beats the 8800gts in price/performance and... that's it. The high-high-end belongs to nvidia.

  13. #1388
    XS News
    Join Date
    Aug 2004
    Location
    Sweden
    Posts
    2,010
    And that makes you wonder how much delayed the R650 gonna be..omg i give up =)
    Everything extra is bad!

  14. #1389
    I am Xtreme
    Join Date
    Jul 2005
    Posts
    4,811
    Quote Originally Posted by cadaveca View Post
    Have proof that any of your comments may be true.


    Look, in alot of situations, X1950Crossfire is faster than a single G80. the HD2900XT is faster than X1950Crossfire, so some conclusions can be made...


    however, this gpu is superscalar. this means that the standard way of doing things does not apply any more. There are instances were a driver CAN NOT HELP AT ALL...especially when it comes to DX9. DX10 can use load balancing, so I'm not so "unconfident" in how things will turn out.



    ANd yes, Ubermann, when reviews say "colour is washed out", etc, etc, and I do not see the same problem, then I call shens.

    Fact of the matter is that I have been pretty muc haccurate about this card from the get-go, including the bit about us getting UFO only first, about the gpu being superscalar, about ALOT of things. This, to me, means that things are pretty "cut and dry" here, and everything else, all the complaints and worries, are FUD.


    And no, I do not care about comparisons w/ G80. When DX10 comes out in full force, then I might...but if one card is not fast enough, I'll simply toss in another, thanks to the cost. I bought G80 when it first came out too, adn the horrible driver bugs had me sell them right away. Until we have some applictions that can properly measure the performance difference between G80 and R600(in all types of rendering), no real comparisons can be made.
    What are you talking about?
    1. G80 DID have driver problems when it was released
    2. Currently there are NO official Cat 7.5 drivers released for the HD yet.

    Based on what you quoted me on this is the summary of my statement. No further proof is needed.
    [SIGPIC][/SIGPIC]

  15. #1390
    I am Xtreme
    Join Date
    Mar 2005
    Location
    Edmonton, Alberta
    Posts
    4,594
    Quote Originally Posted by DoubleZero View Post
    It cleary beats the 8800gts in price/performance and... that's it. The high-high-end belongs to nvidia.
    But do you really understand how many cards get sold @ this level? VERY little. I mean really, two companies merge, we got layoffs, etc, and a product line that has been delayed countless times due to bad choices and medling from other companies. I think, really, AMD have done well.


    BTW, this 2900XT is UFO. it is not "top card". This card is not released yet. The fact of the matter is that even the ringbus in these cards is half disabled, due to the number of memory IC's, and thier size.

    OH, ECH, i never said i didn't agree with you, however, I don't see you with a card, so you are basing your comments on the same stuff that other guy was, thereby causing me to respond. You look at the situation...well..you know what I think.

  16. #1391
    I am Xtreme
    Join Date
    Jul 2005
    Posts
    4,811
    Quote Originally Posted by cadaveca View Post
    OH, ECH, i never said i didn't agree with you, however, I don't see you with a card, so you are basing your comments on the same stuff that other guy was, thereby causing me to respond. You look at the situation...well..you know what I think.
    I don't mean any harm but what you are saying is petty. The information on the video card is out there. It's clear that better drivers are needed. The G80 went through their own driver issues when it was released.
    [SIGPIC][/SIGPIC]

  17. #1392
    Xtreme Enthusiast
    Join Date
    Aug 2005
    Location
    Long Island, NY
    Posts
    980
    Quote Originally Posted by DoubleZero View Post
    It cleary beats the 8800gts in price/performance and... that's it. The high-high-end belongs to nvidia.
    it doesn't beat the 8800gts in price/performance. I see a evga 8800gts 640mb for $329.99 after a $30 rebate. that is $100 cheaper then the 2900 XT's are going for. And i wanted to buy a 2900 XT too, really did. Still kinda want to.. but i don't wanna have the same thing that happened to me when i bought my 1800XL when it first came out - couple months it was pretty much obsolete. It's as if the initial release is only to get something out the door, then they patch it up and re-release a much better and improved product. this time, i'm gonna wait, and get me the 65nm version (if and when it comes out... i might be waiting for a while)
    Desktop
    [Asus Rampage III Gene] [i7 920 D0] [12GB OCZ3B2000C9LV6GK] [HIS HD 5970] [SeaSonic X750 Gold ] [Windows 7 (64bit)] [OCZ Vertex 30GB x3 Raid0] [Koolance CPU 360] [XSPC Razer 5970] [TFC 360 rad, D5 w/ Koolance RP-450X2]
    HTPC
    [Origen AE S10V] [MSI H57M-ED65] [ i5-661 w/ Scythe Big Shuriken] [Kingston HyperX LoVo 4GB ] [ SeaSonic X650 Gold ] [ OCZ Vertex 30GB SSD ] [ SAMSUNG Spinpoint 640GB 7200 RPM 2.5"][Panasonic UJ-225 Blu-ray Slot Burner] [ Ceton InfiniTV4]

  18. #1393
    I am Xtreme
    Join Date
    Mar 2005
    Location
    Edmonton, Alberta
    Posts
    4,594
    Quote Originally Posted by Eastcoasthandle View Post
    I don't mean any harm but what you are saying is petty. The information on the video card is out there. It's clear that better drivers are needed. The G80 went through their own driver issues when it was released.
    I agree, but it's not much different then what you posted.

  19. #1394
    Registered User
    Join Date
    Feb 2006
    Posts
    98
    Quote Originally Posted by WrigleyVillain View Post
    From a post in the Bluesnews thread re. Lost Planet demo:

    "This is already the second DX10 demo where nvidia owners have at least 3 times the framerate and better image quality than the HD2900XT fellows (CoJ was the first DX10 demo). I'm still waiting for the DX10 demo where ATI excels over nvidia like predicted by many ATI fans."

    Um, ok. Can anyone confirm or deny this? No direct response to the guys claim nor other info is provided in that thread...
    The difference between 2900 & GTX in CoJ isn't that big. http://www.legitreviews.com/article/504/3/

    The difference in Lost Planet is under 2x @ http://www.pcgameshardware.de/?article_id=601352
    It's a bit higher @ http://www.legitreviews.com/article/505/3/

    It should be noted that Lost Planet is unplayable on ATI, because of disappearing objects, so what's the point anyway?

  20. #1395
    Xtreme Member
    Join Date
    May 2004
    Posts
    187
    I don't even get the driver argument. Besides a few obvious bugs, its performance based on its price is exactly where it should be. From looking at 20+ reviews, I have come to the conclusion its site very nicely between the GTS and GTX. And what do you know, its price is also between the GTS and GTX, leaning towards the GTS. In some cases its as fast as the GTX, some cases slower than GTS. But that is expected. But the OVERALL impression I got is that it performs right in the middle.

    What happened was that everyone, for good reason, expected it to be a GTX killer. But even before release we knew it was going to cost around $400. I might be in the minority, but I instantly knew it was NOT going to perform at GTX levels. Why else sell it at that price? So I adjusted my expectations accordingly. The problem with the vast majority still seem to be oblivious to what is right in front of their nose. A $400 video card. Not a $550 dollar one.

    So to sum up, its performance is fine. Drivers bugs will be worked out. Everyone be happy.

    and to top it off you get 3 of the years hottest games, plus a free G5 mouse you can sell on ebay for like $25. and sell the games for around $25-$30 also if you want.
    Last edited by gunit; 05-16-2007 at 09:12 AM.
    e6320 @ 3.2Ghz 7x458 1.38v
    DS3 rev. 1.6
    Diamond 2900 XT
    G. Skill 2X1GB
    250GB WD
    X-530 5.1 Speakers
    NEC 16x Dual Layer DVD-RW
    Turbo-Cool 510w Deluxe
    Vista 64

  21. #1396
    Xtreme Mentor
    Join Date
    Oct 2005
    Location
    Portugal
    Posts
    3,410
    Quote Originally Posted by OBR View Post
    and in quality of picture .... looser ...

    i i must confess that i was a litle reticent about image quality because of what few reviews said ( that was not so good ) but then i got my x2900 today and i saw this :


    x2900XT Day of defeat




    better image than with my 8800gts in my other system




    temps playing DOD



    15:35:15, ASIC Temperature via LM64 on DDC3 I2C [0] (C) = 54.750, MCLK(MHz)[0] = 513.00, SCLK(MHz)[0] = 506.25

    15:44:46, ASIC Temperature via LM64 on DDC3 I2C [0] (C) = 71.125, MCLK(MHz)[0] = 828.00, SCLK(MHz)[0] = 742.50



    ASIC Temperature via LM64 on DDC3 I2C [0]

    Minimum temperature: 54.375 C
    Maximum temperature: 71.750 C
    Average temperature: 58.601 C




    (btw: another Free game with X2900 )





    regards
    Last edited by mascaras; 05-16-2007 at 09:26 AM.

    [Review] Core i7 920 & UD5 » Here!! « .....[Review] XFX GTX260 216SP Black Edition » Here!! «
    [Review] ASUS HD4870X2 TOP » Here!! «
    .....[Review] EVGA 750i SLi FTW » Here!! «
    [Review] BFG 9800GTX 512MB » Here!! « .....[Review] Geforce 9800GX2 1GB » Here!! «
    [Review] EVGA GTX280 1GB GDDR3 » Here!! « .....[Review] Powercolor HD4870 512MB GDDR5 » Here!! «

  22. #1397
    I am Xtreme
    Join Date
    Jul 2005
    Posts
    4,811
    Quote Originally Posted by mascaras View Post
    i i must confess that i was a litle reticent about image quality because of what few reviews said ( that was not so good ) but then i got my x2900 today and i saw this :


    x2900XT Day of defeat




    better image than with my 8800gts in my other system




    temps playing DOD



    15:35:15, ASIC Temperature via LM64 on DDC3 I2C [0] (C) = 54.750, MCLK(MHz)[0] = 513.00, SCLK(MHz)[0] = 506.25

    15:44:46, ASIC Temperature via LM64 on DDC3 I2C [0] (C) = 71.125, MCLK(MHz)[0] = 828.00, SCLK(MHz)[0] = 742.50



    ASIC Temperature via LM64 on DDC3 I2C [0]

    Minimum temperature: 54.375 C
    Maximum temperature: 71.750 C
    Average temperature: 58.601 C




    (btw: another Free game with X2900 )





    regards
    The HD2900XT doesn't have IQ issues. You read about it but there is no comprehensive photo details that suggest otherwise. As you clearly see the HD2900XT has IQ as good as or better then the 8800GTX. There is more talk then actually photo proof (using several examples) to show that IQ is a problem with the XT. The only thing that I question in your photo is that window. Something about that window doesn't look right. Can you check with your GTX to see if that window looks like that?
    Last edited by Eastcoasthandle; 05-16-2007 at 09:49 AM.
    [SIGPIC][/SIGPIC]

  23. #1398
    Registered User
    Join Date
    Feb 2006
    Posts
    98
    "Easy" way to resolve this is by using 3D Mark 2006 in 1280 * 1024 @ 6xAA & 16xAF mode on both cards, then letting it dump 900 individual frames using the image quality part of the prog, and compiling this into VC-1 HD movie.

  24. #1399
    Registered User
    Join Date
    Jul 2006
    Posts
    13
    Side to side comparisons would be nice =)
    Btw do some benchies in Lost Planet and Call of Juarez, and some other games.
    how do the temps scale with the clock in r600? i saw all those stock cooler overclocks and im wondering what kind of temp i can expect with, say, 850Mhz on the core.
    anyone has an idea of what sapphire toxic will cost?

  25. #1400
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    Quote Originally Posted by Noobie View Post
    "Easy" way to resolve this is by using 3D Mark 2006 in 1280 * 1024 @ 6xAA & 16xAF mode on both cards, then letting it dump 900 individual frames using the image quality part of the prog, and compiling this into VC-1 HD movie.
    6xAA isn't on either, so it'll revert to 0xAA.

    I'd say run 8xMSAA.
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

Page 56 of 61 FirstFirst ... 64653545556575859 ... LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •