MMM
Page 3 of 4 FirstFirst 1234 LastLast
Results 51 to 75 of 77

Thread: If ATI Won a Round, Would Anyone Notice?

  1. #51
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    Quote Originally Posted by Eastcoasthandle View Post
    Please read
    This is why I express the importants of WHQL approved drivers when benchmarking like this.
    The issue was with lost planet, the new release of it should be any minute.

    Also, if I'm not mistaken(I'm not running sli or vista, so I can't verify myself), but didn't sli not get added properly into the vista driver until 158.42 Could've sworn I remember reading that in the release highlights for 158.42/43. Of course, it might've been 24 as well.

    :::edit:::
    nvm, that was DX10 SLi support.
    Last edited by DilTech; 06-29-2007 at 02:36 PM.
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

  2. #52
    I am Xtreme
    Join Date
    Jul 2005
    Posts
    4,811
    Quote Originally Posted by DilTech View Post
    The issue was with lost planet, the new release of it should be any minute.

    Also, if I'm not mistaken(I'm not running sli or vista, so I can't verify myself), but didn't sli not get added properly into the vista driver until 158.42 Could've sworn I remember reading that in the release highlights for 158.42/43. Of course, it might've been 24 as well.

    :::edit:::
    nvm, that was DX10 SLi support.
    Vista not added properly (as you put it) is pretty hard to validate when plenty of people are using SLI in Vista.
    [SIGPIC][/SIGPIC]

  3. #53
    Xtreme Mentor
    Join Date
    Aug 2006
    Location
    HD0
    Posts
    2,646
    Quote Originally Posted by Zytek_Fan View Post
    Well,the Sapphire Radeon HD2900XT is $374.99 and the cheapest 8800GTS is $379.99 so it's quite close.

    Noise and power consumption, the 2900XT loses, but I could care less.

    All I care about is performance per $...and how much X38 is going to be
    I bought my 640mb GTS for $310

    so not quite...

  4. #54
    Xtreme Member
    Join Date
    Mar 2007
    Posts
    103
    Quote Originally Posted by xlink View Post
    I bought my 640mb GTS for $310

    so not quite...
    he said he couldnt care less .. money is no object ... since hes likely to pay more to run his 2900XT thru power bills, also GTS 640 can be have for much less than 2900xt if one just look

    ohy yea he said MIR doesnt count, but still 640 GTS is 25 buxx chepaer in newegg's case ...

    so i dunno ... maybe he simply likes 2900XT regardless of what .. includes power consumption , price , noise , oh and in XP GTS actually beats 2900xT in many benches ...

    yeah i know nv sucks at DX10/Vista , but DX10 is slow for all cards ... i dont think ill play DX10 on my current rig nor do i think anyone else can on theirs ... not even with 2900XT CF .... at 1680x1050 4xAA .... for games like crysis , bioshock in DX10 mode

    ill prolly get w/e is fastest (maybe G90 or 2950XTX 1GB) when Crysis launches, but seeing how DX10 dont bring much over DX9 i might as well stick with DX9/XP

    maybe when a game is DX10 only thats when we will see what this slide is all about
    Click image for larger version. 

Name:	1146417816aqCxiPwC4n_3_2_l.jpg 
Views:	194 
Size:	44.5 KB 
ID:	61155
    http://enthusiast.hardocp.com/articl...50aHVzaWFzdA==
    Last edited by californian7856; 06-30-2007 at 09:41 PM.

  5. #55
    Xtreme Member
    Join Date
    Jan 2005
    Posts
    411
    Hey Hey Hey now guys, regardless of drivers, the 8800GTX in vista in sli has to suck, this is the same way the hd2900xt was treated due to bad drivers, so I don't want to see this treated any different ya hear.



    If you have to ask why? Because teh review said so :P lol
    System
    Intel C2D Q6600 @ 3.78Ghz 1.536V (1.55V bios)
    TC PA 120.3 , D-TEK Fusion, MCP655, Micro Res, 6 Yate Fans
    Mushkin HP2-6400 2X2GB @ 100Mhz 5-4-4-12
    Asus Maximus Formula 420x9
    4 Samsung Spinpoint 250GB SATA II in raid 0
    Crossfire HD 2900XT 512MB 900/900 1.25V
    Pioneer DVD-RW
    830 Mobo Tray, Wating on MM Duality
    PC Power and Cooling 750W , mobo/cpu/gpus/cdrom , Powmax 380W , hd, fans, pump
    Acer AL2616W 19200x1200 100Hz (75hz native)

  6. #56
    Xtreme Member
    Join Date
    Mar 2007
    Posts
    103
    Quote Originally Posted by DilTech View Post
    The issue was with lost planet, the new release of it should be any minute.

    Also, if I'm not mistaken(I'm not running sli or vista, so I can't verify myself), but didn't sli not get added properly into the vista driver until 158.42 Could've sworn I remember reading that in the release highlights for 158.42/43. Of course, it might've been 24 as well.

    :::edit:::
    nvm, that was DX10 SLi support.
    yeah just searched in guru3d a bit
    http://downloads.guru3d.com/download.php?det=1629
    158.24 was the 1st vista driver they added SLI support for DX9

  7. #57
    Xtreme Member
    Join Date
    Mar 2007
    Posts
    103
    Quote Originally Posted by gdogg View Post
    Hey Hey Hey now guys, regardless of drivers, the 8800GTX in vista in sli has to suck, this is the same way the hd2900xt was treated due to bad drivers, so I don't want to see this treated any different ya hear.



    If you have to ask why? Because teh review said so :P lol
    oh .. i c so drivers/os dont count huh

    http://enthusiast.hardocp.com/articl...50aHVzaWFzdA==

    in that case GTS 320 csts $290 pwns 2900XT $400 ...



    anyway set the fanboy talk aside. Anadtech said they r doing another of both GTX, GTS and XT with new driver ...
    Last edited by californian7856; 06-30-2007 at 10:11 PM.

  8. #58
    Xtreme Mentor
    Join Date
    Apr 2007
    Location
    Idaho
    Posts
    3,200
    Quote Originally Posted by californian7856 View Post
    oh .. i c so drivers/os dont count huh

    http://enthusiast.hardocp.com/articl...50aHVzaWFzdA==

    in that case GTS 320 csts $290 pwns 2900XT $400 ...



    anyway set the fanboy talk aside. Anadtech said they r doing another of both GTX, GTS and XT with new driver ...
    That was with Cat 7.5

    The 2900XT is THE card to buy
    "To exist in this vast universe for a speck of time is the great gift of life. Our tiny sliver of time is our gift of life. It is our only life. The universe will go on, indifferent to our brief existence, but while we are here we touch not just part of that vastness, but also the lives around us. Life is the gift each of us has been given. Each life is our own and no one else's. It is precious beyond all counting. It is the greatest value we have. Cherish it for what it truly is."

  9. #59
    Xtreme Member
    Join Date
    Mar 2007
    Posts
    103
    Quote Originally Posted by Zytek_Fan View Post
    That was with Cat 7.5

    The 2900XT is THE card to buy
    yeah it only use more power than even GTX (which means ur paying that in electric bills)
    http://www.extremetech.com/article2/...2129295,00.asp

    and costs more, in most extreme case $65 more ($310 vs $375)

    and louder

    and slower with 4xAA

    other than all that i agree ..

    because its very fast in Call of Juarez DX10 which is a AMD sponsor game btw

    and CF in Vista is really fast
    Last edited by californian7856; 06-30-2007 at 10:20 PM.

  10. #60
    Xtreme Enthusiast
    Join Date
    Dec 2006
    Location
    France
    Posts
    741
    -http://www.xbitlabs.com/articles/vid..._14.html#sect0
    I pick my test and say it use less power idle and we spend more time in idle mode than in full mode. Extra bill is like 10$ for a year. And on XS speaking about power compsumption it's like a joke.
    - Louder for sure but in summmer i guess all high end cards will become louder.
    - Slower with 4xAA but even with that still in GTS performance.
    - Call of Juarez is a Nvidia sponsor game.
    - CF in XP is equally very fast.
    AMD Phenom II X2 550@Phenom II X4 B50
    MSI 890GXM-G65
    Corsair CMX4GX3M2A1600C9 2x2GB
    Sapphire HD 6950 2GB

  11. #61
    Xtreme Cruncher
    Join Date
    Aug 2006
    Location
    Denmark
    Posts
    7,747
    Quote Originally Posted by californian7856 View Post
    maybe when a game is DX10 only thats when we will see what this slide is all about
    Click image for larger version. 

Name:	1146417816aqCxiPwC4n_3_2_l.jpg 
Views:	194 
Size:	44.5 KB 
ID:	61155
    http://enthusiast.hardocp.com/articl...50aHVzaWFzdA==
    Not really.

    DX10 is either A. Faster with the same content or B. You can do more effects and some that you couldn't do with 9, but its slower.

    And they all went B.

    So all the efficiency bonus moving to DX10 is eaten (And alot more) by more of everything and new features added to the game.

    AMD and nVidia game funding made sure you wouldn´t just see a small boost in content. They had to protect current highend and the next generation. It would be pretty bad for them if someone just did a pure DX10 crossover. So a 8800GTS in DX10 performed like a 8800GTX in DX9. On the other hand, they sacrificed any possibility of DX10 on 8600/2600 series.
    Last edited by Shintai; 07-01-2007 at 12:55 AM.
    Crunching for Comrades and the Common good of the People.

  12. #62
    I am Xtreme
    Join Date
    Jul 2005
    Posts
    4,811
    I found this tid bit of news interesting since the HD 2900XT 1 gig is getting a make over for it's memory ICs (using samsung).
    Source

    Samsung ...announced today that its GDDR4 (Graphics Double Data Rate, version 4) high-speed graphics memory chip is being used in both the 1GB ATI Radeon™ HD 2900 XT and the 256MB ATI Radeon™ HD 2600 XT graphics processing cards. The 1GB card has the widest bus in the industry designed for full-performance, high dynamic range (HDR) rendering in PCs...Built with 80 nanometer process technology, the 512Mb GDDR4-based graphics cards provides 140.8GB-per-second performance ? 25 percent faster than 700 MHz GDDR3 graphics memory, the most common graphics device in use today...Samsung’s GDDR4 memory devices are now in mass production at the 512Mb density level. Samsung developed the world first GDDR4 device last year and submitted a paper on a 4Gbps GDDR4
    I think we all are interested to see what the performance level of the video card is overclocked using Samsung ICs.

    These GDDR 4 (K4U52324QE) is rated roughly at:
    .6ns @ 1.600GHz = 3.2Gbps
    Last edited by Eastcoasthandle; 07-01-2007 at 08:27 AM.
    [SIGPIC][/SIGPIC]

  13. #63
    Xtreme Member
    Join Date
    Mar 2007
    Posts
    103
    Quote Originally Posted by AbelJemka View Post
    -http://www.xbitlabs.com/articles/vid..._14.html#sect0
    I pick my test and say it use less power idle and we spend more time in idle mode than in full mode. Extra bill is like 10$ for a year. And on XS speaking about power compsumption it's like a joke.
    not really umm i dunno how they did that test .. i can find many that suggest it uses more power during idle

    http://www.anandtech.com/video/showdoc.aspx?i=2988&p=30

    a card uses more power load can use less power idle?? something wrong with the way they did that test

    and no when i dont play games/surf i turn my comp off

    louder?? umm b/c its louder then GTS/GTX


    - Call of Juarez is a Nvidia sponsor game.
    its AMD/ATI sponsor, do u even have the game??
    http://img122.imageshack.us/img122/7964/dsc00127ob2.jpg

    wheres the nv title??

    as for CF performance in XP .. well u can always find a review that suggest different

  14. #64
    Xtreme Member
    Join Date
    Mar 2007
    Posts
    103
    Quote Originally Posted by Shintai View Post
    Not really.

    DX10 is either A. Faster with the same content or B. You can do more effects and some that you couldn't do with 9, but its slower.

    And they all went B.

    So all the efficiency bonus moving to DX10 is eaten (And alot more) by more of everything and new features added to the game.

    AMD and nVidia game funding made sure you wouldn´t just see a small boost in content. They had to protect current highend and the next generation. It would be pretty bad for them if someone just did a pure DX10 crossover. So a 8800GTS in DX10 performed like a 8800GTX in DX9. On the other hand, they sacrificed any possibility of DX10 on 8600/2600 series.
    yeah when nv released Lost Planet , COH and AMD's COJ DX10 .. i realized that DX10 is slow as hell ...

    i doubt future driver will improve it .. not gonna make 20fps average to 40fps .. thats just impossible

    http://www.extremetech.com/article2/...2147398,00.asp
    look at that! that is pure crap ...

    so that means with a $550 GTX u cant even play games at >1280x1024 with 4xAA on and still get >40fps constantly

    well ill enjoy Crysis in XP DX9 . or i'll prolly get a new card that can play it on DX10 1680x1050 2xAA

  15. #65
    Xtreme Enthusiast
    Join Date
    Dec 2006
    Location
    France
    Posts
    741
    Quote Originally Posted by californian7856 View Post
    not really umm i dunno how they did that test .. i can find many that suggest it uses more power during idle

    http://www.anandtech.com/video/showdoc.aspx?i=2988&p=30

    a card uses more power load can use less power idle?? something wrong with the way they did that test

    and no when i dont play games/surf i turn my comp off

    louder?? umm b/c its louder then GTS/GTX



    its AMD/ATI sponsor, do u even have the game??
    http://img122.imageshack.us/img122/7964/dsc00127ob2.jpg

    wheres the nv title??

    as for CF performance in XP .. well u can always find a review that suggest different
    U can find a review, a blog or whatever that suggest all you want.
    And i have the game and it begins with a nice Nvidia TWIMTBP.
    AMD Phenom II X2 550@Phenom II X4 B50
    MSI 890GXM-G65
    Corsair CMX4GX3M2A1600C9 2x2GB
    Sapphire HD 6950 2GB

  16. #66
    Xtreme Member
    Join Date
    Mar 2007
    Posts
    103
    Quote Originally Posted by AbelJemka View Post
    U can find a review, a blog or whatever that suggest all you want.
    And i have the game and it begins with a nice Nvidia TWIMTBP.
    yeah and no screenshot

    http://img122.imageshack.us/img122/7964/dsc00127ob2.jpg

    see its AMD/ATI sponsor

    do u know how to view a picture?? move the cursor to where the link is and use left mouse button to click

    and yeah cuz u only waan hear whats best for 2900XT so u pick that specific review that suggest it uses less power idle ... and u try to spin it by saying comp idle more blablabal .. well i turn my comp off when i dont play 3dgames

    but ALL review suggest it uses more power than GTX during full load

    also to get more accurate reading i think they should have the 8pin PCI-E plug in ... as that will increase its power consumption even more
    Last edited by californian7856; 07-01-2007 at 10:11 AM.

  17. #67
    Xtreme Enthusiast
    Join Date
    Dec 2006
    Location
    France
    Posts
    741
    yeah and no screenshot

    http://img122.imageshack.us/img122/7964/dsc00127ob2.jpg

    see its AMD/ATI sponsor

    do u know how to view a picture?? move the cursor to where the link is and use left mouse button to click
    Logo still on CoJ official website.
    http://www.coj-game.com/
    Seems Nvidia not happy with CoJ benchmark so they pulled the game off their list in US but not in France.
    http://www.nzone.com/object/nzone_tw...gameslist.html
    http://fr.nzone.com/object/nzone_twi...eslist_fr.html
    U know how to click?
    http://www.hiboox.com/lang-fr/image....g=htefpwn5.jpg
    and yeah cuz u only waan hear whats best for 2900XT so u pick that specific review that suggest it uses less power idle ... and u try to spin it by saying comp idle more blablabal .. well i turn my comp off when i dont play 3dgames

    but ALL review suggest it uses more power than GTX during full load
    Yours Anandtech link also whow it use less power idle And i never turn my comp off and really don't about power comsuption too.
    And yes it use more power than GTX but i never say it doesn't.

    also to get more accurate reading i think they should have the 8pin PCI-E plug in ... as that will increase its power consumption even more
    Stupid statement. Coz u plug 8 pin PCI-E it will use power it don't need .
    Last edited by AbelJemka; 07-01-2007 at 10:31 AM.
    AMD Phenom II X2 550@Phenom II X4 B50
    MSI 890GXM-G65
    Corsair CMX4GX3M2A1600C9 2x2GB
    Sapphire HD 6950 2GB

  18. #68
    I am Xtreme
    Join Date
    Jul 2005
    Posts
    4,811
    Quote Originally Posted by AbelJemka View Post
    Logo still on CoJ official website.
    http://www.coj-game.com/
    Seems Nvidia not happy with CoJ benchmark so they pulled the game off their list in US but not in France.
    http://www.nzone.com/object/nzone_tw...gameslist.html
    http://fr.nzone.com/object/nzone_twi...eslist_fr.html
    U know how to click?
    http://www.hiboox.com/lang-fr/image....g=htefpwn5.jpg

    Yours Anandtech link also whow it use less power idle And i never turn my comp off and really don't about power comsuption too.
    And yes it use more power than GTX but i never say it doesn't.


    Stupid statement. Coz u plug 8 pin PCI-E it will use power it don't need .

    That's some good research. This leaves no doubt
    [SIGPIC][/SIGPIC]

  19. #69
    Xtreme Enthusiast
    Join Date
    Feb 2007
    Posts
    584
    Just a thought maybe COJ DX9 version is being sponsored by Nvidia with their TWIMTBP program and the DX10 version by ATI/AMD....

  20. #70
    Xtreme Member
    Join Date
    Mar 2007
    Posts
    103
    Quote Originally Posted by AbelJemka View Post
    -http://www.xbitlabs.com/articles/vid..._14.html#sect0
    I pick my test and say it use less power idle and we spend more time in idle mode than in full mode. Extra bill is like 10$ for a year. And on XS speaking about power compsumption it's like a joke.
    - Louder for sure but in summmer i guess all high end cards will become louder.

    .
    yeah its louder than anything, uses more power

    - Slower with 4xAA but even with that still in GTS performance.
    wrong GTS pwns 2900XT when 4xAA is on
    http://www.anandtech.com/video/showdoc.aspx?i=2988&p=19
    http://www.anandtech.com/video/showdoc.aspx?i=2988&p=20

    when 4xAA is off it pwns ur 2900XT in STALKER, SupCom
    http://www.anandtech.com/video/showdoc.aspx?i=2988&p=22
    http://www.anandtech.com/video/showdoc.aspx?i=2988&p=24


    - CF in XP is equally very fast
    equally very fast lol learn some english
    equally very fast haha
    wrong again
    http://www.anandtech.com/video/showdoc.aspx?i=2988&p=25

    - Call of Juarez is a Nvidia sponsor game.
    complete bs , learn to read
    http://ati.amd.com/products/Radeonhd2900/index.html
    Last edited by californian7856; 07-01-2007 at 11:03 AM.

  21. #71
    I am Xtreme
    Join Date
    Jul 2005
    Posts
    4,811
    I think this should settle it:
    "NVIDIA has a long standing relationship with Techland and their publisher Ubisoft. In fact, the original European version of Call Of Juarez that was launched in September 2006 is part of the "The Way Its Meant To Be Played" program. As a result of the support Techland and Ubisoft receives for being part of the "The Way Its Meant To Be Played" program, NVIDIA discovered that the early build of the game that was distributed to the press has an application bug that violates DirectX 10 specifications by mishandling MSAA buffers, which causes DirectX 10 compliant hardware to crash. Our DevTech team has worked with Techland to fix that and other bugs in the Call of Juarez code...
    Source

    Although the article was talking about something else the direct quote from Nvidia is irrefutable. Yes, I now that quote involves European version of the game. However, the current version available didn't simply change to GITG. And, Techland's involvement in COJ didn't change. But make note (if you continue to read the source link), that Techland indicates that the newest benchmark is distribute by (not developed by) AMD. However, that's just the benchmark, not the entire game. Also, it didn't change the relationship between Nvidia and Techland at the time the game was developed.
    Last edited by Eastcoasthandle; 07-01-2007 at 11:20 AM.
    [SIGPIC][/SIGPIC]

  22. #72
    Xtreme Member
    Join Date
    Mar 2007
    Posts
    103
    http://ati.amd.com/products/Radeonhd2900/index.html

    i think this should settle. learn to read.

    what website is that.. oh thats right its AMD

  23. #73
    Xtreme Cruncher
    Join Date
    Nov 2005
    Location
    Rhode Island
    Posts
    2,740
    Californian, You have no right to tell Abel he speaks bad English, he speaks better English then you do, and he is from France. Heres some gold from one of your previous posts.

    "do u know how to view a picture?? move the cursor to where the link is, and use the left mouse button to click.

    and yeah cuz u only waan hear whats best for 2900XT, so u pick that specific review that suggests it uses less power on idle ... and u try to spin it by saying comp idle more blablabal .. well i turn my comp off when i dont play 3dgames."

    I bolded misspelled words, and italicized missing words that should be there. At least he uses proper grammar, capitalizing words and ending sentences with periods, unlike you. You just look like a fool with his quote in your sig and that mess of a sentence below it.

    Also, you should find some OTHER reviews, besides AnandTech, which has shown a slight Intel/nvidia bias as of lately. Also, just because CoJ is on AMD's site, does not mean they sponsor it. At the time, it was one of the only functional publicly available DX10 games.
    Last edited by [XC] Lead Head; 07-01-2007 at 11:18 AM.
    Fold for XS!
    You know you want to

  24. #74
    Xtreme Member
    Join Date
    Mar 2007
    Posts
    103
    Quote Originally Posted by [XC] Lead Head View Post
    Californian, You have no right to tell Abel he speaks bad English, he speaks better English then you do, and he is from France. Heres some gold from one of your previous posts.

    "do u know how to view a picture?? move the cursor to where the link is, and use theleft mouse button to click.

    and yeah cuz u only waan hear whats best for 2900XT, so u pick that specific review that suggests it uses less power on idle ... and u try to spin it by saying comp idle more blablabal .. well i turn my comp off when i dont play 3dgames.

    I bolded misspelled words, and italicized missing words that should be there. At least he uses proper grammar, capitalizing words and ending sentences with periods, unlike you. You just look like a fool with his quote in your sig and that mess of a sentence below it.

    Also, you should find some OTHER reviews, besides AnandTech, which has shown a slight Intel/nvidia bias as of lately.
    yeah its call abbreviation ..

    u = you
    cuz = cause

    why is idle misspelled?? and more?? those were right

    and "w" is not misspelled... why did u bold that??

    also "3d" means 3 dimension?? duh.. why is that one bold??

    was that so hard to understand??

    http://img122.imageshack.us/img122/7964/dsc00127ob2.jpg
    as for AMD sponsor .. can u tell me why AMD ATI logo is on the box of that game
    Last edited by californian7856; 07-01-2007 at 11:23 AM.

  25. #75
    Xtreme Cruncher
    Join Date
    Nov 2005
    Location
    Rhode Island
    Posts
    2,740
    Quote Originally Posted by californian7856 View Post
    yeah its call abbreviation ..

    u = you
    cuz = cause

    was that so hard to understand??
    Nope, and sorry to tell you "cause", isn't a word either. You make fun of Abel for his English, which is a hell of a lot better then yours.
    Fold for XS!
    You know you want to

Page 3 of 4 FirstFirst 1234 LastLast

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •