Page 112 of 167 FirstFirst ... 1262102109110111112113114115122162 ... LastLast
Results 2,776 to 2,800 of 4151

Thread: ATI Radeon HD 4000 Series discussion

  1. #2776
    Banned
    Join Date
    Feb 2008
    Posts
    696
    Quote Originally Posted by zerazax View Post
    Right, a lot of it is PR, but I can also see some of the logic behind it since neither company has fabs for their GPUs, they really are at the mercy of foundry's schedules for processes. So if yields on monolithic GPU's arent up to par at the best possible process you can afford, its a risky proposition to put all your marbles in one bag no matter how well off you are as a company.

    BTW those rumblings on Nvidia weren't from an AMD source at all, it was actually an Nvidia source and I know some of it was published somewhere recently but lost in the shuffle of the GT200 launch
    So you think that microstutter, poor multi-gpu scaling in some apps, input lag, dependency on driver updates for multi-gpu profiles, and AFR corruptions are the future of the high end? I don't buy it.

    In my opinion, anything that is a multi-gpu product has "Stop-gap" written all over it.
    Last edited by Sr7; 06-17-2008 at 12:13 AM.

  2. #2777
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    Quote Originally Posted by AliG View Post
    40 TMUs will be enough, the 8800gtx only had 32 and was fine, and besides the 4xaa doesn't kill performance anymore, jimmyz's review proved that, the drop in fps from 2x to 4x with the r600 was more than 50% in some cases, with COJ (which is pretty demanding) the 4850 lost only 2 fps to the drop. That, and jimmyz was able to get 20 FPS out of his system at stock for crysis at 1920*1200 4xaa 16xaf, that was never possible before for ati, I don't think even the g92 gts could do that (I believe the ultra was the only one due to the high clocks and wide bus)
    I just thought I'd chime in on this one, I guess people forgot something about COJ.

    Think back to the actual launch of the DX10 patch release, remember what NVidia's big complaint about the title was? That's right, the title forces shader AA. As in, it's not exactly the title to gauge whether or not they fixed the AA performance of the card, because it's actually made for ATi's method of AA they switched to with the R600 architecture. We'll need to see other titles with AA before we can accurately gauge if AA has been fixed or not...
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

  3. #2778
    Xtreme X.I.P.
    Join Date
    Apr 2005
    Posts
    4,475
    Quote Originally Posted by Sr7 View Post
    They left off every possible other spec which is *not* in their favor.

    I find it funny how many people complain about efficiency. Do you sit there and have a relative measure watts drawn while you play your game too?

    "OMG TURN YOUR GUY AND LOOK AT A WALL, QUICK, YOU'RE DRAWING TOO MUCH POWER. I DON'T CARE IF THIS IS A BOSS FIGHT LOOK AT THE F'ING WALL."
    it shows effeciency of the design.

    What happened to the days when gaming was about the experience you get?
    Cards got $600-$700 price tag which majority of the people even here will not be willing to pay.

  4. #2779
    Banned
    Join Date
    Feb 2008
    Posts
    696
    Quote Originally Posted by Cooper View Post
    it shows effeciency of the design.



    Cards got $600-$700 price tag which majority of the people even here will not be willing to pay.
    Who games for efficiency? It's something old people gawk over, like gas mileage :P Let's be honest here.. who is concerned about saving power when they play games? You'd do more by turning the lights out every time you're not in a room. Idle consumption is obviously more important because your computer is probably idling 90%+ of the time, not gaming.

    Apparently you don't remember the 8800GTX or Ultra launch prices
    Last edited by Sr7; 06-17-2008 at 12:21 AM.

  5. #2780
    Xtreme Addict
    Join Date
    May 2007
    Posts
    2,125
    Quote Originally Posted by Sr7 View Post
    So you think that microstutter, poor multi-gpu scaling in some apps, input lag, dependency on driver updates for multi-gpu profiles, and AFR corruptions are the future of the high end? I don't buy it.

    In my opinion, anything that is a multi-gpu product has "Stop-gap" written all over it.
    A lot of those things are driver issues and people playing the game where the card is obviously choking. Here's the main thing though:

    Multi-GPU is a very recent thing, despite being an ancient idea. Sure, the Voodoo cards tried it out and after Nvidia bought out 3dfx and adopted their own version of SLI, it took a long time for it to mature to where it is.

    Look back one year and see where we've gone since then. We've finally had multi-monitor support (CF), Hybrid SLI and CF for both power saving and for boosting integrated GPUs, and improved scaling (as the 9800GX2 shows that it is still quite formidable even against the new architecture with a single GPU).

    As both sides get more committed (remember, Nvidia really started the experiment with the 7950GX2 and then continued it on with the 9800GX2) to promoting and selling their multi-GPU setups (why else would Nvidia want to hold so much over SLI being exclusive to their chipset division), they've finally seriously added features and driver improvements that we could only have dreamed of 2 years ago. Heck, even a year ago.

    And anyways, I feel that you are envisioning multi-GPU's too much as simple on-board SLI or CF. I think both sides are exploring different ways to make GPU's communicate. For example, it isn't out of the reach to say that Nvidia may one day adopt an interconnect similar to a memory controller in order to improve the capabilities of their cards for GPGPU purposes and really hold to their claim that CPU's are obsolete.

    Obviously a lot of that is speculation but seeing as to how both Nvidia and AMD are pushing towards integrating GPU's with CPU's (in other words, combining the two to make on uber processor), they will certainly have to come up with new technology for these things to happen.

  6. #2781
    Banned
    Join Date
    Feb 2008
    Posts
    696
    Quote Originally Posted by zerazax View Post
    A lot of those things are driver issues and people playing the game where the card is obviously choking. Here's the main thing though:

    Multi-GPU is a very recent thing, despite being an ancient idea. Sure, the Voodoo cards tried it out and after Nvidia bought out 3dfx and adopted their own version of SLI, it took a long time for it to mature to where it is.

    Look back one year and see where we've gone since then. We've finally had multi-monitor support (CF), Hybrid SLI and CF for both power saving and for boosting integrated GPUs, and improved scaling (as the 9800GX2 shows that it is still quite formidable even against the new architecture with a single GPU).

    As both sides get more committed (remember, Nvidia really started the experiment with the 7950GX2 and then continued it on with the 9800GX2) to promoting and selling their multi-GPU setups (why else would Nvidia want to hold so much over SLI being exclusive to their chipset division), they've finally seriously added features and driver improvements that we could only have dreamed of 2 years ago. Heck, even a year ago.

    And anyways, I feel that you are envisioning multi-GPU's too much as simple on-board SLI or CF. I think both sides are exploring different ways to make GPU's communicate. For example, it isn't out of the reach to say that Nvidia may one day adopt an interconnect similar to a memory controller in order to improve the capabilities of their cards for GPGPU purposes and really hold to their claim that CPU's are obsolete.

    Obviously a lot of that is speculation but seeing as to how both Nvidia and AMD are pushing towards integrating GPU's with CPU's (in other words, combining the two to make on uber processor), they will certainly have to come up with new technology for these things to happen.
    I understand your point, but a lot of the problems with multi-gpu are just "the nature of the beast". You'd need to drastically change the fundamentals of it to improve things... it would need to be many chips able to function as 1 big chip, which is pretty damned hard to do.

    Until you've programmed a game to be scale well for many-gpu, you can't sit here and tell me that it is the future. It's a hassle and a hack that is a necessary evil to be #1 (if you don't do it the competition will), just like the cold war's nuclear standoff. The point of future technologies are to simplify things, not complicate them. No game developer wants to add worrying about multi-gpu scaling to their list of " to do today."

    The future is to not have games be required to code things a certain way to get the scaling.. it should work perfectly every time, which means that AFR goes out the window.

  7. #2782
    Xtreme Addict
    Join Date
    Jul 2006
    Location
    Between Sky and Earth
    Posts
    2,035
    Quote Originally Posted by Macadamia View Post
    When the GT200 is MORE THAN TWICE AS LARGE as the RV770, I'm certain you're just... wrong.

    If AMD wanted to slaughter nVidia single-handedly in the highend, they'd make a chip as big as 8800GTX/2900XT, smaller than the GT200 but performance? Truely owned.


    AMD's being the smart guy here, choosing the right architecture.


    I was talking about specifications not the size of the cards. Past generations from nVidia had smaller numbers on their specifications but balanced, wile ATi had huge numbers but unbalanced. Take ATi 29xx compared to nVidia 88xx, falowed by ATi 38xx vs the G92 primary models.. in Both cases nVidia had better performance.

    How could they say after 7 months that ATi HD 4850 is a competition for nVidia 8800 GT, XMAS vs Summer time ... now that's what I call a cat and mouse game.

    Did I mention that I own a ATI HD 3870 and my last nVidia card was GeForce4 MX 440 ... so yeah I'm not a fan boy. I used to aim for best buy products and FX 5xxx was an accident back them... after that was just bad luck. I bought a ATi X1600 Pro (with gddr3), 1 month later nVidia introduces 7600GS - was cheaper and better performance, on January this year I went for ATi HD 3870 - was cheaper then 8800 GT and looked pretty capable.. two months latter the prices drooped like crazy and keep on dropping ever since.
    Last edited by XSAlliN; 06-17-2008 at 12:44 AM.

  8. #2783
    Xtreme Addict
    Join Date
    May 2007
    Posts
    2,125
    I'm sure its hard to program a game to take advantage of multi GPUs (much like other programs having issues just going multi-thread on CPUs) but I'd also say its an issue of prevalence. It just wasn't that common that long ago (once taken advantage of only by those with lots of money and the top of the line GPUs... and lets not even get into the original CF implementation with master cards and external dongles) but with the recent plethora of cheap affordable performance, the possibility of multi-GPU is definitely tantalizing to consumers out there.

    I've always said this... both ATI and Nvidia were founded and kept afloat by their discrete GPU market, so why wouldn't they in the long run promote the ability to sell more multi GPU configurations and sell more GPU's?

    And this is coming from someone that has kept away from multi-GPU configs and still uses an 8800Ultra for 30" monitor gaming (probably will be replaced soon once I get my overhaul specs finalized)

  9. #2784
    Xtreme X.I.P.
    Join Date
    Apr 2005
    Posts
    4,475
    Quote Originally Posted by Sr7 View Post
    Who games for efficiency? It's something old people gawk over, like gas mileage :P Let's be honest here.. who is concerned about saving power when they play games? You'd do more by turning the lights out every time you're not in a room. Idle consumption is obviously more important because your computer is probably idling 90%+ of the time, not gaming.

    Apparently you don't remember the 8800GTX or Ultra launch prices
    I remember 7900GTX 512 launch price. And you totaly missing the point here. Of course every gamer would like to hve uber system with tri-sli etc etc. But only fraction of these people can actually afford or willing to spend that much money on it. Who cares about awesome performance if you can only imagine it watching the graphs on reviews.
    I play lot of games and for most of them even 3870 is enough. And I do have 24" screen with 1920x1200.

  10. #2785
    Banned
    Join Date
    Feb 2008
    Posts
    696
    Quote Originally Posted by Cooper View Post
    I remember 7900GTX 512 launch price. And you totaly missing the point here. Of course every gamer would like to hve uber system with tri-sli etc etc. But only fraction of these people can actually afford or willing to spend that much money on it. Who cares about awesome performance if you can only imagine it watching the graphs on reviews.
    I play lot of games and for most of them even 3870 is enough. And I do have 24" screen with 1920x1200.
    I don't understand your point.. of course not everyone can spend or will spend that much. That's not the question. The question is why does efficiency get so much hype around it for GPUs? There's plenty of performance left to be had to worry about (unlike CPUs).

  11. #2786
    Xtreme X.I.P.
    Join Date
    Apr 2005
    Posts
    4,475
    Because efficient design on small dies brings prices down and sales up (if competition is there of course) which is good for end-user.

    How could they say after 7 months that ATi HD 4850 is a competition for nVidia 8800 GT
    According to the preliminary reults we've seen 4850 should outperform 8800GTS 512 or *perhaps* even GTX260...and that's different price tag.
    BTW 8800GT wasn't a competitor to 3870 for 2-3 months after release due to lack of availability and high prices.

  12. #2787
    Xtreme Member
    Join Date
    May 2008
    Posts
    336
    Quote Originally Posted by Cooper View Post
    Because efficient design on small dies brings prices down and sales up (if competition is there of course) which is good for end-user.



    According to the preliminary reults we've seen 4850 should outperform 8800GTS 512 or *perhaps* even GTX260...and that's different price tag.
    BTW 8800GT wasn't a competitor to 3870 for 2-3 months after release due to lack of availability and high prices.
    as long as it outperforms a 9800 GTX ill be happy

  13. #2788
    Xtreme Cruncher
    Join Date
    Jan 2005
    Location
    England
    Posts
    3,554
    news on release date? i was quite saddened to see the 280GTX released before this

    My Free-DC Stats
    You use IRC and Crunch in Xs WCG team? Join #xs.wcg @ Quakenet

  14. #2789
    Xtreme Mentor
    Join Date
    Jul 2004
    Posts
    3,247
    Quote Originally Posted by Origin_Unknown View Post
    news on release date? i was quite saddened to see the 280GTX released before this
    NDA expires on 25Jun
    HD4850 on the shelves in Hong Kong

  15. #2790
    Xtreme Cruncher
    Join Date
    Jan 2005
    Location
    England
    Posts
    3,554
    Quote Originally Posted by onethreehill View Post
    NDA expires on 25Jun
    HD4850 on the shelves in Hong Kong
    not too long to wait - i really want a pair of 4870's to replace my Gts640

    My Free-DC Stats
    You use IRC and Crunch in Xs WCG team? Join #xs.wcg @ Quakenet

  16. #2791
    Xtreme Addict
    Join Date
    Apr 2008
    Location
    Lansing, MI / London / Stinkaypore
    Posts
    1,788
    Quote Originally Posted by XSAlliN View Post


    I was talking about specifications not the size of the cards. Past generations from nVidia had smaller numbers on their specifications but balanced, wile ATi had huge numbers but unbalanced. Take ATi 29xx compared to nVidia 88xx, falowed by ATi 38xx vs the G92 primary models.. in Both cases nVidia had better performance.

    How could they say after 7 months that ATi HD 4850 is a competition for nVidia 8800 GT, XMAS vs Summer time ... now that's what I call a cat and mouse game.

    Did I mention that I own a ATI HD 3870 and my last nVidia card was GeForce4 MX 440 ... so yeah I'm not a fan boy. I used to aim for best buy products and FX 5xxx was an accident back them... after that was just bad luck. I bought a ATi X1600 Pro (with gddr3), 1 month later nVidia introduces 7600GS - was cheaper and better performance, on January this year I went for ATi HD 3870 - was cheaper then 8800 GT and looked pretty capable.. two months latter the prices drooped like crazy and keep on dropping ever since.
    That's the tech world

    And really, only God knows how much my (dead) 2900XT has depreciated in pricing...

    However for ATI, the 4850/4870 are much more forward looking than the 8800/9800s. nVidia threw in 32 extra TMUs and 8 ROPs for no real reason, while being short on ALU and bandwidth.

    Now with 32 TMUs, it's decidedly "enough" for something that can perform nicely in games and on a side effect, slaughter 3DMark06.

  17. #2792
    Xtreme Cruncher
    Join Date
    Jun 2006
    Posts
    6,215
    Quote Originally Posted by DilTech View Post
    I just thought I'd chime in on this one, I guess people forgot something about COJ.

    Think back to the actual launch of the DX10 patch release, remember what NVidia's big complaint about the title was? That's right, the title forces shader AA. As in, it's not exactly the title to gauge whether or not they fixed the AA performance of the card, because it's actually made for ATi's method of AA they switched to with the R600 architecture. We'll need to see other titles with AA before we can accurately gauge if AA has been fixed or not...
    AFAIK,jimmyz tested the card in Crysis and maybe another game and came to the same conclusion:AA is fixed and now causes a much much smaller drop in fps than on R600.

  18. #2793
    Xtreme Member
    Join Date
    May 2008
    Posts
    336
    Hey guys...

    Why do you think they don't make higher resolution monitors? I got a laptop recently with a 15 inch screen that displays 1920x1200. Gaming on it is beautiful. No jagged edges and no need for AA. I'm thinking there would be a good market for a 24 inch monitor that can do 2560x1600. Gamers and movie watchers would jump all over it.

  19. #2794
    Xtreme Enthusiast
    Join Date
    May 2007
    Location
    Ireland
    Posts
    940
    Quote Originally Posted by natty View Post
    Hey guys...

    Why do you think they don't make higher resolution monitors? I got a laptop recently with a 15 inch screen that displays 1920x1200. Gaming on it is beautiful. No jagged edges and no need for AA. I'm thinking there would be a good market for a 24 inch monitor that can do 2560x1600. Gamers and movie watchers would jump all over it.
    to expensive...

  20. #2795
    Xtreme Addict
    Join Date
    Jul 2006
    Location
    Between Sky and Earth
    Posts
    2,035
    Quote Originally Posted by natty View Post
    Hey guys...

    Why do you think they don't make higher resolution monitors? I got a laptop recently with a 15 inch screen that displays 1920x1200. Gaming on it is beautiful. No jagged edges and no need for AA. I'm thinking there would be a good market for a 24 inch monitor that can do 2560x1600. Gamers and movie watchers would jump all over it.

    Cause LCD's quality is based on their native resolution. I don't see how you find a game playable or what games you can play on 15 inch laptop screen.

    ...or is that a laptop with a CRT Screen?

  21. #2796
    Xtreme Member
    Join Date
    May 2008
    Posts
    336
    Quote Originally Posted by XSAlliN View Post
    Cause LCD's quality is based on their native resolution. I don't see how you find a game playable or what games you can play on 15 inch laptop screen.

    ...or is that a laptop with a CRT Screen?
    No its not CRT, its a normal laptop LCD from Dell. Over at their website they call it a 'Hi-Def display'. I was playing quake 3 with no problems. It looked absolutely gorgeous. None of the jagged edges that you normally get when you play without anti-aliasing.

    I'm just asking why can't they make LCD's with higher native resolutions than what we currently have?

  22. #2797
    Xtreme Guru
    Join Date
    May 2007
    Location
    Ace Deuce, Michigan
    Posts
    3,955
    Quote Originally Posted by DilTech View Post
    I just thought I'd chime in on this one, I guess people forgot something about COJ.

    Think back to the actual launch of the DX10 patch release, remember what NVidia's big complaint about the title was? That's right, the title forces shader AA. As in, it's not exactly the title to gauge whether or not they fixed the AA performance of the card, because it's actually made for ATi's method of AA they switched to with the R600 architecture. We'll need to see other titles with AA before we can accurately gauge if AA has been fixed or not...
    I knew that, just was hoping none of you guys picked that up

    But seriously, jimmyz also tested crysis with 4xaa 6xaf @ 1920*1200 and got 20 fps, that's 20 more than what the 3870/2900xt could get
    Quote Originally Posted by Hans de Vries View Post

    JF-AMD posting: IPC increases!!!!!!! How many times did I tell you!!!

    terrace215 post: IPC decreases, The more I post the more it decreases.
    terrace215 post: IPC decreases, The more I post the more it decreases.
    terrace215 post: IPC decreases, The more I post the more it decreases.
    .....}
    until (interrupt by Movieman)


    Regards, Hans

  23. #2798
    Xtreme Addict
    Join Date
    Feb 2006
    Location
    Vienna, Austria
    Posts
    1,940
    Quote Originally Posted by Shintai View Post

    So for me 4x00 and 2x0 series is just something I skip.
    true, my 3870CF plays everything i want to with lots of eyecandy, and crysis is no game i'm going to upgrade my pc for + i can play it in high @1920-1200 anyways

    well in some months there 'may' be a game worth it to upgrade
    Last edited by Cooper; 06-17-2008 at 04:13 AM. Reason: BS removed from quote
    Core i7 2600k|HD 6950|8GB RipJawsX|2x 128gb Samsung SSD 830 Raid0|Asus Sabertooth P67
    Seasonic X-560|Corsair 650D|2x WD Red 3TB Raid1|WD Green 3TB|Asus Xonar Essence STX


    Core i3 2100|HD 7770|8GB RipJawsX|128gb Samsung SSD 830|Asrock Z77 Pro4-M
    Bequiet! E9 400W|Fractal Design Arc Mini|3x Hitachi 7k1000.C|Asus Xonar DX


    Dell Latitude E6410|Core i7 620m|8gb DDR3|WXGA+ Screen|Nvidia Quadro NVS3100
    256gb Samsung PB22-J|Intel Wireless 6300|Sierra Aircard MC8781|WD Scorpio Blue 1TB


    Harman Kardon HK1200|Vienna Acoustics Brandnew|AKG K240 Monitor 600ohm|Sony CDP 228ESD

  24. #2799
    Xtreme Member
    Join Date
    Dec 2007
    Posts
    103
    800SP 40TMU pretty much confirmed!

    http://forum.beyond3d.com/showpost.p...postcount=3718



    Last edited by tenebre; 06-17-2008 at 04:26 AM.

  25. #2800
    Xtreme Addict
    Join Date
    Mar 2007
    Location
    United Kingdom
    Posts
    1,597
    Quote Originally Posted by Papu View Post
    from whats been said in support this has been drastically improved
    ati will support havok on there gpu's so maybe its nvidias lack of havok thats the bigger issue?

    back in the day...nuff said..

    good thing the 4850 isnt a 2900 , its improved and 55nm
    I hope that you are right Papu as it would really give Ati the edge. It has been a while since I had an ATi card so I am looking forward to returning to being back on the Red

    However a lot of people do not realise that the 4850 is in fact the RV770PRO which has 480SP and 24TMU. I am guessing that the RV770 4870 form which has the full 800SP and 48TMU.
    Where did I get that from?

    The Box blurb which was acidentally posted on Amazon.com

    Diamond Radeon 4850PE3512
    512MB Memory, GDDR3 Memory Configuration,
    256-bit Memory Interface,
    Display Formats:x2 Dual Link DVI + HDTV-out with built-in HDMI,
    PCI Express 2.0 support,
    480 Stream Processing units,
    Direct X 10.1 / Shader Model 4.1,OpenGL 2.0,
    ATI Crossfire X Multi-GPU Support for highly scalable performance (Up to four GPU support with an AMD 790FX based motherboard),
    High-speed 128-bit-HDR (High Dynamic Range) rendering, 55nm process technology, ATI Avivo HD Video and Display Technology, Game Physics processing capability, Up to 24X Custom Filter Anti-Aliasing, Unified Video Decoder (UVD) for Blu-ray and HD DVD, ATI PowerPlay energy conserving technology.
    Any ideas when the NDA is going to be lifted and when the HD4870 (1GB DDR5) will be released?

    I am still tempted by the Zotac AMP 700Mhz GTX 280, however if the HD4870 comes close (within 10%) and for 66% of the price... I will certainly buy the RADEON over the Geforce.

    John

Page 112 of 167 FirstFirst ... 1262102109110111112113114115122162 ... LastLast

Tags for this Thread

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •