Page 111 of 167 FirstFirst ... 1161101108109110111112113114121161 ... LastLast
Results 2,751 to 2,775 of 4151

Thread: ATI Radeon HD 4000 Series discussion

  1. #2751
    Xtreme Enthusiast
    Join Date
    Feb 2005
    Posts
    970
    I get the feeling there's going to be more to all of this than just great fps. Although I don't want to step too far out on the plank!


    http://rage3d.com/board/showthread.php?t=33925670

    Quote Originally Posted by lupine
    Then? Cinema 2.0 demo Ruby, developed by otoy. The demo was fantastic. Cinema 2.0 rendering? Jawdropping. I for one have never seen anything like it.
    Last edited by flippin_waffles; 06-16-2008 at 08:07 PM.

  2. #2752
    Xtreme Member
    Join Date
    Jun 2008
    Location
    Edinburgh
    Posts
    195
    Quote Originally Posted by disruptfam View Post
    what are the chances of the 4870 beating out the 280gtx?
    Closer than 10% will be total furor for the twice cheaper 4870, because it will stay between GTX 260 and 280, which would be enough to collect almost all the sells in this high-end segment. We can only pray. I'm really astonished at ati's price policy, but I like it.
    MPOWER|i5 3570K|TRUE Spirit 140|2x4GB+2x2GB|VTX3D 280X|SanDisk Extreme 120GB|HX520W|Arc Midi|G2222HDL|G400s+QcK|Xonar DGX
    F1A75-V EVO|3870K|Venomous X|2x4GB|5830+DeepCool V400|F4EG 2TB|Solid 3 120GB|Silencer MKIII 500W|NZXT Source 210 Elite|IPS226V|Xonar DG
    Overclock yourself, you must!!!

  3. #2753
    Xtreme Enthusiast
    Join Date
    Sep 2007
    Location
    Sydney - Australia
    Posts
    515
    Saw this while browsing the [H] forums - appears that the 4870 will not be available till July 8 and is aimed at the 9800gtx while the r700 is taking on the new gt200 series - and is due out in about 8 weeks (August).....also has some screenshots of the new Cinema 2.0 Ruby demo....

    http://www.hardforum.com/showthread.php?t=1316251

  4. #2754
    Xtreme Member
    Join Date
    Aug 2007
    Location
    San Diego, CA
    Posts
    330
    what a shame, two months is way far, by that time good drivers for new nvidia cards be out, prices will be lower than rigth now.
    Quote Originally Posted by Kunaak View Post
    High end videocards are like hot girls.

    your never gonna pick the fat girl with the nice personality over the smoking hot 10.






    E8400
    DFI X38
    ATI 4870 X2
    8 Gb GSkill
    2x 36 Gb Raptor RAID 0
    750 Gb WD
    CPU, NB Water Cooled

  5. #2755
    Xtreme Member
    Join Date
    Sep 2007
    Posts
    216
    http://www.amd.com/us-en/assets/cont...D_Ruby_S04.swf

    Here is the movie of Cinema 2.0 demo WITH Ruby.

    "ATI icon “Ruby” stars in the first-ever Cinema 2.0 experience.

    Rendered in real-time and interactive, this is a brief video from the first Cinema 2.0 demo, premiered by AMD in San Francisco on June 16, 2008. The interactive demo was rendered by a single PC equipped with two "RV770" codenamed graphics cards powered by an AMD Phenom™ X4 9850 Processor and AMD 790FX Chipset. The full demo shows cinema-quality digital images rendered in real-time with interactivity. Check back later this summer for a video of the full Ruby Cinema 2.0 demo.
    (0:11) "
    Last edited by SimBy; 06-16-2008 at 10:24 PM.

  6. #2756
    Banned
    Join Date
    Jan 2008
    Location
    Canada
    Posts
    707
    hmmmm I don't like the sound of this. Soft launches, high end in "8 weeks" etc. Come on AMD, launch on time and no paper launches.

  7. #2757
    Xtreme Member
    Join Date
    Sep 2007
    Posts
    216
    If it's something other than X-Fire on a card it's worth the wait cos it will prolly be a gamechanger.

  8. #2758
    Xtreme Member
    Join Date
    Oct 2007
    Posts
    464
    Quote Originally Posted by SimBy View Post
    http://www.amd.com/us-en/assets/cont...D_Ruby_S04.swf

    Here is the movie of Cinema 2.0 demo WITH Ruby.
    liked the video
    reminds me alittle of dog in hl2

  9. #2759
    Xtreme Member
    Join Date
    Oct 2007
    Posts
    407
    launch on time and no paper launches.
    I like paper launches as long as the date of availability is accurate. I hate all this Nvidia cloak and dagger NDA stuff with everything about the card being secret until a day before store availability. It's ridiculous. And it makes it hard to plan a system in advance. Do you think I would be sitting with a bloody 8800 GTS that I bought in late march if I knew that Nvidia was releasing a totally new architecture in June?! It seems like a cheap/dirty way to scam us into buying cards that we wouldn't otherwise buy. On the other hand you have Intel which announces a new architecture 6-12 months ahead of time. I am already planning on my Nehalem upgrade in Jan-Feb 2009 because I can. With Nvidia we could never make plans like that. All we have are rumours. It's true that paper launches (like Seagate's 1TB drive announcement) with no concrete time frame of actual availability is extremely annoying, but I don't think complete ignorance is a very good option either. Maybe they should pretend to be consumers for a second and ask themselves what they would like.

  10. #2760
    Xtreme Addict
    Join Date
    Jul 2006
    Location
    Between Sky and Earth
    Posts
    2,035
    If HD 4870 it's in competition with 9800 GTX - where is HD 5870, the competition for next generations of cards from nVidia. 8800GT was released in December 2007 - after 7 months they bring a competition product, in conclusion - the next generation should be available after 7 months from nVidia's release of 260/280 GTX ...


    I'm sick of this cat and mouse game: nVidia turned up to be Jerry the small and smart mouse(specifications based) wile ATi got the role of Tom the big and dumb cat. This conclusion is based on past and present facts...not future wannabes....

  11. #2761
    Xtreme Addict
    Join Date
    May 2007
    Posts
    2,125
    Quote Originally Posted by XSAlliN View Post
    If HD 4870 it's in competition with 9800 GTX - where is HD 5870, the competition for next generations of cards from nVidia. 8800GT was released in December 2007 - after 7 months they bring a competition product, in conclusion - the next generation should be available after 7 months from nVidia's release of 260/280 GTX ...


    I'm sick of this cat and mouse game: nVidia turned up to be Jerry the small and smart mouse(specifications based) wile ATi got the role of Tom the big and dumb cat. This conclusion is based on past and present facts...not future wannabes....
    The 4850 is competition for the 9800GTX and it's priced at $200

    From what I've been hearing, 4870 is creeping closer to the GTX260

    I don't understand where you're getting the rest of your rant from. Considering how far behind they were a year ago at this point, the fact that people are saying these $199 and $299 cards are going to be performing well is going to be good

  12. #2762
    Xtreme X.I.P.
    Join Date
    Apr 2005
    Posts
    4,475
    Quote Originally Posted by XSAlliN View Post
    I'm sick of this cat and mouse game: nVidia turned up to be Jerry the small and smart mouse(specifications based) wile ATi got the role of Tom the big and dumb cat. This conclusion is based on past and present facts...not future wannabes....


    Now who's big dumb cat on that pic?

  13. #2763
    Banned
    Join Date
    Feb 2008
    Posts
    696
    Quote Originally Posted by Cooper View Post


    Now who's big dumb cat on that pic?
    They left off every possible other spec which is *not* in their favor.

    I find it funny how many people complain about efficiency. Do you sit there and have a relative measure watts drawn while you play your game too?

    "OMG TURN YOUR GUY AND LOOK AT A WALL, QUICK, YOU'RE DRAWING TOO MUCH POWER. I DON'T CARE IF THIS IS A BOSS FIGHT LOOK AT THE F'ING WALL."

    What happened to the days when gaming was about the experience you get?

  14. #2764
    Xtreme Addict
    Join Date
    Apr 2008
    Location
    Lansing, MI / London / Stinkaypore
    Posts
    1,788
    Quote Originally Posted by XSAlliN View Post
    I'm sick of this cat and mouse game: nVidia turned up to be Jerry the small and smart mouse(specifications based) wile ATi got the role of Tom the big and dumb cat. This conclusion is based on past and present facts...not future wannabes....

    When the GT200 is MORE THAN TWICE AS LARGE as the RV770, I'm certain you're just... wrong.

    If AMD wanted to slaughter nVidia single-handedly in the highend, they'd make a chip as big as 8800GTX/2900XT, smaller than the GT200 but performance? Truely owned.


    AMD's being the smart guy here, choosing the right architecture.

  15. #2765
    Banned
    Join Date
    Feb 2008
    Posts
    696
    Quote Originally Posted by Macadamia View Post
    When the GT200 is MORE THAN TWICE AS LARGE as the RV770, I'm certain you're just... wrong.

    If AMD wanted to slaughter nVidia single-handedly in the highend, they'd make a chip as big as 8800GTX/2900XT, smaller than the GT200 but performance? Truely owned.


    AMD's being the smart guy here, choosing the right architecture.
    Guy, they can't afford to make big chips anymore. The correct thing to say is that AMD is "playing the cards they have the best they can."

  16. #2766
    Xtreme Addict
    Join Date
    Apr 2008
    Location
    Lansing, MI / London / Stinkaypore
    Posts
    1,788
    Quote Originally Posted by Sr7 View Post
    Guy, they can't afford to make big chips anymore. The correct thing to say is that AMD is "playing the cards they have the best they can."
    Really?
    What's a Phenom (Barcelona) then?


    There's the problem with needing both GDDR5 and a >256-bit memory bus for enough bandwidth though. The former hurts prices, the latter hurts ATI's chip size more because the ringbus itself is quite huge. The "SP"s, TMUs are much cheaper though. I might add, cheap enough for nVidia to be very afraid.


    These cards can do double precision floating point using all their existing SP units while nVidia had to add in new ones to get a slower DP speed cap (200GFLOPS on RV770 vs 70-something GFLOPS on GT200 if my memory is right)
    Last edited by Macadamia; 06-16-2008 at 11:42 PM. Reason: Forgot the Giga, loool

  17. #2767
    Banned
    Join Date
    Feb 2008
    Posts
    696
    Quote Originally Posted by Macadamia View Post
    Really?
    What's a Phenom (Barcelona) then?


    There's the problem with needing both GDDR5 and a >256-bit memory bus for enough bandwidth though. The former hurts prices, the latter hurts ATI's chip size more because the ringbus itself is quite huge. The "SP"s, TMUs are much cheaper though.
    What's a Phenom? Mostly cache.

    Size of chip != size of logic.

    Besides, the point is they can't afford to make both big cpu's and big graphics chips.

  18. #2768
    Xtreme Member
    Join Date
    Oct 2007
    Location
    Sydney, Australia
    Posts
    466
    Quote Originally Posted by Sr7 View Post
    What's a Phenom? Mostly cache.

    Size of chip != size of logic.

    Besides, the point is they can't afford to make both big cpu's and big graphics chips.
    Yeah they've admitted to that so that's nothing new. I think their stratergy is a very good one. A card that rivals a 9800GTX for $200 sounds terrific.

    They simply can't compete with the high end so they are bringing out cards that perfrom very well and are cheap to make (and therefore cheap for us).
    So long as we the consumer wins, i'm happy.

  19. #2769
    Xtreme Addict
    Join Date
    May 2007
    Posts
    2,125
    Like I said, read the article on EETimes about the launch of the 48xx's and the GTX200's: here

    Some quotes:
    "We didn't want to come out with one monolithic GPU and then disable parts of it for different markets," said an AMD spokesman prior to a full disclosure of the part in a briefing in San Francisco on June 16.

    The strategy makes sense for the financially troubled AMD which also has laid out conservative road maps for its computer processors. The graphics choice reduces costs and risks while maximizing returns for the company which has suffered through multiple loss-making quarters.

    The decision to use a two-chip strategy for the high end was made more than two years ago, based on an analysis of yields and scalability. It was not related to AMD's recent financial woes, said Rick Bergman, general manager of AMD's graphics division.

    "I predict our competitor will go down the same path for its next GPU once they see this," Bergman said. "They have made their last monolithic GPU."
    I have definitely heard rumblings that Nvidia is moving towards more of an ATI-esque approach to this, though to what extent I do not know. I know that most GPU manufacturers lay the specs for generations long in advance (for example, G80 was supposedly designed years before they ever went into actual production) so that part I do believe in, but I also believe their economic situation + failure of the R600 pushed them down this path a lot faster/deeper than anticipated.

  20. #2770
    Xtreme Enthusiast
    Join Date
    Sep 2006
    Posts
    881
    It might have to do with TSMC can't get good yields on that big of a die, so they have to go smaller dies. From the size of the die, I'm pretty sure it won't have 800sp, probably the originally rumored 480sp would be the case.

  21. #2771
    Banned
    Join Date
    Feb 2008
    Posts
    696
    Quote Originally Posted by zerazax View Post
    Like I said, read the article on EETimes about the launch of the 48xx's and the GTX200's: here

    Some quotes:


    I have definitely heard rumblings that Nvidia is moving towards more of an ATI-esque approach to this, though to what extent I do not know. I know that most GPU manufacturers lay the specs for generations long in advance (for example, G80 was supposedly designed years before they ever went into actual production) so that part I do believe in, but I also believe their economic situation + failure of the R600 pushed them down this path a lot faster/deeper than anticipated.
    You heard those very rumblings from AMD. What do you expect? They want to paint themselves as industry leaders. Every company in every industry does this. Only some are right though. It's harder/more expensive to go bigger. It's easier/cheaper to relegate to lower end chips.

  22. #2772
    Xtreme Addict
    Join Date
    May 2007
    Posts
    2,125
    It has to have 800SP if it reaches 1TFlop with the 4850 unless there's a 1GHz shader domain we haven't seen, but that wasn't detected by the latest version of GPU-Z by w1z so I'm believing its 800SP's

    P.S. die size isn't an indicator of how many SP's are in there since we don't know exactly how much each SP actually takes up... looking at the GT200 die shots, it doesnt even use as much as we think

  23. #2773
    Xtreme Addict
    Join Date
    May 2007
    Posts
    2,125
    Quote Originally Posted by Sr7 View Post
    You heard those very rumblings from AMD. What do you expect? They want to paint themselves as industry leaders. Every company in every industry does this. Only some are right though.
    Right, a lot of it is PR, but I can also see some of the logic behind it since neither company has fabs for their GPUs, they really are at the mercy of foundry's schedules for processes. So if yields on monolithic GPU's arent up to par at the best possible process you can afford, its a risky proposition to put all your marbles in one bag no matter how well off you are as a company.

    BTW those rumblings on Nvidia weren't from an AMD source at all, it was actually an Nvidia source and I know some of it was published somewhere recently but lost in the shuffle of the GT200 launch

  24. #2774
    Banned
    Join Date
    Feb 2008
    Posts
    696
    Quote Originally Posted by zerazax View Post
    It has to have 800SP if it reaches 1TFlop with the 4850 unless there's a 1GHz shader domain we haven't seen, but that wasn't detected by the latest version of GPU-Z by w1z so I'm believing its 800SP's

    P.S. die size isn't an indicator of how many SP's are in there since we don't know exactly how much each SP actually takes up... looking at the GT200 die shots, it doesnt even use as much as we think
    It's possible the shader domain is not detected because the application is not setup to properly detect it on ATI, since it's new to ATI.

    It doesn't necessarily mean it doesn't exist.

  25. #2775
    Xtreme Addict
    Join Date
    May 2007
    Posts
    2,125
    Quote Originally Posted by Sr7 View Post
    It's possible the shader domain is not detected because the application is not setup to properly detect it on ATI, since it's new to ATI.

    It doesn't necessarily mean it doesn't exist.
    Dunno but w1z writes a lot of the programs and recently updated GPU-Z since he received his 4850 last week and that was one part that wasn't updated, so I have my doubts there is a shader domain since w1z really is a wiz and hasn't had problems elsewhere with that

Page 111 of 167 FirstFirst ... 1161101108109110111112113114121161 ... LastLast

Tags for this Thread

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •