Page 36 of 167 FirstFirst ... 26333435363738394686136 ... LastLast
Results 876 to 900 of 4151

Thread: ATI Radeon HD 4000 Series discussion

  1. #876
    Xtreme Addict
    Join Date
    Apr 2007
    Posts
    2,128
    Quote Originally Posted by Shintai View Post
    Well in short. After all investment etc. nVidia makes about 500million in cash profit a year. ATI makes about 30million in loss. nVidia is even having a bigger quarter revenue than ATIs yearly.

    nVidia uses 1billion in R&D. Thats almost ATIs entire revenue.
    Sounds quite interesting. May I ask a source?

  2. #877
    Registered User
    Join Date
    Jun 2007
    Posts
    47
    Quote Originally Posted by perkam View Post
    ?!?!?!?!

    What is with the 2X increase in Mem speeds between 4850 and 4870 0_0 !!!!

    Perkam
    Hello Perkam, i will explain you.
    The 4870 is one of the highest models. Because of that it has en memory clock of 3920mhz.
    The 4450 is a lower model, therefor it is a bit slower. Thus having 2000mhz memory clock.

    Not convinced yet?
    Check out these differences!
    4870
    4750


  3. #878
    Xtreme Cruncher
    Join Date
    Aug 2006
    Location
    Denmark
    Posts
    7,747
    Quote Originally Posted by Calmatory View Post
    Sounds quite interesting. May I ask a source?
    http://phx.corporate-ir.net/phoenix....&p=irol-IRHome

    http://www.amd.com/us-en/Corporate/I...06_643,00.html
    Crunching for Comrades and the Common good of the People.

  4. #879
    I am Xtreme
    Join Date
    Jul 2007
    Location
    Austria
    Posts
    5,485
    Quote Originally Posted by adamsleath View Post
    but total revenues for each company in the discrete gcard market is...?


    i'll take a guess then that nvidia's revenues/turnover of units is much higher than amdati's, but i havent seen any figures of total revenues for nvidia , but how does it break down ...like there are chipsets, discrete gcards, the cpu side of amd, etc.

    http://finance.google.com/finance?q=NASDAQ:NVDA
    http://finance.google.com/finance?q=NYSE:AMD

    AMD:


    2007 total revenue: 6.013billion
    2007 net income: -3.379billion


    NVIDIA:


    2007 total revenue: 3.068billion
    2007 net income: 0.448billion

    wtf?

    do amd losses go into gold plated rolls royces for all empoyees or something? wtf?

    and yet the figures show that amd had twice the total revenues vs amd in 2007

    well if you comapre cpu+gpu business of amd to the gpu business of nv alone, no wonder that amd has a bigger revenue.

  5. #880
    Xtreme Guru adamsleath's Avatar
    Join Date
    Nov 2006
    Location
    Brisbane, Australia
    Posts
    3,803
    yeah, still no idea how graphics/chipsets amdati compare to nvidia for total revenues.

    nor why or where all amd's money goes...ie expenses ie expensive ie not profitable, whether cpu or graphics or both.
    ie they must have a very large overdraft facility or other creative accounting methods.

    because 6 billion in revenues is a lot of $$, and i would surmise that there are some fat cats getting very rich in the amd camp. despite the creative financial statements.
    Last edited by adamsleath; 05-21-2008 at 06:56 AM.
    i7 3610QM 1.2-3.2GHz

  6. #881
    Live Long And Overclock
    Join Date
    Sep 2004
    Posts
    14,058
    Quote Originally Posted by adamsleath View Post
    but total revenues for each company in the discrete gcard market is...?


    i'll take a guess then that nvidia's revenues/turnover of units is much higher than amdati's, but i havent seen any figures of total revenues for nvidia , but how does it break down ...like there are chipsets, discrete gcards, the cpu side of amd, etc.

    http://finance.google.com/finance?q=NASDAQ:NVDA
    http://finance.google.com/finance?q=NYSE:AMD

    AMD:


    2007 total revenue: 6.013billion
    2007 net income: -3.379billion


    NVIDIA:


    2007 total revenue: 3.068billion
    2007 net income: 0.448billion

    wtf?

    do amd losses go into gold plated rolls royces for all empoyees or something? wtf?

    and yet the figures show that amd had twice the total revenues vs amd in 2007
    Financial analysis takes more than quoting a few numbers m8.

    Not to mention, you should be applauding ATI for being able to compete with Nvidia on the same level, despite having revenues 1/3rd the size of Nvidia's.

    Perkam

  7. #882
    I am Xtreme
    Join Date
    Jul 2007
    Location
    Austria
    Posts
    5,485
    Quote Originally Posted by adamsleath View Post
    yeah, still no idea how graphics/chipsets amdati compare to nvidia for total revenues.

    nor why or where all amd's money goes...ie expenses ie expensive ie not profitable, whether cpu or graphics or both.
    ie they must have a very large overdraft facility or other creative accounting methods.

    because 6 billion in revenues is a lot of $$, and i would surmise that there are some fat cats getting very rich in the amd camp. despite the creative financial statements.
    took me 1 min to find:

    year end 2007:
    Graphics
    Net revenue: 903 Million$
    Operating income (loss): (100) Million$

    http://www.amd.com/us-en/assets/cont...Financials.pdf


    1st Quarter 2008:
    Graphics
    Net revenue: 203 Million$
    Operating income (loss): (11) Million$

    http://www.amd.com/us-en/assets/cont...Financials.pdf

  8. #883
    Xtreme Enthusiast
    Join Date
    Mar 2007
    Location
    Portsmouth, UK
    Posts
    963
    Quote Originally Posted by adamsleath View Post
    yeah, still no idea how graphics/chipsets amdati compare to nvidia for total revenues.

    nor why or where all amd's money goes...ie expenses ie expensive ie not profitable, whether cpu or graphics or both.
    ie they must have a very large overdraft facility or other creative accounting methods.

    because 6 billion in revenues is a lot of $$, and i would surmise that there are some fat cats getting very rich in the amd camp. despite the creative financial statements.
    Somewhere on XS someone linked to pdf files of the financials to show that ATI is still losing money & nVidia is making it by the shedload. Search is your friend.

  9. #884
    Xtreme Enthusiast
    Join Date
    Nov 2005
    Location
    Sweden, Örebro
    Posts
    818
    Quote Originally Posted by AliG View Post
    so andreas any idea why they went with 256 bit with the midrange last generation but are giving 256 to even the lower end this generation?
    I don't think we will see 256-bit with lower-end cards. They don't need the bandwidth.

    Quote Originally Posted by Shintai View Post
    Economics and GDDR5.
    GDDR5 is a bit extreme for low-end, don't you think ?

    //Andreas

  10. #885
    Xtreme Cruncher
    Join Date
    Aug 2006
    Location
    Denmark
    Posts
    7,747
    Quote Originally Posted by NH|Delph1 View Post
    GDDR5 is a bit extreme for low-end, don't you think ?

    //Andreas
    I was thinking GDDR5/256bit for highend vs a 512bit/GDDR3-4.
    Last edited by Shintai; 05-21-2008 at 07:53 AM.
    Crunching for Comrades and the Common good of the People.

  11. #886
    Xtreme Enthusiast
    Join Date
    Nov 2005
    Location
    Sweden, Örebro
    Posts
    818
    Quote Originally Posted by Shintai View Post
    I was thinking GDDR5/256bit for highend vs a 512bit/GDDR3-4.
    Then I completely agree with you, makes no sense using 512-bit and GDDR5

    //Andreas

  12. #887
    Xtreme Addict
    Join Date
    Dec 2002
    Posts
    1,250
    3 weeks to go and no solid info?

    Great job hiding the cards.
    4670k 4.6ghz 1.22v watercooled CPU/GPU - Asus Z87-A - 290 1155mhz/1250mhz - Kingston Hyper Blu 8gb -crucial 128gb ssd - EyeFunity 5040x1050 120hz - CM atcs840 - Corsair 750w -sennheiser hd600 headphones - Asus essence stx - G400 and steelseries 6v2 -windows 8 Pro 64bit Best OS used - - 9500p 3dmark11 (one of the 26% that isnt confused on xtreme forums)

  13. #888
    Xtreme Guru
    Join Date
    May 2007
    Location
    Ace Deuce, Michigan
    Posts
    3,955
    Quote Originally Posted by NH|Delph1 View Post
    I don't think we will see 256-bit with lower-end cards. They don't need the bandwidth.
    Isn't the rv730 low end? Or is that the rv710, in which case, what is the rv740?
    Quote Originally Posted by Hans de Vries View Post

    JF-AMD posting: IPC increases!!!!!!! How many times did I tell you!!!

    terrace215 post: IPC decreases, The more I post the more it decreases.
    terrace215 post: IPC decreases, The more I post the more it decreases.
    terrace215 post: IPC decreases, The more I post the more it decreases.
    .....}
    until (interrupt by Movieman)


    Regards, Hans

  14. #889
    Xtreme Member
    Join Date
    Sep 2007
    Posts
    168
    Quote Originally Posted by Rob Halford View Post
    Afraid that ATI will be dead and gone if they don't come up with something really good very soon. If the Nvidia 280/260 is as powerful as rumoured, ATI is in big trouble. Only thing make them stay in business now is fanboyism and the fact that ATI got a free ride with Intel - Intel chipsets rock (and have CF support) and Nvidia chipsets (680/780/790) suck donkey balls...

    ps. apart from that ATI graphics is a lot better in BF2 which is the one and only game I play...
    What a coincidence. BF2 is the game I play the most too. Games like bioshock come and go but BF2 stays.

  15. #890
    Xtreme Guru
    Join Date
    May 2007
    Location
    Ace Deuce, Michigan
    Posts
    3,955
    basically anything that uses the valve engine (which I belive BF2 does since its a steam game) runs extremely nicely with ati cards, which will also help them out.


    But the way I see it, gt 200 is designed if anything to scare intel, which I bet it will until larabee comes out and even still larabee won't be able to compete with ati and nvidia's high end no matter how big intel's R&D fund is. The first generation is always generally a test to see how well certain aspects work, in the case of g80 and core 2, both performed above and beyond expectations, while the r600 and k10 while very modular, did not perform up to their expectations. I wouldn't be surprised to see a second/third generation larabee that can out perform the gtx 280 though.

    So in short, gtx 280 will spank the rv770, but the rv770 was never meant to compete with it anyways. What'll be key is how well the MCM design works out for the r700 and how well the card scales (hopefully they won't need that bridge chip anymore as that definitely added costs and heat to the 3870x2)
    Quote Originally Posted by Hans de Vries View Post

    JF-AMD posting: IPC increases!!!!!!! How many times did I tell you!!!

    terrace215 post: IPC decreases, The more I post the more it decreases.
    terrace215 post: IPC decreases, The more I post the more it decreases.
    terrace215 post: IPC decreases, The more I post the more it decreases.
    .....}
    until (interrupt by Movieman)


    Regards, Hans

  16. #891
    I am Xtreme
    Join Date
    Jul 2007
    Location
    Austria
    Posts
    5,485
    Quote Originally Posted by AliG View Post
    basically anything that uses the valve engine (which I belive BF2 does since its a steam game) runs extremely nicely with ati cards, which will also help them out.
    wrong, BF2 uses a custom game engien made by dice.

    Distribution over steam dosen't means that it uses the source engien.

  17. #892
    Xtreme Member
    Join Date
    Sep 2007
    Posts
    168
    Quote Originally Posted by AliG View Post
    basically anything that uses the valve engine (which I belive BF2 does since its a steam game) runs extremely nicely with ati cards, which will also help them out.


    But the way I see it, gt 200 is designed if anything to scare intel, which I bet it will until larabee comes out and even still larabee won't be able to compete with ati and nvidia's high end no matter how big intel's R&D fund is. The first generation is always generally a test to see how well certain aspects work, in the case of g80 and core 2, both performed above and beyond expectations, while the r600 and k10 while very modular, did not perform up to their expectations. I wouldn't be surprised to see a second/third generation larabee that can out perform the gtx 280 though.

    So in short, gtx 280 will spank the rv770, but the rv770 was never meant to compete with it anyways. What'll be key is how well the MCM design works out for the r700 and how well the card scales (hopefully they won't need that bridge chip anymore as that definitely added costs and heat to the 3870x2)

    No the BF2 engine is designed and programmed by the "engineers" at DICE IIRC. The engine features ugly looking bump maps and large environments. On the contrary, the Valve source enigine features nice looking bump/normal maps but tiny, tiny little low-poly environments

    I wonder how larrabee will fare against nvidias/atis offerings at the time of release!

  18. #893
    Xtreme Guru
    Join Date
    May 2007
    Location
    Ace Deuce, Michigan
    Posts
    3,955
    At best, they'll offer a good "performance" competitor imo, but that's just my opinion. Intel has far more capital to put towards R&D than either ati or nvidia, but I'm not sure how much good it will do for them to design an x86 gpu
    Quote Originally Posted by Hans de Vries View Post

    JF-AMD posting: IPC increases!!!!!!! How many times did I tell you!!!

    terrace215 post: IPC decreases, The more I post the more it decreases.
    terrace215 post: IPC decreases, The more I post the more it decreases.
    terrace215 post: IPC decreases, The more I post the more it decreases.
    .....}
    until (interrupt by Movieman)


    Regards, Hans

  19. #894
    Xtreme Member
    Join Date
    Sep 2007
    Posts
    168
    I think larrabee will be worse at raster rendering than the competitors offerings, but when ray tracing finally becomes available in games it will kick ass.

  20. #895
    Xtreme Cruncher
    Join Date
    Aug 2006
    Location
    Denmark
    Posts
    7,747
    Quote Originally Posted by AliG View Post
    basically anything that uses the valve engine (which I belive BF2 does since its a steam game) runs extremely nicely with ati cards, which will also help them out.


    But the way I see it, gt 200 is designed if anything to scare intel, which I bet it will until larabee comes out and even still larabee won't be able to compete with ati and nvidia's high end no matter how big intel's R&D fund is. The first generation is always generally a test to see how well certain aspects work, in the case of g80 and core 2, both performed above and beyond expectations, while the r600 and k10 while very modular, did not perform up to their expectations. I wouldn't be surprised to see a second/third generation larabee that can out perform the gtx 280 though.

    So in short, gtx 280 will spank the rv770, but the rv770 was never meant to compete with it anyways. What'll be key is how well the MCM design works out for the r700 and how well the card scales (hopefully they won't need that bridge chip anymore as that definitely added costs and heat to the 3870x2)
    Even tho thats highly likely. Remember Intel can use its own foundries and design the chip and process to the max performance. Plus a lead. Imagine Larabee on a High-K metal 45nm process today. Inferiour product on a better process can change the tide.
    Crunching for Comrades and the Common good of the People.

  21. #896
    Xtreme Addict
    Join Date
    May 2007
    Location
    'Zona
    Posts
    2,346
    I like how I give XS a crash course in ACC and now everyone is a pro...

  22. #897
    I am Xtreme
    Join Date
    Jul 2007
    Location
    Austria
    Posts
    5,485
    Quote Originally Posted by Nuker_ View Post
    No the BF2 engine is designed and programmed by the "engineers" at DICE IIRC. The engine features ugly looking bump maps and large environments. On the contrary, the Valve source enigine features nice looking bump/normal maps and moderate large environments
    fixed, play EP2 and call it again tiny tiny.

  23. #898
    Xtreme Guru adamsleath's Avatar
    Join Date
    Nov 2006
    Location
    Brisbane, Australia
    Posts
    3,803
    Quote Originally Posted by Hornet331 View Post
    took me 1 min to find:

    year end 2007:
    Graphics
    Net revenue: 903 Million$
    Operating income (loss): (100) Million$

    http://www.amd.com/us-en/assets/cont...Financials.pdf


    1st Quarter 2008:
    Graphics
    Net revenue: 203 Million$
    Operating income (loss): (11) Million$

    http://www.amd.com/us-en/assets/cont...Financials.pdf
    thx for that.

    for 2007, out of $2.262 billion gross margin AMD spent $1.847 billion on R&D

    & graphics accounts for less than 1/6th (@0.903bn) amd total revenue for 2007...compared to nvidia's 3+billion...
    so that's nv total revenue 3X amd graphics rev.

    which could translate (on a per $ basis) into for every 3 nvidia cards sold, 1 ati card is sold , more or less.
    Last edited by adamsleath; 05-21-2008 at 02:15 PM.
    i7 3610QM 1.2-3.2GHz

  24. #899
    Xtreme Mentor
    Join Date
    Oct 2005
    Location
    Portugal
    Posts
    3,410
    AMD official press release on employing GDDR5


    AMD (NYSE:AMD - News) today announced the first commercial implementation of Graphics Double Data Rate, version 5 (GDDR5) memory in its forthcoming next generation of ATI Radeon™ graphics card products. The high-speed, high-bandwidth GDDR5 technology is expected to become the new memory standard in the industry, and that same performance and bandwidth is a key enabler of The Ultimate Visual Experience™, unlocking new GPU capabilities. AMD is working with a number of leading memory providers, including Samsung, Hynix and Qimonda, to bring GDDR5 to market.

    Today’s GPU performance is limited by the rate at which data can be moved on and off the graphics chip, which in turn is limited by the memory interface width and die size. The higher data rates supported by GDDR5 – up to 5x that of GDDR3 and 4x that of GDDR4 – enable more bandwidth over a narrower memory interface, which can translate into superior performance delivered from smaller, more cost-effective chips.1 AMD’s senior engineers worked closely with industry standards body JEDEC in developing the new memory technology and defining the GDDR5 spec.

    “The days of monolithic mega-chips are gone. Being first to market with GDDR in our next-generation architecture, AMD is able to deliver incredible performance using more cost-effective GPUs,” said Rick Bergman, Senior Vice President and General Manager, Graphics Product Group, AMD. “AMD believes that GDDR5 is the optimal way to drive performance gains while being mindful of power consumption. We’re excited about the potential GDDR5 brings to the table for innovative game development and even more exciting game play.”

    The introduction of GDDR5-based GPU offerings marks the continued tradition of technology leadership in graphics for AMD. Most recently, AMD has been first to bring a unified shader architecture to market, the first to support Microsoft DirectX® 10.1 gaming, first to lower process nodes like 55nm, the first with integrated HDMI with audio, and the first with double-precision floating point calculation support.

    AMD expects that PC graphics will benefit from the increase in memory bandwidth for a variety of intensive applications. PC gamers will have the potential to play at high resolutions and image quality settings, with superb overall gaming performance. PC applications will have the potential to benefit from fast load times, with superior responsiveness and multi-tasking.

    “Qimonda has worked closely with AMD to ensure that GDDR5 is available in volume to best support AMD’s next-generation graphics products,” said Thomas Seifert, Chief Operating Officer of Qimonda AG. “Qimonda’s ability to quickly ramp production is a further milestone in our successful GDDR5 roadmap and underlines our predominant position as innovator and leader in the graphics DRAM market.”

    GDDR5 for Stream Processing

    In addition to the potential for improved gaming and PC application performance, GDDR5 also holds a number of benefits for stream processing, where GPUs are applied to address complex, massively parallel calculations. Such calculations are prevalent in high-performance computing, financial and academic segments among others. AMD expects that the increased bandwidth of GDDR5 will greatly benefit certain classes of stream computations.

    New error detection mechanisms in GDDR5 can also help increase the accuracy of calculations by indentifying errors and re-issuing commands to get valid data. This capability is a level of reliability not available with other GDDR-based memory solutions today.

    http://forums.vr-zone.com/showthread.php?t=278539






    'Final' Radeon HD 4800-series specs, launch details leak out

    German website Hardware-Infos has obtained (translation here) what looks like an official document with "final" specifications, pricing information, and launch details for AMD's next-generation Radeon HD 4800-series graphics cards. This information echoes the June 18 launch date we heard last week, but it says both the 4850 and the 4870 will come out on the same day.

    The Radeon HD 4850 will apparently feature 480 stream processors, a 625MHz core speed, an 825MHz shader speed, 512MB of 1143MHz GDDR3 memory, and a 114W thermal envelope. The faster Radeon HD 4870 will also have 480 SPs, but with an 850MHz core speed, 1050MHz shader speed, 1GB of 1935MHz (3870MHz "effective") GDDR5 RAM, and a 157W TDP. Both cards will also feature 256-bit memory buses and 16 raster operators, just like existing Radeon HD 3800-series models, but with twice as many texture mapping units (32 instead of 16).

    Hardware-Infos says the Radeon HD 4850 will launch at $249 and the Radeon HD 4870 will be $349. If recently leaked performance numbers are accurate, the 4850 may be in the same playing field as Nvidia's ~$300 GeForce 9800 GTX.


    http://www.techreport.com/discussions.x/14763

    regards

    [Review] Core i7 920 & UD5 » Here!! « .....[Review] XFX GTX260 216SP Black Edition » Here!! «
    [Review] ASUS HD4870X2 TOP » Here!! «
    .....[Review] EVGA 750i SLi FTW » Here!! «
    [Review] BFG 9800GTX 512MB » Here!! « .....[Review] Geforce 9800GX2 1GB » Here!! «
    [Review] EVGA GTX280 1GB GDDR3 » Here!! « .....[Review] Powercolor HD4870 512MB GDDR5 » Here!! «

  25. #900
    Registered User
    Join Date
    May 2005
    Posts
    3,691
    Quote Originally Posted by Shintai View Post
    Even tho thats highly likely. Remember Intel can use its own foundries and design the chip and process to the max performance. Plus a lead. Imagine Larabee on a High-K metal 45nm process today. Inferiour product on a better process can change the tide.
    Not quite true... The Pentium D on 65nm didn't surpass a 90nm A64 x2. The Prescott didn't surpass the 130nm A64's either. Nor did the R600 surpass the G80. We also saw what happened with the RV670 vs G92.

    Process can help you do more in a single chip, or lower power consumption, but it doesn't make for a better chip in all cases. Besides, by the time we see larabee, we'll see both ATi and NVidia on smaller processes than they presently are, and intel have stated larabee will be 10x the performance of their best IGP, which would only align it with the G80...
    Quote Originally Posted by Leon2ky
    "dammit kyle what's with the 30 second sex lately?" "Sorry sweetie, I overclocked my nuts and they haven't been stable since"
    Quote Originally Posted by trinibwoy View Post
    I don't think his backside has internet access.
    Quote Originally Posted by n00b 0f l337 View Post
    Hey I just met you
    And this is crazy
    But I'm on bath salts
    And your face looks tasty

Page 36 of 167 FirstFirst ... 26333435363738394686136 ... LastLast

Tags for this Thread

Bookmarks

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •