Read this nvidia blog post(particularly 8/24/09). NV says their latest chips cost $1 billion in R and D and 3-4 years to makes.
2003 was a way different time compared to now. That was when the 9700 pro was strong and was during AMD prime.
It doesn't take a genius to know the years before rv7xx were really bad, and the rv770 generation hasn't been that profitable. Especially when AMD itself is so starving for cash.
NV net income for 2007 was 800 million, for 2006 it was 450 million and for 2005 it was 2005 was more than 200 million.(google wikipedia, answers and nvidia press releases). 2009 hasn't been peachy(2008 was still a profitable year, although not very profitable compared to earlier years)
http://seekingalpha.com/article/1549...uly-09-quarter
Since April 2007 NV has spent typically 150-219 million a quarter on research and you know its mostly on GPU AbelJemka.
Last edited by tajoh111; 09-30-2009 at 06:56 PM.
Ok i search like you and i find real numbers!
-2006 AMD R&D : 1.205 Billions$
-2006 ATI R&D :458 Millions$
with 167 Millions$ spent Q1'06+Q2'06 and 291 Millions$ for Q3'06+Q4'06![]()
-So 2006 AMD+ATI :1.663 Billions$
-2006 Nvidia R&D : 554 Millions$
-2007 AMD+ATI R&D : 1.847 Billions$
-2007 Nvidia R&D : 692 Millions$
-2008 AMD+ATI R&D : 1.848 Billions$
-2008 Nvidia R&D : 856 Millions$
So numbers can't lies, Nvidia had increased it R&D expense since 2006 but so had AMD+ATI.
You said that they mostly research on GPU since 2007 but you seem to forget that since 2007 Tesla and Cuda are push very hard by Nvidia so they must eat some not negligeable ressources and that Nvidia is also promoting Tegra and Ion.
Tesla and cuda are part of the gpu research and design so they are related since they involve making the Gpu more powerful. Its obvious those from those numbers NV should be spending substantially more if the ratio's mean anything from the 2006 numbers of AMD + ATI.
If we look at those numbers AMD spent 2006-2007 spent 11% more and between 2007-2008 they didn't increase spending at all. Compare this to NV who spent 2006-2007 spent 25 percent more and 23.7% more
Not to mention AMD likely spent alot of money getting to 55nm and 40nm to first plus all the money they spent on DDR5 and DDR4 research. NV waited for all this to happen so they didn't have to spent much on research and getting there as much.
I can imagine since its AMD was running the show for the most part, I can see alot more money spent on their CPU then their GPU side, especially considering how behind they were during the conroe years, and looking at simple economics, getting that side on the better side of profitable was alot more important than getting it gpu side going.
Originally Posted by motown_steve
Every genocide that was committed during the 20th century has been preceded by the disarmament of the target population. Once the government outlaws your guns your life becomes a luxury afforded to you by the state. You become a tool to benefit the state. Should you cease to benefit the state or even worse become an annoyance or even a hindrance to the state then your life becomes more trouble than it is worth.
Once the government outlaws your guns your life is forfeit. You're already dead, it's just a question of when they are going to get around to you.
You like speculation a lot more that me!
Tesla and Cuda are part of gpu research but they have a cost. A cost in time or developpers and one or another cost money.
You take percentage because it suits your purpose more but in term of brute numbers AMD 2006 to 2007 its +184 Millions$ and Nvidia 2006 to 2007 its +138 Millions$.
What the cost going to 55nm? You don't know. Going to 40nm? You don't know? GDDR4 research? 2900XT launch six months late in 2007 but due in 2006 so no impact. GDDR4 basically the same as GDDR4 so not a great deal.
For the AMD part you play guessing game. But AMD, graphic division was the first thing who manage too have success of RV670 and RV770. So It may indicate something.
I will give you an historic of our conversation and you will see who put things he doesn't even know but "he's assuming" because "it' doesn't need a genius to know" :
I said :
You supposed :With a launch at Q1 that let's a lots of time for AMD to make a move.
Your assumption in this post is that GT300 is G80-like so history will reproduce himself. You begun melting AMD strategy and ATI strategy, you begun praising Nvidia ("stonger brand due to marketing").Knowing AMD they might just completely forfeit the high end and fight at 250 and below for the next year if this card turns out to be 50-60% percent faster than the 5870, like they did against intel lately or 88xx generation to an extent.If the gtx 380 is able to somehow beat r800, it might just completely abandon it altogether, as I doubt it would sell well at all, as 3870x2 bombed and it was still beating the 8800 ultra.]This was lots to do with NV just being a stronger brand due to marketing.
I said :
I post only fact. And my assumption are basically in the Anandtech's article about Fermi. Like Diltech or...you i do a guessing about GT300 performance and use history to point that a GTX395 model may come late.AMD launch it card before this time, what AMD is doing now : extrapolate GT300 performance and cost.
Performance? GTX285 SLI is like 30% faster than 5870 in average. GTX380 may be more like 50% to 60% faster than 5870 in average. Maybe even less.
Cost? 40% more transistors than RV870 and 384bits istead of 256bits. 600$? More?
Diltech speaks about GTX395 but in Nvidia history multigpu cards were launched very late (More than 6 months in average).
Basically AMD have 3 months to sell DX11 card with the help of Windows 7.
AMD having 3 months to sell card is a giving fact, no?
You said :
Basically you spoke a lot to just say AMD have no money so they have no R&D budget, so they can't design a new architecture.AMD R and D budget is tiny compared to Intel and NV(especially Intel), you can over estimate your rivals performance by 1000% and it will do nothing if you don't have the r and d budget to get something going to match that estimate.
With so many losing quarters in the past(except a couple quarters lately), I can imagine AMD graphic division was working on a shoestring budget, especially when AMD itself is so in the hole. Thankfully the research ATI put into r600 before the AMD and ATI merger paid off to some extent with r7xx and possibly to an extent r8xx as it turned out r6xx turned out to be a very scalable architecture. However research for the next big thing I can imagine being lacking for AMD and if this thing performs 50-60% faster than rv870, then AMD will need to come out with something new and not just a bigger chip with more shaders as returns have started to decrease with more shaders.
It will take either a big chip from AMD(which seems to be against their design philosophy) or a new architecture. I think a new architecture is not coming any time soon because of budget issues.
What AMD did with the R8xx is stretch the limits of the design(which NV did with g80->g200) that began with r600, it's all you can do when your company doesn't have the money to design a new architecture.
You added reponding LordEC911 :
AMD no money they can nothing, Nvidia have money so etc...[..]AMD won't be coming out with anything spectacular anytime soon because of the shoestring budget they have been working with because of so many bad quarters.[...]NV has been a much more profitable company overall and has probably been working on something pretty complex for the last 4 years as todays news confirms.
I posted this :
Real numbers pointing out that AMD use a lot maybe more money on R&D than Nvidia. So your main argument AMD has no R&D money go to trash.Ok i search like you and i find real numbers!
-2006 AMD R&D : 1.205 Billions$
-2006 ATI R&D :458 Millions$
with 167 Millions$ spent Q1'06+Q2'06 and 291 Millions$ for Q3'06+Q4'06
-So 2006 AMD+ATI :1.663 Billions$
-2006 Nvidia R&D : 554 Millions$
-2007 AMD+ATI R&D : 1.847 Billions$
-2007 Nvidia R&D : 692 Millions$
-2008 AMD+ATI R&D : 1.848 Billions$
-2008 Nvidia R&D : 856 Millions$
You took defensive stance and became "Mr Assumption" :
You prove your math skills and you them to try to show that 25% of 700M$ is better than 11% of 1.65B$...If we look at those numbers AMD spent 2006-2007 spent 11% more and between 2007-2008 they didn't increase spending at all. Compare this to NV who spent 2006-2007 spent 25 percent more and 23.7% more
Not to mention AMD likely spent alot of money getting to 55nm and 40nm to first plus all the money they spent on DDR5 and DDR4 research. NV waited for all this to happen so they didn't have to spent much on research and getting there as much.
I can imagine since its AMD was running the show for the most part, I can see alot more money spent on their CPU then their GPU side, especially considering how behind they were during the conroe years, and looking at simple economics, getting that side on the better side of profitable was alot more important than getting it gpu side going.
You try to explain AMD and Nvidia expense with your "Assumption-O-Maker".
I don't deny i made assumption but i use fact to do it.
You posted nearly zero fact since the beginning of this discussion!
Bookmarks