PDA

View Full Version : Intel blocks Nvidia Ion Platform



Monkeywoman
12-24-2008, 12:00 PM
"Nvidia's recent move to try to extend Intel's Atom ecosystem to its recently announced Ion platform appears to have been knocked back by Intel, according to a response primarily directed at PC manufacturers as seen by Digitimes.

Nvidia's push for the Ion platform, which combines its GeForce 9400 mGPU with the Intel Atom CPU, is heavily reliant on Intel being willing to abandon its current Atom pricing system. Under the existing scheme, Intel will only sell Atom CPUs and corresponding chipsets in a bundle, but if hardware vendors are unable to buy just the Atom CPU, the Ion platform becomes too expensive for most applications. Nvidia executives recently visited local PC makers, in an attempt to drum up support for allowing its MCP7A and MCP79 chipsets into the Atom ecosystem.

However, in an internal statement distributed to hardware makers recently, Intel reiterated that Atom CPUs for netbooks and nettops are only available bundled with its 945GSE and 945GC chipsets, the makers said.

When asked to comment, Intel indicated that it has no plans to validate the Nvidia MCP79 chipset on Atom-based nettop or netbook platforms. Intel also has no plans to form a partnership with Nvidia to support nettop or netbook platforms based on the Intel Atom CPU, the company added. "

source:http://www.digitimes.com/news/a20081223PD216.html

Sr7
12-24-2008, 12:02 PM
Monopoly.

Monkeywoman
12-24-2008, 12:05 PM
yah, good thing AMD and VIA are releasing their solutions in 1Q 09; that should force Intel to change its tactics in this market.

OutSider
12-24-2008, 12:05 PM
Good for intel not that much for customers at least for me i was really excited about atom+GF 9400

Reznik Akime
12-24-2008, 12:07 PM
Hot off the heels of the announcement that Intel would allow Nvidia to make Atom chipsets? Interesting.

Sr7
12-24-2008, 12:11 PM
Hot off the heels of the announcement that Intel would allow Nvidia to make Atom chipsets? Interesting.

There was no official announcement, NVIDIA did it on their own, trying to show the value they can add to the platform, and that it would allow you to have much higher res displays with more multimedia capabilities and Vista.

But Intel said no, only we make the chipsets for our processors.

It's funny that Intel would do this after NVIDIA gave them something, namely SLI on x58.

Anyone who owns a platform and blocks a product that is already done and working for the sake of profits, when the market desires that type of product is acting as a monopoly. The same could be said for when NVIDIA artificially restricted SLI from Intel chipsets... though that was *far* less severe (just blocked a feature of the chipset, whereas here Intel is blocking the sale of the someone elses chipset AND graphics entirely).

Rammsteiner
12-24-2008, 12:17 PM
Intel's fail marketing fails, C/C. Seriously, Intel starts to act how nVidia used to act, and that ain't good. Respect for nVidia:up:

T_Flight
12-24-2008, 12:28 PM
Gimme a break. AMD/VIA is not gonna "force" Intel to do anything. They could care less what AMD/VIA does.

As for Intel's marketing it's doing very well. They rely on word of mouth, verses advertising gimmicks and publicity stunts. That's not gonna be changing anytime soon. The only way Intel will gain a monopoly is when AMD hands them one by no longer being viable competition, and that will be of their own doing.

Let's lay off the Intel and nVidia propaganda. Nobody really cares about these political games. Only investors, but in this case, there was never anything there. nVidia made a wrong move. It happens. The sky is not gonna fall over it or anything.

Sr7
12-24-2008, 12:30 PM
Gimme a break. AMD/VIA is not gonna "force" Intel to do anything. They could care less what AMD/VIA does.

As for Intel's marketing it's doing very well. They rely on word of mouth, verses advertising gimmicks and publicity stunts. That's not gonna be changing anytime soon. The only way Intel will gain a monopoly is when AMD hands them one by no longer being viable competition, and that will be of their own doing.

Let's lay off the Intel and nVidia propaganda. Nobody really cares about these political games. Only investors, but in this case, there was never anything there. nVidia made a wrong move. It happens. The sky is not gonna fall over it or anything.

How do you define "wrong"? If the market demands it, is it "wrong?" Or just "wrong" by Intel's standards? :rofl:

T_Flight
12-24-2008, 12:35 PM
Wrong as in they never got Intel to validate any of this stuff or put their stamp of approval on it.

In the case of X58 and SLI, Intel and the board makers DID get nVidia's stamp of approval and paid them for it. Intel is not gonna allow themselves to be put in a position of having a fiasco because this stuff was not tested and approved.

Atom is Intel's property, and they have every right to do what they wish with it.

Sr7
12-24-2008, 12:39 PM
Wrong as in they never got Intel to validate any of this stuff or put their stamp of approval on it.

In the case of X58 and SLI, Intel and the board makers DID get nVidia's stamp of approval and paid them for it. Intel is not gonna allow themselves to be put in a position of having a fiasco because this stuff was not tested and approved.

Atom is Intel's property, and they have every right to do what they wish with it.

Err by that logic, what would stop Intel from doing this approval and qualification now then? Maybe NVIDIA only went to create demand because Intel already said no in private.

There's no legit reason for blocking this (they'd obviously be happy to have it qualified if Intel would actually do it so the fact that it's not qualified is no excuse). I think its shady... it's the difference between buying a netbook or not for me.

Though I'm sure you were as forgiving when NVIDIA chose not to have SLI on Intel chipsets ;)

T_Flight
12-24-2008, 12:45 PM
Err by that logic, what would stop Intel from doing this approval and qualification now then? Maybe NVIDIA only went to create demand because Intel already said no in private.

There's no legit reason for blocking this (they'd obviously be happy to have it qualified if Intel would actually do it so the fact that it's not qualified is no excuse). I think its shady... it's the difference between buying a netbook or not for me.

Though I'm sure you were as forgiving when NVIDIA chose not to have SLI on Intel chipsets ;)


That's my point. You don't go behind companies back and do things like that. You get them to approve it first, or you don't do it all. It was the same thing with Intel chipsets and SLI. We didn't like it, but there wasn't anything we could do about it at the time as nVidia held the rights to the tech.

Zucker2k
12-24-2008, 12:46 PM
How do you define "wrong"? If the market demands it, is it "wrong?" Or just "wrong" by Intel's standards? :rofl:I "demand" a board with 6x x16 lanes and 6x NV 295s fo under $500.00 /jk. :p:

On a serious note, the chips are falling where their made; NV is in business because of Intel and AMD, then it got arrogant and wanted to cut off the hand that feeds it. :shakes: In any case, Intel can make a platform argument, AMD does the same. Sorry, NV.

Sr7
12-24-2008, 12:51 PM
I "demand" a board with 6x x16 lanes and 6x NV 295s fo under $500.00 /jk. :p:

On a serious note, the chips are falling where their made; NV is in business because of Intel and AMD, then it got arrogant and wanted to cut off the hand that feeds it. :shakes: In any case, Intel can make a platform argument, AMD does the same. Sorry, NV.

Actually all 3 are in business because of market demands. If they didn't serve a purpose they wouldn't be there.

Basically you're saying that NVIDIA can sell GPUs because Intel boots the PC?

In that case, Ford/Toyota/GM/Honda are all in business because of a Japanese company that makes starter motors.

Poor logic.

Sr7
12-24-2008, 12:53 PM
That's my point. You don't go behind companies back and do things like that. You get them to approve it first, or you don't do it all. It was the same thing with Intel chipsets and SLI. We didn't like it, but there wasn't anything we could do about it at the time as nVidia held the rights to the tech.

Isn't that exactly what Lucid is doing with Hydra? lol... there are so many examples where this happens and its generally accepted in a free market economy. That's how innovation happens.

AMD/NVIDIA aren't suing Lucid. The company will stand or fall on it's own, as the market dictates need for it's product.

Zucker2k
12-24-2008, 12:56 PM
Actually all 3 are in business because of market demands. If they didn't serve a purpose they wouldn't be there.

Basically you're saying that NVIDIA can sell GPUs because Intel boots the PC?

In that case, Ford/Toyota/GM/Honda are all in business because of a Japanese company that makes starter motors.

Poor logic.Sorry, Intel and the PC out date NV. ;) My argument is that some morons at NV miscalculated and that is why NV is has no Nehalem offering. A 6 year old wouldn't make the mistake NV made. They failed to sell SLI the whole time they could have, and now, they're peddling it freely but without a platform to sell. :shakes:

zanzabar
12-24-2008, 12:58 PM
I "demand" a board with 6x x16 lanes and 6x NV 295s fo under $500.00 /jk. :p:

On a serious note, the chips are falling where their made; NV is in business because of Intel and AMD, then it got arrogant and wanted to cut off the hand that feeds it. :shakes: In any case, Intel can make a platform argument, AMD does the same. Sorry, NV.

amd and via dont need an NV igp they have better, VIA needs to make production for more than atms and cash registers and amd needs to do something

if via ever followed through with their consumer lines intel would have to cut prices more than in half for nettops and mobile parts

Rammsteiner
12-24-2008, 01:02 PM
Hmmm.

It's pretty much a fact Intel ain't having good low power solutions apart from CPU's. nVidia does, they actually try and help them (well, with a certain financial interest of course). This could have made Atom a great thing.

Now however they simply gave all the other participants, including nVidia, a chance to beat Intel by this behaviour.

Besides that, lets not forget that due to nVidia's SLI chip on the X58's Intel is selling way more boards. eVGA etc probably wouldnt even have started selling those boards without SLI support, and we hopefully both know the huge amount of people who prefer eVGA, let alone SLI/nVidia.

So yes, I do think this is a certain stupid move from Intel. Not like Im going to bother about it, they should do whatever they want to do, I think it's plain stupid.

And in the end, nVidia ain't Intel's guardian, they're just trying to get their share out of it. But the CPU is (probably?) more profitable, allowing nVidia in this would have benfitted them more:rolleyes:

Sr7
12-24-2008, 01:03 PM
Sorry, Intel and the PC out date NV. ;) My argument is that some morons at NV miscalculated and that is why NV is has no Nehalem offering. A 6 year old wouldn't make the mistake NV made. They failed to sell SLI the whole time they could have, and now, they're peddling it freely but without a platform to sell. :shakes:

So now because a company outdates another, it's instantly more relevant and can dictate who can sell what products? Seriously getting worse and worse logic here.

Do you really think NVIDIAs business depends on MCP sales, or SLI? :ROTF:

Intel integrated the memory controller and they're trying to take everything on-die. Why would NVIDIA care about making basic components (i.e. what's left of the northbridge)?

Intel sells their chipsets at cost knowing they'll make fat profits on their CPUs and that the chipset purchase locks you into their socket (and guarantees a CPU sale). How do you compete with an at-cost sales-person? Why bother investing resources there (well in advance) when you're *guaranteed* not to see a profit.

Missing out on a no-profit market is pretty much not an issue.

Also, as the previous poster alludes to, this whole corporate dependency talk is kind of laughable, because essentially, Intel held out on licensing Nehalem's QPI to NVIDIA in an effort to get SLI support. In plain business/logic terms, what does that tell you? That tells you that Intel has put their chipset licensing on the same level of importance as SLI.

They clearly value it pretty highly to make that kind of offer... they wouldn't offer something they know is valuable for something they don't care about now would they?

Zucker2k
12-24-2008, 01:07 PM
Hmmm.

It's pretty much a fact Intel ain't having good low power solutions apart from CPU's. nVidia does, they actually try and help them (well, with a certain financial interest of course). This could have made Atom a great thing.

Now however they simply gave all the other participants, including nVidia, a chance to beat Intel by this behaviour.

Besides that, lets not forget that due to nVidia's SLI chip on the X58's Intel is selling way more boards. eVGA etc probably wouldnt even have started selling those boards without SLI support, and we hopefully both know the huge amount of people who prefer eVGA, let alone SLI/nVidia.

So yes, I do think this is a certain stupid move from Intel. Not like Im going to bother about it, they should do whatever they want to do, I think it's plain stupid.

And in the end, nVidia ain't Intel's guardian, they're just trying to get their share out of it. But the CPU is (probably?) more profitable, allowing nVidia in this would have benfitted them more:rolleyes:Intel is NOT selling more boards because of SLI. Nvidia is the one peddling SLI on Intel so they can sell more cards as they have no nehalem offering. In short, if you stuck to SLI,you'll be running last gen Nvidia hardware with last gen Intel processors.

Sr7
12-24-2008, 01:10 PM
Intel is NOT selling more boards because of SLI. Nvidia is the one peddling SLI on Intel so they can sell more cards as they have no nehalem offering. In short, if you stuck to SLI,you'll be running last gen Nvidia hardware with last gen Intel processors.

Again, if Intel didn't think this would help them sell more boards they sure as hell wouldn't have made any cross-licensing offers to get it, now would they?

Do you invest time and money and expensive lawyers and contracts into something that won't generate you revenue? You think large corporations are that dumb?

Zucker2k
12-24-2008, 01:14 PM
So now because a company outdates another, it's instantly more relevant and can dictate who can sell what products? Seriously getting worse and worse logic here.

Do you really think NVIDIAs business depends on MCP sales, or SLI? :ROTF:

Intel integrated the memory controller and they're trying to take everything on-die. Why would NVIDIA care about making basic components (i.e. what's left of the northbridge)?

Intel sells their chipsets at cost knowing they'll make fat profits on their CPUs and that the chipset purchase locks you into their socket (and guarantees a CPU sale). How do you compete with an at-cost sales-person? Why bother investing resources there (well in advance) when you're *guaranteed* not to see a profit.

Missing out on a no-profit market is pretty much not an issue.

Also, as the previous poster alludes to, this whole corporate dependency talk is kind of laughable, because essentially, Intel held out on licensing Nehalem's QPI to NVIDIA in an effort to get SLI support. In plain business/logic terms, what does that tell you? That tells you that Intel has put their chipset licensing on the same level of importance as SLI. They clearly value it pretty highly to make that kind of offer... they wouldn't offer something they know is valuable for something they don't care about now would they?Ok, since you want me to spell it out, here it is:

1. The CPU is the heart of the PC market
2. Other companies make (Intel itself make igps, and soon) discrete gpus

How VITAL is Nvidia to the PC market? Both AMD and Intel are closely moving to "platform" architectures where every component of a system are seamlessly integrated into a unit. Where does that leave Nvidia? The answer? You're seeing it: Nvidia is going to always have to depend on the goodwill of AMD, and Intel (Via to a lesser extent) to sell chipsets and igps. That is Nvidia's problem right now. Nvidia is not indispensable to the PC market, that is my logic.

Zucker2k
12-24-2008, 01:20 PM
Again, if Intel didn't think this would help them sell more boards they sure as hell wouldn't have made any cross-licensing offers to get it, now would they?

Do you invest time and money and expensive lawyers and contracts into something that won't generate you revenue? You think large corporations are that dumb?Why did AMD offer XF freely on Intel? Who sold more gpus because of Nvidia's stubbornness? Who peddled an inferior chipset because of their tight grip on SLI? In the end, the enthusiasts wanted it and they got it for the same fact that if Intel allowed Nvidia to run its cpus/etc on its platform, it was capable of denying Nvidia that same privilege, which is exactly what is happening.

Sr7
12-24-2008, 01:21 PM
Ok, since you want me to spell it out, here it is:

1. The CPU is the heart of the PC market
2. Another companie, Intel itself make igps, nd soon discrete gpus

How VITAL is Nvidia to the PC market? Both AMD and Intel are closely moving to "platform" architectures where every component of a system are seamlessly integrated into a unit. Where does that leave Nvidia? The answer? You're seeing it: Nvidia is going to always have to depend on the goodwill of AMD, and Intel (Via to a lesser extent) to sell chipsets and igps. That is Nvidia's problem right now. Nvidia is not indispensable to the PC market, that is my logic.

You're kidding right? You really believe on die GPUs will ever be as fast as a discrete dedicated GPU which isn't constrained to some corner of the chip?

Fusion and also Intel's integration of graphics processors are going to be okay for office work.

Zucker2k
12-24-2008, 01:26 PM
You're kidding right? You really believe on die GPUs will ever be as fast as a discrete dedicated GPU which isn't constrained to some corner of the chip?

Fusion and also Intel's integration of graphics processors are going to be okay for office work.Don't underestimate technology my friend. I know one thing, I'm not holding my breath for Nvidia's future, you know why? because it's dependent on the PCI-E lane in a PC; that's a tenuous position to be in ain't it?

Sr7
12-24-2008, 01:26 PM
Why did AMD offer XF freely on Intel? Who sold more gpus because of Nvidia's stubbornness? Who peddled an inferior chipset because of their tight grip on SLI? In the end, the enthusiasts wanted it and they got it for the same fact that if Intel allowed Nvidia to run its cpus/etc on its platform, it was capable of denying Nvidia that same privilege, which is exactly what is happening.

You really believe that?

What about the fact that AMD had inferior products on the graphics market so they had to add value to them somehow, namely allowing them to run crossfire in more places where their competitor hadn't gone yet. What do you mean who sold more GPUs? NVIDIA did. They had MCPs which supported SLI, so people who wanted it could have it AND their beloved Intel processor.

It was a direct response to NVIDIA's GPUs, not Intels CPUs.

Also considering the fact that 95% of people buy 1 GPU only, I don't think you can sit here and say "because they allowed CF on Intel, ATI sold more GPUs than NVIDIA". That's just untrue.

The bottom line here is Intel has no legitimate reason to restrict competitors from the platform when the competitors have superior products (in every way).

The only reason is protecting their profits. Sure they can legally do it (no one's questioning that), but it's dumb and going to damage their mindshare and open the door to more lawsuits.

Zucker2k
12-24-2008, 01:29 PM
You really believe that?

What about the fact that AMD had inferior products on the graphics market so they had to add value to them somehow, namely allowing them to run crossfire in more places where their competitor hadn't gone yet. What do you mean who sold more GPUs? NVIDIA did. They had MCPs which supported SLI, so people who wanted it could have it AND their beloved Intel processor.

Also considering the fact that 95% of people buy 1 GPU only, I don't think you can sit here and say "because they allowed CF on Intel, ATI sold more GPUs than NVIDIA". That's just untrue.I don't mean more as in cumulative sales; but besides older sli hardware and cracked drivers, all multi-gpu solutions on Intel were ATI, meaning, those who wanted to see better framerates in games at high settings only had one place to go, ATi.

G0ldBr1ck
12-24-2008, 01:31 PM
I gotta agree with Zucker2K, with the market moving twords "Platforms" and NV having no platform, they will be heading down hill fast without the support from AMD or Intel. I think Cuda was there last hope for some sort of platform.

Sr7
12-24-2008, 01:32 PM
I don't mean more as in cumulative sales; but besides older sli hardware and cracked drivers, all multi-gpu solutions were ATI, meaning, those who wanted to see better framerates in games at high settings only had one place to go, ATi.

You honestly believe that NVIDIA didn't notice this? That they didn't weigh the consequences and do what made business sense? Theoretically that was to their own detriment anyway.. unlike this case with Ion.

Zucker2k
12-24-2008, 01:35 PM
You really believe that?

What about the fact that AMD had inferior products on the graphics market so they had to add value to them somehow, namely allowing them to run crossfire in more places where their competitor hadn't gone yet. What do you mean who sold more GPUs? NVIDIA did. They had MCPs which supported SLI, so people who wanted it could have it AND their beloved Intel processor.

It was a direct response to NVIDIA's GPUs, not Intels CPUs.

Also considering the fact that 95% of people buy 1 GPU only, I don't think you can sit here and say "because they allowed CF on Intel, ATI sold more GPUs than NVIDIA". That's just untrue.

The bottom line here is Intel has no legitimate reason to restrict competitors from the platform when the competitors have superior products (in every way).

The only reason is protecting their profits. Sure they can legally do it (no one's questioning that), but it's dumb and going to damage their mindshare and open the door to more lawsuits.Touche, where were you last year when nvidia was peddling sli with crappy chipsets on their platform? I'm sure Nvidia needs some of the share of the 5% multi-gpu consumption on the market don't you think? Otherwise why are they peddling that now? Uh huh, ATi caught up and surpassed them and they don't get to party at Intel's table anymore, that's what's up.

Eastcoasthandle
12-24-2008, 01:35 PM
I gotta agree with Zucker2K, with the market moving twords "Platforms" and NV having no platform, they will be heading down hill fast without the support from AMD or Intel. I think Cuda was there last hope for some sort of platform.

Yeah I also agree with Zucker2K. Besides, there is a right way and a wrong way of dealing with Intel in this situation.

T_Flight
12-24-2008, 01:37 PM
Protecting their profits is what they do. They are in the buisness to make money, and yes they will protect that. That's why they are the giant they are today, becasue they didn't squander that money away, and make bad buisness moves that would cause those profits to dwindle away into the red.

The fact remains that Atom is not nVidia's property, and they had no right to be advertising things they had no permission to. They went behind Intel's back and did it thinking they get dirty and force Intel's hand, and they got called. Niow they are gonna have to fold.

Later, if they do things the right way, they might get a better reaction from Intel, but one thing about buisness is very clear. You have to respect companies in positions of power. nVidia played a power game unarmed, and it didn't work. If you do things diplomatically and smartly, you will get alot farther, but butting heads with THE most powerful Chip maker on the planet is not a good way to do buisness.

G0ldBr1ck
12-24-2008, 01:37 PM
well I dont blame NV for trying. there gonna have to get in wherever they can, It just seems they are being blocked. I see there atempt at the Atom platform as a desperate act.

Sr7
12-24-2008, 01:39 PM
Touche, where were you last year when nvidia was peddling sli with crappy chipsets on their platform? I'm sure Nvidia needs some of the share of the 5% multi-gpu consumption on the market don't you think? Otherwise why are they peddling that now? Uh huh, ATi caught up and surpassed them and they don't get to party at Intel's table anymore, that's what's up.

Did you even bother to read my earlier post? I said *SIMILAR TO WHAT NVIDIA DID WITH SLI BEFORE*

Just because NVIDIA did that before and no longer does doesn't justify Intel doing this monopolistic crap now.

Heard of two wrongs don't make a right? Ya.. I think that applies here.

Also ATI never surpassed NVIDIA. I'm not really sure where people got that impression. ATI got great speed, but it was still not on par, short of their 2 GPU solution which is, according to most previews, about to get surpassed. The reason ATI made such a splash was because they decided to sell the chips for so much less.

ATI's GPGPU effort lacks momentum and coherency. There is little to no value for buying all their products as a whole platform, short of having their label on all your parts. Their CPUs are obviously far behind, and that's where most of their profits come in.. GPUs are pocket change compared to those.

Not sure what you use to measure "surpassing" but I think it's funny that AMD is in a much weaker position than most care to acknowledge. Everyone preys on NVIDIA due to traditional knowledge that "platform is everything." Platform only means as much as you make it mean.

If someone doesn't like 1 of your weak products, then someone certainly won't like 3 of your weak products bundled together :)

Zucker2k
12-24-2008, 01:40 PM
well I dont blame NV for trying. there gonna have to get in wherever they can, It just seems they are being blocked. I see there atempt at the Atom platform as a desperate act.It was a poor miscalculation on their part; as well as arrogance from their CEO. Well, that can of whoopass is about to get opened soon; we'll be seeing who gets the stinky side of that whiplash. :ROTF:


Did you even bother to read my earlier post? I said *SIMILAR TO WHAT NVIDIA DID WITH SLI BEFORE*

Just because NVIDIA did that before and no longer does doesn't justify Intel doing this monopolistic crap now.

Heard of two wrongs don't make a right? Ya.. I think that applies here.It's too late: according to Nvidia, "teh cpu is dead." Well, Intel took that seriously, let's see how Nvidia maneuvers through a PC world without a cpu. :rofl:

Sr7
12-24-2008, 01:50 PM
It was a poor miscalculation on their part; as well as arrogance from their CEO. Well, that can of whoopass is about to get opened soon; we'll be seeing who gets the stinky side of that whiplash. :ROTF:

It's too late: according to Nvidia, "teh cpu is dead." Well, Intel took that seriously, let's see how Nvidia maneuvers through a PC world without a cpu. :rofl:

Weak sales of i7 chips are an indicator of their relevance. Poor economic times really highlight what's disposable and what's not, and right now everyone in the industry is suffering, which means everyones disposable.

Also, don't you think it's odd that Intel is moving to quad, then 8 core forcibly, discontinuing previous lesser core products instead of just continuing to sell them cheaper and letting the market decide? And their new chips don't have a 2 core variant.. hmmm.

I guess the market will decide, but I think this core race is an attempt to add value over what we all already have in our machines. Not sure how much that's going to pay off long term... does granny need 16 cores to read her email or surf the web?

As much as many of us here on these forums might like these kinds of products, this isn't what appeals to mass market. It's also impressive that much of the CPU business is based on fallacies and misinformation.. the idea that "4 CPU cores are 4x faster than 1 CPU core" which many best buy shoppers might hold.

If you told a typical shopper that maybe 4% of applications will use all 4 cores, they might not buy that processor or new computer.

Zucker2k
12-24-2008, 01:54 PM
Weak sales of i7 chips are an indicator of their relevance. Poor economic times really highlight what's disposable and what's not, and right now everyone in the industry is suffering, which means everyones disposable.

Also, don't you think it's odd that Intel is moving to quad, then 8 core forcibly, discontinuing previous lesser core products instead of just continuing to sell them cheaper and letting the market decide? And their new chips don't have a 2 core variant.. hmmm.

I guess the market will decide, but I think this core race is an attempt to add value over what we all already have in our machines. Not sure how much that's going to pay off long term.My friend, that has always been the case; manufacturers are always going to make the next product more attractive; it happens in auto sales, electronics, etc. In any case, a bad economy is worse for discrete gpu sales than pc units as a whole. People would rather have low cost dualcore/quadcore solution than a gaming rig, which is more costly.

Sr7
12-24-2008, 01:58 PM
My friend, that has always been the case; manufacturers are always going to make the next product more attractive; it happens in auto sales, electronics, etc. In any case, a bad economy is worse for discrete gpu sales than pc units as a whole. People would rather have low cost dualcore/quadcore solution than a gaming rig, which is more costly.

It's not the same when you have a chicken and egg problem like CPU cores. People can't program for them until the share of systems in the market jusitfies the development time/expense. But if Intel and AMD left it alone and just charged less for a dual core, people might tend towards the cheaper dual core rather than get a quad or 8 core system.

So they discontinue the old ones and forcibly upgrade the market, in hopes that programmers will make applications to justify the existence of all 4 cores. I don't know that you see this type of forcible upgrade to justify a products existence in any other industry. It's just interesting ;)

Sentential
12-24-2008, 02:00 PM
It was a poor miscalculation on their part; as well as arrogance from their CEO. Well, that can of whoopass is about to get opened soon; we'll be seeing who gets the stinky side of that whiplash. :ROTF:

It's too late: according to Nvidia, "teh cpu is dead." Well, Intel took that seriously, let's see how Nvidia maneuvers through a PC world without a cpu. :rofl:

I wholeheartidly agree, nVidia made several *huge* mistakes over the last 5 years or so. The biggest of which has got to be the embargo of SLI on intel chipsets.

That alone has spurred so much ill will of nVidia by Intel that everything I see coming down the pipe from Intel seems directed at shutting nVidia out of the market and buying them at the bankruptcy courts.

In addition to that the ripple effect it had caused OEMs to be forced to use aweful chipsets with questionable reliability *at best* causing serious amounts of returns, just look at Apple and HP for examples.

ATi capitalized on their terrible mistake and now they are reaping the benefits of being a first to market with intel multi-graphics now that they are returning to their prime and I am quite sure that if Intel pushes this Hydra technology that AMD will be more than happy to chain their cards with Larabee.

nVidia is in a really precarious position right now. They have thoroughly pissed off all of their suppliers (note the exodus of people like XFX) and the OEMs and they have no competitive products to speak of and are wholly dependant Intel allowing them to have access to the i7 platform.

nVidia needs to turn the ship around and fast, namely they need to be aquired by some larger company (IBM would be great) now or they will face liquidation only a few years down the road as these "platforms" Intel and AMD are pushing really solidify into a quasi-console type PC setup thus locking them out of the market.

I dont see CUDA going anywhere as they are trying to tightly control who gets what, same thing with physx. If they made both of these codes far more open source and pushed it heavily to defeat Havok they would be in a much better position than they are.

Sad thing is I dont believe they realize how dire their situation is...


Did you even bother to read my earlier post? I said *SIMILAR TO WHAT NVIDIA DID WITH SLI BEFORE*

Just because NVIDIA did that before and no longer does doesn't justify Intel doing this monopolistic crap now.

Heard of two wrongs don't make a right? Ya.. I think that applies here.

Ah and this is where the failure of your logic is evident. Right? Wrong? Are you serious? This is not a missonary, this is a *for-profit* buisness. nVidia bit the hand that fed it both with AMD and with Intel. What happenes to them is just buisness, there is no right or wrong.


Weak sales of i7 chips are an indicator of their relevance. Poor economic times really highlight what's disposable and what's not, and right now everyone in the industry is suffering, which means everyones disposable.

Once again you fail to see the brilliance of the i7.... Do you really think Intel gives a damn about the retail market when buisnesses will pay $500+ per server chip? HELL NO. i7 was built and designed for buisnesses with the expectation that retail sales would be weak, that is the genious of their design. No one at Intel expects i7 to sell all that well because the retail market is not who this chip was intended for.

Lynfield is what was made for the retail market and that comes out a couple months from now. You want cheap duals and quads? Look no further just do not expect LGA1366 to be comsumer friendly because it was never meant to be.

Buisness applications actually DO use multi-core designs so their merit is legimate. The only reason you see multi-core chips on the consumer market is because it costs too much money for Intel to re-tool their fabs to make two seperate CPU packages when the margins on retail chips are so small. It is far easier for them to give us the rejects of the buisness class chips then it is for them to redesign a brand new single core cpu that clocks sky high.

Sr7
12-24-2008, 02:06 PM
I wholeheartidly agree, nVidia made several *huge* mistakes over the last 5 years or so. The biggest of which has got to be the embargo of SLI on intel chipsets.

That alone has spurred so much ill will of nVidia by Intel that everything I see coming down the pipe from Intel seems directed at shutting nVidia out of the market and buying them at the bankruptcy courts.

In addition to that the ripple effect it had caused OEMs to be forced to use aweful chipsets with questionable reliability *at best* causing serious amounts of returns, just look at Apple and HP for examples.

ATi capitalized on their terrible mistake and now they are reaping the benefits of being a first to market with intel multi-graphics now that they are returning to their prime and I am quite sure that if Intel pushes this Hydra technology that AMD will be more than happy to chain their cards with Larabee.

nVidia is in a really precarious position right now. They have thoroughly pissed off all of their suppliers (note the exodus of people like XFX) and the OEMs and they have no competitive products to speak of and are wholly dependant Intel allowing them to have access to the i7 platform.

nVidia needs to turn the ship around and fast, namely they need to be aquired by some larger company (IBM would be great) now or they will face liquidation only a few years down the road as these "platforms" Intel and AMD are pushing really solidify into a quasi-console type PC setup thus locking them out of the market.

I dont see CUDA going anywhere as they are trying to tightly control who gets what, same thing with physx. If they made both of these codes far more open source and pushed it heavily to defeat Havok they would be in a much better position than they are.

Sad thing is I dont believe they realize how dire their situation is...



Ah and this is where the failure of your logic is evident. Right? Wrong? Are you serious? This is not a missonary, this is a buisness. nVidia bit the hand that fed it both with AMD and with Intel. What happenes to them is just buisness, there is no right or wrong.

Right.. 4,000 people don't know anything about the industry or their position, but joe schmoe on the forums does :clap:

And there is absolutely right and wrong in the eyes of a free market economy. That's why we have these funny things called "laws" governing how companies do business. Also there are these funny concerns around "anti-trust".

Also, open source CUDA/PhysX? For what? What do you hope to gain? Anyway, if you had done your homework you would've realized that people who use physx get the source code too...

You're basing all criticisms on these really vague demands for open source and "opening things" without any clear reason of how or why. Where are you going with this?

T_Flight
12-24-2008, 02:08 PM
OS's, and software are headed toward Multithreading. Single Core, and Dual Core are becoming legacy by the minute. It's like the Internet has moved toward a broadband era. We're no longer using external baud modems, and we moved up throguh the years to 56K, and onto broadband speeds. The market will react, or they will just be left with obsolete technology. They can decide either way. Nobody is forced into buying a computer of any kind. Each individual can choose if they want to upgrade or not, but the technology is not gonna stand still, and it's gonna improve and faster and faster rates as we have seen.

For those that want to keep up, Multicore is gonna be required. We are nowe seeing the benefits of it in gaming and Video editing and Encoding as well as Rendering. In time single threaded apps will become nonexistent. People want power.

I don;t know whwere anybody thought i7 sales were low, but sales are actually very good. I'd estimate that 90% of my freinds are going with new systems, and most of my freinds don;t even get into computers all that much. They've just waited for good technology to come along, and want a new system. It's not like a computer is gonna put them in the poor house or anything. They expect to buy one at regular intervals, and it's an investment for them. Many of them rely on computers. The economy is not gonna affect their upgrade path. I don't know anybody that has told me the economy has kept them from buying a new machine. They're not buying cars right now, but a computer is not gonna force them to live out of their truck or anything. hehe :)

Brother Esau
12-24-2008, 02:11 PM
Yeah I also agree with Zucker2K. Besides, there is a right way and a wrong way of dealing with Intel in this situation.

Yes the Right way is to Boycott Intel because they suck for this very reason!

Sr7
12-24-2008, 02:13 PM
OS's, and software are headed toward Multithreading. Single Core, and Dual Core are becoming legacy by the minute. It's like the Internet has moved toward a broadband era. We're no longer using external baud modems, and we moved up throguh the years to 56K, and onto broadband speeds. The market will react, or they will just be left with obsolete technology. They can decide either way. Nobody is forced into buying a computer of any kind. Each individual can choose if they want to upgrade or not, but the technology is not gonna stand still, and it's gonna improve and faster and faster rates as we have seen.

For those that want to keep up, Multicore is gonna be required. We are nowe seeing the benefits of it in gaming and Video editing and Encoding as well as Rendering. In time single threaded apps will become nonexistent. People want power.

I don;t know whwere anybody thought i7 sales were low, but sales are actually very good. I'd estimate that 90% of my freinds are going with new systems, and most of my freinds don;t even get into computers all that much. They've just waited for good technology to come along, and want a new system. It's not like a computer is gonna put them in the poor house or anything. They expect to buy one at regular intervals, and it's an investment for them. Many of them rely on computers. The economy is not gonna affect their upgrade path. I don't know anybody that has told me the economy has kept them from buying a new machine. They're not buying cars right now, but a computer is not gonna force them to live out of their truck or anything. hehe :)

Ah but the failure in the methodology comes here.. people expect to spend that money until it fails to give them as much as the previous investment (see diminishing returns). With as few applications as are 4 core or 8-thread aware right now as there are, I think that we'll start to see that happen. Most programmers don't program to 4 cores because it's plain unnecessary for their programs workload.

Very few applications on the typical PC could benefit a perceivable amount from this type of hardware, and of those that can, it gets exponentially harder to manage multiple application threads the more threads you have. This is where I see the hurdle..16 cores.. 32 cores... it's not just something you can infinitely extrapolate on. The logic is becoming *VERY* difficult to manage, and for most developers (be they game developers or app developers) the extra cost and time to make it work is not worth the performance pay-off.

bowman
12-24-2008, 02:22 PM
Intel's fail marketing fails, C/C. Seriously, Intel starts to act how nVidia used to act, and that ain't good.

Heck, where do you think NVIDIA got the inspiration from? :p: Intel's always been like this, except they were much worse before. Not much of a surprise here.

zanzabar
12-24-2008, 02:23 PM
Why did AMD offer XF freely on Intel? Who sold more gpus because of Nvidia's stubbornness? Who peddled an inferior chipset because of their tight grip on SLI? In the end, the enthusiasts wanted it and they got it for the same fact that if Intel allowed Nvidia to run its cpus/etc on its platform, it was capable of denying Nvidia that same privilege, which is exactly what is happening.

amd/ati are in the crossfire consortium as well as intel, s3, via, sis and others, xfire is an open standard that can be used by any pci-e device and it follows ati's past with supporting open source and open platforms. and intel is the only company that makes good chipsets on 775 so why develop something past the r600 (the chipset not the gpu) when intel wants to back it and if chipsets arnt profitable for u why bother

Rammsteiner
12-24-2008, 02:29 PM
Intel is NOT selling more boards because of SLI. Nvidia is the one peddling SLI on Intel so they can sell more cards as they have no nehalem offering. In short, if you stuck to SLI,you'll be running last gen Nvidia hardware with last gen Intel processors.
You're kidding right? Without the SLI chip XFX and eVGA at least wouldnt have released anything, meaning a lot less chipsets would have sold already to them. Of course nVidia knows with the SLI chip they'll sell more cards, but you're turning it around. Those people who seriously want SLI would buy those cards anyway and wouldnt have bought i7 at all and stuck to 775 or consider AM2+/AM3.


Yes the Right way is to Boycott Intel because they suck for this very reason!
Indeed. Im no anti-Intel person at all, but this marketing stuff is taking the piss, way too much to be quite honest:rolleyes:

Ive certain things against Intel, and this way it's only growing. For me personally that is of course, but this ain't adding any positive thing to all of this.

Serra
12-24-2008, 02:35 PM
While I don't want to stop Intels ability to make a decision such at this, this does conjure the word "monopoly" to my mind.

:down:

Maybe this is a platform best assembled via CPU's from eBay?

Zucker2k
12-24-2008, 02:38 PM
Ah but the failure in the methodology comes here.. people expect to spend that money until it fails to give them as much as the previous investment (see diminishing returns). With as few applications as are 4 core or 8-thread aware right now as there are, I think that we'll start to see that happen. Most programmers don't program to 4 cores because it's plain unnecessary for their programs workload.

Very few applications on the typical PC could benefit a perceivable amount from this type of hardware, and of those that can, it gets exponentially harder to manage multiple application threads the more threads you have. This is where I see the hurdle..16 cores.. 32 cores... it's not just something you can infinitely extrapolate on. The logic is becoming *VERY* difficult to manage, and for most developers (be they game developers or app developers) the extra cost and time to make it work is not worth the performance pay-off.Oh no, not this again. Okay, why buy a ferrari or a corvette zr1 that does 205 miles per hour when you're never going to drive it that fast? Obviously it's overkill, but just in case you want to endanger your life once in a while, the speed is there on tap. Remember, the whole argument about multicore is multitasking. Sure, not many apps can take advantge of four cores or 8 threads, but how about encoding a video, while playing music or even gaming and still have your video done in record time? Priceless!


You're kidding right? Without the SLI chip XFX and eVGA at least wouldnt have released anything, meaning a lot less chipsets would have sold already to them. Of course nVidia knows with the SLI chip they'll sell more cards, but you're turning it around. Those people who seriously want SLI would buy those cards anyway and wouldnt have bought i7 at all and stuck to 775 or consider AM2+/AM3.


Indeed. Im no anti-Intel person at all, but this marketing stuff is taking the piss, way too much to be quite honest:rolleyes:

Ive certain things against Intel, and this way it's only growing. For me personally that is of course, but this ain't adding any positive thing to all of this.Woah! Interesting analyses. What is XFX offering that Asus, or Gigabyte, or DFI, or Biostar, or MSI, or any number of manufacturers aren't offering? XFX is in it for the money, they're not doing Intel a favor. As I've said before, if someone wanted SLI with Intel's next gen, the only way to get there is Ci7 and x58. In case you still don't get it, here it is:
Once Intel pulled that move on nvidia by denying them access to Ci7, just as they're doing now, Nvidia had no choice but to capitalize elsewhere by peddling SLI to the manufacturers. Intel gave up nothing; if SLI was important for Intel, there would have been a deal, simple as that.

Manabu
12-24-2008, 03:12 PM
Intel don't want nvidia's chipset because it is too good, and may make people perceive that they don't need an pentium dualcore or E7200 for they simple everyday tasks. An cute EEEbox like PC could do the work. Of course, the more speed the better, but as already said, we are reaching a point of diminishing returns for an bigger investment in computers, for most needs.

Nvidia still has Tegra to counter atack in the netbook market. In this era of "good enough computing" ARM is rising in performance as they always done, and soon they will overlap with x86, as Intel and AMD go down to netbooks. The cortex A9 will probably be as powerfull as current atoms, consume less than 1 watt the whole SoC, but of course will lack x86 compatibility.

T_Flight
12-24-2008, 03:19 PM
On my end, I'm not seeing limited Multicore use at all. Man, I have been waiting for a powerhouse for YEARS. Everything I do that takes the most time is heavily dependednt on multithreaded performance. Everything. I couldn't even begin to tell somebody how much time of my life was wasted waiting on encodes to finish, waiting on flow calculation to complete, trying to run firing simulations that took forever and only ran at slideshow level becasue the performance was not there. EVen now, it has barely caught up.

I'm not looking at limited Multicore performance at all, and it's not overkill...it's barely enough, and it is the main reason I bought my system, and is the reason I waited an extra year longer than when i planned on buying on. Mnay of my freinds are seeing the same things. They do not want to wait on singlethreaded performance anymore when Multheaded apps can cut their time to a fraction of what it was before. It's not limited at all.

Games are moving entirely to it. The rest are following. They have to to keep up, and by this time next year the entire complexion of what used to be singlethreaded apps is gonna be change. People that are from the old school of singlethreaded coding are finding they are having to retrain for Multithreaded programming, or they quickly find themselves wiothout a job. It is literally happening by the minute.

The words "Let's go singlethreaded with this app" would be quickly followed by "You don't wanna last very long here do you?" :D

Sentential
12-24-2008, 03:23 PM
Right.. 4,000 people don't know anything about the industry or their position, but joe schmoe on the forums does :clap:

And there is absolutely right and wrong in the eyes of a free market economy. That's why we have these funny things called "laws" governing how companies do business. Also there are these funny concerns around "anti-trust".

Also, open source CUDA/PhysX? For what? What do you hope to gain? Anyway, if you had done your homework you would've realized that people who use physx get the source code too...

You're basing all criticisms on these really vague demands for open source and "opening things" without any clear reason of how or why. Where are you going with this?

First laws are artificial and are not a part of the free-market and often times as we see illustrated by the collapse of the sub-prime mortgage market do more harm than good while all in the name of "fairness".

As for the CUDA/Physx arguement my point is that by incorporating these codes into a pay-to-play enviroment is killing any possiblitly they had to get widespread usage of their design. Why would anyone want Physx when only 1/3 of the GPUs on the market can actually use it? Do you really think Valve is going to make a game with the intent of limiting their target market by 2/3rds?

The key is prolifiration; that is nVidia's issue right now. They are an isolated company that is quickly becoming obsolete. What they need is to proliferate the market with as much intellectual property as possible so that they can have as much input into the design of gaming studios as possible.

Sure they can use their *improved* implementation of PhysX to generate revenue, the same can be said of their superior OpenGL and soon to be OpenCL support, but their attempts at locking people out of the market are shooting themselves in the foot.

If they continue this lunacy they will die off just like OpenGL and just like GLIDE. nVidia has to get into their heads that A) They are not a monopoly and B) If they fell off the face of the earth tommorow, no one would care. They are not and will never be an "Intel", the sooner they realize that the better and the longer they continue to think they are like either the more damage they will do to the company as a whole.

Sadly what I dont get is that the same thing ATi is doing to them is the same model that nVidia used to bring down 3DFX by releasing the GeforceGTS and RivaTNT2. Smaller cards with cheaper prices and better yields versus monolithic high powered GPUs. Funny how history repeats itself...

Iconyu
12-24-2008, 04:34 PM
Monopoly.

Intel don't want Atom CPU's eating into the Core 2 market, the bigger the Atom market grows the harder it'll be to sell performance CPU's. Intel wants to stick Atom's into mobile phones and PDA's not net books.

[XC] riptide
12-24-2008, 06:45 PM
Intel chips with nvidia chipsets? *shudder* always gives me the creeps.... \thinks about counteless 'My i680 board is dead/can't clock/killed my ram' threads.\

Toysoldier
12-24-2008, 06:50 PM
Atom is Intel's property, and they have every right to do what they wish with it.

+ 1

KoHaN69
12-24-2008, 06:55 PM
yah, good thing AMD and VIA are releasing their solutions in 1Q 09; that should force Intel to change its tactics in this market.

onboard hd 3200 or better in a netbook would make me an instacustomer

nn_step
12-24-2008, 07:05 PM
Fortunately AMD is more than happy to partner with nVidia again.

tiro_uspsss
12-24-2008, 08:35 PM
riptide;3524955']Intel chips with nvidia chipsets? *shudder* always gives me the creeps.... \thinks about counteless 'My i680 board is dead/can't clock/killed my ram' threads.\

+1

intels shafts nv again? *excellent smithers, excellent* :D

Donnie27
12-24-2008, 08:46 PM
+1

intels shafts nv again? *excellent smithers, excellent* :D

Yup, nVidia never screwed Intel......:rofl::ROTF:

Donnie27
12-24-2008, 08:49 PM
My friend, that has always been the case; manufacturers are always going to make the next product more attractive; it happens in auto sales, electronics, etc. In any case, a bad economy is worse for discrete gpu sales than pc units as a whole. People would rather have low cost dualcore/quadcore solution than a gaming rig, which is more costly.

Repost to you to say the same thing!

You're fighting a loosing battle against folks who think Intel is always wrong or never is right, never did anything good or worthwhile, lol, as was said they suck. You can't get anywhere with folks like that? No matter how RIGHT YOU ARE

nVidia brought this on themselves.

Chewbenator
12-24-2008, 09:05 PM
Uhm, we are talking about the Atom paired with a 9400 and below right? What is up with all the squabbling about core i7s and octo cores? I mean, it's netbooks we're talking about here, not SLIed quadcore desktops. Yeesh it's like a fanboy festival in here.

Now related to the topic, with Intel stiff arming Nvidia how is this product possible?
http://www.newegg.com/Product/Product.aspx?Item=N82E16834220385&nm_mc=OTC-Froogle&cm_mmc=OTC-Froogle-_-Netbooks-_-ASUS-_-34220385

Asus N10J-A1, Intel Atom with 9300GS.

tiro_uspsss
12-24-2008, 09:09 PM
Uhm, we are talking about the Atom paired with a 9400 and below right? What is up with all the squabbling about core i7s and octo cores? I mean, it's netbooks we're talking about here, not SLIed quadcore desktops. Yeesh it's like a fanboy festival in here.

Now related to the topic, with Intel stiff arming Nvidia how is this product possible?
http://www.newegg.com/Product/Product.aspx?Item=N82E16834220385&nm_mc=OTC-Froogle&cm_mmc=OTC-Froogle-_-Netbooks-_-ASUS-_-34220385

Asus N10J-A1, Intel Atom with 9300GS.

from what I understand, its still possible - easily - just that any NV equipped NBs are gonna be more expensive.. might be wrong tho :D :shrug:

Speederlander
12-24-2008, 09:40 PM
You're kidding right? You really believe on die GPUs will ever be as fast as a discrete dedicated GPU which isn't constrained to some corner of the chip?

They will be fast enough for 99.8% of the consuming public. And that's all that matters.

IvanAndreevich
12-24-2008, 09:57 PM
They will be fast enough for 99.8% of the consuming public. And that's all that matters.

So you think 1/500th of the population are gamers? :down:

Speederlander
12-24-2008, 10:27 PM
So you think 1/500th of the population are gamers? :down:

No, it's just that 499/500 don't know the first thing about video card makes or models and don't measure their self-worth in fps. Discrete/Dedicated graphics are where the consumer market is going. Stand-alone GPUs will be forced to an ever smaller corner of the market as CPUs and embedded GPUs gain power. As long as the end result is great looking games who cares about the mechanism used to achieve it? You need a separate video card solution now, but assuming you will in the future is a big mistake.

Drwho?
12-25-2008, 12:03 AM
Guys, you got to look the roadmap, with the next generation of Atom, the chipset will be integrated, it does not make any sense to enable several chipset, especially if the 1st goal is compatibility. If I was NV, i would focus on GPU and 3D shaders, competition is coming, and it looks like they are showing weakening in this area, and try to move to other business.
spending research to enable a market that will not exist in 18 months leave me speech less. since the beginning, intel said that the GPU will move in the CPU in the atom product line in 2009/2010.

this is my personal opinion.
Francois

Spectrobozo
12-25-2008, 12:06 AM
Uhm, we are talking about the Atom paired with a 9400 and below right? What is up with all the squabbling about core i7s and octo cores? I mean, it's netbooks we're talking about here, not SLIed quadcore desktops. Yeesh it's like a fanboy festival in here.

Now related to the topic, with Intel stiff arming Nvidia how is this product possible?
http://www.newegg.com/Product/Product.aspx?Item=N82E16834220385&nm_mc=OTC-Froogle&cm_mmc=OTC-Froogle-_-Netbooks-_-ASUS-_-34220385

Asus N10J-A1, Intel Atom with 9300GS.

from what I understand, the ION is still possible, but in order to do that the manufacturer will need to buy a atom with a 945 chipset, than take the 945 out and use with whatever they want, and in this netbook the 945 still there, the 9300GS is just a pcie1x gpu... it still possible, but at higher cost, and that will make this more expensive, than, why do it? if you can make a core 2+ 9400 for the same price!?
anyway, I think I saw a SIS based chipset atom, in the same scheme, throw out the 945 and use whatever you want.
and intel do it for a good reason, Atom is very cheap and is good enough for most people, with a 9400GT and low cost, that will affect low end core 2 sales, and to sell a cpu+chipset it's a lot better for the company, and they can make atom platform less interesting to not kill the low end core2.

to bad that via can't compete.
nano cost more, it's hard to find and consume more power.
http://www.logicsupply.com/products/vb8001_16

Rammsteiner
12-25-2008, 07:02 AM
Woah! Interesting analyses. What is XFX offering that Asus, or Gigabyte, or DFI, or Biostar, or MSI, or any number of manufacturers aren't offering? XFX is in it for the money, they're not doing Intel a favor. As I've said before, if someone wanted SLI with Intel's next gen, the only way to get there is Ci7 and x58. In case you still don't get it, here it is:
Once Intel pulled that move on nvidia by denying them access to Ci7, just as they're doing now, Nvidia had no choice but to capitalize elsewhere by peddling SLI to the manufacturers. Intel gave up nothing; if SLI was important for Intel, there would have been a deal, simple as that.
Personally I wouldnt even buy any XFX/eVGA product, but there are loads of people who do want to get their products. In the end it's nice to have most of your hardware from one manufacturer. Most nVidia fanboys are at the same time very loyal to XFX/eVGA. What they offer over others? Not much, does it matter? Obviously enough to justify to buy one.

And SLI is important to Intel, otherwise they wouldnt have allowed any chipset from nVidia on it. Intel simply had more money, a better future in sight and last but not least, more power. nVidia couldnt do much more than withdraw from the fight.


Guys, you got to look the roadmap, with the next generation of Atom, the chipset will be integrated, it does not make any sense to enable several chipset, especially if the 1st goal is compatibility. If I was NV, i would focus on GPU and 3D shaders, competition is coming, and it looks like they are showing weakening in this area, and try to move to other business.
spending research to enable a market that will not exist in 18 months leave me speech less. since the beginning, intel said that the GPU will move in the CPU in the atom product line in 2009/2010.

this is my personal opinion.
Francois
Do you let the fab cleaners design architectures for your CPU's if you feel AMD is slapping back? I mean, a specialized chipset group ain't going to be very usefull in supporting a much larger GPU team. How much they'd love to, they'll be useless, just like people working at AMD GPU's would be useless for AMD CPU's (unless it's something like Fusion of course, but that's different).

metro.cl
12-25-2008, 07:43 AM
Guys, you got to look the roadmap, with the next generation of Atom, the chipset will be integrated, it does not make any sense to enable several chipset, especially if the 1st goal is compatibility. If I was NV, i would focus on GPU and 3D shaders, competition is coming, and it looks like they are showing weakening in this area, and try to move to other business.
spending research to enable a market that will not exist in 18 months leave me speech less. since the beginning, intel said that the GPU will move in the CPU in the atom product line in 2009/2010.

this is my personal opinion.
Francois

That is a 100% monopoly thinking, you are implying that you want to help NVIDIA by not letting them participate on this market because in 2 more years you will have what they have today?

This approach sucks i would never buy a Atom netbook or nettop with the current GPU, becouse i want to see 1080p in my flat panel, this ION thing is perfect for that (size, noise, performance, price). If it was available i'll buy 3 right now so please try to make intel work to replace ION fast / now because many want it.

MomijiTMO
12-25-2008, 07:48 AM
Well, yay or nay? Not really sure. Don't forget AMD are still very much alive so monopoly we have not.

Hornet331
12-25-2008, 08:04 AM
That is a 100% monopoly thinking, you are implying that you want to help NVIDIA by not letting them participate on this market because in 2 more years you will have what they have today?

This approach sucks i would never buy a Atom netbook or nettop with the current GPU, becouse i want to see 1080p in my flat panel, this ION thing is perfect for that (size, noise, performance, price). If it was available i'll buy 3 right now so please try to make intel work to replace ION fast / now because many want it.

why do you want to watch HD material on a net top?

If you want to watch HD material get a DG45FC+E5200, come for ~170€ and beats the crap out of any atom. :yepp:

Or if dont like the intel solution you can go for a 8200/780G and low power X2. Comes for the same money and playes HD material just as fine, and even enables you to play games at a decent frame rate. :yepp: (thought they consume more when the system is loaded)

all boards are mini-itx boards, so similar size to atom. ;)

metro.cl
12-25-2008, 08:33 AM
why do you want to watch HD material on a net top?

If you want to watch HD material get a DG45FC+E5200, come for ~170€ and beats the crap out of any atom. :yepp:

Or if dont like the intel solution you can go for a 8200/780G and low power X2. Comes for the same money and playes HD material just as fine, and even enables you to play games at a decent frame rate. :yepp: (thought they consume more when the system is loaded)

all boards are mini-itx boards, so similar size to atom. ;)

As i said, size and noise are something important for me, because i'll have one on each room with a TV. The other solutions just arent good enough and also availability down here is always really limited so i cant get mini-ITX mobos or Mini-ITX cases, then i'll have to buy in the us and import it and then it gets a lot more $$$.

P.D. I dont want to watch it in a nettop but due to it's size it is just perfect.

Zucker2k
12-25-2008, 08:41 AM
Personally I wouldnt even buy any XFX/eVGA product, but there are loads of people who do want to get their products. In the end it's nice to have most of your hardware from one manufacturer. Most nVidia fanboys are at the same time very loyal to XFX/eVGA. What they offer over others? Not much, does it matter? Obviously enough to justify to buy one.

And SLI is important to Intel, otherwise they wouldnt have allowed any chipset from nVidia on it. Intel simply had more money, a better future in sight and last but not least, more power. nVidia couldnt do much more than withdraw from the fight.Your reasoning is so off! Here's why:

1. Nvidia does not need Intel's consent to implement SLI on their platform. The architecture is already there (ie. 2, 3, 4, pci-e sockets). SLI was blocked at the driver level. Nvidia chose to implement it through hardware by id'ing a physical chip on a board before drivers kick in.

2. XFX has resources dedicated to making (rebadging?), marketing motherboards; those resources are useless unless they can sell/offer something new. Nvidia has nothing new to offer and Ci7 is new and is the future. Need I say more? Where does that leave XFX? Or maybe XFX's loyal customers don't like new hardware? :rolleyes: XFX exists to make money, and they'll go where the money is. Let me say it again, Nvidia hs nothing new to offer, so companies like EVGA and XFX naturally looked elsewhere. If they stay with Nvidia, they stay with old and outdated gen and would sink with Nvidia, if indeed that is where Nvidia is headed. Even if that's not the case,they still have to make money? :shakes: Sorry, but that's not hard to grasp. Yes, they're still ofeering video cards, but as far as a platform supporting Intel's next gen cpus, they'd have had nothing to offer except jump on an already overpoplated bandwagon.

Rammsteiner
12-25-2008, 09:10 AM
Your reasoning is so off! Here's why:

1. Nvidia does not need Intel's consent to implement SLI on their platform. The architecture is already there (ie. 2, 3, 4, pci-e sockets). SLI was blocked at the driver level. Nvidia chose to implement it through hardware by id'ing a physical chip on a board before drivers kick in.
Because Intel didnt allow nVidia to release their own platform because nVidia said no to SLI license on Intel's X58 chipset, which is completely in their right. nVidia of course sees a quite overhyped popular platform coming up and wants somehow a deal in it, which turned out after a long one sided fight in this.


2. XFX has resources dedicated to making (rebadging?), marketing motherboards; those resources are useless unless they can sell/offer something new. Nvidia has nothing new to offer and Ci7 is new and is the future. Need I say more? Where does that leave XFX? Or maybe XFX's loyal customers don't like new hardware? :rolleyes: XFX exists to make money, and they'll go where the money is. Let me say it again, Nvidia hs nothing new to offer, so companies like EVGA and XFX naturally looked elsewhere. If they stay with Nvidia, they stay with old and outdated gen and would sink with Nvidia, if indeed that is where Nvidia is headed. Even if that's not the case,they still have to make money? :shakes: Sorry, but that's not hard to grasp. Yes, they're still ofeering video cards, but as far as a platform supporting Intel's next gen cpus, they'd have had nothing to offer except jump on an already overpoplated bandwagon.
Am I talking Chinese:confused:

Of course XFX/eVGA barely innovate anything, it's all rebadged (all most everything). But, where does this change the fact that due to this rebadging more boards would be sold? XFX/eVGA wouldnt jump on this bandwagon if they wouldnt gain anything from this, so apparently it's possible to rebadge and to sell it as well, meaning somewhere there's more profit due to more sales.

Besides that, I seriously dont think eVGA for sure, XFX unsure, would have sold X58 boards without SLI. Selling motherboards is a nice little extra income, but not selling them wouldnt kill them at all. Also if it's true what you're saying, Im wondering what that huge tri-SLI logo covered all over the front of the motherboard box is doing there:rolleyes:

Maybe we just differ in opinion about this matter, but dont shout at me that my reasoning is wrong.

Zucker2k
12-25-2008, 10:06 AM
Because Intel didnt allow nVidia to release their own platform because nVidia said no to SLI license on Intel's X58 chipset, which is completely in their right. nVidia of course sees a quite overhyped popular platform coming up and wants somehow a deal in it, which turned out after a long one sided fight in this.


Am I talking Chinese:confused:

Of course XFX/eVGA barely innovate anything, it's all rebadged (all most everything). But, where does this change the fact that due to this rebadging more boards would be sold? XFX/eVGA wouldnt jump on this bandwagon if they wouldnt gain anything from this, so apparently it's possible to rebadge and to sell it as well, meaning somewhere there's more profit due to more sales.

Besides that, I seriously dont think eVGA for sure, XFX unsure, would have sold X58 boards without SLI. Selling motherboards is a nice little extra income, but not selling them wouldnt kill them at all. Also if it's true what you're saying, Im wondering what that huge tri-SLI logo covered all over the front of the motherboard box is doing there:rolleyes:

Maybe we just differ in opinion about this matter, but dont shout at me that my reasoning is wrong.By your own argument, motherboard sales doesn't mean much to XFX. Let's say that is true. Answer this question: How does EVGA and XFX benefit Intel? Anyone who wants an Intel board to run SLI already have choices; only a few months back, these guys were selling other platforms and Intel boards still saturated the market :confused: This is my argument; these guys are offering nothing new or special in this instance. As YOU have pointed out, it'll help them sell more video cards IF they offered rebadged boards and other hardware under their brands. So in effect, and per your argument, since their motherboard operations are negligible, and since they reap more rewards from gpu sales, wouldn't it be true to that it's EVGA, XFX, and Nvidia that stand to reap more from having SLI on the Intel platform? Jeeez, this isn't science; Nvidia with no SLI on Intel platform, and no new Ci7 platform to offer would have lost all multi-gpu sales on that platform to AMD/ATi. I'm done.

Rammsteiner
12-25-2008, 11:02 AM
By your own argument, motherboard sales doesn't mean much to XFX. Let's say that is true. Answer this question: How does EVGA and XFX benefit Intel? Anyone who wants an Intel board to run SLI already have choices;Stop right there,

You still forget the point, it's not about anyone wanting Intel and SLI, it's about wanting SLI with eventually Intel. For 775 there was choice enough, if nVidia's chipsets are so bad, if SLI is such a non-preferance, then how come XFX/eVGA didnt sell Intel chipsets? Or even better, how come so many people bought those motherboards anyway? Even after the 680i fiasco? Right, because they want SLI or even believe in the strength of nVidia chipsets and/or ain't having any troubles with the chipsets.

But without this SLI chip, there was no SLI on X58's, meaning no i7+SLI abilities, meaning people would either stick to 775 or when time was right for an upgrade move to AM2+/AM3, depending how Deneb evaluates. There are so many people, I actually dare to say an easy 30%+, in the enthusiast/gamer group who want nothing less then SLI. If i7 ain't going to run it, then all they would think is 'Oh well, more money left for a 3rd/better GPU'.

As said, this isnt about Intel + SLI, it's about SLI + something to run it on. And of course XFX/eVGA are there for the money, who ain't anyway.

Anyway, it's getting a bit OT now I guess:p:

Ghostbuster
12-25-2008, 12:00 PM
Maybe NVIDIA will go to Intel and bargain like they did with X58.. enable SLI for Intel chipsets, in exchange for Ion platform... :rofl:

Sounds silly enough... :D

Shintai
12-25-2008, 12:19 PM
(Waits for Atom "2". No more chipset discussion).

Ghostbuster
12-25-2008, 12:42 PM
(Waits for Atom "2". No more chipset discussion).Moorestown? What about desktop and HTPC? :D

Drwho?
12-25-2008, 01:29 PM
(Waits for Atom "2". No more chipset discussion).

the futur is integration, it always was and always will be.
you ll see the chipset will follow the sound card, the network card and the rest, like RS232 card and parrallel port card.

1) in the case of Atom, not having external bus saves a lot of power when all is on one dice.
2) integration is good for consumers, it provide more stable hardware to work on
3) who ever think is it monopolistic behavior need to understand that Atom is suppose to be cost effective, nothing to do with monopoly, this is a naive argument. if you claim this, then AMD move the memory controler on dice was, same for the fusion, or the 80386sx mem controler om dice...

integration is a natural process, and in 2009/2010, chipset will be protested as a dissapearing specie.:up:

this is my personal opinion

Shintai
12-25-2008, 01:39 PM
It also allows smaller chassis/MB designs :p:

I think mATX will be the new standard so to say instead of ATX very soon.

K8 added the MCH (again), Core i5 adds PCIe. Core i3? and AMDs APU design adds IGP+PCIe.

Next step would be either southbridge or VRM integration into the CPU for both AMD and Intel I think. My money is on the last one.

http://shintai.ambition.cz/pics/news121323.jpg
http://shintai.ambition.cz/pics/news121323_003.jpg

It would clear alot of motherboard space.

http://images.anandtech.com/reviews/tradeshows/IDF/2005/Fall/Day3/Keynote/demo.jpg

Benefits are obvious:
http://images.anandtech.com/reviews/tradeshows/IDF/2005/Fall/Day3/Keynote/coarse.jpg
http://images.anandtech.com/reviews/tradeshows/IDF/2005/Fall/Day3/Keynote/fine.jpg

Drwho?
12-25-2008, 01:40 PM
Stop right there,

You still forget the point, it's not about anyone wanting Intel and SLI, it's about wanting SLI with eventually Intel. For 775 there was choice enough, if nVidia's chipsets are so bad, if SLI is such a non-preferance, then how come XFX/eVGA didnt sell Intel chipsets? Or even better, how come so many people bought those motherboards anyway? Even after the 680i fiasco? Right, because they want SLI or even believe in the strength of nVidia chipsets and/or ain't having any troubles with the chipsets.

But without this SLI chip, there was no SLI on X58's, meaning no i7+SLI abilities, meaning people would either stick to 775 or when time was right for an upgrade move to AM2+/AM3, depending how Deneb evaluates. There are so many people, I actually dare to say an easy 30%+, in the enthusiast/gamer group who want nothing less then SLI. If i7 ain't going to run it, then all they would think is 'Oh well, more money left for a 3rd/better GPU'.

As said, this isnt about Intel + SLI, it's about SLI + something to run it on. And of course XFX/eVGA are there for the money, who ain't anyway.

Anyway, it's getting a bit OT now I guess:p:

Dual SLI < 0.1% of the market. :shrug:
Except here on xtremesystem, nobody care. it is a benchmark feature.
0.001% of the market uses 2560x1600, 2% of the people have better than 1280x1024 ... you got to put thinks into their context.
99% of the PC users don t need SLI, because running too low resolution to be efficent. It had to be said.

this is my very personal opinion

Rammsteiner
12-25-2008, 01:51 PM
Dual SLI < 0.1% of the market. :shrug:
Except here on xtremesystem, nobody care. it is a benchmark feature.
0.001% of the market uses 2560x1600, 2% of the people have better than 1280x1024 ... you got to put thinks into their context.
99% of the PC users don t need SLI, because running too low resolution to be efficent. It had to be said.

this is my very personal opinion
You're right on that part, then again how many people are truelly interested in i7 as we know it now;)

i7 has mainly a purpose for the same people here at xs, and those very same people do care about benchmarks. Also if you've got an i7 I dont see it happening only using a 17" screen:p:

Sr7
12-25-2008, 01:57 PM
Dual SLI < 0.1% of the market. :shrug:
Except here on xtremesystem, nobody care. it is a benchmark feature.
0.001% of the market uses 2560x1600, 2% of the people have better than 1280x1024 ... you got to put thinks into their context.
99% of the PC users don t need SLI, because running too low resolution to be efficent. It had to be said.

this is my very personal opinion

i7 would be too. But it's sold into new pc's after discontinuing old stock so it's basically forced into PC's for people who don't need it or don't know they have it.

If Crossfire or SLI worked the same way, they could claim it's more relevant and desired than it really is based purely off of shipment numbers too ;)

Res isn't everything for GPU demand either. Things like high levels of AA really tax systems, even at 12x10.

Shintai
12-25-2008, 01:57 PM
You're right on that part, then again how many people are truelly interested in i7 as we know it now;)

i7 has mainly a purpose for the same people here at xs, and those very same people do care about benchmarks. Also if you've got an i7 I dont see it happening only using a 17" screen:p:

That depends what you do ;)

i7 can alot more than GFX gaming.

Drwho?
12-25-2008, 02:36 PM
You're right on that part, then again how many people are truelly interested in i7 as we know it now;)

i7 has mainly a purpose for the same people here at xs, and those very same people do care about benchmarks. Also if you've got an i7 I dont see it happening only using a 17" screen:p:

Most of the 2009 and 2010 monitors will get to 1080P like, Nv already has too much power for it. Why do you think they are running on CUDA, it is because the need for Shadrers in Game is slowing down, too many shared for the number of pixels.

now, thinking that you have enough generic processing power is funny, even Nv wants a part of the cake with CUDA.
You PC today is very stupid, can not reconize your Girl friend into your set of pictures on your hard drive. Those kind of algorythms are not very friendly with the PCI side of the CUDA story, this is the futurs of application for consumer is visual computing, and the processor is the primary and unique piece of the puzzle that can archive efficently all of this: http://www.youtube.com/watch?v=7M3WrdDy_Dg
All the big actors of the market are trying to "copy" deepviewer, but they are all missing the point. The point is not the 3D part of the interface, it is about the computer helping you to find your data. I am fully commited to make the computer smart, I spend all of my days in this direction, and we are almost ready.
Don't think that 3D for users is usefull, that is cool it to be able to use complexe algorythm and match objects together, people, place ,time.
A core i7 at 280$ is equiped to do those kind of Visual computing.

[XC] leviathan18
12-25-2008, 07:15 PM
ok but why you stop nvidia releasing something that in my eyes is far superior than the atom platform intel is selling right now mind you, i have a atom motherboard with cpu and a dead qx6700 in my dead mobo and a c2d in my laptop all i get is intel but, then again the idea and the size of ION is something intel could get more atom cpus selling than they are doing right now.

Drwho?
12-25-2008, 11:31 PM
leviathan18;3526918']ok but why you stop nvidia releasing something that in my eyes is far superior than the atom platform intel is selling right now mind you, i have a atom motherboard with cpu and a dead qx6700 in my dead mobo and a c2d in my laptop all i get is intel but, then again the idea and the size of ION is something intel could get more atom cpus selling than they are doing right now.

well, fisrt, i am not involve in any of this product, so i don t know the details, but you are basically asking toyota to start shipping a porches with toyota engine.
Each company has the freedom to decide how to sell its products.
this is something nobody can argue.

this is my personal opinion.

kl0012
12-26-2008, 12:01 AM
Intel Claims It Does Not Force to Buy Intel Atom with Core-Logic.
Even though Nvidia Corp. was quoted as saying that Intel Corp. only sells its Atom processors for netbooks and nettops bundled with its own core-logic sets, which is why none of device manufacturers were interested in Nvidia’s Ion platform powered by GeForce 9400M chipset, Intel Corp. denied that it blocks Nvidia from entering the market of ultra low-cost personal computers (ULCPCs).

“We do sell Atom both bundled and as stand alone,” an Intel spokesperson told X-bit labs.

According to Intel officials, Nvidia does not need to obtain a separate license to make and sell chipsets compatible with Intel Atom processors aimed mostly at ULCPCs.

http://www.xbitlabs.com/news/mobile/display/20081225140815_Intel_Claims_It_Does_Not_Force_to_B uy_Intel_Atom_with_Core_Logic.html

OutSider
12-26-2008, 12:30 AM
so there is still chance for us too see Ion platform on market

Shintai
12-26-2008, 01:49 AM
so there is still chance for us too see Ion platform on market

Yes ofcourse. Else nVidia wouldnt have started to develop it. Sensationalism ftl.

Zaskar
12-26-2008, 06:28 AM
You're kidding right? You really believe on die GPUs will ever be as fast as a discrete dedicated GPU which isn't constrained to some corner of the chip?

Fusion and also Intel's integration of graphics processors are going to be okay for office work.

Isnt that like saying on CPU Cache will never be as good as separate memory since its not constrained to a small section of the CPU? Same goes for the memory controller?

Things get made smaller, and when integrated with the CPU the transfer time becomes MUCH Faster, it all comes down to being able to shrink it enough, and as you can see by every computer part in history, that's all just a matter of time.

Only downside is no separate upgrades.

Rammsteiner
12-26-2008, 07:41 AM
digitimes is the new fud:p:

metro.cl
12-26-2008, 07:56 AM
well, fisrt, i am not involve in any of this product, so i don t know the details, but you are basically asking toyota to start shipping a porches with toyota engine.
Each company has the freedom to decide how to sell its products.
this is something nobody can argue.

this is my personal opinion.

The example is kinda weird, Intel = Engine & NVIDIA / Chipset / GPU should be something like Gear Box.

Then as you make engines and not cars, cars manufacturers can mix and match :) sounds like a much nicer way to see life :)

Shintai
12-26-2008, 09:31 AM
You're kidding right? You really believe on die GPUs will ever be as fast as a discrete dedicated GPU which isn't constrained to some corner of the chip?

Fusion and also Intel's integration of graphics processors are going to be okay for office work.

i7 with an ondie GPU could easily be at the performance of HD4650 and HD4670. And thats with memory bandwidth.

trinibwoy
12-26-2008, 09:39 AM
i7 with an ondie GPU could easily be at the performance of HD4650 and HD4670. And thats with memory bandwidth.

On 32nm maybe yes. But the point isn't that future IGP's won't be as fast as current discrete parts. Of course they will. The point is that integrated will never be as fast as discrete of the same generation. Stuff gets integrated when the performance is "good enough". We are decades (maybe centuries) away from having good enough graphics and physics acceleration performance so we'll be seeing discrete cards for a good long while. People will always strive to model the real world and that will take an infinitely greater amount of processing power than we have available today.

The on-die cache analogy is flawed in many ways. But primarily it's that cache on its own is useless. Cache is just a fast buffer between the CPU and main memory. When you can integrate 4-6GB of cache on a CPU then maybe you'll have something to talk about. But by that time we'll have Terabyte main memory chips. Oh well :)

Drwho?
12-26-2008, 10:20 AM
On 32nm maybe yes. But the point isn't that future IGP's won't be as fast as current discrete parts. Of course they will. The point is that integrated will never be as fast as discrete of the same generation. Stuff gets integrated when the performance is "good enough". We are decades (maybe centuries) away from having good enough graphics and physics acceleration performance so we'll be seeing discrete cards for a good long while. People will always strive to model the real world and that will take an infinitely greater amount of processing power than we have available today.

The on-die cache analogy is flawed in many ways. But primarily it's that cache on its own is useless. Cache is just a fast buffer between the CPU and main memory. When you can integrate 4-6GB of cache on a CPU then maybe you'll have something to talk about. But by that time we'll have Terabyte main memory chips. Oh well :)

well, well, Centuries may be Microsoft centuries ... it says "2 hours to complete copy" and finish in 20 secs :)

It is much easier to catch up from the low end to high end than the other way around, based on Moores law. The GPU are at the Power wall.
Tesla is just a good marketing way to give more power/cooling to the chip, by adding an external box, but it is obvisous that they have no clue how to take the power down, they had 3 years to try.

In the mean time, you have High-K transistors coming in the integrated area, where the process of fabrication will help dramatically decreasing the leakage to a point that you can control the area you need active or not.

As I always say, most of the battle is not in the architecture, of course it is important, but the real deal is in the transistors, especially when you want to add to the CPU some more transistors to do GPU

What matters is Power per transistor, this is where the natural selection will happen, all 3 companies have smart architects, they will all get to the same efficency per cores/Stream exec units, the difference will be the process technology.

Again, this is my very personal opinion.

Speederlander
12-26-2008, 10:37 AM
On 32nm maybe yes. But the point isn't that future IGP's won't be as fast as current discrete parts. Of course they will. The point is that integrated will never be as fast as discrete of the same generation. Stuff gets integrated when the performance is "good enough". We are decades (maybe centuries) away from having good enough graphics and physics acceleration performance so we'll be seeing discrete cards for a good long while. People will always strive to model the real world and that will take an infinitely greater amount of processing power than we have available today.

The on-die cache analogy is flawed in many ways. But primarily it's that cache on its own is useless. Cache is just a fast buffer between the CPU and main memory. When you can integrate 4-6GB of cache on a CPU then maybe you'll have something to talk about. But by that time we'll have Terabyte main memory chips. Oh well :)

No, it doesn't have to be "as fast as", it just has to be fast enough. Just because they can build something faster doesn't mean it's always needed. With successive shrinks and parallel processing improvements you will see less and less of a need for dedicated graphics cards. I look forward to the day when all I have to do is add processing cores and memory.

Manicdan
12-26-2008, 11:02 AM
No, it doesn't have to be "as fast as", it just has to be fast enough. Just because they can build something faster doesn't mean it's always needed. With successive shrinks and parallel processing improvements you will see less and less of a need for dedicated graphics cards. I look forward to the day when all I have to do is add processing cores and memory.

it probably wont ever come to that

there are different systems for different needs, a netbook may verywell be the first step to having a computer, the size of a paperback book, that can do anything but play video games. then we will still have people who want to play games and will be happy with ATX cases and 500+ watts of energy consumption. i personally would love to have a high end gaming machine in the size of a console, but they are not built to cover that spectrum.

things are changing though. we now have laptops stronger than a desktop that most people need, and we are starting to see battery life last longer than ever before. the different goals of a computer are crossing over. gaming on a budget, a complete PC in the smallest sizes ever. but some things dont change, like windows requiring more and more resources with every version. in the end it will be what users really desire, as long as we are happy with video cards being 12" long and costing 500$, they will keep making them.

Speederlander
12-26-2008, 11:41 AM
it probably wont ever come to that
Exponential growth in the capability to size ratio over the last 25 years says it certainly will come to that. And again, when the death of the dedicated video card comes due to CPU capability, it will be a happy one.

Shintai
12-26-2008, 12:51 PM
In around 2012-2013 I expect discrete GPU sales to be a mere shadow of itself. Maybe around 5% of what it is now.

Rammsteiner
12-26-2008, 01:52 PM
In around 2012-2013 I expect discrete GPU sales to be a mere shadow of itself. Maybe around 5% of what it is now.
:rofl:

Im sorry, but that's really a weird statement. That would mean there has to be released a new CPU SKU for every type of GPU and CPU, not even mentioning the unclearity about Intel's GPU designs and how AMD and nVidia's GPUs are going to perform. I can already tell you that if Intel hits the right spot with its GPU processing and does integrate it into CPU's like AMD will do, if nVidia manages to get a GPU out twice as fast as any of them no one is going to focus on this integrated GPU and will just get a discrete nVidia GPU. If Intel and AMD left the discrete GPU market, which is very unlikely, this means nVidia has the entire market for its self.

Besides that, the only thing that might be likely is we see dual socket motherboards, one for a CPU and one for a GPU. But having integrated GPU's all the way into the high end section satisfying the majority of the market, no single way.

For the value market an integrated GPU is going to be more than enough, but for any gamer and/or high end user it's a no go. As said, dual socket motherboards might be an idea, but much further it ain't going to get. Only imagine the price you'll have to pay, a 1.5K Euro CPU would become like we think a high-end mid range CPU is now, just because of the GPU, which we might not even want:shakes:

Manicdan
12-26-2008, 02:56 PM
im not old enough to remember, but back when i first started using dedicate graphics, it was when win98 came out. so 10 years ago. in those 10 years has the trend in graphics card gone up then back down? or always up? or always down?

as far as i can tell, the cost of a gaming pc has gone down a bit, but enthusiast pcs are still going up, mostly due to multi-gpu, but either way, i dont see the market for dedicated gpus going away at all. if anything you might be able to buy a pc that can play a top game in a very small shell, but it wont be 1/10th that of what a real gaming pc can do.

think right now, a quad core + 100$ midrange gpu, in a case 2x the size of a netbook, with external psu. it could sell for about 600$, play any game on medium setting at 1080p resolution, and can easily be called a gaming pc. but beyond that, to pack in anymore processing power, the price and heat will go up exponentially. while the ATX midtower pc will see a much more linear scale since its not limited by space.

as we see more and more die shrinks we have the ability to see smaller pcs taking on bigger tasks, and eventually we reach a point where for a much smaller price we can see a pc cover more of the needs for a majority of users. thats great up to a point, but peoples needs of a pc change too, and the ability to deliver a cpu for those needs generally requires a bit more than what such a small form can hold

sundancerx
12-26-2008, 03:23 PM
:rofl:

Im sorry, but that's really a weird statement. That would mean there has to be released a new CPU SKU for every type of GPU and CPU, not even mentioning the unclearity about Intel's GPU designs and how AMD and nVidia's GPUs are going to perform. I can already tell you that if Intel hits the right spot with its GPU processing and does integrate it into CPU's like AMD will do, if nVidia manages to get a GPU out twice as fast as any of them no one is going to focus on this integrated GPU and will just get a discrete nVidia GPU. If Intel and AMD left the discrete GPU market, which is very unlikely, this means nVidia has the entire market for its self.

Besides that, the only thing that might be likely is we see dual socket motherboards, one for a CPU and one for a GPU. But having integrated GPU's all the way into the high end section satisfying the majority of the market, no single way.

For the value market an integrated GPU is going to be more than enough, but for any gamer and/or high end user it's a no go. As said, dual socket motherboards might be an idea, but much further it ain't going to get. Only imagine the price you'll have to pay, a 1.5K Euro CPU would become like we think a high-end mid range CPU is now, just because of the GPU, which we might not even want:shakes:

very good point man. if anything if discrete gpu's disappear, not because of their "unusefulness", but because that they become the dominant ones that they go to the slots that cpus originally occupy. where would the cpu goes, you ask? they will be integrated into the gpu/s lol.

atom is your first evidence that cpus are going that route. 2nd, you play games and you check cpu usage, 10-15%. i mean wth, why bother buying $1000 cpu when you only use 10% of it.
afa desktop applications is concern, at least majority of them, i think cpus already hit a wall, while gpus barely scratch it. i mean we dont even have physics yet on gpus.

Shintai
12-26-2008, 03:28 PM
:rofl:

Im sorry, but that's really a weird statement. That would mean there has to be released a new CPU SKU for every type of GPU and CPU, not even mentioning the unclearity about Intel's GPU designs and how AMD and nVidia's GPUs are going to perform. I can already tell you that if Intel hits the right spot with its GPU processing and does integrate it into CPU's like AMD will do, if nVidia manages to get a GPU out twice as fast as any of them no one is going to focus on this integrated GPU and will just get a discrete nVidia GPU. If Intel and AMD left the discrete GPU market, which is very unlikely, this means nVidia has the entire market for its self.

Besides that, the only thing that might be likely is we see dual socket motherboards, one for a CPU and one for a GPU. But having integrated GPU's all the way into the high end section satisfying the majority of the market, no single way.

For the value market an integrated GPU is going to be more than enough, but for any gamer and/or high end user it's a no go. As said, dual socket motherboards might be an idea, but much further it ain't going to get. Only imagine the price you'll have to pay, a 1.5K Euro CPU would become like we think a high-end mid range CPU is now, just because of the GPU, which we might not even want:shakes:


You fail to realize the SoC approach AMD and Intel is taking. Its nothing that wont come. Its a clear fact this is the future and its the way and its starting in 2010 or 2009.

Also the amount of SKUs wont be that much. How is GPU SKUs again? Thats right...most are just clockspeed.

Now this is actually funny. Since you are so busy with GPU limited games. Its actually very easy to see the trend for and when more GPU power wont matter. The GPU have a predefined limit. Its mixed with reality of whats shown. And its mixed with the resolution. And resolution limit is what they gonna hit very soon. Unless everyone should sit 50cm from a 50" monitor.

Larrabee is a power bunny. AMD got a good run with their radeon series. As said, you could integrate HD4650 in i3/i5/PH2 today and HD4670 in i7. And dont tell me about TDP.

And your 1.5K€ statement is just idiotic.

Having a discrete graphics card will be just as rare as a discrete soundcard today. Creative labs was the king once. They was everywhere. Today they are nothing. Chipsets with audio codecs made it so.


im not old enough to remember, but back when i first started using dedicate graphics, it was when win98 came out. so 10 years ago. in those 10 years has the trend in graphics card gone up then back down? or always up? or always down?

as far as i can tell, the cost of a gaming pc has gone down a bit, but enthusiast pcs are still going up, mostly due to multi-gpu, but either way, i dont see the market for dedicated gpus going away at all. if anything you might be able to buy a pc that can play a top game in a very small shell, but it wont be 1/10th that of what a real gaming pc can do.

think right now, a quad core + 100$ midrange gpu, in a case 2x the size of a netbook, with external psu. it could sell for about 600$, play any game on medium setting at 1080p resolution, and can easily be called a gaming pc. but beyond that, to pack in anymore processing power, the price and heat will go up exponentially. while the ATX midtower pc will see a much more linear scale since its not limited by space.

as we see more and more die shrinks we have the ability to see smaller pcs taking on bigger tasks, and eventually we reach a point where for a much smaller price we can see a pc cover more of the needs for a majority of users. thats great up to a point, but peoples needs of a pc change too, and the ability to deliver a cpu for those needs generally requires a bit more than what such a small form can hold

There has been dedicated graphics since the start.

The problem is basicly that in terms of performance/watt. GPUs sometimes stand still or even go backwards.

You would scream murder if Intel or AMD released a 250W CPU. But AMD and nVidia can easily give you that as a GPU and you more or less cheer.

Speederlander
12-26-2008, 04:23 PM
:rofl:

Im sorry, but that's really a weird statement. That would mean there has to be released a new CPU SKU for every type of GPU and CPU, not even mentioning the unclearity about Intel's GPU designs and how AMD and nVidia's GPUs are going to perform. I can already tell you that if Intel hits the right spot with its GPU processing and does integrate it into CPU's like AMD will do, if nVidia manages to get a GPU out twice as fast as any of them no one is going to focus on this integrated GPU and will just get a discrete nVidia GPU. If Intel and AMD left the discrete GPU market, which is very unlikely, this means nVidia has the entire market for its self.

Besides that, the only thing that might be likely is we see dual socket motherboards, one for a CPU and one for a GPU. But having integrated GPU's all the way into the high end section satisfying the majority of the market, no single way.

For the value market an integrated GPU is going to be more than enough, but for any gamer and/or high end user it's a no go. As said, dual socket motherboards might be an idea, but much further it ain't going to get. Only imagine the price you'll have to pay, a 1.5K Euro CPU would become like we think a high-end mid range CPU is now, just because of the GPU, which we might not even want:shakes:

No, Shintai's right.

Donnie27
12-26-2008, 05:33 PM
No, Shintai's right.

Sure is, QFT!

Chad Boga
12-27-2008, 01:20 AM
Intel Claims It Does Not Force to Buy Intel Atom with Core-Logic.
Even though Nvidia Corp. was quoted as saying that Intel Corp. only sells its Atom processors for netbooks and nettops bundled with its own core-logic sets, which is why none of device manufacturers were interested in Nvidia’s Ion platform powered by GeForce 9400M chipset, Intel Corp. denied that it blocks Nvidia from entering the market of ultra low-cost personal computers (ULCPCs).

“We do sell Atom both bundled and as stand alone,” an Intel spokesperson told X-bit labs.

According to Intel officials, Nvidia does not need to obtain a separate license to make and sell chipsets compatible with Intel Atom processors aimed mostly at ULCPCs.

http://www.xbitlabs.com/news/mobile/display/20081225140815_Intel_Claims_It_Does_Not_Force_to_B uy_Intel_Atom_with_Core_Logic.html
What a boon this initial story was for those who wanted to grasp any chance to slag off Intel and irrationally throw out the monopoly claim.

Sr7
12-27-2008, 01:27 AM
What a boon this initial story was for those who wanted to grasp any chance to slag off Intel and irrationally throw out the monopoly claim.

Not really.. Intel is not happy about this and obviously rejected NVIDIAs private offers to work with them on this as a platform. Intel is merely covering its tracks so they don't look like a monopolistic entity.

Otherwise there'd be no need for NVIDIA to go behind Intels back to create demand for this via various channels and OEMs (a lot more work than just doing a joint press release).

Chad Boga
12-27-2008, 01:31 AM
Not really.. Intel is not happy about this and obviously rejected NVIDIAs private offers to work with them on this as a platform. Intel is merely covering its tracks so they don't look like a monopolistic entity.
Considering Intel is trying to establish Atom in the netbook/midi market and considering Nvidia's failures in the field on laptops and possibly Apple gear, who can blame Intel for not wanting to be overly associated with Nvidia quality control at the moment.

As it is, Nvidia are free to go ahead with the designs anyway, something for which Intel was condemned for supposedly blocking.

Sr7
12-27-2008, 02:49 AM
Considering Intel is trying to establish Atom in the netbook/midi market and considering Nvidia's failures in the field on laptops and possibly Apple gear, who can blame Intel for not wanting to be overly associated with Nvidia quality control at the moment.

As it is, Nvidia are free to go ahead with the designs anyway, something for which Intel was condemned for supposedly blocking.

I guess they were so concerned about these things that they went out of their way begging for SLI on X58 huh? :ROTF:

Chad Boga
12-27-2008, 02:54 AM
I guess they were so concerned about these things that they went out of their way begging for SLI on X58 huh? :ROTF:
You must have a strange definition of "begging". :ROTF:

Sr7
12-27-2008, 02:55 AM
You must have a strange definition of "begging". :ROTF:

Well, withholding a bus license in exchange for SLI.. so bartering. Happy? Obviously they weren't afraid of associating with NVIDIA for those purposes.. don't think your logic applies here.

Chad Boga
12-27-2008, 02:59 AM
Well, withholding a bus license in exchange for SLI.. so bartering. Happy? Obviously they weren't afraid of associating with NVIDIA for those purposes.. don't think your logic applies here.
As I said, Atom is a new product they are trying to establish in new markets, the last thing they need is for it to be tarnished.

With SLI, it is a market for people far more knowledgeable about computing issues, so a GPU failure there wouldn't hurt Intel.

Sr7
12-27-2008, 03:10 AM
As I said, Atom is a new product they are trying to establish in new markets, the last thing they need is for it to be tarnished.

With SLI, it is a market for people far more knowledgeable about computing issues, so a GPU failure there wouldn't hurt Intel.

Press is press. Headlines and mental associations make no such distinctions.

Chad Boga
12-27-2008, 03:23 AM
Press is press. Headlines and mental associations make no such distinctions.
:ROTF::ROTF::ROTF:

Atom is likely to sell over 20 million units of one sort or another a year, how big a slice of that pie could Nvidia potentially infect?

How many people who buy an SLI & CF capable motherboard actually go SLI?

It would take a real effort to think the difference in scale here would make no difference if things went awry, but I have faith that you will make that effort. :lol2:

mAJORD
12-27-2008, 04:16 AM
To imply Discrete GFX cards will disapear is to imply game engine technology is going to stand still.. as of right now.

As by 2012-2013 intergtrated GFX on 32/22nm at reasonable TDP for SoC probably still won't be to a performance level of current RV770 /G200x, letalone up to handling the engines of the time..

It is obvious they will be a lot better compared to now however.. and more able to handle average joe's gaming requirments. Sideport memory on current AMD intergrated is a good start, and the onchip PCIe of i5 will no douby be handy in intergrating beefier IGPs.

DeathReborn
12-27-2008, 07:09 AM
As I said, Atom is a new product they are trying to establish in new markets, the last thing they need is for it to be tarnished.

*cough* 945 *cough*

Intel is doing a good enough job tarnishing what should have been a great little product and turning it into a mediocre one. Intel had the resources & capability to create a new chipset to go with Atom but chose to offload old, power sucking chipsets instead. Poulsbo should have been the launch chipset.

Chad Boga
12-27-2008, 07:21 AM
*cough* 945 *cough*

Intel is doing a good enough job tarnishing what should have been a great little product and turning it into a mediocre one.
Which is of course the same sort of tarnishing one gets from silicon failure in the field.

Crikey!!!!! :rolleyes:

gosh
12-27-2008, 08:05 AM
You would scream murder if Intel or AMD released a 250W CPU. But AMD and nVidia can easily give you that as a GPU and you more or less cheer.

You don't cheer because of the power consumption. If the computer is on all day then I think power use is much more important. The game computer I have has a 4870x2, I would never buy that card for a computer that I work on, but the game computer is only on if I play a game.
If you upgrade a processor on one computer and shifts the processor that has been used to another computer (maybe HTPC) it will problematic if that processor consumes a lot of power.

Rammsteiner
12-27-2008, 09:05 AM
You fail to realize the SoC approach AMD and Intel is taking. Its nothing that wont come. Its a clear fact this is the future and its the way and its starting in 2010 or 2009.
I never said it ain't going to come, we all know it's going to come. But not in such a way you're presenting it, and not at all in such a short time.


Also the amount of SKUs wont be that much. How is GPU SKUs again? Thats right...most are just clockspeed.
And do you think those SKU's are called into life for the lols of it? Any of these sell, why bother otherwise with a i7 920 and 940 if 'only clock speeds are different'. Not everyone OC's and in some cases such a new SKU can be the edge for flawless FPS HQ gaming or stuttering HQ gaming.

Also not all are clockspeeds, not at all actually. They go combined with way better cooling solutions, improved custom PCB lay-outs, amount of GDDR, what type of GDDR etc. It's not just clockspeeds.


Now this is actually funny. Since you are so busy with GPU limited games. Its actually very easy to see the trend for and when more GPU power wont matter. The GPU have a predefined limit. Its mixed with reality of whats shown. And its mixed with the resolution. And resolution limit is what they gonna hit very soon. Unless everyone should sit 50cm from a 50" monitor.
Well, would you have expected to see people gaming in front of a 30" screen when the very few first 17" TFT's came out? That a single GPU could actually run a game flawless at 24"? Of course you can expect the unexpected, but everytime you start to wonder the usefullness of MGPU solutions, a bigger screen comes in and the standard goes a rank higher. I'd rather expect people to game at 50" screens than that we'll see almost no dedicated GPU's over 3 years anymore. Besides that, the screen resolution might not go higher, but that's far from relevant. That would mean a GPU from 5 years ago able to run the game at 1600x1200 flawless, it should run Far Cry 2 at 1600x1200 flawless as well.


Larrabee is a power bunny. AMD got a good run with their radeon series. As said, you could integrate HD4650 in i3/i5/PH2 today and HD4670 in i7. And dont tell me about TDP.
Of course you can, will it run my games at my 24" screen HQ? No way.


And your 1.5K€ statement is just idiotic.

Having a discrete graphics card will be just as rare as a discrete soundcard today. Creative labs was the king once. They was everywhere. Today they are nothing. Chipsets with audio codecs made it so.
No, chipsets are getting better, that's different. Creative :banana::banana::banana::banana:ed up some things quite badly, then again explain why Asus is doing some good stuff regarding audio cards if it didnt do a lot anyway:confused: Also it might be me, but the by far the majority does have a sound card:confused:

Also I'd like to add if they're going to integrate GPU's from any range into CPU's from any range, they've to adjust roadmaps so badly to make new CPU's to have the newest and latest GPU's. Not even to think about new implantations like DirectX etc.

As said before, it's going to happen, but not the way you're presenting it. Most likely a dual socket sort of solution where the GPU can acces RAM with a IMC. Only to that it will take a long time to see this happening unless we get sticks with GDDR5+, since even i7's bandwith is way too low to feed the GPU's. However, for low end solutions, yes, we'll see GPU's being integrated into CPU's. Then again, what does a low end solution need.

[EDIT]Nevermind, why do I even bother.

Speederlander
12-27-2008, 10:16 AM
blah blah blah gzzzt *cough* blah sputter blah


Fixed both your posts to the appropriate content:rolleyes: Maybe post something... more relevant, you know, to add into the discussion, open eyes etc:rolleyes:

Fixed your post to the appropriate content.

Rammsteiner
12-27-2008, 10:52 AM
Not worth it.

Shintai
12-27-2008, 12:35 PM
Can you either stop trolling or just shut up?

Dont throw rocks when you live in a glasshouse :yepp:

[XC] riptide
12-27-2008, 01:46 PM
Can you either stop trolling or just shut up?

Why is it always you that is causing so much trouble in these threads. Seriously? You got a short vacation before from News, I personally warned you a few times.... I mean. What does a guy have to do?

Speederlander
12-27-2008, 02:18 PM
Can you either stop trolling or just shut up?

Funny, the only post I made that could be remotely considered "trolling" was in response to your trolling. :shrug: And indeed, I was rather restrained, only offering a parody of your own action against other members. :up:

Rammsteiner
12-27-2008, 02:20 PM
riptide;3530582']Why is it always you that is causing so much trouble in these threads. Seriously? You got a short vacation before from News, I personally warned you a few times.... I mean. What does a guy have to do?
You sir, got PM:shakes:

[XC] riptide
12-27-2008, 02:27 PM
You sir, got PM:shakes:

No I don't! :p:

Xello
12-27-2008, 02:37 PM
Man, someone should update the first post with the info that the thread title is incorrect after all, i actually got 3 pages into this before i got bored :rolleyes:

Rammsteiner
12-27-2008, 02:40 PM
riptide;3530653']No I don't! :p:
Should have made PM first I guess:ROTF:

Now you do though:rolleyes: