PDA

View Full Version : Jen-Hsun Huang: GPU will continue to grow for another 15 years



JohnJohn
09-25-2008, 06:52 PM
The w00pass master insightfull interview


Q: Intel and AMD are using their platform strategies to put pressure on Nvidia's chipset business. Is their any truth to the reports you are considering quitting the chipset market?

A: Nvidia has already thrown a large amount of resources into the chipset market and we have managed to gain a large share of the AMD chipset platform, we have also made some progress into Intel's platform. We will continue to operate this product line.

In addition to chipsets, we will add more innovative applications to our GPU products and will promote to consumers that Nvidia's GPUs are the best choice.

Q: Is Nvidia planning to adopt the 40nm process recently announced by TSMC?

A: Nvidia have always been very careful about our process transition plans. Maturity and yield rates are our priority considerations. We will adopt the latest process nodes at the most appropriate time.

Q: AMD's current strategy for GPUs is to focus resources on mainstream products in order to save costs and time, and then target the high-end market with dual-GPU designs, what is your opinion on this?

A: Basically, I don't agree with the strategy since single-GPU designs are definitely more efficient than dual-GPU ones. If a single GPU can perform 100%, a dual-GPU card can only achieve 130-180% performance, while costs are much higher.

As for whether dual-GPU designs help save development time, I think this is a question of the research team's ability. Since Nvidia's team is much stronger than our competitor's, I believe we are able to achieve time-to-market and yet still maintain design excellence (this guy is the real and only w00pass master).

More honey here (http://www.digitimes.com/news/a20080905PD206.html)

safan80
09-25-2008, 06:56 PM
They still don't have anything that beats the 4870x2.

Nasgul
09-25-2008, 07:02 PM
Q: There are rumors Nvidia is heading into x86 CPU design, your comments?

A: Currently, we have no plans for that. We are very busy already and have no time to cross over to our competitors' field. The most important thing to remember is that Intel is the leader in the CPU market and so it's better that Nvidia focuses on what we do best. To cut in to the x86 CPU market would just be a waste of time and resources. Well said.......:clap:


Dual GPUs? With the current economy? Sure why not? Dual GPU $550 cards for everyone!!!!.

I'll stick with a GTX260 Core 216 please.

zanzabar
09-25-2008, 07:04 PM
they still dont have anything to compete with amd from AVIVO, HTPC cards or server/workstation GPgpu

NV needs to get a new development path they really need dx10.1 oGL3 and a fully programmable frame buffer for their next new platform card in q1, and it dosnt go over that. saying htat u have the best dev team dose nothing if they dont have leadership

GAR
09-25-2008, 07:47 PM
They still don't have anything that beats the 4870x2.

Actually you are comparing 2 gpu's to a single gpu, somehow doesnt seem right.

Sparky
09-25-2008, 07:50 PM
Actually you are comparing 2 gpu's to a single gpu, somehow doesnt seem right.

It is top card to top card :shrug:

astrallite
09-25-2008, 08:25 PM
Actually you are comparing 2 gpu's to a single gpu, somehow doesnt seem right.

?

Comparing an imaginary single GPU to a dual-GPU solution?

RejZoR
09-25-2008, 09:09 PM
Actually you are comparing 2 gpu's to a single gpu, somehow doesnt seem right.

Design is not important, it's the card class (or price/performance if you want).
And HD4870X2 smacks GTX280 badly. 2x GPU or not, thats not really important.
It makes sense. They perfected multicore and creating 2 lower end chips is easier and cheaper than producing one massive super chip that has huge die, heats badly, has bad yelds and low production number. All this then falls on price itself...

xlink
09-25-2008, 09:12 PM
Actually you are comparing 2 gpu's to a single gpu, somehow doesnt seem right.

but is probably costs more to make a GTX280 than it does a 4870x2

Frananta
09-25-2008, 09:29 PM
why 15 yrs? is it then the cloud computing era?

railer
09-25-2008, 09:36 PM
but is probably costs more to make a GTX280 than it does a 4870x2

You got a source for that?

Macadamia
09-25-2008, 10:06 PM
Don't worry about the single/dual argument- the 4870X2 is selling without the need to be firesaled, something NONE of the nVidia products have managed to do so this round (from budget 9400 to highend GTX280)


The nVidia team is so pathetically this round that they lost all sense of "scaling" or "efficiency" beyond 128SPs (use 9600GT as a baseline; You'll see.) The Radeon team has that problem too, but at least they did it in a die area that is smaller than G92 even in 55nm.

Not so tough now, are they?

zanzabar
09-25-2008, 10:18 PM
You got a source for that?

what was it 4 or 8 rv770 fit in 1 g200, that will make a huge price difference since each wafer should cost about the same, so thats 2 or 4 duel gpu pairs in the same die area

shiznit93
09-25-2008, 10:44 PM
They still don't have anything that beats the 4870x2.
to many of us current multi-gpu solutions are not an option. i play in windowed mode a lot, use dual monitors, and will NOT use vsync w/o triple buffering. the second gpu would almost never be in use, and that's without throwing micro-stutter in the mix. when ati comes up with a single gpu card that can compete with nvidia in performance (actual games not 3dmark), reliability (how many overhearing/failing 4800 series out there already? yes nvidia has mobile issues but desktop cards are rock solid and their partners have had lifetime warranties since before the 6800 series), driver completeness and features (game profiles, lod clamp, high quality texture mode with no shimmer), image quality (ati still doesn't have angle-dependent AF, run the d3d AF analyzer if you need proof), and price (gtx 260 is cheaper than 4870, overclocks better, runs cooler, and has almost double the memory), call me.

saveus222
09-25-2008, 11:27 PM
Actually you are comparing 2 gpu's to a single gpu, somehow doesnt seem right.

it dint seem right to me at first too.. but look at the price performance.. plus it makes more sense.. its cheaper for them to design and manufacture... and its working.. thats what counts.

biohead
09-25-2008, 11:31 PM
and already the topic somehow revolves around gtx280 versus 4870x2. just cut it out, same old story.

RejZoR
09-26-2008, 12:40 AM
why 15 yrs? is it then the cloud computing era?

I have no clue why ppl buzz so much about this "cloud computing" all the sudden (everywhere).

ownage
09-26-2008, 12:43 AM
The w00pass master insightfull interview



More honey here (http://www.digitimes.com/news/a20080905PD206.html)

LOL, Jen-Hsun Huang fanboi in da house!

Shintai
09-26-2008, 12:48 AM
In 15 years nVidia is dead and the GPU is long gone.

tiro_uspsss
09-26-2008, 03:43 AM
In 15 years nVidia is dead and the GPU is long gone.

i wish it were 15 months so we can all have a break from the endless mind-numbing re-hashs/re-naming bs :D

STaRGaZeR
09-26-2008, 03:53 AM
In 15 years nVidia is dead and the GPU is long gone.

Now some random guy will quote this for 15 years LOL.

Perp
09-26-2008, 04:16 AM
"Maturity and yield rates are our priority considerations." :rofl:

Miss Banana
09-26-2008, 04:42 AM
Since Nvidia's team is much stronger than our competitor's, I believe we are able to achieve time-to-market and yet still maintain design excellence.

Maintain design excellence?
Does this mean that the Nvidia CEO thinks they currently have design excellence?

He really does live in his own little world...

Also, if nvidias design team is that much stronger than ati's design team, it makes me wonder how nvidia managed to launch such crappy products (260/280) compared to ati (4850/4870).

Pathetic.


Disclaimer: I do not mean to say the 260 or 280 are crappy as a GPU. I mean to say they are crappy as a product, profit-wise. When looking at margins, Nvidia lost to Ati this round, and not just a little bit. This was allready clear when nvidia cut the prices quickly and severely right after the launch. At this point, for the consumer, nvidia has excellent options, but from a business standpoint Nvidia currently does not have any good products on the market.

LowRun
09-26-2008, 05:02 AM
to many of us current multi-gpu solutions are not an option. i play in windowed mode a lot, use dual monitors, and will NOT use vsync w/o triple buffering. the second gpu would almost never be in use, and that's without throwing micro-stutter in the mix. when ati comes up with a single gpu card that can compete with nvidia in performance (actual games not 3dmark), reliability (how many overhearing/failing 4800 series out there already? yes nvidia has mobile issues but desktop cards are rock solid and their partners have had lifetime warranties since before the 6800 series), driver completeness and features (game profiles, lod clamp, high quality texture mode with no shimmer), image quality (ati still doesn't have angle-dependent AF, run the d3d AF analyzer if you need proof), and price (gtx 260 is cheaper than 4870, overclocks better, runs cooler, and has almost double the memory), call me.

Talk about reliability: http://www.xtremesystems.org/forums/showthread.php?t=197460
and this is coming from a Nvidia fan.

Drivers, lets talk about multi screen, Vista and the list could go on, yeah you got it Nvidia's one suck just as much as ATI's one.

Image quality, what about that for example: http://www.behardware.com/news/lire/19-09-2008/
and the fact that many guys i've seen with rv770 and gt200 cards tend to favor ATI's IQ.

Finally for the price hd4870 is at the very least 50€ cheaper than gtx260 here.

So that makes pretty much each and every of your points moot to say it kindly except for the single GPU preference :yepp:

shiznit93
09-26-2008, 10:37 AM
Talk about reliability: http://www.xtremesystems.org/forums/showthread.php?t=197460
and this is coming from a Nvidia fan.

Drivers, lets talk about multi screen, Vista and the list could go on, yeah you got it Nvidia's one suck just as much as ATI's one.

Image quality, what about that for example: http://www.behardware.com/news/lire/19-09-2008/
and the fact that many guys i've seen with rv770 and gt200 cards tend to favor ATI's IQ.

Finally for the price hd4870 is at the very least 50€ cheaper than gtx260 here.

So that makes pretty much each and every of your points moot to say it kindly except for the single GPU preference :yepp:

maybe my points are moot to you but not in my reality. I have abused the hell out of every nvidia card I have every owned since the TNT2 safe in the knowledge that BFG and EVGA will replace it for free here in the USA no questions asked for life and I have never ever had one die on me. I had a x800xl that had fan failure that lead to overheating and it had to be replaced. It took over a month to get the replacement, while when a 6800 in a PC that I had put together for a friend got fried by lighning (idiot didn't use surge protector), BFG replaced in a week (he sent it from MD on a monday, it was in OH wednesday, and replacement was at the door friday!) and even upgraded him to 6800GT (i'm sure this was a fluke but you can't deny the speed). Also tried a x1900 that died on my friend a couple months after I sold it to him (he didn't want nvidia). Noise is very important to me and you can safely replace nvidia coolers w/o voiding your warranty as well, ATI's cards sound like small turbines compared to my downvolted 120mm fans (well the 4870 don't but they idle at 90C:eek:) and you can't touch the stock cooling unless you want to take your chances with their asian partners.

Image quality is subjective, many people are all about AA and maybe to them ATI looks better, I use a 24'' FW900 CRT and with 4xAA the jaggies are gone in any game period, I don't need any fancy modes. What I do care about is texture quality and ATI's AF is still inferior to Nvidia's (http://techreport.com/articles.x/14990/7), I can see the shimmering on my friend's 4850 while my 8800gt has none in the same games. Catalyst AI also introduces artifacts.

Just because the 4870 is cheaper where you are doesn't make my point moot by the way, check US prices and you will see that the 260 is a much better value for me.

What's wrong with Vista? I switched when SP1 came out and haven't had any issues with Nvidia or anything else for that matter, early adopters usually have to put up with crap so I waited til SP1.

Ok I'll concede that the secondary monitor does not turn black in Xfire but since you can't play in windowed mode it kind of defeats a good bit of the reason for having two monitors in the first place. And not being able to use triple buffering means vsync becomes garbage, I have no idea how you 60hz LCD people can stand that :banana::banana::banana::banana:.

Slovnaft
09-26-2008, 10:54 AM
i really don't like nvidia. but i really like the way this guy talks. he's really good at it.
either that or it's just a stellar translation.
stellar.

Vinas
09-26-2008, 11:13 AM
Actually you are comparing 2 gpu's to a single gpu, somehow doesnt seem right.The 4870x2 is the fastest single slot solution you can buy RIGHT NOW!:clap: So who cares if it's two GPU's or one the fact is this is still a single slot solution.

Vinas
09-26-2008, 11:28 AM
maybe my points are moot to you but not in my reality. I have abused the hell out of every nvidia card I have every owned since the TNT2 safe in the knowledge that BFG and EVGA will replace it for free here in the USA no questions asked for life and I have never ever had one die on me. I've killed a BFG 7900GT and they asked questions when I RMA'd it.


I had a x800xl that had fan failure that lead to overheating and it had to be replaced. It took over a month to get the replacement,It took 10 weeks to get my BFG 7900GT back!


ATI's cards sound like small turbines compared to my downvolted 120mm fans (well the 4870 don't but they idle at 90C:eek:) and you can't touch the stock cooling unless you want to take your chances with their asian partners.False.


Catalyst AI also introduces artifacts.overclocking introduces artifacts, *not* CCC or the nvidia control panel


Just because the 4870 is cheaper where you are doesn't make my point moot by the way, check US prices and you will see that the 260 is a much better value for me.The point was that the 4870x2 is the better value over the GTX280.


Ok I'll concede that the secondary monitor does not turn black in Xfire but since you can't play in windowed mode it kind of defeats a good bit of the reason for having two monitors in the first place. I have four 20" monitors on my ASUS 3870x2 TOP and I play windowed across all four. Surely the 4870 can do the same with 2!

Not that I'm a fanboi (I have a 8800GTX in my second rig) but I wanted to clear up a few things. ATi is doing a fine job this time around.

Xope_Poquar
09-26-2008, 11:35 AM
why 15 yrs? is it then the cloud computing era?

Because 15 years is the exact amount of time it takes before everyone forgets what you said.

Coincidentally, 15 years is the common time frame used when talking about natural resource supply. "We only have enough ______ to last us 15 years" is the common phrase when talking about something we're run out of. (Even if we've always had enough of ______ to last us 15 years from the current locations we get it).

shiznit93
09-26-2008, 11:43 AM
I have four 20" monitors on my ASUS 3870x2 TOP and I play windowed across all four. Surely the 4870 can do the same with 2!
Only one gpu can be used in windowed mode in dx9, your other 3870 is just idling.

but don't take it from me, http://forums.amd.com/game/messageview.cfm?catid=262&threadid=96440&enterthread=y


false what's false about it?


overclocking introduces artifacts, *not* CCC or the nvidia control paneli said Catalyst AI, not the CCC.


The point was that the 4870x2 is the better value over the GTX280.only if you can live with the baggage that multi-gpu brings.

YukonTrooper
09-26-2008, 12:41 PM
Jen-Hsun Huang needs a hug.

ownage
09-26-2008, 12:43 PM
Jen-Hsun Huang needs a hug.

From Hector Ruiz :up:

Sr7
09-26-2008, 12:52 PM
i really don't like nvidia. but i really like the way this guy talks. he's really good at it.
either that or it's just a stellar translation.
stellar.

Jen-Hsun Huang just speaks english lol.

Sr7
09-26-2008, 12:55 PM
The 4870x2 is the fastest single slot solution you can buy RIGHT NOW!:clap: So who cares if it's two GPU's or one the fact is this is still a single slot solution.

"More like who cares if it's single slot or not, it's still Crossfire"

This has more weight because CF and SLI have certain limitations and issues associated with them which some people don't want. These are real implications.

It's a bit of a hack on ATI's part to compare their microstutter prone, scaling-lacking (in certain games) card to a single GPU that will be across the board faster than the next chip below it in the lineup.

Hence the pricing for these is not the same though (different price points).

What you're doing is kind of worse than comparing the R600 to the G80 GTX. ATI said it wasn't supposed to compete with GTX and the pricing reflected that. Of course when it was engineered it was supposed to be the fastest thing of all, but it didn't turn out that way, so they started saying "hey make sure you compare this to the 8800GTS" and priced it accordingly.

shiznit93
09-26-2008, 01:07 PM
"More like who cares if it's single slot or not, it's still Crossfire"
exactly. you can put lipstick on a pig but it's still a pig :D:up:

iddqd
09-26-2008, 01:10 PM
They should just let one GPU slave resources of another GPU; it would be much easier from the software standpoint, since it looks like you've only got one GPU. They're embarrassingly parallel as it is, I'm sure they could implement local parallelism much better than simply doing corssfire (or SLi, for that matter).

T_Flight
09-26-2008, 02:00 PM
Actually for those that don't know, nVidia does currently hold design excellence.

There are serious issues with the VRM section of ATi cards right now, and it doesn't look like they plan to fix it anytime soon. They have actually had to dumb the drivers down to run Furmark at half the frames to keep them from burning up. Before they did that, mnay were doing just that and letting the magic smoke out. Many on this forum have stated they are designed to 80% of what they should be. You might get a record with an 4870x2 for a couple weeks, but if you stress test one at full FPS rates it's gonna get nuked.

That does not happen with nVidia cards, and is not an issue.

nVidia also owns the records in Folding and the ATi's are not even running close.

Overall, nVidia is just the better card right now. That could change, but I've seen no evdience or news that they plan on chnaging it anytime soon. They are aiming at the less expensive market right now, and AMD does not have the money to invest in R&D. It's just not there. You cannot do R&D without money, and to get ahead you have to have R&D.

Some may also be shocked in a few days when they find out that nVidia's drivers are maturing and when some of the folks here start setting new records with those new drivers.

I'm also glad to see that nVidia has no interest in dual GPU "double cheesburger" arrangements. I'll take a single "whopper" over a double cheesburger thankyou. :D Why have dual GPU heat when one GPU does it?

Right now there is no ATi single GPU that comes close to a GTX280. It is simple the fastest single GPU solution out there.

So when they say they have design excelence they are dead nuts correct.

Dual GPU's are a "patch". Single powerful GPU's are the "fix", and in time dual GPU's will be a fad who's time has long since vanished and became a distant memory.

Jimmer411
09-26-2008, 02:17 PM
Actually for those that don't know, nVidia does currently hold design excellence.

There are serious issues with the VRM section of ATi cards right now, and it doesn't look like they plan to fix it anytime soon. They have actually had to dumb the drivers down to run Furmark at half the frames to keep them from burning up. Before they did that, mnay were doing just that and letting the magic smoke out. Many on this forum have stated they are designed to 80% of what they should be. You might get a record with an 4870x2 for a couple weeks, but if you stress test one at full FPS rates it's gonna get nuked.

That does not happen with nVidia cards, and is not an issue.

nVidia also owns the records in Folding and the ATi's are not even running close.

Overall, nVidia is just the better card right now. That could change, but I've seen no evdience or news that they plan on chnaging it anytime soon. They are aiming at the less expensive market right now, and AMD does not have the money to invest in R&D. It's just not there. You cannot do R&D without money, and to get ahead you have to have R&D.

Some may also be shocked in a few days when they find out that nVidia's drivers are maturing and when some of the folks here start setting new records with those new drivers.

I'm also glad to see that nVidia has no interest in dual GPU "double cheesburger" arrangements. I'll take a single "whopper" over a double cheesburger thankyou. :D Why have dual GPU heat when one GPU does it?

Right now there is no ATi single GPU that comes close to a GTX280. It is simple the fastest single GPU solution out there.

So when they say they have design excelence they are dead nuts correct.

Dual GPU's are a "patch". Single powerful GPU's are the "fix", and in time dual GPU's will be a fad who's time has long since vanished and became a distant memory.



I guess we should all go for single cylinder cars, because 8 pistons are nothing more than a patch! We need 1 big powerful piston..... right?



Lay off the pipe and step away from your parents computer before you hurt yourself k?

grimREEFER
09-26-2008, 02:20 PM
in 15 years, we have perfect graphics in games then? cool beans, but i thought it would come much sooner. or are they just gonna keep improving them for non-graphical purposes?
and yea, the 4870x2 is the fastest gaming card out atm, but the bar isnt set too high, nvidia just have to make a dual gpu version of the gtx 260 to basically match it, or a dual gpu version of the gtx 280 to surpass it, its not like amd is incredibly ahead in a technological sense.

cegras
09-26-2008, 02:25 PM
"More like who cares if it's single slot or not, it's still Crossfire"

This has more weight because CF and SLI have certain limitations and issues associated with them which some people don't want. These are real implications.

It's a bit of a hack on ATI's part to compare their microstutter prone, scaling-lacking (in certain games) card to a single GPU that will be across the board faster than the next chip below it in the lineup.

Hence the pricing for these is not the same though (different price points).

What you're doing is kind of worse than comparing the R600 to the G80 GTX. ATI said it wasn't supposed to compete with GTX and the pricing reflected that. Of course when it was engineered it was supposed to be the fastest thing of all, but it didn't turn out that way, so they started saying "hey make sure you compare this to the 8800GTS" and priced it accordingly.

Wow, you're certainly a poster boy for that particular issue.

T_Flight
09-26-2008, 02:26 PM
I guess we should all go for single cylinder cars, because 8 pistons are nothing more than a patch! We need 1 big powerful piston..... right?



Lay off the pipe and step away from your parents computer before you hurt yourself k?


I just love people who let out verbal diahreah without having a f'n clue who they are talking to on the other end. :rolleyes: Try again kid.

You just gotta laugh at such ineptness. Thanks for the laughs there. :ROTF:

eric66
09-26-2008, 02:37 PM
Actually for those that don't know, nVidia does currently hold design excellence.

There are serious issues with the VRM section of ATi cards right now, and it doesn't look like they plan to fix it anytime soon. They have actually had to dumb the drivers down to run Furmark at half the frames to keep them from burning up. Before they did that, mnay were doing just that and letting the magic smoke out. Many on this forum have stated they are designed to 80% of what they should be. You might get a record with an 4870x2 for a couple weeks, but if you stress test one at full FPS rates it's gonna get nuked.

That does not happen with nVidia cards, and is not an issue.

nVidia also owns the records in Folding and the ATi's are not even running close.

Overall, nVidia is just the better card right now. That could change, but I've seen no evdience or news that they plan on chnaging it anytime soon. They are aiming at the less expensive market right now, and AMD does not have the money to invest in R&D. It's just not there. You cannot do R&D without money, and to get ahead you have to have R&D.

Some may also be shocked in a few days when they find out that nVidia's drivers are maturing and when some of the folks here start setting new records with those new drivers.

I'm also glad to see that nVidia has no interest in dual GPU "double cheesburger" arrangements. I'll take a single "whopper" over a double cheesburger thankyou. :D Why have dual GPU heat when one GPU does it?

Right now there is no ATi single GPU that comes close to a GTX280. It is simple the fastest single GPU solution out there.

So when they say they have design excelence they are dead nuts correct.

Dual GPU's are a "patch". Single powerful GPU's are the "fix", and in time dual GPU's will be a fad who's time has long since vanished and became a distant memory.

yeah ever heard gtx 280 hitting 105 C ? about gpu that comes close to gtx 280 go and read anand 1 gb 4870 review and your third statement is bs :ROTF:

shiznit93
09-26-2008, 02:38 PM
I guess we should all go for single cylinder cars, because 8 pistons are nothing more than a patch! We need 1 big powerful piston..... right?



Lay off the pipe and step away from your parents computer before you hurt yourself k?
this analogy is terribly flawed. When comparing the number of cylinders in an ICE you have to compare at a fixed displacement. More cylinders are more efficient because you maximize surface area per a given displacement, but an I-4 2.5L and a V6 2.5L are still both 2.5L. multi-gpu is less efficient, and the die area/transistor count is not constant but it doubles when you add another GPU. If going from 4 cylinders to 8 required doubling engine displacement at a loss of efficiency there would be no 8 cylinder cars. try again.

JohnJohn
09-26-2008, 02:57 PM
nVidia also owns the records in Folding and the ATi's are not even running close.

Yeah right when the w00pass master is spending huge amounts of money in the w00pass Building (http://www.theinquirer.net/gb/inquirer/news/2008/09/16/huang-donates-million-stanford) you can get the stanford team develop the software to run better on the w00pass hardware


Overall, nVidia is just the better card right now. That could change, but I've seen no evdience or news that they plan on chnaging it anytime soon. They are aiming at the less expensive market right now, and AMD does not have the money to invest in R&D. It's just not there. You cannot do R&D without money, and to get ahead you have to have R&D.

Yeah im sure they are aiming to the less expensive market right now, it is all they can do.


I'm also glad to see that nVidia has no interest in dual GPU "double cheesburger" arrangements. I'll take a single "whopper" over a double cheesburger thankyou. :D Why have dual GPU heat when one GPU does it?

yeah well 1 thing i can tell, itīs that nvidia IS going to make more dual GPUs card, they donīt have another choice than that,
I think Huang said this in order to confuse the competition like ati did with the 480sp, but in the end of the year we are going to see a dual gpu card from nvidia, sorry to dissapoint you

Earzz
09-26-2008, 03:11 PM
What does it really mather guys, NVidia is just making there chips really big and ATI is basicly doing the same only they split it up in two chips.
You cant really go wrong with either, they all have there own good/bad things its just your own preference.

Caparroz
09-26-2008, 03:29 PM
(...) If going from 4 cylinders to 8 required doubling engine displacement at a loss of efficiency there would be no 8 cylinder cars. try again.

I allways stay miles away from trolling prone threads like these but...

Ever heard of specific power? European cars?

metro.cl
09-26-2008, 03:46 PM
What does it really mather guys, NVidia is just making there chips really big and ATI is basicly doing the same only they split it up in two chips.
You cant really go wrong with either, they all have there own good/bad things its just your own preference.

:clap::clap:

Decami
09-26-2008, 04:37 PM
What does it really mather guys, NVidia is just making there chips really big and ATI is basicly doing the same only they split it up in two chips.
You cant really go wrong with either, they all have there own good/bad things its just your own preference.

exactly, its the preferences they are arguing about. Looking at this argument from the outside, I am noticing a rapid pattern. Nvidia backers stating facts (mostly) and backing it with data, and Nvidia haters just blurting out random stuff in the intent to prove the other wrong, and seldomly using data or sources, but weak ones. Like the link to the RMA issue with the GTX280 which to what i remember wasnt Nvidias fault, or the link to the MSAA bug which didn't occur on all cards, and only happened when AF was forced threw the drivers. and was only existence in 2 months worth of drivers, is that all you could come up with in response? and the car cylinder statement LOL wow, did you even think about that before you typed it?

Spawne32
09-26-2008, 04:40 PM
ohh so this explains why they have to rebadge old hardware and sell it as new

JumpingJack
09-26-2008, 04:50 PM
It's gonna to grow into what in 15 years?? A nuclear reactor?

jimmyz
09-26-2008, 05:15 PM
Jen-Hsun Huang just speaks english lol.

Really, the article was translated from Chinese :up: , you may want to read the articles before commenting on them. :shrug:

Solus Corvus
09-26-2008, 05:43 PM
There are serious issues with the VRM section of ATi cards right now, and it doesn't look like they plan to fix it anytime soon. They have actually had to dumb the drivers down to run Furmark at half the frames to keep them from burning up. Before they did that, mnay were doing just that and letting the magic smoke out. Many on this forum have stated they are designed to 80% of what they should be. You might get a record with an 4870x2 for a couple weeks, but if you stress test one at full FPS rates it's gonna get nuked.
Way to spread fud. I have run a renamed furmark on two separate 4870x2s for many hours and haven't had a single problem with either one.


That does not happen with nVidia cards, and is not an issue.
Hear about the 280GTX spiking to 105c issue? My friend got two cards in a row with that problem.

T_Flight
09-26-2008, 05:49 PM
Way to spread fud. I have run a renamed furmark on two separate 4870x2s for many hours and haven't had a single problem with either one.





Driver version? Frames? Screens?

Read the post first. You will not have a heat issue if you are running the latest drivers. You are no longer working the card with them.

If you doubt this, start reading the benchmarks right here at this forum, and check out the pics of the burned chips. The forum is litered with them. I wouldn't go around calling people at this forum Fud... that's a pretty low blow and you might not last very long.

Solus Corvus
09-26-2008, 05:53 PM
I have run it with 8.52.2 through 8.54, no problems - renamed furmark.exe for the latest drivers. 30fps @1920x1200 noAA(oops, 4xAA + Edge detect, rather). I'll post a screenshot after I have run it for an hour, but that will be later because I want to use this machine right now.

And maybe you should think first before calling someone out.

Vinas
09-26-2008, 06:38 PM
Only one gpu can be used in windowed mode in dx9, your other 3870 is just idling.

but don't take it from me, http://forums.amd.com/game/messageview.cfm?catid=262&threadid=96440&enterthread=y

Don't need reviews, I own the hardware. :yepp: It works with 4 and both GPUs are working :shocked:

metro.cl
09-27-2008, 12:31 AM
Jen-Hsun Huang just speaks english lol.

I know for a fact that he knows at least English, Japanese and Chinese, i would say he know a few more chinese deribates but i cant confirm on that.

Sr7
09-27-2008, 01:26 AM
Really, the article was translated from Chinese :up: , you may want to read the articles before commenting on them. :shrug:

Good game kid.. Jen-Hsun speaks very little chinese. There is no way he was that articulate in Chinese. Someone there was translating it into Chinese.

So maybe you could do some research before commenting on people commenting.

Sr7
09-27-2008, 01:32 AM
I know for a fact that he knows at least English, Japanese and Chinese, i would say he know a few more chinese deribates but i cant confirm on that.

But he isn't fluent enough in anything but English to speak like that.

gOJDO
09-27-2008, 01:33 AM
Maybe Jen-Hsun Huang needs to grow up first. He was very childish with his wh00p a$$ threats. :rofl:

Sr7
09-27-2008, 01:35 AM
Maybe Jen-Hsun Huang needs to grow up first. He was very childish with his wh00p a$$ threats. :rofl:

There was never a threat.. he said we're gonna whoop some ass, in terms of making good products and advancing cuda and graphics. He never said "we're gonna take Intel out and be bigger than them!"

ownage
09-27-2008, 01:52 AM
There was never a threat.. he said we're gonna whoop some ass, in terms of making good products and advancing cuda and graphics. He never said "we're gonna take Intel out and be bigger than them!"

Actually he did say that!

Jimmer411
09-27-2008, 02:48 AM
I just love people who let out verbal diahreah without having a f'n clue who they are talking to on the other end. :rolleyes: Try again kid.

You just gotta laugh at such ineptness. Thanks for the laughs there. :ROTF:

Is this the best you can come up with to backup your side of the argument? Lay down some facts behind your claims then. FUD and INQ arent sources and neither is your cousins uncles brother who knows someone on the inside.




this analogy is terribly flawed. When comparing the number of cylinders in an ICE you have to compare at a fixed displacement. More cylinders are more efficient because you maximize surface area per a given displacement, but an I-4 2.5L and a V6 2.5L are still both 2.5L. multi-gpu is less efficient, and the die area/transistor count is not constant but it doubles when you add another GPU. If going from 4 cylinders to 8 required doubling engine displacement at a loss of efficiency there would be no 8 cylinder cars. try again.


It was intended to be flawed, as it was showing how flawed the quoted post was ;) Re read my post

Hell he used cheeseburger vs double cheeseburger as the base of his argument of why single gpu > dual gpu.

jimmyz
09-27-2008, 03:16 AM
Good game kid.. Jen-Hsun speaks very little chinese. There is no way he was that articulate in Chinese. Someone there was translating it into Chinese.

So maybe you could do some research before commenting on people commenting.

First off I'm 40yrs old so I'm not a "kid". second I simply pointed out that it was translated from Chinese (which you would have seen if you had read it).
you implied by your comment that he only speaks english and attempted to make the other poster look like a rube.
The fact is if you had read it you would have seen it was from Digi-times and was written in chinese and translated to english. I never stated he spoke in chinese during the interview did I? of course not.

Enough of the thread derailment, back on topic now.

Calmatory
09-27-2008, 03:30 AM
"More like who cares if it's single slot or not, it's still Crossfire".
Let me.. fix: "More like who cares if it's single slot or not, it's still superior to anything the competitor can offer."

First Nvidians are liek "omg but I care about perfurmans", after half of them jump to ATI, the rest change their opinion to "umg I care about warranty and bfg moer even if the product is inferior!".

Inferior or not, still not as fast as HD4870X2. :shrug:

naokaji
09-27-2008, 04:16 AM
They have actually had to dumb the drivers down to run Furmark at half the frames to keep them from burning up. Before they did that, mnay were doing just that and letting the magic smoke out.

nVidia also owns the records in Folding and the ATi's are not even running close.

Dual GPU's are a "patch".

they also could have increased fanspeed instead, but you know, 99% of the customers never even heard of furmark, so dropping the score there was the smaller sacrifice than having everyone complain about noise.

Of course they win at folding, the software was developed and optimized for nvidia gpu's...

fully agree, dual gpu's are nothing more than a "patch" and the best thing are still single gpu's.

RaZz!
09-27-2008, 04:20 AM
i really like atis 4800 series cards, but honestly, i don't like their approach of single gpu = low end & mainstream and multi-gpu = high end...
as long as multi-gpu graphics cards have microstuttering issues and such it's absolutely NO alternative for me. either they solve the multi-gpu issues we have right now or if they're not able to do so, focus on single-gpu performance.

gosh
09-27-2008, 06:00 AM
15 years is a VERY VERY long time in this industry. It's hard to predict one year ahead.

In 15 years we may have biochips :)

clonez
09-27-2008, 06:32 AM
ah, but i dont wanna feed my pc :shrug:
and biological stuff tends to not being really fast at math the way it is needed
so super realism, but calculating 1m pi? not under 100 years

maybe 10-dimensional chips :D
open up those strings mr. hawking, pleeeaaase ;)

Miss Banana
09-27-2008, 06:55 AM
To seriously suggest for one second that nvidia currently has design excellence is just bordering on blatant fanboyism.
Design excellence is not about about having the fastest chip or the fastest single slot solution, it's about what product is doing best in the market. The summ of all good and bad things about a chip, compared to the good and bad things of the competition.

I hope you realize nvidias extreme pricecuts right after the launch, made it clear they didn't exactly design the best product this round.

Dami3n
09-27-2008, 09:06 AM
Design excellence?:rofl:
When your opponent have a cheap and small solution that fight face to face with your big and expensive card, your design have a problem.

Decami
09-27-2008, 09:41 AM
To seriously suggest for one second that nvidia currently has design excellence is just bordering on blatant fanboyism.
Design excellence is not about about having the fastest chip or the fastest single slot solution, it's about what product is doing best in the market. The summ of all good and bad things about a chip, compared to the good and bad things of the competition.

I hope you realize nvidias extreme pricecuts right after the launch, made it clear they didn't exactly design the best product this round.

not true at all. The majority of the market are very clueless people with little hardware knowledge. Most of them wouldnt know a good product if smacked them in the head. Marketing is what forces competition more than anything, not design. There are some who argue a single chip solution is better for them and others who argue that its bullcrap, (Nvidia haters) who are they to tell someone its bullcrap if they have no idea what they are talking about. Most of the Nvidia haters in this very thread are staring at numbers flashing across their faces and going, OMG OMG Nvidia is beaten, down with Nvidia. When a company takes a crown for a long period of time like Nvidia, it creates whats in this very thread.

Its funny cause the side arguing against Nvidia sounds more Fanboy than the Nvidia fans, blurting out random statements with half truth and yelling fanboy around every corner as if that makes them right, this doesnt make you any better than a fanboy. Its funny cause i see so many people yelling fanboy and all but the same people are looking for any shread of a hint at Nvidia comments so that they can say that, and bash the comments, even if it means their comment doesnt make any sense. To be honest watching both sides in this thread, I would rather be the Fanboy, atleast i can leave the thread with some kind of dignity and values.

JumpingJack
09-27-2008, 09:58 AM
To seriously suggest for one second that nvidia currently has design excellence is just bordering on blatant fanboyism.
Design excellence is not about about having the fastest chip or the fastest single slot solution, it's about what product is doing best in the market. The summ of all good and bad things about a chip, compared to the good and bad things of the competition.

I hope you realize nvidias extreme pricecuts right after the launch, made it clear they didn't exactly design the best product this round.

The arguments going on here both have good points, but I think this is really a key concept. nVidia or ATI (Intel or AMD) for that matter, could easily throw together a 8000 mm^2 die, pump it with 400 W of power, and put something out that blows anything and anyone away on the performance front be it GPU or CPU, but there are counter balancing forces at work when putting products out to market, performance is just one vector.

Price that they can charge is a key one as well, and if they cannot get costs down to be worth marketing it is not a good design, form factor, power, and so forth and so on.

There are many many facets to decisions made and how to put the product into the market under the umbrella of competing forces for anyone trying to sell anything.

metro.cl
09-27-2008, 08:22 PM
But he isn't fluent enough in anything but English to speak like that.

Not sure where do you find that info, but i saw him speak Japanese and Chinese quite fluently in the Editors Day (i'm not an expert there thought).

dinos22
09-28-2008, 10:39 PM
It is top card to top card :shrug:

would you say that when Intel released their 300-400W TDP monster lol heheheh