intel can do a BS theoretical measurement too. it is very unlikey for an rv770 to be under full load with the way they measure it.
it is an odd chart but i think silverthorne's tdp is BS. it would have been nicer to see SP too because the ratios of SP: DP are very different from each architecture.
francois, but why is it comparing die size to dp flops?
that makes no sense... shouldnt it be transistors or die size/mfc node or something like that?
interesting graph though... thanks![]()
and those pics arent misleading. the nvidia sp is simplified more than the other though.
Last edited by Chumbucket843; 09-25-2009 at 12:47 PM.
I don't follow the personalities involved like a lot of you do, but doesn't it seem that every time one company owns the other one like an old mule, the guy on top gets *ocky and forgets that the game isn't over yet? AMD did it when they were kicking Intel's ass and to a casual observer such as moi, it seems like now it's Intel's turn.
I'm sure neither of them thought that they were being over confident, but then, that was probably the biggest part of the problem.
Just my somewhat irrelevant observation.![]()
I think that what Mr Francois is saying is that it is way above what we can even speculate.I for one am glad that he even posts in here, and people shouldn't bash so hard. He is gracing our presence....
![]()
"Lurking" Since 1977
![]()
Jesus Saves, God Backs-Up *I come to the news section to ban people, not read complaints.*-[XC]GomelerDon't believe Squish, his hardware does control him!
the data is public, you can redo the math for Atom.
I checked what David K did, and it matches with my expectation.
prove it wrong if you think it is not right, it is ok to dissagree, but if you say it is wrong, you got to prove it.
(this is 3rd party data, not intel data ...)
Francois
DrWho, The last of the time lords, setting up the Clock.
whos bashing dr who?
It is not bashing, i am trying to explain how to have an enginnering discussion.
the rules are simple:
1) You can not say it is wrong if you don t have the demonstration that it is wrong.
2) a demonstration of a concept needs to be back up my experimentscor use of experiments results.
3) it is ok to have an opinion, but it can not be use as proof.
4) Carnaut mathematics apply to arguements, except the case of the reciprocity ...
5) Admiting that some time a more qualified or better positionned person can over right your arguements is a smart move... (when a CPU architect explain you thinks, pretty hard to know better than him ..)
always funny when somebody try to teach me the performance of an intel product. I am lucky enough to be the visible parts, there are an army of people making sure i am accurate, including real hackers to PhD style guys.
Those guys are hardcore!
Give the credit to the new Intel, since Conroe, we use our very secret advanced maths, and it looks like we do a good job at figuring out performance ... Why do you think we are not in the top guys to do this?
Thanks for understanding that we are not beginners, we have amazing projection systems, we are adapting to GPGPU, that is going to be awesome when the all Cathedrale is finish ... As i keep asking, let's have real engineering discussion, this is what is interesting.
Got the direction i would like to have a discussion?
Francois
Last edited by Drwho?; 09-27-2009 at 06:11 PM.
DrWho, The last of the time lords, setting up the Clock.
Larrabee is a great concept, good for devs & great for intel. However, I don't see it debuting with good performance.
[SIGPIC][/SIGPIC]Bring... bring the amber lamps.
Im having the most trouble figuring out how LRB will be competitive price wise. I believe that Intel will get the performance they are aiming for but how will it be able to compete $$$/perf with NV and ATI?
Particle's First Rule of Online Technical Discussion:
As a thread about any computer related subject has its length approach infinity, the likelihood and inevitability of a poorly constructed AMD vs. Intel fight also exponentially increases.
Rule 1A:
Likewise, the frequency of a car pseudoanalogy to explain a technical concept increases with thread length. This will make many people chuckle, as computer people are rarely knowledgeable about vehicular mechanics.
Rule 2:
When confronted with a post that is contrary to what a poster likes, believes, or most often wants to be correct, the poster will pick out only minor details that are largely irrelevant in an attempt to shut out the conflicting idea. The core of the post will be left alone since it isn't easy to contradict what the person is actually saying.
Rule 2A:
When a poster cannot properly refute a post they do not like (as described above), the poster will most likely invent fictitious counter-points and/or begin to attack the other's credibility in feeble ways that are dramatic but irrelevant. Do not underestimate this tactic, as in the online world this will sway many observers. Do not forget: Correctness is decided only by what is said last, the most loudly, or with greatest repetition.
Remember: When debating online, everyone else is ALWAYS wrong if they do not agree with you!
Friends shouldn't let friends use Windows 7 until Microsoft fixes Windows Explorer (link)
![]()
Friends shouldn't let friends use Windows 7 until Microsoft fixes Windows Explorer (link)
![]()
Dr. Who, I'm sure you realize why so many nay-sayers...
You see, the graphics war, to many of us, is far more interesting than the cpu war has ever been. The cpu war is pretty cut-dry for most of our needs, we either go with 1 company for budget or the other for performance and if we're going to game on the system we generally don't have to worry much about which brand we buy from right now on cpus, which is why Intel wants in the gpu war in the first place. With GPU's things change, on the other hand.
1 card can absolutely MURDER in every game, then comes a new game and the tide completely changes, or a driver comes out and everything completely changes. As such, when it comes to gpu's everyone is ALWAYS on a "put up or shut up" kick. Reason being? Every time in the graphics market that we heard huge things about a design for an extended period of time before the release, it's failed. Remember the R600? Remember the FX5800? Yeah...
Now you see, we've been hearing this and that about LRB for quite awhile...I seem to recall hearing it's coming soon when I bought my old 8800gtx. Here we are, 3 years later, and the only thing we've seen is a limited ray Ray Trace demo of poor quality that ran pretty slow and the camera never moved a notch. You really can't blame people for being skeptical at this point because frankly this isn't a situation like other companies. People are saying NVidia have nothing because the 5870 launched less than a week ago, and intel have had us waiting for years...
Yes, Intel has proven themselves, everywhere but one place... The gpu market, which is exactly where they're trying to go. Intel doesn't exactly have anything at all to show for themselves when it comes to said market, as their IGP's are royally under-powered and actually cited by major game developers as part of the reason for the collapse of the PC gaming industry. Not a good start, especially for the enthusiast market. Just notice these guys want numbers, and they want them yesterday, most don't understand that telling performance early is SUICIDE as your competition knows what they're preparing for in that case. Like I pointed out earlier, they're grilling NVidia (and NVidia has had a VERY solid track record as of late when it comes to their new architectures for gpus) just because ATi launched early and caught them off guard. Don't think they're going to give intel any special treatment just because of conroe... They'll instead remind you how long it took for intel to go to conroe.
TL; DR version - Don't announce a GPU over 3 years in advance, delay it, and show a demo that does nothing for anyone watching and on top of that runs slow and the camera doesn't move. You are liable to get grilled by the enthusiast community for doing so.
maximus IV extremegtx580
gigabut p67-ud7
p67 sabertooth
2500k+2600k
antec 1200watt
EVGA classified 760
920 Batch# 3849B018 4.985ghz@1.52vgtx285 ftw sli
OCZ3RPR1866LV6GK hypers
dfi ut p35rampage extreme
gigabut p35c-ds3r![]()
bios suks
gigabut x38-d6qdead thank god
ballistix 8500![]()
1240mhz@2.02v
fyi, intel told charlie that the lrb part in the demo was still Ax silicon, not the fixed Bx silicon that should come out soon.
and they told him that the performance of this sample was less than 10% of what they are hoping for with the final retail parts.
that would mean over 110fps at 720p and 60fps at 1080p... sounds great, but who knows if what they are "hoping" for is realistically possible...
and even if, those fps numbers sound great, but thats for this custom demo... who knows what the fps will drop to if we are talking about complex geometry and textures...
intel really shouldnt have shown the lrb life demo...
you dont demo a product that is so crippled its only running at 8% of its predicted performance...
I've said it before and I'll say it again. LRB has no chance of getting a foothold in the PC GPU market unless it can produce a card that will play games around when it's released, as good as ATI or NVIDIA.
It doesn't matter if LRB can produce an image that is 100FPS using raytracing as no game maker would make a game for it.
The only way I can see LRB forcing it's way into the market is for it to get into the console market. This is either on intels own console or by intel paying for LRB to be in another.
10 years from now, I've no doubt LRB type GPUs will be around. In fact, I suspect more players could of entered the market (ARM for starters). Until then, it's great technology but it suffers from the Chicken and Egg disorder. no one will create a game until there are LRBs in the wild, yet no one will buy one until their are games it will run on.
Even 480P RayTraced would look much better than 1080P on current ATI/NV hardware.. The question really is: Can Intel deliver 60 FPS in-game experience at ANY decent resolution with LRB?
If it can, I'm in.
my opinion lrb will be a total fiasco for the start. actually i am believing this as a fact (my fact ofcourse). there is no way first lrb generations will hava a success.
but everyones succes is not same all the time. intel has a big power on the market. and intel can easily dominate onboard market with lrb by using its power. for the performance parts intel has to move lots of stones in the market to put lrb in performance leauge. this will be a long adventure for intel. my opinion is some time they will give up on this.
When i'm being paid i always do my job through.
It's simple...
Think Atom chip with modularity...etc. Add Larrabee on die with 6 atoms and what do you get..? Motherboards may have two processors soon, the second is a co-processor such as Larrabee.. AMD is headed this way also.
Dr Who just isn't allowed to tie it all together for you, as if this is some secret! With the advent of Hydra engine (intel) you can easily just have your co-processor not integrated, but on a card. Anyone notice the priority of the Hydra chip?
It's now the north bridge..
add it all up over the next 2 years...
I'll take that with a mountain of salt, seeing as i cant remember the last time any numbers that charlie spewed were even close to accurate... i think his site should be renamed, almostneveraccurate.com
what i can see happening is that the cards will be marketed as coprocessors and industrial cards, I got a feeling that they'll pull their mainstream focus, i'm sure two or three lrb cards would make an epic renderfarm especially for firms like pixar that still have a software based rendering engine.
Bookmarks