Click on his name and then click on view profile, click on user lists and the option is right there.
Printable View
http://www.xtremesystems.org/forums/...?do=ignorelist
You're welcome.
Your way is even easier.
Tick tock.
http://www.legitreviews.com/images/r...re_slide11.jpg
New Microarchitecture.
New Process Technology.
New Microarchitecture.
New Process Technology.
New Microarchitecture.
New Process Technology.
Separate things.
/argument.Quote:
In computer engineering, microarchitecture (sometimes abbreviated to ľarch or uarch), also called computer organization, is the way a given instruction set architecture (ISA) is implemented on a processor. A given ISA may be implemented with different microarchitectures.[1] Implementations might vary due to different goals of a given design or due to shifts in technology.[2] Computer architecture is the combination of microarchitecture and instruction set design.
Unless you think you have far more information than intel and they're wrong(?)
Learn yourself to respect others ideas, before preaching education to others.
This is my ideas, we need a shrink to call it a new generation, otherwise we are going to get the same mess and confusions we had win G92-mess in this round form both AMD and nVidia. Just wait and watch.
Those who don't like my idea are free to express it in a constructive way. Those who are trying to get aggressive with personal attacks, I gotta say, just eat it.
Peoples ideas can be outright incorrect and respecting them would be disrespecting logic and intelligence. There comes a point when an Idea is outright proven wrong and is discarded.
Have one of my ideas: Gravity occurs as a result of tiny sub-atomic Ralph Macchios waxing on and waxing off nearby atoms, pushing them towards the Planets surface.
I'll now proceed to tell you that you are wrong for 4 pages then say you should respect my ideas even though I clearly don't respect yours.
To end this, I will proclaim;
IT IS RAPLH MACCHIO AND HE IS ANGRY YOU FOOL!!!
Your idea is not an idea any longer, it has been proven incorrect and is now classed as a lie or untruth.
You have the right to dismiss my ideas and defend yours. But you can't insist on having patent on ultimate truth, and be the judge at the same time too.
A few are preaching to be the judge of this, so tell me, if the shrink is not the norm, where does the line go for generation-shift? A better performance/power usage would do? How about a re-fresh or a re-brand?
These guys are being rude about it, but they are right. Architecture and process are two separate things. Architecture is the number and type of units in the core, their arrangement and connections, and other such details. Process is the physical implementation of an architecture onto a chip. An architecture could be implemented on different processes. And a particular process could be used to produce very different architectures. But a new generation requires a new architecture, not a new process as the examples already presented to you show. Are you saying G71 to G80 wasn't a new generation?
Saying that we should call something on the same process a new generation because it will confuse people is a logical fallacy called an Appeal to Consequences. Words have meaning, even if we don't like the results of that meaning.
nvm
We don't know what kind of architectural changes we are going to get with upcoming high-end GPUs. I'm afraid we are going to see the same mess as G92 in this round, form both AMD and nVidia. (just as a speculation for now, but just wait and watch).
I've been trying to raise the question, where does the line go for a architectural changes to become a new generation? But nobody has given a clear answer. Could you cast some light on it?
Architectural-changes is a trickery term, and when I look at 68xx, it's smaller, cooler and cheaper, but is it a new generation? In case why does it perform less that the last gen with exact same name?
We must have a clear definition to define a new generation (with new number at first digit), and I'm open to see what else could be used as a clear definition, but these kids are taking it personal.
It's the nature of these GPU-treads, they have tendency to get, funny, rude, aggressive, and finally to childish fights. But it doesn't scare me, those who are trying can just eat it.
No, there is no hard line. Architectures are so complex that sometimes a small change in one part could result in a major improvement in performance, while major changes to another part might only make minimal difference. There isn't going to be any such hard definition and often it just depends on what the company wants to call a new generation.
Personally I would call 68xx an evolution of the evergreen generation, and did so shortly before launch based on the rumors. We will have to see about Cayman. But as we know, in computers as in biology, evolution is driven by small changes in successive generations (to twist the analogy a bit) ;).
I agree with most part of you comment, specially when you say 68xx is not new generation. Exactly as GF104 (GTX 460) was not a new generation, even tho it was smaller, cooler, and cheaper.
But we need to make awareness, by raising this question, and find a clear definition. Otherwise, these guys (AMD and nVidia both of them) are going to call these gray-zone evolution/improvements as new generation, and soon run out of new numbers too, LOL.
As you said, the details about upcoming high-end GPUs is still unclear, but I'm afraid they are going to mess it up just for the purpose of marketing (as other are suggesting over too). Whatever is behind it, it will make a lot of confusion, but I want to wait until I see them before jumping to conclusion.
Sam please drop it. I am sure you thought you had an interesting idea, and it made sense to you somehow but it's just derailing threads.
Quite interesting that nvidia has lowered the price of fermi quadro cards so drastically. Either GTX580 is a big step or they're scared of cayman based professional cards.
Is it bordering you? I'm taking about upcoming high-end GPUs, and trying to find out if they are "fake" or "real" generation. Maybe I've repeated some, in case, sowwy but I felt the need to raise awareness.
I would love to talk about the details about GTX580 and Cayman, it would be much more interesting, but I don't have much reliable info yet. Just drop what you got, you won't disturb, and I will follow you.
Well I agree that companies calling a barely changed architecture a new generation is marketing driven. But just as we are able to say that "I don't think this card should be called X" we could just as easily say that we don't think certain chips should be considered new generations.
But I don't think that requires us to make up some specific definition for generation just to avoid undesirable consequences. If we could even agree on a definition at all. Instead I think we should simply be open to discussing what a company calls a generation and deciding for ourselves if it really deserves that moniker. And I don't think we should be overemphasizing the importance of defining something as fitting in a specific generation when the the actual performance of a product is so much more complex then what can be defined by a single word.
maybe it's hard for me to separate physical and architecture differences because my perspective is more design oriented but the two are related. for example, arch A may create a nasty physical design while arch B has a good one, but arch B might have some crippling architectural bottleneck. arch B could be prescott in the P4 or arch A could be fermi in GF100.
in my opinion a next generation gpu can be defined roughly as a gpu that is capable of something previously not possible and in a meaningful way (5% faster wont cut it.). fitting barts into this is difficult, but with 20/20 hindsight it will become clear if it's a next gen gpu.
also i have something important to note about processes. less nanometers does not make a process better. arguing that 28nm is faster than 40nm because it has smaller structures is not true. physics gets very strange at this level and is very counterintuitive. this means that a new process does not make or break a next generation. do you like how i keep my sentences all the same length? i like being monotonous and terse.
Chumbucket, I agree. I was (over?)simplifying a lot.
It is up to the manufacturer to decide whether or not a product is of a new generation. It is up to you to agree or disagree with it, however, you aren't the one to decide whats right or wrong.
If AMD calls Cayman 7xxx and claims its a new gen stuff, then it is. You can disagree with it, but thats all. They can decide, you just have an opinion.
We know next to nothing about them as you said yet your worried if their new generations? Geforce 5200 was slower than geforce 4 4200 even though it was on a new process and a different architecture. Faster is not always a necessity for a new generation. I would consider gf104 to be a new generation because of the reordering of shader's and its proximity in performance compared to the larger gf100.
:rofl: :rofl: :rofl: :rofl: :rofl: :rofl:
Man... i was occupied with work and i was worried sick about task i'm to perform... but then you come and... I want whatever you're on!
Nvidia screwed up badly... just because your competitor is better, doesn't mean you have an automatic right to deceive people left, right and center. Jensen is too egotistical, and some Nvidia fans, just wow!!!
AMD's 6 series cards which we see (Barts) they're supposed to be replacement for 5770 and lower cards... They're from a new generation. It's like moving from Nehalem to Sandy Bridge from Intel.
What with you guys trying to antagonize other people and start flaming wars?
This new generation of fans is really making this forum a worse place.
Anyways, it doesn't give you the right but marketing wise it makes far more sense. If NV kept the gtx 4xx moniker, they would likely sell less cards because of it. This is because a lot of people put weight in naming, they would be thinking its still last generations card which is half true. This is an act of desperation for a company that has fallen behind in the development cycle when it comes to performance per watt(they still compete quite well for performance). This naming scheme has nothing to do with Jensen and everything to do with market conditions and a flawed product.
None the less your getting more performance for likely the same or less money, which in turn helps lessen the blow of AMD new naming scheme and products. Its kind of a :banana::banana::banana::banana: move for AMD to shift its naming upward, just as it was a :banana::banana::banana::banana: move for NV to rebrand cards.
Since your all about criticizing naming schemes by consequence of what your saying against Nvidia for trying to deceive it customers, what do you think of AMD latest naming tactics. When your beating up your competitors at selling cards, why start shifting your naming scheme so it is more confusing for customers, when you already have the advantage? Nvidia is doing part of this out of desperation and because the market demands it. Why is AMD doing it?
AMD is just doing it so people swallow the barts price easier because most people would have a hard time paying 239 and 179, when last generations 577x series were 159 and 129. It's dishonest because never in the history of ATI videocards has buying a card in the same series gotten you lower performance(besides when a naming scheme changes entirely). Still even if AMD didn't drop the price and kept barts xt priced as it is, I think all of us would have appreciated the naming scheme to stay 67xx because it is more honest. Barts codename was rv940 which for the longest time was AMD midrange. If we accept barts xt as a next generation midrange, its performance is inline where it should have been, but its price is significantly higher than people thought, as alot of people were thinking barts xt was going to be under 200 dollars.
Both naming acts are dishonest, but a lot more people are suffering from AMD naming scheming because if you do your research and all that jazz, you will likely to be able to sidestep both problems and with both naming schemes. However by pricing Barts XT for what it is, people that want to buy a card at around 150-100 are SOL and have to settle for last generations stuff as roadmaps indicate that the 57xx gen not being discontinued. AMD going to make more money, but this probably doesn't benefit the consumer.
LOL, nVidia's the angel of desperation (with all of their money in the bank, overall longterm market dominance, & many of kneejerk tactics using that particular dominance, those propietary bvllsh1ts), AMD's the devil incarnation that's in the mission to screw gamers all around the globe (with their 1 mil profit, great value products, and goodwill built since i don't know how long). :ROTF:
Ignorance at its best form. :clap:
Wow you've sure got a hate on for AMD. :rofl:
You act as if there's no good reason for Barts to cost more than Juniper, as if shifting the name upstream is simply to make it easier to digest. Well duh, no kidding, 32nm process was cancelled. You know it was cancelled, you know it's on the same process, you know it has a higher BOM than the 5700 series so what else were they supposed to do? Sell them at a loss?
It's amazing they even came up with these cards given the 32nm cancellation, their execution has been spot on. Even if you don't like the cards, they have driven prices of GTX 470 and GTX 460s cards down, so everybody wins. You're always harping on AMD is charging/will charge too much for their new cards b/c of the lack of competition. Yet they released them at price so low that Nvidia had to respond by deeply slashing prices. Yeah AMD sure seem to be gouging. :rofl:
Give it up, you don't have a leg to stand on, provide some evidence instead of bias. :shakes:
What the heck, you have even said Nvidia has been desperate at one point or another. Nvidia is not so desperate because of cash reasons but products reasons. They don't have very much in terms of releases at the moment and that's what they are desperate about:shakes:
Anything they launch at this point is likely going to be simply a modified version of their existing product and for the high end, if they do release anything new, its going to be a paper launch.
That's a pretty desperate situation I think regardless of how much money they have in the bank. NV been riding their professional market since the release of the gtx 280 to make money. If they didn't have that, they would be bleeding money.
What the :banana::banana::banana::banana:, how can you interprete that I said AMD is evil. They are simply operating as a business, AMD(ATI a different story and have priced cards high in the past) has never had the up hand in performance and branding, so they could never price their products really high. AMD is still providing some value in their products, but not as much value has it has in the past and the value will drop if Nvidia fails to provide any competition, which I don't see coming from fermi unless Nvidia wants to sell their products at cost or a loss.
I would expect Nvidia to do the same thing(they did so in the past) and Intel is doing it now. I am not expecting AMD to act like Angels and not do the same thing. Any company would do so when the competition from Nvidia is so shotty(something a real NV fanboy would never say).
The goodwill your thinking of is meaningless. As long as your products are better than the competition, they are going to sell regardless of how evil you think the corporation is(look at Intel).
Your being ignorant to simply ignore AMD is business number 1 and anything else number 2 or lower.
AMD, Intel & Nvidia should create football teams to make these forums all the more interesting...
You know...just once, it'd be nice if we could have ONE thread for BOTH companies make it more than, I don't know, the first SIX pages without turning into a huge flame war(and usually by the usual suspects too).
I'm making this VERY clear, next one to flame anyone gets to feel my wrath. People need to quit acting like children and grow up. Unless you work for either company it's not like you're getting a check for your behavior, so why keep it up?
Don't worry, I'll be treating the cayman thread much to same way, because frankly I think it's ****ing silly that it's impossible to have a discussion on the possibilities of what new hardware might be without people starting a huge flame war over it... Perhaps it's time to start requiring legal i.d. to sign up here.
let me get my popcorn... There is going to be a NERD FIGHT!!!
:argue: :fight: :lsfight: :frag: :weapon:
LMAO; they have it already. Maybe not a physical team, but they have it inside heads.
It's really sad to see how all GPU-threads becomes like a football-match. This match works like this: If you say something negative about the other "team", then a few narrow sighted people (I call them propaganda agents) starts with funny comments, and before you now it a wolf-pack/team will be allover you with rude comments and personal attacks, and always ends up as a childish fights.
A few narrow sighted people involve other innocent people emotionally, just like a football-match, and "my team" is always right and the other one is always wrong. IThis goes further than a football-match actually, this kind of narrow sighted thinking will always end up in a war and "battle field" where you have to kill the emery, believe me.
I think we all will benefit, from a constructive discussion about (the bad and good) each GPU. We will all learn from each other and make better choices.
This kind of childish fights, and emotional involvements (that just a few known people usually starts), will hide the advantages/disadvantages for all of us.
Wee all lose in this kind of childish fights, only GPU-makers and those on their pay-list will benefit from it.
In this way GPU-makers gets away with their "tricks". You won't know disadvantages before you buy it. There is one other side too, which is a big irony, those who fight for these GPus and get involved emotionally become the biggest victims av these "propaganda agents". Because that emotional attachment makes you pay more for the particular GPU/brand that you have fought for.
That's why I vote with my wallet... At the time that I'm purchasing I buy whatever happens to be best for the price that I'm paying out...
9800pro
6800GT(Won it on a bet)
8800GTX
4850 1gb(sold the 8800GTX who wanted it for SLi, actually made a profit on the switch)
460 GTX(sold the 4850 1gb to a friend who wanted it for CF)
2 ati cards, 3 NVidia cards. Whatever happens to be the best card for my money, that's what I buy. I don't care who makes it if it works as it should and beats it's competition at that price point. Sadly, more people can't seem to think in this manner.
At high resolution with AA, I doubt we'll see too much of a cpu bottleneck going on.
I'm still hoping we see some games soon that can actually put the power to good use from both companies. :up:
Lately even the mid-range parts have been enough for 1920x1080 with AA without issue.
Crytek claim that Crysis 2 will run better than the previous games while looking a little better, and considering they got it to run on consoles that just might be true....but to be honest i don't trust what they say one bit, it's gunna run bad and the new question will be as you said... "will it run crysis 2"
which makes me wonder what type of FPS will modern GPU's pull in crysis 2. the 8800gtx got demolished by Crysis when it came out and it took till now to get a single gpu that can run it at 1920x1080 all maxed with 16xQ AA, for the minor graphics improvements in Crysis 2 it better run the same as the old one... or I will be straight up mad:p: and then I will look for a second GTX 480...
That picture could be fake, I never really trust Chinese sources anyway.
That just looks like a 470GTX....
It could be the 580, which runs cooler or it it's fake.
Always some questionable info from these Chinese sites.
That would be quite unexpected and probably not a wise choice. Nvidia needs to tell their potential costumers that they have something better on the horizon because otherwise it's a big risk they'd be jumping ship. Also, since when hasn't JHH been overly confident about new products and promoted them like there was no tomorrow.
The only reason I could see for this to be a "spy" product is if the availability is going to be minimal at best. That way they could paper launch the product and try to spoil Cayman's parade but not get the huge bash for not being available in stores.
480SP-GTX480 reaching 95 degrees on stock - how will a 512SP card fare with worse cooling? At the same node?
Yeah, but those "potential costumers" doesn't believe it anyways, after those wood-screws in last round. :rofl:
But seriously, keeping it secret to surprise the competitor is nothing new, and it has it's advantages that may be more than disadvantages you mentioned. In case, it will come right at time of the release of Cayman, to avoid those concerns you mentioned.
ahh Mrs. Pots mini oven???
play crysis 2 and cook your roast all at the same time
I don't know why i have not think to the 470 .... ( he he no heat pipes on the reference Nvidia one )
http://img42.imageshack.us/img42/968...oductshot1.png
Yes, but GTX580 is supposed to be even hotter than GTX480. This means that the cooling solution should be better.
It's the new HELLGATE thermal solution. It is a quantum thermal portal that sends the heat to Hell to help keep it from freezing over. Jen-Hsun Huang made a deal with the devil to help save Nvidia and Hell at the same time.
Seriously though it is expected to be hot and suck much electricity that much is for certain. It should mean a better cooling solution but we shall see.
Looks more like a GTX 475 (aka full GF104) to me...
http://fudzilla.com/graphics/item/20...-not-permanent
durrrr gtx 480 is out and gtx 580 is inQuote:
Update: Igor Stanek, Product PR Manager EMEAI informed us that :"Any claims that our pricing update is temporary are patently false. The price adjustments we made are not temporary, and are reflective of an upcoming change in our product lineup."
I hate to break it to myself, but all these card pictures are like smoke... -> fire very soon
Well if those images are the real deal, we can count out 512b bus width. Looks more and more like a full GTX480 512, but I guess we'll find out soon enough.
Hmm.. Its rather lengthy, more than GTX480?
Also, is the chip a little bit smaller than GF100?
Can anyone make a comparison with a GTX480 in similar position?
EDIT - Scrap that... Its same lenght and chip is similar size also... It seems really like just a fixed GF100.. Or someone is playing with us, and that is just a regular GTX470...
Aww if thats how the real card looks .... well it looks like anyother gpu :P would have liked them to keep the heatpipes sticking out the top, I kinda liked that.
12 mem chip, 384 bus - 1536 Mb = 480 GTX
512 bus - need 16 mem chip x 128 Mb = 2048 vmem = 580 GTX
foto FAIL
http://img266.imageshack.us/img266/4629/gtx580480.jpg
same length, same memory bus, same gpu package
angle fixed..
What puzzles me is how people think that nvidia can just snap their fingers and give 512 bit bus to such a large chip already plus more cuda and make it work next month.
lol lets get real its already a miracle that the same chip can run 512 shaders.
what gtx580 needs is aggressive clock speeds both ram and especially gpu. memory bandwidth is fine.
I rather see if the GTX 580 still has the 384 bit memory bus, I rather see them increase the VRAM to 3GB. Though I would love a 512 bit memory bus would be sweet.
While Nvidia must have had plans for a 480 successor from they day they announced it, I imagine they were hoping to release a 485 rather than a 580. Nvidia and AMD GPU plans rarely push the boat this far out, they tend to be Something big followed by a tweaked SKU like 280 to 285 and 4870 to 4890.
What AMD have planned this year is totally out of character, I don't think they've pushed changes like this over a 12/14 month period before. I agree that there is a strong chance that Nvidia's plans are more subdued than ATI's. Nvidia can't go larger, but there is a lot of room for tweaking that could build up to fill the potential power gap that many are anticipating. A 512bit bus is a bad choice in comparison to improving the 'weak' memory controller the 480 currently sports.
772/1102Mhz for the 580
and the GTX 560 exists (460-384). I stand corrected lol.
HOWEVER there were these certain guys that kept yapping about the 6870 naming... can they stand up now? :rofl:
Drivers 261.00
http://developer.nvidia.com/object/c...oolkit_rc.html
Quote:
NVIDIA_DEV.0E22.01 = "NVIDIA GeForce GTX 460"
NVIDIA_DEV.0E23.01 = "NVIDIA GeForce GTS 455"
NVIDIA_DEV.0E24.01 = "NVIDIA GeForce GTX 460 "
NVIDIA_DEV.0E25.01 = "NVIDIA D12U-50"
NVIDIA_DEV.0E38.01 = "NVIDIA GF104GL"
NVIDIA_DEV.0E3E.01 = "NVIDIA GF104-ES"
NVIDIA_DEV.0E3F.01 = "NVIDIA GF104-INT"
NVIDIA_DEV.1080.01 = "NVIDIA GeForce GTX 580"
thank you cold2010
but what are these ?
NVIDIA_DEV.0E38.01 = "NVIDIA GF104GL"
NVIDIA_DEV.0E3E.01 = "NVIDIA GF104-ES"
NVIDIA_DEV.0E3F.01 = "NVIDIA GF104-INT"
No blame AMD, they're the culprit of all renaming madness in this generation while their cards clearly don't deserve their respective names, the source of all evil & wrongdoings, they force nVidia the innocent angel to follow suit too. :ROTF:
Even woodscrew gate was AMD's responsibility too, why did they dare to release DX 11 generation first ? Don't they know that it was a special previlege for the angel in green robe to be there first, so they won't have to resort into this :
http://i50.tinypic.com/2lkshls.jpg
prompting a clear response like this:
http://t2.gstatic.com/images?q=tbn:w...gxDvO4QhCYCD4=
:rofl:
Back to topic, while 384 bit is quite clear, the chip mArch is still very much in question. The card seems cool enough not using GTX 480 monstrous cooler & inhaust hole in the PCB. Will this chip be a derivative of GF104 ? :shrug:
It's a GF100. A4 spin. GTX480 is A3.
No, that doesn't make sense.
It's at least a B-stepping, but not a A4. NV could have brought a A4 stepping in June and they could never make it 20% faster with A4.
No way man, that can't happen by default, there's no way AMD will get outrenamed by the angelic nVidia. :shakes:
Seriously, there's got to be more than just a simple respin like that if this chip is gonna be any competitive against Cayman. But if they indeed succesfully pull this stunt, then i tip my hat to nVidia's engineers, it's a job well done. :up:
hmmm for some reason a i really like the look of whatever was posted there and seeing a real life pic of one means that whatever it is it's close and is in working form. the question is however what changes did they make, the card looks so similar to a GTX 480 from the back so it's obviously still a 384bit bus (which is fine) but is it more shadders? more TMU's? tweaked arch? we have no idea, I really hope they don't just bump shadders to 512 and tweak the core to get a little bit better power and heat output to fit in that smaller cooler and call it a GTX 580, that would not be enough for cayman, they need something more...
Good news, this means it's real. It remains to be seen if can stay to the expatiations the new generation name is generating.
I tip it will drop out of the clear sky just right on Cayman-release, and then we can expect it to match/beat it's performance too. Just some guesses and speculation for now, thought.
Put me down for this is not a rename cuz its Nvidia :D
This better have a few more changes than clocks. If it doesn't well shame on them because this should have been the gtx 485. The only way I would see this as worthy of being considered a new generation is they cut power consumption by atleast 25% and increased performance by 20+%. If they don't do that this can't be considered a new generation. This isn't a rebrand, but its not enough to be called the gtx 580. Worst case scenario I think is if this is just a4 revision and its the same card with higher clocks. If that comes true man it will be the worst product refresh I have ever seen.
The biggest concern for this card is that it performs similarly to cayman, which I think is the real card I want this generation. Cayman has enough changes(from rumors) to be called next generation. It doesn't look like Nvidia will be able to catch up to performance per watt this generation. I can seeing them getting cayman performance but at higher volts.
Rumors are suggesting this is a 512 shader card with 128TMU, which in my opinions is not going to give it a clear advantage over cayman.
If that picture is true, I am happy they got rid of the hole in the card.
nvidia is not the favorite to win here. amd is coming off many successes straight. nvidia is probably a 2/1 underdog if anyone was counting
Relax folks, this is nVidia that we're talking about, they're full of win. ;)
Whatever architectural changes were applied in Cayman, those don't matter, since it was still made on 40 nm process, we have come to conclusion that it can never be called a generational change. And regarding the renaming or rebranding game, nVidia is innocent since it was AMD who instigate this round of renaming & rebranding. No matter that the chips were having some architectural changes & priced to compete + offering great value to consumers, they have to bear the brunt of original renaming sin. :up:
GTX 580 is looking more like GTX485 now...
5 series looking like the 4 series, but with fully functional chips:rofl:
I must have entered the twilight zone. Last I checked nVidia was blamed for everything around here including the recession and crying babies. Where did this "can do no wrong" joke come from?
A new name doesn't always bring generational change. See 2900XT->HD3870 and 8800GTX->9800GTX. I'm looking forward to seeing this 580. It could answer the question of whether the architecture is inherently over-engineered or if GF100 was just an inefficient implementation.