Heres your proof and specs. http://www.xbitlabs.com/articles/vid...e-gtx-465.html
My rig the Kill-Jacker
CPU: AMD Phenom II 1055T 3.82GHz
Mobo: ASUS Crosshair IV Extreme
Game GPU: EVGA GTX580
Secondary GPU 2: EVGA GTX470
Memory: Mushkin DDR3 1600 Ridgeback 8GB
PSU: Silverstone SST-ST1000-P
HDD: WD 250GB Blue 7200RPM
HDD2: WD 1TB Blue 7200RPM
CPU Cooler: TRUE120 Rev. B Pull
Case: Antec 1200
FAH Tracker V2 Project Site
My point is they could never make one in the first place, at least not enough to sell even in PR/halo quantities, hence the "unmanufacturable" claim. They don't need to release one NOW to compete, they should have released it when the 400 series launched, but they couldn't.![]()
Gaming Box
Ryzen R7 1700X * ASUS PRIME X370-Pro * 2x8GB Corsair Vengeance LPX 3200 * XFX Radeon RX 480 8GB * Corsair HX620 * 250GB Crucial BX100 * 1TB Seagate 7200.11
EK Supremacy MX * Swiftech MCR320 * 3x Fractal Venture HP-12 * EK D5 PWM
I've heard they were working on 512, but I don't know why they ended up with 480. Maybe they got in trouble with power usage, heat, yield, etc., or all those lies from that genius with claims about insider info, or all those great Charlie-histories. But maybe not, who knows? Are you claiming to know?
But the reality says they didn't (and still doesn't) need 512 to beat the 5870. Maybe they discovered it at the end and that's why they settled with 480. The GTX 480 has manged to perform with good enough margin above the competing GPU (5870) and that's all that matters in this business, the rest is history. I hope this logic can blow up the little that was left, if any.
Last edited by Sam_oslo; 05-25-2010 at 09:28 PM.
► ASUS P8P67 Deluxe (BIOS 1305)
► 2600K @4.5GHz 1.27v , 1 hour Prime
► Silver Arrow , push/pull
► 2x2GB Crucial 1066MHz CL7 ECC @1600MHz CL9 1.51v
► GTX560 GB OC @910/2400 0.987v
► Crucial C300 v006 64GB OS-disk + F3 1TB + 400MB RAMDisk
► CM Storm Scout + Corsair HX 1000W
+
► EVGA SR-2 , A50
► 2 x Xeon X5650 @3.86GHz(203x19) 1.20v
► Megahalem + Silver Arrow , push/pull
► 3x2GB Corsair XMS3 1600 CL7 + 3x4GB G.SKILL Trident 1600 CL7 = 18GB @1624 7-8-7-20 1.65v
► XFX GTX 295 @650/1200/1402
► Crucial C300 v006 64GB OS-disk + F3 1TB + 2GB RAMDisk
► SilverStone Fortress FT01 + Corsair AX 1200W
Fermi design was set in stone before all those troubles (yield problems, leaking, thermal performance, TSMC, etc.) arised. Nvidia had a good idea how it would turn out but it's impossible to be 100% sure until you have actual silicon in your hands. But they couldn't go back to the drawing boards once they realised they're in trouble, this takes years. There's a thing called time to market.
Fermi was supposed to have 512 working cuda cores, it's not like Nvidia said, "Uhm, OK, 480 cores is just fine". All those problems combined resulted in this "catastrophic" failure (like an air plane crash) and Nvidia had to do a lot of adjustments to release the gf100 as we know it now.
Last edited by Caparroz; 05-26-2010 at 12:49 AM.
Murray Walker: "And there are flames coming from the back of Prost's McLaren as he enters the Swimming Pool."
James Hunt: "Well, that should put them out then."
Hence why it was almost six months late.
Gaming Box
Ryzen R7 1700X * ASUS PRIME X370-Pro * 2x8GB Corsair Vengeance LPX 3200 * XFX Radeon RX 480 8GB * Corsair HX620 * 250GB Crucial BX100 * 1TB Seagate 7200.11
EK Supremacy MX * Swiftech MCR320 * 3x Fractal Venture HP-12 * EK D5 PWM
As I've said, this or other problems may be the case too, and yes they used too long to pull it off. Some of those assumptions may be true, but nobody knows what's the whole truth behind it. Specially when you look at the all those propaganda with funny pictures, funny comments, the genius insider, the great Charlie, and other BS that made the GPU business to look like monkey business. But anyways, they are history now, what matters now is the current reality. We should focus on the PPP (Performance, Power usage, Price) of their current products to judge their efforts, not what they couldn't/shouldn't do in the past.
Right now, the performance and power usage/heat of GTX 480M is interesting. It will show how scalable and flexible this new architecture really is.
Last edited by Sam_oslo; 05-26-2010 at 08:43 AM.
► ASUS P8P67 Deluxe (BIOS 1305)
► 2600K @4.5GHz 1.27v , 1 hour Prime
► Silver Arrow , push/pull
► 2x2GB Crucial 1066MHz CL7 ECC @1600MHz CL9 1.51v
► GTX560 GB OC @910/2400 0.987v
► Crucial C300 v006 64GB OS-disk + F3 1TB + 400MB RAMDisk
► CM Storm Scout + Corsair HX 1000W
+
► EVGA SR-2 , A50
► 2 x Xeon X5650 @3.86GHz(203x19) 1.20v
► Megahalem + Silver Arrow , push/pull
► 3x2GB Corsair XMS3 1600 CL7 + 3x4GB G.SKILL Trident 1600 CL7 = 18GB @1624 7-8-7-20 1.65v
► XFX GTX 295 @650/1200/1402
► Crucial C300 v006 64GB OS-disk + F3 1TB + 2GB RAMDisk
► SilverStone Fortress FT01 + Corsair AX 1200W
Too bad, human don't have the power to change the future, but to change the past.
The Fermi design was a bit too unrealistic, while nvidia didn't do enough trial and error, or to say, not enough effort to realisticly inspire the chip maker.
Not being a fanboy, but it's unlike what ATI did. While they designed the card to use GDDR5 on 4870, they did realisticly enough what the maker should do.
Maybe a little bit off with some details, but you guys know what i mean.
My PC
Collecting dust....
Asus Rampage II Extreme | i7 920 @ 4ghz (1.4v 200 BCLK) | 6 Gig OCZ Reaper 1866mhz | MSI GTX 480 Tri-SLi | Intel X25-M | 5 WD 1TB in RAID0 | Thermaltake 1500watt ToughPower | Samsung SM305T |
EK Supreme | EK Mosfet Blocks | Koolance MB-ASR2E NB/SB Block | DRAM Block for MB-ASR2E | MCP355 | MCP355 | TFC XCHANGER Triple & Quad Radiators | Lian Li P80
A man who appreciates 2560x1600
__________________________________________________
AMD Phenom II X4 "B50" 3.8GHz/2.6GHz NB/1.46v
ATI Radeon HD4850 700MHz/2500MHz
4GB OCZ Platinum DDR3 1333MHz/5-5-5-15-20 1T/1.90v
MSI 790FX-GD70
Auzentech X-Meridian 7.1 PCI
Cooler Master CM690
Corsair HX850W
Thermalright Ultra 120
Dual Dell 2407WFP 24" LCD's
__________________________________________________
If you have something constructive to say about this GPU distinction, say it, otherwise mind your own business. It is not up to you to tell others how much to write.
When the argument drys out, then the personal attacks and funny comments stars. This GPU monkey business propaganda has been going on with this kind of off-topic BS.
Last edited by Sam_oslo; 05-26-2010 at 09:44 AM.
► ASUS P8P67 Deluxe (BIOS 1305)
► 2600K @4.5GHz 1.27v , 1 hour Prime
► Silver Arrow , push/pull
► 2x2GB Crucial 1066MHz CL7 ECC @1600MHz CL9 1.51v
► GTX560 GB OC @910/2400 0.987v
► Crucial C300 v006 64GB OS-disk + F3 1TB + 400MB RAMDisk
► CM Storm Scout + Corsair HX 1000W
+
► EVGA SR-2 , A50
► 2 x Xeon X5650 @3.86GHz(203x19) 1.20v
► Megahalem + Silver Arrow , push/pull
► 3x2GB Corsair XMS3 1600 CL7 + 3x4GB G.SKILL Trident 1600 CL7 = 18GB @1624 7-8-7-20 1.65v
► XFX GTX 295 @650/1200/1402
► Crucial C300 v006 64GB OS-disk + F3 1TB + 2GB RAMDisk
► SilverStone Fortress FT01 + Corsair AX 1200W
@Sam_oslo: im gonna try to explain you why you are wrong.
a) Full blown desktop GF100. As of now, we only get 480SP, and not the full product, 512SP. So, think about it...do you think it makes any sense to make a design with 512SP and then only market a castrated version? In my book its not, because had you intended to make a 480SP in the first place the chip would be smaller and thus, cheaper. Instead, they fabricate 512SP chips but have to disable a part of it in order to make it feasible. So, what does this tells us? And well, dont forget that GF100 is horrible power consumption / noise and heat wise. How is it possible that GF100 barely competes with GTX295 yets its not better in any of aforementioned aspects? Something is wrong, yet not many people sees it. Why? That is something I do not understand either.
Lets look at the competition: just judging from the performance numbers we can tell that something in the ATI architecture is not working at 100% (or the architecture is too old), because 5870 barely keeps up with 4870X2. BUT, and this one is enormous, they made huuuuuuuuuge improvements in heat, noise and power consumption, as the 5870 barely takes 200W. Now, ask yourself, how is it possible that NVIDIA needs a 500mm2 to compete with a 350mm2 one and, not only that, it needs a tooooooooon of extra energy. In my book something is deffinitely not right because, in the previous gen, GT200 was incredible good in power consumption / heat / noise wise, compared to ATI (not in vain I have two GT200 products: GTX260 and GTX295). So, in the past, you paid more, but you had also more: less consumption, less noise, a huuuge overclock headroom, etc. Now, tell me: where did all of that go? There is nothing left of the previous gen. Yes, they do keep the most powerful single-gpu...but that is not enough, specially when you are asking the same money as a GTX295 costed sometime ago withouth being notably better in any aspect...
b) Different product, different chip. When you create and fabricate a chip, you have your target in order to optimize production. If you need a product that has 1/5 the power of Cypress, its nonsense to use a castrated one as you are wasting huuuuuuuuuuge money. Instead, you design another chip, in order to optimize production and make everything more efficient. Its not about the power consumption...its about the cost. Why would you use a 500mm2 castrated chip to power a laptop if you are gonna clock it at 1/2 the desktop clocks and use 3/5 of its original SP? that is a total waste of a chip that could be used for plenty of other stuff. If you look at previous NVIDIA mobile line-up, NVIDIA only offered G92 (and other low end ones) based products (even those who are labeled as GT2XXM), because it was clear that using a 500mm2 chip uber castrated was total nonsense.
You could say that, if many chips are not 100% functional, its a good idea to use the defective ones for other purpouses: you are right. The problem is that, with time, yield rates improve (or tend to do so) and you would be getting more high-end chips and lesser cut-down ones, which would show to be a huge problem as you want more mainstream than enthusiast products. That is why they only make this as a limited line of products, because its not a good idea: they did it with 8800GS and now ATI is doing it with 5830.
Now, lets look at the hole stuff: if they are using GF100 for mobile parts it means that yields are not horrible, but 100% crap and, probably, it will take them quite some time to produce in good numbers any other chip. So, all in all, its a bad-bad situation for NVIDIA, and I deffinitely hope it gets better as, otherwise, we are screwed. Good competition means good prices, we didnt have 120€ GTX260 a year ago because NVIDIA stopped being so greedybut because ATI was doing an insane competition.
PS: not that ATI is not greedy now...Evergreen has everything but raised its prices since launch, specially in Europe (but that is due to other factors, including the raising dollar).
Last edited by prava; 05-26-2010 at 10:21 AM.
GF100 is a brand new architecture there was bound to be problems. This happens whenever you make a new architecture just because its new. GT200 was till part of an old and matured architecture that is why it did so well and GF100 has the potential to do the same. I just needs to mature.
Yes this is a castrated chip but think of if the clusters they disabled were faulty then wouldn't it be a way to make money off something you would normally throw away.b) Different product, different chip. When you create and fabricate a chip, you have your target in order to optimize production. If you need a product that has 1/5 the power of Cypress, its nonsense to use a castrated one as you are wasting huuuuuuuuuuge money. Instead, you design another chip, in order to optimize production and make everything more efficient. Its not about the power consumption...its about the cost. Why would you use a 500mm2 castrated chip to power a laptop if you are gonna clock it at 1/2 the desktop clocks and use 3/5 of its original SP? that is a total waste of a chip that could be used for plenty of other stuff. If you look at previous NVIDIA mobile line-up, NVIDIA only offered G92 (and other low end ones) based products (even those who are labeled as GT2XXM), because it was clear that using a 500mm2 chip uber castrated was total nonsense.
My rig the Kill-Jacker
CPU: AMD Phenom II 1055T 3.82GHz
Mobo: ASUS Crosshair IV Extreme
Game GPU: EVGA GTX580
Secondary GPU 2: EVGA GTX470
Memory: Mushkin DDR3 1600 Ridgeback 8GB
PSU: Silverstone SST-ST1000-P
HDD: WD 250GB Blue 7200RPM
HDD2: WD 1TB Blue 7200RPM
CPU Cooler: TRUE120 Rev. B Pull
Case: Antec 1200
FAH Tracker V2 Project Site
When it comes to power usage and price, I've already said this:
If I understand your right, you are arguing that the current architecture of GTX 480M means nVidia is having problems with yield. You also bring inn the "castrated" chips and manufacturing costs on top of this. and trying to say manufacturing costs, yield, is the reason for high prices, and other bad stuff. But you are wrong, because the competition (ATi's ability to counter the Fermi) has the biggest effect on prices, would anybody with a couple of days marking-classes tell you.
I don't know if your assumptions about yield is right or wrong, but even so, why should I, as a consumer care about it? Why should I care about 480 out of 512 at all?
I don't care how much it costs to manufacture, or what kind of problems nVidia may had/have, what they could or should make. All I as a consumer care about is the PPP (Performance, Power usage, price) of the current products on hand.
The performance is the most important and comes first. There is no doubt that GTX 480 is performing good enough above the competing GPU (5870).
The price is more up to ATi's ability to get it's acts together and compete, and in case Fermi-prices will fall fast, for sure.
The high power usage is still a problem, and it may have something to do with GPGPU-extras, but it may get better by maturing BIOS, drivers, and such by time. Hopefully will get better, and it should, anyways.
Last edited by Sam_oslo; 05-26-2010 at 11:01 AM.
► ASUS P8P67 Deluxe (BIOS 1305)
► 2600K @4.5GHz 1.27v , 1 hour Prime
► Silver Arrow , push/pull
► 2x2GB Crucial 1066MHz CL7 ECC @1600MHz CL9 1.51v
► GTX560 GB OC @910/2400 0.987v
► Crucial C300 v006 64GB OS-disk + F3 1TB + 400MB RAMDisk
► CM Storm Scout + Corsair HX 1000W
+
► EVGA SR-2 , A50
► 2 x Xeon X5650 @3.86GHz(203x19) 1.20v
► Megahalem + Silver Arrow , push/pull
► 3x2GB Corsair XMS3 1600 CL7 + 3x4GB G.SKILL Trident 1600 CL7 = 18GB @1624 7-8-7-20 1.65v
► XFX GTX 295 @650/1200/1402
► Crucial C300 v006 64GB OS-disk + F3 1TB + 2GB RAMDisk
► SilverStone Fortress FT01 + Corsair AX 1200W
It is very clear that Fermi is not a very successful architecture for now. The reason is obvious: when 5870 was on sale, Mr Huang didn't even have a real Fermi card to show in the conference (where the wood screw jokes came from). In most cases, a delay means a failure. Pretty much the situation as Intel Pentium 4.
Well, you should.
1- Lower manufacturing costs means lower prices for the end-user.
2- A full-blown Fermi would be a better performer.
3- A better performing product for a lower price is good for itself and also would put a lot of presure on AMD, thus...
Well, you get the idea.![]()
Murray Walker: "And there are flames coming from the back of Prost's McLaren as he enters the Swimming Pool."
James Hunt: "Well, that should put them out then."
Going back in time and talking about problems, even if true, doesn't change the realities of the day.
Look at the GTX 480 as it is today, and compare it's PPP with the competing GPU (5870). It makes it much easier to judge it, why go back in time to dismiss something that is right in front of your eyes?
1- Completion (ATi's ability to counter the Fermi) will decide the price, not manufacturing costs. Just like 980x costs $1000 because AMD is still stuck in 45nm, not because it costs more to manufacture. Many other examples proves this point.
2- Of course, who wouldn say no to 512? But nVidi didn't (and still doesn't) need it to perform good enough above the competing GPU (5870). nVidia needs/needed a double-GPU to compete with 5970, not a better single-GPU.
3- GTX 480 is a better performing product, but a superior GPU costs more, just like 5870 costs more than 5850. The price will go dwn when ATi can put up a good fight and compete with Fermi performance.
► ASUS P8P67 Deluxe (BIOS 1305)
► 2600K @4.5GHz 1.27v , 1 hour Prime
► Silver Arrow , push/pull
► 2x2GB Crucial 1066MHz CL7 ECC @1600MHz CL9 1.51v
► GTX560 GB OC @910/2400 0.987v
► Crucial C300 v006 64GB OS-disk + F3 1TB + 400MB RAMDisk
► CM Storm Scout + Corsair HX 1000W
+
► EVGA SR-2 , A50
► 2 x Xeon X5650 @3.86GHz(203x19) 1.20v
► Megahalem + Silver Arrow , push/pull
► 3x2GB Corsair XMS3 1600 CL7 + 3x4GB G.SKILL Trident 1600 CL7 = 18GB @1624 7-8-7-20 1.65v
► XFX GTX 295 @650/1200/1402
► Crucial C300 v006 64GB OS-disk + F3 1TB + 2GB RAMDisk
► SilverStone Fortress FT01 + Corsair AX 1200W
they dont have a product with 512sp but that's not a serious problem. the extra alu's are useful for redundancy so you can turn off some for different SKU's. ATi takes a fine grained redundancy approach and it ends up with the 5830, an overpriced underperformer. it helps in the high end but the lowest bin of the chip is terrible. that's where the money is so they are equal tradeoffs.
different products should not require a different chip, that's a total waste of money. just the masks will costs you several million dollars. it's best to have a one size fits all design to target all performance segments. non-recurring costs are very high while silicon is as cheap as dirt.b) Different product, different chip. When you create and fabricate a chip, you have your target in order to optimize production. If you need a product that has 1/5 the power of Cypress, its nonsense to use a castrated one as you are wasting huuuuuuuuuuge money. Instead, you design another chip, in order to optimize production and make everything more efficient. Its not about the power consumption...its about the cost. Why would you use a 500mm2 castrated chip to power a laptop if you are gonna clock it at 1/2 the desktop clocks and use 3/5 of its original SP? that is a total waste of a chip that could be used for plenty of other stuff. If you look at previous NVIDIA mobile line-up, NVIDIA only offered G92 (and other low end ones) based products (even those who are labeled as GT2XXM), because it was clear that using a 500mm2 chip uber castrated was total nonsense.
btw, you might want to look up what the word castrate means. you are using it in the wrong context. fermi definitely has balls, in the form of a R/W cache.
the exact opposite is happening from what you described. just because nvidia's yields are poor doesnt mean ATi's are good. they are still under-binning. 5870's still come in two voltages because they cant keep supply high enough with current manufacturing capabilities and the 5970 is still $700. neither the 4870x2 nor the 295 had that premium even at launch.You could say that, if many chips are not 100% functional, its a good idea to use the defective ones for other purpouses: you are right. The problem is that, with time, yield rates improve (or tend to do so) and you would be getting more high-end chips and lesser cut-down ones, which would show to be a huge problem as you want more mainstream than enthusiast products. That is why they only make this as a limited line of products, because its not a good idea: they did it with 8800GS and now ATI is doing it with 5830.
Now, lets look at the hole stuff: if they are using GF100 for mobile parts it means that yields are not horrible, but 100% crap and, probably, it will take them quite some time to produce in good numbers any other chip. So, all in all, its a bad-bad situation for NVIDIA, and I deffinitely hope it gets better as, otherwise, we are screwed. Good competition means good prices, we didnt have 120€ GTX260 a year ago because NVIDIA stopped being so greedybut because ATI was doing an insane competition.
PS: not that ATI is not greedy now...Evergreen has everything but raised its prices since launch, specially in Europe (but that is due to other factors, including the raising dollar).
the 5830 has horrible value. its about $240 and the 5850 is 30% faster for $60 more. it's a disappointment compared to the 4830.
The 480sp Fermi actually cost more for Nvidia than the full blown 512sp Fermi. They had to spend money to develop the existing version once the original went wrong. Your analogy with AMD and Intel CPUs isn't valid because their "equivalents" CPUs don't compete in the same segment anymore. AMD focus on the price/performance consumer conscious market.
Sorry to be brutally honest with you but your other arguments don't really make sense.![]()
Murray Walker: "And there are flames coming from the back of Prost's McLaren as he enters the Swimming Pool."
James Hunt: "Well, that should put them out then."
Murray Walker: "And there are flames coming from the back of Prost's McLaren as he enters the Swimming Pool."
James Hunt: "Well, that should put them out then."
Bookmarks