WTF, this is exactly why AMD fan boys are the absolute worst, they don't know when the line is crossed between being a fan of a company and just being damn hateful.
You want Jen-Hsun Huang to kill himself, then you put a smile face. WTF, thatsing evil.
Anytime a company or person is remotely positive for NV, they are accused for being a fanboy. If this person has any power or any pull, this person is being accused of being paid off by NV. Even Fugger has been accused of being paid off recently.
Man you guys, can't you just stay out of this AMD fans just stop posting in the NV crap in NV threads if your going to just say crap. I have never heard anything positive come from your guys mouths.
Atleast most NV fanboys can say something positive about AMD every once in a while. You guys are just creating a negative atmosphere and making the mods to just give.
The only thing negative about thing I have said about this latest generation is I don't see the 5870 is a worthy upgrade for someone who owns a 4870x2(which is me, and I also have a 4890), and eyefinity is kind of a novelty like 3d gaming because you have to pay alot of money to get it going(more screens), and I am accused of being an NV fanboy when I own a whole lot of AMD stuff.
Dude, are you paying attention to whats going on around you..? The GT300 will not be out for the Xmas holiday shopping season..
Nvidia concedes this, just read Anand's review of the Tesla architecture. ATi will have 3 months of un-fettered sales. When the GT300 is released, ATi will simply drop the prices on their current line-up and release (or announce) their X2 line, etc.. competing $ for $.
Do you really think Nvidia will be able to compete with Hemlock (5870x2) @ $599, (come this January or Febuary...) when Nvidia is due to release their GT300.
Or, a 5870 @ $299, or the GT300 for $499..?If Nvidia doesn't have a DX11 $199 part soon, they will loose massive market share and loose $Billions in stock. It doesn't matter how good the GT300 actually is, if only 40k uber-hardcore people ever buy it. The 5870, for the foreseeable future, is good enough for almost anyone needs.
Millions of people will be buying or upgrading their stuff for the Holiday Season. Nvidia will miss out on all those sales.
Timing is everything!
As promised, sticking to just news posts from here on in this thread.
Not sure if it's been posted guys, but...
http://www.xbitlabs.com/news/video/d...r_Q4_2009.html
It's on track for Q4 '09 according to NVidia.
thats not an accurate way to speculate when a card will launch.
its just kind of sad that people would go that far over an email.
indeed, they are going to get screwed over hard in the future if they dont stay ahead. they have to get in embedded markets because intel and amd will have their gpu's on die in pretty much anywhere they can put x86. its not right now that looks iffy for nvidia its in the next few years.im not really worried for them. they have some nice chips from tegra to tesla.
Particle's First Rule of Online Technical Discussion:
As a thread about any computer related subject has its length approach infinity, the likelihood and inevitability of a poorly constructed AMD vs. Intel fight also exponentially increases.
Rule 1A:
Likewise, the frequency of a car pseudoanalogy to explain a technical concept increases with thread length. This will make many people chuckle, as computer people are rarely knowledgeable about vehicular mechanics.
Rule 2:
When confronted with a post that is contrary to what a poster likes, believes, or most often wants to be correct, the poster will pick out only minor details that are largely irrelevant in an attempt to shut out the conflicting idea. The core of the post will be left alone since it isn't easy to contradict what the person is actually saying.
Rule 2A:
When a poster cannot properly refute a post they do not like (as described above), the poster will most likely invent fictitious counter-points and/or begin to attack the other's credibility in feeble ways that are dramatic but irrelevant. Do not underestimate this tactic, as in the online world this will sway many observers. Do not forget: Correctness is decided only by what is said last, the most loudly, or with greatest repetition.
Rule 3:
When it comes to computer news, 70% of Internet rumors are outright fabricated, 20% are inaccurate enough to simply be discarded, and about 10% are based in reality. Grains of salt--become familiar with them.
Remember: When debating online, everyone else is ALWAYS wrong if they do not agree with you!
Random Tip o' the Whatever
You just can't win. If your product offers feature A instead of B, people will moan how A is stupid and it didn't offer B. If your product offers B instead of A, they'll likewise complain and rant about how anyone's retarded cousin could figure out A is what the market wants.
Yields..
vary a lot depending on what libraries are used, logic or SRAM, etc.
Pure specualtion, but we can infer somethings:
- RV670 didn't have a "4830". - this suggests yields were very good.
- RV770 both 4850 and 4870 at launch had 800 SIMD. Few "4830" and company instead opted for seperate 40nm die. Later revision clocks all the way to 1Ghz. - this suggests yields were spectacular
- RV870.. notice for first time in many years the "050" part has cut down shaders etc. Even though die shrink, clockspeed about same. - this suggests yields are poor
Now lets look at nVidia:
- G80. Right away we see GTS variants. Suggests many defects on ROP or MEMCTRL.
- G92. Launced as 112 shader 8800GT. Suggests yields quite poor initially. Couldn't even launch "full" part. But, later we get GTX, and GTX+ and only GTS250 on market. No cut down versions. This indicates yields improved a lot.
- G200. Once again, cut down GTX260 and GTX275 versions. Notice how they "improved" the shaders on GTX260 to 216... yields gradually improving, but because of huge die, many sub-perfect parts.
GT300.
We know 3B transistors. We know 384bit. We know 512 CUDA cores. Thats it. Rumours about yields are just rumours.
But, we do know 3B transistor chip will be bigger than RV870. And AMD launch suggests they are having yield problems. This suggests that *if* and *when* GT300 launches, it might be like G92 launch.
To harvest more chips with defects, 'GTX' could launch with 480 SP, and the GT with 416 SP. Later, after several months, a "Ultra" can be launched with full 512 SP.
Dont be surprised if this happens. They did it with 7800GTX/7800GT. 8800GTX/8800GTS, and ofcourse GTX280/GT260.
24/7: A64 3000+ (\_/) @2.4Ghz, 1.4V
1 GB OCZ Gold (='.'=) 240 2-2-2-5
Giga-byte NF3 (")_(") K8NSC-939
XFX 6800 16/6 NV5 @420/936, 1.33V
I think you're exaggerating quite a bit here. Several months without a $200 DX11 card isn't going to cause NVIDIA irreparable damage, or even the millions in lost sales that you claim. The simple truth is people aren't upgrading at the feverish rate that they used to, back when PC gamers had exclusives like HL2, Doom 3, Far Cry and UT2k4 to look forward to. It's not just that the 5870 is "good enough", because not a lot of people will be buying that card either, as it's still in the enthusiast price range. Rather, it's more because there aren't many upcoming bleeding-edge, AAA PC exclusives to galvanize sales.
Besides, this is the first time NVIDIA's been late since 2002. I think they'll survive this generation.
DFI LANParty DK 790FX-B
Phenom II X4 955 BE (1003GPMW) @ 3.8GHz (19x200) w/1.36v
-cooling: Scythe Mugen 2 + AC MX-2
XFX ATI Radeon HD 5870 1024MB
8GB PC2-6400 G.Skill @ 800MHz (1:2) 5-5-5-15 w/1.8v
Seagate 1TB 7200.11 Barracuda
Corsair HX620W
Support PC gaming. Don't pirate games.
Totally agree about at most a hand full of wafers and couple hundred cards by Dec31/09.
Too many people are idealistic and imagine that because they already have first silicon, its a simple matter to produce boards. While that may be true (making boards is easy), it would not be a sell-able product.
Assuming perfectly working A1 silicon, you dont know its perfectly working until after 3-4 months of extensive testing. Remember, its not as simple as testing a new Coke bottle design, or a desk lamp. If customers indeed will be running scientific applications, all math functionality must be tested precisely. And lets not forget compatiblity with hundreds of monitors, games, engineering and graphic applications. Although some of this can be "dismissed" and fixed in driver later on.
So forgive me if when I see somebody post "board Nov12 - confirmed" that I roll over in laughter.
[EDIT:
To add to what Cybercat said, many people **only** have 20" 1680x1050 displays. Even 8800GT is fine for that with most games. Most people are not crazy obsessive MUST have 8xAA working or x_x... very very few people even have 2560x1600. Last gen (GTX275/4890) already pushes 50+ at 19x10 in virtually every game WITH AA. And its a lot simpler and cheaper to disable a few quality settings than to wait and spend $$$ just to play a $49 game.]
Last edited by ***Deimos***; 10-03-2009 at 07:58 PM.
24/7: A64 3000+ (\_/) @2.4Ghz, 1.4V
1 GB OCZ Gold (='.'=) 240 2-2-2-5
Giga-byte NF3 (")_(") K8NSC-939
XFX 6800 16/6 NV5 @420/936, 1.33V
“The first Fermi GPUs are expected to launch by year’s end,” stressed Mr. Alibrandi.
What exactly is meant by "launch"? And then there is the word expected tossed in. Leaves a lot of wiggle room for Nvidia IMO. When Nvidia gives an actual launch/end NDA date, then we can all sit back and expect it. Until then, no one really knows. People are understandably skeptical after the mock up stunt Nvidia pulled.
I think people tend to forget, that Nvidia has a share price to Nurse, and to take all "Expected" dates with that in mind... Delays = share price fall, so they as a company have to spin news as much as they can legally, to keep investor confidence up. They may launch some December 31 for posterity, but I think Jan Feb realistically is when we ll see quantity.
" Business is Binary, your either a 1 or a 0, alive or dead." - Gary Winston ^^
Asus rampage III formula,i7 980xm, H70, Silverstone Ft02, Gigabyte Windforce 580 GTX SLI, Corsair AX1200, intel x-25m 160gb, 2 x OCZ vertex 2 180gb, hp zr30w, 12gb corsair vengeance
Rig 2
i7 980x ,h70, Antec Lanboy Air, Samsung md230x3 ,Saphhire 6970 Xfired, Antec ax1200w, x-25m 160gb, 2 x OCZ vertex 2 180gb,12gb Corsair Vengence MSI Big Bang Xpower
probably they meant soft launch i don't expect anything special from nvidia this year
"Cast off your fear. Look forward. Never stand still, retreat and you will age. Hesitate and you will die. SHOUT! My name is…"
//James
If in trying to defend the defenseless, nvidia fanboys didn't make an absurd amount of absurd posts, the threads would be smaller.Originally Posted by Chumbucket843
I find it funny how quickly they went from zealots to emmo. "leave nvidia alone", "ATI fans are Bad", "I won't post more" kinda remind me this:
http://www.youtube.com/watch?v=kHmvkRoEowc
Oh man, I hate when things turn to a discussion over "why ATi is right and NVIDIA not? You're ATi... You're NVIDIA... you only say that because of hatred... you only say that because of love"... it makes keep the thread so difficult...
I'd like to imagine for a moment that those names belong to a pair of companies that make products for us to buy, like Philips, LG, Woxter, :banana::banana::banana:or, Sony... not to some charismatic heroes of an epic world, or something.
But that comparison is not really a useful one because they are not comparing the same applications / code in different hw, but different code on different hw (note that ATI cards are running code in ATi Stream while NVIDIA cards are running code in CUDA, so that means that the code they are running is different). That's similar to compare the Windows loading time with 2 ssd's (A and B) using Windows XP in ssd A and Windows Vista in ssd B.
They are testing HD5870, which is the reviewed card. They are using the most powerful NVIDIA solution, to compare. Which is exactly the point to include the RV770? About the GT300 decimating RV870 in double precision, we will see. That's too much fortune-telling for me. On the other hand I'm not all that interested in the applications that can get a benefit from double precision, since domestic applications and the kind of professional applications that may be useful to me, shouldn't do much about double precision floats.
What does ati better/worse have to do with this thread, keep it on topic please
CPU: Intel 2500k (4.8ghz)
Mobo: Asus P8P67 PRO
GPU: HIS 6950 flashed to Asus 6970 (1000/1400) under water
Sound: Corsair SP2500 with X-Fi
Storage: Intel X-25M g2 160GB + 1x1TB f1
Case: Sivlerstone Raven RV02
PSU: Corsair HX850
Cooling: Custom loop: EK Supreme HF, EK 6970
Screens: BenQ XL2410T 120hz
Help for Heroes
CPU: Intel 2500k (4.8ghz)
Mobo: Asus P8P67 PRO
GPU: HIS 6950 flashed to Asus 6970 (1000/1400) under water
Sound: Corsair SP2500 with X-Fi
Storage: Intel X-25M g2 160GB + 1x1TB f1
Case: Sivlerstone Raven RV02
PSU: Corsair HX850
Cooling: Custom loop: EK Supreme HF, EK 6970
Screens: BenQ XL2410T 120hz
Help for Heroes
Why don't we just bash nVIDIA 'til they finally release something ?
Coding 24/7... Limited forums/PMs time.
-Justice isn't blind, Justice is ashamed.
Many thanks to: Sue Wu, Yiwen Lin, Steven Kuo, Crystal Chen, Vivian Lien, Joe Chan, Sascha Krohn, Joe James, Dan Snyder, Amy Deng, Jack Peterson, Hank Peng, Mafalda Cogliani, Olivia Lee, Marta Piccoli, Mike Clements, Alex Ruedinger, Oliver Baltuch, Korinna Dieck, Steffen Eisentein, Francois Piednoel, Tanja Markovic, Cyril Pelupessy (R.I.P.), Juan J. Guerrero
Anyone see this? I don't know if it's real or not.
http://forums.overclockers.co.uk/sho...&postcount=112
[SIGPIC][/SIGPIC]Bring... bring the amber lamps.
very nice post, thx, and also thx for bringing this thread back on topic!![]()
and yes, i agree, i wouldnt be surprised if nvidia launches a cut down gpu card first. all they need right now is to beat a 5870 and a possible overclocked rv870 aka 5890... if they dont need all sps for that, they probably wont enable all sps initially.
oh really? hmmm thx i didnt know that...
im confused though, what do cards use the slot power for then?
memory pwm? but 75W for memory, isnt that way too much?
![]()
Bookmarks