Well I won't be against nVidia if you buy performance, but those fanboys need to get off their high horses and reality distortion fields.
Printable View
Well I won't be against nVidia if you buy performance, but those fanboys need to get off their high horses and reality distortion fields.
If you are saying that you prefer ATI products to Nvidia,than you have every right to.And no one has the right to argue that.I like ATI products as well and personally I think their tech has always been as good or better.
If you won't buy Nvidia because they are in the unique position of being a business that is in it to become bigger and make money for themselves as well as their shareholders then there can't be too many things you will buy.
Fact is Nvidia has been a much better run company for years, and if you ask me who I think has better long term forecast, I'd bet on Nvidia.(I could definitely be wrong, only the future can answer that question)
As for favorites ,I don't have any when it comes to money.
I speculated Nvidia would take a tumble , that's why I am still holding a few put options that cost $0.20 and are now at $2.15 bid.
I am also holding AMD calls that I paid $0.40 for and are currently at $0.38 bid because I think they will beat expectations this quarter.
But all this - I won't buy because Oooo00oo0 they are only trying to make money is nonsense.As if AMD gives a sh*t about you or anyone else.
Sounds like INQ talking points to me. Opinions, not facts. Don't care about Vista drivers, forced companies to remove extra features, lies about mobile GPU issues etc.. Claiming that they lied on these issues is an assumption, you can't PROVE it one way or the other.
Its INQs word vs. nvidia's/game developers/mobo makers.
I skipped the 780i when it first came out because I knew 790i was down the pipe. Was not sudden at all.
$50 bucks more for 10% performance (550 to 600), that is linear price scaling! We never get that. I will concede that 600 bucks is not impressive, but by your argument it is. :p:
Good conrete reasons to dislike Nvidia would be if you no owned a mobile GPU and it failed on you or if you no longer trust future mobile GPUs for fear of a repeat incident. Also, maybe you prefer ATi because of the price to performance ratio they offer. OR because ATi offers the fastest graphics solution on the planet. Maybe you prefer their driver support.
As for me, I lean nvidia because I enjoy their driver support. I have had good luck with both the 7XXX and 8XXX series. Also, I don't trust ATi's drivers. I am currently playing WarHammer Online and ATi users are having a hell of a time with that game, in my opinion the reason is both ATi driver support and lack of engine optimization. The game has been out for a month and still no driver showing support for WarHammer.
If ATi had better support for the games I play, like Warhammer, I might be much more apt to make a pair of 4870X2's the graphics muscle for my next rig. I also want to see better over all scaling in quad CF.
No problems with Warhammer with the Spider platforms in my sig so not sure were you got that.
You do know how much money Jen-Hsun and other executives made by taking advantage of automatic stock selling before revealing the Mobile GPU fiascos, no?
Newer Geforce 7 drivers reduced performance quite dramatically. No, Catalysts don't do that.
@G0ldBr1ck
Gamespot's benchmarks reveal that 4870X2 is top dog in WAR (or whenever transparency AA is on), and SLI is borked. So... :D
http://www.gamespot.com/features/6197926/p-4.html
Can't argue this one.
It's the same as when Orton announced great things,stocks shot up,sold them, then announced a $400 million dollar stock write down,product delays etc and ATI stocks plummeted,,
AIG gets over $80 billion in a bailout and then the executives take a californian retreat to the tune of $400k of taxpayers money,
Good old Hector recieving millions in salary bonuses and stock options all while AMD is losing hundreds of millions of dollars every quarter.
Again, you make these things sound like they are unique to Nvidia.
And I want to say ,honestly, this isn't a personal attack.If you were talking tech,I'm lost,and know less than 99% of the people that frequent this site.
But I spend twice as much time reading market reports etc, as I do on tech sites, so I feel I know as much as the majority(but definitely not everyone) of people that frequent this site, when it comes to business.
You don't like Nvidia , we get it.
But quit trying to say Nvidia is the only one guilty of these things when ATI has been accused of these things as well as many many other companies,in the past and each and every year.
Actually Gamespot had an article about 4870x2 and used games that favors Nvidia hardware. 4870x2 only shined on Grid, which was a pretty bad review nonetheless. Gamespot = paid to review gamescores, bad at reviewing anything else.
Well thats good. I have a good friend who runs WarHammer with an E8500 and a 4850 and has stutter issues. Also, I see a lot of complaints about all ranges ATi cards and stutter.
That gamespot review....how in the heck can gamespot benchmark an mmo anyways? Far too many variables involved and they achieved this benchmark by doing a 100 second walking test in a highly populated area...there is no way you could hope to get consistent result with such a short test and random people running around.
I am sorry but those numbers look totally random. GeForce 280 GTX is top of the list w/o transparent AA but when on the same section scores 10fps lower with SLI. There isn't even an SLi profile for the game so there should be zero performance difference. In other words, from what I can tell the game was not using SLi in both tests gamespot ran but each test yielded a totally different result.
SLi is not broke in Warhammer, it doesnt exist. As far as I know, neither does CF. Which is another reason why that gamespot review is highly suspect. Especially because it was posted 3 weeks ago.
A quick google search on the subject found me this quote from one guys experience posted on Oct 3rd.
Basically what I think he did is the same thing that I do. I have SLIAA forced in nHancer as a means to take advantage of my second card, since there is no SLi profile yet. Works quite well.Quote:
Warhammer Online
All settings max @1920x1080 w/ hardware forced 8xAA 16xAF
First let me convey my frustration in which this game was optimized for any graphics card, ATI or Nvidia. It's terrible!! First of all they have a 100FPS framelimiter that's setup to divide by an integer so you get something close to 100FPS, 80FPS, 60FPS, etc.. This greatly limits the frames already rendered and as there is no software setting for this feature you have to painstakingly turn it off with hardware tweaks (not easy.) Warhammer does not easily take advantage of multi-GPU setups and the only way in which you can get any improvements with crossfire or SLI setup is to turn up the hardware based AA and AF (dont bother with transparent AA *it makes everything in the game transparent.*) Also there's a terrible memory leak in specular lighting that causes most problems within the game for most people. If you are playing this game... turn off specular lighting. Oh, dont think it's simple to turn it off either... you can't just unclick the box there; it wont allow that. You have to click the fast performance preset, restart the game (or use /reload.) And turn everything back up after that. Anyway with some modifications to the game engine, the game does run at about 40fps to the cap of 100fps with the above specifications with max (except specular lighting.) The game needs a lot of optimization and I really hope they fix it soon.
Also, you might notice that transparency AA is broke so if you run it, you are not rendering the full scene.
I have had no issues with the memory leak that he describes.
http://www.overclockers.ru/images/ne...1/gt206_01.jpghttp://www.overclockers.ru/images/ne...1/gt206_04.jpg
GT206 - 484 mm^2
GT200 - 576 mm^2
Source
doh who cracked the chip?
anyways why sit here an bash nv? i did not see that in thread title.
its like no,we dont want cheaper faster cooler running cards.
waaaaa only a refreash lol yup just like the 68gs and 320/640 a3 g80's and they ran circles around there top of the line brothers.
i expect the same here with the gt200 whats so bad about that.
So if it is really GT206 aka GT200B then seems it has die size like G80@90nm so it is still pretty big. 40nm GT212 should have about 250-270mm2 if specs will be the same.
Generally “targeted” (apologize for the word “targeted” but I couldn’t find a suitable word, not a native English speaker ok)
There’s a difference (amongst other things) between bashing and criticize, the critic NVIDA has received have most of the time been more then well founded. To use an American expression here (despite being European); what goes around comes around.
I don’t dislike the product, but I have the intelligence (not to suggest that you or anyone else would not, nor to suggest that I somehow where to be smarter or better, I am not) to understand the difference between company “policy” (which is severally lacking) and the product and those engineers and others spending hours putting the card together and optimizing drivers.
Now, if you can not do that you in time hopefully will, but fanboys never seem to have a sober mind come NVIDA, or ATI for that matter, same mentality but different product. As I mentioned in my previous post: they are all insane but since they are all insane they don’t know it, its not sane to identify yourself with a product, or a team or what ever. I myself identify myself as a human; not a product be it red or green, nor do I take personally offence if someone would attack me for owning a ATI card or NVIDA card, how could I possibly do that?!? Am I the darn card or what? Of course not; you can have a psychiatrist that has a patient that could stand up and smack him in the face, now is he to take offence or understand the disorder? Well? He is to understand the disorder. This was of course a metaphor and if you don’t understand what I just conveyed to you bin it and move on.
Look, I’ve owned many cards from both NVIDA and ATI (many of us most likely have). All the cards have had their pros and cons but stating that card A is better then card B is as daft (no offence) as to state “how deep is a pit” and then expect an answer.
Cheers
Edit:
At the end of the day this is a hobby and an interest we all share right? and as such it should be fun and we should try helping each other out whilst sharing ideas and so on. This you VS me mentality really bums me down, enough of that crap on this planet as it is already.
I cant take an article which says 'NV is in deep doo doo' seriously.
I hate the inquirer.
Oh, and they are making it all up. How professional. They write their articles like a bunch of 12 year olds.
For all the accusations, there is very little proof you present. Alot of the negative comments about how their business is run, ATI has committed the same acts at one time or another. So are you saying AMD nor ATI has lied before. Thats crock. Lies, lies lies, AMD CEO was getting paid more than Intels CEO, Look at the financial statements at both companies. How can you explain that.
There was alot of shady things going on with ATI purchase, there is a reason why AMD overpaid for ATI.This purchase price in turn screwed over AMD quite badly as we all known. Cost alot of money, alot of jobs, so a few guys can have way more money in their pocket.
Sure the gtx280 was overpriced for the performance but for the cost of the actual hardware for a company, the price made sense. The gtx280 was an expensive chip to make(bad yields, huge chip, expensive PCB). Compare this to the 4870(good yeilds, smaller chip, simple PCB, simpler design, less transistors). I wouldn't doubt if the 4870 x2 has a similar cost to make gtx280.
Even people like charlie are saying the gtx280 are sold at a loss at the current costs. What this means although the price was high in the first place, in terms of cost for the company, the original costs were justified. They would be scalping the chip as severely as you say, if the productions costs were similar to the 4870 and they were pocketing the same profits as the 4870 + 300 or 100 dollars in case of the gtx260. If you want to look at scalping, look at intel. They have been doing it for years. Their 1500 dollar processsors make anything from, NV, AMD, look like bargains.
The 790i was released about 4 months after and was released more to compete with the x48 chipset which had an even shorter release time between the x38. Atleast the 790i added more features than the x48. The x48 is pretty much identical to the x38, people the chip is more or less cherry picked x38. How about crooked ATI reviews with ANANDTECH. E.g Alot of the early anandtech for AMD chipsets show, this glowing promising review, something that rivals intel and beats their chipsets. When the retails hits, all
that performance, those incredible overclocks disappear.
The other one did it too so it's ok if they're doing it now? These are people that bite marketing gimmicks like physx and cuda when there's no compelling reason at present. Let Nv be on top again so they can screw you more, it's obvious you like it.
Well said! That could not have been put any better. I have an 11 year old nephew that wouldn't write like that. For those who have kids and know about the "terrible 2's" they obviously already know that is really the "terrible 4's". When I read what you wrote that's exactly what it (the article) reminded me of...a 4 year old kid in a store wanting that candy and pitching a tantrum after you have to say no, and grab it and put it back. Seriously. :ROTF:
Man, you nailed it! We could reduce this thread to 2 posts...the OP and your post and it would be summed up completely.
ah, you shouldnt take the "made-up" part seriously
theinq likes to screw nvidia, and wants to be quiet about its informants
No, I love how the INQ said "deep doo doo". It cracked me up!
You guys are taking everything so seriously, I wonder if you guys have ever laughed in the past month or so!
^Spoken by the biggest ati fanboy who ever lived^.
You crap every single nvidia thread you can find with all your ati fanboyism.
News flash....nobody gives a .... this is an nvidia thread.
I can't believe you haven't gotten a vacation yet.
Attachment 86835Attachment 86835Attachment 86835Attachment 86835Attachment 86835Attachment 86835
lol @ troll spray :D
Hehe, so G92b became 9800GTX+, but when GT200 shrinks it becomes a new model, GTX 270/290? *hint hint* ;)
//Andreas
Guys...next person who as much as drops the fan-word gets dropped like the dumb blond in just about any horror movie. I mean, the only reason everyone gets bent out of shape here is because there's too many squares in the circle.
Anyway, a lot of it is INQ bs(funny how we all knew it was the inq WAY before we even saw the source), but I'd be shocked if NVidia only got another 50mhz or so out of this thing. Look how high the 280's and 260's clock on 65nm without pretty much any effort at all. Also, I agree with Delph, plain jane shrinks from NVidia have been getting the b suffix, while shrinks with optimizations have gotten totally new codenames(G80->G92, which DID bring some changes, although nothing big).
Either way, I'm just going to put it like this... Haven't ANY of you learned after the past generation of graphics cards not to waste your time fighting over predictions? First G80 utterly blind-sided the vast majority of the people here, while you guys argued that it would only be 48 pipes(and not even unified), then everyone thought the R600 would be the best card since sliced bread(we all see how that turned out), then that the GTX-280 would crush all(it did actually come in at what was expected, ~2x the G80), and then the RV770 showing us the definition of bang for buck(and the R700 reminding us that ATi WILL keep pricing high on the high-end for as long as they can get away with it, just like NVidia). It's one of the reasons for the most part I've stopped bothering discussing them, but I still watch them, as even when you have the greatest sources possible there's always info that changes between pre-release and release.