MSI showed the bigbang trinergy preproduction models at cebit last march
http://images.hardware.info/news/cebit-day2-23.jpg
link
http://www.tcmagazine.com/images/new...C_board_01.jpg
link
Printable View
MSI showed the bigbang trinergy preproduction models at cebit last march
http://images.hardware.info/news/cebit-day2-23.jpg
link
http://www.tcmagazine.com/images/new...C_board_01.jpg
link
I remember when Nvidia was the new underdog and 3Dfx was the evil one, now Nvidia is the biggest so now it becomes the Evil one and ATI is the saint.
If Nvidia ever dies and ATI becomes the new EVIL then we do it all over again.
Humanity is so lame that I am beginning to hate our race.
I wish some alien race comes in and kills us all...then again, I doubt any intelligent live out there wants to get remotely close to us human scum.
Damn I am in a good mood today...now I just need to join the South Park goth kids!
Anything that crushes the other thing will be automatically declared evil. Same goes with AMD/Intel and Aliens/Humans "See District 9 nice movie"
Humanity is passionate and emotional it will retaliate if it feels something unfair is going on with respect to their moral's, this is only in 3rd person perspective tough people who work for Nvidia will think Nvidia is correct in doing whatever it does and it is true for every other company out there till the company's action does not go against the wishes of that individual.
Now to become goth all you need to do is get bitten by a vampire "Ask new moon's actor"
Yep.
The whole thing just gets annoying after you have been in the industry for so long....Tandy Coco, pong, beloved C64, etc.
We have seen it all and we have seen it happen some many times that it is just annoying to say the least.
Between that and the flavor of the month kids that can only recommend things that came out last week, to quote Cartman "you are breaking my balls man"!....lol
I don't think anyone said Nvidia was evil or that ATI was a saint. The difference is that if, and I say if, this is true, it's just another example of a large company screwing the consumer. If ATI did this then the thread would be about them and not Nvidia. The reality is that ATI has released two generation of cards that have not been overly priced unlike Nvidia. If Nvidia was really wanting their SLI to be adopted then they wouldn't be charging for it and would have gladly allowed it to be used on Intel based chipsets. AMD allowed it's arch rival Intel to include CF support on the last few generation of chipsets because they wanted it to be adopted. Nvidia on the other hand wanted their cake and wanted to eat it too. Sometimes it's better to compromise if you want to get ahead and that's what ATI is doing. If Nvidia's chipsets were so great they would have allowed SLI to be used on x48 and P45 MB's and openly compete with them with their 680i and 790i chipsets, but they wanted to keep that feature to themselves. If they want folks to buy their GPU's, then they would be more competitive on price and stop with the negative PR campaigns when ATI released their HD5800 series or have a competitive model on the market. Instead they showed off a non-working GT300 and stated it would be delivered this year, not going to happen. Then they denounced DX11 and then came out and supported it. There is a reason that 3DFX went away and that's the fact that they sat on their laurels while Nvidia took their market from them and now it looks like Nvidia is doing the same thing.
It it really that or just that people's perspective is once again changing to get rid of the new evil....face it, it is always the same thing.Quote:
There is a reason that 3DFX went away and that's the fact that they sat on their laurels while Nvidia took their market from them and now it looks like Nvidia is doing the same thing.
What do all evil companies have in common? Yeah, they are all at the top. Coincidence? I think not.
I agree very much. Besides I thought that was one of the goals of business is to make customers happy because a happy customer is a returning customer which is the foundation to any great business. I think what happens is that some companies get a little carried away and paranoid in the board room. They become a little separated from reality in their climb for the top and become a little callous in their actions.
Again, no one said Nvidia was evil. What has been stated is that they are sitting on their laurels and rather than compete, they are hording. And I wouldn't really consider them on top, but I do not have the numbers in front of me, so maybe they are. IMHO they just swap blows with ATI in the GPU market since they will not be producing chipsets anymore I am not sure what they have left.
Here are the facts, ATI has released a new gen card that is faster than the previous generation, adds new features and is priced competitively. Nvidia on the other hand is 4-5 months out from releasing their card even though they have stated it would be out in December and showed off a mock card to everyone (Did we say that was Fermi, oops, our bad, that's just what it will look like). In the last two months they have done everything they can to keep people from grabbing a new gen ATI card, this by itself is not bad, except for the fact of how they are going about it. First they say DX11 is not needed and start to rename cards once again. Then they say they have a working model, but it turns out to be one of their Quadro cards, but the performance should be similar. Then they say they will be out with the GT300 by December, now it looks like it won't be till March or April and it will support DX11. They are basically stealing AMD's CPU playbook on "Why we can not compete, but you shouldn't buy Intel".
I could care less who makes the best chip or GPU, I have no brand loyalty, but I do live in the real world and when companies employ anti-competitive practices then they should get called out on it, whether it is Nvidia, ATI, Intel, AMD, Microsoft or Apple, doesn't matter it's wrong. Does it make them evil? Not at all, but it sure doesn't make them look consumer friendly.
And yet another story about this:
http://www.tweaktown.com/news/13433/...hip/index.html
Seams Nvidia may try to block it via drivers. Really? Really? Do they not realize they are just shooting themselves in the foot?
Nvidia grew big and it now must think it's invincible. I am a long time Nvidia user, I've bought their top models of each series, since the 4800 titanium days; if Nvidia really pulls this stunt and doesn't let MSI launch the Hydra board and/or blocks the chip in the GPU drivers, they are really stupid. What can possibly drive them to do such thing? Do they really believe that by doing that they will force the WORLD to use SLI instead? SLI has its problems and limitations. CF has its problems and limitations and I am sure Hydra will have its problems/limitations as well. But the consumers must be the ones to judge and choose.
If the Hydra chip is good/better than SLI or CF, let the users have them. It might even stimulate them to buy more GPUs... why don't they think like that?
Lucid is a very small company, compared to Nvidia. Instead of blocking the Lucid chip, put your engineers together and build a better one, Nvidia!
Looks like Charlie is trying to get hits again. I'm gonna have to ask around and see if this is actually true...
I just think that nvidia may be taking things a bit too far...
"Everybody is evil, you only become evil when someone is able to point out the fact that you are"
If this is true. Its game over for Nvidia for me, and for everyone else most probably. I have been a hard nvidia user since the dawn of time, i actually went from VooDoo to Nvidia, which gives you some perspective of how long I have been using them. If they block the functionality of hydra with their cards. I will have no choice but to go to the competitor. Be it ATI, or eventually Intel.
I have a hard time believing that someone with a single 8800GT is all of a sudden mortified that Nvidia would block access to an unproven multi-GPU solution ;)
Nvidia isnt the bad guy here.:shakes: What if this lucid hydra chip has potential compatibility problems with nvidia cards.:( They dont do any hardware testing for lucid based configurations. Therefore they must disable such a configuration to prevent any unnecessary problems that good customers may potentially have.:) Dont you see now? nvidia is protecting us here. The lucid chip could potentially be a major time consuming and costly problem for consumers, especially when we have such a SOLID AND PROVEN technology like SLI.;) Nvidia is only making sure that we dont have to worry about these kind of problems, and keeping the price of gpu's down from unnecessary extra hardware support.:D you should be thanking them for caring about the quality of multi gpu gameing and retail prices!:clap:
This just like why physx shouldnt be able to run with ATI hardware rendering. There is just too many issues that could potentially hurt and cost us in the long run.:up:
I hope and expect Nvidia to do this. The free pr spin from the Nvidia fans on every forum is going to provide alot of entertainment.
do you work for nvidia :rofl:
what you just wrote is a crock
i been thankin nvidia ever since i used a 680i chipset:rofl: thankfully i got a full refund after 2 dead board:D
most of us saw how much better sli is on an intel chipset even better when it was hacked
i'm waitin on hydra but i really dont care if it works on an nvidia card cause i really doubt i'll ever buy another one
how bout this for a clue. I have stuck single GPU and will continue to until something comes along better than this garbage SLI and Xfire. I have attempted to use both and cant stand either one. Hydra was this wish, and I am an nvidia chipset user, but not anymore if true. Its one thing to have an extreme system by giving in to pay twice the money for a percentage more of the performance for SLI/Xfire, its another when its intolerable to play visually. Maybe some cant catch the microstutter, but i do, and sometimes even getting 15% more performance or less of a gain is just ridiculous to me. Not to mention not every single game supports it, but, thats none of this is the case with Hydra....so Lucid says, anyway. (<---gotta be careful and say that, the "there is no facts about hydra, you dont know that" monsters might eat me ;))
Heads up, you dont know everything about a person from a little bit of info in their sig, I havent updated that thing in ages anyway. Though I still use this 8800GT.....in one tower and you know what, im freaking proud of it, its an amazing piece of chip, even now.
If any of you would work for nVidia and have the final decision regarding this, you would most likely go ahead with blocking it with drivers too.
For the simple reason that it can completely destroy the reputation SLI has got because of buggy lucid drivers etc, same counts for AMD, they will probably do the same.
You've got no control over a third party that uses your technology properly (even though it's different tech), or even breaking it. nVidia has invested millions in SLI, and isn't going to let some newbie company destroy it's reputation. Anyone with a grain of business sense will agree to this, and would most likely come to the same conclusion if working for said company.
I agree (partially) with this guy :
What do you mean?
was there some info on the net saying lucid were waiting for updated driver to allow cross vender support... ????? so the delay are probably that... anything other than that is retarded fanboy comment or assumption... unless they have insider info to share with us
People actually talk like this on forums I frequent. XS is by far the tamest and also has least foaming at the mouth type. hardforum has the worst sheeple spinning free pr for Nvidia that I've seen... but for me at least its why I love going there to mess with them.
Is it just me or its mostly pro nvidia people that have more animosity towards Hydra Lucid chip? Hey if its vapor then its vaporware *shrug, nobody else is stepping up to improve sli/xfire.
FUD: Both MSI's Big Bang motherboards are real
You know you wanna click it! :D
Bottom line: Q1 2010.
@Decami: So you run a single low-end GPU because SLI and Xfire are crap? Yet you believe that this magical chip on which you have zero information will be more compatible and scale better because they said so. Did I miss Lucid's brainwashing?
I can't wait to see the support model for this. When something's broke and it don't look good who you gonna call? Ghostbu....oh wait....:shakes:
well you seem to be a big fan of hydra you have any test to show us? or just all the hype that they shoveld us the last few yrs?
i cant think that the support for this would be any good if you did have issues with it.
or just make us pay more for a mb that has this feature only to turn it off cus its broken.
I hope you guys do realize that this entire discussion is based on the premise that Charlie's nonsense is accurate? Nonsense that has already been debunked by MSI themselves. He must be very entertained by now.
Actually, this conversation is based on Charlie's comments and another source stating Nvidia will be blocking the use of Hydra with their GPU's at the driver level. While MSI might say it is still coming out, the point was that they would let it quiet down and be forgotten before dropping it, that way there would be no fuss about it. Until I see a board on store shelves with the Hydra chip I won't believe it will make it to the market. It's simple economics and in this case Nvidia will be hurt by the Hydra chip coming to the market, at least in their eyes. If they could see beyond SLI they would realize that Hydra would actually help them sell more GPU's because it would broaden the upgrade path for a lot of people. No longer would they have to choose between waiting between generations to be able to afford an upgrade. They could keep their current GPU and get the new generation and still be able to benefit from the increase in GPU power. Currently, a lot of folks skip upgrading each generation due to costs and small amount of increase in performance, usually about 15% over previous generations. However, if you could keep your current GPU and add a new generation GPU and get a 50% or better performance increase then it might be worth it.
Oh whatever...we know...we know...we have no information about the chip, and Lucid is completely lying and is probably a made up company anyway and those recent videos are CGI developed by ILM and funded by intel. When will this question stop. Some of us want this to actually happen. Is that a bad thing? To be honest, your argument is nothing either if you dont think the chip is going to succeed, because you dont have proof that it wont. That question works both ways, if you wanna call it bull because you dont wanna get your hopes up then fine, but dont try to push that crap down my throat.
and why do you keep mentioning what I run. You have no clue what I run, not to mention it would make no difference in my opinions anyway.
I could be running on board graphics and it wouldnt change the basis of my opinions, and neither would it for you.
You know what I love... how your questioning Hydra and saying is theres no fact to back it up creating the complete argument around how to doubt something, yet, you look down at my sig, and take it for complete and utter fact, like you have been to my house, strange contradictions, I do say so myself.
No it's not a bad thing to want it to happen. Of course not. But if people go around making claims that it "will" happen while dissing the current solutions they should be prepared to get called on it and defend their opinion. It would also be easier if you could back up your faith with some technical reasoning. The people challenging Hydra are doing so on technical grounds, not because they hate progress or don't want a better solution.
The hardware you run is an indicator of whether you're in Lucid's target market. Multi-GPU is primarily a performance enhancing option. If you're happy with a single 8800gt then I don't see how Hydra's benefits would apply.
Well if you are still hanging on to your premise that hydra is technically impossible, then it doesn't matter if Charlie's article is right or wrong because the product will never materialize anyway. Is that still the case, that you think hydra is impossible to implement?
1 - Nobody is "dissing" SLI or CF, just that Lucid on paper is supposed to be better.
2 - Nope, not one person challenging whether or not Nvidia is blocking Hydra on a MSI board have brought up that Hydra is technically impossible or what not.
3 - Reread my post. Hydra actually would allow someone with a 8800gt to upgrade to the next generation while using the 8800gt, hence is why EVERYONE WANTS IT!!!!! Hydra would take away the limitations that SLI and CF place on users, not to mention you could use a Nvidia card with an ATI card.
Seriously, this discussion was on whether or not Charlie was correct and if so, then what? Stay on subject. :shakes:
Hydra is currently having issues combining two current high end gpus ( same model & card, not different gpus ), do you really think it can mix and match different GPUs ? :D
I told you guys I know somebody who works for LucidLogix.
He's also a member of this very extreme forum, but he doesn't seem to want to post anything himself ( I wouldn't too, if I was working for them and was bound by NDA and risk losing my job :p: ).
Now if you guys are so eager to hear that it was nVIDIA who stopped MSI then I guess there's nothing that can be done.
Even if an MSI rep. posted something I'm quite sure lots of people wouldn't believe his words...
Anyway... keep it up...
This thread really suxxx ... :down:
this been posted yet? http://techpowerup.com/107704/NVIDIA...cid_Hydra.html
They reliable enough for you tim?
Do you believe in the easter bunny too?
MSI already called him on his BS. Move along.Quote:
Seriously, this discussion was on whether or not Charlie was correct and if so, then what? Stay on subject. :shakes:
You're kidding right? Their source is Overclock3D which got their info from SemiAccurate. See how Charlie's BS spreads and gets people all excited?
And finally somebody who knows about news circulation :D
It's kinda hard to say something is technically impossible when you don't know what that something is. But let's step back for a minute and see what people are getting so excited about. The current data path is:
Application -> DirectX -> IHV Driver -> IHV Hardware.
Now what I would like is for one of the believers to tell us where Lucid is going to inject itself in that stream in such a way that they could manipulate what commands get sent to the hardware. Note that it IS impossible for them to insert themselves between the driver and the hardware because that communication is completely proprietary.
No, it shouldn't be reliable enough for anyone. That is just a rehash of what Charlie posted. The same goes for what is posted over at OC3D (the original source).
Unfortunately for people who "ran" with the story, MSI came out with a statement that debunks what everyone was so eagerly lapping up like lovestruck puppies.
Damn, that's a shame, I really thought techpowerup was a good source, well its gone from my rss feeds
From what i've heard it had nothing to with nvidia or anything... its just that the Lucid drivers didn't pass MSI's QA... so they delayed the product until they're sure its working correctly. And from what i've seen in the past, the drivers were the biggest issue that most people complained about when they encountered this Hydra chip.
Yeah, its really annoying that people still read his site considering all of the biased crap he continues to post. And then people take his stuff, because its so sensationalist and post it around, and then he gets more and more hits. Which is ultimately achieving his goal of more traffic and bashing nvidia.
Sure, we need to be skeptical about news reports/rumors of unreleased products. Especially one like this where the product is so uncertain that assigning blame on Nvidia for delaying it is uncalled for without a higher level of evidence.
We don't know how it supposedly works so being skeptical is a reasonable response. I'm just saying that the other side of skepticism is that we don't know how it supposedly works so how could we possibly have enough information to come to a conclusion that it likely won't work? How hard it would be to implement a product like this depends on a lot of factors, not the least being where in the do they intend to insert themselves in the instruction stream. But we don't even know that much. So I'm not going to say such a product is likely or unlikely without a little more information.
We do have information. We know how DirectX and GPUs work. The guys who write engines for a living are asking the same questions and showing the same skepticism (see the Lucid thread at B3D).
Yeah, I'm highly interested in Lucidlogix's Hydra chip... :up:
Who isn't?
Though, I'm not as cynical as most, because of the technology itself. Honestly, to think SLI/Xfire is the absolutely best way, is ignorant & laughable.
Ironically, I think THESE people thought so too.. and decided to do something about it. I would assume they know more about motherboards, PCIe and DirectX calls than any of the thread-crappers here.
It's works, unless you call the press release and demos fake?
So.. the only thing left to discuss is it's performance... not HYDRA's validity. Thats a forgone conclusion at this point. The doom & gloom is overdone at this point, the chips are real and on boards... the drivers are being tweaked.
What is it exactly that you are trying to dismiss..??
dont be so naive xoulz, its just a chip and so what if it provides perfect scaling. thats not nearly as exciting as the rest of hardware out there especially with the speed increases with next generations of gpu's. it makes the hydra engine pretty insignificant. if you read trinibwoy's post above you can see people who write engines are skeptical about it. thats not a good sign coming from a forum like b3d. the people there probably know more than lucid anyways. i dont understand how you can validate a product by a demo either. what if it only works with ue3?
im gonna find it hilarious if lucid finally does come out with the chip, but it doesnt perform as well as SLI and crossfire, or introduces noticeable input lag or other problems.
then we'll have endured all this hype for nothing lol
It's good to be skeptical. But questions about how it could possibly function don't necessarily mean that it cant. But trinibwoy, you, and some others seem to want to use this as a reason to doubt that it could function at all, ie. that it is a nonexistent product.
But inclusion in a future motherboard seems to indicate that the product is in fact real, though perhaps not functioning properly. So did Lucid trick MSI into designing a board around vaporware or do you think that they might know more about it then we do?
What we have hear is a FAiL to communicate.
I could write a story about how having a hydra chip swinging from my nutsack helps taking double headed dookies, and somebody would believe it.:rofl:
[QUOTE=trinibwoy;4098376]Do you believe in the easter bunny too?QUOTE]
Actually....................................no, but my point was if it does what it is suppose to do, then it would mean the end to SLI and CF. They have done limited demo's, so it isn't vaporware. Now the real question is whether Nvidia is trying to kill it, hence the delay or was it crappy drivers that Benchzowner has stated that he got from someone he knows from an inside source as being the real reason for the delay. We don't know either way and since I do not know Charlie or Benchzowner, then I will take both with a grain of salt.
We will just have to wait till January to find out.
Ummm... you posted an article from "TechPowerUp" which sources it's information from "Overclock3D.net", which then in turn sourced it's information from "Semi-Accurate" aka Charlie Demerjian.... the original source of the story.
Great job at reading your articles. :rofl:
Im sorry but Nvidia deserves to be bashed they have done nothing but set back the grafix community for years inb fact from the time they bought 3DfX nearly a decade ago...anyone remember 3dfx brought us multi gpu boards in the voodoo5 nvidia buried the technology and milked us (as any major corp should do right?) for a decade.. Im not a fanboy of ATI (though I have used their cards for the better part of the last 5-6 years) but I just simply despise what Nvidia has and continues to do time and time again just as the physX bs still continues to play out... While their cards are excellent they have surpressed technology at the cost of the gamming community for years and for that I say they ought to be bashed, not banished, just bashed :)
Yes... and his name is Charlie Demerjian. (Yes... that's him in the suit)
http://www.theinquirer.net/img/1007/...jpg?1241331964
SOURCE: http://www.theinquirer.net/inquirer/...l-ceo-confirms
Yes... I owned 2 Monster Voodoo II graphics cards. And yes... they were worth every dime I spent on them.
Surpressed technology = Technology that nVidia spent millions of dollars to develop.Quote:
Im not a fanboy of ATI (though I have used their cards for the better part of the last 5-6 years) but I just simply despise what Nvidia has and continues to do time and time again just as the physX bs still continues to play out... While their cards are excellent they have surpressed technology at the cost of the gamming community for years and for that I say they ought to be bashed, not banished, just bashed :)
OoooOo.... that evil corporation who simply wants to make a solid financial return on their big investment!! How dare they!!!
If you socialists (read: Communists) had their way, no corporation would go spend millions of dollars to develop a technology which makes their products substantially better than their competitors.
Without profit being the goal of innovation, the graphics industry would be at least a decade behind where they currently are now.
Any of you down for playing a little DOOM 2?
WELL DONE.
If we ever meet, i owe you a beer. That was perfect placement. :D
So, you're siding with Charlie?
I dont like things that companies do, and i speak out against them. But bashing a company for things they haven't done is taking it too far...
Someone, quick grab the zipties! I've found one of charlie's minions!
I couldnt care less if this is true or not, but I am really sorry for the people believing firmly that charlie is in any way to believe. I have been reading so much bull**** coming from him that I would not believe I was facing my my own reflection in a mirror if it was charlie telling me that I look at myself. His mouth is litterally flooding pure sh*t. If he sometimes is in a right track its old news or something really embarrasing for some poor bastard.
How is asking companies to play fair equate to be a communist? Seriously, you draw from this thread that some of us are communist because we question the ethics of a company that possibly could be forcing another company to hold back technology that might be beneficial to the consumer? Let's forget that we do not have the whole truth here, anti competitive practices should be frowned upon regardless of what company does it, whether it be Microsoft or Wally World. How is that being a communist? Seriously, re-read the posts here, no one said making a profit was bad. What everyone has condemned are the supposed actions of Nvidia to keep a technology off the market that could hurt them. That would be like Exon buying all the electric car technology and tossing it in a vault. That's not anywhere near being free market.
And no, I do not play Doom 2. :D
This happens all the time in business. Companies will keep competitors out of the picture to keep their own products and prices intact all the time. This is basically what nvidia did with ageia. Ageia would of been a huge competitor on the hpc market if they would kept their technology, but since nvidia bought it, they just swallowed it all up and the competition at the same time. Keeping ageia from being a competitor was more important than the actual technology.
Exactly. But on the other hand if you want to cheerlead something you should be asking those same questions too and coming up with a different answer. Anything else is just blind faith.
Until it actually makes it to market and we have independent verification that it works then it is vaporware. You have seen Duke Nukem screenshots right?
Sorry but this is BS. If 3dfx's tech was so fantastic they wouldn't have gone belly up and gotten mopped up by Nvidia in the first place. Not to mention their tech back then was cost inefficient from a manufacturing standpoint and completely irrelevant in today's world. Back then graphics cards were little more than texture units.
Nvidia has held back some things - rotated grid AA comes to mind and maybe even tessellation too (we'll only know how practical tessellation would've been on past architectures when we see how it does on the new cards). But in terms of growth of the industry itself? Name another company that has done more to promote the PC as a gaming platform.
I don't understand why this topic really needs any debate. Plain and simple, Nvidia milked the gaming community with its g80 design. Renaming a product 4 or 5 times will turn a community against you. I don't think this is true just because of the source, but I would not be surprised at all if it is. JH is on record saying that gts 250 is better than 5870. That company is ran by a bunch of greedy idiots.
3dfx messed up bigtime with a lot of things, that's why they went bankrupt and why nVidia bought them, not because of nVidia being evil.
The 5500 was a failure. Why do you think it had two VSA-100 chips on there? Because it needed to. And even then it couldn't beat a Geforce 2. nVidia didn't bury the technology, it didn't need it because their products were already much faster then 3dfx's. Also, remember that it was nVidia who brought the SLI tech (albeit in a different form) back to the enthusiast market. That's not what I call setting back the consumer.
Most of the stuff is self promotion, NOTHING to do with PC gaming promotion. For example TWIMTBP is all about marketing and sometime optimizing the game for NVIDIA platform. How is that beneficial for PC gaming?
How beneficial was for PC gaming NVIDIA never adopting the DX10.1 (until now on some entry card)? NVIDIA was actually promoting how useless the DX10.1 was before that.
How beneficial for PC gaming is NVIDIA being late with DX11?
How beneficial for PC gaming was NVIDIA buying Ageia and using the PhysX engine to run only on NVIDIA cards?
How beneficial for PC gaming was NVIDIA forcing the use of the expensive NF200 SLI chip? In most cases not needed to run SLI.
How beneficial is for PC gaming NVIDIA charging $5 SLI licensing? NOT needed at all.
Why do I and millions of other gamers have to pay for such license if I don't have any use for it at all? Thank you for the benefit NVIDIA.:rolleyes:
Yes except for the above I can show you another company which does as much for PC gaming as NVIDIA but maybe you can guess by yourself if you are open minded.
Is this thread nothing more than an excuse to bash Nvidia? Don't we have enough of those threads in the news section?
By Alex Myers, the author of the article in Overclock3D.net:
http://www.overclock3d.net/news.php?...g_bang_board/1Quote:
Edit/Apology: It seems in my rush to get this article out nice and early for all of our readers, I made a beginners mistake (remember I've only been at this for a couple of weeks) and missed out a few important steps in the process.
I would like to officially apologize to SemiAccurate, my initial source for the article, and NVIDIA and MSI for my misinformed statements which cast them in a bad light. My sincerest apologies guys, I hope I haven't offended you - Alex Myers
TechPowerUp lists Overclock3D.Net as source for its article. I hope the make a similar statement and remove it.
Then it looks like this thread should be closed and we can wait till first quarter next year for this to hit the market.
No no no, I think we should continue bashing nVIDIA :D
Bad nVIDIA, bad!
They sell PC gaming hardware. How can they promote themselves without promoting PC gaming? Every company promotes itself. It's a normal part of running a business so trying to make that seem like a negative thing is beyond stupid (to put it nicely).
In the grand scheme of things they are technologies that PC's have that consoles don't. On one hand people piss and moan about console restrictions forcing their way to the PC and now you piss and moan about tech that allows the PC to stand above and beyond. No Nvidia doesn't give a crap about ATI, they want people to play PC games and they want them to do so on Nvidia hardware. Duh.
If you can point to initiatives from other companies to grow the market we'd love to hear them.
I don't have much use for fAith. I prefer to approach it as a scientist - we need a certain level of evidence to know something to a reasonable level of certainty, and then the conclusion is still tentative barring the introduction of contradictory evidence. I hope hydra comes out and works, but I don't think cheerleading it or being overly dismissive is very productive.
If lucid have a good business team ...then they will give the chips away for half the production cost.
Just like when silicon valley just started out and they sold microprocessors for a dollar a chip