Ok thanks for pointing that out. It seems there are advantages and disadvantages to each implementation but ATI has no form of ECC/EDC on the cache. Also it will be irrelevant on the GeForce cards as...
Type: Posts; User: 003; Keyword(s):
Ok thanks for pointing that out. It seems there are advantages and disadvantages to each implementation but ATI has no form of ECC/EDC on the cache. Also it will be irrelevant on the GeForce cards as...
I could have sworn reading a news article on this. I believe it was from BSN.
Ah, yup I am right. Here is the article in question. Some interesting quotes:
So yes it looks like the current...
Rather than trolling as you usually do, are you capable of pointing out specific remarks in the post that you believe are false or untrue and explain why you believe so?
Unless of course they don't want to get the people who breached NDA in trouble.
Wait, these are all LGA 1366 and not LGA 1567??!?!?! If so.... :bounces:
If not, which ones are LGA 1366 if any? Wow, Westmere quad cores with 12mb L3 :slobber:
EDIT:
Ok I'm not positive but I...
There is literally no difference between idling at 10W and 20W and that would be a seriously stupid reason to avoid a considerably faster GPU. Nvidia has always been very good with idle power though....
Once again you have no idea what the performance of the high end Fermi based GeForce will be. I actually know some information that is under NDA and it will be considerably faster than the RV870, I...
It is a very good thing that GeForce won't have ECC. It is a significant performance hit. Especially because it is 2-bit ECC.
So what exactly is the TDP of the high end Fermi based GeForce? And what about 3D performance? Because without that information you can not logically make the claims you are trying to assert.
...
You must know something I don't considering there is zero official information on power and performance on the 5890 and the power and performance of the GeForce based Fermi. Even I don't have that...
What sources are these? I'd love to see them and I hate Charlie! :mad:
We will definitely be seeing the GeForce over 200W TDP as it will be clocked higher than both Quadro and Tesla, although ECC will be disabled.
Well CUDA on GT200 already makes the OpenCL implementation on any ATI card look broken by comparison and CUDA is going to be further improved on Fermi so it would logical to assume it is better yet.
You said "better cards" though, which rules out any current ATI cards. Future card I'm not sure.
You mean the ones that don't exist?
Uh no actually I'm not kidding. You can try and argue this all you want but there was already a very big review conducted by Anandtech of a 12-core Nehalem-EX system vs a 24-core Opteron system and...
Nehalem-EX already does that ;) See the linked sources posted in the thread.
Hmm so what is Infiniband and how does it work exactly? It's nice maybe now their Quadro and Tesla cards won't simply be exactly the same as GeForce but with more memory.
I wouldn't be surprised if a 6-core Westmere based Xeon with HT would be enough to beat or come very very close to the performance of a native 12-core opteron, and even beat it in a number of...
So it's anti-competitive now to not include a multitude of browsers all made by your top competitors in your own operating system? :stick::shakes:
So it doesn't sound like they are pushing for a fine.
Except adding cores doesn't really do much after a point. They need to step up IPC, and step down power consumption. Intel CPUs are faster clock for clock and use less power to boot. That is what...
They most certainly can depending on what type of panel is used, not to mention the LED backlight.
Obviously it is a big conspiracy between Nvidia and U of A. :rolleyes:
Hopefully you are correct. That isn't how the article is worded though