May I ask what situations you feel limited at? I have no issue with 3440x1440. Or are you referring to gaming?
Since the iMac is designed entirely for content creation when it comes to...
Type: Posts; User: N19h7m4r3; Keyword(s):
May I ask what situations you feel limited at? I have no issue with 3440x1440. Or are you referring to gaming?
Since the iMac is designed entirely for content creation when it comes to...
Considering Apple and the applications on it support OpenCL, the new m295x is rather good. It's just behind the GTX 780 Ti in OpenCL, and significantly better than the GTX 780m they were using. It's...
They started with Motarola CPU's, then PowerPC before going x86.
Before the Macintosh they were just called Apple's. PC is just personal computer and Apple were the first to mass produce and...
They've made personal computers since the 70's.
Ah thanks, I never did much with HTML and CSS. Learn something new every day.
Also yes its down to Luxmark haha, it's a nasty looks interface and program.
Also the Kernel in OSX is/was the...
I'e never heard that saying before. Could you explain what it means please?
The primary GPU is always at 850Mhz, it's the second one that down clocks when not used.
The configuration is GPU #1 is Video, #2 is rendering( openCL ) in general.
It's up to the application to...
Luxmark will take both.
It has an issue where it only reports the second GPU in its idle state, hence the 150Mhz instead of 850Mhz. it clocks up to 100% once it gets used.
...
Here you go.
AMD's OS X drivers definitely need some work.
http://i.minus.com/i9y0OppkywuHN.png
Yeah, the drivers in Windows are a bit all over the place.
I ran some Unigine, and the second GPU is barely used or noticed sometimes.
The Windows OpenGL unigine tests are also rubbish compared...
It's not even a preview, it's a joke haha.
I put windows 8 enterprise on and ran 3dmark for the lols
Not made for gaming, and I use it primarily for FCPX, but it does play if you want it.
The...
This is very old news, there was a rather largish thread on this when reviews originally started coming out.
Gizmodo needs to be ashamed of this so called "review", a few synthetic benchmarks and...
They had a great VR hardware team, and fired them all. It wasn't quite the oculus, but CastAR is working, and constantly getting better.
I want a solution that's great, and multi platform. So will...
I was a week away from actually buying into it and getting the new Dev kit 2 because it supports OSX and Linux.
Thanks for that link, time to look into alternatives.
Outside of just games, it is the standard. Especially for professional 3D work.
Now some core companies are moving towards it. Especially Epic and Crytek, never mind Valve.
I'm looking forward to seeing how this all plays out. Personally I'd say OpenGL will be the 'standard' within the next few years.
The Source Engine 2 is being made with OpenGL and Valve have been...
Possibly down to the old Westmere Xeons in use? Although they are 460Mhz faster than the new 8 core, and has 4 extra cores.
The chap that posted it been very anti new Mac Pro, so I would have...
Here's some 3dMark of Dual D700's vs Dual 7970's.
D700's
http://www.3dmark.com/3dm/2038247
7970's
http://www.3dmark.com/3dm/2064476
The 7970s are run on a 12-core 2012 Mac Pro with 3.45Ghz...
Seems Crossfire is enabled by default, although games still only seem to use 1 GPU.
https://www.youtube.com/watch?v=TdLOh8MdU20
It's an 3.0Ghz 8 Core model with dual D700's.
I'm rather...
It's the bargain bucket AOC i2367FH.
It's 23" IPS display with a matte anti-glare coating.
Cheap and cheerful really.
This place seems a bit barren, I'll add something.
http://i.minus.com/jh5fDqq9CmInD.jpg
That about sum sit all up.
My Phenom II 940 BE is still running, and it's a good chip, but it pails in comparison against my Xeon W3680, which is still going very strong.
If AMD can bring...
That's all well and good, but the mid range cards usually sell the most.
According to retailers like OCUK, they're also not allowed to bundle games with the 280 cards like they do with the 7xxx...
You'd be surprised how many people do argue it. I just don't see how the AMD midrange 280's can use as much power as the their flagship, and NVIDIA's top cards.
I want more than just raw...
Personally my complaint is that they weren't able to improve/reduce power consumption. I would have hoped after nearly two years they'd have been able to improve the 7xxx/280 series in those regards.