Matrox cards cannot span 3D across multiple displays.
hm.. 3x TH2G + 5870?
Printable View
How much shorter is the HD 5850 estimated compared to a 5870 ?
The Dell is a 30" H-IPS panel. A typical 26" IPS panel consumes 135W. A typical 20" IPS panel consumes around 80W. You are comparing essentially the a cream of the crop panel that is typically seen in very high end televisions compared to a low end computer display, a panel technology that's so bad nobody would ever consider selling it on the television market.
I guess it doesn't let you link to quotes in closed threads. :shakes:
Largon, Here is what I said about it:
To me it sounds like a "dual-core GPU", because even though there are 1600 shaders/stream processors, they are in two groups of 800. 10 SIMD x 2. That's all, the guy over at Hardocp forums called it "dual shader mega clusters" CJ at tweakers called it "dual shader" and Theo called "dual shader engine" at BSN. Dozens of posts over chiphell they call it "native dual core gpu". Occasionally someone comes into the Beyond3d thread and declares it to be "native dual core". One guy brought up a point earlier and showed a gt200 die shot and said, "well i guess you would call this quad core?" (because it looks like 4 quadrants of unequal size, or two equal halves?) Well you could cut it in half down the middle and seem to have two identical parts, but are those 2 parts rendering via SFR internally, and designed that way to be seen as a single gpu? Probably not. There are lots of threads on "multi-core" and "interconnect" that are way more technical than what I can understand. Either these people are completely mistaken, or it is actually using SFR internally, and sharing the MC. Remember, we still have a bet ;)Quote:
...similar to dual core, yet not actually two dies in one package ala MCM). Perhaps there is no more real-time compiler in the driver and its all handled on the hardware level by the scheduler. Because the core of the chip is so modular & scalable with certain areas sharing parts of the die (ROPS + Memory controller logic), you are able to divide the specs in half (1600/256/80/32 to 800/128/40/16) and have two parts (rv870/rv830) and rv870 appears as two rv830's, yet it is still only one die. Hence term "dual-core" - more like "modular". Some people think, yes, but all GPUs are multi-core because each of the shaders is like a core by itself. Well, here we are dealing with two large arrays of 800 shaders, along with other standalone logic that communicates with either one of the two identical arrays. More specifically, one Rv870 die is composed of two Rv830-like 160x5 clusters/shader arrays (like two current-gen revamped rv770's in one die) sharing certain features, but connected via "internal crossfire" working in unison and the entire design is a continuation of R600 architecture. It is load balanced, efficient, and requires no crossfire software optimization (because it is hardware level communication); it works via SFR 8*8-32*32, and is bandwidth intensive. The board is using next-gen 5ghz Gddr5 to provide the required bandwidth. So, apparently they've slapped together two 40nm rv770's... so it's easy to see where "dual-core" confusion comes from. Cypress is like a 40nm 4890x2 in one die! -So, it is like a "native dual core" CPU...
Here is a quote from the technical discussion of the core at chiphell. (It sounds goofy because of the translation, but see what you make of it):
Quote:
Since the House passed 100 to summarize it:
We all know that RV870 is a native dual-core, MEM share, TMU sharing, etc. These hardware innovations is just the beginning, or rather in addition to hardware of the new technology, RV870 brings us the updated software in-depth , for example, CrossFire mode, specifically can refer to the savage major technical paper, I did not explain the response to this (http://bbs.chiphell.com/viewthre ... & extra = page% 3D1)
RV870 by native dual-core hardware support, the biggest benefit is the CF model can be evolved from today's AFR for a more reasonable SFR (do not understand what AFR and SFR for us to look savage Great article,), SFR and thoroughly get rid of the PCI-E bandwidth bottlenecks leading to CF or SLI can not use the SFR of the constraints, CF no longer rely on driving on the optimization of 1 +1 is infinitely close to two and said, where everyone will be RV870 native dual-core architecture of the deep deep impress, at least in my case, native dual-core is my first time in years, felt playing card graphics technology, a major step forward.
Go back to Crysis on the results, in fact, the RV870's Crysis performance can be regarded as a strong reason why the bounds of reason, but it can be regarded as lucky, because to say whether it is before the CF or SLI, the efficiency of both in Crysis is not high, RV870 just drill the use of technology, this SFR loopholes and completed card crisis, the role of Terminator, but I hope that everyone calmly RV870 performance, in other CF or SLI more efficient GAME in, RV870 impossible to think about Crysis so lucky, though it in terms of the total, RV870 architecture as pioneered the use of dual-core technology, combined with the many part of the shared framework, the core DIE or the ratio of power consumption and performance will be the same Performance Products the highest, which is beyond doubt.
Lao Lao hastily said so much, would have wanted to test them on the RV870 article says, but I got fascinated by the RV870 architecture could not resist spraying a bit out, more content or waiting for me in the future release of the RV870 test, let us share the bar.
Quietly asked a small problem, AFR to SFR does not seem to call it evolution, can only be called after the performance to a certain extent the development of an inevitable trend of bar
Yet, the die is nearly a perfect square...
Snip!
The costs now are allot cheaper for monitors than before.
Also the tech is different now that it would be much easier for pc
gamers.
Before multi monitor was aimed more at productivity than anything else then triple head go came on the scene & cost as much as a gfx card it self with heavy limitations requiring allot of planing with matching monitors & at the right res which only but the hardcore would bother to go through.
Hearing about it never makes up for seeing it in action, there is no one who wouldn't be more happy with no bezel but the experience does make up for it.
Very little ever starts off a being essential. A pc is not required at all for gaming.
The NV 3D is apples to oranges as it requires another piece of hardware between the PC & the screen which you could by 2 new screen or nearly for the cost & is no use for anything else & requires specific monitors.
As time goes on people go though monitors & replace them but now they may hang onto them & try out some multi screen gaming as it will cost them nothing pluysh being able to use them for multi or extended desktop with out the need for utramon to fix the taskbar. Even second hand monitors are cheap as chips & many are thrown away in office refits.
At the end of the day it will be the marketing & easy of use which will effect it popularity.
It could be square and look like this (SIMD sections highlighted red are the "cores" of the "dual-core")
http://i32.tinypic.com/fn8upc.jpghttp://i32.tinypic.com/wkhbno.jpg
Call it, "Native dual-SIMD section gpu". HAHAHAHA
I agree. For most high end gamers a screen for 400-600 dollars is a no-brainer then how could three screens at 200 dollars be something extravagant. As tajoh said himself, (although not literally) you don't need the most expensive screens for your peripheral vision since it's not nearly the same quality as your central vision.
This technology enables users to use three different screens (different kinds) and still work. For those like me who have been thinking about going multi monitor - this is making me seriously consider three instead of two screens... And sure, the bezels on my 19" CRT are wide but on most LCDs it's not that big of a deal.
Some here are focusing on the limitations while others are seeing the possibilities...
How about putting ALL of the 6 monitors horizontally, possibly encircling the gamer with 360 degree vision? Holy crap it would be awesome in zombie games to both run and look at your back to see zombies chasing you. Sure a lot of pixels would be going to waste but still it'd be cool :D
I have seen many posts across forums of people thinking of going multi screen purely because of ATI Eyefinity where they never even considered it because of needing fork out for triple head go & it limitations.
Forking out for the monitors & the Belize was not really discussed for a reason not to go multi screen because that's taken as a given.
Updated idea:
You put the gamer in the center, give him a rotating chair so that he can turn anywhere 360 degrees at will. In a FPS for example, when you change your direction the monitor with the crosshair would change too. You place a little motor on the chair (idea going a bit off the map here:D) so that when you turn 30 degrees to the right in the game, you turn 30 degrees in real life with the chair also, and the crosshair is ALWAYS directly in front of you, so it's a WHOLE LOT MORE realistic - when you simply want to look back, you literally look at your back monitor and when you want to TURN back, your chair turns also. Holy crap it'd be amazing.
Wow, i think you guys are on to something. sounds like virtual reality. You'd need some kind of Wii head tracking unit on to keep track of your crosshair on the map.
Imagine a circle of 12 displays, 2 high. 24 monitors in crossfire! of course your headset/controller/keyboard would have to be wireless if you plan on doing a lot of 180's, haha!
Back in the day Matrox cards used to be THE consumer cards for video editing and Image quality... anyway back on topic, the 2GB HD5870 just cant come out soon enough, I'm hoping that it has a decent OpenCL and DirectX Compute implementation as I am still wary that nVidia will have the upper hand where this is concerned.
For the record I don't really give a fig for multi-monitor capabilities and this is what concerns me. Just get one phat 30" 2500x1600 or whatever it is display and give us some performance figures in Crysis, S.T.A.L.K.E.R Clear Sky and World in Conflict DX10 already!
John
The greatest thing about my unique trademark (TM) design is that you can't just turn 180 degrees back and forth in milliseconds like in all FPS's - and you won't need to. Just like in real life, you won't have to change the direction you're facing in order to look back, for example. To look back, you just look back. To turn back, you and your chair turn back along with your control device. Great or what!
I think there's a tiny flaw in your design. What happens when you want to look up or down :p:
That will be solved with the Holodeck
Found this closeup at HKEPC, but they shopped an ATi logo over the die...
http://i25.tinypic.com/op8ugl.jpg
http://i30.tinypic.com/24gnd54.jpg
http://global.hkepc.com/4007
Man I'm so excited for these new cards, AMD has been so tight lipped about them. Grr past couple years there would be multiple leaked benchies by now. Makes me wonder what kind of a monster 5870x2 is going to be ...MCM chip maybe.
But the real question is how many out there actually have a second monitor that they didn't put to good use yet? IE: No need to actually buy another monitor just use the spare they already have. The steam survey already points out that it's not uncommon that some already use more then one monitor.