130euro for a 1920x1080 screen? 2 or even 3 of those, is just what one screen cost 2 years ago.
and you be suprised in how many actually run 2 screens.
I do, obviosly.
Printable View
forget multi-monitor, what we need are VR headsets to play games with :)
Thats only ture for the "run of the mill" stuff, if you want good quality, you pay the same as 3-4 years ago.
If you want good a good homogenous illumination, good viewing angles and decent color space you pay 400-600€ (or more).
And then we are coming into regions where the multi-monitor setup costs more then the everything else.
I'd rather have one decent monitor, then 3 that give me eye cancer when i look at them.
Cost is not the only issue, its practicality too.
Here are some issues.
Even with the power of the 5870, a game running at 6 million pixels, is going to need some serious power to run, especially a direct 11 title. Turn on stuff like AA and add some detail and watch the fps drop like a rock.
Who wants to game with screen borders all over the place. Not me.
Drivers issues. I can imagine alot of driver issues popping up because multimonitor stuff can get messy.
Need. When you sit a good foot and a half away from your computer monitor how important is it to see your peripheral when a 30" monitor already dominates your field of view, hell a 24" monitor does a pretty good job of that.
Don't be stupid, what gonna make 58xx sell well is its performance and that it is direct x11, which will coincide well with the influx of windows 7 pc ready to be sold. Not eyefinity.
yeah your right about that.
i paid over 3,1k for my 30"
my 22" was way cheaper, only 140 a real bargain.
but the 22 has .. uhm yeah i cant explain it different than " ugliest display ever "
Does anyone know how much bandwith the PCIe slots have? Like 8x 2.0 and 16x 2.0?
Because I got an LE and I'm going to try to pick up two 2GB's at launch. 16x/8x/4x/8x <- retarded config, I know.
Actually, eye Infinity will help it sell.
When you have two cards of equal price, that have the same performance. Which would you choose.
The one with or without eye infinity?
That's why we get these little extras on the cards and it's why Nvidia pushes CUDA and Physx.
When performance/price are equal, each company needs a USP.
More people have multi-monitors, than overclock...!
I know alot of people I work with and was surprised how many use multiple monitors... musicians, stock traders, etc..
Racing and flight sim peeps have always pursued this avenue. I know alot of WoW'tards who also use it.
I have a 27" Dell, most opt for a second monitor. Understating the use of multi-monitor support is a tad overdone here. Many people will go with 3-monitor set-up now that doing so will not impact performance in our favorite games.
I don't think that many people in a gaming environment(not work as I mentioned earlier), have multimonitor set up. Too much space, too much money. (how alive is the flight sim market today) and no real impact on gaming experience.
More people use multimonitors for gaming than overclock, surely you jest. If what you said is true, than companies like thermalright would have gone bankrupt a long time ago.
Not taking a performance hit, now your just talking out of your ass. More pixels... ....forget about it, you should know this already.
Most card can do dual monitors now(especially the cards that have the power to push this resolution), its the triple monitor audience that eyefiniti matters and this market is alot alot alot smaller than the overclocking market.
The 5870 was running 80 fps playing WOW with 6 30" monitors with max details. Any other game with max details is going to be unplayable as WOW is a crazy old game.
umm actually it is the beginning of the Holodeck
I'm not going to buy three cheap ass screens but I'm not going to waste a fortune on three great screens either. The reason for this is the human eye. In your peripheral vision you don't have as much colour receptors and thus I won't be needing a 400-600€ screens to give me great colour depth. What I'm going for is a really good primary screen and two less good screens (but with decent refresh rates and GtG etc).
I don't think it'll be really possible to play games fast in 3 monitors, which'll be something like 6000x1200 resolution. Most modern games would be out of the question at high settings and such a massive resolution.
I don't think so. What new games are more demanding than Crysis? Not that I would want to play Crysis, seems like a really boring game with good graphics. Seems like the 5870 handles it pretty well so pair two 5870's together and less demanding games and you'll have nice performance.. This is what I'm hoping for at least. (But I'm projecting 5040x1050 will be the resolution for me.)
i dont think its a bs gimmick, its def a nice feature... i only have one display and wont go multi monitor for now... but once displays with REALLY tiny or no bezel show up im going to get a set for sure! ive been looking for a wider than 16:9 display forever but just couldnt find a propper one... id def get two displays if it would be possible to merge them like one and have only 2mm or so of bezel in the center...
If you actually watch the demo you might change your mind. That's one card handling 3 displays at max resolution (3x 2560x1600), and at max detail, at playable fps for L4D, Dirt 2 and HAWX.
If you're talking about crysis, that's only one game.
EDIT:Anyone else find keeping track of >4 threads a little annoying?
This thread and "Radeon HD 5850 is a 725MHz chip with 1440 shaders" should be locked.
edit:
nvm, other threads got closed.
It should be able to handle anything pre2k10 @ *3,some might need crossfire.;)
Modified specs about the HD 5870 (core clock , memory clock & compute power)
http://i30.tinypic.com/maw86p.jpg
source: http://www.arabhardware.net/forum/sh...d.php?t=133519
Cost is NOT really an issue. I can buy two additional 19" monitors with good specs and good reviews for maybe another 2-300 dollars. That's not a lot comparatively. Furthermore, the eye sees more horizontally than vertically, and adding two monitors to my peripheral vision is going to do a lot more than adding a few inches to my vertical vision for about the same price, as we all know (and apparently claim) prices go up exponentially with monitor size.
If you read around, you'd see that this feature is clearly for older games where the new 58xx series is seriously overpowered for the graphical horsepower they require.
The screen borders become invisible if you are adding periphery space around the central monitor. Try it; it is a well known, researched fact that the more you stare at one thing, the more it appears to disappear from your vision. Your eye is much more sensitive to dynamic objects as opposed to static objects.
There should not be any driver issues. Again, if you read, you would have known that the whole idea is to treat the setup of monitors as a single monitor, not like individual entities like Windows traditionally (?) handles them. Thus, when you select a resolution in game (they even showed screenshots of eyefinity resolutions in Crysis) it appears as one giant resolution.
Who says you have to sit far away? If I'm adding monitors to expand my periphery I will still be sitting at the exact same distance, but the 'dead space' that was there before will now be filled with information I can pick up on a subconscious level.
Eyefinity is definitely a nice feature and if it continues to be passed on to future generations, I might consider a multimonitor setup.
Also, to those people who complain the other monitors would suffer from poor viewing angle, all I have to say is: adjust them until they are OK (e.g. angle them). The rationale behind this, especially for FPS, is that the game probably (I'm not sure?) renders what you see based upon a circle field of view. Setting >100 degree FOV ingame (Q3, Source games) and actually having monitors capable of displaying that without the crazy distortion you see in a planar panel would be ... impressive.
They don't have multimonitor setup. After this card is released, that number may increase significantly.
Your second sentence makes no logical sense.
Of course it will take a performance hit. But if it remains above a certain threshold, you will most likely not be able to feel it unless it dips under 30 in chaotic situations.
Again, you should have read before you said anything; the monitors are merged into one large resolution, instead of being split and handled separately.
The 5870 was also running Left 4 Dead, Flight Sims, and I assume should be able to run RTS' with ease.
im sure there will be some driver or game glitches... its a new thing, barely anybody plays games with 2 or even more displays... but it should work alright for most games seeing as it merges the screens into one big display virtually...
the menu being partly hidden by the borders between the displays is the biggest problem i can think of... and the crossbar being centered and thus hidden or shown cut in half on both screens is annoying too...
there are bezel less displays, its really not that dificult from a technical point of view... so i dont get why there arent more displays without bezels....
i think the display makers think it doesnt matter and its not worth the 10-20% higher cost... i hope we will see some propper displaus without bezel soon... those samsung displays are a joke... there already are displaus with such a small and even smaller bezels afaik...
Well said Cegras, totally agree with u.
Well you probably want one of these! Just needs to be affordable!
Or this?
http://www.youtube.com/watch?v=vq52x6LmUew
Or this?
http://www.youtube.com/watch?v=ZWivZEtH6-c&NR=1
This is what you want here -->http://www.youtube.com/watch?v=K39zHXI8MPs:up:
You guys need to think BIGGER. How about 3 1080 52" screen tv's, or 3 LCD projectors running. With the projectors there is no seem, you could really just run 2 if you wanted. You could get the screens to line up perfect then.
because a HD projector costs about the same as what 10x 19 to 23" screens would cost.
and 10240x2340 (5x 23" on the horizontal, 2 on the vertical) just can't be passed up, admittedly you'd need 2 or 3 5870's and all that would run you $2500 but the experience would be awesome.
If someone has the money to purchase a 5870, then they are likely gaming on a 23"-24" monitor to begin with.
From what I have seen, the price of 23" monitor has stabilized and to get something not completely crappy, its still 300 dollars. Thats 600 dollars on monitors or 900 if you starting from scratch. Thats alot of money for most people and is as much as some peoples computer budgets.
Do you know how much desk space 3 monitors that takes up? Most desk's I have seen can barely fit a 24 inch monitor and a tower. Now your talking about 3, 24" monitors.
I have a multimonitor setup. I like using it for 2 work surfaces, where I can focus my attention at one screen at a time. I can still not ignore the bezel because they are simply too big on monitors. Even if its in my peripheral, I can still notice it, and if its one gigantic image distributed across multiple displays, I defintely notice the bezel and its fricken distracting. Hence when i do game once in a while, I do it on a single monitor.
Drivers issue, no company wants driver issues, but they all pop up anyway. To get playable frame rates on newer games, your going to need crossfire. I already have crossfire issues on a single monitor with 3 videocards on multiples games(crysis and crysis warhead) on a single monitor. I can imagine only imagine the havok type of technology will involve.
AMD and good drivers just don't seem mix for me.
If you read the other post I replied to it would make sense.
Someone said more people use multimonitor setups than overclock. I said that bull, as overclocking is close to free and look at the market for it. Look at all the companies that make products strictly for overclocking. Look at all the forums and website dedicated strictly to overclocking. Look at all the budget builds that people suggest, because the processor is supposed to be a good overclocker. It took AMD until now to come up with this eyefinity thing(and the technology behind it), if it was really that important AMD or some other videocard company would have come up with it a long long time ago. It wasn't until last year, that NV added driver support for dual monitor, so I don't think it was that big of a priority.
Who says the number of people will go up significantly? This could be a feature that fails very easily. The reason being, people don't want to take the risk of buying monitors(some free features already go into obscurity).
Your also trying to predict the future, which is really uncertain at this points and is difficult to judge in this economy. Something about 3 monitor is more of a luxury than a mid high end videocard.
Your 'opinion' is as good as anyone else's. Read: it's not.
Don't try and spread FUD, therefore, based on your anecdotes and personal experience.
You use poor logic throughout. Just because an invention has not been conceived yet does not make it unworthy of attention. Perhaps with a bit of thought you would have realized how silly eyefinity would have been if it was introduced in the era of CRT of the when LCD's were first being introduced.
I will repeat myself: eyefinity is useful for games where the 58xx is clearly overpowered. This is what has been said and will be stated. No one expects you to be able to run eyefinity on recent games at that kind of resolution, unless some surprises are afoot.
I'm not even talking about 23, 24" monitors. 19" monitors that are good are cheap and available everywhere. Nowadays the push is for widescreen, and this is exactly what it lets you do - get widescreen without paying an exponential price for screen space.
I can see, however, that you will never be convinced, most likely because you are either a staunch critic of new technology (!) or you don't like ATI (!). My crusade is to prevent people from taking what you say seriously.
The future of electronics is certain (to about 5 years), The future of consumers is not.
I never said it was not worthy of attention. Don't twist my words.
Simply because you say it is bad logic, doesn't make it so. I already had other people agree with me. I think your ridiculous to think that people that are buying a card like the 5870 or 5850 to game on 19" monitors.
To assume cost will not prevent people from buying multimonitors is a fact, especially in this bad economy.
Your overextending your own beliefs thinking they are reality. I mentioned nothing of crts and multidisplay technology has been around for a long long time. Its never taken off because it is not practical for gaming, the technology has alway been there for dual screen atleast for a while, but it just has not taken off because
I have three AMD videocards in my computer. Hell I even posted pictures on here to prove it. Look for my posts.
I even have one of the highest 3dmarks in Canada using amd cards, so I already proven I don't mind AMD. I have no current benches with NV, so I am not an NV fanboy.
http://hwbot.org/listResults.do?user...plicationId=12
I have praised new technology in the past in this very forum so I already proven you wrong completely.
Cry Engine 3 and Eyefinity Video
Impressive !!! :eek:
I guess people should keep their personal experiences to themselves, for risk of spreading fud. I mean if we're not all avid supporters of everything, what would the world come to?? People might actually be critical of poor experiences and force companies to change!
I locked the other threads and renamed this one. Hopefully you guys will find this satisfactory.
Wont happen in multiplayer games, its cause a high POV usually gives you certain advantages, depending on the game.
In HL1/2 a high FOV practically negates bobbing and also makes recoil far better controllable. Thats why most HL1/2 games have hardlocked fov to <=90°C. Same goes for Q3A.
Also a big are is not always an advantage, especial in fps. It might look cool but thats all -> Foveal vision>>Peripheral Vision
FINALLY!!! One thread, excellent.
Thanks Gautam. :up: :up:
I sure wouldn't mind seeing that Spy decloaking off to my right and trying to stab me before I can deploy an Uber though.
Thanks God... er Gautam! You heard our prayers! :p: :D
You say dual monitors have no taken off because (blank?). Yet it's pretty obvious why it wouldn't have, which is why you need an odd number of monitors in row and column. Hint: It places the crosshair in one monitor instead of splitting it. That's what Eyefinity allows. Hm.
I have a 4870 1 GB for a 19". Benchmarks give you a decent story but most people fail to pay attention to the most important metric; minimum FPS. A 4850 or a 4770 may be 'adequate' for a 19", but once a game gets busy (and it'll get busy all the time in multiplayer) the situation from your timedemo FRAPS captures starts to dramatically change. This is why it's always important to be overpowered for any given resolution, unless you play single player only. I have went through a 9800 GTX+, 4770, 9600GT all in the quest for ALWAYS smooth performance in multiplayer FPS before finally deciding to overpower my resolution.
A good example is left4dead, a incredibly dynamic game. You cannot decide the candidacy of it based on 'just good enough'; it's much safer to overpower and forget.
I'm sure you have proven me wrong completely, since I alluded to none of the points that you apparently took me apart on. You are surprisingly hurt about me (apparently?) insinuating you're an nvidia fanboy, which I have not stated at all. I merely hinted that you either do not like new technology or you are skeptical about something you haven't even had a chance to try.
Yeah, totally, I mean, let's be critical of a product we haven't even had a chance to test when all the people who have laid their hands on it have said it's awesome ([H], Anand, etc).
i am going to start a thank you gautam thread in the thank you section.:ROTF:
Did any information get released about new anisotropic filtering method?
Wow, look at the dates, this particular 2gb 5870 "six" (trillian) board has been around since july. heh.
http://www.wsgfmedia.com/generaladmi...setup-card.jpg
http://www.wsgfmedia.com/generaladmi...HDTV-card2.jpg
http://i31.tinypic.com/svov3s.jpg
Why are they still making 1Gb parts? Should leave 1Gb for for 5850...
From official AMD slides from the closed threads, it's 27 Watts.
http://farm4.static.flickr.com/3454/...8dc3e525_o.jpg
http://farm4.static.flickr.com/3439/...371b8b78_o.jpg
I wish Gautam had let this http://www.xtremesystems.org/forums/showthread.php?p=4007933#post4007933 open instead of this one. Oh well, rain in the pictures again guys.
Doing a bit of a shameless threadjack am I the only person here relishing what Magic Sapphire will pull out of their hats with the VaporX and Atomic version's of the HD5870? WILL we see 3 Terraflops and 2GB with a cool and quiet cooling solution?
I hope so :)
John
Let's hope they're right. In the mean time, is it so unreasonable for people with first-hand experience to have skepticism when history hasn't always shown the driver side of things to be all that pleasant? I mean you make it sound like any negative opinion is irrelevant. This isn't just a question of monitor support, it's in some portion a question of Crossfire support, which wasn't covered by those editors.
It's strange. Since so few will be using this feature this couldn't possible be a problem. Besides, the slides says that HL2 is compatible with Eyefinity and I'm assuming this means extended FOV and not just increased resolution since three screens would make it look absolutely ridiculous.
And yeah, foveal vision is much better than peripheral vision. But somehow I still think my peripheral vision has saved my ass more than once. The greatest experience would probably be Eyefinity (or rather extended FOV) combined with TrackIR...
At last, Samsung GDDR5. Hoping for nice memory OCs.
Have any pics been released for the 5850? Looking at the length of the 5870 I doubt it would fit in my case. Hope the 5850 is a little shorter. Case is lian li A05B.
Largon, can you tell anything about the oc potential, or latency from the part numbers on the memory chips? :shrug:
http://i32.tinypic.com/x5ujib.jpg
http://i25.tinypic.com/ix6wjb.jpg
Not when going above spec, since every IC will perform differently but you can at least see what specification they are and the bin.
http://www.samsung.com/global/busine...do?fmly_id=675
Now these are really nice pictures... but I'm sure you're all dieing to hear about performance :D. I happen to know the numbers. From what I've been told (confidential source):
5870 is 100% faster than 4870 (2x)
5870 will surpass GTX295
5850 will surpass GTX285
Quote me on it. You'll see. You're welcome!
Has this been posted?
http://www.techpowerup.com/103612/Ra...n_Spotted.htmlQuote:
Radeon HD 5870 Eyefinity Edition Spotted
Among the three main high-end graphics SKUs AMD has in store for this 23rd known so far (namely Radeon HD 5850, HD 5870 1GB, and HD 5870 2GB), is a fourth distinct SKU called the Radeon HD 5870 Eyefinity Edition. This is a variant of the Radeon HD 5870 with specially-designed connectivity that makes setting up to six displays possible/convenient for making use of the Eyefinity multi-display technology that lets you span a display-head across several physical displays like a mosaic.
On its panel, the card has a large air-vent occupying one slot, and six mini-DisplayPort connectors occupying the other. Necessary cabling will be provided to connect to the displays. While each accelerator supports six displays in all, multiple accelerators can be installed on the same PC without any multi-GPU setup, to scale the size of the resulting display by up to 24 displays and up to 268 megapixels of effective resolution. It remains to be seen if there are similar Eyefinity Edition SKUs based on other AMD GPUs in the series, especially considering the fact that the company is also eying the business/productivity market segment ............
http://www.techpowerup.com/img/09-09-11/132a.jpg
http://www.techpowerup.com/img/09-09-11/132b.jpg
Seems to be 0.4ns, 5.0Gbps (K4XXXXXXXX-XC04), Samsung Specs Here but not for sure ... :shrug:
CryEngine 3
http://news.ati-forum.de/images/stor...ssslides/7.jpg
Alien versus Predator
http://news.ati-forum.de/images/stor...ssslides/6.jpg
Backside of the card, http://www.widescreengamingforum.com...ic.php?t=16780 .
http://www.wsgfmedia.com/generaladmi...setup-card.jpg
Um, the 2nd card from the top is... a bit different.
http://www.wsgfmedia.com/generaladmi...Xplane-rig.jpg
To be honest, I wish everyone put SOURCES for their pictures.
Well IMO, odyssey96 is correct about what he is saying.
I posted this 3 weeks ago:
http://www.xtremesystems.org/forums/...&postcount=174Quote:
Hemlock ------~P28k-P30k (4870 quadfire, gtx295 quad sli, or better)
I agree with him.
Yes, looks like the .4ns 1gbit. Hope they oc to ~1400mhz. Rated for 5ghz, so 1250mhz should be easy on stock volt.
04 : 0.40ns (5.0Gbps)
K4G10325FE 32Mx32 32Mx16 1.5V ± 0.045V
http://www.samsung.com/global/system...uide_q2_09.pdf
edit:
http://img2.pict.com/64/f6/bc/1605647/0/amdgpus.png
http://www.forum-3dcenter.org/vbulle...16524&page=196
I'm wondering if going for the 2GB variant will be worth it and whether these will be all that loud. Given 188watt max and a cooler of that size, you'd think it would be more quiet than the 4870/4890s. Considering CF doesn't scale awesome in everything, I expect the 5870 to overtake it in a fair few games so that is somewhat attractive. What I'm really wondering now is how they'll do an X2 version? I'm honestly expecting it to be less than 2 5870s (eg 285vs295) as I don't see them saving over 60 watts by having only 1 pcb.
@Odyssey96
http://www.xtremesystems.org/forums/...&postcount=341
Thank you.
http://vr-zone.com/articles/juniper-....html?doc=7626Quote:
Juniper XT is Radeon HD 5770 and Juniper LE is HD 5750
While all eyes are on the Radeon HD 5800 "Cypress" series, there is another series quietly brewing beneath AMD which NVIDIA should be really afraid of. It is targeted at the mainstream market which is no doubt the most important segment for both camps.
Juniper XT and LE will be officially named Radeon HD 5770 and Radeon HD 5750 respectively when launched. HD 5770 card is codenamed Countach while HD 5750 is codenamed Corvette and they both come with 1GB GDDR5 memories on 128-bit memory interface. Juniper will possess all the features of its higher end counterpart like 40nm, DX11, Eyefinity technology, ATI Stream, UVD2, GDDR5 and best of all, it is going to be very affordable.
One of the reason why AMD is not mass producing Radeon HD 4700 (RV740) series now is because HD 5700 series will be replacing it soon and will come one month after HD 5800 series. It will meet head on against the NVIDIA's D10P1 (GT215) series in October so expect a full fledge war then. With a performance target of 1.6x over the HD 4770 and 1.2x over the HD 4750, they are surely packed with enough power to pit against the NVIDIA's lineup. Pair them up and you will get a boost of 1.8x which is roughly the performance of a Cypress card .........
So Cypress performs 1.2 x 1.8 = 2.16 of a 4750?
Cos rumors say it that Cypress = 2x 4870, not 4750 :S
I want to see real performance numbers before anything else. All this talk of 2X performance this and 14X performance that is BS to me till I see it in independent benchmarks/reviews. Nice to see some pics though as well as pics of some of the new features.
this could be one of the most epic launches in history
or one of the biggest flops
I'm certainly hoping for the former
I love it how codemasters has to hide beneath a million post processing effects to cover for its shoddy visual work. Cannot wait to see some real DX11 titles though, and a few 20 page long 5xxx reviews :D
Well perhaps we can appoint you as Mr Eyefinity and you can let us all know when issues like cost, deskspace, power consumption, minimum framerates, 20 to 30mm combined bezel space, driver issues are no longer things we need to worry about and that Eyefinity will stop being a near worthless gimmick for 99% of users.
Perhaps you should spend a bit of time rereading what you post?
I suspect there are more nobel crusade's one could dedicate themselves to.
As a person who has and owns Both company's cards. I applaud AMD-ATI for an incredible accomplishment. Now where are the nay-sayers for AMD buying ATI!:clap:
Cost / deskspace - I don't see how this would be too much of an issue. People who can accord luxuries will buy into this, people who won't will still be buying budget cards with 15" LCDs.
Power consumption: Seriously?
Minimum framerates: Again, this is meant for games that this card is super overpowered on already.
Bezel space: You choose an odd x odd grid of monitors so your center piece is a monitor and not a bezel. Simple enough?
Driver issues: It's already been said that the driver merges it into one large resolution and the game treats it as such. At least, from what's available.
I think we had this discussion, Chad, where I patiently explained to you that just because a is not b, it does not mean a is c. This is perfectly analogous to you calling me an amd fanboy just because I said that intel's shillman (that Francois guy) was posting rampantly on a consumer forum and I didn't like that at all.
There might be, but uh, I'd rather people not get swayed by his terrible rhetoric.
ugh. The new ruby looks ugly. The old one was way better.
Agreed, I noticed it right away and touched it a little just to see what would turn up.
Photo 1 Before
Photo 1 After
Photo 3 After
Photo 3 After
I do hope they show before and after video detailing what's different. I recall the developer respond to a you tube video that DX11 version of this game would look better.
People this is the dawn of a technology that can become mainstream at a consumer level. like any tech it takes time, but the door is now open.;)
Can't wait to see more benchmarks, reviews and pictures. The 5800 series sounds very promising.