PDA

View Full Version : AMD's lead with DirectX 11 is 'insignificant' says Nvidia



ajaidev
12-17-2009, 11:18 PM
"Nvidia brushed off the technology lead that rival Advanced Micro Devices built by releasing the first graphics chips to support DirectX 11, saying the release only gives AMD a short-term advantage that won't have a long-term effect on the graphics market...."

http://www.computerworld.com/s/article/9142411/AMD_s_lead_with_DirectX_11_is_insignificant_Nvidia _says?source=rss_hardware


On related news Over 800 Thousand of DirectX 11 Graphics Chips Shipped by ATI (http://www.xtremesystems.org/forums/showthread.php?t=241094):ROTF:

villa1n
12-17-2009, 11:28 PM
"Nvidia brushed off the technology lead that rival Advanced Micro Devices built by releasing the first graphics chips to support DirectX 11, saying the release only gives AMD a short-term advantage that won't have a long-term effect on the graphics market...."

http://www.computerworld.com/s/article/9142411/AMD_s_lead_with_DirectX_11_is_insignificant_Nvidia _says?source=rss_hardware


On related news Over 800 Thousand of DirectX 11 Graphics Chips Shipped by ATI (http://www.xtremesystems.org/forums/showthread.php?t=241094):ROTF:

Jen Hsun Huang: Don't be too proud of this technological terror you've constructed. The ability to render dx 11 is insignificant next to the power of the geForce.
Admiral Motti D: Don't try to frighten us with your sorcerous ways, Lord Huang. Your sad devotion to that ancient religion has not helped you conjure up dx11 cards, or given you clairvoyance enough to create a working fermi...
[Jen Hsun Huang makes a pinching motion and A M D starts choking]
Jen Hsun Huang: I find your lack of faith disturbing.

RPGWiZaRD
12-17-2009, 11:30 PM
Well if they had beaten AMD to DX11 support they'd prolly use the exact opposite tone, it's just part of the marketing speech as usual.

However I do hope Fermi surprises us positively. When a manufacturer doesn't talk much about an upcoming product it means either good or bad things to come. :rolleyes:

villa1n
12-17-2009, 11:32 PM
... When a manufacturer doesn't talk much about an upcoming product it means either good or bad things to come. :rolleyes:

LoL... are there any other options? ^^

Shadov
12-17-2009, 11:34 PM
Honestly I HATE that shady approach from Nvidia to downplay and block everything they can't offer at the moment.

Directx 10,1 is a prime example and now they are going for it. :shakes:

I just hope that ATI and Nvidia will get equal market shares so none of them has the power to stiff inovation in the future.

RPGWiZaRD
12-17-2009, 11:36 PM
LoL... are there any other options? ^^

Yes, in-between I was thinking of. FX/5 series was a disappointment, G80 was a huge success for example, big difference between these 2 launches. If a manufacturer doesn't talk much about upcoming products it usually either means:

a) we don't wanna reveal anything of the awesomeness so once released it gets more positive feedback and info will faster spread around about it and result in good sales and take the competitor with a surprise or
b) we don't wanna admit the problems with the manufacturing process that we're working 24/7 with in order to hopefully solve these until launch.

Shadov
12-17-2009, 11:51 PM
Yes, in-between I was thinking of. GT200 was a disappointment, G80 was a huge success for example, big difference. If a manufacturer doesn't talk much about upcoming products it usually either means: a) we don't wanna reveal anything of the awesomeness so once released it gets more positive feedback and info will faster spread around about it and result in good sales and take the competitor with a surprise or b) we don't wanna admit the problems with the manufacturing process that we're working 24/7 with in order to hopefully solve these until launch.

In my opinion its like this (more or less):

If youre the underdog and have a great product coming, youre quiet to not alarm the competition.

If you have bigger market share you scream about how great your product will be in order to not let the competition gain ground. Though if the product has some problems and limitations you are careful about settings expectations in order to not loose confidence (investors and user base ect.).

That way I suppose Fermi will be that "limited in some fields product" and that way Nvidia is cautious on what they tell the public.

Am I at least a bit right here? ;)

safan80
12-18-2009, 12:15 AM
I really wish they would start releasing new products rather than trash talk.

Nvidia's Fermi relied on wood screws, Cuda, they won't even make a high end 10.1 card, and now they don't have a dx11 product.

Cuda is leaving that glide taste in my mouth.

I'm starting to hate Nvidia because they forgot what got them to where they are today, giving customers what they wanted and supporting existing standards.

Teemax
12-18-2009, 12:29 AM
Aren't we supposed to use the Fermi thread for any Fermi related news?

The mods are getting irritated with all the Fermi discussions, I fear.

Florinmocanu
12-18-2009, 12:50 AM
"This 60-day lag between these events of when our competition has DX11 and when we're coming to market will absolutely seem insignificant in the big picture," Hara said.

60 day lag? AMD already has DX11 for about 2.5months, 3 at the end of december.

When Fermi hits, they wil have 4-5 months of lead.

That guy is full of crap...

gOJDO
12-18-2009, 12:52 AM
Stop the craps*it and show us what you have nVidia...

NaMcO
12-18-2009, 12:58 AM
Directx 10,1 is a prime example and now they are going for it. :shakes:


Yeah but... What's DX10.1 anyway, did anyone see anything *WORTH* 10.1?

Didn't think so :(

The same is probably going to happen with DX11 with or without nVidia hardware, at least in the following months, so there's no real need for DX11 cards for a LONG time, but then again, what's new here?

Shadov
12-18-2009, 01:20 AM
Yeah but... What's DX10.1 anyway, did anyone see anything *WORTH* 10.1?

Didn't think so :(

The same is probably going to happen with DX11 with or without nVidia hardware, at least in the following months, so there's no real need for DX11 cards for a LONG time, but then again, what's new here?

DX 10,1 - HawX, Battleforge, Stalker Clear Sky anyone? ;)

Again DX 11 - Dirt 2, Stalker Pripyat, Battleforge and Aliens Vs Predator soon?

You should never say "No" to free FPS and enhanced visuals.

http://pclab.pl/zdjecia/artykuly/miekrzy/atomic4890/wykresy/stalker-1280aa.png

http://pclab.pl/zdjecia/artykuly/miekrzy/5870premierowy/wykresy/bench/bf1920.png

flopper
12-18-2009, 01:21 AM
Fermi is a highly adaptable multipurpose GPU.
ATI will build something similiar in the next generation late next year.

To build the platform of 3D, physics, the hardware must be plug and play, which as far, the hardware isnt good enough.
Basically the infrastructure to build what Nvidia would say be useful, isnt here for at least another 4 or so years.
There is no support for 3D, its buggy, lot of issues etc...

ATI give us a dx11 card that uses eyefinity, something we can use with the current platform that allow the gamedevelopers to tweak their game engine with little to no effort.
If I buy ati card, I get something I can use now, that adds imersion without any special as the eyefinity is a benefit I can choose to use or not.

Nvidia produces for a market that replaces the cpu, calculating in science, cheaper and more powerful, and also provide a adaptable platform for years to come.
However, they are early, and the market is growing, and game wise, the fermi wont offer much new, sure better physics support, better 3D but the infrastructure isnt there yet.

ati get time to note what to build a year in advance and even if a chip seldom gets changed a lot during such timeframe, its however some time to prepare for a growing market.

For me dx11 with bf bd2 and BF3 using DICE game engine is reason enough and the eyefinity is simply, useful and now as ati added it, gamedevwlopers will build support in thier engines.
its something that allow, a better game experience today without a lot of issues.
its plug and play.
Unless the game developer sucks as Infinity ward.

Starscream
12-18-2009, 02:50 AM
Its predictable and normal for Nvidia to say something like that.
Didnt ATI say the exact same when Nvidia came with DX10?


Its normal for one company say something isnt needed if they dont have it yet.

Nedjo
12-18-2009, 03:17 AM
Its predictable and normal for Nvidia to say something like that.
Didnt ATI say the exact same when Nvidia came with DX10?


Actually no! ;) AMD was aware of the importance of being first with new API support (school they've learned with DX9 and Radeon 9700 Pro). ANd they worked hard on shaping up DX10 with M$, (and .1 later on) to go on and start minimizing the impacts of the API-GPU sync...

It's funny how essentially root of R600 problems and GT300 problems is the same - "disharmony" of the manufacturing tech (at R600 time it was 80nm), and complexity of the GPU it self... but obviously AMD has learned good school since that time.

zalbard
12-18-2009, 04:29 AM
They said this before, I don't see how it is news for anyone.
They don't have any products, why should we listen to them? :shrug:

Chrono Detector
12-18-2009, 06:39 AM
NVIDIA is sure talking big here right, considering the fact they don't have a DX11 capable product at the moment. Either STFU or GTFO NVIDIA, nobody wants to hear anymore bull:banana::banana::banana::banana: coming out from these jackasses, seriously.

jam2k
12-18-2009, 06:54 AM
What nVidia says is insignificant.

zalbard
12-18-2009, 07:22 AM
What nVidia says is insignificant.
+100!
I say Fermi or GTFO! :D

grimREEFER
12-18-2009, 07:30 AM
well, the idea isnt that dx11 isnt important, it's that nvidia can release dx11 cards, hence the dx11 lead is insignificant because both companies will have it, while gpgpu abilities willl be skewed in one direction giving a long term lead to nvidia.

its hard to use this argument with the geforce line though, as all the interesting stuff will be in the tesla's.

3NZ0
12-18-2009, 07:39 AM
What nVidia says is insignificant.

This.

Show us your stuff nv, we're all waiting.

Oh wait.

Jamesrt2004
12-18-2009, 07:40 AM
60 day lag? AMD already has DX11 for about 2.5months, 3 at the end of december.

When Fermi hits, they wil have 4-5 months of lead.

That guy is full of crap...

win


/threadclose

don_xvi
12-18-2009, 08:02 AM
Jen Hsun Huang: Don't be too proud of this technological terror you've constructed. The ability to render dx 11 is insignificant next to the power of the geForce.
Admiral Motti D: Don't try to frighten us with your sorcerous ways, Lord Huang. Your sad devotion to that ancient religion has not helped you conjure up dx11 cards, or given you clairvoyance enough to create a working fermi...
[Jen Hsun Huang makes a pinching motion and A M D starts choking]
Jen Hsun Huang: I find your lack of faith disturbing.

I thought this was super funny. Kudos. :up:

eric66
12-18-2009, 08:40 AM
60 day lag? AMD already has DX11 for about 2.5months, 3 at the end of december.

When Fermi hits, they wil have 4-5 months of lead.

That guy is full of crap...

nvidia days are longer than normal days don't you know :shakes:

naokaji
12-18-2009, 08:53 AM
^They renamed time:D

For AMD DX11 is significant, if it would not matter then the 5xxx cards would not be selling like gold coated double chocolate chip cookies with free pr0n.

If its really important though is questionable as most games are just console ports, but then maybe it will help to kick developers in their lower backs and get them back to making games by other means than copy paste.

trinibwoy
12-18-2009, 09:21 AM
^They renamed time:D

:rofl:


For AMD DX11 is significant, if it would not matter then the 5xxx cards would not be selling like gold coated double chocolate chip cookies with free pr0n.

They're selling for a whole lot more reasons than just DX11 support though. You won't see many people upgrading from a 4890 to a 5770 for example. The cards are the fastest around and that in itself is enough to drive sales.

naokaji
12-18-2009, 09:23 AM
It is true that the performance is a major factor (and in europe where electricity is expensive the superb idle power consumption as well), but the whole future proofing aspect of dx11 should not be underestimated.

ownage
12-18-2009, 09:25 AM
Jen Hsun Huang: Don't be too proud of this technological terror you've constructed. The ability to render dx 11 is insignificant next to the power of the geForce.
Admiral Motti D: Don't try to frighten us with your sorcerous ways, Lord Huang. Your sad devotion to that ancient religion has not helped you conjure up dx11 cards, or given you clairvoyance enough to create a working fermi...
[Jen Hsun Huang makes a pinching motion and A M D starts choking]
Jen Hsun Huang: I find your lack of faith disturbing.

w00t! :rofl::ROTF:

Blacky
12-18-2009, 10:07 AM
Whats the point of releasing DX10.1 products since now DX11 is out anyways?

D749
12-18-2009, 10:17 AM
Reminds me of the whole DX 10.1 ordeal. :rolleyes:

Manicdan
12-18-2009, 10:23 AM
Jen Hsun Huang: Don't be too proud of this technological terror you've constructed. The ability to render dx 11 is insignificant next to the power of the geForce.
Admiral Motti D: Don't try to frighten us with your sorcerous ways, Lord Huang. Your sad devotion to that ancient religion has not helped you conjure up dx11 cards, or given you clairvoyance enough to create a working fermi...
[Jen Hsun Huang makes a pinching motion and A M D starts choking]
Jen Hsun Huang: I find your lack of faith disturbing.

does that mean LRB Skywalker saves us? (hes almost the son of CUDA?)

RejZoR
12-18-2009, 10:36 AM
Of course they ae saying that when they don't have ANY DX11 hardware...

STEvil
12-18-2009, 11:38 AM
does that mean LRB Skywalker saves us? (hes almost the son of CUDA?)

What gets to be Princess Laia (or however her name is spelled)?

Glow9
12-18-2009, 12:59 PM
Nvidia 8800 DX10 Series WOOOOO We HAVE DX10 ITS THE FUTURE!! Cards 8800GTX $600+!!
ATI: :(

ATI DX11 5800 Series WOOOOOO 5870 $400!!
Nvidia.. Yeah so no games out for it yet. Whatever...

Hrmmmm it's almost like we've seen this before except last time it was different... The loser also wasn't a poor sport about it either... Ever get the feeling that Nvidia recruits their PR from young republicans clubs? Seriously no tact whatsoever. Just bash the competition. I'm surprised they haven't come out and told us that using a 5870 will burn down your house.

Chickenfeed
12-18-2009, 01:16 PM
Jen Hsun Huang: Don't be too proud of this technological terror you've constructed. The ability to render dx 11 is insignificant next to the power of the geForce.
Admiral Motti D: Don't try to frighten us with your sorcerous ways, Lord Huang. Your sad devotion to that ancient religion has not helped you conjure up dx11 cards, or given you clairvoyance enough to create a working fermi...
[Jen Hsun Huang makes a pinching motion and A M D starts choking]
Jen Hsun Huang: I find your lack of faith disturbing.

The win is strong with this one...

Sadasius
12-18-2009, 01:29 PM
nvidia days are longer than normal days don't you know :shakes:

Yes exactly because they renamed the hour to 'CUDA hours' which are almost 2.5 times longer then our regular hour and consists of 150 minutes. So when they say their product is 2.5 times faster then competitors you have to factor that in as well.

Sly Fox
12-18-2009, 01:29 PM
If anything I think Nvidia's monopoly of PhysX is what's really insignificant.

Oh wow, you can play 2 or 3 crappy games with some meaningless eye candy thrown in. Awesome. :shocked:

JohnZS
12-18-2009, 01:45 PM
If anything I think Nvidia's monopoly of PhysX is what's really insignificant.

Oh wow, you can play 2 or 3 crappy games with some meaningless eye candy thrown in. Awesome. :shocked:

Yep Both ATi and nVidia are plugging their own insignificant white elephants at the moment. nVidia with their Physx and ATi with their eyeinfiniti... the real deal everyone is interested in here is HOW MANY FPS do these cards push at 1920+ with FSAA and AF running intense DirectX10/11 games.

John

freeloader
12-18-2009, 02:53 PM
Yep Both ATi and nVidia are plugging their own insignificant white elephants at the moment. nVidia with their Physx and ATi with their eyeinfiniti... the real deal everyone is interested in here is HOW MANY FPS do these cards push at 1920+ with FSAA and AF running intense DirectX10/11 games.

John

I'll take Eyefinity any day over Physx. Unless you've used it, there's no way you can say Eyefinity is ATI's white elephant. :up:

Sly Fox
12-18-2009, 03:29 PM
Yep Both ATi and nVidia are plugging their own insignificant white elephants at the moment. nVidia with their Physx and ATi with their eyeinfiniti... the real deal everyone is interested in here is HOW MANY FPS do these cards push at 1920+ with FSAA and AF running intense DirectX10/11 games.

John

Couldn't agree with you more man. :up:

Chickenfeed
12-18-2009, 03:30 PM
I'll take Eyefinity any day over Physx. Unless you've used it, there's no way you can say Eyefinity is ATI's white elephant. :up:

I agree 150%. Wait... :rofl:

Strictly PR garbage, nothing more.

Mabey a fist fight will break out over Twitter again? I can dream right?

STEvil
12-18-2009, 06:30 PM
Eyefinity is not garbage.

Its Matrox Tripple Head 2 Go for free and more powerful.

If you think its garbage that is probably because you have not used it and dont really care about your gaming experience.

LordEC911
12-18-2009, 06:59 PM
Ever get the feeling that Nvidia recruits their PR from young republicans clubs? Seriously no tact whatsoever. Just bash the competition. I'm surprised they haven't come out and told us that using a 5870 will burn down your house.
I didn't realize that AMD's defense/response to every logical argument was "That's racist." :shrug:
(Since you seem to be implying AMD's gets their PR from the dems/libs)

trinibwoy
12-18-2009, 07:10 PM
I'll take Eyefinity any day over Physx. Unless you've used it, there's no way you can say Eyefinity is ATI's white elephant. :up:

Longer term hardware accelerated physics has a vastly bigger potential market than Eyefinity though. In the future anybody with a graphics card will be able to run physics on it. Triple monitor setups aren't going mainstream anytime soon (or ever).

freeloader
12-18-2009, 08:04 PM
Longer term hardware accelerated physics has a vastly bigger potential market than Eyefinity though. In the future anybody with a graphics card will be able to run physics on it. Triple monitor setups aren't going mainstream anytime soon (or ever).

In that same future, ATI will have physics and Eyefinity. I'll take that anytime over physics alone. More features for less money. :up:

Serpentarius
12-18-2009, 08:26 PM
there's nothing wrong with Eyefinity .... atleast they're a bonus, not a forced standard ... unlike Physx which disabled when ATi GPU is detected
most of us would say unfair, but Nvidia said it's business :\

madcho
12-18-2009, 10:13 PM
i agree with ATI, They have a lot of 3D power but they can't use it on a stupid low res on a 24".

i play actually a game in multi account. and for me it's the best thing i'hve seen for years.

I you want play on a single large screen with one account at the end of day, or you can play multiples accounts on the different screen without reboot even the clien. just change res, and play. 5870 has enough power to play 4-5 accounts without lag ( 4-8go of ram needed ^^ ), but using alt tab is not the best solution. And using 5 PC not too.

And the single display mode is very beautyfull.

This MMO is Eve the best game i have seen for years, after HL saga.

STEvil
12-18-2009, 10:15 PM
Will nV ever create a multi-monitor alternative to eyefinity? ATi's got a solid handle on the market if they dont.

PhysX does nothing good yet.

RejZoR
12-18-2009, 10:31 PM
PhysX is actually good, problem is that they market it completely wrong and that they push it at final stage where devs can add only eye candy instead affect core gameplay. Proof for that is Max Payne 2 and Half-Life 2. Where physics affect core gameplay from the beginning. And they don't even use hardware physics. If PhysX could run on every hardware with requirement of NVIDIA PhysX logo spinning in the intro, that would be an excellent advertisement for NVIDIA, even if it was able to run on AMD hardware.
And you can be sure that physics would have evolved much firther in the same time and that they would be used in wider range of games. But since NVIDIA just doesn't get it, it is like it is.

LordEC911
12-18-2009, 10:56 PM
Will nV ever create a multi-monitor alternative to eyefinity? ATi's got a solid handle on the market if they dont.

PhysX does nothing good yet.

I thought I read something that pointed towards a yes but cannot recall where.

madcho
12-18-2009, 11:43 PM
PhysX is actually good, problem is that they market it completely wrong and that they push it at final stage where devs can add only eye candy instead affect core gameplay. Proof for that is Max Payne 2 and Half-Life 2. Where physics affect core gameplay from the beginning. And they don't even use hardware physics. If PhysX could run on every hardware with requirement of NVIDIA PhysX logo spinning in the intro, that would be an excellent advertisement for NVIDIA, even if it was able to run on AMD hardware.
And you can be sure that physics would have evolved much firther in the same time and that they would be used in wider range of games. But since NVIDIA just doesn't get it, it is like it is.

i agree PhysX will die if nvidia don't open it. PhysX is just a new kind of 3D tech demo but not usefull for real gameplay.

A lot of games now use some core game play software rendered. They show a better face than batman additionals.

Half life 2 is a game of 2004. 5 years after, nobody created better physics gameplay, just copyed. PhysX is very a low in interest, and huge work for near nothing.

PhysX will not help you to choice to buy a geforce, if you choice geforce for that you are very very stupid.

Eyefinity maybe a reason to buy, PhysX can't.

flopper
12-19-2009, 01:07 AM
Longer term hardware accelerated physics has a vastly bigger potential market than Eyefinity though. In the future anybody with a graphics card will be able to run physics on it. Triple monitor setups aren't going mainstream anytime soon (or ever).

Seen prices on 1920x1080, those monitors are going down a lot in price and the cost of 3 is like one, 3 years ago.
since most might buy 2, its a feature that will gain momentum.

how much will see

RejZoR
12-19-2009, 01:33 AM
You can get 24 inch 1920x1080 screen from ViewSonic for like 160 eur. Sure it's nothing high end, but you can get 3 of them for very acceptable price.

JohnZS
12-19-2009, 03:54 AM
Couldn't agree with you more man. :up:

Phew at least I am not the only one here craving for high frame rates in DirectX 10 and 11 Games at high res with FSAA and AF.

With regards to the eyeinfiniti ATi Fanboys who said I have not experienced such a thing and have no desire to enhance the experience gaming I suggest you all take a deep breath and think for a moment.

I have experienced 3 monitors (and even 4!) since 2007. For 2D Desktop work it is a MUST have I can compromise to 2 monitors at work, but having a single monitor IS a flaw here in a IT environment. (Trust me it is great to have your Outlook on one monitor, your code in the middle and testing/research on the right).

Now with regards to gaming, 3 monitor setups are only good for FlightSims.. in fact I will retract that statement and say that they are A MUST HAVE for Flight Sims, and to a lesser extent driving games. However for all sorts of other forms of gaming (and yes trust me I have tried) 3 monitor setups and even dual monitor setups are nonsense.

Until we get panoramic screens with no dividing chunks of plastic and a form of triple display port input ATi's eye infiniti is nothing more than a fancy feature for a very niché market. (Rather like nVidia's Physx... it will never catch on to be a must have...)

Now back on topic, DirectX 11 might be insignificant now, but come February it will be gaining a lot of momentum... and IMHO will become a must have feature by the summer as more and more developers will be jumping onto the DirectX 11 bandwaggon.

Also for those of you crying over nVidia saying they are only 60days behind, the ATi 5800 series did not become widely available in the UK until 2 weeks ago (and even then it is only the 5850 and a handful of 5870 cards). nVidia's exaggeration is not that far off considering January the 7th will be when the Fermi is unveiled.

John

SocketMan
12-19-2009, 04:55 AM
Even if Eyefinity is not used "the way it was meant to be played" having 3 monitors is useful in many other ways:you can have more workspace,have a game running full screen and have the hardware/software (temperatures,cpu/gpu loads,voltages,etc) sensors on the second/third screen.
For flight simulators - having the instruments on the separate
screens is hard to beat.



"We're almost there. ...In Q1, the world will get to see what we've done with Fermi," he said.


Let me guess you nailed 3 more screws in it ? :D
Now it's definitely "the way it was meant to be screwed"

I love NV products,but absolutely hate their business practices.Seems like they've lost touch with reality.
I can only hope that their engineers aren't drinking the
same cool-aid as their marketing department.

Final8ty
12-19-2009, 05:18 AM
Yep Both ATi and nVidia are plugging their own insignificant white elephants at the moment. nVidia with their Physx and ATi with their eyeinfiniti... the real deal everyone is interested in here is HOW MANY FPS do these cards push at 1920+ with FSAA and AF running intense DirectX10/11 games.

John


I'll take Eyefinity any day over Physx. Unless you've used it, there's no way you can say Eyefinity is ATI's white elephant. :up:

Indeed & it does not need code that will take anything away from the game if you don't have an ATI or if you don't use EyeFinity its simply an option like the many others so i don't understand all the negative fuss over its existence as if there was something much more significant that it replaced.

What a bunch of babies example
i don't use 3 monitors for anything so the option for others should not be there also & even if people do use 3 monitors for the workspace letting them play games on all 3 as well good heavens no :rolleyes:

Final8ty
12-19-2009, 05:59 AM
Phew at least I am not the only one here craving for high frame rates in DirectX 10 and 11 Games at high res with FSAA and AF.

With regards to the eyeinfiniti ATi Fanboys who said I have not experienced such a thing and have no desire to enhance the experience gaming I suggest you all take a deep breath and think for a moment.

I have experienced 3 monitors (and even 4!) since 2007. For 2D Desktop work it is a MUST have I can compromise to 2 monitors at work, but having a single monitor IS a flaw here in a IT environment. (Trust me it is great to have your Outlook on one monitor, your code in the middle and testing/research on the right).

Now with regards to gaming, 3 monitor setups are only good for FlightSims.. in fact I will retract that statement and say that they are A MUST HAVE for Flight Sims, and to a lesser extent driving games. However for all sorts of other forms of gaming (and yes trust me I have tried) 3 monitor setups and even dual monitor setups are nonsense.

Until we get panoramic screens with no dividing chunks of plastic and a form of triple display port input ATi's eye infiniti is nothing more than a fancy feature for a very niché market. (Rather like nVidia's Physx... it will never catch on to be a must have...)

Now back on topic, DirectX 11 might be insignificant now, but come February it will be gaining a lot of momentum... and IMHO will become a must have feature by the summer as more and more developers will be jumping onto the DirectX 11 bandwaggon.

Also for those of you crying over nVidia saying they are only 60days behind, the ATi 5800 series did not become widely available in the UK until 2 weeks ago (and even then it is only the 5850 and a handful of 5870 cards). nVidia's exaggeration is not that far off considering January the 7th will be when the Fermi is unveiled.

John

Its does not matter what you use your monitors for & your dislike for multi screen gaming because of the Belize like that's ATI`s fault.
Not everyone has a big issue with it & its for those people.

I really hate the if its no good to me then it should not exist or advertised attitude when the existence has no bad effect on anyone who is not interested, we would all have nothing in that case.

trinibwoy
12-19-2009, 07:57 AM
However for all sorts of other forms of gaming (and yes trust me I have tried) 3 monitor setups and even dual monitor setups are nonsense.

It's not complete nonsense. It's certainly more hype than anything else right now but should improve over time. Anandtech (http://www.anandtech.com/video/showdoc.aspx?i=3679&p=6) sums it up pretty well - until there's better software support for FOV adjustments and Eyefinity aware HUDs it really isn't that useful or practical for most games. For driving and flight simulators it's nice though.


I really hate the if its no good to me then it should not exist or advertised attitude when the existence has no bad effect on anyone who is not interested, we would all have nothing in that case.

It's unfortunate that lesson hasn't been learned with respect to PhysX.

Xoulz
12-19-2009, 08:09 AM
Longer term hardware accelerated physics has a vastly bigger potential market than Eyefinity though. In the future anybody with a graphics card will be able to run physics on it. Triple monitor setups aren't going mainstream anytime soon (or ever).

Riiiiight!

There are more games released using physics, than using PhysX. One look at the new Battlefield: Bad Company 2 is illustrative of how much better non-PhysX games look and feel.

Batman:AA and it's ancillary paper shuffling(:ROTF:), etc.. really? Or DICE's physics..? Nvidia PhysX sucks!



Dell's spectacular new monitor (st2410) goes on sale all the time @ $189, so for roughly $570 many gamers WILL be using Eye-infinity, since is gives them an advantage and immersion.

People don't need PhysX, it offers nothing greater than regular physics.

trinibwoy
12-19-2009, 08:10 AM
Sigh, please read carefully before you reply to a post. I said hardware accelerated physics, not PhysX. Geez.

JohnZS
12-19-2009, 09:59 AM
Even if Eyefinity is not used "the way it was meant to be played" having 3 monitors is useful in many other ways:you can have more workspace,have a game running full screen and have the hardware/software (temperatures,cpu/gpu loads,voltages,etc) sensors on the second/third screen.
For flight simulators - having the instruments on the separate
screens is hard to beat.


I agree with you 110% with regards to FlightSims. IMHO eyeinfinity (or Matrox Tripple head, or a synergy and other technologies which have been out for years) are a MUST have for Flightsim fans.

I guess why I am so negative towards eyeinfinity is that ATi are banging on about it like they have reinvented the wheel, yet sadly they are not the first (nor will they be the last) to offer this feature.

Anyway I agree with Anandtech, other than driving and Flight Sim games (and of course the desktop estate) there really is no need for this in FPS and other genres of games.

nVidia's 3d will be the next big thing... and trust me that is NOT another "Physx" this really does add to the gaming experience, just sadly the monitors are not quite up to the standard (1920*1200+ IPS) that I require.

Oh yeah and I am with trinibwoy Hardware accelerated Physics (OpenCL based, or DirectCompute Based) are going to be important to the gamer (note to nVidia that DirectCompute is actually one of the significant advantages of having DirectX 11 hardware)

What we all need to do is forget about Physx and it's closed API (which lets be honest it hasn't made my jaw drop to the floor.... not even in Batman!) and move onto the standardised API's which are supported by all.

John

Final8ty
12-19-2009, 10:48 AM
It's not complete nonsense. It's certainly more hype than anything else right now but should improve over time. Anandtech (http://www.anandtech.com/video/showdoc.aspx?i=3679&p=6) sums it up pretty well - until there's better software support for FOV adjustments and Eyefinity aware HUDs it really isn't that useful or practical for most games. For driving and flight simulators it's nice though.



It's unfortunate that lesson hasn't been learned with respect to PhysX.


I agree with you 110% with regards to FlightSims. IMHO eyeinfinity (or Matrox Tripple head, or a synergy and other technologies which have been out for years) are a MUST have for Flightsim fans.

I guess why I am so negative towards eyeinfinity is that ATi are banging on about it like they have reinvented the wheel, yet sadly they are not the first (nor will they be the last) to offer this feature.

Anyway I agree with Anandtech, other than driving and Flight Sim games (and of course the desktop estate) there really is no need for this in FPS and other genres of games.

nVidia's 3d will be the next big thing... and trust me that is NOT another "Physx" this really does add to the gaming experience, just sadly the monitors are not quite up to the standard (1920*1200+ IPS) that I require.

Oh yeah and I am with trinibwoy Hardware accelerated Physics (OpenCL based, or DirectCompute Based) are going to be important to the gamer (note to nVidia that DirectCompute is actually one of the significant advantages of having DirectX 11 hardware)

What we all need to do is forget about Physx and it's closed API (which lets be honest it hasn't made my jaw drop to the floor.... not even in Batman!) and move onto the standardised API's which are supported by all.

John
ATI are not saying that they are the first for multi screen gaming but they are for standard multi screen gaming out of the box.
THGO can cost from £150-£200 alone.

The FOV is important The HUD is secondary & the people who use EYEfinity play allot more than just Drive & flight sims & none of them says its no good for FPS even in its current state.

Your comments are Opinions based on your personal issues with the tech that are unacceptable to you & you seem to think that everyone has them.

Final8ty
12-19-2009, 11:00 AM
It's not complete nonsense. It's certainly more hype than anything else right now but should improve over time. Anandtech (http://www.anandtech.com/video/showdoc.aspx?i=3679&p=6) sums it up pretty well - until there's better software support for FOV adjustments and Eyefinity aware HUDs it really isn't that useful or practical for most games. For driving and flight simulators it's nice though.



It's unfortunate that lesson hasn't been learned with respect to PhysX.

Not the same issue at all.

trinibwoy
12-19-2009, 12:06 PM
Your comments are Opinions based on your personal issues with the tech that are unacceptable to you & you seem to think that everyone has them.

Not at all. There are unsurprisingly relatively few first hand accounts of Eyefinity usage from the community as most people simply do not have 3 monitors or the horsepower to drive them. So we have to look to sites like [H], Anandtech and PCPer for their presumably more objective evaluations.

Criticism and praise are both based on opinion but there are certain things that you intuitively know make sense. I have a single 30" monitor and my head does a fair bit of travelling to cover the screen in an FPS when I need to focus on something that's not right in the center. In a driving game that's not so important because the road is always fixed to the middle screen. Someone can claim that looking 2 feet to their right to check their health gauge or ammo on the HUD is not an issue but I'm not going to buy that given my experience with a single monitor.

Jowy Atreides
12-19-2009, 12:13 PM
Also for those of you crying over nVidia saying they are only 60days behind, the ATi 5800 series did not become widely available in the UK until 2 weeks ago (and even then it is only the 5850 and a handful of 5870 cards). nVidia's exaggeration is not that far off considering January the 7th will be when the Fermi is unveiled.

John


OK, but only if you consider GT300 to be released when their line up is widely available also and not on January 7th.
Only fair by your justification. No double standards!


That should be next christmas :p

It'll still be more than 60 days :ROTF:

Final8ty
12-19-2009, 12:33 PM
Not at all. There are unsurprisingly relatively few first hand accounts of Eyefinity usage from the community as most people simply do not have 3 monitors or the horsepower to drive them. So we have to look to sites like [H], Anandtech and PCPer for their presumably more objective evaluations.

Criticism and praise are both based on opinion but there are certain things that you intuitively know make sense. I have a single 30" monitor and my head does a fair bit of travelling to cover the screen in an FPS when I need to focus on something that's not right in the center. In a driving game that's not so important because the road is always fixed to the middle screen. Someone can claim that looking 2 feet to their right to check their health gauge or ammo on the HUD is not an issue but I'm not going to buy that given my experience with a single monitor.

Your on the wrong forum. just like the person who said that not many people buy SP games on the PC, then i pointed him to a thread on people listing there games.

Affordability is irrelevant to the point of the issues with the tech.
The tech is there if you want to use it plain & simple & no one is forced to use it, so basically making a fuss over nothing.

Criticism and praise are both based on opinion but its something else when people use it to try to say to everyone that its useless to themselves AND everyone else.
There is allot of tech out there that is useless to 90% of the population but is vital to the 10% that do use it. No one can speak for everyone & no one should try to.

Blacky
12-19-2009, 01:03 PM
In that same future, ATI will have physics and Eyefinity. I'll take that anytime over physics alone. More features for less money. :up:

and what makes you think nvidia cannot come out with something similar to eyefinity? is not like ATi is 1000 years ahead of nvidia :rolleyes:

STEvil
12-19-2009, 01:19 PM
Blacky - there's been no rumblings of such yet unfortunately.

I'd be doing eyefinity right now but there are no 5870/5890 2gb cards yet... so i'm still on my 4870X2 :(



I agree with Final8ty, you're fussing about nothing, trinibwoy. It also sounds like you're sitting too close to your monitor if you need to turn your head. I dont with my 30" 2560x1600.

FischOderAal
12-19-2009, 01:46 PM
and what makes you think nvidia cannot come out with something similar to eyefinity? is not like ATi is 1000 years ahead of nvidia :rolleyes:

How I understand is that there are some things you need to add to the GPU to drive even more displays. As designing a chip is not "easy" and/or quick NVIDIA must have done so several months ago if they want to do Eyefinity as well.

Amirite?

yngndrw
12-19-2009, 01:54 PM
I don't see why people like Eyefinity so much, especially the two screen setups. I'd hate having a border down the middle of my view.

Final8ty
12-19-2009, 02:11 PM
I don't see why people like Eyefinity so much, especially the two screen setups. I'd hate having a border down the middle of my view.

Most would not use a 2 screen setup.

Manicdan
12-19-2009, 02:11 PM
I don't see why people like Eyefinity so much, especially the two screen setups. I'd hate having a border down the middle of my view.

because not everyone likes 2 screen setups maybe?

Tim
12-19-2009, 02:26 PM
Blacky - there's been no rumblings of such yet unfortunately.

They mentioned in one of their PR interviews that it would be easy for them to do so, because their Quadro line has been capable of this for a long time.

Just the messenger, don't shoot. :)

Really can't remember which interview it was.

tajoh111
12-19-2009, 03:17 PM
Most would not use a 2 screen setup.

Thats the problem, most people only need 2 screens for work and these are very few at that. 3 screen is where eyefiniti starts to get useful but its really a real investment at this point.

The investment of 400 dollars is something for the extra 2 monitors, but there is another cost people are forgetting, a new desk. 3 monitors take up a huge amount of space and from what I have seen, the only people who typically have the huge desks are those who do stuff that are more work orientated than gamers.

Also from what I have seen, everything but the 5970(and even this), will be to slow for directx 11 games, as dirt 2. AvP is running even worse apparently.

http://www.pcgameshardware.com/aid,700050/Dirt-2-benchmarks-DX-9-vs-DX-11-Update-Radeon-HD-5970-results/Practice/

49 FPS is not exactly encouraging on a single screen(at 1920*1200). Eyefiniti is going to butcher the FPS to unplayable levels.

Directx 11 and eyefiniti appear to be incompatible at this point because the hardware is not fast enough. This is a shame. If you have to vastly reduce the image quality to get playable framerates, it seems counter productive to me.

Its almost ironic to me that the reason physx is falling is simply because Nvidia is not being currupt enough or pushing enough money to gaming companies. It makes way more sense to put physx on CPU because everyone has one(instead of a dedicated AMD or NV GPU), thus, it going to take some bribery to get physx used with alot of companies.

What NV needs to do is if they want physx to catch on is pay more companies to use physx exclusive as their game engine.

Nvidia is not going to make money(atleast for covering the cost of purchasing agia) by making physx open source and allowing it to run on AMD hardware. If anything it would just help them lose money by wasting the money use to purchased agiea in the first place.

I don't condone it, but I can see why Nvidia wants to keep NV exclusive, it gives consumers a reason to buy their card over an AMD one. A not very strong one at this point however, unless NV gets more games on board.

I can also understand why intel did make Havok open source at this point. Havoc is still being tied to the Intel name, which helps it sells the cards, but if Intel did ever make it run only on Intel hardware, they would be sued like crazy because of the position they are in.

Final8ty
12-19-2009, 03:26 PM
Thats the problem, most people only need 2 screens for work and these are very few at that. 3 screen is where eyefiniti starts to get useful but its really a real investment at this point.

The investment of 400 dollars is something for the extra 2 monitors, but there is another cost people are forgetting, a new desk. 3 monitors take up a huge amount of space and from what I have seen, the only people who typically have the huge desks are those who do stuff that are more work orientated than gamers.
.
Snip .

Again Im talking about the gaming aspect because driving 2 screens with a GFX card for productivity has been around long enough & which has nothing to do with what we are talking about.
You also talking about the cost as if everything that is should be in your cost bracket or not at all.

And again your making a fuss about nothing.

Its for the people who can afford the screens & the space, not everything is aimed at the cheapskate.

It seems to me that you just don't know as many as i do who have bought 2 more screens & not one has had to buy a new desk.

Manicdan
12-19-2009, 03:46 PM
you dont have to have 3x 1920x1200 monitors with eyefinity. i was thinking of going with a smaller setup. maybe 3x 900x1440, so the total would be 2700x1440, which isnt much higher in resolution than a 30" monitor, for about half the price.

Blacky
12-19-2009, 07:42 PM
Blacky - there's been no rumblings of such yet unfortunately.

I'd be doing eyefinity right now but there are no 5870/5890 2gb cards yet... so i'm still on my 4870X2 :(



I understand but eyefinity is not something that nvidia needs to reclute a rocket scientist to launch a "nfinity" edition card if a simplier architecture like R800 (compared to Fermi in paper) can, why nvidia Fermi doesn't ? ATi took everyone by surprise when they launched R800 with this feature is not like people knew it was coming if I remember, the same can be done by nvidia (they also have time to prepare their similar solution still)...another thing if the resolution scales at 3760++ or some odd number like that does the picture actually look "that" HD? or just a bigger picture?...

STEvil
12-19-2009, 09:26 PM
If eyefinity were so simple (ie nV could add it to "fermi" easily with no hardware changes) why dont ATi release drivers to enable it on older cards as well, Blacky? Its either hardware level stuff (probably fully now) or completely possible on older hardware with a slave card in x-fire/sli to handle the 3rd display/physics/etc.

Scaling of the image will depend on the game for the most part (aspect ratio used).

Blacky
12-19-2009, 09:40 PM
If eyefinity were so simple (ie nV could add it to "fermi" easily with no hardware changes) why dont ATi release drivers to enable it on older cards as well, Blacky? Its either hardware level stuff (probably fully now) or completely possible on older hardware with a slave card in x-fire/sli to handle the 3rd display/physics/etc.

Scaling of the image will depend on the game for the most part (aspect ratio used).

Maybe because since R800 is a powerhouse right now, and probably are the only ones capable of running those odds res smoothly? idk who knows, they have their reasons, though one thing that bugs me is that how can it run smooth for such resolutions, I'm starting to think that eyefinity is just like using SoftTH or Matrox Triple Head just for "free"

Final8ty
12-19-2009, 11:51 PM
you dont have to have 3x 1920x1200 monitors with eyefinity. i was thinking of going with a smaller setup. maybe 3x 900x1440, so the total would be 2700x1440, which isnt much higher in resolution than a 30" monitor, for about half the price.

But is you did want you can get 2 1920x1200 @ £220 for the pair which is a little more than the cost of a THGO itself, but some here moan about EyeFin but not about THGO which has the same Belize issue les res & costs more from the get go.

So some people are moaning of free THGO function in the new card that they don't own & are not forced to use that function even if they own the card.

tajoh111
12-20-2009, 12:40 AM
Snip .

Again Im talking about the gaming aspect because driving 2 screens with a GFX card for productivity has been around long enough & which has nothing to do with what we are talking about.
You also talking about the cost as if everything that is should be in your cost bracket or not at all.

And again making your making a fuss about nothing.

Its for the people who can afford the screens & the space, not everything is aimed at the cheapskate.

It seems to me that you just don't know as many as i do who have bought 2 more screens & not one has had to buy a new desk.

I think its unreasonable to think people not wanting to spend money on 2 extra quality screen to be cheapskates. People wanting to spend 300 dollars total or less total on monitors make up the vast majority of gamers. The thing is, the thing thats going to determine eyefiniti success is its adoption rate. Simply if its targeting the most expensive bracket of gamer, its going to get a smaller adoption rate and less of a success(which the technology can only help by having lower cost videocards, monitors are independent at this point). Additionally, the cost of a card to effectively run eyefiniti is essentially 400+, things are starting to add up. In this bad economy, something like multiple monitors for gaming is barely a want and more of a luxury.

Your friends are likely in the IT industry and use their desk as a work environment, hence the large desks. Most of the picture I see of traditional non extreme setup have smaller desks, especially younger gamers or college gamers.

What do you mean my cost bracket? I consider my set up pretty high end and I am not thinking about of an eyefiniti setup because of inherent weakness of the technology(too slow for newer games and driver issues) and the superfluous nature of the technology itself(its not like this needed that much development to get three screens running, it just people don't need 3 monitors to game). Additionally the speed jump this generation from 4xxx to 5xxx was not enough for me to make the jump and this is ultimately the biggest weakness of eyefiniti for the future.

You think the reason I am not getting an eyefiniti is because of cost or not in my price bracket? My system is far above what I consider a cheap skate system and goes well into the high end. Not only that I already have two monitors.

I am not putting eyefiniti down so much as a technology as i think its kind of cool under very peculier circumstances(flight sims, driving). I just don't think it will be successful in the longterm because of several factors working against it. And if we want a current example of a something with the same factors working against it(cost, drivers, implementation, videocards speed), look at NV 3d gaming. That spiraling very fast towards failure and monitor cost(and monitor size) is one of those issues.

One thing that going to help the technology to succeed that needs to be dealt with soon is bezel size. Those eyefiniti specialized monitors cannot come soon enough, as people who are getting an eyefiniti setup are going to base it on first hand accounts and initial impression show those bezels are noticeable.

This is also a reason why I don't want an eyefiniti setup now. I know if I got a another monitor, I would just kick myself, when they started selling monitors with zero bezel. I only want to buy this type of setup once and not have to do a trimonitor upgrade every year or two.

I don't know about you, but my ideal visual setup of eyefiniti monitors, have three monitors of the same brand and type for esthetic reasons(I don't want a chimera on my desk) for a gaming setup in addition to having almost unnoticeable bezels.

If I am going to do eyefiniti, I am going to do it right the first time and not half assed, so this requires faster cards on AMD part and better monitors.

Buying an eyefiniti setup now just seems like a waste of money because your just set yourself up for disappointment in the next upcoming few months.



you dont have to have 3x 1920x1200 monitors with eyefinity. i was thinking of going with a smaller setup. maybe 3x 900x1440, so the total would be 2700x1440, which isnt much higher in resolution than a 30" monitor, for about half the price.

The smaller your screen those, the more your going to notice those bezels because they are coming more into your main field of view and less in your peripheral. In addition, the size of your mainscreen is important, I don't think you should sacriface the size of your mainscreen, for the sake of eyefiniti.

Final8ty
12-20-2009, 01:14 AM
I think its unreasonable to think people not wanting to spend money on 2 extra quality screen to be cheapskates. People wanting to spend 300 dollars total or less total on monitors make up the vast majority of gamers. The thing is, the thing thats going to determine eyefiniti success is its adoption rate. Simply if its targeting the most expensive bracket of gamer, its going to get a smaller adoption rate and less of a success(which the technology can only help by having lower cost videocards, monitors are independent at this point).

First the cost is irrelevant to people who moan about the space needed & the problems with the having bezels so they don't want it anyway so what does it matter.
The price of the monitors has nothing to do with ATI.
I see a massive adoption rate at OcUK.
And what do you expect for 3 monitor gaming to come flying out of the gates. EyeF has already got more people into multi screen that would not have done otherwise because its simply there as an option from the get go.

Your argument is flawed as EF is not a standalone product like THGO, & everything has to start somewhere & cost allot at first.
From what i see its success is irrelevant & all i see are people who like the tech & people who do not, 3 screen having a huge uptake will not change anything for people who simply do not like it.
Its not like games will be flying out that state 3 screen minimum.

If everything was based on your argument & time scale there would be no Plasma or LCD in most homes except for the very rich.

And the cheapskate sticks as many people spend far more in other non essential areas than £200.
From what i have read the people who want EF but presently don't have it have no issue with the cost of the monitors its only the people who don't want EF who bring up the costs.

tajoh111
12-20-2009, 01:27 AM
First the cost is irrelevant to people who moan about the space needed & the problems with the having bezels so they don't want it anyway so what does it matter.
The price of the monitors has nothing to do with ATI.
I see a massive adoption rate at OcUK.
And what do you expect for 3 monitor gaming to come flying out of the gates. EyeF has already got more people into multi screen that would not have done otherwise because its simply there as an option from the get go.

Your argument is flawed as EF is not a standalone product like THGO, & everything has to start somewhere.
From what i see its success is irrelevant & all i see are people who like the tech & people who do not, 3 screen having a huge uptake will not change anything for people who simply do not like it.
Its not like games will be flying out that state 3 screen minimum.

I don't understand your first sentence. Also, ss eyefiniti EF? what is THGO? I don't want to sound like a :banana::banana::banana::banana:, but if your writing a scientific thesis or a paper in general, its best to use the full abbreviated form, than the abbreviation right after. I google THGO and I got Total Hepatic Glucose Output as the abbreviation, so I don't think you mean that.

I already said AMD has nothing to do with the monitor price and the only thing that AMD can do with the cost barrier of eyefiniti is to lower the price of their cards, thus reducing the total cost of building an eyefiniti setup.

Smaller if not non existant bezels are really important for the technology to take off. The first people that are going to buy these monitors are people with eyefiniti ready videocards and are waiting for the monitors to come out.

I might consider the technology but the technology is too immature at this point. I think alot of people can agree on this. Faster cards to handle directx 11 games in addition to better monitors will work wonders for adoption. Its just not must have at this point.

One thing that really is not making want to buy a new videocard or gaming setup at this point is the lack of games(I think you can agree with me on this one), there is simply not enough recently released good games out already for PC that I haven't played or wanted too. As a result such a setup would be wasted because I not going to replay the games I already finished just for the sake of playing it on a eyefiniti setup, that a waste of time.

STEvil
12-20-2009, 01:33 AM
EF is Eyefinity.

TH2G is Matrox's Tripple Head 2 Go... basically the fore-runner of eyefinity.

If bezels are such a problem then people who already run 2+ monitors should be complaining loudly. They arent.

Final8ty
12-20-2009, 01:35 AM
Smaller if not non existant bezels are really important for the technology to take off. The first people that are going to buy these monitors are people with eyefiniti ready videocards and are waiting for the monitors to come out.

Smaller bezels will help but you carry-on like multi screen gaming was not around before, all the costs involved Belize issue has always been there with the th2go approach so why moan now with EF & if you cant workout what EF means within the context then its your problem.

And i personally don't care for your personal reasons to not go EF.


I already said AMD has nothing to do with the monitor price and the only thing that AMD can do with the cost barrier of eyefiniti is to lower the price of their cards, thus reducing the total cost of building an eyefiniti setup.


That makes no sense at all because you could use that excuse for any feature on the Video card, ATI would have nothing to gain by making it cheaper for EF setup than what they have already done compared to the TH2go so that the money can be spent elsewhere that does not go onto ATIs pocket as if it was not for the EF feature people would not buy the card anyway.

tajoh111
12-20-2009, 02:01 AM
EF is Eyefinity.

TH2G is Matrox's Tripple Head 2 Go... basically the fore-runner of eyefinity.

If bezels are such a problem then people who already run 2+ monitors should be complaining loudly. They arent.

That because they are using them in a work environment, where each part of there work is on a separate screen. IF they were working with a continous image, that would not be the case. And people who have bought eyefiniti setups have already accepted the consequences of the bezels and knew what they were getting into and are willing to put up with that con. I am talking about mass adoption or more adopters at the very least. Additionally, if a good monitor comes out with next to no bezel, I can tell you this, there are going to be alot of eyefinity owners with that monitor on their wish list.


Smaller bezels will help but you carry-on like multi screen gaming was not around before, all the costs involved Belize issue has always been there with the th2go approach so why moan now with EF & if you cant workout what EF means within the context then its your problem.

I kind of already figured what EF kind of meant, but I really had no idea what THGO was. This was something that not that popular a technology and I didn't even mention it,I was simply asking you to clarify what you said and you respond with a comment boardering on a personal attack.

Heck th2go was so obscure for gaming because the cards were so slow. Th2go was such a failure in regards to gaming that there was no need to moan because no one bragged about it like people do for eyefiniti. No one was bragging like it some savior or this super huge selling point for gaming for matrox cards.

Eyefiniti as it is, is almost more suited for a business environment and hence might be a good reason why a business might use an AMD card, so more power to AMD in this regard as matrox cards are pricey for what they are(but I heard their drivers and stability is top notch). But for gaming there are some obvious flaws.

I am saying cost can play a role in adoption rate those, look how successful nvidia 3d vision is(I initially thought you meant this as THGO but the abbreviation was all wrong). Its a failure because of cost, performance issues, and drivers. The same reason working against eyefinity at this point.

I am talking about mass adoption in general, its seems like you don't care that much for the quantity of the adoption itself.

My reason are not personal at all, I think my reason for not buying an eyefiniti card can be applied to a huge demographic and the vast majority of gamers. Hence they are reasonable in arguing why eyefiniti won't get mass adoption or a large percentage. If we don't care about the number of people who adopt an AMD card because of eyefinity, we can argue PhysX is a success because there are obviously people out there who bought an NV card because of PhysX. Yet you would consider PhysX as dying and a failure, as would most AMD card owners, myself included at this point.

Final8ty
12-20-2009, 02:17 AM
That because they are using them in a work environment, where each part of there work is on a separate screen. IF they were working with a continous image, that would not be the case. And people who have bought eyefiniti setups have already accepted the consequences of the bezels and knew what they were getting into and are willing to put up with that con. I am talking about mass adoption or more adopters at the very least. Additionally, if a good monitor comes out with next to no bezel, I can tell you this, there are going to be alot of eyefinity owners with that monitor on their wish list.

I kind of already figured what EF kind of meant, but I really had no idea what THGO was. This was something that not that popular a technology and I didn't even mention it,I was simply asking you to clarify what you said and you respond with a comment boardering on a personal attack.

Heck th2go was so obscure for gaming because the cards were so slow. Th2go was such a failure in regards to gaming that there was no need to moan because no one bragged about it like people do for eyefiniti. No one was bragging like it some savior or this super huge selling point for gaming for matrox cards.

Eyefiniti as it is, is almost more suited for a business environment and hence might be a good reason why a business might use an AMD card, so more power to AMD in this regard as matrox cards are pricey for what they are(but I heard their drivers and stability is top notch). But for gaming there are some obvious flaws.

I am saying cost can play a role in adoption rate those, look how successful nvidia 3d vision is(I initially thought you meant this as THGO but the abbreviation was all wrong).
Its a failure because of cost, performance issues, and drivers. The same reason working against eyefinity at this point.

All the people i know use EF for gaming & i know plenty with the TH2GO & so what if people want to brag about it, its popularity is irrelevant to the individuals satisfaction that are using it (Niche) & thank god for it.
people using or bragging about EF should not make any difference to you unless you just don't like others being happy about something that you don't have for whatever reason & does not effect you or the games that you play.

I don't care for NV 3d but im not going around telling others not to care.

You don't seem to get the point.
The adoption rate for EF is irrelevant to ATI as ATI does not make money on the monitors or anything else need for it. EF is a feature already on the card just like
- HDCP Capable
- DirectX 11 Support
- OpenGL 3.2 Support
- ATI CrossFireX Ready
- ATI Eyefinity Technology
- ATI Avivo HD
- ATI Stream Technology
Unified Video Decoder 2 (UVD) for Blu-ray and HD Video
- Built-in HDMI with 7.1 surround sound support

None of the above really matter to ATI if they are used by the consumer or not besides CrossFireX maybe, all that matters is that the card has been sold ATI could not careless if people choose to use the Built-in HDMI with 7.1 surround sound support hooked up to the TV & AV or not, it is purely an option like all the others in the list.

The more options the better the more people will likely buy the product because one or more of the available features hit the spot for the individual & if that one happens to be EF then good on them.

SocketMan
12-20-2009, 02:22 AM
Also from what I have seen, everything but the 5970(and even this), will be to slow for directx 11 games, as dirt 2. AvP is running even worse apparently.

http://www.pcgameshardware.com/aid,700050/Dirt-2-benchmarks-DX-9-vs-DX-11-Update-Radeon-HD-5970-results/Practice/

49 FPS is not exactly encouraging on a single screen(at 1920*1200). Eyefiniti is going to butcher the FPS to unplayable levels.

Directx 11 and eyefiniti appear to be incompatible at this point because the hardware is not fast enough. This is a shame. If you have to vastly reduce the image quality to get playable framerates, it seems counter productive to me.




That's just one level out of dozens.Utah track (for example) gets 20-30 fps more then the London one.
Settings like "car reflections" (on ULTRA vs LOW) eats up another 20 FPS.


That (49fps) is hardly a "final verdict" for the entire DX11 lineup.More of a "worst case scenario".

Also running the (built in) benchmark 10 times (on the same track) gets 10 different results as long as there are more then 1 car used in the test.
Every (benchmark) race is different, just like the FPS.

They've used 8 cars - the default number. :)

flopper
12-20-2009, 03:38 AM
Eyefinity or widescreen gaming has been around a long time using THg2 systems which wasnt plug and play.
People doing such gaming often build their own system with projectors and such.
for fly and race sims, then 3 screens are a real benefit to immersion.

ATI development team was behind creating the THG systems, now supported direct with hardware with basically plug and play,
this allows, many more who has one or 2 screens to try widescreengaming, support for widescreen is coming to increase due to ati´s initiativ.

I use a mix screen combo, one 1920x1200 with 2 1920x1080 meaning 5760x1080 in gaming, which the odd screen even if it scales down, its in the my FoV so it adds side vision.
Bezels isnt anything I see, and we will get better support with screens that has little to no bezels in the coming year just that ati choose to set this.

Now, even if someone isnt gaming using 3 screens, people who work, will have a chance to add more screens, to implement such additions with the low end cards.

now, as dx11, is adding features that allow a gamedeveloper to increase frames, or to keep them the same, or even improve them as Battleforge has done, it seems dx11 isnt insignificant or that eyefinity is insignificant, note, Nvidia isnt downplaying eyefinity...

its a free feature, that will gain support as monitors today, can be found for cheap prices.
3 screens today cost as much as one did 3 years ago.

btw, Dragon Age rocks using eyefinity and its a new game..
:yepp:

and besides, I like to get Bf BC2 using eyefinity as it seems to be a game to get.

trinibwoy
12-20-2009, 05:13 AM
.....

Nobody was forcing their opinions on you or telling you not to spend your money. In fact it seems that you think we're not entitled to an opinion that differs from yours. I was talking about the issues with having the HUD stretched out a couple feet horizontally in front of you but you seem to have missed that point. There are also other practical considerations - like physical space requirements etc.


note, Nvidia isnt downplaying eyefinity...

How could they? Multi-display support is a key feature of their Quadro line. This isn't some new tech guys, it's nice that AMD added some outputs to their consumer cards and brought it down market but it's been around forever. The real innovation here is that it's doable on a single card.

http://img689.imageshack.us/img689/3055/nv29.jpg

Final8ty
12-20-2009, 05:40 AM
Nobody was forcing their opinions on you or telling you not to spend your money. In fact it seems that you think we're not entitled to an opinion that differs from yours. I was talking about the issues with having the HUD stretched out a couple feet horizontally in front of you but you seem to have missed that point. There are also other practical considerations - like physical space requirements etc.

Your repeating yourself as i have answered all your points already.

You are telling me about space & i have told you that none of the people i know who have EF has had to worry about the space or buying a new desk for 3 monitors when before hand they had just one monitor.

Space maybe an issue for you.

When your generalise your opinion of how others will see things purely based on your own likes & dislikes then that is forcing your opinions on other because your trying to speak for others besides yourself who are not in your situation or circumstances.

Everyone is entitled to there OWN opinion= yourself.

Of course some things can be generalised but its too soon to judge in regards to EF.

I know about the HUD stretch & i really don't care enough about it for it to even slightly put me of EF, its not a big issue for me, it maybe for you but not for me.

At the end of the day people either have the space & money or they don't which is blatantly obvious as the average car needing petrol & a road.
Why such obvious things need to be debated with you is beyond me.

Final8ty
12-20-2009, 05:52 AM
How could they? Multi-display support is a key feature of their Quadro line. This isn't some new tech guys, it's nice that AMD added some outputs to their consumer cards and brought it down market but it's been around forever. The real innovation here is that it's doable on a single card.



Can you game on that if not then you have missed the point.
And all i care is that it is here at consumer level.

Blacky
12-20-2009, 06:28 AM
Which could be the best eyefinity setup 3x1 landscape or 3x1 portrait? with these (http://accessories.us.dell.com/sna/products/Displays/productdetail.aspx?c=us&l=en&cs=19&sku=320-8325), best price / performance monitors for eyefinity imo, the only problem is that since it has TN panel on the portrait could be a problem due sucky viewing angles...

Xoulz
12-20-2009, 08:03 AM
That because they are using them in a work environment, where each part of there work is on a separate screen. IF they were working with a continous image, that would not be the case. And people who have bought eyefiniti setups have already accepted the consequences of the bezels and knew what they were getting into and are willing to put up with that con. I am talking about mass adoption or more adopters at the very least. Additionally, if a good monitor comes out with next to no bezel, I can tell you this, there are going to be alot of eyefinity owners with that monitor on their wish list.



I kind of already figured what EF kind of meant, but I really had no idea what THGO was. This was something that not that popular a technology and I didn't even mention it,I was simply asking you to clarify what you said and you respond with a comment boardering on a personal attack.

Heck th2go was so obscure for gaming because the cards were so slow. Th2go was such a failure in regards to gaming that there was no need to moan because no one bragged about it like people do for eyefiniti. No one was bragging like it some savior or this super huge selling point for gaming for matrox cards.

Eyefiniti as it is, is almost more suited for a business environment and hence might be a good reason why a business might use an AMD card, so more power to AMD in this regard as matrox cards are pricey for what they are(but I heard their drivers and stability is top notch). But for gaming there are some obvious flaws.

I am saying cost can play a role in adoption rate those, look how successful nvidia 3d vision is(I initially thought you meant this as THGO but the abbreviation was all wrong). Its a failure because of cost, performance issues, and drivers. The same reason working against eyefinity at this point.

I am talking about mass adoption in general, its seems like you don't care that much for the quantity of the adoption itself.

My reason are not personal at all, I think my reason for not buying an eyefiniti card can be applied to a huge demographic and the vast majority of gamers. Hence they are reasonable in arguing why eyefiniti won't get mass adoption or a large percentage. If we don't care about the number of people who adopt an AMD card because of eyefinity, we can argue PhysX is a success because there are obviously people out there who bought an NV card because of PhysX. Yet you would consider PhysX as dying and a failure, as would most AMD card owners, myself included at this point.



The cost of 2 additional 24" Dell Monitors is about $378 bucks (on sale)...

My bar tab last night was $270...


Point is:

Eyefinity makes use of multi-monitors which many gamers have been asking for & using, for ages. It has clear advantage during gameplay and yields alot of real estate. If you are a gamer, then you will almost assuredly be building an Eyefinity set-up within a year or two... if not already. Bezels are a non-issue because more and more manufacturers will be making smalled bezel'd monitors. It will only get more popular as time goes on.


Where-as PhysX offers nothing to gameplay and isn't needed, at all..!! Ever! It's a second rate, proprietary solution, aimed at making more money threw marketing and back door deals, than actually providing a unique experience.


Tajoh, your crusade is moot.

shiznit93
12-20-2009, 08:22 AM
The cost of 2 additional 24" Dell Monitors is about $378 bucks (on sale)...

My bar tab last night was $270...


Point is:

Eyefinity makes use of multi-monitors which many gamers have been asking for & using, for ages. It has clear advantage during gameplay and yields alot of real estate. If you are a gamer, then you will almost assuredly be building an Eyefinity set-up within a year or two... if not already. Bezels are a non-issue because more and more manufacturers will be making smalled bezel'd monitors. It will only get more popular as time goes on.


Where-as PhysX offers nothing to gameplay and isn't needed, at all..!! Ever! It's a second rate, proprietary solution, aimed at making more money threw marketing and back door deals, than actually providing a unique experience.


Tajoh, your crusade is moot.
Not all of us are happy with :banana::banana::banana::banana:ty dell monitors. I just gave up my FW900 CRT a few weeks ago.

How can you presume to tell us what we will be doing in a year or two? I would rather have ONE 120hz screen than three 60hz screens using eyefinity. I have a 5870 on the way and I will "almost assuredly" not be using eyefinity.

I guess I could buy two more 2233rz monitors for about $600-700 but I would need a much faster card than 5870 for 100+ fps at 5040x1050.

cegras
12-20-2009, 10:04 AM
In a three monitor setup, or in any particular setup where your focus is on a unbroken area of the screen, the bezels will fade if you are not actively looking a them. This is because it is a scientific fact that the eye is sensitive to dynamic environments. Try drawing a dot that is in the middle of a largish circle and focus on the dot. The circle will fade over time.

trinibwoy
12-20-2009, 10:12 AM
I know about the HUD stretch & i really don't care enough about it for it to even slightly put me of EF, its not a big issue for me, it maybe for you but not for me.

Enjoy :)


Can you game on that if not then you have missed the point. And all i care is that it is here at consumer level.

That was a response to flopper asking why Nvidia doesn't downplay Eyefinity, nothing related to your interest in it.

kuhla
12-20-2009, 10:15 AM
Jen Hsun Huang: Don't be too proud of this technological terror you've constructed. The ability to render dx 11 is insignificant next to the power of the geForce.
Admiral Motti D: Don't try to frighten us with your sorcerous ways, Lord Huang. Your sad devotion to that ancient religion has not helped you conjure up dx11 cards, or given you clairvoyance enough to create a working fermi...
[Jen Hsun Huang makes a pinching motion and A M D starts choking]
Jen Hsun Huang: I find your lack of faith disturbing.

L O L. You win sir. You win.

Final8ty
12-20-2009, 10:20 AM
Enjoy :)



That was a response to flopper asking why Nvidia doesn't downplay Eyefinity, nothing related to your interest in it.

And your still wrong as his context is the same as mine, Gaming & you showed a none gaming capable multi screen set up from NV.

I could drive 8 screens none gaming on my 4x Sapphire 3870 XT "Toxic Edition".

yngndrw
12-20-2009, 10:24 AM
many gamers have been asking for & using, for ages. It has clear advantage during gameplay and yields alot of real estate.
Have they ? How many people do you know actually played Supreme Commander on more than one screen ? I believe this was the first game to support multiple monitors correctly but I might have missed one.


Bezels are a non-issue because more and more manufacturers will be making smalled bezel'd monitors.
Even with three monitors, the fact is that you have two ugly lines right in your view. I don't see how it's a non-issue, no matter how small they are they will still be there. Even if you were to butt two panels up to each other (Which I don't believe you can due to dynamic contrast ratio and back light problems.) you would still have a line from the two pieces of glass. The only way to make it seamless would be to use one screen.

trinibwoy
12-20-2009, 10:41 AM
And your still wrong as his context is the same as mine, Gaming & you showed a none gaming capable multi screen set up from NV.

Sigh, where are you going with this? I take it you haven't even read the Eyefinity whitepaper. The first section of it talks about workstation productivity, gaming is mentioned afterwards. Nvidia can't downplay something that they offer themselves. Like I said, you're free to enjoy all of Eyefinity's benefits to your heart's content.

Final8ty
12-20-2009, 10:45 AM
Sigh, where are you going with this? I take it you haven't even read the Eyefinity whitepaper. The first section of it talks about workstation productivity, gaming is mentioned afterwards. Nvidia can't downplay something that they offer themselves.

That part of the white paper is irrelevant to the aspect of EF that we are interested in.
You are not talking to ATI you are talking to us & the gaming aspect is what our main focus is in.
NV does not offer that aspect.

STEvil
12-20-2009, 10:47 AM
Have they ? How many people do you know actually played Supreme Commander on more than one screen ? I believe this was the first game to support multiple monitors correctly but I might have missed one.


Even with three monitors, the fact is that you have two ugly lines right in your view. I don't see how it's a non-issue, no matter how small they are they will still be there. Even if you were to butt two panels up to each other (Which I don't believe you can due to dynamic contrast ratio and back light problems.) you would still have a line from the two pieces of glass. The only way to make it seamless would be to use one screen.

So have you tried multi-monitor gaming yet or are you just complaining about something you've not experienced?

Bezels are not a huge problem. The other monitors are mostly for peripheral vision except for some game types (RTS etc) so you spend most of your time on the center screen or not looking where the bezels are, so you forget about them.

trinibwoy
12-20-2009, 10:55 AM
That part of the white paper is irrelevant to the aspect of EF that we are interested in.
You are not talking to ATI you are talking to us & the gaming aspect is what our main focus is in.

Gotta admit, you've completely lost me here. I don't know who "we" is but maybe you should re-read flopper's question. You seem to be saying that Nvidia should ignore the productivity benefits of Eyefinity and downplay the gaming aspect because they don't offer anything similiar - that doesn't seem like a useful discussion to me. And it would be a pretty stupid marketing move since the gaming and productivity aspects are based on the same technology. :shrug:

Final8ty
12-20-2009, 11:02 AM
Gotta admit, you've completely lost me here. I don't know who "we" is but maybe you should re-read flopper's question. You seem to be saying that Nvidia should ignore the productivity benefits of Eyefinity and downplay the gaming aspect because they don't offer anything similiar - that doesn't seem like a useful discussion to me. And it would be a pretty stupid marketing move since the gaming and productivity aspects are based on the same technology. :shrug:


Eyefinity or widescreen gaming has been around a long time using THg2 systems which wasnt plug and play.
People doing such gaming often build their own system with projectors and such.
for fly and race sims, then 3 screens are a real benefit to immersion.

ATI development team was behind creating the THG systems, now supported direct with hardware with basically plug and play,
this allows, many more who has one or 2 screens to try widescreengaming, support for widescreen is coming to increase due to ati´s initiativ.

I use a mix screen combo, one 1920x1200 with 2 1920x1080 meaning 5760x1080 in gaming, which the odd screen even if it scales down, its in the my FoV so it adds side vision.
Bezels isnt anything I see, and we will get better support with screens that has little to no bezels in the coming year just that ati choose to set this.

Now, even if someone isnt gaming using 3 screens, people who work, will have a chance to add more screens, to implement such additions with the low end cards.

now, as dx11, is adding features that allow a gamedeveloper to increase frames, or to keep them the same, or even improve them as Battleforge has done, it seems dx11 isnt insignificant or that eyefinity is insignificant, note, Nvidia isnt downplaying eyefinity...

its a free feature, that will gain support as monitors today, can be found for cheap prices.
3 screens today cost as much as one did 3 years ago.

btw, Dragon Age rocks using eyefinity and its a new game..


and besides, I like to get Bf BC2 using eyefinity as it seems to be a game to get.

The bold part is the productivity part of the post which clearly is not his point to the post when you look at when 95% of it is about gaming.

But you took the part in bold as a reason why NV is not down playing when EF when there is no evidence to that when he was clearly talking about NV not offering the gaming side.

Im not saying NV should ignore the productivity benefits of Eyefinity but that's a reason your giving not a reason that NV has put forwards in regards to EF.

Why isn't NV downplaying the gaming side of EF when that has been thrusted in peoples face day in day out.

trinibwoy
12-20-2009, 11:09 AM
The spanning mode of Eyefinity is not a special gaming mode - you can run any application like that. So to downplay running games on multiple monitors you would have to downplay the underlying technology itself which would be very stupid since it's something that professionals have been depending on for years. As pointed out in that quote the big innovation here is bringing the capability to do it on 3-6 monitors using a single card. Maybe I don't know what you mean by the "gaming side" of Eyefinity but technology wise it's just a spanning mode that any application can use, not just games.

Final8ty
12-20-2009, 11:18 AM
The spanning mode of Eyefinity is not a special gaming mode - you can run any application like that. So to downplay running games on multiple monitors you would have to downplay the underlying technology itself which would be very stupid since it's something that professionals have been depending on for years. As pointed out in that quote the big innovation here is bringing the capability to do it on 3-6 monitors using a single card.

Who said anything about spanning besides you.

The big innovation here is bringing the gaming capability on 3-6 monitors using a single card.
But your dodging that part at every opportunity even tho 99% of the time when people talk about EF & use EF setup in forums its about the gaming side of it.

The only reason why i said anything about productivity is that many people use more than one monitor at home & the same issue of space & cost is still there no matter what your using the monitors for.

80+Pages & adding 3rd monitor or 3 smaller is not a problem for most if they wanted to EF game.
http://forums.overclockers.co.uk/showthread.php?t=17405978&page=89

trinibwoy
12-20-2009, 11:22 AM
You do know what spanning is right? In any case, it seems to me that you're not clear on how Eyefinity actually works. If you believe there's a special gaming setting or something that Nvidia could target separately then far be it from me to convince you otherwise :) Carry on.

STaRGaZeR
12-20-2009, 11:32 AM
Bezels are not a huge problem.

They really are...

Final8ty
12-20-2009, 11:39 AM
You do know what spanning is right? In any case, it seems to me that you're not clear on how Eyefinity actually works. If you believe there's a special gaming setting or something that Nvidia could target separately then far be it from me to convince you otherwise :) Carry on.

I have 3 monitors since my very first PC i know what spanning is thank you with all screen seen as just one screen.
I didn't say anything about what NV could or could not do, im talking about what ATI is doing & what NV is not when it comes to multi screen gaming at consumer level.
You have a habit of saying that people said this & that when they have said nothing of the sort, i mean not even close.

SocketMan
12-20-2009, 12:58 PM
They really are...

In certain cases - bezels actually add to realism,look at the :banana::banana::banana::banana:pit windows - now that is thick.
Also has anyone tried using 3 or more projectors?

Final8ty
12-20-2009, 01:18 PM
In certain cases - bezels actually add to realism,look at the :banana::banana::banana::banana:pit windows - now that is thick.
Also has anyone tried using 3 or more projectors?

Check this one out. :eek:
http://www.widescreengamingforum.com/forum/viewtopic.php?t=16794&postdays=0&postorder=asc&start=60

SocketMan
12-20-2009, 01:45 PM
Check this one out. :eek:
http://www.widescreengamingforum.com/forum/viewtopic.php?t=16794&postdays=0&postorder=asc&start=60

That looks awesome !
Many thanks :up:

Xoulz
12-20-2009, 05:52 PM
Have they ? How many people do you know actually played Supreme Commander on more than one screen ? I believe this was the first game to support multiple monitors correctly but I might have missed one.


Even with three monitors, the fact is that you have two ugly lines right in your view. I don't see how it's a non-issue, no matter how small they are they will still be there. Even if you were to butt two panels up to each other (Which I don't believe you can due to dynamic contrast ratio and back light problems.) you would still have a line from the two pieces of glass. The only way to make it seamless would be to use one screen.



I'm sorry, doesn't the monitor you have right now... have a bezel? Do you sit there and stare @ it all day long and complain?

So where is the logic behind you argument? As noted, the additional side monitors are for your periphery and as you target something, you are only using your traditional monitor. I fail to recognize your point. Multi-monitors do provide an advantage... therefore ATI Radeons provide an advantage..



If the bezels bother you.. then wait for one piece multi-displays.. ;)

STEvil
12-20-2009, 08:52 PM
This just in: discussion of bezels and productivity is old news. Next subject!

ajaidev
12-21-2009, 12:11 AM
yep btw i remember AMD said special monitors were on their way, ones without the dam black borders...

I was thinking about getting 3 small 21"-22" monitors for when and if i get the 5950, viewsonic has some great value for money products.

Final8ty
12-21-2009, 03:00 AM
That looks awesome !
Many thanks :up:

http://www.youtube.com/watch?v=csOISrFN0O0

STaRGaZeR
12-21-2009, 05:18 AM
In certain cases - bezels actually add to realism,look at the :banana::banana::banana::banana:pit windows - now that is thick.
Also has anyone tried using 3 or more projectors?

WTF! Come on, lame-ass justification :rolleyes:

MrMojoZ
12-21-2009, 05:47 AM
WTF! Come on, lame-ass justification :rolleyes:

Wow, not exactly a classy response there fella. He was absolutely right about sims that would treat each monitor as a seperate window.

safan80
12-21-2009, 07:49 PM
Who needs 3 monitors if you have one nice big one ;) I'm not saying I don't want three monitors but I have 2 30" monitors... hmm This would make racing games fun again.. Yeah ok I want EF now :lol:

yngndrw
12-21-2009, 08:16 PM
I'm sorry, doesn't the monitor you have right now... have a bezel?
Not splitting different parts of the viewport, it doesn't.

For simulators, I can see the point made there, but for normal usage and gaming I can't.

Why bother with a special multi-screen-borderless-monitor instead of having a larger monitor ?

STEvil
12-21-2009, 09:45 PM
Not splitting different parts of the viewport, it doesn't.

For simulators, I can see the point made there, but for normal usage and gaming I can't.

Why bother with a special multi-screen-borderless-monitor instead of having a larger monitor ?

Because a larger (wider) monitor exaggerates the cost astronomically. You see what the borderless 5760x900 displays cost? I think they were 5760 wide but I could be wrong..

If they were in line with normal monitors it would be a problem but as it stands today I can buy 3x 40"+ 1920x1080 true 120hz displays for less than one of those borderless displays.

SocketMan
12-21-2009, 10:17 PM
http://www.youtube.com/watch?v=csOISrFN0O0

I would have punched the first guy that walked between me
and ma-finity,the next 5 would've planned a better route.:D


WTF! Come on, lame-ass justification :rolleyes:

My bad I should have used a minivan as an example.

When a person drives a minivan,a car or a bus
(or fly a jet/helo)
they don't give a flying fud about these things between the
side windows and the front windshield (glass).
It's the exact same idea with the monitor bezels just like a few people (who most likely used multy-monitor setups)have pointed this out already - you don't look at the bezels you look forward ( or you get into an accident ) .

The 2 side monitors (the side windows in a car/truck,jet etc..) are there for the peripheral vision/effect.
The focus is (almost) always on the centre monitor.
However trying to explain this can be painful,it's like explaining how sex feels like - better to just try it (at least once),before complaining about it.



Not splitting different parts of the viewport, it doesn't.

For simulators, I can see the point made there, but for normal usage and gaming I can't.

Why bother with a special multi-screen-borderless-monitor instead of having a larger monitor ?

The 2 side monitors are there for the peripheral effect.
The focus is (almost) always on the centre monitor.
no matter how big the monitor is - it won't give you peripheral vision/effect.I haven't tried the 3d glasses,that
may be even better,just don't know.


I am not saying EF is the way to go for all the games and all the gamers,but sims,driving,mmorpg and fps aren't hurting (from it).Again it's (EF) there with the card (even the midend like 5770) so use or not it's good to know it's available.

tajoh111
12-21-2009, 11:56 PM
I would have punched the first guy that walked between me
and ma-finity,the next 5 would've planned a better route.:D



My bad I should have used a minivan as an example.

When a person drives a minivan,a car or a bus
(or fly a jet/helo)
they don't give a flying fud about these things between the
side windows and the front windshield (glass).
It's the exact same idea with the monitor bezels just like a few people (who most likely used multy-monitor setups)have pointed this out already - you don't look at the bezels you look forward ( or you get into an accident ) .

The 2 side monitors (the side windows in a car/truck,jet etc..) are there for the peripheral vision/effect.
The focus is (almost) always on the centre monitor.
However trying to explain this can be painful,it's like explaining how sex feels like - better to just try it (at least once),before complaining about it.




The 2 side monitors are there for the peripheral effect.
The focus is (almost) always on the centre monitor.
no matter how big the monitor is - it won't give you peripheral vision/effect.I haven't tried the 3d glasses,that
may be even better,just don't know.


I am not saying EF is the way to go for all the games and all the gamers,but sims,driving,mmorpg and fps aren't hurting (from it).Again it's (EF) there with the card (even the midend like 5770) so use or not it's good to know it's available.

I simply don't get why people are ignoring the drop in performance to render the extra pixels in addition sacrificing image quality on the main screen. Its not like boom, extra resolution at no performance penalty, which alot of eyefinity fan seems like to be implying. People are used to above 60 fps for games and people will have to take an image quality hit to get this for alot of games.

At this point for eyefinity to work at 1920*1200(I use this resolution because most people on this board have a 24" monitor) across three screens, we need cards to be atleast twice as fast as they are at the moment in addition to having 2gb of video memory for decent amounts of AA. We also need AMD to get crossfire eyefinity working ASAP because most people who have an eyefinity setup, can afford the extra cards to get playable framerates.

I have to admit I like eyefinity on projectors, it looks really nice and just emphasizes why bezels suck against a landscape image. If bezels were a non issue, samsung would not be coming out with monitors with small bezels design specifically for eyefinity. In addition, companies would not even attempt to design an extra wide monitor and sell it for 6000 dollars.

JohnZS
12-22-2009, 12:06 AM
tajoh111
Are you saying that Eyeinfinity does not work with Crossfire?!? If so that is an EPIC fail on ATi's part... very disappointing indeed, woeful even. IMHO eyeinfinity is really only usable with Crossfire setups (at such high resolutions and on new games). To not come with Crossfire support for Eyeinfinity out of the box is just very sloppy....almost rushed even.
On a more serous note it would not surprise me if nVidia bring out a very similar technology, Quadro cards have had this technology for a while now.
Back on topic I have read that even more developers are jumping onto the DirectX 11 platform (heck even EA are doing Battlefield on DirectX11) So this is all SIGNIFICANT and a good thing :D
John

tajoh111
12-22-2009, 12:18 AM
tajoh111
Are you saying that Eyeinfinity does not work with Crossfire?!? If so that is an EPIC fail on ATi's part... very disappointing indeed, woeful even. IMHO eyeinfinity is really only usable with Crossfire setups (at such high resolutions and on new games). To not come with Crossfire support for Eyeinfinity out of the box is just very sloppy....almost rushed even.
On a more serous note it would not surprise me if nVidia bring out a very similar technology, Quadro cards have had this technology for a while now.
Back on topic I have read that even more developers are jumping onto the DirectX 11 platform (heck even EA are doing Battlefield on DirectX11) So this is all SIGNIFICANT and a good thing :D
John

Not at the moment, the drivers don't support it.

The only thing that support crossfire of any sort and eyefinity is the 5970 and this is only internal CF. Another 5970 will not work for eyefinity. It also bad that these card are 700 dollar at the moment.

madcho
12-22-2009, 12:49 AM
9.12 hotfix support it

Final8ty
12-22-2009, 04:05 AM
Who needs 3 monitors if you have one nice big one ;) I'm not saying I don't want three monitors but I have 2 30" monitors... hmm This would make racing games fun again.. Yeah ok I want EF now :lol:

Was chatting to a friend last night who was talking about going 3x30"
I personally would be happy with 1600x900 portrait- 2560x1600 Landscape-1600x900 portrait but if that is not possible then i may go the 3x30 myself depending on the performance of Tri-Quad GPU performance with them.

Manicdan
12-22-2009, 07:17 AM
Was chatting to a friend last night who was talking about going 3x30"
I personally would be happy with 1600x900 portrait- 2560x1600 Landscape-1600x900 portrait but if that is not possible then i may go the 3x30 myself depending on the performance of Tri-Quad GPU performance with them.

i wanted to do that same exact setup, one big widescreen in center, and two portrait on the side. but thats not possible right now. hopefully it will be eventually

WaterFlex
12-22-2009, 07:29 AM
Nvidia has no dx11 products. stfu then, Nvidia :)

LedHed
12-22-2009, 10:05 AM
yes because of all the DX11 games (not DX10 games with DX11 additions)...

NVIDIA is right in saying it doesn't matter at all right now who has sold what, DX11 isn't even out the door yet in terms of usage. By the time full DX11 games (meaning built ground up with w/ DX11 instead of using a DX10 game and added a few DX11 features like todays "DX11 games"), NVIDIA will have their card on the market and everyone can then make up their minds who is the better manufacturer.

I don't think NVIDIA is in a rush to launch the GT300 when the 295 holds it own against the 5870, they have more time to improve fermi. And we all know rushing a product can be disastrous and NVIDIA is trying to avoid this with extensive R&D. Which in the long run is the smarter business move, rather than just rushing something out to say you are the first. Think back to the first DX9 cards, they couldn't even handle real DX9 games.

Nedjo
12-22-2009, 10:22 AM
yes because of all the DX11 games (not DX10 games with DX11 additions)...

NVIDIA is right in saying it doesn't matter at all right now who has sold what, DX11 isn't even out the door yet in terms of usage. By the time full DX11 games (meaning built ground up with w/ DX11 instead of using a DX10 game and added a few DX11 features like todays "DX11 games"), NVIDIA will have their card on the market and everyone can then make up their minds who is the better manufacturer.
with all due respect LedHed, but you obviously don't have clue about game engine development! ALL serious game engines support different rendering paths. That's necessity 'cos of their multiplatform nature! For one engine to work on XB360 it must support DX9, on PS3 it must support OGL, that support basically covers PC completely, but future proof engines are coming out with DX11 support. That's the story with Ego engine from Codemasters, that's the story with Frostbite 2 engine from DICE, that's the story from Unigine engine, and it will happen wit Cryengine 2!

So, all DX11 titles that are now out - BattleForge, Dirt 2, STALKER: CoP (out in Russia, in February in USA), and that will come out in near (Bad Company 2 in March) or distant future (Crysis 2) ARE FULL DX11 TITLES!



I don't think NVIDIA is in a rush to launch the GT300 when the 295 holds it own against the 5870, they have more time to improve fermi. And we all know rushing a product can be disastrous and NVIDIA is trying to avoid this with extensive R&D. Which in the long run is the smarter business move, rather than just rushing something out to say you are the first. Think back to the first DX9 cards, they couldn't even handle real DX9 games.

You can't be serious with GTX295? Does anyone sells that behemoth? I guess that NV is in ecstasy over the economy with mega-complex card with two mega-sized GPUs that they are selling on equally profitable GTX275! :rolleyes:

You can't be serious claiming that GTX295 is ruining 5870 sales, and buying time for NV?

First DX9 cards... you mean G300 based Radeon 9700Pro? Of course you do! :rolleyes: well they were selling like hotcakes unlike FX5800 that really couldn't run Half-Life 2 in DX9 rendering path! ;)

LedHed
12-22-2009, 10:26 AM
with all due respect LedHed, but you obviously don't have clue about game engine development! ALL serious game engines support different rendering paths. That's necessity 'cos of their multiplatform nature! For one engine to work on XB360 it must support DX9, on PS3 it must support OGL, that support basically covers PC completely, but future proof engines are coming out with DX11 support. That's the story with Ego engine from Codemasters, that's the story with Frostbite 2 engine from DICE, that's the story from Unigine engine, and it will happen wit Cryengine 2!

So, all DX11 titles that are now out - BattleForge, Dirt 2, STALKER: CoP (out in Russia, in February in USA), and that will come out in near (Bad Company 2 in March) or distant future (Crysis 2) ARE FULL DX11 TITLES!



You can't be serious with GTX295? Does anyone sells that behemoth? I guess that NV is in ecstasy over the economy with mega-complex card with two mega-sized GPUs that they are selling on equally profitable GTX275! :rolleyes:

You can't be serious claiming that GTX295 is ruining 5870 sales, and buying time for NV?

First DX9 cards... you mean G300 based Radeon 9700Pro? Of course you do! :rolleyes: well they were selling like hotcakes unlike FX5800 that really couldn't run Half-Life 2 in DX9 rendering path! ;)

I stopped reading when you called Dirt 2 a full DX11 title, if it was a full DX11 title it would not run on the PS3 nor could it have come out when it did and be a DX11 build. All they did was add a few feature from DX11 to a DX10/OpenGL game. Not to mention Dirt 2 looks no better than GRID.

Also yes, the 295 is still selling, it's not that hard to do a NewEgg search (though it sells out quickly). This is the reason NVIDIA stopped production of all 200 series except the 295.

There are still plenty of people who only buy NVIDIA (for drivers or whatever reason) and the 295 is still the fastest only behind the 5970 which is extremely hard to find.

MrMojoZ
12-22-2009, 10:31 AM
I stopped reading when you called Dirt 2 a full DX11 title, if it was a full DX11 title it would not run on the PS3 nor could it have come out when it did and be a DX11 build. All they did was add a few feature from DX11 to a DX10/OpenGL game. Not to mention Dirt 2 looks no better than GRID.

Wow, you don't understand rendering paths at all. :(

LedHed
12-22-2009, 10:34 AM
do you understand what a ground up DX11 build is? doesn't seem like it

I'm not even talking about console porting and all that comes along with that, I'm talking about PC only games that are full DX11 titles. For example Crysis (full DX10 build) vs BioShock (DX9+DX10 features)

Nedjo
12-22-2009, 10:38 AM
I stopped reading when you called Dirt 2 a full DX11 title, if it was a full DX11 title it would not run on the PS3 nor could it have come out when it did and be a DX11 build. All they did was add a few feature from DX11 to a DX10/OpenGL game.
well that's the spirit! But you should have stopped reading when I've called you clueless on this matter! ;) Instead of commenting the rest of your "contemplating" I suggest you go back and read my post completely ;)


Also yes, the 295 is still selling, it's not that hard to do a NewEgg search (though it sells out quickly). This is the reason NVIDIA stopped production of all 200 series except the 295.

I see, now beside being game engine expert, you're now sales specialist! :rolleyes:

No doubt that hose FIVE 295 models from THREE manufacturers from 500-700 USD are killing sales of twice as many of those 400 USD 5870s from EIGHT manufacturers!
:rolleyes:



do you understand what a ground up DX11 build is? doesn't seem like it

I'm not even talking about console porting and all that comes along with that, I'm talking about PC only games that are full DX11 titles. For example Crysis (full DX10 build) vs BioShock (DX9+DX10 features)

and what makes Crysis "full" DX10 build, and not a second rendering path?

LedHed
12-22-2009, 10:40 AM
No one said NVIDIA was killing sales, I simply said the 295s are still for sale and they are faster than the 5870 in 80% of titles.

You can put all the words into my mouth you want, doesn't mean anything.

Helloworld_98
12-22-2009, 10:46 AM
No one said NVIDIA was killing sales, I simply said the 295s are still for sale and they are faster than the 5870 in 80% of titles.

You can put all the words into my mouth you want, doesn't mean anything.

I agree with what you say, but any gamer worth their salt who considers the GTX 295 will also consider dual 5850's, which are in a league of their own compared to the GTX 295 and 5870.

LedHed
12-22-2009, 10:50 AM
It's also a different price league with most 5850 averaging $320 (so $640 for two) while the 295 is only $500 or less with rebates.

MrMojoZ
12-22-2009, 12:30 PM
do you understand what a ground up DX11 build is? doesn't seem like it

I'm not even talking about console porting and all that comes along with that, I'm talking about PC only games that are full DX11 titles. For example Crysis (full DX10 build) vs BioShock (DX9+DX10 features)

So explain the diffrence between your "ground up" DX11 title and a game that has a DX11 render path. That should clear things up.

MrMojoZ
12-22-2009, 12:36 PM
It's also a different price league with most 5850 averaging $320 (so $640 for two) while the 295 is only $500 or less with rebates.

So you'll compare a 5870 at $400ish with a 295 at $500 but we musn't compare 2x5850 against the 295 because of price diffrences? Are you sure your comments aren't strictly based on brand loyalty? :shrug:



edit: Nevermind, I see some of your other posts now too. I'll just ignore and move on. Thanks.

purecain
12-22-2009, 01:45 PM
@ledhed - WOW whats going on with you, do you not read the same info as everyone else...

no one wants a 295 over a 5870.... you know this deep down right???

yngndrw
12-22-2009, 03:21 PM
with all due respect LedHed, but you obviously don't have clue about game engine development! ALL serious game engines support different rendering paths. That's necessity 'cos of their multiplatform nature! For one engine to work on XB360 it must support DX9, on PS3 it must support OGL, that support basically covers PC completely, but future proof engines are coming out with DX11 support. That's the story with Ego engine from Codemasters, that's the story with Frostbite 2 engine from DICE, that's the story from Unigine engine, and it will happen wit Cryengine 2!
You should probably think about what you're about to say before making comments that LedHed doesn't know what he's on about.

The transition between DX 9 and DX 10 is pretty huge in terms of API usage. In order to have an engine in which you can write both DX 9 and DX 10/11 rendering paths some compromise must be made. These limit the feature use and performance for the DX 10/11 renderers. In other words, having DX 9 support limits a DX 10/11 engine by quite a bit. You end up having to write the best part of the engine twice, once for DX 9 and lower and once for DX 10 and higher.

Nedjo
12-22-2009, 04:52 PM
You should probably think about what you're about to say before making comments that LedHed doesn't know what he's on about.

The transition between DX 9 and DX 10 is pretty huge in terms of API usage. In order to have an engine in which you can write both DX 9 and DX 10/11 rendering paths some compromise must be made. These limit the feature use and performance for the DX 10/11 renderers. In other words, having DX 9 support limits a DX 10/11 engine by quite a bit. You end up having to write the best part of the engine twice, once for DX 9 and lower and once for DX 10 and higher.

I'm not arguing the fact that dedicating the more resources in to exclusively DX11 path would give results in form of more performance or visuals. What I'm arguing is the FUD that multi-path engine that incorporate DX11 path isn't "real" DX11 engine/game! Simple as that.

yngndrw
12-22-2009, 06:37 PM
I'm not arguing the fact that dedicating the more resources in to exclusively DX11 path would give results in form of more performance or visuals. What I'm arguing is the FUD that multi-path engine that incorporate DX11 path isn't "real" DX11 engine/game! Simple as that.
DX10+ has a different buffer system than DX9-, which means that a much larger part of the engine must be re-written. It becomes far more than a "rendering path" and as such the capabilities are linked.

LedHed
12-22-2009, 08:01 PM
So you'll compare a 5870 at $400ish with a 295 at $500 but we musn't compare 2x5850 against the 295 because of price diffrences? Are you sure your comments aren't strictly based on brand loyalty? :shrug:

I didn't compare the 5870 and the 295, every single review site did. Also the 295 can be had for around $450 after rebate which is the same price as some 5870s, so the price difference is minuscule at best.

I never said the 5870 wasn't the better overall card, but when it comes to performance the 295 wins more than it loses.

Jowy Atreides
12-22-2009, 08:15 PM
I didn't compare the 5870 and the 295, every single review site did. Also the 295 can be had for around $450 after rebate which is the same price as some 5870s, so the price difference is minuscule at best.

I never said the 5870 wasn't the better overall card, but when it comes to performance the 295 wins more than it loses.

So, you're comparing the most expensive release of one card with the cheapest of a second?

Can't let you do that dave.

Compare both relatively.

The cheapest after rebate 295 should be compared to a cheap and rebated 5870

And if you disagree, I'll just pull out the cheapest, rebated 5970 and compare it to the most expensive gtx 295.
It's equally as fair as anything you have said.

LedHed
12-22-2009, 08:27 PM
plenty of people compare the 5970 to the 295, feel free.

DeathReborn
12-22-2009, 11:07 PM
Technically speaking, the lead won't be significant until H2 2010 at the earliest, but by then Fermi might be out the door. Still not enough DX11 software available, Engines alone do not a game make.


no one wants a 295 over a 5870....

I can guarantee you that there are people wanting a 295 and not a 5870, not everyone wants the exact same thing.

purecain
12-23-2009, 03:21 AM
i accept that, but i did say it was my oppinion... as i have read all the facts and researched the technical advances the 5870 holds...

is it not true that we like to play games with all bells and whistles on maximum... because with a dx10 only card that is no longer possible...

who wants dx10 when we have dx11 and tesselation... ^^

Solus Corvus
12-23-2009, 10:49 AM
Most of us look at products from both manufacturers and decide which suits their needs best.

For the longest time I never imagined I would buy an ATI card again (besides AIW). They had poor opengl performance and bad drivers. But then the 4xxx series happened. ATI drivers still aren't that great, but I've had some bad experiences with the Nvidia drivers lately as well. As for SLI it's just like CF, in some games it works and some games it doesn't.

LedHed
12-23-2009, 10:52 AM
I owned a 4850 Crossfire setup and all I got was crashing and blue screens when trying to use both cards at all stock; ATI even acknowledged this problem by releasing a hot fix (but the issue continued). Only when running one card was I able to game without crashing; this just left a really bad taste in my mouth and it will take a lot for me to return to ATI for the 4th time.

Solus Corvus
12-23-2009, 10:55 AM
Sounds like a bad experience. But you know that you can get dud cards from either manufacturer right?

I had a 4870x2 and CF worked fine in the majority of games. If it was crashing constantly I would have sent it back for RMA not blamed it on CF.

Macadamia
12-23-2009, 11:04 AM
Sounds like a bad experience. But you know that you can get dud cards from either manufacturer right?

I had a 4870x2 and CF worked fine in the majority of games. If it was crashing constantly I would have sent it back for RMA not blamed it on CF.

Certainly bad enough for him to RMA his problematic GTX295 a couple of times and not try to get the 4850s work.
Don't really buy such a story at all. Of course the posts justify the erm... bias.

LedHed
12-23-2009, 11:08 AM
I couldn't RMA the 4850's, but thank you for following my posts so closely.

If you follow all my posts you would see that I will be getting the GT300 if it beats out the 295.

Solus Corvus
12-23-2009, 11:20 AM
4850, oops I read that as 4870. Yeah, that was a bad situation. But people aren't having the same problems with 5850's.

My point is that companies can change and learn from mistakes. They can make new mistakes too. ATI used to have horrible opengl performance, but they have improved that significantly. I used to hold Nvidia drivers in high regard, but not after my recent experiences. Not buying a company's product because of recent experience is one thing, but not ever looking at a company's products ever again is shortsighted, imo, and ignores the human capacity for learning and change.

LedHed
12-23-2009, 11:23 AM
I don't have the money to risk buying products that are unacceptable in the quality of drivers, I have had zero problems with NVIDIA drivers and my 295 except when I had bad 295s (1 bad VRM and the other had a bad core 0).

While it's great to try new things/companies many of us buy what we know works because we can't afford to try everything.

Solus Corvus
12-23-2009, 11:34 AM
It's your choice of course. But you can always wait a little while and see what people are saying about the drivers/product. What if you get Fermi on launch day and because it's a new arch the drivers suck or it has some other issues? For me I would have been just as pissed about having to spend money to ship back 2 broken cards as I would be about crappy drivers.

And I'm not giving AMD slack for having crappy drivers. I think they have a lot to work on and some problems are quite frustrating. But I have had some horrible experiences with the NV drivers lately so the driver comparison doesn't work for me personally. I'm testing a friend's 5970 and I have had very few driver problems, with a few standouts. But he's going to want his second 5970 back when he transplants to a larger case and I'm dreading going back to my 8800GTS 512 because the driver situation won't be any better and it's horribly slow too, lol.

LedHed
12-23-2009, 11:38 AM
I didn't have to pay for anything to get the 295's replaced, eVGA even did cross shipment, meaning I still had the broken 295 when I received the new 295. This is another reason why I prefer NVIDIA, eVGA is by far the best company in terms of service/warranty/applications/etc and they are based in the US. I'm waiting for ATI to pay eVGA enough to produce both, similar to what XFX did; then I may look at ATI cards more seriously. However eVGA is die hard NVIDIA, so that may be a hard feat.

Solus Corvus
12-23-2009, 11:50 AM
Having to ship 2 cards is still a hassle you shouldn't have to deal with, imo. I agree that ATI needs better partners though.