so much about reality in games
![]()
so much about reality in games
![]()
Adobe is working on Flash Player support for 64-bit platforms as part of our ongoing commitment to the cross-platform compatibility of Flash Player. We expect to provide native support for 64-bit platforms in an upcoming release of Flash Player following the release of Flash Player 10.1.
lol...i have nothing to say than that
This post above was delayed 90 times by Nvidia. Cause that's their thing, thats what they do.
This Announcement of the delayed post above has been brought to you by Nvidia Inc.
RIGGY
case:Antec 1200
MB: XFX Nforce 750I SLI 72D9
CPU:E8400 (1651/4x9) 3712.48
MEM:4gb Gskill DDR21000 (5-5-5-15)
GPU: NVIDIA GTX260 EVGA SSC (X2 in SLI) both 652/1403
PS:Corsair 650TX
OS: Windows 7 64-bit Ultimate
--Cooling--
5x120mm 1x200mm
Zalman 9700LED
Displays: Samsung LN32B650/Samsung 2243BWX/samsung P2350
Its funny cause its true! :P
actually not correct im loling because the top comparisons are incorrect.
Thats actually removing HDR from a photograph, not adding it.
...and none of those photos are rendered. they are all photographs. Its not comparing anything.
This is pointless and I do not understand the meaning.
Last edited by Decami; 12-08-2009 at 06:10 PM.
This post above was delayed 90 times by Nvidia. Cause that's their thing, thats what they do.
This Announcement of the delayed post above has been brought to you by Nvidia Inc.
RIGGY
case:Antec 1200
MB: XFX Nforce 750I SLI 72D9
CPU:E8400 (1651/4x9) 3712.48
MEM:4gb Gskill DDR21000 (5-5-5-15)
GPU: NVIDIA GTX260 EVGA SSC (X2 in SLI) both 652/1403
PS:Corsair 650TX
OS: Windows 7 64-bit Ultimate
--Cooling--
5x120mm 1x200mm
Zalman 9700LED
Displays: Samsung LN32B650/Samsung 2243BWX/samsung P2350
Actually that seems an accurate example of how many games show HDR, but I agree that's not a good example of how HDR is added to photos. I disagree that the top ones show HDR being removed though, because it's a misnomer. You can retouch photos to remove glare and hence add contrast, but HDR is a post effect treatment, thus no stock photos have it.
I think the point of all this though, dare we stray from the topic, is that HDR in games is rarely done properly. So often is it just too glowy and bright, vs subtle and natural. Even IW, whom have done marvelous things in rendering realistic graphics at high FPS with their last two titles, seem to have fubared the HDR somewhat on MW2.
MW2's controversial airport level is just annoyingly glowy and over the top with the "God rays" blinding you through the sky lights. I also find the last level in the Zodiac to be over the top with the bright orange reflections of the sun off the water. Sadly whenever a title is primarily made for console and the PC version an afterthought, we often get such poorly done effects.
Cute avatar of the white kitty btw. It looks to be focusing so intently. I think I'll call that pic "Cat on Wire". LOL
Last edited by Frag Maniac; 12-09-2009 at 04:43 PM.
i think games just overdo it tbh.
7820X | Asrock X299 Taichi XE | Gigabyte 1080 Ti Xtreme | 32GB Memoriez | Corsair HXi1000 | 500GB 960 Evo
What I find funny, is that there are only 2 different pictures out of the four, unless my eyes fail me.
Core i7 920 D0 B-batch (4.1) (Kinda Stable?) | DFI X58 T3eH8 (Fed up with its' issues, may get a new board soon) | Patriot 1600 (9-9-9-24) (for now) | XFX HD 4890 (971/1065) (for now) |
80GB X25-m G2 | WD 640GB | PCP&C 750 | Dell 2408 LCD | NEC 1970GX LCD | Win7 Pro | CoolerMaster ATCS 840 {Modded to reverse-ATX, WC'ing internal}
CPU Loop: MCP655 > HK 3.0 LT > ST 320 (3x Scythe G's) > ST Res >Pump
GPU Loop: MCP655 > MCW-60 > PA160 (1x YL D12SH) > ST Res > BIP 220 (2x YL D12SH) >Pump
Nadeshiko: i7 990 12GB DDR3 eVGA Classified *In Testing... Jealous?*
Miyuki: W3580 6GB DDR3 P6T-Dlx
Lind: Dual Gainestown 3.07
Sammy: Dual Yonah Sossoman cheerleader. *Sammy-> Lind.*
Its my fault.. and no im not sorry about it either.[12:37] skinnee: quit helping me procrastinate block reviews, you asshat. :p
[12:38] Naekuh: i love watching u get the firing squad on XS
i think this is missing the point completely...
videogames naturally produce photorealistic images, but that is not a fun and immersive experience...the point of introducing HDR was to make the games more LIFELIKE so that they look like what your EYE sees, not the camera
The point is though, many games do NOT apply HDR realistically and that photo-realism you speak of at that point is more real looking than with their crap HDR/Bloom applied. I also feel the same about poorly done AOL (ambient occlusion lighting). It's one thing to have a subtle milky ambient light simulating a haze, but when it amplifies right in front of your face each time you head off a certain direction, it's definitely over the top. Blur is often overdone too.
HDR when it's done well is very nice true, but keep in mind we are talking about effects that AREN'T done well. Crysis has wonderful, subtle HDR with nicely done sunset reflections off water and filtered sunlight through trees. IMO though it's AOL and blur effects are overdone and annoying. I totally agree on Epic failing with post effects too. Ever since GoW their post processing has looked like dark, dull crap that totally desaturates all color content and makes the game dull, drab and dreary. They're yet another dev that overdoes blur too.
It used to be motion blur was for high speeds in racing games like NFS, which of course makes sense. Velocity sports have always been about reading terrain whilst trying to keep your eyes from tearing due to wind and/or blur effects. It's where the expression "Enough to make your eyes bleed" comes from. Now we have it in lots of games in such an over the top way you see heavy blur just when you start off on foot.
Worse yet you have all these players just loving excessive blur, most fairly young, that will no doubt find out when they're older why many of us don't like it. Aside from being unrealistic it causes eyestrain and even headaches. It's funny how most games have the disclaimer of epilepsy warnings in their legal text, yet nothing about the excessive blur even in games where you can't turn it off or even lower it.
Last edited by Frag Maniac; 12-09-2009 at 05:03 PM.
This post above was delayed 90 times by Nvidia. Cause that's their thing, thats what they do.
This Announcement of the delayed post above has been brought to you by Nvidia Inc.
RIGGY
case:Antec 1200
MB: XFX Nforce 750I SLI 72D9
CPU:E8400 (1651/4x9) 3712.48
MEM:4gb Gskill DDR21000 (5-5-5-15)
GPU: NVIDIA GTX260 EVGA SSC (X2 in SLI) both 652/1403
PS:Corsair 650TX
OS: Windows 7 64-bit Ultimate
--Cooling--
5x120mm 1x200mm
Zalman 9700LED
Displays: Samsung LN32B650/Samsung 2243BWX/samsung P2350
well it's not so mindless...in addition to making games 'lifelike' by giving them physical attributes like the eye would experience, there is also another trend which gamers enjoy very much which is MOVIE-LIKE effects. gamers sometimes want to feel like they are watching a movie, and as we all know movies are filmed with cameras and have these artifacts. so it's really some kind of combination of both eye and camera-effects, and it just depends on what the gamer thinks is fun.
games often gets things completely wrong though cause the method is so horrible.
Like depth of field. this completely fails in games, depth of field means objects in focus at certain depths in your field of view, this is how realize distance of objects in a direct path of our field of view, or depth. blurring parts of the screen is a downright ridiculous, and completely useless way to create this. It just doesnt work if not done right, CODMW1 absolutely pointless depth of field blurring. thats more of a focus blur than anything and our eyes do that already in FPSs naturally to the screen in our view. Crysis had this down slightly better but still needed work. with Crysis DOV you could actually judge objects distance from one another in a direct path easier but it still had over used and useless blur in places it shouldnt be.
This post above was delayed 90 times by Nvidia. Cause that's their thing, thats what they do.
This Announcement of the delayed post above has been brought to you by Nvidia Inc.
RIGGY
case:Antec 1200
MB: XFX Nforce 750I SLI 72D9
CPU:E8400 (1651/4x9) 3712.48
MEM:4gb Gskill DDR21000 (5-5-5-15)
GPU: NVIDIA GTX260 EVGA SSC (X2 in SLI) both 652/1403
PS:Corsair 650TX
OS: Windows 7 64-bit Ultimate
--Cooling--
5x120mm 1x200mm
Zalman 9700LED
Displays: Samsung LN32B650/Samsung 2243BWX/samsung P2350
Cameras have a lower dynamic range than the human eye, its going to be hard to mimic it like we see it. Anyone with SLR or DSLR experience has seen what happens when you stand outside and take a picture of someone inside a shaded area or vice versa. Situations where we can see someone just fine the camera will render as a silhouette.
Half life 2 had a good example of HDR thats is similar to how the eyes react, in the lost coast demo walking in and out of the tunnel (dark to light) you will see things very bright and then it will adjust, similar to waking up at 2am and turning on the lights.
Doesnt seem like many games use HDR to the extent that they use bloom, or they use too much bloom and people just assume its HDR.
Phenom 9950BE @ 3.24Ghz| ASUS M3A78-T | ASUS 4870 | 4gb G.SKILL DDR2-1000 |Silverstone Strider 600w ST60F| XFI Xtremegamer | Seagate 7200.10 320gb | Maxtor 200gb 7200rpm 16mb | Samsung 206BW | MCP655 | MCR320 | Apogee | MCW60 | MM U2-UFO |
A64 3800+ X2 AM2 @3.2Ghz| Biostar TF560 A2+ | 2gb Crucial Ballistix DDR2-800 | Sapphire 3870 512mb | Aircooled inside a White MM-UFO Horizon |
Current Phenom overclock
Max Phenom overclock
HDR in photos is actually same as game without HDR..
When you look from dark room to bright out.. youīll be probably able to see both quite ok (unless its extreme light and extreme darkness). What HDR photo does is, that you can see equaly both - light and dark places. But none of them has correct "light" in them.. dark places are usually too light, and light places just bit darker, than in real life. Its just way to put details from light/dark to one pic.. usually done by taking pics of light + normal + dark and putting that together, after bit of PS work, it will create somewhat watchable result.
HDR photos are never accurate, cause sensors canīt capture everything that human eye can at one shot .. and putting it together from few shots needs some afterprocessing, which is never accurate and even if it was, you canīt display it.Cause it has bit more information, than you can display on conventional display.. cause even displays has limited dynamic range. Eg. until we get HDR cam sensors and HDR displays.. we are left with semi-accurate HDR pics at best.. usually HDR pics are used more as art, than reproduction of real world.
HDR in games, actually doesnīt have anything with high dynamic range at all, cause.. in normal game, you simply see everything.. so in some way, you have unrealistic high dynamic range. HDR in games, actually lowers your dynamic range, so you when you go out from dark cave, you see at first dark cave.. and just light outside, when go out, you get blinded by light and then you see everything around normal and totaly dark cave.. which, if its implemented right, its somewhat near real world.. but usually this effect is overshoted so hard, that it has nothing to do with real life, and its way slower than in real life.. I guess that term "Reduced dynamic range" wouldnīt sound that good, so they call it HDR.. even if its in fact opposite.
Tough, both are trying to be closer to real world. Games by reducing dynamic range and photos by adding it (cause cam sensors canīt capture that much as human eye at one shot). It has lot to do with reality in both cases, but less to do with real HDR in game case.
Last edited by Mescalamba; 12-10-2009 at 03:00 PM.
i7 930 D0 - 4,2 GHz + Megashadow
3x4GB Crucial 1600MHz CL8
Foxconn Bloodrage rev. 1.1 - P09
MSI HAWK N760
Crucial M500 240GB SSD
SeaGate ES.2 1TB + 1TB External SeaGate
Corsair HX 850W (its GOLD man!)
ASUS STX + Sennheiser HD 555 (tape mod)
Old-new camera so some new pics will be there.. My Flickr My 500px.com My Tumblr
projecting a three-dimensional experience onto a 2D screen is going to have a lot of limitations no matter how you do it. there is no "right" way to do it anyway because game designers have a lot of different goals for how they want users to experience the game. a lot of the way that images are presented is going to come down to artistic preference or just technical choices. for example, depth of field, which you are talking about, is compensated for by the brain despite being a limitation of the lens of the eye. the question of what we are really "seeing," i.e. the brain's image or the eye's image, is complicated enough that game designers can get away with not really picking a side and following through on all the details. people started to deal with these issues when we first built cameras and videocameras - even today photos and films can be done in extremely "artistic" ways, but even the most scientific application of camera technology comes with limitations in displaying information. no matter what, you have to make choices about how you want to present the images because a 2D screen can never replicate the 3D experience that the eye and the brain create.
Im not talking photo realistic here, some games butcher this effect, and other effects as well, has nothing to do with creating what i see in real life, if that ever even happens its light years away. I pretty much understand your entire post, no need for the drawn out explanation. I am simply saying developers add and create effects in games, and it seems that the effect they are adding they completely dont understand. Like they did no studying or gathering information of what they are actually trying to create.
what im talking about has nothing to do with limitations or technical reasons. Its just pure mistake.
Last edited by Decami; 12-11-2009 at 07:57 PM.
This post above was delayed 90 times by Nvidia. Cause that's their thing, thats what they do.
This Announcement of the delayed post above has been brought to you by Nvidia Inc.
RIGGY
case:Antec 1200
MB: XFX Nforce 750I SLI 72D9
CPU:E8400 (1651/4x9) 3712.48
MEM:4gb Gskill DDR21000 (5-5-5-15)
GPU: NVIDIA GTX260 EVGA SSC (X2 in SLI) both 652/1403
PS:Corsair 650TX
OS: Windows 7 64-bit Ultimate
--Cooling--
5x120mm 1x200mm
Zalman 9700LED
Displays: Samsung LN32B650/Samsung 2243BWX/samsung P2350
I really didnt read alot of your post, but the first sentence you will understand is incorrect if you have done alot of photo editing. Adding HDR to a photo is exactly like adding it in games. the adding part, games are not very good at it. But its basically the same thing.
What you have to understand is how its working in games and how it works in photos.
High Dynamic range can increase darkness in light and lightness in dark, either way. and you can use this in photos.
Games only seem to use it one way, some games actually with decent HDR do use it both ways, but it seems that games always only go the lightness in dark method, basically cause it seems more impressive.
So its not that games do it to the opposite effect, cause they dont, they just only use it in one way.
Last edited by Decami; 12-11-2009 at 08:18 PM.
This post above was delayed 90 times by Nvidia. Cause that's their thing, thats what they do.
This Announcement of the delayed post above has been brought to you by Nvidia Inc.
RIGGY
case:Antec 1200
MB: XFX Nforce 750I SLI 72D9
CPU:E8400 (1651/4x9) 3712.48
MEM:4gb Gskill DDR21000 (5-5-5-15)
GPU: NVIDIA GTX260 EVGA SSC (X2 in SLI) both 652/1403
PS:Corsair 650TX
OS: Windows 7 64-bit Ultimate
--Cooling--
5x120mm 1x200mm
Zalman 9700LED
Displays: Samsung LN32B650/Samsung 2243BWX/samsung P2350
Bookmarks