http://www.joystiq.com/2006/06/08/am...mparison-pics/
http://img161.imageshack.us/img161/2...on018mx4df.jpg
Printable View
landscape is easier to reproduce because you need less details, but a close shot is more difficult and you can tell the difference pretty easy
uhhhh huh awesome, this game and the new ET gives me reason to want better pc parts! :D
Either way, Still looks nice.
yea....looks pretty damn good.
Am I the only one who had to read the labels to figure which one was which ? :p:
Perkam
that looks very pretty indeed :)
Real life one looks faker than the game :p:
wow i think im going to need to upgrade somtime soon
At first I thought the images were trying to compare different game engines, they look that close.
Anyone seen/remember that image that made fun of the doom/halflife/farcry graphics by showing what a rock would look like in the game?
Sweet
those grass sprites look like poop, other than that solid engine
yes i edited my post
It's not so hard for game developers to make beautiful renders. It's hard to do it realtime with sufficient FPS.
Other than the grass it's a beatiful render, but it has nothing to do with what games will look like using this engine. Atleast not anytime soon.
actually they ran that demo on x1900xt crossfire and a dual core cpu but i forget the type.Quote:
Originally Posted by alexio
it ran fluidly probably not the bests fps but over 20 it looked like.
certainly playable!! and with a little ocing.....
I like the grass, gives off that nice homey burning flame feeling.
That or a mini christmas tree farm
:D
Ok, I didn't read that. But you have to remember that this is only the landscape. It has no other models, physics or AI I guess.Quote:
Originally Posted by brandinb
No :eek:Quote:
Originally Posted by perkam
D'amm that looks good.
Shame it wont be remotly playable on my rig.
That shed looks fantastic. Very nice lighting/shadowing.
I just hope we don't witness the dull indoor textures that plagued Far Cry. Far Cry looked great outside, but once you stepped inside... bleh.
that looks amazing, think i might need to upgrade just for this
as said before the grass does need some work
So... someone took their time to making the exact same shed and finding a beach just like the game's just for comparison?
+1 for effort.
Looking good.
May well be old news but video previews can be found here
http://uk.media.pc.ign.com/media/694/694190/vids_1.html
The video interview on the second page is the most interesting, with the tech demo on page 3 is a bit smaller download. I'm really looking forward to this game :D
that cant be :( i was counting on my own rig to run this game smoothly with nothing more than a graphics card upgrade :slap: .Quote:
Originally Posted by 3NZ0
I thing we are 1 generation away from real lifelike nature.
________
Halfbaked
7800gt's don't even run smoothly on the newer games of today :(Quote:
Originally Posted by RAMMAN
as i said my next upgrade will be my graphics card...in 6-7 months, before crysis is even released.
I could tell pretty easily. The grass sprites were rubbish but if you look at the valley the floor looks very flat and has almost no depth as if all the details were done in textures with normal/bump/parallax maps. Also notice how the engine still has the fundamental problem with current HDR: It makes everything look very bright rather than realistic.
Very good, i could have really belived the beach was a photo, the shed is slightly more obvious because of the grass as said before, but the trees in it are very good.
http://img164.imageshack.us/img164/8986/negative1nn.jpgQuote:
Originally Posted by Cobalt
:rolleyes:
On a 4400+ 2GB ram and a XTX? of Course it will man.Quote:
Originally Posted by 3NZ0
:stick: :stick: :stick:Quote:
Originally Posted by Cobalt
If you found any other game in the world more realistic than this tell something.....
That game only uses a portion of code in Direct_X10. Imagine when they release the next game full Direct_X10 :eek: ;)
WOW, that is impressive, is it prerendered? or real time?
I like this comparison because it shows how overdone and generally bad HDR looks.
I would use www.GameTrailers.com for those videos if they are not available in HD in a download site. GameTrailers has higher videos with higher resolutions, and usually has videos that no one has. Make sure to watch the HD version if it has it.Quote:
Originally Posted by Richie P
Here is the link for the Crysis section:
http://www.gametrailers.com/gamepage.php?fs=1&id=2509
BTW, I have a really large HD version (1280x720) of that Tech demo on page 3 on that IGN site. It doesnt have sound, tho. They also released a second tech demo called "CRYTEK_GDC_VIDEO_2006" which is also a large HD video (1280x720) and it has sound too. I dont remember where I got it, but I dont think it whoever had it was asked to remove it.
Was released a long time ago in www.gamershell.com , but they were asked to remove it. Luckly I got it before it was removed :p:
My only complains about the graphics is the grass, and the shape of the volumetric clouds. They fixed the repeating water, tho. If you look at the first Tech demo, the water didnt look as real and had a repeating texture pattern (Looked like FarCry's water). The second tech demo shows a more natural looking water. I hope they do the same with the grass and the cloud. If they find a way to make the grass more bend and random, and the clouds more stretched and scaled, then the engien will be even closer to virtual reality :)
If anyone that watches the HD Crysis videos starts to complain about the low texture resolutions and the jaggies, what did you expect from two x1900xt running a engine that benefits a lot more from Unified Shaders and SM4 standards?
They have to reduce the resolution a lot so it looks playable. Even with the verry low resolution, this engine is the best in graphics so far, so dont be so picky :stick:
If your next-gen card is going to be a SM4 card, I would hold up until this game releases, which is probably in between 2007 and 2008.Quote:
Originally Posted by RAMMAN
The r600 will be the only one dominating in SM4 unless nVidia releases something else soon afther it. The g80 will most likely be SM3 with close to SM4 features, but wont get SM4 certification.
If you have the money to upgrade to that next-gen card and update half a year later for the other next-gen card designed to run in Crysis demand in performance, then go ahead with the r600 for now :p:
We dont know that yet... unless you have any proofs.Quote:
Originally Posted by v_rr
There are no SM4 cards yet to show true SM4 in real-time. And as far as I know, SM4 doesnt have a huge difference in graphics. Its mostly a performance improvement and it also makes game development easier.
The Crysis engine was built using the DirectX 10 or 9.0c API, but SM4 screenshots are still unknown atm. If they ever showed SM4 screenshots, that means they have a r600 and by looking at the screenshots you would be able to guess how well the r600 is going to run in this game.
EDIT: That reminds me that Halo2 for PC will probably be the ONLY SM4 game that the r600 will take advantage at the moment. Im guessing they have to redo a lot of the Halo2 engine for the PC since they are forcing SM4 standards (which are strict) on a SM2 engine. Halo2 for PC will probably have better graphics while running smoother than in SM2 or SM3, because SM4 could be used to replace some of the effects mimiced by the Xbox SM2 engine version, and it will most likely take advantage of Unified Shaders and other stuff.
Some will probably buy Halo2 with a r600 just to mod it to run with SM4 effects and use it for SM4 benchmarking in more complex moded enviroments :p:
BTW, those who were wondering if this is going to run well on real-time E3 gameplay, look at the videos that I posted for Richie P
http://www.gametrailers.com/gamepage.php?fs=1&id=2509
Looks pretty smooth to me. I dont know what hardware they used on those presentations, tho. If you google it, there's probably someone that found out what they were using to run those gameplay footage.
r600 will probably come too late for me, and while g80 will most likely release in time, power consumption is a worry plus im not exactly made of money, i never spend over $350usd on a single part.x1950 xt/x will likely lack the performance i need.besides my single core p4 would cripple an r600/g80 lol :slap: .Quote:
Originally Posted by Turok
perhaps a cpu/mobo upgrade is in order instead followed by the graphics card upgrade in another 6 months :confused: 6 months is 6 months.i should probably ask on this forum what to upgrade then.
I'm not trying to say it isn't the best thing out yet but in my eyes it is still far from photo-realistic.
realy?Quote:
Originally Posted by Hicks
awsome:woot: :toast:
Try playing DOD:S with full hdr...its a pity the allies didn't have sunglasses standard issue during world war ii.Quote:
Originally Posted by Cobalt
Perkam
LOL! Excessive HDR is for losers! :DQuote:
Originally Posted by perkam
________
Zx14 Vs Hayabusa
campers love players who use HDR ;)
I don't know, seems to me HDR is just another way to prove which gamers haven't actually experienced a sunny day in Florida.:rolleyes:Quote:
Originally Posted by XS Janus
is that dx9 or dx10
I agree, it is indeed close on the horizon. If you look at how far photo-realism in gaming has come in just the last 2 years, it's gone from incredibly flawed and obviously fake-looking to something like this, where the visual line between a real photo and this rendered gaming experience is becoming extremely thin (though not quite gone). In the near future that line is destined to dissappear, resulting in a literal meaning of photo-realism in gaming. It's certainly exciting to watch the progression.Quote:
Originally Posted by XS Janus
Dx9Quote:
Originally Posted by Dillusion
As I said above, the screenshots are DX9 since there isnt a DX10 video card to take the screenshtos atm.Quote:
Originally Posted by Dillusion
The difference between DX9 and DX10 is mostly performance and ease of development anyway.
The Crysis engine was coded to run on DX10, so expect higher quality screenshots from a SM4 card like the r600, since the performance increase will allow a increase in IQ without lagging.
Hardware: We saw the game running on an Intel Extreme Edition dual core processor, 2GB of RAM and X1900 CrossFire
Here
A sunny day is a sunny day anywhere. HDR is a fanboy gimmick.Quote:
Originally Posted by Ominous Gamer
:ROTF: :clap: :toast:Quote:
Originally Posted by STEvil
lol , you guys kill me....7800 GT is plenty for games right now!:slap: :fact:Quote:
Originally Posted by OmegaMerc
Sure it is when you sacrifice some stuff on certain high end games.Quote:
Originally Posted by Flexkill
yeah, in call of duty 2 i cant turn on AA+shadows @ 1280x1024 and still get acceptable fps.
WellI haven't had that problem...and i'm stuck at same res as you 1280x1024. The main reason I went with the cheaper 7800GT. Why spend a crap load of money on a card that is over kill for 1280x1024 when DX10 is around the bend. Maybe its your CPU holding you back:shrug: I went from X850XT to 7800GT and noticed a pretty big differance. I hate Nvidia ...but i got the card so cheap I had to do it. Plus, they are cheap enough now for you to just add another card and go SLI....that should be plenty.Quote:
Originally Posted by RAMMAN
This is kind of funny coming from a Canadian.:slapass:Quote:
Originally Posted by STEvil
well things are gonna get a whole lot worse if i decide to slap a x1900xtx or something of the like in there one day.maybe its got something to do with the 84.21,s set to quality and the fact that i havent overclocked anything just yet? my fps in cod2 and doom3 seem to match the tests done at hardocp.i doubt cod2 would be cpu limited when the fps is as low as 30.i cant go sli, im stuck with a non-sli board that only accepts .13 micron/90nm single core p4,s :( so i already have the second most expensive cpu my board accepts.my next upgrade is in december, whatever it may be and i never spend over $350usd on an individual part.Quote:
Originally Posted by Flexkill
You need a new board for sure:( As far as the CPU speed...it is important these days more than ever, as fast as these GFX cards are today, but i hear what your saying though. I play COD2 1280x1024 quality drivers 2xAA 8XAF and im not in the 30's often or at all that i know of. 2 G's o mem helps a lot as well;) as I see you have now that I look...DOH!!!Quote:
Originally Posted by RAMMAN
Well to be honest I looked at them breifly and then had to scroll up again. But upon further inspection you can see onside is lacking detail, but it still looks damn awesome.
The other thing that is quite noticable is the shadowing. An object in real life casts a different shadow based on the position of the sun and the distance between the object casting the shadow and the object the shadow is being cast on. You can see this clearly in the shed where all shadows in the crysis photo look really dark. Naturally, some are lighter and some are darker.
That is something they will probably leave for last. Shadows and parallax mapping are not easy as they are anyways.
here is the hardocp testing i was comparing with:Quote:
Originally Posted by Flexkill
http://enthusiast.hardocp.com/articl...50aHVzaWFzdA==
as you can see they used an fx-55, had textures, normal mapping, dynamic lights set to normal and still got lower fps than me.mind you i have shadows set to no, smoke edges set to off and number of bodies set to tiny.
That test sux....Why is it when people comaper cards they put the faster cards with more AA and crap....thats not a real comparison....everything should be the same.:( I mean why do they not have the same settings for the 7900GT as they do the 7800GT so we can tell the real speed differance:slapass:Quote:
Originally Posted by RAMMAN
It's not meant as a comparison...
They show the HIGHEST possible playable IQ settings.
Why? I'm even going to play into what you seem to be suggesting and point out the high reflectivity of snow.Quote:
Originally Posted by Ominous Gamer
interesting...i started lowering the res in cod2 and my average and maximum framerate went up but the minimum framerate didnt change much.1280x1024@2xAA is almost as demanding as 1024x768@8xAA(enabled through driver)so res affects framerate alot more than AA/AF.minimum framerate didnt go up to 60 until res was set to 800x600 LOL.so if i was to upgrade to an x1900xtx i would probably have an average fps above 60, maximum fps up to 200 but minimum fps might still drop below 40.so i think i will get a cheap x1900 at christmas and then a cpu+mobo in may(thank god my birthday is not too long after vista,s release lol.)btw i noticed in cod2 my x800pro has noticeably better iq than my 7800gt at any given setting ;) but i cant crank up the res or AA with the x800pro so i dont use it.Quote:
Originally Posted by Flexkill
COD2, should win an award, for worst coded crappy engine of all time.
lol, that would explain the low minimum framerate in cod2.a while ago i set all graphics settings to minimum, stared at the sky in the 2nd tank level and got 450fps.Quote:
Originally Posted by Hicks
Quote:
Originally Posted by LOE
I already have eyes that do that for me though.
Back to the pictures, just gorgeous. It appears that the most difficult thing to recreate in the outdoors environment is foliage/grass but it's still amazing how far graphics have come. Hopefully, AI will make such a jump in quality soon.
The difference may be more obvious for LCD users since it looks more sharp, but
please consider the following before criticizing the game or engine::slapass:
1) The Crysis engine is probably the first engine built from ground up that fully supports SM4 and its looking verry good in SM3 mod. Its probably even superior than the Unreal Engine 3 (SM3) in foilage and outdoor enviroments.
2) Those screenshots are GPU limited by a x1900 CF solution that doesnt benefit from a SM4 architecture. Its unknown how far this engine can go without the right hardware. Probably with a r600 or a g90 we will be able to see more.
3) Be realistic with texture quality. They are not going to put high resolutions textures with low compression in such a large enviroment. Then you will be complaining why the game require so much video and system memory or why only $5000 PC setups can run it. :rolleyes:
4) If you look at the older screenshots and videos and you compare it to the current ones, you will notice that the engine has impproved, so its likely that some graphical defects will be corrected during the development of Crysis.
5) The full potencial of the engine is unknown. Some of the imperfecion may be due to imperfections in texturing, mapping, or modeling.
The picture below is a good example pointing out some of the things I mentioned above. If you look closely you will see that some some things look better than the picture on page 1, like the grass.
This shows you that: 1. they are improveing the engine; 2. the engine has a greater potential if the hardware is available or more detail is put into the game; 3. the bad resolution is due to hardware limitations and not engine limitations.
http://img161.imageshack.us/img161/2...on018mx4df.jpg
http://pcmedia.ign.com/pc/image/arti...9092931879.jpg
And I dont know why a lot of people are complaining about HDR. :stick:
IMO, games look a lot more dynamic and better looking with HDR than without it. Even if its not exact in timming and intensity vs real life, its still looks better than having the same boring lighting.
BTW, the shadowing is not solid black. In a CRT it may look solid, but its not compleately black like some Doom3 engine shadows.
The HDR effect also contributes to this effect, since in a shade the gap in contrast will be lower.
The difference in shadow and light intensity is also due to HDR. The volumetric clouds move and have different shapes, so in the second picture above was probably taken while the clouds were blocking a bit of the sun, shich is why it may have less HDR, making it look less intensive than the irst picture above it. It could also be the HDR effect adjusting itself.
Yes, HDR does look better- but still isn't anywhere near how real light interacts with objects and our eyes.
Yes, this is the best looking engine yet and it makes use of all the next gen bells and whitsles- but anybody who says they can't tell the difference between this and real life needs to spend less time playing games.
We will get photo-realistic and maybe even in the next 10 years or so but we need a radical overhaul of how we construct virtual environments before that can happen. Traditional textures don't work because there is a limit to how detailed you can make them for each generation of hardware. Tileing occurs on landscapes and multiple copies of an object look exactly the same. We will need procedural generation of textures and models, with physics that affects every polygon and interacts with the textures directly.
Sprites will have to go. Lighting will have to be improved. Water will have to use fluid dynamic equations. Its a lot of work and I don't wish to demean such a great looking game. And it really is spectacular, but I get annoyed when claims of photo-realism are made at this stage in our hardware development.
Just look at movies such as the matrix where there is heavy intergration of CG and real footage. Even with fantastically detailed models and textures of huge proportions and all of that is pre-rendered. Yet still it is quite easy to tell the difference.
Anyway. I do look forward to this game and I just hope my X1800XT will be able to run it on something other than lowest settings lol.
Funnily enough you already get that effect when going from a dark area in a game to a light area. :rolleyes:Quote:
Originally Posted by LOE
I only just noticed this post but overbright is not a "bug" it is part of the way HDR is used. It is ment to make the lighting more obvious so that people don't feel cheated when they turn of HDR and see very little difference. HDR should be a far more subtle effect but that just isn't the case in any games we've seen so far.Quote:
Originally Posted by LOE