IMO, the game is poorly optimized. Seeing as it doesn't do much from Doom 3, HL2, Far Cry, and it runs with half the frames,on even the fastest hardware, id say the engine is a piece of crap.![]()
Kentsfield Q6600@ 3100Mhz
Asus P5B-E Plus
2GB GSkill DDRII PC6400GBHZ
Albatron Geforce 8800GTX
Silverstone Temjin TJ-09
Thermalright Ultra 120
Corsair 620W HX
74GB Raptor/ X-Fi Platinum
19" CRT NEC 930 SB
Windows XP Pro SP2
I'm sure the programers would prefer to refer to it as a "demanding revolutionary engine setting new standards".Originally Posted by Hicks
While the only standards it may be setting is hardware requirement for abysimal framerate, it sure sounds a lot better![]()
In short, both AMD and NVIDIA discovered that their next-generation graphics cards are superior to each others' last-generation graphics cards.
Indeed, I don't understand why people like this game so much. Half-life 2 looks much better at 1280*1024 4*AA/8*AF then FEAR at 1024*768 no AA/AF but they still run at the same framerate at these different settings.Originally Posted by Hicks
Blue Dolphin Reviews & Guides
Blue Reviews:
Gigabyte G-Power PRO CPU cooler
Vantec Nexstar 3.5" external HDD enclosure
Gigabyte Poseidon 310 case
Blue Guides:
Fixing a GFX BIOS checksum yourself
98% of the internet population has a Myspace. If you're part of the 2% that isn't an emo bastard, copy and paste this into your sig.
I haven't played this game yet, but comparing 4xAA/8xAF with no AA/AF is just wrong.Originally Posted by alexio
But from anandtech:
So they conclude that the soft shadows are no good. Then you get this performanceSeeing the pictures gives you a little better idea of how the soft shadow option looks in FEAR. Since it's not impressive and it gives the game a major performance hit we don't see any reason to enable it. It might look good with AA enabled but unfortunately as of right now both soft shadows and AA can't be enabled at the same time. They might allow this in some later patch, but as we've shown by our performance tests, the cost to performance would be almost too great to think about.
And this is the conclusion they come up with?
What a biased reviewerOriginally Posted by Anandtech
Also, where the heck is the x850XT or the x800XL...
Last edited by ahmad; 10-20-2005 at 06:30 AM.
Much more testing here:
http://hardware.gamespot.com/Story-ST-x-2661-x-x-x
They test different ram and cpu speed and even GTX SLI.
Everything extra is bad!
What would you like most if you look at IQ only (and fps ofcourse)?Originally Posted by ahmad
HL2 1024*768 No AA/AF 80fps
HL2 1280*1024 4AA/8AF 50fps
FEAR 1024*768 No AA/AF 50fps
FEAR 1280*1024 4AA/8AF 25 fps
There's nothing wrong with comparing No AA/AF to 4AA/8AF if one game is badly optimized and can'r run a certain setting. After all it's about what setting you can play at, nobody cares if one card can do 75fps and the other 80fps at a resolution if no other resolution can be chosen.
Blue Dolphin Reviews & Guides
Blue Reviews:
Gigabyte G-Power PRO CPU cooler
Vantec Nexstar 3.5" external HDD enclosure
Gigabyte Poseidon 310 case
Blue Guides:
Fixing a GFX BIOS checksum yourself
98% of the internet population has a Myspace. If you're part of the 2% that isn't an emo bastard, copy and paste this into your sig.
You could turn the texture quality down to medium and you would see a similar performance to HL2 I think.Originally Posted by alexio
Excellent review, this, Anandtech is always there with good proper reviews..bles them
That's not Biased dude....that's the cold hard truth...This is very old news by now but we have to mention it yet again. The fact that ATI still has no competitor for the 7800 GTX yet means that lots of FEAR players will be looking to NVIDIA for their graphics solution. This puts ATI behind again, and with games like Quake 4 coming out soon, things are looking even worse for ATI than they already have been. We were happy to see that ATI is at least coming out with high end parts, but where is the 7800 GTX competition? We need to see the X1800 XT on shelves with a competitive price soon, or there won't be much that can help ATI, especially with the rumors about what's coming down the pipe from NVIDIA.
Intel Core i7-3770K
ASUS P8Z77-I DELUXE
EVGA GTX 970 SC
Corsair 16GB (2x8GB) Vengeance LP 1600
Corsair H80
120GB Samsung 840 EVO, 500GB WD Scorpio Blue, 1TB Samsung Spinpoint F3
Corsair RM650
Cooler Master Elite 120 Advanced
OC: 5Ghz | +0.185 offset : 1.352v
Don't half quote me please. Quote the image as well if you wish to do so. ATI performs better in FEAR with no softshadows so how would it make sense to recommend Nvidia?Originally Posted by Tim
I am not liking Anandtech's reviews as of late. The initial review of the x1800XT was pretty selective and they got harassed bad for it and thats why they released a second, more comprehensive one.
I think their point was, that maybe the X1800XT does perform better, but it is not viable since this card is not available, but the game is, making a GTX the choice......for now. Why recommend something you can't buy?Originally Posted by ahmad
.........current project, make the 135i faster
I didn't quote you, I quoted Anandtech IIRCOriginally Posted by ahmad
Look, GTX is available, XT not...GTX is much cheaper, XT is more expensive...there is a difference of 2fps...which one would you recommend?
No brainer.
Intel Core i7-3770K
ASUS P8Z77-I DELUXE
EVGA GTX 970 SC
Corsair 16GB (2x8GB) Vengeance LP 1600
Corsair H80
120GB Samsung 840 EVO, 500GB WD Scorpio Blue, 1TB Samsung Spinpoint F3
Corsair RM650
Cooler Master Elite 120 Advanced
OC: 5Ghz | +0.185 offset : 1.352v
If they wanted a demanding game for benchmarks, tell them to use WZ21 with all the eye candy on... A quad dual core Opteron set up with 2(x1800xts or 7800gtxs) and 16Gb Ram can barely get more than 55fps
I get 61 fps but I dream of getting to use the Reaper mod..( the perfect AI game engine)
Fast computers breed slow, lazy programmers
The price of reliability is the pursuit of the utmost simplicity. It is a price which the very rich find most hard to pay.
http://www.lighterra.com/papers/modernmicroprocessors/
Modern Ram, makes an old overclocker miss BH-5 and the fun it was
Tim, every card there has a counterpart, except the 6800GT. At least an X800XL should've been there. And where's the paragraph on the fact that Nvidia's nex gen mainstream and low-end, supporting full HDR, h.264 decode and what not, are MIA.
As I've said a million times and Anandtech just joined the group that doesnt listen, there are two sides to every story, balance the scale at the very least if you have to make comments like that.
Perkam
You could understand the slow performance of the game engine was totally amazing, but it's not. Fact is it runs crap for what it is.
I can't run 1600x1200 max details with any AA.
I can sweep through Far Cry, HL2, and D3 at these settings, and they look just as good, if not better at the same settings.![]()
Kentsfield Q6600@ 3100Mhz
Asus P5B-E Plus
2GB GSkill DDRII PC6400GBHZ
Albatron Geforce 8800GTX
Silverstone Temjin TJ-09
Thermalright Ultra 120
Corsair 620W HX
74GB Raptor/ X-Fi Platinum
19" CRT NEC 930 SB
Windows XP Pro SP2
That has got NOTHING to do with what I quoted...maybe your vision isOriginally Posted by perkam
clouded...or is that maybe because there is a ATI rep around here ey?
GTX vs the XT is what I quoted....that decision is a no brainer...
![]()
Intel Core i7-3770K
ASUS P8Z77-I DELUXE
EVGA GTX 970 SC
Corsair 16GB (2x8GB) Vengeance LP 1600
Corsair H80
120GB Samsung 840 EVO, 500GB WD Scorpio Blue, 1TB Samsung Spinpoint F3
Corsair RM650
Cooler Master Elite 120 Advanced
OC: 5Ghz | +0.185 offset : 1.352v
This engine is amazing, it's just the theme/art work that lets you down. All the textures are bland and boring. If they had made Quake4 or Max Payne with this engine, then you'll truly see it shine.
Originally Posted by Tim
![]()
Im sure that has nothing to do with it man and thats some cold stuff to say. There are OCZ and Gskill reps here and I've seen both companies bashed at some point. Its not like the mods here cover up shiat about products(or be biased) just because there are reps here. IMO its awsome for companies to have rep's in the forums, just look at what OCZ does, they listen to thier customers and try to please them.
Last edited by twiggy; 10-20-2005 at 09:38 AM.
You're not looking at the big picture. The goal of Q4 is to bring back to glory days of Q2 multiplayer. Or, rather, Quake multiplayer in general. So, it would have to be a less demanding engine, that uses fewer resources and favors a more robust netcode instead. It will most likely be lower IQ than Doom III, despite using a modified Doom III engine, I suspect that the quality will be downplayed, rather than upgraded.Originally Posted by HKPolice
Though, I don't really care since in the grand scheme of things, graphics mean basically nothing.
TTTT, I don't really expect much from Q4, after Q3... maybe if Raven could focus on improving the mistakes ID has made with Q3, then, maybe it would be worth the $50.
Last edited by iddqd; 10-20-2005 at 10:03 AM.
Sigs are obnoxious.
I like how gamespot did a variety of different testest on subsystems, but i am curious why did they test the extra RAM benefit at only 1024x768. Wouldn't it be a better idea to test this at high resolutions or am I completely wrong in my assumption that a higher resolution will consume more system memory not just GPU memory?Originally Posted by Ubermann
Maybe the aperture. Shouldn't make any significant difference in terms of the variables you have stored in the memory.
Sigs are obnoxious.
No, I'm talking about Quake4 singleplayer which looks better than Doom3. It actually looks better than FEAR overall, cuz of the futuristic theme, but the polygon count is horrible compared to FEAR. I guess they intentionally made FEAR look so dark and gloomy/boring to create atmosphere. I've finished both games so I know first hand.
Originally Posted by iddqd
Uhm, They also forgot to mention that the X1300Pro is nowhere to be seen (at least, not in newegg or froogle..right now), so they should add other * to the graphs...Originally Posted by ahmad
Oh, suddenly he's not so biased![]()
And yes, adding the X8xx's would be nice...
Remember how X1800 completely OWNED in the FEAR demo?
I wonder what Monolith's programmers did to let the 7800 series catch up...
oh man
I don't think they did anything...I believe it's the 81.85 drivers...Originally Posted by Shadowmage
Intel Core i7-3770K
ASUS P8Z77-I DELUXE
EVGA GTX 970 SC
Corsair 16GB (2x8GB) Vengeance LP 1600
Corsair H80
120GB Samsung 840 EVO, 500GB WD Scorpio Blue, 1TB Samsung Spinpoint F3
Corsair RM650
Cooler Master Elite 120 Advanced
OC: 5Ghz | +0.185 offset : 1.352v
Bookmarks