All kinds if they're good, mostly RPG and some RTS...
- the other model (HP LP1965) - which was based on P-MVA panel, had a little input lag and some ghosting but the black level and viewing angles were as close as possible to an CRT.
- the new model (LG W2220P) - which is based on e-IPS panel, has no noticeable input Lang and no ghosting, which is a bonus for FPS games - yet the black levels are not his strong point (IPS glow related) and the viewing angles are pretty good but not as good as with the other one or a CRT. And also, the color reproductions - is better on this one...
....compared to TN's - well, there's no room for comparison... since games or one thing, but I also use this monitor for movies, some editing and so on... Yet, it's true this is not good for 3D movies, but I'm not that desperate for 3D to watch it on a monitor with TN panel... And when I said games or one thing, didn't mean it's not as capable as a fast TN:
LG W2220P vs. Asus VG236H
... was refereeing to a TN which beyond FPS games, on which you stay focused on the screen like a statue, the other games and watching a movie could be more enjoyable on the model from LG (for my standards at least - don't play FPS like CS 1.6 anymore, which was optimized with 90 FPS in mind for smoother game-play, wile new shooters work great even at 30+ FPS and nor room for complaining at 60 FPS).
Last edited by XSAlliN; 11-19-2010 at 01:20 AM.
new shooters working well at 30fps+? huh?
if you dont notice the difference between 30fps and up when playing shooters, im not surprised you think ips monitors are fine for gaming :P
i know the image quality is better on ips panels, but if your playing games, whats the point?
and even if you watch action movies some panels might have slight ghosting in fast scenes... i prefer less intense colors and high refresh rates...
crossfire at 5760x1080 and dropping under 90fps is highly noticeable.
30fps I cant even hit a target.
4670k 4.6ghz 1.22v watercooled CPU/GPU - Asus Z87-A - 290 1155mhz/1250mhz - Kingston Hyper Blu 8gb -crucial 128gb ssd - EyeFunity 5040x1050 120hz - CM atcs840 - Corsair 750w -sennheiser hd600 headphones - Asus essence stx - G400 and steelseries 6v2 -windows 8 Pro 64bit Best OS used - - 9500p 3dmark11(one of the 26% that isnt confused on xtreme forums)
Cut the BS-ing, it's getting pathetic - seriously now...When Crysis was released, few cards could run the game with all on max at 30+ FPS on a 1680 x 1050 resolution or higher... yet, didn't see many complaining they can't play the game unless it's close to 90 FPS. That game runs more than fine and real smooth on 60 FPS, cause wasn't optimized with 90 FPS in mind... Did see that difference in CS 1.6 - but CoD 4 was running real smooth at 60 FPS.
No offense, but if you can't hit a target at 30+ fps you probably have some kinda handicap - eyes, hands... you tell me. Yet, don't see how more FPS could help you in that situation... who knows, maybe some strange and rare disease - that makes you see blur images or very bad coordination at 30+ FPS's... Can't say for sure - but your problem is a specific one, doesn't apply to rest of us... Don't mean to sound offensive, sorry for you problem but you shouldn't add it in this discussion, not being relevant in any way.
Last edited by XSAlliN; 11-19-2010 at 05:35 AM.
I'd take any TN panel 120Hz LCD monitor over 60Hz IPS monitor, if 120Hz LCD monitors hadn't been released I'd still be using a CRT today.![]()
Intel? Core i5-4670K @ 4.3 GHz | ASRock Extreme6 Z87 | G.Skill Sniper 2x8GB @ DDR4-1866 CL9 | Gigabyte GTX 970 OC Windforce 3x | Super Flower Titanium 1000W | ViewSonic VX2268wm 120Hz LCD | Phanteks PH-TC14PE | Logitech MX-518 | Win 7 x64 Professional | Samsung 850 EVO & 840 Pro SSDs
If all people would share opinions in an objective manner, the world would be a friendlier place
it depends on the game.
did you ever play a fast paced online shooter (ut, quake, ...) with 30fps? you won't have a chance at all.
even quake4, which is capped at 60fps, feels jerky. i couldn't stand that and quit it after only a few weeks. for me, such shooters need to have at least 85fps to be considered smooth.
the only reason crysis looks pretty smooth at 30fps is the motion blur. this doesn't change the fact that your mouse movement is still linked to the fps. the less fps you have, the more inaccurate the mouse.
e.g. the ice level in crysis where you leave the alienship - my fps were sub somwhere in the 20's and i had a hard time dealing with the drones as it felt so jerky.
well, you can always say this and that is enough to "play" it, but to really have fun or be good at smth you'll need more than that.
if you're happy with 30fps, well that's your thing - it's purely subjective. it's only your personal perception, nothing more, nothing less.
1. Asus P5Q-E / Intel Core 2 Quad Q9550 @~3612 MHz (8,5x425) / 2x2GB OCZ Platinum XTC (PC2-8000U, CL5) / EVGA GeForce GTX 570 / Crucial M4 128GB, WD Caviar Blue 640GB, WD Caviar SE16 320GB, WD Caviar SE 160GB / be quiet! Dark Power Pro P7 550W / Thermaltake Tsunami VA3000BWA / LG L227WT / Teufel Concept E Magnum 5.1 // SysProfile
2. Asus A8N-SLI / AMD Athlon 64 4000+ @~2640 MHz (12x220) / 1024 MB Corsair CMX TwinX 3200C2, 2.5-3-3-6 1T / Club3D GeForce 7800GT @463/1120 MHz / Crucial M4 64GB, Hitachi Deskstar 40GB / be quiet! Blackline P5 470W
Crysis is a single player game(MP is just an afterthought), you cant compare it to a competitive online shooter.
when the other guy says that at 30frames he cant hit jacki believe that to be the case when playing vs good players in a competitive shooter ´.
if you like pretty tech demos like Crysis then more power to you,stick to them,just dont try to educate people on a matter that you have no idea about.
---
---
"Generally speaking, CMOS power consumption is the result of charging and discharging gate capacitors. The charge required to fully charge the gate grows with the voltage; charge times frequency is current. Voltage times current is power. So, as you raise the voltage, the current consumption grows linearly, and the power consumption quadratically, at a fixed frequency. Once you reach the frequency limit of the chip without raising the voltage, further frequency increases are normally proportional to voltage. In other words, once you have to start raising the voltage, power consumption tends to rise with the cube of frequency."
+++
1st
CPU - 2600K(4.4ghz)/Mobo - AsusEvo/RAM - 8GB1866mhz/Cooler - VX/Gfx - Radeon 6950/PSU - EnermaxModu87+700W
+++
2nd
TRUltra-120Xtreme /// EnermaxModu82+(625w) /// abitIP35pro/// YorkfieldQ9650-->3906mhz(1.28V) /// 640AAKS & samsung F1 1T &samsung F1640gb&F1 RAID 1T /// 4gigs of RAM-->520mhz /// radeon 4850(700mhz)-->TRHR-03 GT
++++
3rd
Windsor4200(11x246-->2706mhz-->1.52v) : Zalman9500 : M2N32-SLI Deluxe : 2GB ddr2 SuperTalent-->451mhz : seagate 7200.10 320GB :7900GT(530/700) : Tagan530w
That's also true, it does depend on the game - didn't say UT would run nice on 30 FPS, but it's more than playable at 60 FPS...Warsow for example, which was build on an enhanced version of QII engine - runs pretty smooth on 60 PFS, yet those games could also be played better on lower resolutions (800 x 600 being one of the best). But this kinda scenarios are valid for "professional FPS gaming" not enthusiasts and their high - end rigs and eyefinity setups... If you put two professional gamers in a vs. one on a CRT with 800 x 600 resolution vs one with a high end rig and eyefinity setup - i'm pretty sure the one with lower setting... not only that it would own him, the other might even play like a noob ...Eyefinity may look fun for few seconds, but it's a joke for professional gaming of fast passed FPS - that extra sight being a burden, not a bonus and also harder to bunny jump on the map... (i've been a witness to a similar situation, so it's not a theory ).
PS.Quake 4 was a fail when it comes to MP, not a true successor of quake series - so it's more like a general scenario for that game, not one specific... capped is one thing "optimized" is another... In Q4's case, even if they remove that cap it could still run jerky even at 90 FPS... As far as I know that engine is capable of more than that, but was capped at that value for some reason. As you putted, depends on the games - other optimized at 60 FPS could runt just fine.
Last edited by XSAlliN; 11-19-2010 at 06:48 AM.
People that are different from you must be BSing
I also can not hit things at 30fps. Hell, I can't even play properly at 60fps. It is uncomfortable. Yes, I do not play crysis, doom 3, quake 4, etc because I like to beat my shooters on hardest difficulty and I can not do it at 60fps because there is too much input lag.
Many years of competitive shooters like CS make you really sensitive to the input lag of many newer games.
No, just those that are BS-ing...Many years of competitive shooters like CS make you really sensitive to the input lag of many newer games.
Well, I know a guy like you - he still has the same PC as 7 years ago, never feel the need for an Upgrade - CS 1.6 still running decent on that system and of course he has a CRT. He was pretty good in CS competition - playing at 640 x 480 resolution and 100 FPS, as many CS Pro's did back then and some still do (640 x 480/ 800 x 600 being the most used resolutions). Beyond FPS playing a Shooter at resolution higher than 800 x 600 for him is a major drawback...
As for me, well I'm not a professional gamer - never was for that matter (yet I knew some), did play many shooters in MP and was many times top player from my team - but mostly in random games with random teams and random players... Besides SC, AoE 2, Quake 2, CS 1.5 and UT (1) can't say I played a team game with friends or as part of a clan and so on... but that was 8 - 10 years ago, so yeh I'm also not that young...Mostly was generally speaking about games...
...and did mentioned, I could play a shooter from the new ones, as in those with intense graphics at 30+ FPS and didn't had a problem hitting a target and more than satisfied on how they work on 60 FPS, so makes sense you guys having other problems not really monitor related in cases like this... As for CS - don't see the point of mentioning a 6 - 10 years old FPS compared to games intended for enthusiast systems, which or obviously not intended for professional gamers.All kinds if they're good, mostly RPG and some RTS...
For me, the main interest on the monitor from this topic was not gaming related - more like video as in 3D. Already have a decent monitor which more than decent for any kinda of gaming (no ghosting or input lag) and being based on a superior panel, I get better viewing angles, better color reproduction - which beyond gaming, is more than welcomed for movies... A TN at 120 Hz is still a TN so still junk - which I personally can't stand, unless it's work related as in "Office".![]()
Fact is the human brain is able to dicern very minute differences. The more data presented to our senses, the better the potential to use it to our advantage. Now how quickly one can process and thus respond to given data varries from person to person ( for many reasons be it ones sight, hearing, physical condition or general mental prowess ) which in part is why there are differing views as far as what is viable framerate wise.
Now it might simply be some people are unable to process the difference or it might be they haven't been given a fair opportunity to see it. Myself amongst others here *do* notice a tangible improvement playing a game at *much* higher framerates than the generic "30 fps is playable" stuff. It has been accepted for the most part that games are reasonably fluid at 30 frames per second but that isn't to say it there are no benifits associated with higher framerates. I feel it's not that people "can't" play at 30 frames per second but there that more is to be had with higher performance.
A few years back I was reasonably content with 30fps performance. Now after having spent considerable time with higher quality input / output devices, I find the same experiance to be severely lacking. Theres no comparison as far as I'm concerned. To me its like low quality hamburger and prime cut steak. Sure the hamburger is edible and will fill you up but its not nearly as sastifying as a nice juicy steak. By the same merit if one is satisfied with hamburger and has never tried steak, they don't know what they are missing. Damn. I'm hungry.
PS: Another thing regarding online games, which I don't want to get into too much here, is netcode. Many games have netcode reliant on framerate so much of the percieved difference of higher framerates ( ie 100+fps on 60hz displays ) is due to more consistent netcode performance. The Half Life, Quake III ( of which the COD franchise is built on ) and Source engines are prime examples of this. Both the source and half life engine run ideally at 100 packets per second which requires 100 frames per second, with 125 packets per second being ideal on the Q3 engine ( 300 can be as well but not as achieveable ) You can have a great connection and low latency but if the packet quality is poor, your experiance will suffer and this is directly tied to framerate.
Last edited by Chickenfeed; 11-19-2010 at 07:58 PM. Reason: better wording - lost my copies of literaaccy and typ1ng for dumbies
Feedanator 7.0
CASE:R5|PSU:850G2|CPU:i7 6850K|MB:x99 Ultra|RAM:8x4 2666|GPU:980TI|SSD:BPX256/Evo500|SOUND:2i4/HS8
LCD:XB271HU|OS:Win10|INPUT:G900/K70 |HS/F:H115i
Unfortunately you do sound offensive.![]()
Have you spent more than 5 minutes in front of a 120hz panel? Of course most can hit targets at 30fps but the point is why would you want to when at 60+ fps everything looks/feels/plays smoother. If you disagree with that then its simply because you have not actually used a 120hz monitor. Its the closest feel to CRT and when looking at a 60hz side by side to a 120hz the difference is dramatic to say the least.
Corsair 700D
Intel i7 920 @ 4.20|Asus P6T6 Revolution|G.Skill 6gb DDR3 1600|Zotac GTX480|Intel x-25-M 80GB x 2 / Raid0
H2O
|Perfecting the Obsidian series case. Build log to follow soon...|
I did give a solid example for that, like "Crysis" as in - what if your video-card can't do more than 32 FPS on a Shooter with High-End graphics... To putted more simple - it was not about me wanting to play at 30 FPS, more like being the only solution in those circumstances and at 32 FPS being still playable. My next point, was that at 60 FPS most recent games play real smooth... and of course "I WOULD WANT ALL GAMES TO WORK AT 60 FPS CONSTANTLY (100 x DOH!)" but things don't always work as we want them...
And yes - I did try an LCD's at 120 Hz (watched few 3D movies and them and tried some games) but can't say I enjoyed using them being based on a TN panel (watching a 3D movie in "STATUE POSTURE" - could never get used to that), which is also the reason I was a little interested by the subject of this topic.
And no - I don't see that closer to a CRT monitor, since refresh rate was not to only bonus (viewing angles, color reproduction - i personally can't ignore that) and it's also worth adding that a CRT refresh rate is not the same as an LCD refresh rate...
Same here.
I have the Acer 245 since 2 months, the colors aren't that great but the resolution and work space is by far better than my old 19" Flatron 915+ that I was using since 10 years.
Everytime I enter in my room I wonder : wtf who's steal my screen ? oh wait .. I have a TFT now![]()
2500K, 6970, Acer 245HQ
E8400@3.85 + Vdroop P5K
Refresh Rate is the only bonus you get with that...
Unless you play only Quake 3 or similar shooters and like to watch movies in 3D - you get no other bonus, yet being TN you get some disadvantages like those related to viewing angles and color reproduction...
So yeah, the direction you're aiming is subjective - if coming from a CRT... If you don't care about viewing angles and color reproduction yet RR is very important then yeh, even a crappy TN panel with 120 Hz could be good enough...
It's same with audio - since usually gamers with those needs in terms of monitor, are interested in footsteps direction - and aim for Razer or similar products, wile those that care about sound quality go for AKG, ATH, Sennheiser, Beyerdynamic or similar products.
As mentioned above - with a 120 Hz capable IPS - I was expecting to get all advantages, that's why I was a little interested in it... A little, since IPS far from perfect - being aware of that, after I witnessed live both SED and OLED - yet, 1'st technology is "in the air" wile next is very expensive - has shorter life and not many models available being still new in this market. Here's a recent model from LG:
![]()
I know it's not the same type of competitive shooter as CS 1.6 and even CS:S, but as a proficient sniper in TF2, my performance dropped significantly when playing in a server over 40-50ms latency. My home server I was in the teens to 20's, so when you factor in how 20ms can affect my performance in sniping, it follows as well that 20-30 FPS could make the difference between a kill and a miss. Nobody could ever see the difference watching me play, but I could feel a difference when shots that were headshots on one server were misses in another.
||| Desktop: Antec 300, Intel Q6600, Asus P5Q Pro, ASUS Radeon HD 4870 1GB DK, 4x1GB OCZ DDR2 800, Xigmatek s1283, 80 GB X-25M
And what those server latency have to do with this monitor?- gaming at 30 FPS can be acceptable for SP and games like Crysis, nobody said anything about MP - where the FPS would drop even more... Yet at 60 FPS which is the default RR for most monitors, except older titles - as in FP shooters (like Quake or even those based on valve engines optimized with 90+ FPS in mind) the game-play is pretty smooth... Until 120 Hz LCD's got released, many competitions were made on 60 Hz capable LCD monitors (also true, that some organizers still used CRT's), where ms was their only bonus and considered suitable for hardcore gamers.
well, with the lack of 120hz tfts i had no other choice than to get a 60hz tft... it's almost impossible to find large crts that are in good shape.
and for me, 60hz tfts are craptastic for gaming for the following reasons:
- tearing is extremely annoying, especially in games with very high framerates (e.g. older games like ut, q3 etc...)
- when using vsync to prevent tearing you'll get uber mouse lag which makes lots of games unplayable. plus in newer games vsync destroys your fps (e.g. bc2 is unplayable with vsync on my 4850 since triple buffering doesn't work).
so yeah, even though i love my tft (lg l227wt), i've never been happy with the 60hz in most of the games. especially if you were playing games at 100/120hz on your crt... i still miss my crt, but it was bulky, had a relatively small screen (19") and a power consumption beyond good and evil.
every tft that matches my current l227wt + 120hz would be the perfect buy for me.
1. Asus P5Q-E / Intel Core 2 Quad Q9550 @~3612 MHz (8,5x425) / 2x2GB OCZ Platinum XTC (PC2-8000U, CL5) / EVGA GeForce GTX 570 / Crucial M4 128GB, WD Caviar Blue 640GB, WD Caviar SE16 320GB, WD Caviar SE 160GB / be quiet! Dark Power Pro P7 550W / Thermaltake Tsunami VA3000BWA / LG L227WT / Teufel Concept E Magnum 5.1 // SysProfile
2. Asus A8N-SLI / AMD Athlon 64 4000+ @~2640 MHz (12x220) / 1024 MB Corsair CMX TwinX 3200C2, 2.5-3-3-6 1T / Club3D GeForce 7800GT @463/1120 MHz / Crucial M4 64GB, Hitachi Deskstar 40GB / be quiet! Blackline P5 470W
An 120Hz Monitor would be very good.
When someone enables vsync will play with 120FPS Maximum without horizontal Line!
I think that you are right.
When the monitor Displays Static Pictures as photos, S-IPS is better than a TN.
Gaming is different. Explosions, very fast pictures, etc. S-IPS is better again, but without many difference from TN. imho.
In movies i don't like Interpolation. I want to watch True 24p. When i see movies Interpolated, is such has been recorded from VideoCamera.
My HDTV is 60Hz.
That means that plays 3D with 3D Glasses via PS3 or via PC?
And something else.
Can i see 3d with 24p!? As i told above i don't like Interpolation in movies.
iMac 27" (mid 2011) Intel i5 2.7GHz, 12GB 1333MHz, Mac OSX Mountain Lion / iPad 3 16GB | White / iPhone 5 16GB | White
FX 8150 oc@4.6 Ghz 1.38V / ASUS Crosshair V Formula / 16GB Kingston HyperX 1600MHZ / ASUS EAH 5870 1GB / Intel 320 Series 120GB / Silverstone Tj07 - Black / Corsair TX750v2 / Windows 8 Pro x64
Apogee HD/Heatkiller GPU-X?/MPC655/360 Radiator
There is no shooter capped at 30 FPS - but crysis was playable even at those values... yet at 60 FPS was real smooth.
You have an eye problem which is a subjective case... I also have a similar problem, but get affected by other environments... for example "Mirror's edge" - after 20 minutes of playing that game, I got so sick that took hours to recover... yet that's also a subjective case - can't say the game is awful just cause "I" can't play it....
S-IPS is the old technology - H-IPS and e-IPS is newer and can be as capable as a TN... here's an e-IPS vs a fast TN, for example:
LG W2220P vs. Asus VG236H
Bookmarks