Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
fps vs my eyes
#21
40 frames per second for a game is adequate. I'd settle for 30. Right now with a burnt out gfx card I'd settle for 20. Tongue

At 60Hz, I detect flicker. I see a noticable change at 70 and 75Hz,but still with flicker. At 85Hz it's almost inconcievable. At 90Hz I can't detect it at all.

>anarky
Screwing with your reality since 1998.
Reply
#22
when i used to have an Atari ST the normal mode was 50Hz. Some games would switch it to 60HZ, and my sister always told me to change it back because 'she could hear it (she said a nasty whistle), and it gave her a headache'
, whereas i noticed nothing except the greater refresh (and also that opened up the number of lines a bit) Also i remember our TV was one of a very few at the time that would run the 60hz modes, many others would look like the 'vertical hold' was wrongly set
EVEN MEN OF STEEL RUST.
[Image: chav.gif]
Reply
#23
Funny you mention that, because I can tell without looking from about 20 metres away when a TV is on and muted. They emit a high pitch whistle which I find very annoying, especially when you're trying to sleep at night and the better half is watchnig a show with no audio and captions...

>anarky- EEEEEEEEEEEEEEEEEEEEEEEEEEEEEE
Screwing with your reality since 1998.
Reply
#24
Quote:I see a lot of 'w00t 7000fps!!' , type posts, but was wondering what is the best fps that the eye can actually see?

A friend says ithe brain only processes about 30, but that you need around 90 to really fool the brain.

Can anyone expand on this?

Although the brain can only really perceive, what, 10 - 20 frames per second, we need our games to be at a much higher rate. This because we still would notice the difference between 10 frames and 60 frames per second. Reason that is, is that basically when the human eye is 'capturing' an image, it will give an average of what it is seeing. I mean by that, if, for example the eye is catching 10 frames per second and the game is showing 60 frames per second, one eye capture will catch 6 game frames and blend these 6 frames together, creating a motion blurred picture. The brain will decrypt this motion blurred picture, making us believe that there's actually motion.

If we'd show a game with only 10 fps, there would be no motion blur and it would look like the game is stuttering.

If you would look at a movie it would seem sharp to you, but when you'd pause it during a movement, you suddenly notice when frozen, the movie isn't sharp at all.

basically a game would look stutterless at 15 FPS if i'd have motion blur to compensate.
Reply
#25
Quote:
yetifoot Wrote:I see a lot of 'w00t 7000fps!!' , type posts, but was wondering what is the best fps that the eye can actually see?

A friend says ithe brain only processes about 30, but that you need around 90 to really fool the brain.

Can anyone expand on this?

Although the brain can only really perceive, what, 10 - 20 frames per second, we need our games to be at a much higher rate. This because we still would notice the difference between 10 frames and 60 frames per second. Reason that is, is that basically when the human eye is 'capturing' an image, it will give an average of what it is seeing. I mean by that, if, for example the eye is catching 10 frames per second and the game is showing 60 frames per second, one eye capture will catch 6 game frames and blend these 6 frames together, creating a motion blurred picture. The brain will decrypt this motion blurred picture, making us believe that there's actually motion.

If we'd show a game with only 10 fps, there would be no motion blur and it would look like the game is stuttering.

If you would look at a movie it would seem sharp to you, but when you'd pause it during a movement, you suddenly notice when frozen, the movie isn't sharp at all.

basically a game would look stutterless at 15 FPS if i'd have motion blur to compensate.
Thanks for making that post. Hopefully people wont go apeshit on you and whine about TV's. Big Grin
Reply
#26
To sum up:

The eye could blend a game to look seemless at 15 FPS.

Television on average does about 30 to 60 FPS.

30 FPS should be sufficient for ANY game, and isn't very hard.

Use 30 fps.
earn.
Reply
#27
TV probably runs at 60Hz. TV also runs at 24-30FPS. Not 60 FPS.

I can detect 30 frames per second anyway. That's why I said 40.

>anarky
Screwing with your reality since 1998.
Reply


Forum Jump:


Users browsing this thread: 1 Guest(s)