Page 4 of 4 FirstFirst 1234
Results 31 to 36 of 36
  1. #31
    Amiga PT user VIP
    My location

    Tiago's Avatar
    Join Date
    Aug 2007
    Location
    Estoril/Lisbon
    Posts
    2,419
    Downloads
    0
    Uploads
    0
    Not laggy at all mate... after all science has proved the human eye can't see beyond 60fps anyway!
    Most games run around 95-110fps no problems.
    sorry the ignorance but, is it or not, that video cards can do more then 100 fps per minute, so if in a certain part of rendering the image cpu or whatever goes slower and needs more power, the frame rate can go down with a safe margin. It the video card does 100 fps, if a more detail part of the level/game do the video card go down, the 100 fps is a safe margin, it could go to 80 or 60 and still mantaine a good level. Am i wrong?
    A500 - A600 - A1200

  2. #32
    Retro Addict Administrator
    My location

    Burger Time Champion, Sonic Champion Harrison's Avatar
    Join Date
    Dec 2002
    Location
    UK
    Posts
    16,654
    Blog Entries
    1
    Downloads
    6
    Uploads
    14
    Sort of.

    A monitors refresh rate is exactly that. The number of times per second the display can physically update the image. So in effect it is the fps you can physically see. So for a monitor with a 60Hz refresh rate, it will update the image every 60ms, so in effect you can see a maximum of 60fps.

    However the GPU could produce 100's of FPS but the monitor will only show show you a new actual image based on the frames at its refresh rate... but the higher the GPU FPS the smoother the resulting image will theoretically be because as you say the software has a lot more overhead to keep running over the refresh rate if the generated GPU framerate starts to drop due to increased load.

    If for example the GPU produced 120FPS and your display was 60hz, then every 60ms the screen will update with a new image, missing the ones that it can't keep up with, but this goes unnoticed by the user.

    If alternatively the GPU was producing 30FPS, then monitor will still update every 60ms but as the GPU is only producing a new image 30 times per second it would show the same image every 2 image updates as the GPU hasn't yet produced anything new.

    A lot of people mention image tearing and vsync. Image tearing is when the image you can see on the screen has a noticeable change within the screen. This occurs when the screen refresh and the GPU framerate are not in sync, so the screen refreshes between frames coming from the GPU resulting in you seeing part of one frame and part of the next, with a visible tear where the 2 parts change on the screen. Vsync tried to fix this issue by syncing the GPU to the refresh rate of the connected monitor, so if the monitor was 60Hz it would try to deliver 60FPS to the display so that the display always has a complete image to display to the user.

    The argument over pros and cons for vsync, higher framerates, and higher monitor refresh rates has continued for years. In tests it is proved that most people don't notice more than 60fps, so this is now the standard most monitors are designed to meet. We could argue that this is not true because on CRTs we could see notice the flicker from the CRT refresh rate, and this didn't truly go away until at least 85Hz, but that is equally as much about how CRTs work, because as you know CRTs do not refresh the whole screen at once, but instead draw the image from top to bottom every refresh, so the flicker is also the redraw from top to bottom ever 60+ times per second. With an LCD the whole screen refreshes at once, so you are only seeing complete image flicking, much more like watching a film. This is much nicer on the eyes than the redraw method of a CRT. And can most people honestly say they notice the framerate of a film at the cinema? These run at 24FPS. Cinema keeps at this rate because it is proved that the human eye does not need any more frames per second to be tricked into thinking the brain is seeing full motion moving images. The difference is that film is pre-recorded. Computers need higher framerates because the motiion is generated on the fly and the human eye is good at noticing when something is wrong, so the CPU and GPU need enough framerate to stop the human eye from noticing it isn't real motion we are seeing.

    If you haven't played a classic game in years, it's never too late to start!


  3. #33
    Amiga PT user VIP
    My location

    Tiago's Avatar
    Join Date
    Aug 2007
    Location
    Estoril/Lisbon
    Posts
    2,419
    Downloads
    0
    Uploads
    0
    Good info mate, thanks !
    A500 - A600 - A1200

  4. #34
    ELITE Kin Hell's Avatar
    Join Date
    Jun 2011
    Location
    Cornwall. UK.
    Posts
    1,342
    Downloads
    0
    Uploads
    6
    @ Harrison

    Like I said with Spunksters comment, "Stay out of my world or you are going to end up seriously pissed off."

    I also have to ask this Dave. - If you're happy with your setup, do you honestly think your idea of a better monitor making the cards work harder is going to improve my gaming experience? I play FPS shooters & NEED VSYNC enabled to stop the horizontal tearing. So for blowing 700 bucks on 2 x GTX670's, there is no way as long as I have a hole in my ass that I am going to throw a BIGGER screen res at the cards and lock it all down @ 60FPS with VSNC on @ 60Hz. - And just to add to that last sentence, as I said earlier, 60FPS is NOT enough to play games smoothly. It's a simple fact that you either accept because you can't appreciate the difference or because you know no better. I'm sorry to appear blunt, but you already know a spade is a spade with me & there is no animosity on my part either.

    You have also over complicated the topic for Tiago & I will merely suggest he re-reads my earlier post. Your answer was sort of right but also wrong in several places so to elbaorate for Tiago:

    FPS is Frames Per Second. Minutes are not in the equation in any instance. I run BF3 @ 1920 x 1080 @ 120Hz on a 3D ready LG 23" screen. I set BF3 to run with VSYNC & hold 120FPS everywhere with the AA & Anisotropic all set out to MAX. Occasionally, I see FRAPS, the FPS counter, showing 116 but VSNC is still enabled keeping horizontal tearing out of the picture. In FB3 Operation Guillotine level, disabling VSYNC lets me see as high as 185fps in places, but I never see lower than 125 with VSYNC off, but the horizontal tearing is a bit of a piss off. Adding to this, Adaptive VSYNC is incoporated into these new 6 Series nVidia cards (probably why fraps reports 116fps from time to time) & I'm having to run beta drivers atm to avoid Micro-Stutter. Your statement is sort of right and as Harrison sort of said, ideally, we want to see at least 85Hz or above, though he shoots himself in the foot by making self contradictive remarks saying 60 FPS is enough, whilst also stating 85Hz fickers on a CRT. 85Hz does not flicker for any human eye on a decent Pro CRT & with VSNC set to on @ 85Hz, providing your Graphics card is man-enough to maintain 85+ fps, you won't see any dropped frames.

    Now this leads me to Harrisons remarks about Movies. 30FPS is adequate for DVD not to stutter @ 720p & 1080p, but 24FPS BluRay absolutely sucks for me. I've re-mux'd BluRay rips from 24 FPS to 23.86 FPS & play back is smoother? - Can't get my head round why, but the difference is huge. BluRay picture quality is stunning, but the playback is worse than DVD for me. Take Star Trek XI when they are Sky Diving onto that drilling platform: OMFG @ the shaking. Thats 24FPS BS for ya & on the latest & greatest Panasonic Plasma's with this compensation mer-larky, it does help slightly over my now nearly 5 year old Pioneer Plasma, but I still wont by BluRay. I spent £200 on 2 x Chord HDMI cables & the difference of depth of field on my Pioneer was massive for your as supplied £10 HDMI cables. Again though, either you can see it and apprciate it or you can't.

    @ Harrison RE Internet:

    Yeah, BT really do need to get the f'kin fingers out, but then they also need to stop stealing Noise Margin from customers phone lines so the bloody service runs as intended. BT seem to think that 6db downsteam is adequate, when we all know 5db is disconnection teritory on any DSL/Fibre service & they quite happily suggest we can turn your Noise Margin up to 12 from 6. Are thes pillocks too f'kin stupid to realise that some of us out here know that Downstream or Upstream Noise margin is simply a resultant factor based on length of copper wire, quality of copper wire & the speed applied to it?? - These butt munching a$$holes are ripping off thousands of users in the UK by sharing Noise Margin because of inadequate provision by BT Wholesale & over the last 7 years, the quality of service for Internet Access to this address has been totally f'kd with housing developments. You could be right about some backbones being screwed here in the UK. I'm pi$$ed right off @ seeing less than 100k/sec from the likes of nvidia & Microsoft with a 40+Mb Fibre connection. Summat is seriously screwed & we're paying for it. Grrrrr!

    @ Thread RE CRT v LCD Panels:

    60Hz on a CRT flcikers for me. 75Hz also does very slightly on cheap brand CRT's but no where near as bad as 60Hz. @ 85Hz, CRT is flicker free, but the caveat for a higher Refresh rate on a CRT is less pin sharp definition. The only reason LCD looks so sharp is because it's 60Hz. Put the LCD out of it's native resolution & the display looks mushy & fluffy. CRT's only get mushy & fluffy over 85Hz on the desktop, yet for gaming, the more Hz-idge you can get, the better the FPS when VSYNC is enabled. Aside from the Radiation worries & the later CRT's were much more efficient at this than earlier years, there is NO banding on CRT monitors & the reason for this is all down to Colour Gamut. Your average home user LCD's do not display from a full colur Gamut so you will always see banding on gradient filled backgrounds & that really narks me tbh. If you do see banding on a CRT, either it's a really crap CRT or the coding in the game is bollocks.
    Getting 0ld0r is mandatory - Growing up is just an option.

  5. #35
    Retro Addict Administrator
    My location

    Burger Time Champion, Sonic Champion Harrison's Avatar
    Join Date
    Dec 2002
    Location
    UK
    Posts
    16,654
    Blog Entries
    1
    Downloads
    6
    Uploads
    14
    I've re-mux'd BluRay rips from 24 FPS to 23.86 FPS & play back is smoother? - Can't get my head round why
    The original films are always made in 24p format. However normally they are transferred for the NTSC market first, which as usual decides to be difficult and not stick to standards. They transfer it at a rate of 23.976 FPS. Not far off the original, but annoying just that little bit slower than it should be. Then for the PAL and SECAM markets they speed it to 25FPS.

    Therefore for a lot of DVD and BD we are ending up with video that has first been slowed down slightly compared to the original, and then speeded back up to 1fps faster than it should be. This is why we sometimes see motion issues.

    A lot more Blu-Rays are however now being released with true 24p support, this should hopefully fix the issue on TVs that natively support 24p playback.

    Regarding refresh rates. Yes the human eye can't perceive it flickering at 85Hz but it still is on a CRT. It will flicker whatever the frequency because it is still having to draw each frame from top to bottom, rather than switching the whole screen as one like an LCD.

    One thing. You mention you are using a 120Hz 3D LCD. Did you know that these still operate at 60Hz? They need the 120Hz so they can deliver 60Hz to each eye (splitting the 120fps into left and right 60fps halves).

    I also have to ask this Dave. - If you're happy with your setup, do you honestly think your idea of a better monitor making the cards work harder is going to improve my gaming experience?
    As you seem to be focused mostly on just FPS then if you think your setup gives you the best results than fair enough. I don't play FPS that much. I'm mostly into RPG and Strategy games more and for those the extra screen real estate is very much needed. Everyone has a very different set of needs when it comes to the hardware and software we use. First and foremost I need a display that is capable of delivery as high a colour Gamut as possible for image and video editing, and more Dell is one of the best on the market for this (even comes factory calibrated). And for me 60Hz is fine for gaming, even FPS. I honestly don't notice any issues. You might and therefore using this monitor might not be for you, but for me it is perfect.

    If you haven't played a classic game in years, it's never too late to start!


  6. #36
    ELITE Kin Hell's Avatar
    Join Date
    Jun 2011
    Location
    Cornwall. UK.
    Posts
    1,342
    Downloads
    0
    Uploads
    6
    [QUOTE=Harrison;49430]

    One thing. You mention you are using a 120Hz 3D LCD. Did you know that these still operate at 60Hz? They need the 120Hz so they can deliver 60Hz to each eye (splitting the 120fps into left and right 60fps halves).
    Correct, but only when running in 3D, which I don't & have no interest to do so & nor do I want to bombard each eye with a nasty flickering 60Hz.

    As you seem to be focused mostly on just FPS then if you think your setup gives you the best results than fair enough. I don't play FPS that much. I'm mostly into RPG and Strategy games more and for those the extra screen real estate is very much needed. Everyone has a very different set of needs when it comes to the hardware and software we use. First and foremost I need a display that is capable of delivery as high a colour Gamut as possible for image and video editing, and more Dell is one of the best on the market for this (even comes factory calibrated). And for me 60Hz is fine for gaming, even FPS. I honestly don't notice any issues. You might and therefore using this monitor might not be for you, but for me it is perfect.
    If it's screen real Estate you want with high FPS and VSYNC enabled, then you'd run dual or even 3 screens with 120Hz Monitors, but not a dell with 60Hz @ what-ever it's native resolution might be.
    I would partially agree with you with Regard to video eding & hight Gamuts, but more so for picture editing or book publishing. The latter of course better supported with Mac's.
    If you honestly can't appreciate the difference between 60Hz & 120Hz or greater for a more satisfying gaming experience, then there is nothing I can do to convince you. I've seen people @ huge professionaly organised Lan partys with all the kit to participate in the best possible way, but only running @ 60Hz. After fixing their refresh rates, I never had one person say they couldn't see or feel any difference & 85% of those I did fix didn't even know how to set up their refresh rates correctly or realise that VSYNC makes a massive difference to how their game experince potrays itself. I believe there are now LCD's claiming to do 144Hz, which should throw 72Hz into each eye, therefore helping alleviate damage to the eye with only 60Hz. I can't beleive the industry got away with that one, much like the shocking Sony bastardisation of BluRay @ 24FPS.
    Getting 0ld0r is mandatory - Growing up is just an option.

Similar Threads

  1. Upgrade Windows 1.0 to Windows 7!
    By Harrison in forum PC - Windows, Linux, Mac
    Replies: 6
    Last Post: 6th March 2011, 12:15
  2. Windows 7 pricing?
    By Demon Cleaner in forum PC - Windows, Linux, Mac
    Replies: 24
    Last Post: 21st September 2010, 10:27
  3. Hardware pricing continue to tumble
    By Harrison in forum PC - Windows, Linux, Mac
    Replies: 0
    Last Post: 4th January 2008, 02:18
  4. Microsoft announce 20 million sales of Vista in first month!
    By Harrison in forum PC - Windows, Linux, Mac
    Replies: 3
    Last Post: 28th March 2007, 02:51

Tags for this Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  

Copyright classicamiga.com