I understand it's linked to fps and graphics but what exactly is hz? Is it hertz like I guessed or something completely different?
Just a confused consumer here
-
Edited by Jamie: 5/26/2017 12:52:15 PMIt's a unit of time to measure frequency - how long something takes. Processors used Hertz but now use Gigahertz to explain how long it takes for the processor to complete a cycle of information. Simply put the faster the clock speed the faster your processor is at computing or completing instructions. 1 hertz is one cycle a second. 1 gigahertz is one billion cycles a second.
-
This is the opening to the Wikipedia entry for Refresh Rate: The refresh rate (most commonly the "vertical refresh rate", "vertical scan rate" for cathode ray tubes) is the number of times in a second that a display hardware updates its buffer. This is distinct from the measure of frame rate in that the refresh rate includes the repeated drawing of identical frames, while frame rate measures how often a video source can feed an entire frame of new data to a display. For example, most movie projectors advance from one frame to the next one 24 times each second. But each frame is illuminated two or three times before the next frame is projected using a shutter in front of its lamp. As a result, the movie projector runs at 24 frames per second, but has a 48 or 72 Hz refresh rate. On cathode ray tube (CRT) displays, increasing the refresh rate decreases flickering, thereby reducing eye strain. However, if a refresh rate is specified that is beyond what is recommended for the display, damage to the display can occur.[1] For computer programs or telemetry, the term is also applied to how frequently a datum is updated with a new external value from another source (for example; a shared public spreadsheet or hardware feed). If you want more: https://en.wikipedia.org/wiki/Refresh_rate