Clock Rate - Falmouth-Games-Academy/comp310-wiki GitHub Wiki

Clock Rate

Clock rate typically refers to the frequency at which a chip like a CPU (Central Processing Unit) or GPU (Graphical Processing Unit) is running and is used to indicate the speed of those chips 1(https://en.wikipedia.org/wiki/Clock_rate). Back when the NES was new, console clock rates were measured in MegaHertz (MHz) whereas in the modern day we have reached speeds in the GigaHertz (GHz) which is 1000 times higher 2(https://www.google.co.uk/search?rlz=1C1GCEA_enGB817GB817&ei=ca-8W5_LKunZgAbF6J-YDQ&q=1+gigahertz+to+megahertz&oq=1+gigahertz+to+megahertz&gs_l=psy-ab.3..35i39k1j0i22i30k1l7.958.1064.0.1192.2.2.0.0.0.0.92.151.2.2.0....0...1c.1.64.psy-ab..0.2.150...0.0.BflfCMJiawo).

NES Clock Rate Discrepancy

The clock rate differs based on the region for each component within the NES, mainly due to the different television standards used (NTSC M vs PAL B). The colour encoding methods used by the NES meant that the master clock frequency must be six times that of the colour sub-carrier, however, this frequency is also almost 25% higher on PAL consoles. The PAL versions also had more scan-lines per field, lower fields per second, as well as the CPU master lock, is divided by 16 to somewhat preserver the ratio between CPU and PPU speeds 3(http://wiki.nesdev.com/w/index.php/Clock_rate).

Here are the main differences between NTSC and PAL PPU's: NTSC vs PAL

References