Jump to content

NTSC: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
Line 396: Line 396:
| <span style="display:none">2012-06-30</span>June 30, 2012
| <span style="display:none">2012-06-30</span>June 30, 2012
|-
|-
| {{THA}}<ref>Former used shortly by [[MCOT]] and Royal Thai Army Radio and Television.</ref>
| {{THA}}<ref>Experimented by [[MCOT]] and Royal Thai Army Radio and Television, before adopted for PAL.</ref>
| [[PAL]]
| [[PAL]]
| 1970s
| 1970s

Revision as of 17:36, 20 December 2015

Television encoding systems by nation; countries using the NTSC system are shown in green.

NTSC, named after the National Television System Committee,[1] is the analog television system that was used in most of the Americas (except Brazil, Argentina, Paraguay, Uruguay and French Guiana); Burma; South Korea; Taiwan; Japan; the Philippines;[2] and some Pacific island nations and territories (see map).

The first NTSC standard was developed in 1941 and had no provision for color. In 1953 a second NTSC standard was adopted, which allowed for color television broadcasting which was compatible with the existing stock of black-and-white receivers. NTSC was the first widely adopted broadcast color system and remained dominant until the 2010s, when it is gradually being replaced with different digital standards such as ATSC and others.

Most countries using the NTSC standard, as well as those using other analog television standards, have switched to or are in process of switching to newer digital television standards, there being at least four different standards in use around the world. North America, parts of Central America, and South Korea are adopting the ATSC standards, while other countries are adopting or have adopted other standards. After nearly 70 years, the majority of over-the-air NTSC transmissions in the United States ceased on June 12, 2009,[3] and by August 31, 2011[4] in Canada and most other NTSC markets.[5] The majority of NTSC transmissions ended in Japan on July 24, 2011, while Mexico has complete digital simulcast since 2012, the same year as the cessation of NTSC broadcasts in the Japanese prefectures of Iwate, Miyagi, and Fukushima, but has as yet turned off all of its analog signals.[4][6] Digital broadcasting allows higher-resolution television, but digital standard definition television continues to use the frame rate and number of lines of resolution established by the analog NTSC standard.

History

The National Television System Committee was established in 1940 by the United States Federal Communications Commission (FCC) to resolve the conflicts that were made between companies over the introduction of a nationwide analog television system in the United States. In March 1941, the committee issued a technical standard for black-and-white television that built upon a 1936 recommendation made by the Radio Manufacturers Association (RMA). Technical advancements of the vestigial side band technique allowed for the opportunity to increase the image resolution. The NTSC selected 525 scan lines as a compromise between RCA's 441-scan line standard (already being used by RCA's NBC TV network) and Philco's and DuMont's desire to increase the number of scan lines to between 605 and 800.[7] The standard recommended a frame rate of 30 frames (images) per second, consisting of two interlaced fields per frame at 262.5 lines per field and 60 fields per second. Other standards in the final recommendation were an aspect ratio of 4:3, and frequency modulation (FM) for the sound signal (which was quite new at the time).

In January 1950, the Committee was reconstituted to standardize color television. In December 1953, it unanimously approved what is now called the NTSC color television standard (later defined as RS-170a). The "compatible color" standard retained full backward compatibility with existing black-and-white television sets. Color information was added to the black-and-white image by introducing a color subcarrier of precisely 3.579545 MHz (nominally 3.58 MHz). The precise frequency was chosen so that horizontal line-rate modulation components of the chrominance signal would fall exactly in between the horizontal line-rate modulation components of the luminance signal, thereby enabling the chrominance signal to be filtered out of the luminance signal with minor degradation of the luminance signal. Due to limitations of frequency divider circuits at the time the color standard was promulgated, the color subcarrier frequency was constructed as composite frequency assembled from small integers, in this case 5×7×9/(8×11) MHz.[8] The horizontal line rate was reduced to approximately 15,734 lines per second (3.579545×2/455 MHz) from 15,750 lines per second, and the frame rate was reduced to approximately 29.970 frames per second (the horizontal line rate divided by 525 lines/frame) from 30 frames per second. These changes amounted to 0.1 percent and were readily tolerated by existing television receivers.[9][10]

The FCC had briefly approved a different color television standard, starting in October 1950, which was developed by CBS.[11] However, this standard was incompatible with black-and-white broadcasts. It used a rotating color wheel, reduced the number of scan lines from 525 to 405, and increased the field rate from 60 to 144, but had an effective frame rate of only 24 frames per second. Legal action by rival RCA kept commercial use of the system off the air until June 1951, and regular broadcasts only lasted a few months before manufacture of all color television sets was banned by the Office of Defense Mobilization (ODM) in October, ostensibly due to the Korean War.[12] CBS rescinded its system in March 1953,[13] and the FCC replaced it on December 17, 1953, with the NTSC color standard, which was cooperatively developed by several companies, including RCA and Philco.[14] The first publicly announced network television broadcast of a program using the NTSC "compatible color" system was an episode of NBC's Kukla, Fran and Ollie on August 30, 1953, although it was viewable in color only at the network's headquarters.[15] The first nationwide viewing of NTSC color came on the following January 1 with the coast-to-coast broadcast of the Tournament of Roses Parade, viewable on prototype color receivers at special presentations across the country. The first color NTSC television camera was the RCA TK-40, used for experimental broadcasts in 1953; an improved version, the TK-40A, introduced in March 1954, was the first commercially available color television camera. Later that year, the improved TK-41 became the standard camera used throughout much of the 1960s.

The NTSC standard has been adopted by other countries, including most of the Americas and Japan.

With the advent of digital television, analog broadcasts are being phased out. Most U.S. NTSC broadcasters were required by the FCC to shut down their analog transmitters in 2009. Low-power stations, Class A stations and translators are required to shut down by 2015.

Technical details

Lines and refresh rate

NTSC color encoding is used with the System M television signal, which consists of 29.97 interlaced frames of video per second. Each frame is composed of two fields, each consisting of 262.5 scan lines, for a total of 525 scan lines. 483 scan lines make up the visible raster. The remainder (the vertical blanking interval) allow for vertical synchronization and retrace. This blanking interval was originally designed to simply blank the receiver's CRT to allow for the simple analog circuits and slow vertical retrace of early TV receivers. However, some of these lines may now contain other data such as closed captioning and vertical interval timecode (VITC). In the complete raster (disregarding half lines due to interlacing) the even-numbered scan lines (every other line that would be even if counted in the video signal, e.g. {2, 4, 6, ..., 524}) are drawn in the first field, and the odd-numbered (every other line that would be odd if counted in the video signal, e.g. {1, 3, 5, ..., 525}) are drawn in the second field, to yield a flicker-free image at the field refresh frequency of approximately 59.94 Hz (actually 60 Hz/1.001). For comparison, 576i systems such as PAL-B/G and SECAM use 625 lines (576 visible), and so have a higher vertical resolution, but a lower temporal resolution of 25 frames or 50 fields per second.

The NTSC field refresh frequency in the black-and-white system originally exactly matched the nominal 60 Hz frequency of alternating current power used in the United States. Matching the field refresh rate to the power source avoided intermodulation (also called beating), which produces rolling bars on the screen. When color was added to the system, the refresh frequency was shifted slightly downward to 59.94 Hz to eliminate stationary dot patterns in the difference frequency between the sound and color carriers, as explained below in "Color encoding". Synchronization of the refresh rate to the power incidentally helped kinescope cameras record early live television broadcasts, as it was very simple to synchronize a film camera to capture one frame of video on each film frame by using the alternating current frequency to set the speed of the synchronous AC motor-drive camera. By the time the frame rate changed to 29.97 Hz for color, it was nearly as easy to trigger the camera shutter from the video signal itself.

The actual figure of 525 lines was chosen as a consequence of the limitations of the vacuum-tube-based technologies of the day. In early TV systems, a master voltage-controlled oscillator was run at twice the horizontal line frequency, and this frequency was divided down by the number of lines used (in this case 525) to give the field frequency (60 Hz in this case). This frequency was then compared with the 60 Hz power-line frequency and any discrepancy corrected by adjusting the frequency of the master oscillator. For interlaced scanning, an odd number of lines per frame was required in order to make the vertical retrace distance identical for the odd and even fields, which meant the master oscillator frequency had to be divided down by an odd number. At the time, the only practical method of frequency division was the use of a chain of vacuum tube multivibrators, the overall division ratio being the mathematical product of the division ratios of the chain. Since all the factors of an odd number also have to be odd numbers, it follows that all the dividers in the chain also had to divide by odd numbers, and these had to be relatively small due to the problems of thermal drift with vacuum tube devices. The closest practical sequence to 500 that meets these criteria was 3 × 5 × 5 × 7 = 525. (For the same reason, 625-line PAL-B/G and SECAM uses 5 × 5 × 5 × 5, the old British 405-line system used 3 × 3 × 3 × 3 × 5, the French 819-line system used 3 × 3 × 7 × 13 etc.)

Colorimetry

The original 1953 color NTSC specification, still part of the United States Code of Federal Regulations, defined the colorimetric values of the system as follows:[16]

Original NTSC colorimetry (1953) CIE 1567 luminance CIE 1931 luminance
primary red 0.67 0.33
primary green 0.21 0.71
primary blue 0.14 0.08
white point (CIE Standard illuminant C) 6774 K 0.310 1

Early color television receivers, such as the RCA CT-100, were faithful to this specification (which was based on prevailing motion picture standards), having a larger gamut than most of today's monitors. Their low-efficiency phosphors (notably in the Red) were weak and long-persistent, leaving trails after moving objects. Starting in the late 1950s, picture tube phosphors would sacrifice saturation for increased brightness; this deviation from the standard at both the receiver and broadcaster was the source of considerable color variation.[17]

SMPTE C

To ensure more uniform color reproduction, receivers started to incorporate color correction circuits that converted the received signal — encoded for the colorimetric values listed above — into signals encoded for the phosphors actually used within the monitor.[17] Since such color correction can not be performed accurately on the nonlinear gamma corrected signals transmitted, the adjustment can only be approximated,[18] introducing both hue and luminance errors for highly saturated colors.

Similarly at the broadcaster stage, in 1968-69 the Conrac Corp., working with RCA, defined a set of controlled phosphors for use in broadcast color picture video monitors.[17] This specification survives today as the SMPTE "C" phosphor specification:

SMPTE "C" colorimetry CIE 1931 x CIE 1931 y
primary red 0.630 0.340
primary green 0.310 0.595
primary blue 0.155 0.070
white point (CIE illuminant D65) 0.3127 0.3290

As with home receivers, it was further recommended[19] that studio monitors incorporate similar color correction circuits so that broadcasters would transmit pictures encoded for the original 1953 colorimetric values, in accordance with FCC standards.

In 1987, the Society of Motion Picture and Television Engineers (SMPTE) Committee on Television Technology, Working Group on Studio Monitor Colorimetry, adopted the SMPTE C (Conrac) phosphors for general use in Recommended Practice 145,[20] prompting many manufacturers to modify their camera designs to directly encode for SMPTE "C" colorimetry without color correction,[21] as approved in SMPTE standard 170M, "Composite Analog Video Signal — NTSC for Studio Applications" (1994). As a consequence, the ATSC digital television standard states that for 480i signals, SMPTE "C" colorimetry should be assumed unless colorimetric data is included in the transport stream.[22]

Japanese NTSC never changed primaries and whitepoint to SMPTE "C", continuing to use the 1953 NTSC primaries and whitepoint.[19] Both the PAL and SECAM systems used the original 1953 NTSC colorimetry as well until 1970;[19] unlike NTSC, however, the European Broadcasting Union (EBU) rejected color correction in receivers and studio monitors that year and instead explicitly called for all equipment to directly encode signals for the "EBU" colorimetric values,[23] further improving the color fidelity of those systems.

Color encoding

For backward compatibility with black-and-white television, NTSC uses a luminance-chrominance encoding system invented in 1938 by Georges Valensi. The three color picture signals are divided into Luminance (derived mathematically from the three separate color signals (Red, Green and Blue)) which takes the place of the original monochrome signal and Chrominance which carries only the color information. This process is applied to each color source by its own Colorplexer, thereby allowing a compatible color source to be managed as if it was an ordinary monochrome source. This allows black-and-white receivers to display NTSC color signals by simply ignoring the chrominance signal. Some black-and-white TVs sold in the US after the introduction of color broadcasting in 1953 were designed to filter chroma out, but the early B&W sets did not do this and chrominance could be seen as a 'dot pattern' in highly colored areas of the picture.

In NTSC, chrominance is encoded using two color signals known as I (in-phase) and Q (in quadrature) in a process called QAM. The two signals each amplitude modulate 3.58 MHz carriers which are 90 degrees out of phase with each other and the result added together but with the carriers themselves being suppressed. The result can be viewed as a single sine wave with varying phase relative to a reference carrier and with varying amplitude. The varying phase represents the instantaneous color hue captured by a TV camera, and the amplitude represents the instantaneous color saturation. This 3.58 MHz subcarrier is then added to the Luminance to form the 'composite color signal' which modulates the video signal carrier just as in monochrome transmission.

For a color TV to recover hue information from the color subcarrier, it must have a zero phase reference to replace the previously suppressed carrier. The NTSC signal includes a short sample of this reference signal, known as the colorburst, located on the 'back porch' of each horizontal synchronization pulse. The color burst consists of a minimum of eight cycles of the unmodulated (fixed phase and amplitude) color subcarrier. The TV receiver has a "local oscillator", which is synchronized with these color bursts. Combining this reference phase signal derived from the color burst with the chrominance signal's amplitude and phase allows the recovery of the 'I' and 'Q' signals which when combined with the Luminance information allows the reconstruction of a color image on the screen. Color TV has been said to really be colored TV because of the total separation of the brightness part of the picture from the color portion. In CRT televisions, the NTSC signal is turned into three color signals called Red, Green and Blue, each controlling that color electron gun. TV sets with digital circuitry use sampling techniques to process the signals but the end result is the same. For both analog and digital sets processing an analog NTSC signal, the original three color signals (Red, Green and Blue) are transmitted using three discrete signals (Luminance, I and Q) and then recovered as three separate colors and combined as a color image.

When a transmitter broadcasts an NTSC signal, it amplitude-modulates a radio-frequency carrier with the NTSC signal just described, while it frequency-modulates a carrier 4.5 MHz higher with the audio signal. If non-linear distortion happens to the broadcast signal, the 3.579545 MHz color carrier may beat with the sound carrier to produce a dot pattern on the screen. To make the resulting pattern less noticeable, designers adjusted the original 60 Hz field rate down by a factor of 1.001 (0.1%), to approximately 59.94 fields per second. This adjustment ensures that the sums and differences of the sound carrier and the color subcarrier and their multiples (i.e., the intermodulation products of the two carriers) are not exact multiples of the frame rate, which is the necessary condition for the dots to remain stationary on the screen, making them most noticeable.

The 59.94 rate is derived from the following calculations. Designers chose to make the chrominance subcarrier frequency an n + 0.5 multiple of the line frequency to minimize interference between the luminance signal and the chrominance signal. (Another way this is often stated is that the color subcarrier frequency is an odd multiple of half the line frequency.) They then chose to make the audio subcarrier frequency an integer multiple of the line frequency to minimize visible (intermodulation) interference between the audio signal and the chrominance signal. The original black-and-white standard, with its 15,750 Hz line frequency and 4.5 MHz audio subcarrier, does not meet these requirements, so designers had either to raise the audio subcarrier frequency or lower the line frequency. Raising the audio subcarrier frequency would prevent existing (black and white) receivers from properly tuning in the audio signal. Lowering the line frequency is comparatively innocuous, because the horizontal and vertical synchronization information in the NTSC signal allows a receiver to tolerate a substantial amount of variation in the line frequency. So the engineers chose the line frequency to be changed for the color standard. In the black-and-white standard, the ratio of audio subcarrier frequency to line frequency is 4.5 MHz / 15,750 = 285.71. In the color standard, this becomes rounded to the integer 286, which means the color standard's line rate is 4.5 MHz / 286 = approximately 15,734 lines per second. Maintaining the same number of scan lines per field (and frame), the lower line rate must yield a lower field rate. Dividing (4,500,000 / 286) lines per second by 262.5 lines per field gives approximately 59.94 fields per second.

Transmission modulation scheme

Spectrum of a System M television channel with NTSC color.

An NTSC television channel as. transmitted occupies a total bandwidth of 6 MHz. The actual video signal, which is amplitude-modulated, is transmitted between 500 kHz and 5.45 MHz above the lower bound of the channel. The video carrier is 1.25 MHz above the lower bound of the channel. Like most AM signals, the video carrier generates two sidebands, one above the carrier and one below. The sidebands are each 4.2 MHz wide. The entire upper sideband is transmitted, but only 1.25 MHz of the lower sideband, known as a vestigial sideband, is transmitted. The color subcarrier, as noted above, is 3.579545 MHz above the video carrier, and is quadrature-amplitude-modulated with a suppressed carrier. The audio signal is frequency-modulated, like the audio signals broadcast by FM radio stations in the 88–108 MHz band, but with a 25 kHz maximum frequency deviation, as opposed to 75 kHz as is used on the FM band, making analog television audio signals sound softer than FM radio signals as received on a wideband receiver. The main audio carrier is 4.5 MHz above the video carrier, making it 250 kHz below the top of the channel. Sometimes a channel may contain an MTS signal, which offers more than one audio signal by adding one or two subcarriers on the audio signal, each synchronized to a multiple of the line frequency. This is normally the case when stereo audio and/or second audio program signals are used. The same extensions are used in ATSC, where the ATSC digital carrier is broadcast at 1.31 MHz above the lower bound of the channel.

"Setup" is a 54 mV(7.5 IRE) voltage offset between the "black" and "blanking" levels. It is unique to NTSC. CVBS stands for Color, Video, Blanking, and Sync.

Frame rate conversion

There is a large difference in frame rate between film, which runs at 24.0 frames per second, and the NTSC standard, which runs at approximately 29.97 (10 MHz×63/88/455/525) frames per second. In regions that use 25-fps television and video standards, this difference can be overcome by speed-up.

For 30-fps standards, a process called "3:2 pulldown" is used. One film frame is transmitted for three video fields (lasting 1½ video frames), and the next frame is transmitted for two video fields (lasting 1 video frame). Two film frames are thus transmitted in five video fields, for an average of 2½ video fields per film frame. The average frame rate is thus 60 ÷ 2.5 = 24 frames per second, so the average film speed is nominally exactly what it should be. (In reality, over the course of an hour of real time, 215,827.2 video fields are displayed, representing 86,330.88 frames of film, while in an hour of true 24-fps film projection, exactly 86,400 frames are shown: thus, 29.97-fps NTSC transmission of 24-fps film runs at 99.92% of the film's normal speed.) Still-framing on playback can display a video frame with fields from two different film frames, so any difference between the frames will appear as a rapid back-and-forth flicker. There can also be noticeable jitter/"stutter" during slow camera pans (telecine judder).

To avoid 3:2 pulldown, film shot specifically for NTSC television is often taken at 30 frame/s.[citation needed]

To show 25-fps material (such as European television series and some European movies) on NTSC equipment, every fifth frame is duplicated and then the resulting stream is interlaced.

Film shot for NTSC television at 24 frames per second has traditionally been accelerated by 1/24 (to about 104.17% of normal speed) for transmission in regions that use 25-fps television standards. This increase in picture speed has traditionally been accompanied by a similar increase in the pitch and tempo of the audio. More recently, frame-blending has been used to convert 24 FPS video to 25 FPS without altering its speed.

Film shot for television in regions that use 25-fps television standards can be handled in either of two ways:

  • The film can be shot at 24 frames per second. In this case, when transmitted in its native region, the film may be accelerated to 25 fps according to the analog technique described above, or kept at 24 fps by the digital technique described above. When the same film is transmitted in regions that use a nominal 30-fps television standard, there is no noticeable change in speed, tempo, and pitch.
  • The film can be shot at 25 frames per second. In this case, when transmitted in its native region, the film is shown at its normal speed, with no alteration of the accompanying soundtrack. When the same film is shown in regions that use a 30-fps nominal television standard, every fifth frame is duplicated, and there is still no noticeable change in speed, tempo, and pitch..

Because both film speeds have been used in 25-fps regions, viewers can face confusion about the true speed of video and audio, and the pitch of voices, sound effects, and musical performances, in television films from those regions. For example, they may wonder whether the Jeremy Brett series of Sherlock Holmes television films, made in the 1980s and early 1990s, was shot at 24 fps and then transmitted at an artificially fast speed in 25-fps regions, or whether it was shot at 25 fps natively.

These discrepancies exist not only in television broadcasts over the air and through cable, but also in the home-video market, on both tape and disc, including laser disc and DVD.

In digital television and video, which are replacing their analog predecessors, single standards that can accommodate a wider range of frame rates still show the limits of analog regional standards. The ATSC standard, for example, allows frame rates of 23.976, 24, 29.97, 30, 59.94, and 60 frames per second, but not 25 and 50.

Modulation for analog satellite transmission

Because satellite power is severely limited, analog video transmission through satellites differs from terrestrial TV transmission. AM is a linear modulation method, so a given demodulated signal-to-noise ratio (SNR) requires an equally high received RF SNR. The SNR of studio quality video is over 50 dB, so AM would require prohibitively high powers and/or large antennas.

Wideband FM is used instead to trade RF bandwidth for reduced power. Increasing the channel bandwidth from 6 to 36 MHz allows a RF SNR of only 10 dB or less. The wider noise bandwidth reduces this 40 dB power saving by 36 MHz / 6 MHz = 8 dB for a substantial net reduction of 32 dB.

Sound is on a FM subcarrier as in terrestrial transmission, but frequencies above 4.5 MHz are used to reduce aural/visual interference. 6.8, 5.8 and 6.2 MHz are commonly used. Stereo can be multiplex or discrete, and unrelated audio and data signals may be placed on additional subcarriers.

A triangular 60 Hz energy dispersal waveform is added to the composite baseband signal (video plus audio and data subcarriers) before modulation. This limits the satellite downlink power spectral density in case the video signal is lost. Otherwise the satellite might transmit all of its power on a single frequency, interfering with terrestrial microwave links in the same frequency band.

In half transponder mode, the frequency deviation of the composite baseband signal is reduced to 18 MHz to allow another signal in the other half of the 36 MHz transponder. This reduces the FM benefit somewhat, and the recovered SNRs are further reduced because the combined signal power must be "backed off" to avoid intermodulation distortion in the satellite transponder. A single FM signal is constant amplitude, so it can saturate a transponder without distortion.

Field order

[24] An NTSC "frame" consists of an "even" field followed by an "odd" field. As far as the reception of an analog signal is concerned, this is purely a matter of convention and, it makes no difference. It's rather like the broken lines running down the middle of a road, it doesn't matter whether it is a line/space pair or a space/line pair; the effect to a driver is exactly the same.

The introduction of digital television formats has changed things somewhat. Most digital TV formats, including the popular DVD format, record NTSC originated video with the even field first in the recorded frame (the development of DVD took place in regions that traditionally utilize NTSC). However, this frame sequence has migrated through to the so-called PAL format (actually a technically incorrect description) of digital video with the result that the even field is often recorded first in the frame (the European 625 line system is specified as odd frame first). This is no longer a matter of convention because a frame of digital video is a distinct entity on the recorded medium. This means that when reproducing many non NTSC based digital formats (including DVD) it is necessary to reverse the field order otherwise an unacceptable shuddering "comb" effect occurs on moving objects as they are shown ahead in one field and then jump back in the next.

This has also become a hazard where non NTSC progressive video is transcoded to interlaced and vice versa. Systems that recover progressive frames or transcode video should ensure that the "Field Order" is obeyed, otherwise the recovered frame will consist of a field from one frame and a field from an adjacent frame, resulting in "comb" interlacing artifacts. This can often be observed in PC based video playing utilities if an inappropriate choice of de-interlacing algorithm is made.

Variants

NTSC-M

Unlike PAL, with its many varied underlying broadcast television systems in use throughout the world, NTSC color encoding is invariably used with broadcast system M, giving NTSC-M.

NTSC-J

Only Japan's variant "NTSC-J" is slightly different: in Japan, black level and blanking level of the signal are identical (at 0 IRE), as they are in PAL, while in American NTSC, black level is slightly higher (7.5 IRE) than blanking level. Since the difference is quite small, a slight turn of the brightness knob is all that is required to correctly show the "other" variant of NTSC on any set as it is supposed to be; most watchers might not even notice the difference in the first place. The channel encoding on NTSC-J differs slightly from NTSC-M. In particular, the Japanese VHF band runs from channels 1-12 (located on frequencies directly above the 76-90 MHz Japanese FM radio band) while the North American VHF TV band uses channels 2-13 (54-72 MHz, 76-88 MHz and 174-216 MHz) with 88-108 MHz allocated to FM radio broadcasting. Japan's UHF TV channels are therefore numbered from 13 up and not 14 up, but otherwise uses the same UHF broadcasting frequencies as those in North America.

PAL-M (Brazil)

The Brazilian PAL-M system, introduced in 1972, uses the same lines/field as NTSC (525/60), and almost the same broadcast bandwidth and scan frequency (15.750 vs. 15.734 kHz). Prior to the introduction of color, Brazil broadcast in standard black-and-white NTSC. As a result, PAL-M signals are near identical to North American NTSC signals, except for the encoding of the colour subcarrier (3.575611 MHz for PAL-M and 3.579545 MHz for NTSC). As a consequence of these close specs, PAL-M will display in monochrome with sound on NTSC sets and vice versa.

  • PAL-M (PAL=Phase Alternating Line) specs are:
Transmission band UHF/VHF,
Frame rate 30
Lines/fields 525/60
Horizontal freq. 15.750 kHz
Vertical freq. 60 Hz
Color sub carrier 3.575611 MHz
Video bandwidth 4.2 MHz
Sound carrier frequency 4.5 MHz
Channel bandwidth 6 MHz
  • NTSC (National Television System Committee) specs are:
Transmission band UHF/VHF
Lines/fields 525/60
Horizontal frequency 15.734 kHz
Vertical frequency 59.939 Hz
Color subcarrier frequency 3.579545 MHz
Video bandwidth 4.2 MHz
Sound carrier frequency 4.5 MHz

PAL-N

This is used in Argentina, Paraguay and Uruguay. This is very similar to PAL-M (used in Brazil).

The similarities of NTSC-M and NTSC-N can be seen on the ITU identification scheme table, which is reproduced here:

World television systems
System Lines  Frame rate Channel b/w Visual b/w Sound offset Vestigial sideband Vision mod. Sound mod. Notes
M 525 29.97 6 4.2 +4.5 0.75 Neg. FM Most of the Americas and Caribbean, South Korea, Taiwan, Philippines (all NTSC-M) and Brazil (PAL-M). Greater frame rate results in higher quality.
N 625 25 6 4.2 +4.5 0.75 Neg. FM Argentina, Paraguay, Uruguay (all PAL-N). Greater number of lines results in higher quality.

As it is shown, aside from the number of lines and frames per second, the systems are identical. NTSC-N/PAL-N are compatible with sources such as game consoles, VHS/Betamax VCRs, and DVD players. However, they are not compatible with baseband broadcasts (which are received over an antenna), though some newer sets come with baseband NTSC 3.58 support (NTSC 3.58 being the frequency for color modulation in NTSC: 3.58 MHz).

NTSC 4.43

In what can be considered an opposite of PAL-60, NTSC 4.43 is a pseudo color system that transmits NTSC encoding (525/29.97) with a color subcarrier of 4.43 MHz instead of 3.58 MHz. The resulting output is only viewable by TVs that support the resulting pseudo-system (usually multi-standard TVs). Using a native NTSC TV to decode the signal yields no color, while using a PAL TV to decode the system yields erratic colors (observed to be lacking red and flickering randomly). The format was used by the USAF TV based in Germany during The Cold War.[citation needed] It was also found as an optional output on some laserdisc players and some game consoles sold in markets where the PAL system is used.

The NTSC 4.43 system, while not a broadcast format, appears most often as a playback function of PAL cassette format VCRs, beginning with the Sony 3/4" U-Matic format and then following onto Betamax and VHS format machines. As Hollywood has the claim of providing the most cassette software (movies and television series) for VCRs for the world's viewers, and as not all cassette releases were made available in PAL formats, a means of playing NTSC format cassettes was highly desired.

Multi-standard video monitors were already in use in Europe to accommodate broadcast sources in PAL, SECAM, and NTSC video formats. The heterodyne color-under process of U-Matic, Betamax & VHS lent itself to minor modification of VCR players to accommodate NTSC format cassettes. The color-under format of VHS uses a 629 kHz subcarrier while U-Matic & Betamax use a 688 kHz subcarrier to carry an amplitude modulated chroma signal for both NTSC and PAL formats. Since the VCR was ready to play the color portion of the NTSC recording using PAL color mode, the PAL scanner and capstan speeds had to be adjusted from PAL's 50 Hz field rate to NTSC's 59.94 Hz field rate, and faster linear tape speed.

The changes to the PAL VCR are minor thanks to the existing VCR recording formats. The output of the VCR when playing an NTSC cassette in NTSC 4.43 mode is 525 lines/29.97 frames per second with PAL compatible heterodyned color. The multi-standard receiver is already set to support the NTSC H & V frequencies; it just needs to do so while receiving PAL color.

The existence of those multi-standard receivers was probably part of the drive for region coding of DVDs. As the color signals are component on disc for all display formats, almost no changes would be required for PAL DVD players to play NTSC (525/29.97) discs as long as the display was frame-rate compatible.

OSKM

In January 1960 (7 years prior to adoption of the modified SECAM version) the experimental TV studio in Moscow started broadcasting using OSKM system. OSKM abbreviation means "Simultaneous system with quadrature modulation" (Russian Одновременная Система с Квадратурной Модуляцией). It used the color coding scheme that was later used in PAL (U and V instead of I and Q), because it was based on D/K monochrome standard, 625/50.

The color subcarrier frequency was 4.4296875 MHz and the bandwidth of U and V signals was near 1.5 MHz. Only circa 4000 TV sets of 4 models (Raduga, Temp-22, Izumrud-201 and Izumrud-203) were produced for studying the real quality of TV reception. These TV's were not commercially available, despite being included in the goods catalog for trade network of the USSR.

The broadcasting with this system lasted about 3 years and was ceased well before SECAM transmissions started in the USSR. None of the current multi-standard TV receivers can support this TV system.

NTSC-movie

NTSC with a frame rate of 23.976 frame/s is described in the NTSC-movie standard.[citation needed]

Canada/U.S. video game region

Sometimes NTSC-US or NTSC-U/C is used to describe the video gaming region of North America (the U/C refers to U.S. + Canada), as regional lockout usually restricts games released within a region to that region.

Comparative quality

The SMPTE color bars, an example of a test card.

Reception problems can degrade an NTSC picture by changing the phase of the color signal (actually differential phase distortion), so the color balance of the picture will be altered unless a compensation is made in the receiver. The vacuum-tube electronics used in televisions through the 1960s led to various technical problems. Among other things, the color burst phase would often drift when channels were changed, which is why NTSC televisions were equipped with a tint control. PAL and SECAM televisions had no need of one, and although it is still found on NTSC TVs, color drifting generally ceased to be a problem once solid-state electronics were adopted in the 1970s. When compared to PAL in particular, NTSC color accuracy and consistency is sometimes considered inferior, leading to video professionals and television engineers jokingly referring to NTSC as Never The Same Color, Never Twice the Same Color, or No True Skin Colors,[25] while for the more expensive PAL system it was necessary to Pay for Additional Luxury. PAL has also been referred to as Peace At Last, Perfection At Last or Pictures Always Lovely in the color war. This mostly applied to vacuum tube-based TVs, however, and later-model solid state sets using Vertical Interval Reference signals have less of a difference in quality between NTSC and PAL. This color phase, "tint", or "hue" control allows for anyone skilled in the art to easily calibrate a monitor with SMPTE color bars, even with a set that has drifted in its color representation, allowing the proper colors to be displayed. Older PAL television sets did not come with a user accessible "hue" control (it was set at the factory), which contributed to its reputation for reproducible colors.

The use of NTSC coded color in S-Video systems completely eliminates the phase distortions. As a consequence, the use of NTSC color encoding gives the highest resolution picture quality (on the horizontal axis & frame rate) of the three color systems when used with this scheme. (The NTSC resolution on the vertical axis is lower than the European standards, 525 lines against 625.) However, it uses too much bandwidth for over-the-air transmission. The Commodore 64 home computer of the 1980s generated S-video, but only when used with specially designed monitors as no TV at the time supported the separate chroma and luma on standard RCA jacks. In 1987, a standardized 4-pin DIN plug was introduced for S-video input with the introduction of S-VHS players, which were the first device produced to use the 4-pin plugs. However, S-VHS never became very popular. Video game consoles in the 1990s began offering S-video output as well.

With the advent of DVD players in the 1990s, component video also began appearing. This provides separate lines for the luminance, red shift, and blue shift. Thus, component produces near-RGB quality video. It also allows 480p progressive-scan video due to the greater bandwidth offered.

The mismatch between NTSC's 30 frames per second and film's 24 frames is overcome by a process that capitalizes on the field rate of the interlaced NTSC signal, thus avoiding the film playback speedup used for 576i systems at 25 frames per second (which causes the accompanying audio to increase in pitch slightly, sometimes rectified with the use of a pitch shifter) at the price of some jerkiness in the video. See Frame rate conversion above.

Vertical interval reference

The standard NTSC video image contains some lines (lines 1–21 of each field) that are not visible (this is known as the Vertical Blanking Interval, or VBI); all are beyond the edge of the viewable image, but only lines 1–9 are used for the vertical-sync and equalizing pulses. The remaining lines were deliberately blanked in the original NTSC specification to provide time for the electron beam in CRT-based screens to return to the top of the display.

VIR (or Vertical interval reference), widely adopted in the 1980s, attempts to correct some of the color problems with NTSC video by adding studio-inserted reference data for luminance and chrominance levels on line 19.[26] Suitably equipped television sets could then employ these data in order to adjust the display to a closer match of the original studio image. The actual VIR signal contains three sections, the first having 70 percent luminance and the same chrominance as the color burst signal, and the other two having 50 percent and 7.5 percent luminance respectively.[27]

A less-used successor to VIR, GCR, also added ghost (multipath interference) removal capabilities.

The remaining vertical blanking interval lines are typically used for datacasting or ancillary data such as video editing timestamps (vertical interval timecodes or SMPTE timecodes on lines 12–14[28][29]), test data on lines 17–18, a network source code on line 20 and closed captioning, XDS, and V-chip data on line 21. Early teletext applications also used vertical blanking interval lines 14–18 and 20, but teletext over NTSC was never widely adopted by viewers.[30]

Many stations transmit TV Guide On Screen (TVGOS) data for an electronic program guide on VBI lines. The primary station in a market will broadcast 4 lines of data, and backup stations will broadcast 1 line. In most markets the PBS station is the primary host. TVGOS data can occupy any line from 10-25, but in practice its limited to 11-18, 20 and line 22. Line 22 is only used for 2 broadcast, DirecTV and CFPL-TV.

TiVo data is also transmitted on some commercials and program advertisements so customers can autorecord the program being advertised, and is also used in weekly half-hour paid programs on Ion Television and the Discovery Channel which highlight TiVo promotions and advertisers.

Countries and territories that are using or once used NTSC

Countries and territories that have ceased using NTSC

The following countries no longer use NTSC for terrestrial broadcasts.

Country Switched to Switchover completed
 Cambodia PAL 1990s
 Canada ATSC 2012-07-31July 31, 2012
 Dominican Republic ATSC 2015-09-24September 24, 2015
 Fiji PAL 1990
 Japan ISDB-T 2012-03-31March 31, 2012
 South Korea ATSC 2012-12-31December 31, 2012
 Paraguay PAL-N 2000s
 Samoa PAL In the 1980s as a result of influence from American Samoa.
 South Yemen PAL Yemen unification in 1990.
 Taiwan DVB-T 2012-06-30June 30, 2012
 Thailand[35] PAL 1970s
 United States ATSC 2009-06-12June 12, 2009 (Full Power Stations)[36]
September 1, 2015 (Low Power Stations)
 Vietnam PAL 1990s

See also

Notes

  1. ^ National Television System Committee (1951–1953), [Report and Reports of Panel No. 11, 11-A, 12-19, with Some supplementary references cited in the Reports, and the Petition for adoption of transmission standards for color television before the Federal Communications Commission, n.p., 1953], 17 v. illus., diagrs., tables. 28 cm. LC Control No.:54021386 Library of Congress Online Catalog
  2. ^ NTSC system information and the countries that use it. High-Tech Productions
  3. ^ Digital Television. FCC.gov. Retrieved on 2014-05-11.
  4. ^ a b DTV and Over-the-Air Viewers Along U.S. Borders. FCC.gov. Retrieved on 2014-05-11.
  5. ^ Canada... PAL or NTSC?. VideoHelp Forum Retrieved on 2015-01-23.
  6. ^ Televisión Digital Abierta (in Spanish) 2015 © Televisoras Grupo Pacífico. Retrieved 2015-11-06
  7. ^ What actually occurred was the RCA TG-1 synch generator system was upgraded from 441 lines per frame, 220.5 lines per field, interlaced, to 525 lines per frame 261.5 lines per field, also interlaced, with minimal additional changes, particularly not those affecting the vertical interval which, in the extant RCA system, included "serrated" equalizing pulses bracketing the vertical synch pulse, itself being "serrated". For RCA/NBC, this was a very simple change from a 26,460 Hz master oscillator to a 31,500 Hz master oscillator, and minimal additional changes to the generator's divider chain. The "equalizing" pulses and the "serration" of the vertical synch pulse were necessary because of the limitations of the extant TV receiver video/synch separation technology, thought to be necessary because the synch was transmitted "in band" with the video, although at a quite different dc level. The early TV sets did not possess a "dc restorer" circuit, hence the need for this level of complexity. In-studio monitors were provided with separate horizontal and vertical synch, not "composite" synch and certainly not "in-band" synch (possibly excepting early color TV monitors, which were often driven from the output of the station's "colorplexer").
  8. ^ The master oscillator was 14.31818 MHz, from which the 3.579545 "color burst" frequency was obtained by dividing by four; and the 31 kHz "horizontal drive" and 60 Hz "vertical drive" were also synthesized from that frequency. This facilitated a conversion to color of the then common, but monochrome, RCA TG-1 synchronizing generator by the simple expedient of adding-on an external 14.31818 MHz temperature-controlled oscillator and a few dividers, and inputting the outputs of that chassis to certain test points within the TG-1, thereby disabling the TG-1's own 31500 Hz reference oscillator.
  9. ^ "Choice of Chrominance Subcarrier Frequency in the NTSC Standards," Abrahams, I.C., Proc. IRE, Vol. 42, Issue 1, p.79-80
  10. ^ "The Frequency Interleaving Principle in the NTSC Standards," Abrahams, I.C., Proc. IRE, vol. 42, Issue 1, p. 81-83
  11. ^ A third "line sequential" system from Color Television Inc. (CTI) was also considered. The CBS and final NTSC systems were called "field sequential" and "dot sequential" systems, respectively.
  12. ^ "Color TV Shelved As a Defense Step", The New York Times, October 20, 1951, p. 1. "Action of Defense Mobilizer in Postponing Color TV Poses Many Question for the Industry", The New York Times, October 22, 1951, p. 23. "TV Research Curb on Color Avoided", The New York Times, October 26, 1951. Ed Reitan, CBS Field Sequential Color System, 1997. A variant of the CBS system was later used by NASA to broadcast pictures of astronauts from space.
  13. ^ "CBS Says Confusion Now Bars Color TV," Washington Post, March 26, 1953, p. 39.
  14. ^ "F.C.C. Rules Color TV Can Go on Air at Once", The New York Times, December 19, 1953, p. 1.
  15. ^ "NBC Launches First Publicly-Announced Color Television Show", Wall Street Journal, August 31, 1953, p. 4.
  16. ^ 47 CFR § 73.682 (20) (iv)
  17. ^ a b c DeMarsh, Leroy (1993): TV Display Phosphors/Primaries — Some History. SMPTE Journal, December 1993: 1095–1098.
  18. ^ Parker, N.W. (1966): An Analysis of the Necessary Receiver Decoder Corrections for Color Receiver Operation with Non-Standard Primaries. IEEE transactions on broadcast and television receivers, vol. BTR-12, no. 1, pp. 23—32.
  19. ^ a b c International Telecommunications Union Recommendation ITU-R 470-6 (1970–1998): Conventional Television Systems, Annex 2.
  20. ^ Society of Motion Picture and Television Engineers (1987–2004): Recommended Practice RP 145-2004. Color Monitor Colorimetry.
  21. ^ Society of Motion Picture and Television Engineers (1994, 2004): Engineering Guideline EG 27-2004. Supplemental Information for SMPTE 170M and Background on the Development of NTSC Color Standards, pp. 9
  22. ^ Advanced Television Systems Committee (2003): ATSC Direct-to-Home Satellite Broadcast Standard Doc. A/81, pp.18
  23. ^ European Broadcasting Union (1975) Tech. 3213-E.: E.B.U. Standard for Chromaticity Tolerances for Studio Monitors.
  24. ^ CCIR Report 308-2 Part 2 Chapter XII — Characteristics of Monochrome Television Systems (1970 edition).
  25. ^ Jain, Anal K., Fundamentals of Digital Image Processing, Upper Saddle River NJ: Prentice Hall, 1989, p. 82.
  26. ^ [1] Archived 2006-03-13 at the Wayback Machine
  27. ^ Waveform Mons & Vectorscopes. Danalee.ca. Retrieved on 2014-05-11.
  28. ^ SMPTE EBU timecode by Phil Rees. Philrees.co.uk. Retrieved on 2014-05-11.
  29. ^ Technical Introduction to Timecode. Poynton.com. Retrieved on 2014-05-11.
  30. ^ Tools | The History Project. Experimentaltvcenter.org. Retrieved on 2014-05-11.
  31. ^ a b c d e f g h i j k l m n o p q r s t u v w x y z aa ab ac ad ae af ag ah ai aj ak al am an ao ap aq ar as at au av aw Michael Hegarty; Anne Phelan; Lisa Kilbride (1 January 1998). Classrooms for Distance Teaching and Learning: A Blueprint. Leuven University Press. pp. 260–. ISBN 978-90-6186-867-5.
  32. ^ Canadian Radio-television and Telecommunications Commission (CRTC) Press release May 2007
  33. ^ Transicion a TDT (Transition to DT) (Spanish)
  34. ^ Philip J. Cianci (9 January 2012). High Definition Television: The Creation, Development and Implementation of HDTV Technology. McFarland. pp. 302–. ISBN 978-0-7864-8797-4.
  35. ^ Experimented by MCOT and Royal Thai Army Radio and Television, before adopted for PAL.
  36. ^ "ATSC SALUTES THE 'PASSING' OF NTSC". NTSC. Retrieved June 13, 2009. {{cite web}}: Check |archiveurl= value (help)

References