Jump to content

Talk:1080i

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by 202.165.91.8 (talk) at 15:14, 26 May 2022 (lkkkk: new section). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

WikiProject iconTelevision Start‑class Mid‑importance
WikiProject iconThis article is within the scope of WikiProject Television, a collaborative effort to develop and improve Wikipedia articles about television programs. If you would like to participate, please visit the project page where you can join the discussion. To improve this article, please refer to the style guidelines for the type of work.
StartThis article has been rated as Start-class on Wikipedia's content assessment scale.
MidThis article has been rated as Mid-importance on the project's importance scale.

29,97fps vs 30fps?

Someone cares to add WHY the 0.03 fps difference exists; how it came about, and what the reason is of it's existence? Thanks! —Preceding unsigned comment added by 71.206.65.120 (talk) 04:03, 17 March 2009 (UTC)[reply]

It came about with the move to NTSC colour. It solved certain problems that existed had the frame rate remained at 30fps.20.133.0.13 (talk) 13:31, 27 October 2009 (UTC)[reply]

I believe the reason is to avoid a beat with 60hz electricity. 29.97 certainly existed before NTSC color. Adamgoldberg (talk) 12:49, 15 September 2021 (UTC)[reply]

1080i v. p

I am removing the line about 1080p offering no advantage and being generally unsupported. In addition to it now becoming more widely supported by the latest generation of LCD sets, the statement about it offering no advantage is unsupported by references and certanly arguably untrue (refer to the article on interlacing). For displays such as DLP, Plasma and LCD interlacing must be removed and causes visible artifacts in the process.

Robbins 06:20, 28 November 2006 (UTC)[reply]

Is there any reason why anyone would prefer 1080i over 1080p except if you have a CRT based HDTV set? In uncompressed format the data rate, i.e. pixels per second, is the same and it is easier and more efficient to compress video in progressive format than in interlaced format.

As has been added to the article, many current LCD TVs do not support the 1080p standard. It offers no advantage. Interlaced scanned formats have always offered a resolution advantage over progressive scan (plus a reduction in flicker on CRTs). This is primarily why the EMI 405i system triumphed over the Baird 240p system in 1936. Had LCD displays been around in 1936, the outcome may have not have been so clear cut, though the greater portability of the EMI all electronic cameras offered a significant advantage over the Baird Film based cameras which required a water supply for the film processing plant in the camera base (which meant that they were bolted to the studio floor).
The previous poster states that interlaced scanning offers a higher resolution than progressive scanning. For any given line standard this is simply not true. For example 1080i/25 has a lower vertical resolution than 1080p/25. In fact, the apparent vertical resolution of 1080i is approximated by that of 720p. Comparing a 405-line interlaced electronic system with a 240-line progressive mechanical system is hardly fair! 82.127.93.74 17:19, 19 January 2007 (UTC)[reply]
Yes, when using the same bandwidth 1080p has 25frames per second at a res of 1080. 1080i has 50frames per second of a res of 540. The later is usefull in sports where the second frame is not just the odd-numbered lines, but also the action happening 1/50 second later. This gives you 50 live updates per second opposed to 25, and is the reason why sports are much better on interlaced CRT screens than on most de-interlaced flat-screens. Ofcourse if you compare double bandwidth 1080p60 then it wins over 1080i every time. Carewolf 13:16, 2 August 2007 (UTC)[reply]
1080i is not necessarily better for sports. As you get half the horizontal lines updated in each update if something is moving across the screen you see it 'stretched' as half of it is 1/60th of a second 'ahead' of the rest of it. I include a link for a potential future update to this poorly written misinformed wiki here - Anomalous result (talk) 09:09, 12 December 2007 (UTC)[reply]

The main article states: "Because of interlacing 1080i has half the vertical resolution of 1080p." This is not true. 1080i and 1080p have exactly the same spatial resolution but 1080i has less temporal resolution. RastaKins 05:06, 21 March 2007 (UTC)[reply]

As far as I'm aware, the main reason 1080i still exists is broadcast engineers reluctance to change.. There is no technical reason interlacing is needed anymore (TVs are capable of display progressive scan quite easily now), but it's still around (it's much the same as the obscure 29.97frames/second framerate still existing) - the benefits are questionable (claiming interlacing "doubles the frame rate" is silly, given that there's basically no perceivable difference between 25 and 50fps - if there was, cinema wouldn't still be 24fps..) 81.152.116.183 (talk) 02:37, 18 July 2008 (UTC)[reply]

There is a big perceivable change between 24/25fps and 50fps. 24fps has a distinctive 'film' look while 50fps has a much smoother look that is associated with video (for obvious reasons). However, since almost all film and TV is shot at 24 or 25fps, you would gain nothing from 1080p over 1080i for these, as they'd both be showing the same frames at the same resolution. Teppic74 (talk) 13:05, 24 September 2011 (UTC)[reply]

It seems few are capable of understanding temporal resolution, which operates in the human visual cortex, and that interlaced video, if presented accurately, provides more total resolution than progressive for the same frame rate (two fields in interlaced). The fact that monitor and TV manufacturers wish to abandon it makes no difference, and in fact becomes a self-fulfilling prophecy in that, yes, since the devices alter the signal, the picture is now somewhat slightly degraded in addition to the loss of temporal resolution. — Preceding unsigned comment added by Mydogtrouble (talkcontribs) 13:58, 13 March 2013 (UTC)[reply]

1440px wide?

Maybe I'm completely wrong, but I'm sure I've read that 1080i is (often if not always?) 1440*1080 before being stretched to 16:9 (like DVD), at least when being broadcast on terrestrial television. Nova Prime 11:51, 17 January 2007 (UTC)[reply]

Most HD cameras only output 1440 pixels per line in 1080 mode and several broadcast HD VTR formats only record 1440 samples per line in 1080 mode, in order to reduce costs. The signals are re-sampled and interpolated back to 1920 per line on the outputs of these devices, however. 82.127.93.74 17:19, 19 January 2007 (UTC)[reply]
ATSC supports only 1920x1080 frames (1920x540 fields) for 1080i MPEG-2 transmission (although the FCC didn't accept that recommendation so in theory any MPEG-2 MP@HL resolution is supported for use within the US.) ATSC was recently revised to support MPEG-4 Part 10 (AVC/H.264) for video, and 1440x1080 is a supported resolution for that standard, although the FCC hasn't accepted that recommendation either (which in this case means that TV stations can't broadcast a primary signal in H.264)
So, the bottom line is, a TV station can theoretically broadcast 1440x1080i60 if they're in the US, but there's a strong chance many TVs simply will not display the signal. I'd be inclined to believe TV stations would be more likely to broadcast a 1440x1080p(45-50) (using RFF) stream if they're going to ignore the ATSC recommendation and just do what's possible, as that would allow them to broadcast 1080p at a high framerate while still keeping the stream under 20Mbps.
And then again, the major reason to broadcast a high frame rate is when showing sports. If you really want to make your TV station unpopular, broadcast sports using a resolution that might not be supported by everyone's TVs... —Preceding unsigned comment added by 66.149.58.8 (talk) 11:52, 12 July 2009 (UTC)[reply]

Deprecated terminology

The terminology used within this article is deprecated. The preferred terminology as described by the ITU and SMPTE has the following format: xxxxy/zz where xxxx is the number of active lines per picture (usually 1080 or 720 when discussing high definition), y is the scanning mode (indicated by a letter i for interlaced scanning or a letter p for progressive scanning), next comes a slash character, and zz is the refresh rate of the picture. Thus standard definition television as used in Europe would be described as 576i/25. 82.127.93.74 17:40, 19 January 2007 (UTC)[reply]

"Solidus"?

The second paragraph says:

 Others, including the European Broadcasting Union (EBU), prefer to use the frame rate 
 instead of the field rate and separate it with a solidus from the resolution as in 1080i/30 
 and 1080i/25, likewise 480i/30 and 576i/25.

The word "solidus" links to the page for the [mark "slash,"] which includes a very stuffy note that "slash" and "solidus" are not the same. Jackrepenning 22:05, 22 January 2007 (UTC)[reply]

Yeah, that's ridiculous. "Solidus" ??? As opposed to what, "liquidus"? Gimme a break. I replaced it with the word "slash".

What to reference

When writing a paper you are not suppose to list references for someone who is reasonably skilled in the field would know already. For example 1080i60 uses the same bandwidth 1080p30. Most people who knowing something about video display know this is true. A simple calculation of the image size times the frame rate shows this is true: 60*1920*540= 30*1920*1080. So does wikipedia have a different standard for what should have a reference? Daniel.Cardenas 04:25, 4 April 2007 (UTC)[reply]

Since a paper on video display is intended for a particular audience, its author can reasonably make assumptions about what the audience should know. Wikipedia, however, is aimed at a general audience, and we cannot assume that anyone reading the article is "reasonably skilled" in video display, or even that they know anything at all about video display. Many, perhaps most, people reading this article would not know how to calculate bandwidth. Fumblebruschi (talk) 21:00, 27 December 2007 (UTC)[reply]

Layman's question

I'm baffled. I thought 30 fps was the standard for US television, and that the difference between interlaced and progressive was whether the full frame was completed in one pass or in two passes. I'm no techie, so perhaps this could be explained in layman's terms. The discussion above compounds my confusion rather than ending it. Second question, partly related to the first: Here's a quote from the article: "Due to interlacing 1080i has twice the frame-rate but half the resolution of a 1080p signal using the same bandwidth. This is especially useful in sport-shows and other shows with fast-moving action." I'm not sure what "this," the first word in the second sentence, refers to. It seems to me to refer to the subject of the first sentence, 1080i, which means that 1080i is better for shows with fast-moving subjects. I'm almost sure that's wrong, because of the movement of displayed objects in the picture between the first and second scans of the same frame. But even if it's not wrong, perhaps the pronoun "This" that begins the second sentence should be changed to a noun (The 1080i standard or the 1080p standard) to clear up any confusion.

Many cameras doesn't use the same frame for generating the two frames in interlace. So what if you imagine 50 frames per second each divided in odd and even lines, you get the even lines of the first frame, then the odd lines of the second frame, then the even lines of the third frame, etc. This is also what makes de-interlacing so incredibly hard, because you can not just recombine the two frames to get the original. Carewolf 13:21, 2 August 2007 (UTC)[reply]
Answer to "baffled". Interlaced (1080i) transmits 30 frames per second, ½ of each frame every 1/60th of a second. Progressive (720p) transmits 60 complete frames per second. On flat-screen HDTV sets (LCD or plasma), all broadcasts are displayed as progressive; thus the two parts of the 1080i frame must be combined (de-interlaced); some HDTV sets perform de-interlacing better than others. 1080p is not broadcast; it has twice the frame rate (60/second) of 1080i (30/sec). Many believe that 720p at 60 frames/sec is superior for fast-moving action (it's used by ESPN and ABC). -Dawn McGatney 69.139.231.9 (talk) 08:29, 30 March 2008 (UTC)[reply]

Contradiction in frame rates.

The first paragraph speaks of frame rates of 25 and 30Hz for 1080i; yet the comparison table says 1080i is 50 or 60Hz and 1080p is 24, 25 or 30HZ. This appears to be a contradiction. —Preceding unsigned comment added by KX36 (talkcontribs) 06:15, 16 August 2007

This is not a contradiction; it's just from the standards that the Consumer Electronics Association have adopted. 1080i content exists only in 50 and 60 field per second varieties. 1080p exists in 24, 25 and 30 frames per second. There is no analogous format for 1080i at 48 fields per second. Traditionally, 1080p24 content will be telecinc'd to 1080i60 or sped up to 1080p25 and interlaced, making 1080i50. — Preceding unsigned comment added by 206.248.184.215 (talk) 21:38, 22 September 2012 (UTC)[reply]

1080iN notation - Is N frame rate or field rate?

This is unclear in the article.

"1080p60", has no ambiguity, (60 full frames per second), but "1080i60" might be interpreted as 60 two-field-frames per second, or it might be interpreted as 60 fields per second.

Can we have a section establishing the standard interpretation? Also we may need to clarify or avoid all usages of "N frames per second". fields or full-frames?

Glueball 10:56, 10 September 2007 (UTC)[reply]

Until now I had always seen that number meaning fields per second for interlaced modes, so that PAL-B/G is expressed as 576i50. But I'll have to check. --150.241.250.3 07:31, 18 September 2007 (UTC)[reply]

The number following "p" or "i" is frames (complete pictures) per second. 1080i is short for 1080i30; 720p is short for 720p60. (And 480i (SD) is short for 480i30.) -Dawn McGatney 69.139.231.9 (talk) 08:38, 30 March 2008 (UTC)[reply]
No, it's fields. You'd be hard pressed to find a single instance of the number after the 'i' (except possibly in the EBU notation - I'd like a citation because a quick Google shows around 8x the number of references to 1080i/60 as 1080i/30) where a 60Hz 1080 interlaced signal is referred to as "1080i30". On manufacturer's data sheets, it's 1080i60. On reviews it's 1080i60. On "Dummy's guide to HDTV" type articles, it's 1080i60. I'm a little baffled that Wikipedia is bucking the trend here, as I've come across a number of Wikipedia articles (and only WP articles) using the "frames" number, and no citations to back it up.
I'll await clarification on the EBU notation before changing that, but for now if there's no slash, the number after the "i" should be fields, otherwise Wikipedia's going to be out of step with pretty much the entire world on this! --66.149.58.8 (talk) 11:44, 11 July 2009 (UTC)[reply]
The definition of the EBU nomenclature is found in the article's first external link, "High Definition (HD) Image Formats for Television Production" (EBU - Tech 3299), under the heading "Nomenclatures and Image Sampling Systems". It says "samples horiz. x active lines/Scanning/frame rate" with an abbreviated style without the horizontal samples. Regarding 1080i/60 vs 1080i/30, a month later EBU Technical Review Editorial No. 301 says that "the convention used to describe TV formats is the number of active lines per frame + the scanning algorithm [interlace(i) or progressive (p)] / the frame rate" while using the notation 1080i/30 in the article itself. Lacking an authoritative source, I find it hard to believe that notations in the form of 1080i30 would be anything more than derivatives of these. I believe the de facto source of the EBU style notation is to be found in the ITU publications sourced in the EBU - Tech 3299 document under the heading "Informative References". 212.246.213.38 (talk) 18:51, 23 October 2009 (UTC)[reply]

540p?

Why does 540p redirect here, its clearly not the same. 83.108.208.28 (talk) 00:22, 9 July 2009 (UTC)[reply]

Same question, almost a year later. Why does nobody fix things around here? --77.109.214.213 (talk) 11:19, 10 May 2010 (UTC)[reply]
Because 1080i is made up of two 540 interlaced fields. See Display resolution#Current standards. 74.179.40.22 (talk) 18:14, 18 May 2010 (UTC)[reply]
One field is 1920 X 540. 540p is 960 X 540 pixels. Added to article to dodge further confusion. CadetMadet (talk) 01:04, 4 July 2010 (UTC)[reply]
That information was deleted – along with various other material – about two months later (00:58, 28 August 2010 (UTC) by Mikus with the edit explanation "Less bla-bla".) I just put it back (and added a mention of 1440x1080). I suppose that involves "More bla-bla", but I think it is important information. —Mulligatawny (talk) 17:38, 27 April 2011 (UTC)[reply]

Broken citation

Please fix my broken citation (#3, as of the time of writing). I could not find a way to link the image, which is on Wikimedia Commons, inside the reference. The image is the only real reliable reference, in this case (as explained in my edit comment). Comanoodle (talk) 21:02, 20 September 2009 (UTC)[reply]

1877x1000?

There is an edit done by an anonymous editor in the fourth paragraph: "1877x1000 (the actual displayed resolution of a 1920x1080 source) resolutions.", in contrast to the original "1920x1080 resolutions". Where in the world have they got that information? To my knowledge, all HDTV resolutions are displayed as such; no cropping is ever done (which I believe the author tries to say, in contrast to scaling). If nobody comments, I'll undo that. Elmo Allen (talk) 04:39, 23 November 2009 (UTC)[reply]

Some televisions apply an artificial "overscan", were the outer 8-10% of the image wouldn't be seen, in the same way that older CRT displays tended to. However this is quite uncommon with modern displays. —Preceding unsigned comment added by 121.98.240.135 (talk) 10:12, 27 April 2011 (UTC)[reply]

Interlaced artefacts

I think the article needs to be made clearer that interlaced video doesn't have to suffer any artefacts at all, and can produce the exact same results as 1080p broadcasts. The image used in the article is a bit misleading. For example, in the UK programmes are broadcast at 1080i for most HD channels, but the source material is 25fps, and so a 1080p television simply combines the two fields to produce an original progressive frame. You'd never under any circumstances see a combing effect, because the two fields are from the exact same original frame. As the article stands, it gives the impression that 1080i is always visually inferior to 1080p, which is nonsense. Teppic74 (talk) 12:15, 28 September 2011 (UTC)[reply]

I agree with that, I think the image should be removed as it's misleading causing readers to think the combing problem is what interlaced video is supposed to look like, which is not the case.NJM2010 (talk) 12:25, 8 October 2011 (UTC)[reply]

Bandwidth compression degradation

This article needs much more content!

It only discusses "perfect" content streams -- which is not the real world. What is the full bandwidth of a perfect 1080i stream? What is the real typical bandwidth of OTA broadcasts, cable broadcasts, and Bluray sources? How much of what kinds of compression is used, and how does this degrade various kinds of content? Signal encoding redundancy/error correction/artifacts?-96.237.4.73 (talk) 19:08, 14 February 2013 (UTC)[reply]

lkkkk

sssss 202.165.91.8 (talk) 15:14, 26 May 2022 (UTC)[reply]