Jump to content

Talk:Frame rate: Difference between revisions

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia
Content deleted Content added
Tags: Mobile edit Mobile web edit New topic
 
(18 intermediate revisions by 7 users not shown)
Line 1: Line 1:
{{Talk header}}
{{Talk header}}
{{WikiProject banner shell|class=C|
{{WikiProject Film
{{WikiProject Film
|Filmmaking-task-force=yes
|Filmmaking-task-force=yes
|needs-image=yes
|needs-image=yes
}}
|class=C
}}
|B-Class-1=no
|B-Class-2=no
|B-Class-3=yes
|B-Class-4=yes
|B-Class-5=yes}}


==Accuracy==
==Accuracy==
Line 43: Line 40:
== 24.976 and 29.97 vs 24 and 30 ==
== 24.976 and 29.97 vs 24 and 30 ==


I have done a lot of work with interlaced video, and I am pretty sure that the correct frame rates for interlaced TV are <DEL>23.976</DEL> <INS>24.975</INS> for PAL and 29.97 for NTSC. The numbers 24 and 30 are used for simplicity. I may be wrong about this, please correct me if this is so. If I am not corrected, and if nobody has done it, I will change it tommorow. [[User:HighInBC|HighInBC]] 00:27, 19 March 2006 (UTC)
I have done a lot of work with interlaced video, and I am pretty sure that the correct frame rates for interlaced TV are <DEL>23.976</DEL> <INS>24.976</INS> for PAL and 29.97 for NTSC. The numbers 24 and 30 are used for simplicity. I may be wrong about this, please correct me if this is so. If I am not corrected, and if nobody has done it, I will change it tommorow. [[User:HighInBC|HighInBC]] 00:27, 19 March 2006 (UTC)
:Afterthought, since this technical detail is not directly related to the subject matter perhaps putting the word ''aproximatly'' should be put infront of it instead of using akward numbers. [[User:HighInBC|HighInBC]] 00:29, 19 March 2006 (UTC)
:Afterthought, since this technical detail is not directly related to the subject matter perhaps putting the word ''aproximatly'' should be put infront of it instead of using akward numbers. [[User:HighInBC|HighInBC]] 00:29, 19 March 2006 (UTC)
:::well No. Wikipedia should use the actual data at least once, then the rest of the article could simply state "approx 24".--[[User:Diza|<span style="color:DarkGreen;">Procrastinating'''@'''</span>]]<sup>[[User talk:Diza|talk2me]]</sup> 11:34, 1 June 2006 (UTC)
:::well No. Wikipedia should use the actual data at least once, then the rest of the article could simply state "approx 24".--[[User:Diza|<span style="color:DarkGreen;">Procrastinating'''@'''</span>]]<sup>[[User talk:Diza|talk2me]]</sup> 11:34, 1 June 2006 (UTC)
Line 52: Line 49:
:Though '''60i''' and '''50i''' are accurate aswell(I think) [[User:HighInBC|HighInBC]] 13:00, 1 June 2006 (UTC)
:Though '''60i''' and '''50i''' are accurate aswell(I think) [[User:HighInBC|HighInBC]] 13:00, 1 June 2006 (UTC)


:Well this is further confused by the 24 vs 25. PAL MPEG-2 (PAL DVD) is 25 fps, period. But NTSC MPEG-2 is either 29.97 or 23.976 (with 3:2 pulldown), or about 24. The 3:2 pulldown allows a match with cinema standards and simplify dvd mastering (i assume). The ~30 (vs 25) is originally derived from the differing power standards in different countries (e.g. 50Hz vs 60Hz current). NTSC used to be exactly 60i (vs current approx 59.94, or to be exact, 60/1.001 which is 59.94059...), half of which is (about) 29.97. But all this stuff is written about extensively elsewhere on Wiki, so I think this page should (a) be technically correct and (b) have some suitable links to more details. I will adjust accordingly. --[[User:Psm|Psm]] 19:50, 1 August 2006 (UTC)
:Well this is further confused by the 24 vs 25. PAL MPEG-2 (PAL DVD) is 25 fps, period. But NTSC MPEG-2 is either 29.97 or 23.976 (with 3:2 pulldown), or about 24. The 3:2 pulldown allows a match with cinema standards and simplify dvd mastering (i assume). The ~30 (vs 25) is originally derived from the differing power standards in different countries (e.g. 50Hz vs 60Hz current). NTSC used to be exactly 60i (vs current approx 59.94, or to be exact, 60/1.001 which is 59.940059...), half of which is (about) 29.97. But all this stuff is written about extensively elsewhere on Wiki, so I think this page should (a) be technically correct and (b) have some suitable links to more details. I will adjust accordingly. --[[User:Psm|Psm]] 19:50, 1 August 2006 (UTC)


::If you divide 25 by 1.001 you get 24.9750249... not 24.976. Where 24.97'''6''' comes from? (PS I know this does not prove things, [http://googlefight.com/index.php?lang=en_GB&word1=24.976+fps&word2=24.975+fps but]...) [[User:91.124.41.18|91.124.41.18]]
::If you divide 25 by 1.001 you get 24.9750249... not 24.976. Where 24.97'''6''' comes from? (PS I know this does not prove things, [http://googlefight.com/index.php?lang=en_GB&word1=24.976+fps&word2=24.975+fps but]...) [[User:91.124.41.18|91.124.41.18]]
Line 239: Line 236:
REPLY:
REPLY:


Black and white television is 30 FPS (60i) , <del>The amount of chroma information is minimal , when color TV was introduced , the frame rate was lowered to add the color information. which incidental only updates every "frame"<del> .
Black and white television is 30 FPS (60i) , <del>The amount of chroma information is minimal , when color TV was introduced , the frame rate was lowered to add the color information. which incidental only updates every "frame"</del> .


Wrong information!
Wrong information!
Line 260: Line 257:


I don't know why but when i watching some cartoons like Spongebob (season 1) Hey Arnold (season 1-3) etc i noted that some scenes are sped up twice and it really drives me insane it ruins the theatrical look and it looks like a more video game rather than a film or cartoon itself. Also on Youtube the Spongebob looked pretty choppy like missing frames (Spongebob's arms are not moving when he building a bubblestand) as there are many fast motions here and when it converted to 24 fps the sped up scenes look really choppy like when you watch spongebob with hypercam that had only 10 fps or a cartoon made by Cambria Studios from late 50s and 60s like Clutch Cargo, Space Angel, Captain Fathom which had choppy or no animation at all. It's too bad that YouTube can't play anything faster than 30 fps. [[Special:Contributions/109.174.115.127|109.174.115.127]] ([[User talk:109.174.115.127|talk]]) 08:27, 21 March 2013 (UTC)
I don't know why but when i watching some cartoons like Spongebob (season 1) Hey Arnold (season 1-3) etc i noted that some scenes are sped up twice and it really drives me insane it ruins the theatrical look and it looks like a more video game rather than a film or cartoon itself. Also on Youtube the Spongebob looked pretty choppy like missing frames (Spongebob's arms are not moving when he building a bubblestand) as there are many fast motions here and when it converted to 24 fps the sped up scenes look really choppy like when you watch spongebob with hypercam that had only 10 fps or a cartoon made by Cambria Studios from late 50s and 60s like Clutch Cargo, Space Angel, Captain Fathom which had choppy or no animation at all. It's too bad that YouTube can't play anything faster than 30 fps. [[Special:Contributions/109.174.115.127|109.174.115.127]] ([[User talk:109.174.115.127|talk]]) 08:27, 21 March 2013 (UTC)

:See [[Motion interpolation#Soap opera effect|Soap opera effect]]. --[[Special:Contributions/2003:DA:CF11:CF77:9082:7795:7704:9A65|2003:DA:CF11:CF77:9082:7795:7704:9A65]] ([[User talk:2003:DA:CF11:CF77:9082:7795:7704:9A65|talk]]) 21:50, 16 July 2024 (UTC)


== Backup Evidence for Motion Blur Headache Claim ==
== Backup Evidence for Motion Blur Headache Claim ==
Line 308: Line 307:
But i don't know what's pull-down for PAL and SÉCAM refresh rate with NTSC framing rate and same issue from PAL and SÉCAM framerate with NTSC refresh rate...
But i don't know what's pull-down for PAL and SÉCAM refresh rate with NTSC framing rate and same issue from PAL and SÉCAM framerate with NTSC refresh rate...
[[Special:Contributions/2A01:E0A:AB0:C580:D1C4:271D:6CF0:560E|2A01:E0A:AB0:C580:D1C4:271D:6CF0:560E]] ([[User talk:2A01:E0A:AB0:C580:D1C4:271D:6CF0:560E|talk]]) 13:41, 26 March 2023 (UTC)
[[Special:Contributions/2A01:E0A:AB0:C580:D1C4:271D:6CF0:560E|2A01:E0A:AB0:C580:D1C4:271D:6CF0:560E]] ([[User talk:2A01:E0A:AB0:C580:D1C4:271D:6CF0:560E|talk]]) 13:41, 26 March 2023 (UTC)

== Deformable convolution? Adding/dropping frames? ==

The article uses the term "deformable convolution" without defining it and without giving a simple explanation. Are interpolated frames similar to image-morphing transitions? Does using AI significantly improve the quality of interpolated frames? Are transformations from 29.97 fps (NTSC) to 25 or 30 fps usually done by adding a duplicate or dropping a frame roughly every half minute? Can people notice that? [[User:David spector|David Spector]] ([[User Talk:David spector|talk]]) 15:02, 11 May 2023 (UTC)

:it also refer to first person shooter games [[Special:Contributions/64.125.177.174|64.125.177.174]] ([[User talk:64.125.177.174|talk]]) 21:17, 2 July 2024 (UTC)

== Standard silent (shooting) framerate ==

As the German equivalent of this article correctly states, the standard silent framerate was 16fps ever since the 1909 Paris congress of film manufacturers. Before that, the only film system using a higher framerate as a standard upon filming and playback was Edison's [[Kinetoscope]] peephole viewer (with various shorts running at either 30 or 40fps throughout their respective runtime), and in fact, many systems prior to 1909 used framerates below 16fps. The archived Brownlow article dating from 1980 used as a source here to claim that the standard framerate was higher by the 1920s already has multiple issues that make it obvious that it's far from a reliable source:

*Brownlow's article is self-contradictory when it comes to standard recording framerates. He gives one single anecdotal quote made decades later by a cameraman's son to claim that so-called 'overcranking' during filming was common at the time, and only shortly after, Brownslow contradicts himself by stating several times and in several ways that 16fps was still the definite standard at the time, such as that it can be told from the standard Bell & Howell camera equipment of the time, and uses other quotes relating to facts such as that at least up until the early or mid-1930s, professional silent-camera operators still calculated f-stop or aperture for 16fps, because that's what they were still shooting at for the standard filming speed.
*Brownlow erroneously points to examples of deliberate slo-mo and timelapse effects (whether in-camera or during projection, cf. ''cue cards'') for particular shots or scenes to claim that there was "no standard framerate" during filming or that it was higher than 16fps, and to rare film experiments such as ''Anapolis'' (1928). Similar to this, many newer sources basically point to the experimental [[Napoléon (1927 film)|''Napoleon'']] (1927) by [[Abel Gance]] to claim the standard framerate was higher than 16fps, and by that logic, there not only was a "higher" standard framerate, but films of the time were also shot in [[Cinerama]] and [[Keller-Dorian cinematography|Keller-Dorian natural lenticular color]] "as a standard", as those were other experimental features unique to Gance's ''Napoleon''.
*Brownlow obviously lacks technical understanding of the contemporary [[Geneva drive]] inside cameras, as is obvious from numerous anecdotes in the article affirmative of the common erroneous myth that framerate would be determined by how fast the operator was cranking the handle. Movie cameras fundamentally relied upon Geneva drives translating naturally uneven and varying cranking motions into set consistent framerates, in order to maintain most basic exposure standards. Before early cameras adopted Geneva drives around c. 1900, films would flicker a lot in brightness throughout, because naturally uneven cranking speeds, no matter how miniscule the unevenness, would lead to significantly different amounts of exposure upon each frame. Up until c. 1900-1905, brightness flicker was also caused by early projection equipment, but then around c. 1905 they introduced 3-bladed shutters in order to minimize the duration of darkness between frames in projection. The fact that it were three shutter blades in silent-movie projectors from the start makes it evident that they were designed for 16fps playback, as 16fps * 3 shutters = 48 Hz, which meets the minimum requirement of the human eye's [[flicker fusion threshold]] of 48 Hz when it comes to the duration of darkness in-between frames (and in relation to the standard amount of image brightness inside a dark theater, as image brightness is also a factor when it comes to the minimum Hz required for flicker-free playback). When cinema progressed to talkies, the 3-bladed shutters in silents projectors were replaced by 2-bladed shutters in sound projectors, as 24fps * 2 shutters = 48 Hz. Also note that today, we are used to think of early movies prior to c. 1950 to flicker a lot, only because of cheap contact prints surviving of a particular film, and/or because of improper off-the-wall transfers made by simply pointing a video camera at the screen, where flicker occurs because the video camera is not in-sync with the projector. But in fact, movies didn't flicker anymore during projection once the shutter blade was introduced around c. 1905.
*Brownlow ignores the difference between filming speeds on the one hand and projection-speed fads on the other, in order to claim there was "no standard filming framerate" or that it was higher than 16fps, even though he also states that during the 1920s, greedy commercial theaters owners began to instruct their projectionists to playback the films faster so they could cram more showings into the same day and thus make more money from selling more tickets. This also shows in his article when he quotes moviegoers saying that they initially perceived talkies as "looking leaden-footed", which was simply due to the fact that talkies were shot at 24fps and also "PROJECTED* at that speed by technical necessity. This proves that by that time, audiences were used to silent films being projected at higher framerates than they had been shot at, all because of greedy theater owners trying to cramp more showings in a day. It could well be that the one cameraman's son quote about common 'overcranking' during filming above was simply due to a misunderstanding in relation to actually 'overcranked' *PROJECTION* like this. Brownslow himself relates the visible difference in speeds of movements between silent and talkie scenes seen in ''[[The Jazz Singer]]'' (1927), as the silent scenes were shot at the silent standard of 16fps and the talkie scenes were shot at the new talkie standard of 24fps, but the entire film was projected at 24fps as the new projection standard now.
*Brownlow draws his final conclusions from a faulty 1970s electronic slow-motion technology called "Polygon" to try and slow down tape playback after a telecine to tape had been done, in order to try and determine what the original filming framerate was. As this analogue video technique caused excessive tape-dropouts and interlacing flicker below a playback of c. 20-22fps (both because the video tape head was not fed enough information at such a low speed and all electronic monitors were reliant upon a higher framerate signal, and what they got at such a slow speed didn't meet their sync requirements; Bronwlow himself refers to those elctronic defects as "image drag"), he determines that the standard framerate of the 1920s must have been "somewhere between 20-26fps". For goodness' sake, by that quack method, he woulda gotten the same erronous results with [[Bioscop]] footage shot by the [[Max Skladanowsky|Skladanowsky brothers]] which is known, documented, and measured to have been shot at only 6-8fps!

Thus, I think Brownlow and his claim should either be removed from the article, or those numerous flaws in his article must be pointed out here by Wikipedia, namely in relation to the standard *SHOOTING* framerate. --[[Special:Contributions/2003:DA:CF11:CF77:9082:7795:7704:9A65|2003:DA:CF11:CF77:9082:7795:7704:9A65]] ([[User talk:2003:DA:CF11:CF77:9082:7795:7704:9A65|talk]]) 18:09, 16 July 2024 (UTC)

:In fact, on second thought (considering how once again, this is another case where falsehoods are picked up by many external sources simply because they were originally put on Wikipedia and have survived on there for a long time), maybe it should be made an FAQ on the very top of this talkpage, akin to [[Talk:Elon Musk/FAQ]] over at [[Talk:Elon Musk]], pointing out the flaws and errors in Brownlow's 1980 article. --[[Special:Contributions/2003:DA:CF11:CF77:9082:7795:7704:9A65|2003:DA:CF11:CF77:9082:7795:7704:9A65]] ([[User talk:2003:DA:CF11:CF77:9082:7795:7704:9A65|talk]]) 19:26, 16 July 2024 (UTC)

Latest revision as of 21:50, 16 July 2024

Accuracy

[edit]

The flicker fusion section is very good and accurate. It conflicts though with an innacruate introductory paragraph for the topic:

The top says, "Frame rate...is the measurement of how quickly an imaging device can produce unique consecutive images called frames. The term applies equally well to computer graphics, video cameras, film cameras, and motion capture systems." We need to change "equally well", find something truthful to allow for the unusual and significant reality of interlaced video, wherein the term "frame" means something different than it does in film.

Here's the issue. It has everything to do with the "look" of video versus film, and is important imformation in regards to conversions between film and video. Traditional interlaced video, NTSC PAL and SECAM (all the major formats still in use, though HD is taking over) has "frames" in video. However, each "frame" actually contains two unique consecutive images, each called a field. So the definition above is not true. For instance, PAL has a "frame" rate of 25 "frames" per second. BUT, it has 50 unique images every second. Video is very strange, thanks to interlacing, full resolution only happens on the unmoving parts of images over the span of two fields. So full resolution sometimes resolves at 25 times per second (on unmoving or overlapping parts), but movement always resolves at 50 times per second. That's why interlaced video looks so hyper-real compared to film.

Film movement is abstract, often considered pleasing for it, because it is on the congitive perceptal edge of discreet images versus animation, between 15 to 25 images every second; but video is on the other cognitive edge, just beyond the brain's the ability to perceive simulated movement versus real world movement, between 50 and 60 images every second. So we have to find a way to address this in this topic. The following statement in the flicker fusion section hints at the basic issue: "...since a conventional television camera will scan the scene again for each field, in many circumstances it may be useful to think of the frame rate as being equal to the field rate."

Cheers! worldpoop.com 20:05, 13 November 2005 (UTC)[reply]

The recent addition

[edit]

An anonymous editor just added a bunch of good information to the article, but it really could use tighter integration with what was already here rather than simply being tacked-on at the bottom. Some of the new information is redundant and all of it could be better "factored" into the article. Volunteers? If no, I'll eventually get to it.

Atlant 13:01, 23 Apr 2005 (UTC)

The whole thing was pasted in from here whatis.com. I'll revert it. You can still add any facts from there to the current article. --Dtcdthingy 14:46, 23 Apr 2005 (UTC)

Frame rate conversion

[edit]

How difficult is it to convert between different frame rates? The article should answer that.

In the particular case of increasing frame rates by 2.5 times, we have 3:2 pulldown. And also the reverse of that. But how about going from 25 or 50 fps to 30 or 60 fps, or the other way around? It's often done, of course, but how?

-- ABostrom 20:24, September 5, 2005 (UTC)

That whole topic area is called Standards conversion, and encompasses scaling video between resolutions as well. There are lots of different ways to do it: Merging adjacent frames; dropping and doubling frames, intelligent motion compensation; etc etc. It's a big topic. --Dtcdthingy 22:01, 5 September 2005 (UTC)[reply]
Generally it is fairly easy to make LARGE changes in frame rate, but smaller ones create problems. 24 to 60 works well, 24 to 30 can be jerky. Algr (talk) 09:30, 9 April 2016 (UTC)[reply]

24.976 and 29.97 vs 24 and 30

[edit]

I have done a lot of work with interlaced video, and I am pretty sure that the correct frame rates for interlaced TV are 23.976 24.976 for PAL and 29.97 for NTSC. The numbers 24 and 30 are used for simplicity. I may be wrong about this, please correct me if this is so. If I am not corrected, and if nobody has done it, I will change it tommorow. HighInBC 00:27, 19 March 2006 (UTC)[reply]

Afterthought, since this technical detail is not directly related to the subject matter perhaps putting the word aproximatly should be put infront of it instead of using akward numbers. HighInBC 00:29, 19 March 2006 (UTC)[reply]
well No. Wikipedia should use the actual data at least once, then the rest of the article could simply state "approx 24".--Procrastinating@talk2me 11:34, 1 June 2006 (UTC)[reply]
(Presumably, you mean 24.976 or some such. Atlant 12:39, 1 June 2006 (UTC))[reply]
Yes, I meant 24.976. oops. HighInBC 12:58, 1 June 2006 (UTC)[reply]
Though 60i and 50i are accurate aswell(I think) HighInBC 13:00, 1 June 2006 (UTC)[reply]
Well this is further confused by the 24 vs 25. PAL MPEG-2 (PAL DVD) is 25 fps, period. But NTSC MPEG-2 is either 29.97 or 23.976 (with 3:2 pulldown), or about 24. The 3:2 pulldown allows a match with cinema standards and simplify dvd mastering (i assume). The ~30 (vs 25) is originally derived from the differing power standards in different countries (e.g. 50Hz vs 60Hz current). NTSC used to be exactly 60i (vs current approx 59.94, or to be exact, 60/1.001 which is 59.940059...), half of which is (about) 29.97. But all this stuff is written about extensively elsewhere on Wiki, so I think this page should (a) be technically correct and (b) have some suitable links to more details. I will adjust accordingly. --Psm 19:50, 1 August 2006 (UTC)[reply]
If you divide 25 by 1.001 you get 24.9750249... not 24.976. Where 24.976 comes from? (PS I know this does not prove things, but...) 91.124.41.18
I believe you meant multiplying 25 by 0.999, in which case it would 24.975. Multiplying 24 by 0.999 becomes 23.976. ZtObOr 21:50, 24 July 2008 (UTC)[reply]
Found it! See Telecine#3:2_pulldown, it sais:
So, HighInBC was actually right there about 23. 91.124.41.18

This whole section of the article needs to be revised, because it's highly inaccurate. Here's a frame rate section from the Final Cut documentation. Artem-S-Tashkinov (talk) 09:05, 18 November 2009 (UTC)[reply]

Frame rate Media Description
24 fps Film; high definition video This is the universally accepted film frame rate. Movie theaters worldwide almost always use this frame rate. Many high definition formats can record and play back video at this rate, though 23.98 is usually chosen instead (see below).
23.98 (23.976) fps Film; high definition video with NTSC compatibility; NTSC This is 24 fps slowed down by 99.9% (1000/1001) to easily transfer film to NTSC video. Many high definition video formats (and some SD formats) can record at this speed, and it is usually preferred over true 24 fps because of NTSC compatibility.
25 fps PAL; high definition video The European video standard. Film is sometimes shot at 25 fps when destined for editing or distribution on PAL video.
29.97 fps NTSC; high definition video This has been the color NTSC interlaced video standard since 1953. This number is sometimes inaccurately referred to as 30 fps.
30 fps High definition video; early black-and-white NTSC video Some high definition cameras can record at 30 fps, as opposed to 29.97 fps. Before color was added to NTSC video signals, the frame rate was truly 30 fps. However, this format is almost never used today.
50 fps PAL; high definition video This refers to the interlaced field rate (double the frame rate) of PAL. Some 1080i high definition cameras can record at this frame rate.
59.94 fps High-definition video with NTSC compatibility High-definition cameras can record progressive video at this frame rate, which is compatible with NTSC video. It is also the interlaced field rate of NTSC video. This number is sometimes rounded to 60 fps, but it is best to use 59.94 fps unless you really mean 60 fps.
60 fps High definition video High definition equipment can often play and record at this frame rate, but 59.94 fps is much more common because of NTSC compatibility.

Important: Many people round 29.97 fps to 30 fps, but this can lead to confusion during post-production. Today, it is still very rare to use a frame rate of 30 fps, but very common to use 29.97 fps. When in doubt, ask people to clarify whether they really mean 30 fps, or if they are simply rounding 29.97 fps for convenience. Artem-S-Tashkinov (talk) 09:05, 18 November 2009 (UTC)[reply]

Another explanation for "choppiness"

[edit]

Should something like this be appended to (or replace) the discussion of "choppiness?"

Choppiness can also occur if the rendering rate is not the same as the monitor's frame rate. For example, assume the video card is redrawing a scene depicting a smoothly moving object 65 times per second, and the monitor's refresh rate is 60Hz. Every 13th frame will be dropped, resulting in the object appearing to jerk forward 5 times per second. Limiting the redraw rate to the refresh rate helps to eliminate this source of "chop."

However, the graphics card redraw rate will fall below the monitor's refresh rate if the scene is complex enough, raising the possibility of choppy motion again. This is where double buffering and triple buffering come into play. Display output is delayed by one or more frames. Thus the graphics subsystem can draw one or more frames in advance, so if extra time is needed to draw the frame it is still ready when it should be displayed. The rendering software can then skip one or more frames and render to depict a time 2 (or more) frames later. (It's OK to leave a frame on the display for two frame periods as long as the new placement is correct; this is quite different than the 65Hz rendering example.) In 3D games or simulations, this introduces some "lag" because user input is not applied instantly, but one or more frames from now. As long as this lag is fairly constant and relatively small, the user can adapt to it without really noticing. Shyland 09:49, 22 March 2006 (UTC)[reply]


you can add the jerk part on in a few sentences. also that triple buffering means 3 frames of lag from input. keep it short, theyre just details.

This is great, put it in as a new section ! :) Procrastinating@talk2me
Triple buffering doesn't mean 3 frames of lag. It means 2 frames (1/30 sec at 60Hz, less at higher refresh rates.) One buffer holds the frame currently shown; it's the other two that hold advance frames and produce delay. Furthermore if your mouse/button input is ALWAYS 1/30 second(!) behind, you can easily adapt to such a small delay. It's unpredictable lag (i.e. not using double/triple buffering) that screws up your timing. --Shyland 19:50, 10 October 2006 (UTC)[reply]

The cinema standard of 24 fps assumes a rotating shutter with 170 degrees open. This means that the actual exposure time is slightly less than 50% of 1/24 sec - or about 1/50 sec. with a space of 1/50 sec where no action is captured. A similar effect is obtained with 60i video where the alternate field is removed and the active field doubled. Think of this as a square wave where every negative excursion is a slice of time that is not captured. This can result in a very important visual difference from a video camera capturing at 30p - if the sensor is actually being scanned 30 times/sec. Frame blurs will be much more noticeable when the sensor is integrating a full 1/30 second of action rather than 1/60 second of action. It also follows that the common recommendation to shoot video at 1/30 sec. with a higher f-stop will result in a much more blurred look when shooting fast motion than locking to 1/60 sec and accepting a lower f-stop with a shorter depth of focus. It is a catch22 - more motion blur with a more forgiving focus or no motion blur but more critical focus. Mccainre (talk) 19:41, 26 March 2013 (UTC)[reply]

Just wondering.

[edit]

What's the frame rate of reality? What will happened when games get a higher framerate then real life? --Planetary 03:28, 12 September 2006 (UTC)[reply]

I think that the real world frame rate are infinite or close to the speed of light The brain and the machine cannot go faster. As for me, I can detect up to 120 FPS. My old MAC monitor has up to 120 hz refresh rate. Now does it was interlaced or progressive? I do not know. If it was interlaced at 120 fps then 120/2 = 60FPS. I can really see the difference with 30 FPS. 2 tips to help you to see the difference. look in a white background. In a game, do quick 180 degree turns (360 if you can) An other way it to move your mouse quickly left, right, left, etc. in your computer. You should see a lot of mouses in the same time. If your computer has a low refresh rate you will not see much mouses. —Preceding unsigned comment added by 76.69.190.11 (talk) 06:37, 28 November 2008 (UTC)[reply]

Your brain isn't a synchronous computer; it has no central clock from which to measure its "frame rate".
Atlant 13:57, 12 September 2006 (UTC)[reply]
Thanks for the info. I wasn't expecting an answer like that. :)--Planetary 04:04, 26 September 2006 (UTC)[reply]
Now, if you're talking about the frame rate of your eye, it's usually between 35 and 55 fps. Just an estimate, but I can usually see if something's running below 20 fps. ZtObOr 21:56, 24 July 2008 (UTC)[reply]
Fun idea...Reality isn't a synchronous computer either. It has no framerate; or infinite framerate, if you like. Frames are just our crude way of approximating reality. Higher frame rates only get closer to simulating reality's utter smoothness. Above a certain rate it stops mattering because you can't see the difference. At that point the display system exceeds the capabilities of the input system (your eyes and brain), so there's no real point in any further frame-rate improvements. --Shyland 20:33, 10 October 2006 (UTC)[reply]
Yeah, ti would be funny if there was a maximum frame rate of real life. Sort of a stand-up comedy time gag.--Planetary 00:04, 11 October 2006 (UTC)[reply]
Modern physics seems to think that time really is quantized, so if the physicists are correct, there is an ultimate frame rate. See [1].
Atlant 00:41, 11 October 2006 (UTC)[reply]
Interesting. Thanks for the link. Looks like this tale hasn't been told...--Planetary 00:48, 11 October 2006 (UTC)[reply]
There is no definite answer to this. It relates to the computational issue of whether the universe is a finite Turing machine or an infinate one, as well as to the philosophical issue of whether the universe is continuous or discrete (e.g. Zeno's paradox). Indeed, generations of physics students have in essence been taught that the universe is *both* when told that light is both a wave *and* a particle. The underlying dilemma is that both perspectives lead to what appear to be contradictions. But more interestingly, the point of the graphics is to fool your brain, and that in turn relates to how your optical system works as well as your cognitive functions, not how the "universe" works. Hence issues like simulated motion blur (there is no such thing as "motion blur" in "reality"), as well as relativistic presentation in current massively multiplayer games (run two clients next each other of the same scene, and they will show different things). Note also that the biggest "fib" of current games is not the frame rate, but the simplified physics (and lighting) models of the rendering systems, since current (and envisioned) computational power is several magnitudes short of anything approaching photorealism. --Psm 22:14, 5 January 2007 (UTC)[reply]
Thanks for the explanation. I :think: I get it now.--Planetary 23:29, 5 January 2007 (UTC)[reply]
I'm of the opinion that the above is an example of a little knowledge of physics being a dangerous thing. Light behaving as a particle and a wave has nothing to do with time being discrete or continuous. Zeno's paradox isn't a paradox at all, it's an easily solved calculus problem. Calling the universe a turing machine is also absurd, as is equating client synchronization issues in games to relativity.


The universe runs at one frame per planck second. But it is not that simple, the duration of a planck second in each point is not the necessarily the same than at any other point due to the effects of relativity. Kinda like a cellular automata running across billions of parallel processors. --TiagoTiago (talk) 07:45, 2 March 2013 (UTC)[reply]


"In theory, the continuous information (also analog signal) has an infinite number of possible values..." https://en.wikipedia.org/wiki/Analog_device — Preceding unsigned comment added by Dcsee1 (talkcontribs) 12:40, 6 June 2015 (UTC)[reply]

"Extra" frames are not always dropped

[edit]

The solution to this problem would be to interpolate the extra frames together in the back-buffer (field multisampling), or simulate the motion blur seen by the human eye in the rendering engine. Currently most video cards can only output a maximum frame rate equal to the refresh rate of the monitor. All extra frames are dropped.

I've heard of triple buffering, but with double buffering usually at least some of each rendered frame makes it (with vsync nothing but potential processing power can go waste; otherwise a new frame can appear if another wasn't completely sent to the monitor yet, resulting in parts of each being shown). I think the last claim should be either adjusted or removed. --62.194.128.232 02:33, 26 September 2006 (UTC)[reply]

I agree with this but directx does not have triple buffering. Only openGL for now. Well, some people have being able to hack directx to have triple buffering but it is not supported and too much troubles. —Preceding unsigned comment added by 76.69.190.11 (talk) 06:45, 28 November 2008 (UTC)[reply]


hi. this is my first time posting on a wiki page, and hopefully not too many people will be annoyed because i have nothing to back up my statements but personal experience and from much anecdotal evidence from other avid competitive FPS (first person shooter computer games) players. i have had several almost heated debates on this subject where multiple people have told me i was completely imagining things when i tell them i can clearly see the difference between 70 fps and 140 fps in these games. 70 fps feels downright CHOPPY to me and almost unplayable compared to what i am used to. i can even tell the difference between 120 and 250+ quite easily altho its not as immediate and jarring as 70 fps is to me. yes, i DO have a 70 refresh rate on my CRT monitor. i think without playing these types of games and using a mouse to spin your view 180 degrees almost instantaneously, its hard for almost anybody to fathom this, and yet i could tell the diff tween 70 and 140 %100 of the time if anybody blind tested me. i would like to know WHY exactly. the first 4 pages of google didn't give me anything to support what i can see firsthand tho, so let me just state the only explanation i can think of, and hopefully somebody out there knows for sure if this is all it is (or if there's more to it or if i'm wrong), and help get a proper explanation for this phenomena explained on this page because so many people think that if your monitor doesn't refresh THE ENTIRE screen faster than 70 fps, that its a physical impossibility for you to have a smoother gameplay experience going past that frame rate.

now as one of the posters above me stated, you should be getting more than one frame of 3D for each time your monitor draws a whole screen. therefore at 140 fps, you are actually getting 2 rendered screens (only a part of each one) rendered into 1 of your monitors screen refreshes. if you are at 280 fps, you are getting 4 different 3D renderings split between 1 monitor screen refresh. and these bits of more accurate frame pieces are able to, over time, give you a more accurate representation than a 70 fps vsync'ed rendering can give you.

the 2nd half of a monitor's screen should be twice as up to date with a 140 videocard frame rate versus a 70 fps videocard rate. the 4th quarter of a monitor's screen should be 4 times as up to date with a 280 videocard frame rate versus a 70 fps videocard rate. i would imagine that effectively, the higher up on the screen you are, the less difference a higher frame rate matters, whereas the further down you go the more accurate your view will be with rates above 70. is this how it works? —Preceding unsigned comment added by Corpusc (talkcontribs) 08:33, 6 March 2009 (UTC)[reply]

Is this the same topic? or..

[edit]

I know that, like, birds and bees and things have a higher frame-rate of vision.. or something like that, so that they see movies as choppy.. whereas my vision seems to have a super low framerate.. one frame, the ball of the sport in question is ahead of me, the next it's behind me... no chance to react..

You dont actually see in frames. You see what appears to be a streak the color of the ball that extends from in front of you to behind you. You reaction time has nothing to do with this. We film video at around 1/30th second per frame, because if you open the lens for 1/30th second as the ball is thrown by a streak about the same length as what humans see would appear on the film.

Feel free to edit

[edit]

I wrote about half of this article, and honestly just made it up as I went along. Im pretty smart, so its probly 98% right, but be bold if you disagree with something.

I was just wondering if the Madden reference at the bottom of the article fit in with the rest. —Preceding unsigned comment added by 209.159.98.1 (talk) 19:32, 6 September 2007 (UTC)[reply]

Human Eye

[edit]

As I've noted in the discussion for the human eye article, there's no information there or in this article about the human eye's "frame rate" which, I have read to be about 60Hz or approximately 24 fps. This, I think is an important fact to include somewhere in these articles (as long as at least one of the articles supported it by a reliable reference). Additionally, this article could have a link to the article on aliasing (or simply the article on the wagon-wheel effect) as this information can help a person understand how the frame rate can affect the perceived motion. —Preceding unsigned comment added by Andreas Toth (talkcontribs) 23:59, 1 October 2007 (UTC)[reply]


Well when playing a PC game, I can definitely tell the difference between a game that is running at 24fps and 60fps. 212.139.18.21 (talk) 01:39, 9 January 2008 (UTC)[reply]

I have read that the average human eye can see 60 fps, and experienced PC users can see up to 120 fps. —Preceding unsigned comment added by Metallica10 (talkcontribs) 00:31, 25 January 2009 (UTC)[reply]

"The human visual system does not see in terms of frames; it works with a continuous flow of light information." OK, I'm not an expert on human vision, but I do work for some, and I've just read that the human retina does indeed work in frames. So it looks like this needs and expert. So I'm adding the expert banner Anniepoo (talk) 03:45, 2 April 2011 (UTC)[reply]

The human retina does not work in frames. Different receptors respond at different rates. For example, Rod cells integrate light slowly and might not be able to resolve light that is flikering faster than about 12Hz, while cones can detect flicker up to at least 55Hz.[1] I have added a reference to a Neuroscience textbook that discusses responses of light receptors.

AlexsandraSmart (talk) 16:13, 21 June 2011 (UTC)[reply]

  1. ^ Kandel ER, Schwartz JH, Jessell TM 2000. Principles of Neural Science, 4th ed. McGraw-Hill, New York. ISBN 0-8385-7701-6

Well I don't know about "expert," but I'm a cognition and perception psych phd student, and I am definitely 100% sure that the eye does not have a framerate. Every single individual rod and cone light receptor fires as often as it can as long as it is receiving light in sufficient amounts. The rate is therefore biologically determined from a whole host of factors that are not controlled or standardized at all - how nutritious your meals have been recently, whether you have enough oxygen, how much light you are seeing now and just recently saw, the focus and aperture of your eye at the moment, the coloration of the light, blood supply to that one receptor cell, blah blah blah. Thus, every cell will be out of synchronization with every other cell. So it is the equivalent of rendering every pixel of your screen separately, at a fairly random (but adapted to what is needed to some extent) rate between about 1 millisecond (very stimulated) to 30ms (inactive). To further complicate matters, that does NOT mean 1000FPS and 30FPS, because neurons often transmit information in terms of firing RATE, so it might take many firings to signal "yes, I am active" to the bipolar cells they are attached to, not just one. AND lightness/brightness information is updated more slowly than color information, another reason it is totally different than a monitor. AND the information that is collected from the retina is NOT processed in one frame and then discarded, as FPS systems are. It is averaged over a long period of time, up to several seconds in a very dark room. AND line orientation and motion and color etc. are all processed at different stages and have different reaction rates, etc. The two systems are just totally not comparable at all. — Preceding unsigned comment added by 128.255.159.81 (talk) 15:10, 9 September 2011 (UTC)[reply]

Feet and frames

[edit]

PBS said, "Most feature films and many TV shows are shot on 35 mm film that has 64 holes per foot. There are 4 holes on each side of each frame." [2]

Does this mean that a foot of film corresponds to one second of filming time, i.e., 16 frames per second? When they talk about shooting 300 feet of film, did they mean five minutes (300 = 5 * 60)? --Uncle Ed (talk) 13:56, 27 November 2009 (UTC)[reply]

That sounds about right to me. I gather that sound film runs at 90 feet per minute at 24 fps, or 18 inches per second. That incidentally gives the soundtrack a speed comparable to that of most professional reel to reel tapes (15 ips). Lee M (talk) 19:57, 25 May 2011 (UTC)[reply]
Yes, 35 mm film runs 18 inches per second at 24 fps. Binksternet (talk) 03:58, 26 September 2011 (UTC)[reply]

Watching old movies

[edit]

Is there any way to watch an old movie, which was shot at 12 frames per second, on a modern computer? That is, I'd like to see the film at normal speed, so the action doesn't look sped up. --Uncle Ed (talk) 14:05, 13 February 2010 (UTC)[reply]

Use gSpot or similar to change the fps entry in the header of the avi file and change it for audio accordingly. --Arnero (talk) 10:57, 10 May 2010 (UTC)[reply]

60 fps vs 59.94 fps

[edit]

Why are they no distinct points in the list? What happens if I play NTSC DVDs in my notebook? Theoretically some fields will be displayed twice? Looking at modern cameras, I see they use 59.94 fps. Do modern notebooks and Desktop LCDs also use this 59.94 fps and there are not troubles at all (I do not think so)? Can we state this explicitly in that point of the list? --Arnero (talk) 10:21, 10 May 2010 (UTC)[reply]


REPLY:

Black and white television is 30 FPS (60i) , The amount of chroma information is minimal , when color TV was introduced , the frame rate was lowered to add the color information. which incidental only updates every "frame" .

Wrong information!

with 30p or 60i caution should be used! they are not use for some of this frame rate combination on video/film industry! they do not mix and match flawless! some time use 59.94i or 29.97 instead. most prosumer cameras stead 24p but actual rate is 23.976p! or 60i but actual rate is 59.94i!

NTSC 59.94i<->29.97p (TV ok)
24p<->30p (not TV rate, jumpy panning on TV)
23.976p<->59.94i (TV rate)
60p or 60i (Not TV rate, jumpy panning on TV)


There are techniques to compensate for that slight discrepancy , try viewing something with mediaplayer classic , and press CTRL-P , you will see the compensation for it in real time in a graph at the bottom right corner of the screen. You will also see your screen's true refresh rate at the top right.

300FPS change

[edit]

I updated the 300 fps section to remove the statement that BBC was investigating high frame rate broadcasts due to issues from viewers. The paper was done to investigate the quality of improved frame rate for sports broadcasts and roughly outlines the benefits and some tech hurdles. There is 1 line talking about standard 1080 video @ 60hz causing some people to complain about nausea that is irrelevant to 300 FPS.

"High-Definition television (by which we mean television with a vertical resolution of 720 or 1080 lines and a field or frame rate of 50/60Hz) has increased the spatial resolution without altering the frame rates used, however. Traditional television production techniques have been constrained by this change. For example, during camera pans to follow the action at sports events, HDTV trial viewers reported nausea as the static portion of the scene changed between sharp (when stationary) and smeared (when panning)"

48 fps on Spongebob and Hey Arnold!? or it's just me?

[edit]

I don't know why but when i watching some cartoons like Spongebob (season 1) Hey Arnold (season 1-3) etc i noted that some scenes are sped up twice and it really drives me insane it ruins the theatrical look and it looks like a more video game rather than a film or cartoon itself. Also on Youtube the Spongebob looked pretty choppy like missing frames (Spongebob's arms are not moving when he building a bubblestand) as there are many fast motions here and when it converted to 24 fps the sped up scenes look really choppy like when you watch spongebob with hypercam that had only 10 fps or a cartoon made by Cambria Studios from late 50s and 60s like Clutch Cargo, Space Angel, Captain Fathom which had choppy or no animation at all. It's too bad that YouTube can't play anything faster than 30 fps. 109.174.115.127 (talk) 08:27, 21 March 2013 (UTC)[reply]

See Soap opera effect. --2003:DA:CF11:CF77:9082:7795:7704:9A65 (talk) 21:50, 16 July 2024 (UTC)[reply]

Backup Evidence for Motion Blur Headache Claim

[edit]

The article currently states unter 'Video games', "Motion blur can also induce headaches when people play a game that requires concentration." The source is a single blog post about a single game, with a single commenting person expicitly agreeing. Does that suffice to make that rather strong claim about a health-related effect of a technique? --178.0.155.26 (talk) 07:41, 26 July 2013 (UTC)[reply]

The Hobbit isn't first 48 fps film

[edit]

The Hobbit is not first film to be filmed at 48 fps. The first 48 fps film is Avatar (yup there's a 48 fps version of Avatar floating around the web). Hobbit is actually second. 109.174.115.63 (talk) 19:26, 19 February 2014 (UTC)[reply]

Any source on this at all? I was unable to find anything on it (there are some news articles about the sequels being filmed in 48 fps though.) What you found floating around the web is probably interpolated footage from the original 24 fps. 71.213.147.249 (talk) 21:49, 29 June 2018 (UTC)[reply]

300FPS

[edit]

regarding the line "300 FPS can be converted to both 50 and 60 FPS transmission formats without major issues" would it be appropriate to add ", due in part to the fact that 300 is evenly divisible by both 50 and 60"? I'm just assuming that's the reason, not sure if it's correct. Jchap1590 (talk) 23:30, 17 July 2014 (UTC)[reply]

fps or FPS?

[edit]

The article started with FPS in the lede, switched to fps half-way through, and after that switched back and forth in a seemingly random manner. I have changed fps to FPS thoughout, for consistency with the lede, but now I wonder whether it should have been FPS changed to fps throughout because a quick google search suggests fps is more common than FPS. Thoughts? Is there a standard abbreviation somewhere? Dondervogel 2 (talk) 10:30, 12 April 2015 (UTC)[reply]

Following the standard rules of the language(s) and the common sense, the correct form is FPS, as it's an abbreviation, an initialism to be more precise. Written *fps, be it popular or not, it's incorrect, even when most of people don't seem to care. It would be f. p. s., but then again, it's easier and more popular to write FPS, just as AM or PM...

62.83.135.212 (talk) 00:35, 6 June 2016 (UTC)[reply]

I believe "REFRESH RATE" is hertz, not frame rate.

[edit]

Contrary to the opening paragraph, frame rate is not the same thing as hertz on a progressive scan monitor. "REFRESH RATE" is hertz, not frame rate.

https://en.wikipedia.org/wiki/Refresh_rate

The refresh rate (most commonly the "vertical refresh rate", "vertical scan rate" for cathode ray tubes) is the number of times in a second that a display hardware updates its buffer. This is distinct from the measure of frame rate in that the refresh rate includes the repeated drawing of identical frames, while frame rate measures how often a video source can feed an entire frame of new data to a display. — Preceding unsigned comment added by Dcsee1 (talkcontribs) 12:19, 6 June 2015 (UTC)[reply]

Questionable sources

[edit]

"In 2011, written on a forum was said that the retina takes about 5 to 12 milliseconds for an electrical impulse to fire and reset, 100 to 1000 rods depending on where in the retina you are, can fire every 7 milliseconds on average or around 140 fps.[11] Another website said that the human eye on average could see up to 150 fps."

I don't see how "a forum" and "a website" can count as reliable sources when people can write whatever they want on those without backing it up with evidence? — Preceding unsigned comment added by Ytrearneindre (talkcontribs) 20:30, 21 January 2016 (UTC)[reply]

[edit]

What the heck is "progrsssive" (p) ? Article suddenly jumps into this term and its nowhere explained. — Preceding unsigned comment added by 78.49.9.220 (talk) 23:06, 5 February 2016 (UTC)[reply]

3322285226

[edit]

3322285226 43.250.241.45 (talk) 20:36, 2 June 2022 (UTC)[reply]

60 Hz to 25/48/50 FPS vs 50 Hz to 24/30/48/60 FPS... I Need help.

[edit]

I know right for 48 in my opinion: 48 need 2:1 pulldown (1-2-1-2-1-2-1-2-1-2-1-2-1-2-1-2-1-2-1-2-1-2-1-2-1-2-1-2-1-2-1-2-1-2) (frame one = refresh one, frame two = two refreshes) But i don't know what's pull-down for PAL and SÉCAM refresh rate with NTSC framing rate and same issue from PAL and SÉCAM framerate with NTSC refresh rate... 2A01:E0A:AB0:C580:D1C4:271D:6CF0:560E (talk) 13:41, 26 March 2023 (UTC)[reply]

Deformable convolution? Adding/dropping frames?

[edit]

The article uses the term "deformable convolution" without defining it and without giving a simple explanation. Are interpolated frames similar to image-morphing transitions? Does using AI significantly improve the quality of interpolated frames? Are transformations from 29.97 fps (NTSC) to 25 or 30 fps usually done by adding a duplicate or dropping a frame roughly every half minute? Can people notice that? David Spector (talk) 15:02, 11 May 2023 (UTC)[reply]

it also refer to first person shooter games 64.125.177.174 (talk) 21:17, 2 July 2024 (UTC)[reply]

Standard silent (shooting) framerate

[edit]

As the German equivalent of this article correctly states, the standard silent framerate was 16fps ever since the 1909 Paris congress of film manufacturers. Before that, the only film system using a higher framerate as a standard upon filming and playback was Edison's Kinetoscope peephole viewer (with various shorts running at either 30 or 40fps throughout their respective runtime), and in fact, many systems prior to 1909 used framerates below 16fps. The archived Brownlow article dating from 1980 used as a source here to claim that the standard framerate was higher by the 1920s already has multiple issues that make it obvious that it's far from a reliable source:

  • Brownlow's article is self-contradictory when it comes to standard recording framerates. He gives one single anecdotal quote made decades later by a cameraman's son to claim that so-called 'overcranking' during filming was common at the time, and only shortly after, Brownslow contradicts himself by stating several times and in several ways that 16fps was still the definite standard at the time, such as that it can be told from the standard Bell & Howell camera equipment of the time, and uses other quotes relating to facts such as that at least up until the early or mid-1930s, professional silent-camera operators still calculated f-stop or aperture for 16fps, because that's what they were still shooting at for the standard filming speed.
  • Brownlow erroneously points to examples of deliberate slo-mo and timelapse effects (whether in-camera or during projection, cf. cue cards) for particular shots or scenes to claim that there was "no standard framerate" during filming or that it was higher than 16fps, and to rare film experiments such as Anapolis (1928). Similar to this, many newer sources basically point to the experimental Napoleon (1927) by Abel Gance to claim the standard framerate was higher than 16fps, and by that logic, there not only was a "higher" standard framerate, but films of the time were also shot in Cinerama and Keller-Dorian natural lenticular color "as a standard", as those were other experimental features unique to Gance's Napoleon.
  • Brownlow obviously lacks technical understanding of the contemporary Geneva drive inside cameras, as is obvious from numerous anecdotes in the article affirmative of the common erroneous myth that framerate would be determined by how fast the operator was cranking the handle. Movie cameras fundamentally relied upon Geneva drives translating naturally uneven and varying cranking motions into set consistent framerates, in order to maintain most basic exposure standards. Before early cameras adopted Geneva drives around c. 1900, films would flicker a lot in brightness throughout, because naturally uneven cranking speeds, no matter how miniscule the unevenness, would lead to significantly different amounts of exposure upon each frame. Up until c. 1900-1905, brightness flicker was also caused by early projection equipment, but then around c. 1905 they introduced 3-bladed shutters in order to minimize the duration of darkness between frames in projection. The fact that it were three shutter blades in silent-movie projectors from the start makes it evident that they were designed for 16fps playback, as 16fps * 3 shutters = 48 Hz, which meets the minimum requirement of the human eye's flicker fusion threshold of 48 Hz when it comes to the duration of darkness in-between frames (and in relation to the standard amount of image brightness inside a dark theater, as image brightness is also a factor when it comes to the minimum Hz required for flicker-free playback). When cinema progressed to talkies, the 3-bladed shutters in silents projectors were replaced by 2-bladed shutters in sound projectors, as 24fps * 2 shutters = 48 Hz. Also note that today, we are used to think of early movies prior to c. 1950 to flicker a lot, only because of cheap contact prints surviving of a particular film, and/or because of improper off-the-wall transfers made by simply pointing a video camera at the screen, where flicker occurs because the video camera is not in-sync with the projector. But in fact, movies didn't flicker anymore during projection once the shutter blade was introduced around c. 1905.
  • Brownlow ignores the difference between filming speeds on the one hand and projection-speed fads on the other, in order to claim there was "no standard filming framerate" or that it was higher than 16fps, even though he also states that during the 1920s, greedy commercial theaters owners began to instruct their projectionists to playback the films faster so they could cram more showings into the same day and thus make more money from selling more tickets. This also shows in his article when he quotes moviegoers saying that they initially perceived talkies as "looking leaden-footed", which was simply due to the fact that talkies were shot at 24fps and also "PROJECTED* at that speed by technical necessity. This proves that by that time, audiences were used to silent films being projected at higher framerates than they had been shot at, all because of greedy theater owners trying to cramp more showings in a day. It could well be that the one cameraman's son quote about common 'overcranking' during filming above was simply due to a misunderstanding in relation to actually 'overcranked' *PROJECTION* like this. Brownslow himself relates the visible difference in speeds of movements between silent and talkie scenes seen in The Jazz Singer (1927), as the silent scenes were shot at the silent standard of 16fps and the talkie scenes were shot at the new talkie standard of 24fps, but the entire film was projected at 24fps as the new projection standard now.
  • Brownlow draws his final conclusions from a faulty 1970s electronic slow-motion technology called "Polygon" to try and slow down tape playback after a telecine to tape had been done, in order to try and determine what the original filming framerate was. As this analogue video technique caused excessive tape-dropouts and interlacing flicker below a playback of c. 20-22fps (both because the video tape head was not fed enough information at such a low speed and all electronic monitors were reliant upon a higher framerate signal, and what they got at such a slow speed didn't meet their sync requirements; Bronwlow himself refers to those elctronic defects as "image drag"), he determines that the standard framerate of the 1920s must have been "somewhere between 20-26fps". For goodness' sake, by that quack method, he woulda gotten the same erronous results with Bioscop footage shot by the Skladanowsky brothers which is known, documented, and measured to have been shot at only 6-8fps!

Thus, I think Brownlow and his claim should either be removed from the article, or those numerous flaws in his article must be pointed out here by Wikipedia, namely in relation to the standard *SHOOTING* framerate. --2003:DA:CF11:CF77:9082:7795:7704:9A65 (talk) 18:09, 16 July 2024 (UTC)[reply]

In fact, on second thought (considering how once again, this is another case where falsehoods are picked up by many external sources simply because they were originally put on Wikipedia and have survived on there for a long time), maybe it should be made an FAQ on the very top of this talkpage, akin to Talk:Elon Musk/FAQ over at Talk:Elon Musk, pointing out the flaws and errors in Brownlow's 1980 article. --2003:DA:CF11:CF77:9082:7795:7704:9A65 (talk) 19:26, 16 July 2024 (UTC)[reply]