[FFmpeg-trac] #8590(undetermined:closed): 'telecine=pattern' error for p24, soft telecined sources

FFmpeg trac at avcodec.org
Mon Apr 6 10:00:39 EEST 2020


#8590: 'telecine=pattern' error for p24, soft telecined sources
-------------------------------------+-------------------------------------
             Reporter:  markfilipak  |                    Owner:
                 Type:  defect       |                   Status:  closed
             Priority:  normal       |                Component:
                                     |  undetermined
              Version:  unspecified  |               Resolution:  invalid
             Keywords:               |               Blocked By:
             Blocking:               |  Reproduced by developer:  0
Analyzed by developer:  0            |
-------------------------------------+-------------------------------------

Comment (by pdr0):

 Replying to [comment:41 markfilipak]:
 >
 >
 > Sorry, but I don't know what you had in mind when you wrote
 "Undersampling here". To what does "here" refer? Are you referring to
 sampling film? Are you referring to resampling an image? Or are you
 referring to something else?
 >

 We were discussing undersampling as it relates to aliasing and twitter.
 That it was the most common underlying cause. Spatial undersampling .
 Discarding pixels or line skipping or groups of pixels in the cases of
 pixel binning

 > I thought we were discussing interlace and aliasing. Instead of 100
 samples, throw away 50, how about a real case? Each picture on a 720x480
 DVD has 345600 samples. Are you saying that hard telecine with interlaced
 fields, for example, that divides the pictures into half-pictures with
 172800 samples each is throwing half the samples away? What video product
 throws half the pixels away?
 >

 Yes, interlace, aliasing, twitter artifacts

 Interlace content throws half the spatial samples away, progressive 29.97p
 content throws away half the temporal samples away (both compared to
 59.94p, which isn't "legal" for DVD-video, but that's how it's acquired
 these days) . Both are old compromises only because of historical
 bandwidth issues. That's the main reason we even have interlace.

 Hard telecine is 23.976 encoded as 29.97i fields  (or if you like 59.94i
 fields, same thing different naming convention). If you started with
 23.976, then obviously nothing is thrown away. In fact you're adding
 repeated duplicate fields (extra repeat fields actually encoded). Soft
 telecine does not encode the extra repeat fields, it just uses repeat
 field flags to signal - I think you know this because you mentioned soft
 in the 1st post





 >
 > > > When you say that that transcoding 24fps to 24fps is not subsampling
 but merely dividing a picture into half-pictures is subsampling, are you
 being consistent?
 > > >
 > >
 > > Yes it's consistent. Nothing is being discarded when you go 24fps to
 24fps (ignoring lossy compression for this discussion) . Dividing into
 half pictures - nothing is being discarded
 >
 > But that's not subsampling, is it?
 >

 Correct, it's not subsampling

 (And "subsampling" in terms of video is a term usually reserved for
 "chroma subsampling")


 > > BUT interlace content - now something is being discarded . Do you see
 the difference ?
 >
 > No. I don't. Each DVD picture has 345600 samples regardless of whether
 those 345600 samples are in one frame or two interlace fields.
 >
 Ok you're assuming DVD.

 Yes, this is true. 720x480=345600. But interlace content starts at 59.94p
 before it got interlaced (ok there are some old CCD sensors that begin
 life interlaced , but anything in the last 10-15 years uses progressive
 CMOS sensors).


 > > > >... Each field has half the spatial information of a full
 progressive frame. ...
 > > >
 > > > Does that make it subsampling?
 > > >
 It does when you start with 59.94p




 > >
 > > For interlace content - yes it does. Half the spatial information is
 missing
 >
 > No, it's not. It's just formatted into fields instead of frames. Three's
 nothing missing.
 >
 And this is where the misunderstanding is -

 When you assume the starting point is a CCD interlaced sensor acquisition
 , sure nothing is missing compared to that (again, ignoring lossy
 compresssion)

 But anything in the last 10-15 years started life as progressive CMOS
 59.94p (or higher) acquisition

 Eitherway - The bottom line is interlace is a spatially undersampled
 picture, a bandwidth saving measure. You can't represent a moving diagonal
 line cleanly with interlace content, because half the information is
 missing compared to a full 59.94p progressive version. And that's how and
 why interlace relates to aliasing , and twitter .


 >
 > The only process that I know of that takes one field and makes a frame
 of it is bobbing.
 Nothing is being thown away, even then, but the results indeed are half
 resolution.

 Yes, it's called "separating fields" these days. Nothing is thrown away if
 you separate fields. You can weave them back.


 >>>
 >Bobbing certainly is the only sane way to turn telecast fields into
 frames. If that's the source of the line aliasing to which you refer, then
 I agree of course (and the discussion is so basic I can't see why we're
 even engaging in it), and I'm surprised because that is such an obscure
 application that has nothing to do with progressive sources like movies
 and, again, nothing is thrown away. So when you refer to something thrown
 away... throwing away half the pixels, I have no idea to what you refer.
 Sorry.
 >>>

 The "source" of aliasing is because is a single field does not make a full
 progressive frame. When you have interlaced content, you don't have
 complete matching field pairs to make up a full progressive frame.
 Interlace is spatially undersampled compared to the 59.94 version. We were
 talking about interlace ,aliasing, and twitter,  remember ?

 Yes, nothing is thrown away with progressive sources like movies. They
 were acquired at 23.976 (or 24.0p in many cases). (An exception might be
 some HFR acquisition movies)






 >
 > > But if you bob deinterlace interlaced content, the undersampling was
 there to begin with . The process of making the interlace, dropping the
 spatial samples is what caused the undersampling
 >
 > You're talking about dropping samples again. For telecasts, the non-
 existent field never existed. The 'current' field can never be part of a
 field-pair. Nothing is dropped. it never existed.
 >

 The information is there in acquisition. It certainly is these days. (Ok
 there might be some exceptions, for quick turn around scenarios, like ENG
 that still use interlaced acquisition)





 > > >
 > >
 > > It's undersampling. The resolution of DSLR sensors is very high. 30-50
 megapixels on average. 120-150 Megapixel cameras available. 1920x1080 HD
 video is only ~2K . 2Megapixels. It should be massively oversampled for
 video. But because of overheating and processing requirements (it takes
 lots of CPU or hardware to do proper downscaling in realtime), they
 (especially 1st 5-6 generations of DSLRs) , drop every nth pixel when
 taking video. Some newer ones are better now, but this is a very common
 source of aliasing and line twitter
 >
 > Okay, I don't consider that undersampling, but you're right, regarding
 the resolution of the CCD, it is undersampling. You see, I don't consider
 the taking camera as part of the mastering process. To me, the origin is
 what comes out of the camera, not what could come out of the camera if the
 sensor was 100% utilized.
 >

 Not CCD, almost nothing uses CCD anymore :) CMOS. Progressive.

 The still images that come out of the camera from the sensor are almost
 100% utilized, in burst mode you can get 12-16fps. Massive images. Video
 is just still images strung together.

 But the point about the DSLR example is how it relates to aliasing and
 twittering. It's the sampling that is done downstream that causes it, much
 like the interlace example. It's line skipping and dropping pixels
 compared full images.

 >
 > There is no simple, mutually-agreed sense/understanding/meaning of the
 word "undersample".
 >
 Sure, call it something else if you like

 The point about this part of the discussion was how interlace - spatial
 undersampling compared to a full progressive frame,  was a common
 mechanism for aliasing and twitter.

--
Ticket URL: <https://trac.ffmpeg.org/ticket/8590#comment:42>
FFmpeg <https://ffmpeg.org>
FFmpeg issue tracker


More information about the FFmpeg-trac mailing list