[FFmpeg-user] Glossary: d-NTSC & d-PAL

Mark Filipak (ffmpeg) markfilipak at bog.us
Sun Oct 4 04:08:42 EEST 2020


On 10/03/2020 08:09 PM, Jim DeLaHunt wrote:
> On 2020-10-03 08:44, Mark Filipak (ffmpeg) wrote:
-snip-> When you say, "My goal (for a long time) is to differentiate hard telecine from pseudo NTSC 
(which
> I'm calling d-NTSC).… [using] MPEG-PS metadata", it sounds to that your goal is to describe 
> different content structures in the context of an MPEG-PS stream.

That is exactly what I'm doing. I've been (manually) parsing a lot of video sequences (meaning the 
stream beginning with 'sequence_header_code': 0x00 00 01 B3, and ending with 'sequence_end_code': 
(0x00 00 01 B7 -- I'm not interested in the transport packages, though I've parsed them, too) 
looking for clues to formats. I spent over a month just figuring out macroblock structures.

> The right document for doing this 
> work is a guide to or explanation of MPEG-PS stream contents. As part of describing a content 
> structure, it is probably quite helpful to list the metadata values which identify that structure. 
> But this document is not a glossary.

Why not? (That question is rhetorical ... I appreciate that you have a right to your own opinion.)

> It also sounds to me like you are coining the term "d-NTSC" to name one kind of content structure. 
> It is perfectly in scope to define names in such a guide or explanation.  But it sounds like you 
> aren't claiming that the term "d-NTSC" is [also] defined by some other document, such as the H.262 
> specification. Fine.

H.262 (and presumably MPEG) don't name things. For example, H.262 refers to d-NTSC & d-PAL (i.e. 
scan-frames) by citing metadata thusly: "If progressive_frame is set to 0 it indicates that the two 
fields of the frame are interlaced fields in which an interval of time of the field period [1] 
exists between (corresponding spatial samples) of the two fields." -- how cumbersome! I'm just 
assigning names to the 30/1.001 Hz & 25 Hz versions.

> In the glossary, I would expect to see a term, e.g. "d-NTSC", and then one or more entries 
> describing meanings of that term, each with an explanation of the term and a cross-reference to 
> where the term is defined or used in an important way, e.g. to "Mark's Guide to MPEG-PS Stream 
> Content Structures", section X.Y, "pseudo-NTSC content".
> 
> Or simply put, what you are drafting in this thread is an entry in Mark's Guide to MPEG-PS, not a 
> glossary entry. In my humble opinion.

So, I take it that, to you, a glossary is a snack whereas a meal must be some sort of treatise and 
that you think a meal is required. I disagree, but maybe you're right.

Perhaps a presentation of my motives is in order? -- I DO have an axe to grind. :-)

Treatises drive me nuts. I better understand a complicated subject by hacking between and among 
concise definitions. I rarely read treatises because they always seem to explain by citing use 
cases. With each use case, the architecture comes into better focus, but it does take relearning 
over and over and that takes so much time. I'm a computer system architect. Kindly just give me the 
structure and the procedure and I'll put it together. I don't need use cases. (Code would probably 
be sufficient, but I don't know 'C'.)

When presented with a treatise, what I do is scan it -- I never exhaustively read it -- and build a 
glossary. Then, to really understand the topic, I scan the glossary, pulling threads together in my 
mind until I've formed an architecture. Then I test the architecture against the treatise's use 
cases. I don't think I'm alone in this. In the case of ffmpeg, everything seems to be use cases and 
it drives me postal.

-- 
-- 
What if you woke up and found yourself in a police state?
African-Americans wake up in a police state every day.


More information about the ffmpeg-user mailing list