
On Tue, Feb 12, 2008 at 12:33:22AM +0100, michael wrote:
Author: michael Date: Tue Feb 12 00:33:21 2008 New Revision: 604
Log: 3 more issues which have come up in the past but have IIRC never been resolved.
Modified: docs/nutissues.txt
Modified: docs/nutissues.txt ============================================================================== --- docs/nutissues.txt (original) +++ docs/nutissues.txt Tue Feb 12 00:33:21 2008 @@ -110,3 +110,42 @@ Solutions: A. Store such alternative playlists of scenes in info packets somehow. B. Design a separate layer for it. C. Do not support this. + + +Issue header-compression +------------------------ +Headers of codec frames often contain low entropy information or things +we already know like the frame size. + +A. Store header bytes and length in the framecode table. +B. Leave things as they are.
I think this one was resolved strongly as a leave-it-alone. I'm absolutely against any proposal that precludes implementations from performing zero-copy decoding from the stream buffer.
+Issue small-frames +------------------ +The original intent of nut frames was that 1 container frame == 1 codec +frame. Though this does not seem to be explicitly written in nut.txt.
It is.
+Also it is inefficient for very small frames, AMR-NB for example has 6-32 +bytes per frame.
I don't think anyone really cares that much about efficiency when storing shit codecs in NUT. Obviously any good codec will use large frame sizes or compression will not be good.
+Solutions: +A. Enforce 1 container frame == 1 codec frame even if it causes 10% overhead.
Yes.
+B. Allow multiple frames as long as the whole packet is less than some + fixed minimum in bytes (like 256byte)
Very very bad. Demuxing requires a codec-specific framer.
+C. Allow multiple frames as long as the whole packet is less than some + fixed minimum in bytes (like 256byte) and the codec uses a constant + framesize in samples.
This does not help.
+D. Use header compression, that is allow to store the first (few) bytes + of a codec frame together with its size in the framecode table. This + would allow us to store the size of a frame without any redundancy. + Thus effectivly avoiding the overhead small frames cause.
At the cost of efficient decoding... If such a horrid proposal were accepted, it would have to be restricted to frames smaller than ~256 bytes. Otherwise the buffer requirements for reconstructing the complete frame would be troublesome and for very large frames it would make a huge performance difference.
+Issue pcm-frames +---------------- +No word is said about how many or few PCM samples should be in a frame. + +Solutions: +A. Define an maximum number of samples (like 512) +B. Define an maximum timespam (like 0.1 sec) +C. Define an maximum number of bytes (like 1024)
PCM is already a much bigger question due to sample format and interleaving issues. Should there be a new fourcc for each combination of PCM properties, or a single fourcc with extradata? If the latter, the number of samples per frame would probably be coded in the extradata or something. I'm open to ideas. Rich