[asterisk-video] How do you send large frames?

SteveK stevek at stevek.com
Thu May 18 05:50:44 MST 2006


On May 18, 2006, at 6:06 AM, Sergio García Murillo wrote:

>> Hi, sorry - I think I confused the discussion on #asterisk-dev.
>>
>> We actually strip the rtp header - it's the h263 RFC2190 header that
>> is stored, together with the length (with rtp mark bit) and
>> timestamp.
>>
>> It's not a standard format that I know of, which is why we need the
>> GStreamer modules to convert other formats into this 'asterisk
>> format'.
>>
>> Neil
>>
>> Olle E Johansson wrote:
>>> I am learning new things about the video implementation all the  
>>> time.
>>>
>>> Seems like we are actually saving RTP headers in the format files,
>>> which is bad. As far as I know,
>>> IAX2 does not use RTP so I wonder if playing one of the saved files
>>> to an IAX videophone would work?
>>>
>>> I am not aware of any IAX2 video phones, but not using a clean
>>> format disturbs me.
>
> Hi everyone, I've been following the thread, but I really still  
> doesn't understand the problem.
> Can any one clear up the questions? Where is the problem? Is it in  
> sending video stored in a file?
> Saving it to a file or just retransmitting it?
> By the way, I think that it would be a good idea to use some  
> standard container for storing the video.
> I have been working woth mp4 for a while and I think it would be  
> perfect for what we need. In fact I developed an aplication that  
> converted a dump of rtp h263 rfc 2190 data into a perfectly valid  
> mp4 file, that could be played with vlc or qt and streamed  
> correctly (with same rfc 2190 payload) with a darwing streaming  
> server.


A couple of comments:

1) As far as I know, there only IAX2 video phone implementation, is  
the fork that tipic made of iaxclient;  The official iaxclient  
distribution should also support video at some point later this year  
(based in part on tipic's work).    So, I think that the IAX2 video  
formats are somewhat flexible at this point, and no specification  
documents really discuss this.

Things like marker bits for beginning-of-frame, keyframes, etc, are  
certainly not specified.

2) File formats: MP4 (or MOV, on which it's based) are fine  
containers, but one thing I've been concerned about with saving  
streams has always been how to represent lost packets in these  
containers.    It's important to be able to represent that for two  
reasons:   (a) you want to be able to keep audio and video in sync,  
and (b) you may want to be able to accurately represent a recording  
with some packet loss, by playing it in the proper amount of time.

For (b), consider the situation where you are recording a one hour  
session with random 10% packet loss.  User agents in the live session  
will play this fine, and users may not even notice the 10% packet  
loss (in the audio portion, the video might be a mess).  But, if you  
don't somehow represent these lost packets in the recorded file, the  
whole thing will play back in 54 minutes, instead of 60.

-SteveK




More information about the asterisk-video mailing list