[asterisk-dev] AST_FRAME_DIGITAL

Sergio Garcia Murillo sergio.garcia at fontventa.com
Sun Sep 9 18:50:14 CDT 2007


----- Original Message ----- 
From: "Tilghman Lesher" <tilghman at mail.jeffandtilghman.com>
To: "Asterisk Developers Mailing List" <asterisk-dev at lists.digium.com>
Sent: Sunday, September 09, 2007 9:32 PM
Subject: Re: [asterisk-dev] AST_FRAME_DIGITAL


> On Sunday 09 September 2007 06:58:08 Sergio Garcia Murillo wrote:
> > ----- Original Message -----
> > From: "Tilghman Lesher" <tilghman at mail.jeffandtilghman.com>
> > To: "Asterisk Developers Mailing List" <asterisk-dev at lists.digium.com>
> > Sent: Saturday, September 08, 2007 11:46 PM
> > Subject: Re: [asterisk-dev] AST_FRAME_DIGITAL
> >
> > > On Saturday 08 September 2007 10:28:06 Sergio Garcia Murillo wrote:
> > > > In fact the problem is when you want to transmit the data from a
> > > > channel to an application. The real case that we have to deal with
> > > > is with h234m (i.e. 3g  videocalls).  The call is received by the
> > > > isdn channel (chan_zap in klaus case), but the content
> > > > is a h223 multiplexed data carrying h245 negotiation data, h263
> > > > video and amr audio.
> > > > So the problem is, how do we implement a pseudo channel in top
> > > > of that channel?  We could have modified the chan_zap to decode
> > > > it an directly output the demuxed frames to asterisk. The problem
> > > > is that appart of beeing innecesarily difficult I would have to
patch
> > > > each isdn channel not only chan_zap.  So what I decided to do is
> > > > implement an application to receive the incomming data
> > > > and demultiplex it into a new local psuedo channel.
> > > > The problem is that the data between the channel and the
> > > > application is neither ALAW nor ULAW, or any other asterisk voice
> > > > formats, and any conversion or transformation done will  screw
> > > > the data.
> > >
> > > Excellent.  So it is now clear that you should be using
> > > AST_FRAME_VIDEO for the data, not an opaque data type.
> >
> > Let me try to explain myself again, because it seems that you have not
> > undestood the issue yet.
> > The data is NOT video, is h223 data, which is a multiplexed stream that
> > carries
> > simultaniously audio,video and h245 control data. Once it's processed by
> > the application then you can extract video frames and audio frames which
> > can be correctly handled by asterisk.
> > So if you don't want to create an opaque data type, then it will be
great
> > if you
> > create it's correct type that would be AST_FRAME_DIGITAL and it's
correct
> > format AST_FORMAT_H223.
> > Also if you want you could also include other digital formats, as for
> > example
> > AST_FORMAT_H320 (for isdn conferencing) that would be great.
>
> Okay, so it's clear that you want to be able to demux the data in one
place,
> to have no code duplication.  I would encourage you to create a common API
> in a resource module (perhaps res_demux) and call that API from every
channel
> driver that needs to demux the stream into multiples and send the data
through
> the core as each individual frame type (voice, video, and control).
Again,
> with a native bridge, you could short circuit this step and avoid the
demux,
> but you should follow the Asterisk model of separating the streams in the
> general case, such that other channels can interpret the data
successfully.

I'll take a look at what can be done in a resource module, I don't know much
about them.
A good  point for including this new type is that you could move the
multiplexed
data from one asterisk to another (via TDMOE for example). The process
involved
in a video call (multiplexing/demultiplexing, transcoding, and transrating)
are quite
cpu intensive, so probably is a good idea (also for ha and load balancing)
to be able
to distribute small number of calls to many different asterisk servers
instead of
having one huge server with all the hardware.

Best regards
Sergio




More information about the asterisk-dev mailing list