[Asterisk-Dev] How to implement an audio delay?
Tony Mountifield
tony at softins.clara.co.uk
Thu Jan 20 02:10:41 MST 2005
I posted this to -users the other day, but had no responses, probably
because it is not just a "using" question but may involve some real
development too. So I figured people here might be more inclined and
able to answer....
This question is directed towards those who are familiar with the inner
workings of the Asterisk code. I'm quite at home hacking on the source
code, and have become familiar with certain parts of Asterisk's
operation. I'm looking for some advice on the most fruitful avenues to
explore in order to achieve a particular application I need: either in
the source code or in AGI (with which I'm not so familiar).
The requirement is to put a several-second delay in the audio path from
one channel to another. This would naturally be in a situation where
communication is one-way. I would envisage reading audio frames into a
ring buffer of the required length, and writing them out from the other
end of the buffer.
In the first instance, the link would be between two MeetMe conferences,
where the audio from the "master" conference (in which any participant
can speak) is fed through this delay channel and into a "slave" conference
(where the participants just listen to it). The delay is required in
order to approximately synchronise the audio with other things happening
outside of Asterisk.
Any advice on good ways to approach this would be much appreciated!
Cheers
Tony
--
Tony Mountifield
Work: tony at softins.co.uk - http://www.softins.co.uk
Play: tony at mountifield.org - http://tony.mountifield.org
More information about the asterisk-dev
mailing list