<html><head><meta http-equiv="Content-Type" content="text/html; charset=us-ascii"><meta name="Generator" content="Microsoft Word 15 (filtered medium)"><style><!--
/* Font Definitions */
@font-face
{font-family:"Cambria Math";
panose-1:2 4 5 3 5 4 6 3 2 4;}
@font-face
{font-family:Calibri;
panose-1:2 15 5 2 2 2 4 3 2 4;}
/* Style Definitions */
p.MsoNormal, li.MsoNormal, div.MsoNormal
{margin:0in;
margin-bottom:.0001pt;
font-size:11.0pt;
font-family:"Calibri",sans-serif;}
a:link, span.MsoHyperlink
{mso-style-priority:99;
color:#0563C1;
text-decoration:underline;}
a:visited, span.MsoHyperlinkFollowed
{mso-style-priority:99;
color:#954F72;
text-decoration:underline;}
span.EmailStyle17
{mso-style-type:personal-compose;
font-family:"Calibri",sans-serif;
color:windowtext;}
.MsoChpDefault
{mso-style-type:export-only;
font-family:"Calibri",sans-serif;}
@page WordSection1
{size:8.5in 11.0in;
margin:1.0in 1.0in 1.0in 1.0in;}
div.WordSection1
{page:WordSection1;}
--></style></head><body lang="EN-US" link="#0563C1" vlink="#954F72"><div class="WordSection1"><p class="MsoNormal">Hello asterisk-dev list,<span style="font-size:12.0pt"></span></p><p class="MsoNormal"> </p><p class="MsoNormal">I am working on a rtp proxy that essentially takes a mp4 video stream and converts it into a sip endpoint.</p><p class="MsoNormal"> </p><p class="MsoNormal">To start I hacked up ekiga to use a text file with a gstreamer pipeline defined as a video and audio source, demuxing the video and audio and feeding it into the sip call.</p><p class="MsoNormal"> </p><p class="MsoNormal">I then modified chan_rtp.c to send both the video and audio streams - which is currently working. I can use gstreamer to receive the udp streams and play back the audio and video.</p><p class="MsoNormal"> </p><p class="MsoNormal">Now I want to get rid of ekiga and make chan_rtp also listen for an audio and video incoming udp stream to feed into the call. I have tried adding the source ports to the channel, but the sockets don't actually get opened and listen.</p><p class="MsoNormal"> </p><p class="MsoNormal">Looking in the other channels I see where sockets are manually opened, but I would rather use the rtp engine.</p><p class="MsoNormal"> </p><p class="MsoNormal">Could someone point me in the direction where a channel defines a rtp address/port using the ast_rtp_engine and opens the listening socket, or some guidance to at least identify the api calls to make that happen?</p><p class="MsoNormal"> </p><p class="MsoNormal">I think I am close, but I am missing something.</p><p class="MsoNormal"> </p><p class="MsoNormal">I have defined the video and audio channels, and can call into the extension and stream the call to my video wall.</p><p class="MsoNormal"> </p><p class="MsoNormal">Do I need to define a separate pair of channels for receiving rtp, and what do I call once the local address is set so that the engine will actually start receiving the rtp data?</p><p class="MsoNormal"> </p><p class="MsoNormal">I tried setting ast_rtp_instance_set_local_address on the channels I am sending on - that doesn't open the actual sockets.</p><p class="MsoNormal"> </p><p class="MsoNormal">Any help would be greatly appreciated.</p><p class="MsoNormal"> </p><p class="MsoNormal">Thank you.</p><p class="MsoNormal"> </p><p class="MsoNormal">Michael</p><p class="MsoNormal"> </p></div></body></html>