<p>Joshua Colp has uploaded this change for <strong>review</strong>.</p><p><a href="https://gerrit.asterisk.org/c/asterisk/+/12869">View Change</a></p><pre style="font-family: monospace,monospace; white-space: pre-wrap;">func_jitterbuffer: Add audio/video sync support.<br><br>This change adds support to the JITTERBUFFER dialplan function<br>for audio and video synchronization. When enabled the RTCP SR<br>report is used to produce an NTP timestamp for both the audio and<br>video streams. Using this information the video frames are queued<br>until their NTP timestamp is equal to or behind the NTP timestamp<br>of the audio. The audio jitterbuffer acts as the leader deciding<br>when to shrink/grow the jitterbuffer when adaptive is in use. For<br>both adaptive and fixed the video buffer follows the size of the<br>audio jitterbuffer.<br><br>ASTERISK-28533<br><br>Change-Id: I3fd75160426465e6d46bb2e198c07b9d314a4492<br>---<br>A doc/CHANGES-staging/func_jitterbuffer_video.txt<br>M funcs/func_jitterbuffer.c<br>M include/asterisk/abstract_jb.h<br>M main/abstract_jb.c<br>M res/res_rtp_asterisk.c<br>5 files changed, 213 insertions(+), 3 deletions(-)<br><br></pre><pre style="font-family: monospace,monospace; white-space: pre-wrap;">git pull ssh://gerrit.asterisk.org:29418/asterisk refs/changes/69/12869/1</pre><pre style="font-family: monospace,monospace; white-space: pre-wrap;"><span>diff --git a/doc/CHANGES-staging/func_jitterbuffer_video.txt b/doc/CHANGES-staging/func_jitterbuffer_video.txt</span><br><span>new file mode 100644</span><br><span>index 0000000..550650d</span><br><span>--- /dev/null</span><br><span>+++ b/doc/CHANGES-staging/func_jitterbuffer_video.txt</span><br><span>@@ -0,0 +1,6 @@</span><br><span style="color: hsl(120, 100%, 40%);">+Subject: func_jitterbuffer</span><br><span style="color: hsl(120, 100%, 40%);">+</span><br><span style="color: hsl(120, 100%, 40%);">+The JITTERBUFFER dialplan function now has an option to enable video support.</span><br><span style="color: hsl(120, 100%, 40%);">+When enabled and used with a compatible channel driver (chan_sip, chan_pjsip)</span><br><span style="color: hsl(120, 100%, 40%);">+the video is buffered according to the size of the audio jitterbuffer and is</span><br><span style="color: hsl(120, 100%, 40%);">+synchronized to the audio.</span><br><span>diff --git a/funcs/func_jitterbuffer.c b/funcs/func_jitterbuffer.c</span><br><span>index e4c68ed..c8e495c 100644</span><br><span>--- a/funcs/func_jitterbuffer.c</span><br><span>+++ b/funcs/func_jitterbuffer.c</span><br><span>@@ -62,8 +62,9 @@</span><br><span>           </syntax></span><br><span>              <description></span><br><span>                  <para>Jitterbuffers are constructed in two different ways.</span><br><span style="color: hsl(0, 100%, 40%);">-                        The first always take three arguments: <replaceable>max_size</replaceable>,</span><br><span style="color: hsl(0, 100%, 40%);">-                 <replaceable>resync_threshold</replaceable>, and <replaceable>target_extra</replaceable>.</span><br><span style="color: hsl(120, 100%, 40%);">+                     The first always take four arguments: <replaceable>max_size</replaceable>,</span><br><span style="color: hsl(120, 100%, 40%);">+                        <replaceable>resync_threshold</replaceable>, <replaceable>target_extra</replaceable>,</span><br><span style="color: hsl(120, 100%, 40%);">+                 and <replaceable>video</replaceable>.</span><br><span>                    Alternatively, a single argument of <literal>default</literal> can be provided,</span><br><span>                  which will construct the default jitterbuffer for the given</span><br><span>                  <replaceable>jitterbuffer type</replaceable>.</para></span><br><span>@@ -76,12 +77,17 @@</span><br><span>                         <para>target_extra: This option only affects the adaptive jitterbuffer. It represents</span><br><span>                  the amount time in milliseconds by which the new jitter buffer will pad its size.</span><br><span>                    Defaults to 40ms.</para></span><br><span style="color: hsl(120, 100%, 40%);">+                        <para>video: This option enables video synchronization with the audio stream. It can be</span><br><span style="color: hsl(120, 100%, 40%);">+                 turned on and off. Defaults to off.</para></span><br><span>                     <example title="Fixed with defaults" language="text"></span><br><span>                      exten => 1,1,Set(JITTERBUFFER(fixed)=default)</span><br><span>                     </example></span><br><span>                     <example title="Fixed with 200ms max size" language="text"></span><br><span>                        exten => 1,1,Set(JITTERBUFFER(fixed)=200)</span><br><span>                         </example></span><br><span style="color: hsl(120, 100%, 40%);">+                      <example title="Fixed with 200ms max size and video support" language="text"></span><br><span style="color: hsl(120, 100%, 40%);">+                       exten => 1,1,Set(JITTERBUFFER(fixed)=200,,,yes)</span><br><span style="color: hsl(120, 100%, 40%);">+                    </example></span><br><span>                     <example title="Fixed with 200ms max size, resync threshold 1500" language="text"></span><br><span>                         exten => 1,1,Set(JITTERBUFFER(fixed)=200,1500)</span><br><span>                    </example></span><br><span>@@ -91,6 +97,9 @@</span><br><span>                         <example title="Adaptive with 200ms max size, 60ms target extra" language="text"></span><br><span>                  exten => 1,1,Set(JITTERBUFFER(adaptive)=200,,60)</span><br><span>                  </example></span><br><span style="color: hsl(120, 100%, 40%);">+                      <example title="Adaptive with 200ms max size and video support" language="text"></span><br><span style="color: hsl(120, 100%, 40%);">+                    exten => 1,1,Set(JITTERBUFFER(adaptive)=200,,,yes)</span><br><span style="color: hsl(120, 100%, 40%);">+                 </example></span><br><span>                     <example title="Set a fixed jitterbuffer with defaults; then remove it" language="text"></span><br><span>                   exten => 1,1,Set(JITTERBUFFER(fixed)=default)</span><br><span>                     exten => 1,n,Set(JITTERBUFFER(disabled)=)</span><br><span>@@ -133,6 +142,7 @@</span><br><span>                   AST_APP_ARG(max_size);</span><br><span>                       AST_APP_ARG(resync_threshold);</span><br><span>                       AST_APP_ARG(target_extra);</span><br><span style="color: hsl(120, 100%, 40%);">+                    AST_APP_ARG(video);</span><br><span>          );</span><br><span> </span><br><span>               AST_STANDARD_APP_ARGS(args, parse);</span><br><span>@@ -151,6 +161,11 @@</span><br><span>                           "jbtargetextra",</span><br><span>                           args.target_extra);</span><br><span>          }</span><br><span style="color: hsl(120, 100%, 40%);">+             if (!ast_strlen_zero(args.video)) {</span><br><span style="color: hsl(120, 100%, 40%);">+                   res |= ast_jb_read_conf(&jb_conf,</span><br><span style="color: hsl(120, 100%, 40%);">+                         "jbvideo",</span><br><span style="color: hsl(120, 100%, 40%);">+                          args.video);</span><br><span style="color: hsl(120, 100%, 40%);">+          }</span><br><span>            if (res) {</span><br><span>                   ast_log(LOG_WARNING, "Invalid jitterbuffer parameters %s\n", value);</span><br><span>               }</span><br><span>diff --git a/include/asterisk/abstract_jb.h b/include/asterisk/abstract_jb.h</span><br><span>index b300d12..91ddaab 100644</span><br><span>--- a/include/asterisk/abstract_jb.h</span><br><span>+++ b/include/asterisk/abstract_jb.h</span><br><span>@@ -44,7 +44,8 @@</span><br><span> enum {</span><br><span>         AST_JB_ENABLED = (1 << 0),</span><br><span>     AST_JB_FORCED =  (1 << 1),</span><br><span style="color: hsl(0, 100%, 40%);">-        AST_JB_LOG =     (1 << 2)</span><br><span style="color: hsl(120, 100%, 40%);">+       AST_JB_LOG =     (1 << 2),</span><br><span style="color: hsl(120, 100%, 40%);">+      AST_JB_VIDEO =   (1 << 3)</span><br><span> };</span><br><span> </span><br><span> enum ast_jb_type {</span><br><span>@@ -89,6 +90,7 @@</span><br><span> #define AST_JB_CONF_TARGET_EXTRA "targetextra"</span><br><span> #define AST_JB_CONF_IMPL "impl"</span><br><span> #define AST_JB_CONF_LOG "log"</span><br><span style="color: hsl(120, 100%, 40%);">+#define AST_JB_CONF_VIDEO "video"</span><br><span> </span><br><span> /* Hooks for the abstract jb implementation */</span><br><span> /*! \brief Create */</span><br><span>diff --git a/main/abstract_jb.c b/main/abstract_jb.c</span><br><span>index b375739..15070a6 100644</span><br><span>--- a/main/abstract_jb.c</span><br><span>+++ b/main/abstract_jb.c</span><br><span>@@ -41,6 +41,8 @@</span><br><span> #include "asterisk/utils.h"</span><br><span> #include "asterisk/pbx.h"</span><br><span> #include "asterisk/timing.h"</span><br><span style="color: hsl(120, 100%, 40%);">+#include "asterisk/rtp_engine.h"</span><br><span style="color: hsl(120, 100%, 40%);">+#include "asterisk/format_cache.h"</span><br><span> </span><br><span> #include "asterisk/abstract_jb.h"</span><br><span> #include "fixedjitterbuf.h"</span><br><span>@@ -53,6 +55,9 @@</span><br><span>      JB_CREATED =              (1 << 2)</span><br><span> };</span><br><span> </span><br><span style="color: hsl(120, 100%, 40%);">+/*! The maximum size we allow the early frame buffer to get */</span><br><span style="color: hsl(120, 100%, 40%);">+#define MAXIMUM_EARLY_FRAME_COUNT 200</span><br><span style="color: hsl(120, 100%, 40%);">+</span><br><span> </span><br><span> /* Implementation functions */</span><br><span> /* fixed */</span><br><span>@@ -568,6 +573,8 @@</span><br><span>               }</span><br><span>    } else if (!strcasecmp(name, AST_JB_CONF_LOG)) {</span><br><span>             ast_set2_flag(conf, ast_true(value), AST_JB_LOG);</span><br><span style="color: hsl(120, 100%, 40%);">+     } else if (!strcasecmp(name, AST_JB_CONF_VIDEO)) {</span><br><span style="color: hsl(120, 100%, 40%);">+            ast_set2_flag(conf, ast_true(value), AST_JB_VIDEO);</span><br><span>  } else {</span><br><span>             return -1;</span><br><span>   }</span><br><span>@@ -832,6 +839,11 @@</span><br><span> #define DEFAULT_RESYNC  1000</span><br><span> #define DEFAULT_TYPE AST_JB_FIXED</span><br><span> </span><br><span style="color: hsl(120, 100%, 40%);">+struct jb_stream_sync {</span><br><span style="color: hsl(120, 100%, 40%);">+    unsigned int timestamp;</span><br><span style="color: hsl(120, 100%, 40%);">+       struct timeval ntp;</span><br><span style="color: hsl(120, 100%, 40%);">+};</span><br><span style="color: hsl(120, 100%, 40%);">+</span><br><span> struct jb_framedata {</span><br><span>   const struct ast_jb_impl *jb_impl;</span><br><span>   struct ast_jb_conf jb_conf;</span><br><span>@@ -841,11 +853,21 @@</span><br><span>  int timer_interval; /* ms between deliveries */</span><br><span>      int timer_fd;</span><br><span>        int first;</span><br><span style="color: hsl(120, 100%, 40%);">+    int audio_stream_id;</span><br><span style="color: hsl(120, 100%, 40%);">+  struct jb_stream_sync audio_stream_sync;</span><br><span style="color: hsl(120, 100%, 40%);">+      int video_stream_id;</span><br><span style="color: hsl(120, 100%, 40%);">+  struct jb_stream_sync video_stream_sync;</span><br><span style="color: hsl(120, 100%, 40%);">+      AST_LIST_HEAD_NOLOCK(, ast_frame) early_frames;</span><br><span style="color: hsl(120, 100%, 40%);">+       unsigned int early_frame_count;</span><br><span style="color: hsl(120, 100%, 40%);">+       struct timeval last_audio_ntp_timestamp;</span><br><span style="color: hsl(120, 100%, 40%);">+      int audio_flowing;</span><br><span>   void *jb_obj;</span><br><span> };</span><br><span> </span><br><span> static void jb_framedata_destroy(struct jb_framedata *framedata)</span><br><span> {</span><br><span style="color: hsl(120, 100%, 40%);">+      struct ast_frame *frame;</span><br><span style="color: hsl(120, 100%, 40%);">+</span><br><span>   if (framedata->timer) {</span><br><span>           ast_timer_close(framedata->timer);</span><br><span>                framedata->timer = NULL;</span><br><span>@@ -859,11 +881,15 @@</span><br><span>          framedata->jb_obj = NULL;</span><br><span>         }</span><br><span>    ao2_cleanup(framedata->last_format);</span><br><span style="color: hsl(120, 100%, 40%);">+       while ((frame = AST_LIST_REMOVE_HEAD(&framedata->early_frames, frame_list))) {</span><br><span style="color: hsl(120, 100%, 40%);">+         ast_frfree(frame);</span><br><span style="color: hsl(120, 100%, 40%);">+    }</span><br><span>    ast_free(framedata);</span><br><span> }</span><br><span> </span><br><span> void ast_jb_conf_default(struct ast_jb_conf *conf)</span><br><span> {</span><br><span style="color: hsl(120, 100%, 40%);">+      ast_clear_flag(conf, AST_FLAGS_ALL);</span><br><span>         conf->max_size = DEFAULT_SIZE;</span><br><span>    conf->resync_threshold = DEFAULT_RESYNC;</span><br><span>  ast_copy_string(conf->impl, "fixed", sizeof(conf->impl));</span><br><span>@@ -886,6 +912,51 @@</span><br><span>     jb_framedata_destroy((struct jb_framedata *) framedata);</span><br><span> }</span><br><span> </span><br><span style="color: hsl(120, 100%, 40%);">+static int rtp_get_rate(struct ast_format *format)</span><br><span style="color: hsl(120, 100%, 40%);">+{</span><br><span style="color: hsl(120, 100%, 40%);">+        /* For those wondering: due to a fluke in RFC publication, G.722 is advertised</span><br><span style="color: hsl(120, 100%, 40%);">+         * as having a sample rate of 8kHz, while implementations must know that its</span><br><span style="color: hsl(120, 100%, 40%);">+         * real rate is 16kHz. Seriously.</span><br><span style="color: hsl(120, 100%, 40%);">+         */</span><br><span style="color: hsl(120, 100%, 40%);">+        return (ast_format_cmp(format, ast_format_g722) == AST_FORMAT_CMP_EQUAL) ? 8000 : (int)ast_format_get_sample_rate(format);</span><br><span style="color: hsl(120, 100%, 40%);">+}</span><br><span style="color: hsl(120, 100%, 40%);">+</span><br><span style="color: hsl(120, 100%, 40%);">+static struct timeval jitterbuffer_frame_get_ntp_timestamp(const struct jb_stream_sync *stream_sync, const struct ast_frame *frame)</span><br><span style="color: hsl(120, 100%, 40%);">+{</span><br><span style="color: hsl(120, 100%, 40%);">+       int timestamp_diff;</span><br><span style="color: hsl(120, 100%, 40%);">+   unsigned int rate;</span><br><span style="color: hsl(120, 100%, 40%);">+</span><br><span style="color: hsl(120, 100%, 40%);">+  /* It's possible for us to receive frames before we receive the information allowing</span><br><span style="color: hsl(120, 100%, 40%);">+       * us to do NTP/RTP timestamp calculations. Since the information isn't available we</span><br><span style="color: hsl(120, 100%, 40%);">+       * can't generate one and give an empty timestamp.</span><br><span style="color: hsl(120, 100%, 40%);">+         */</span><br><span style="color: hsl(120, 100%, 40%);">+   if (ast_tvzero(stream_sync->ntp)) {</span><br><span style="color: hsl(120, 100%, 40%);">+                return ast_tv(0, 0);</span><br><span style="color: hsl(120, 100%, 40%);">+  }</span><br><span style="color: hsl(120, 100%, 40%);">+</span><br><span style="color: hsl(120, 100%, 40%);">+   if (frame->frametype == AST_FRAME_VOICE) {</span><br><span style="color: hsl(120, 100%, 40%);">+         rate = rtp_get_rate(frame->subclass.format);</span><br><span style="color: hsl(120, 100%, 40%);">+       } else {</span><br><span style="color: hsl(120, 100%, 40%);">+              /* Video is special - internally we reference it as 1000 but in RTP it is actually 90000 */</span><br><span style="color: hsl(120, 100%, 40%);">+           rate = 90000;</span><br><span style="color: hsl(120, 100%, 40%);">+ }</span><br><span style="color: hsl(120, 100%, 40%);">+</span><br><span style="color: hsl(120, 100%, 40%);">+   /* Convert the Asterisk timestamp into an RTP timestamp, and then based on the difference we can</span><br><span style="color: hsl(120, 100%, 40%);">+       * determine how many samples are in the frame and how long has elapsed since the synchronization</span><br><span style="color: hsl(120, 100%, 40%);">+      * RTP and NTP timestamps were received giving us the NTP timestamp for this frame.</span><br><span style="color: hsl(120, 100%, 40%);">+    */</span><br><span style="color: hsl(120, 100%, 40%);">+   timestamp_diff = (frame->ts * (rtp_get_rate(frame->subclass.format) / 1000)) - stream_sync->timestamp;</span><br><span style="color: hsl(120, 100%, 40%);">+</span><br><span style="color: hsl(120, 100%, 40%);">+     if (timestamp_diff < 0) {</span><br><span style="color: hsl(120, 100%, 40%);">+          /* It's possible for us to be asked for an NTP timestamp from before our latest</span><br><span style="color: hsl(120, 100%, 40%);">+            * RTCP SR report. To handle this we subtract so we go back in time.</span><br><span style="color: hsl(120, 100%, 40%);">+           */</span><br><span style="color: hsl(120, 100%, 40%);">+           return ast_tvsub(stream_sync->ntp, ast_samp2tv(abs(timestamp_diff), rate));</span><br><span style="color: hsl(120, 100%, 40%);">+        } else {</span><br><span style="color: hsl(120, 100%, 40%);">+              return ast_tvadd(stream_sync->ntp, ast_samp2tv(timestamp_diff, rate));</span><br><span style="color: hsl(120, 100%, 40%);">+     }</span><br><span style="color: hsl(120, 100%, 40%);">+}</span><br><span style="color: hsl(120, 100%, 40%);">+</span><br><span> static struct ast_frame *hook_event_cb(struct ast_channel *chan, struct ast_frame *frame, enum ast_framehook_event event, void *data)</span><br><span> {</span><br><span>         struct jb_framedata *framedata = data;</span><br><span>@@ -928,6 +999,78 @@</span><br><span>                return frame;</span><br><span>        }</span><br><span> </span><br><span style="color: hsl(120, 100%, 40%);">+ if (ast_test_flag(&framedata->jb_conf, AST_JB_VIDEO)) {</span><br><span style="color: hsl(120, 100%, 40%);">+                if (frame->frametype == AST_FRAME_VOICE) {</span><br><span style="color: hsl(120, 100%, 40%);">+                 /* Store the stream identifier for the audio stream so we can associate the incoming RTCP SR</span><br><span style="color: hsl(120, 100%, 40%);">+                   * with the correct stream sync structure.</span><br><span style="color: hsl(120, 100%, 40%);">+                     */</span><br><span style="color: hsl(120, 100%, 40%);">+                   framedata->audio_stream_id = frame->stream_num;</span><br><span style="color: hsl(120, 100%, 40%);">+         } else if (frame->frametype == AST_FRAME_RTCP && frame->subclass.integer == AST_RTP_RTCP_SR) {</span><br><span style="color: hsl(120, 100%, 40%);">+                  struct ast_rtp_rtcp_report *rtcp_report = frame->data.ptr;</span><br><span style="color: hsl(120, 100%, 40%);">+                 struct jb_stream_sync *stream_sync = NULL;</span><br><span style="color: hsl(120, 100%, 40%);">+</span><br><span style="color: hsl(120, 100%, 40%);">+                  /* Determine which stream this RTCP is in regards to */</span><br><span style="color: hsl(120, 100%, 40%);">+                       if (framedata->audio_stream_id == frame->stream_num) {</span><br><span style="color: hsl(120, 100%, 40%);">+                          stream_sync = &framedata->audio_stream_sync;</span><br><span style="color: hsl(120, 100%, 40%);">+                   } else if (framedata->video_stream_id == frame->stream_num) {</span><br><span style="color: hsl(120, 100%, 40%);">+                           stream_sync = &framedata->video_stream_sync;</span><br><span style="color: hsl(120, 100%, 40%);">+                   }</span><br><span style="color: hsl(120, 100%, 40%);">+</span><br><span style="color: hsl(120, 100%, 40%);">+                   if (stream_sync) {</span><br><span style="color: hsl(120, 100%, 40%);">+                            /* Store the RTP and NTP timestamp mapping so we can derive an NTP timestamp for each frame */</span><br><span style="color: hsl(120, 100%, 40%);">+                                stream_sync->timestamp = rtcp_report->sender_information.rtp_timestamp;</span><br><span style="color: hsl(120, 100%, 40%);">+                         stream_sync->ntp = rtcp_report->sender_information.ntp_timestamp;</span><br><span style="color: hsl(120, 100%, 40%);">+                       }</span><br><span style="color: hsl(120, 100%, 40%);">+             } else if (frame->frametype == AST_FRAME_VIDEO) {</span><br><span style="color: hsl(120, 100%, 40%);">+                  /* If a video frame is late according to the audio timestamp don't stash it away, just return it.</span><br><span style="color: hsl(120, 100%, 40%);">+                  * If however it is ahead then we keep it until such time as the audio catches up.</span><br><span style="color: hsl(120, 100%, 40%);">+                     */</span><br><span style="color: hsl(120, 100%, 40%);">+                   struct ast_frame *jbframe;</span><br><span style="color: hsl(120, 100%, 40%);">+</span><br><span style="color: hsl(120, 100%, 40%);">+                  framedata->video_stream_id = frame->stream_num;</span><br><span style="color: hsl(120, 100%, 40%);">+</span><br><span style="color: hsl(120, 100%, 40%);">+                       /* If no timing information is available we can't store this away, so just let it through now */</span><br><span style="color: hsl(120, 100%, 40%);">+                  if (!ast_test_flag(frame, AST_FRFLAG_HAS_TIMING_INFO)) {</span><br><span style="color: hsl(120, 100%, 40%);">+                              ast_log(LOG_NOTICE, "Returned %d\n", frame->seqno);</span><br><span style="color: hsl(120, 100%, 40%);">+                              return frame;</span><br><span style="color: hsl(120, 100%, 40%);">+                 }</span><br><span style="color: hsl(120, 100%, 40%);">+</span><br><span style="color: hsl(120, 100%, 40%);">+                   /* To ensure that the video starts when the audio starts we only start allowing frames through once</span><br><span style="color: hsl(120, 100%, 40%);">+                    * audio starts flowing.</span><br><span style="color: hsl(120, 100%, 40%);">+                       */</span><br><span style="color: hsl(120, 100%, 40%);">+                   if (framedata->audio_flowing) {</span><br><span style="color: hsl(120, 100%, 40%);">+                            struct timeval video_timestamp;</span><br><span style="color: hsl(120, 100%, 40%);">+</span><br><span style="color: hsl(120, 100%, 40%);">+                             video_timestamp = jitterbuffer_frame_get_ntp_timestamp(&framedata->video_stream_sync, frame);</span><br><span style="color: hsl(120, 100%, 40%);">+                          if (ast_tvdiff_ms(framedata->last_audio_ntp_timestamp, video_timestamp) >= 0) {</span><br><span style="color: hsl(120, 100%, 40%);">+                                 return frame;</span><br><span style="color: hsl(120, 100%, 40%);">+                         }</span><br><span style="color: hsl(120, 100%, 40%);">+                     }</span><br><span style="color: hsl(120, 100%, 40%);">+</span><br><span style="color: hsl(120, 100%, 40%);">+                   /* To prevent the early frame buffer from growing uncontrolled we impose a maximum count that it can</span><br><span style="color: hsl(120, 100%, 40%);">+                   * get to. If this is reached then we drop a video frame, which should cause the receiver to ask for a</span><br><span style="color: hsl(120, 100%, 40%);">+                         * new key frame.</span><br><span style="color: hsl(120, 100%, 40%);">+                      */</span><br><span style="color: hsl(120, 100%, 40%);">+                   if (framedata->early_frame_count == MAXIMUM_EARLY_FRAME_COUNT) {</span><br><span style="color: hsl(120, 100%, 40%);">+                           jbframe = AST_LIST_REMOVE_HEAD(&framedata->early_frames, frame_list);</span><br><span style="color: hsl(120, 100%, 40%);">+                          framedata->early_frame_count--;</span><br><span style="color: hsl(120, 100%, 40%);">+                            ast_frfree(jbframe);</span><br><span style="color: hsl(120, 100%, 40%);">+                  }</span><br><span style="color: hsl(120, 100%, 40%);">+</span><br><span style="color: hsl(120, 100%, 40%);">+                   jbframe = ast_frisolate(frame);</span><br><span style="color: hsl(120, 100%, 40%);">+                       if (!jbframe) {</span><br><span style="color: hsl(120, 100%, 40%);">+                               /* If we can't isolate the frame the safest thing we can do is return it, even if the A/V sync</span><br><span style="color: hsl(120, 100%, 40%);">+                             * may be off.</span><br><span style="color: hsl(120, 100%, 40%);">+                                 */</span><br><span style="color: hsl(120, 100%, 40%);">+                           return frame;</span><br><span style="color: hsl(120, 100%, 40%);">+                 }</span><br><span style="color: hsl(120, 100%, 40%);">+</span><br><span style="color: hsl(120, 100%, 40%);">+                   AST_LIST_INSERT_TAIL(&framedata->early_frames, jbframe, frame_list);</span><br><span style="color: hsl(120, 100%, 40%);">+                   framedata->early_frame_count++;</span><br><span style="color: hsl(120, 100%, 40%);">+                    return &ast_null_frame;</span><br><span style="color: hsl(120, 100%, 40%);">+           }</span><br><span style="color: hsl(120, 100%, 40%);">+     }</span><br><span style="color: hsl(120, 100%, 40%);">+</span><br><span>  now_tv = ast_tvnow();</span><br><span>        now = ast_tvdiff_ms(now_tv, framedata->start_tv);</span><br><span> </span><br><span>@@ -1022,6 +1165,8 @@</span><br><span>     }</span><br><span> </span><br><span>        if (frame->frametype == AST_FRAME_CONTROL) {</span><br><span style="color: hsl(120, 100%, 40%);">+               struct ast_frame *early_frame;</span><br><span style="color: hsl(120, 100%, 40%);">+</span><br><span>             switch(frame->subclass.integer) {</span><br><span>                 case AST_CONTROL_HOLD:</span><br><span>               case AST_CONTROL_UNHOLD:</span><br><span>@@ -1029,12 +1174,50 @@</span><br><span>           case AST_CONTROL_SRCUPDATE:</span><br><span>          case AST_CONTROL_SRCCHANGE:</span><br><span>                  framedata->jb_impl->force_resync(framedata->jb_obj);</span><br><span style="color: hsl(120, 100%, 40%);">+                 /* Since we are resyncing go ahead and clear out the video frames too */</span><br><span style="color: hsl(120, 100%, 40%);">+                      while ((early_frame = AST_LIST_REMOVE_HEAD(&framedata->early_frames, frame_list))) {</span><br><span style="color: hsl(120, 100%, 40%);">+                           ast_frfree(early_frame);</span><br><span style="color: hsl(120, 100%, 40%);">+                      }</span><br><span style="color: hsl(120, 100%, 40%);">+                     framedata->audio_flowing = 0;</span><br><span style="color: hsl(120, 100%, 40%);">+                      framedata->early_frame_count = 0;</span><br><span>                         break;</span><br><span>               default:</span><br><span>                     break;</span><br><span>               }</span><br><span>    }</span><br><span> </span><br><span style="color: hsl(120, 100%, 40%);">+ /* If a voice frame is being passed through see if we need to add any additional frames to it */</span><br><span style="color: hsl(120, 100%, 40%);">+      if (ast_test_flag(&framedata->jb_conf, AST_JB_VIDEO) && frame->frametype == AST_FRAME_VOICE) {</span><br><span style="color: hsl(120, 100%, 40%);">+              AST_LIST_HEAD_NOLOCK(, ast_frame) additional_frames;</span><br><span style="color: hsl(120, 100%, 40%);">+          struct ast_frame *early_frame;</span><br><span style="color: hsl(120, 100%, 40%);">+</span><br><span style="color: hsl(120, 100%, 40%);">+              /* We store the last NTP timestamp for the audio given to the core so that subsequents frames which</span><br><span style="color: hsl(120, 100%, 40%);">+            * are late can be passed immediately through (this will occur for video frames which are returned here)</span><br><span style="color: hsl(120, 100%, 40%);">+               */</span><br><span style="color: hsl(120, 100%, 40%);">+           framedata->last_audio_ntp_timestamp = jitterbuffer_frame_get_ntp_timestamp(&framedata->audio_stream_sync, frame);</span><br><span style="color: hsl(120, 100%, 40%);">+           framedata->audio_flowing = 1;</span><br><span style="color: hsl(120, 100%, 40%);">+</span><br><span style="color: hsl(120, 100%, 40%);">+            AST_LIST_HEAD_INIT_NOLOCK(&additional_frames);</span><br><span style="color: hsl(120, 100%, 40%);">+</span><br><span style="color: hsl(120, 100%, 40%);">+          AST_LIST_TRAVERSE_SAFE_BEGIN(&framedata->early_frames, early_frame, frame_list) {</span><br><span style="color: hsl(120, 100%, 40%);">+                      struct timeval early_timestamp = jitterbuffer_frame_get_ntp_timestamp(&framedata->video_stream_sync, early_frame);</span><br><span style="color: hsl(120, 100%, 40%);">+                     int diff = ast_tvdiff_ms(framedata->last_audio_ntp_timestamp, early_timestamp);</span><br><span style="color: hsl(120, 100%, 40%);">+</span><br><span style="color: hsl(120, 100%, 40%);">+                  /* If this frame is from the past we need to include it with the audio frame that is going</span><br><span style="color: hsl(120, 100%, 40%);">+                     * out.</span><br><span style="color: hsl(120, 100%, 40%);">+                        */</span><br><span style="color: hsl(120, 100%, 40%);">+                   if (diff >= 0) {</span><br><span style="color: hsl(120, 100%, 40%);">+                           AST_LIST_REMOVE_CURRENT(frame_list);</span><br><span style="color: hsl(120, 100%, 40%);">+                          framedata->early_frame_count--;</span><br><span style="color: hsl(120, 100%, 40%);">+                            AST_LIST_INSERT_TAIL(&additional_frames, early_frame, frame_list);</span><br><span style="color: hsl(120, 100%, 40%);">+                        }</span><br><span style="color: hsl(120, 100%, 40%);">+             }</span><br><span style="color: hsl(120, 100%, 40%);">+             AST_LIST_TRAVERSE_SAFE_END;</span><br><span style="color: hsl(120, 100%, 40%);">+</span><br><span style="color: hsl(120, 100%, 40%);">+         /* Append any additional frames we may want to include (such as video) */</span><br><span style="color: hsl(120, 100%, 40%);">+             AST_LIST_NEXT(frame, frame_list) = AST_LIST_FIRST(&additional_frames);</span><br><span style="color: hsl(120, 100%, 40%);">+    }</span><br><span style="color: hsl(120, 100%, 40%);">+</span><br><span>  return frame;</span><br><span> }</span><br><span> </span><br><span>@@ -1066,6 +1249,9 @@</span><br><span>               return -1;</span><br><span>   }</span><br><span> </span><br><span style="color: hsl(120, 100%, 40%);">+ framedata->audio_stream_id = -1;</span><br><span style="color: hsl(120, 100%, 40%);">+   framedata->video_stream_id = -1;</span><br><span style="color: hsl(120, 100%, 40%);">+   AST_LIST_HEAD_INIT_NOLOCK(&framedata->early_frames);</span><br><span>  framedata->timer_fd = ast_timer_fd(framedata->timer);</span><br><span>  framedata->timer_interval = DEFAULT_TIMER_INTERVAL;</span><br><span>       ast_timer_set_rate(framedata->timer, 1000 / framedata->timer_interval);</span><br><span>diff --git a/res/res_rtp_asterisk.c b/res/res_rtp_asterisk.c</span><br><span>index 08138cb..16f57cb 100644</span><br><span>--- a/res/res_rtp_asterisk.c</span><br><span>+++ b/res/res_rtp_asterisk.c</span><br><span>@@ -6084,6 +6084,7 @@</span><br><span>                   transport_rtp->f.delivery.tv_sec = 0;</span><br><span>                     transport_rtp->f.delivery.tv_usec = 0;</span><br><span>                    transport_rtp->f.src = "RTP";</span><br><span style="color: hsl(120, 100%, 40%);">+                    transport_rtp->f.stream_num = rtp->stream_num;</span><br><span>                         f = &transport_rtp->f;</span><br><span>                        break;</span><br><span>               case AST_RTP_RTCP_RTPFB:</span><br><span></span><br></pre><p>To view, visit <a href="https://gerrit.asterisk.org/c/asterisk/+/12869">change 12869</a>. To unsubscribe, or for help writing mail filters, visit <a href="https://gerrit.asterisk.org/settings">settings</a>.</p><div itemscope itemtype="http://schema.org/EmailMessage"><div itemscope itemprop="action" itemtype="http://schema.org/ViewAction"><link itemprop="url" href="https://gerrit.asterisk.org/c/asterisk/+/12869"/><meta itemprop="name" content="View Change"/></div></div>

<div style="display:none"> Gerrit-Project: asterisk </div>
<div style="display:none"> Gerrit-Branch: master </div>
<div style="display:none"> Gerrit-Change-Id: I3fd75160426465e6d46bb2e198c07b9d314a4492 </div>
<div style="display:none"> Gerrit-Change-Number: 12869 </div>
<div style="display:none"> Gerrit-PatchSet: 1 </div>
<div style="display:none"> Gerrit-Owner: Joshua Colp <jcolp@digium.com> </div>
<div style="display:none"> Gerrit-MessageType: newchange </div>