<div dir="ltr"><br><div class="gmail_extra"><br><br><div class="gmail_quote">On Fri, Apr 11, 2014 at 3:38 PM, Matthew Jordan <span dir="ltr"><<a href="mailto:mjordan@digium.com" target="_blank">mjordan@digium.com</a>></span> wrote:<br>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr"><div><div class="h5"><br><div class="gmail_extra"></div><div class="gmail_extra"><br><div class="gmail_quote">
On Fri, Apr 4, 2014 at 10:00 AM, Ben Langfeld <span dir="ltr"><<a href="mailto:ben@langfeld.me" target="_blank">ben@langfeld.me</a>></span> wrote:<br>
<blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex"><div dir="ltr">This is very much functionality that should be in Asterisk, not only via ARI, and which is present in pretty much every other IVR platform (see FS' <a href="http://wiki.freeswitch.org/wiki/Mod_http_cache" target="_blank">http://wiki.freeswitch.org/wiki/Mod_http_cache</a>). There are obvious caching issues to be considered, but I'd love to see this present and accessible via all interfaces.</div>
<div><div>
<div class="gmail_extra"><br><div class="gmail_quote">On 4 April 2014 10:15, Ben Merrills <span dir="ltr"><<a href="mailto:b.merrills@mersontech.co.uk" target="_blank">b.merrills@mersontech.co.uk</a>></span> wrote:<br>
<blockquote class="gmail_quote" style="margin:0px 0px 0px 0.8ex;border-left:1px solid rgb(204,204,204);padding-left:1ex">Hi All/Dan,<br>
<br>
I have brought this up myself once or twice and having the ability to play audio from a remote source (an additional schema in play) would be of great benefit. Here are some general reasons I believe this is important.<br>
<br>
1. It allows quite integration of TTS into ARI without having to wait for the Generic Speech API to be adopted. Having the ability to play a remote wav would allow a TTS engine (say a free one like <a href="http://tts-api.com" target="_blank">http://tts-api.com</a> or a local instance that can generate a wav or mp3) to play audio directly into the call. As ARI has no Exec or other TTS integration as yet, this would help bridge the gap.<br>
<br>
2. It allows for remote audio to be played and as a Stasis Application has not requirement to be hosted either on or locally to the Asterisk instance, getting audio files to a server is a problem. This either has be to done in advance (copied) or already exist there. Now, I did mention a while back having a set of ARI features to allow upload and download of files would be very useful, again, having a remote audio schema for Play helps to bridge this gap once more.<br>
<br>
That's my thoughts. Hope they're useful from another person actively using ARI to write applications :)<br>
<div><div><br>
<br>
-- Dan Wrote:<br>
Hello All,<br>
<br>
Was talking to a few people yesterday about how I'd like to be able to play media from an external source into Asterisk using the ARI - say when adding someone to a bridge - prompts/hold music come from an external source.<br>
<br>
I know Paul B talked about this a month or so ago - <a href="http://lists.digium.com/pipermail/asterisk-app-dev/2014-March/000408.html" target="_blank">http://lists.digium.com/pipermail/asterisk-app-dev/2014-March/000408.html</a><br>
<br>
That ended up being a technical discussion about local channels etc; I'd like to get to a point where we find out if this is something people want so the Asterisk Team can decide if it's something worth putting into their timescales - hence the new topic - sorry if anyone disagrees!<br>
<br>
For me, external source = HTTP(S)<br>
<br>
A really basic example of what I mean... the absolute radio mp3 stream (they have many other stream codec types - FLAC being the best they give)<br>
<a href="http://network.absoluteradio.co.uk/core/audio/mp3/live.pls?service=vrbb" target="_blank">http://network.absoluteradio.co.uk/core/audio/mp3/live.pls?service=vrbb</a><br>
Obviously there may be licensing issues with *that* stream but you get where I'm coming from - and issues when it comes to different codecs.<br>
Going a little further, it may not be a stream at all and may just be a file over http. Essentially I wouldn't want to have to put files onto the same filesystem as Asterisk - my ARI application may live outside of Asterisk itself.<br>
What do people think about this?<br>
Dan<br>
<br>
</div></div><br></blockquote></div></div></div></div></blockquote></div><br></div></div></div><div class="gmail_extra">There's actually two different use cases in here, both of which are worth pursuing/discussion.<br>
<br></div><div class="gmail_extra">
The first is being able to specify a remote resource to play to a channel/bridge via a URI, i.e.,<br><br></div><div class="gmail_extra">POST /channels/1234/play?media=http:%2F%2Fmyawesomeserver%2Fmonkeys.wav<br><br></div>
<div class="gmail_extra">You would cURL that down to the local filesystem and play it back. As Ben mentioned, ideally you would also cache the result so that a subsequent request to that URL simply plays the media from the local file system. This use case, by itself, would be an excellent addition to ARI/Asterisk.<br>
<br></div><div class="gmail_extra">The second is a bit more complex: you have a remote system that is constantly streaming media and you want to pipe that media to an ARI resource. Since this is a constant stream of media with no well defined beginning/end, this is a bit more involved then simply pulling a file down and playing it. This could be an RTP stream, but it could also be something else - although specifying it as an RTP stream is a good starting point.<br>
<br></div><div class="gmail_extra">This use case is a bit more complex: the /play operation essentially doesn't have a well defined 'file' to stream to a resource. What's more, you can't rewind/fastforward/restart a stream. Let's say we try to view this as an operation similar to MoH, where you have the ability to start/stop the media to the channel but it is essentially just an endless stream of media. In that case, we could extend the MoH operation to specify a remote stream to play as opposed to a local stream. If it's an RTP stream, we'd have to specify how you want to receive that stream - which means under the hood, you're probably creating something similar to a channel. This might look something like:<br>
<br>POST /channels/1234/moh?mohClass=remote&format=ulaw&src=0%2E0%2E0%2E0%3A10000<br><br></div><div class="gmail_extra">That is, we expect a media stream to occur in format ulaw and we're going to attempt to read it on port 10000 (binding to all addresses). There's a lot of implementation details to make something like this work - we have to have something reading from that address, turning it into frames, then distributing that to all channels that want that stream (as you wouldn't want this to be tied to a single channel that makes the moh request to that class/address - you'd want to be able to share the stream). That implies some form of bridge to distribute the media to all channels that want the media. Implicit behaviour is one thing I'd like to avoid.<br>
</div><div class="gmail_extra"><br>Another way to view this - and the way I'd prefer to view it - is that we're really creating another path of communication from Asterisk to some remote media stream. Yes, that path of communication is potentially one-way - but it is a constant, never-ending stream of media coming from an external source, and that is really the point of a channel in Asterisk. This use case thus feels like it is better served by a dedicated channel of some sort - special purpose in the same way that a snoop channel is special purpose. The standard way of creating channels can be used to create such a media stream channel:<br>
<br></div><div class="gmail_extra"></div><div class="gmail_extra">POST /channels/1234?endpoint=RTPStream%2F0%2E0%2E0%2E0%3A10000&format=ulaw&app=myStasisApp<br><br></div><div class="gmail_extra">Once you have the remote streaming channel, you can do whatever you want with it. Make it the announcer in a holding bridge. Create a snoop channel, put both channels in a bridge together, and whisper music to a channel. Put the stream in a mixing bridge with however many channels you want. Basically, it's up to you to do what you want with it. And there's nothing that says that this has to be an RTP Stream - you could really use ARI as the signalling mechanism to set up an RTP stream with anything - but going down that road, you will eventually have to handle a full SDP in the JSON body, which is quite a chunk of work. I'd punt that down the road for now, as that requires a bit more thought.<br>
<br></div><div class="gmail_extra">The second option feels more flexible/powerful, and it fits in with the model of bridges/channels that Asterisk/ARI uses.<br><br>Thoughts?<span class="HOEnZb"><font color="#888888"><br>
</font></span></div><span class="HOEnZb"><font color="#888888"><div class="gmail_extra"><br clear="all">
<br>-- <br><div dir="ltr"><div>Matthew Jordan<br></div><div>Digium, Inc. | Engineering Manager</div><div>445 Jan Davis Drive NW - Huntsville, AL 35806 - USA</div><div>Check us out at: <a href="http://digium.com" target="_blank">http://digium.com</a> & <a href="http://asterisk.org" target="_blank">http://asterisk.org</a></div>
</div>
</div></font></span></div>
<br>_______________________________________________<br>
asterisk-app-dev mailing list<br>
<a href="mailto:asterisk-app-dev@lists.digium.com">asterisk-app-dev@lists.digium.com</a><br>
<a href="http://lists.digium.com/cgi-bin/mailman/listinfo/asterisk-app-dev" target="_blank">http://lists.digium.com/cgi-bin/mailman/listinfo/asterisk-app-dev</a><br>
<br></blockquote></div><br></div><div class="gmail_extra"><br></div><div class="gmail_extra">It seems sensible to me - I think that covers everything that I would want as an end user for the time being</div></div>