<html>
<head>
<base href="https://wiki.asterisk.org/wiki">
<link rel="stylesheet" href="/wiki/s/en/2171/18/9/_/styles/combined.css?spaceKey=AST&forWysiwyg=true" type="text/css">
</head>
<body style="background: white;" bgcolor="white" class="email-body">
<div id="pageContent">
<div id="notificationFormat">
<div class="wiki-content">
<div class="email">
<h2><a href="https://wiki.asterisk.org/wiki/display/AST/Writing+a+Python+Test">Writing a Python Test</a></h2>
<h4>Page <b>added</b> by <a href="https://wiki.asterisk.org/wiki/display/~mjordan">Matt Jordan</a>
</h4>
<br/>
<div class="notificationGreySide">
<h1><a name="WritingaPythonTest-Overview"></a>Overview</h1>
<p>A test within the Asterisk Test Suite exercises some piece of specific functionality within Asterisk. Examples of such tests include:</p>
<ul>
        <li>Making sure a SIP peer can register over IPv6 with Asterisk</li>
        <li>Checking that a CDR is created with the expected entries when a call is made</li>
        <li>Verifying that a user is authenticated properly when checking their voicemail</li>
</ul>
<p>There is no end to what could be tested by the Test Suite; in general, however, each test should attempt to verify the functionality of one particular user story. Tests that have too wide of a scope are often brittle and prone to system-dependent failures; likewise, tests that are too narrow in scope are better handled by the Asterisk Unit Test Framework.</p>
<div class='panelMacro'><table class='infoMacro'><colgroup><col width='24'><col></colgroup><tr><td valign='top'><img src="/wiki/images/icons/emoticons/information.gif" width="16" height="16" align="absmiddle" alt="" border="0"></td><td>The following walkthrough produces a test similar to the <em>tests/skeleton_test</em>, which is included in the Asterisk Test Suite and provides a template for a Python test. You can use that test as a starting point for tests that you write.</td></tr></table></div>
<p>Developing a test can be broken down into the following steps:</p>
<ol>
        <li>Define the <a href="#WritingaPythonTest-TestLayoutandAsteriskConfiguration">Test Layout and Asterisk Configuration</a></li>
        <li>Describe the test in <a href="#WritingaPythonTest-DescribingthetestinTestConfig.yaml">Test-Config.yaml</a></li>
        <li>Write the <a href="#WritingaPythonTest-Writingruntest">run-test</a></li>
        <li><a href="#WritingaPythonTest-Runningthetest">Executing</a> the test</li>
</ol>
<h2><a name="WritingaPythonTest-Dependencies"></a>Dependencies</h2>
<p>Python tests use a number of dependencies to facilitate test development; these are enumerated below. Note that other dependencies are also needed for test development and execution; these however explicitly affect the python modules.</p>
<ul>
        <li>twisted - an event-based networking framework for internet applications</li>
        <li>starpy - a set of wrappers that provides access to AMI and FastAGI</li>
</ul>
<div class='panelMacro'><table class='infoMacro'><colgroup><col width='24'><col></colgroup><tr><td valign='top'><img src="/wiki/images/icons/emoticons/information.gif" width="16" height="16" align="absmiddle" alt="" border="0"></td><td><b>Note</b><br />Please check the README.txt delivered with the Asterisk Test Suite for a full list of dependencies</td></tr></table></div>
<h2><a name="WritingaPythonTest-InternalLibraries"></a>Internal Libraries</h2>
<p>There are a number of libraries that have been developed to ease common tasks that many tests require. Of particular note is the TestCase class, which acts as a common base class for the Python tests. While you do not have to inherit from this class to have a working test, you are encouraged to use it as it provides a common model and a large amount of functionality for the Python tests.</p>
<div class='table-wrap'>
<table class='confluenceTable'><tbody>
<tr>
<th class='confluenceTh'> Library </th>
<th class='confluenceTh'> Purpose </th>
</tr>
<tr>
<td class='confluenceTd'> ami </td>
<td class='confluenceTd'> A small wrapper class around starpy's AMI manager </td>
</tr>
<tr>
<td class='confluenceTd'> asterisk </td>
<td class='confluenceTd'> A class that manages an instance of Asterisk, including the location of its configuration files and interacting with it over the CLI </td>
</tr>
<tr>
<td class='confluenceTd'> buildoptions </td>
<td class='confluenceTd'> Determines the build options for the repository of Asterisk that contains the Test Suite, and whether or not a given test can execute given the build options </td>
</tr>
<tr>
<td class='confluenceTd'> cdr </td>
<td class='confluenceTd'> A set of classes that verify CSV CDRs </td>
</tr>
<tr>
<td class='confluenceTd'> CDRTestCase </td>
<td class='confluenceTd'> A class that inherits from TestCase, used by tests that verify CDRs </td>
</tr>
<tr>
<td class='confluenceTd'> ChannelTestCondition </td>
<td class='confluenceTd'> A pre/post condition module. Verifies that channels are not orphaned after test execution </td>
</tr>
<tr>
<td class='confluenceTd'> confbridge </td>
<td class='confluenceTd'> Provides a class that inherits from TestCase and other helper classes for tests that verify ConfBridge </td>
</tr>
<tr>
<td class='confluenceTd'> config </td>
<td class='confluenceTd'> Parses and represents an Asterisk configuration file in memory </td>
</tr>
<tr>
<td class='confluenceTd'> FdTestCondition </td>
<td class='confluenceTd'> A pre/post condition module. Verifies that file descriptors are not leaked after test execution </td>
</tr>
<tr>
<td class='confluenceTd'> LockTestCondition </td>
<td class='confluenceTd'> A pre/post condition module. Verifies that a deadlock has not occurred, or that locks are not held, after test execution </td>
</tr>
<tr>
<td class='confluenceTd'> SipDialogTestCondition </td>
<td class='confluenceTd'> A pre/post condition module. Verifies that after test execution, all SIP dialogs are in an expected state </td>
</tr>
<tr>
<td class='confluenceTd'> sipp </td>
<td class='confluenceTd'> Wrapper classes around SIPp scenarios </td>
</tr>
<tr>
<td class='confluenceTd'> sippversion </td>
<td class='confluenceTd'> Determines the version of SIPp on the system, and how it was compiled </td>
</tr>
<tr>
<td class='confluenceTd'> syncami </td>
<td class='confluenceTd'> A module that provides a wrapper to the HTTP version of AMI. starpy (and the ami module) use TCP for communication with Asterisk </td>
</tr>
<tr>
<td class='confluenceTd'> TestCase </td>
<td class='confluenceTd'> A base class for tests within the Test Suite. Provides a common framework for starting a test, stopping a test, and interacting with various components within the Test Suite </td>
</tr>
<tr>
<td class='confluenceTd'> TestConditions </td>
<td class='confluenceTd'> Manages the pre/post condition checking. This is an option for any test, but will greatly increase test execution times, and is not enabled on most test runs </td>
</tr>
<tr>
<td class='confluenceTd'> TestConfig </td>
<td class='confluenceTd'> Sets of classes that determine the configuration of a test from its corresponding test-config.yaml file </td>
</tr>
<tr>
<td class='confluenceTd'> TestState </td>
<td class='confluenceTd'> A module that provides a controller and state machine for TestEvents from Asterisk. This is used for complex tests that require information from long running applications in Asterisk, such as app_voicemail or app_confbridge </td>
</tr>
<tr>
<td class='confluenceTd'> ThreadTestCondition </td>
<td class='confluenceTd'> A pre/post condition module. Verifies that all threads are cleaned up after test execution </td>
</tr>
<tr>
<td class='confluenceTd'> utils </td>
<td class='confluenceTd'> Various independent utilities that are too small to warrant their own module </td>
</tr>
<tr>
<td class='confluenceTd'> version </td>
<td class='confluenceTd'> Determines the installed version of Asterisk, and performs comparisons between Asterisk versions </td>
</tr>
<tr>
<td class='confluenceTd'> voicemail </td>
<td class='confluenceTd'> Provides a class that inherits from TestCase and other helper classes for tests that verify VoiceMail applications </td>
</tr>
</tbody></table>
</div>
<h1><a name="WritingaPythonTest-TestLayoutandAsteriskConfiguration"></a>Test Layout and Asterisk Configuration</h1>
<ol>
        <li>Create a new folder for the test in the appropriate location. In general, this will be a folder in the <em>/tests</em> directory. You may want to provide a similar structure to Asterisk by grouping related tests together, e.g., application tests should have folder(s) under the <em>/tests/apps</em> directory. For now, we'll assume that we're creating a test called <em>sample</em>, located in <em>tests/sample</em>.</li>
        <li>In the <em>sample</em> folder, create the following:
        <ul>
                <li>A <em>run-test</em> file, which will contain the python script to execute. The file should have execute permissions, and should not have the ".py" extension. The Test Suite looks for files named run-test and executes them; the fact that we are choosing Python as our language is an implementation decision that the Test Suite does not care about.</li>
                <li><em>test-config.yaml</em>, which will contain the test information and its dependency properties</li>
                <li>A <em>configs</em> directory. The <em>configs</em> directory should contain subfolder(s) for each instance of Asterisk that will be instantiated by the test, named <em>ast#</em>, where # is the 1-based index of the Asterisk instance. For now, create a single folder named <em>ast1</em>.</li>
                <li>In each <em>ast#</em> subfolder, the Asterisk config files needed for the test. At a minimum, this will be <em>extensions.conf</em>.
<div class='panelMacro'><table class='infoMacro'><colgroup><col width='24'><col></colgroup><tr><td valign='top'><img src="/wiki/images/icons/emoticons/information.gif" width="16" height="16" align="absmiddle" alt="" border="0"></td><td>The asterisk class automatically creates an <em>asterisk.conf</em> file, and installs it along with other basic Asterisk configuration files (see the <em>configs</em> directory). You can override their behavior by providing your own <em>.conf.inc</em> files. Any configuration files not provided in the <em>configs</em> directory are installed from the subfolders for each test.</td></tr></table></div></li>
        </ul>
        </li>
        <li>Edit your <em>extensions.conf</em> to perform some test in Asterisk. For our test, we'll simply check that we can dial into Asterisk and play back a sound file.
<div class="code panel" style="border-width: 1px;"><div class="codeContent panelContent">
<pre class="theme: Confluence; brush: java; gutter: true">[default]
exten => s,1,NoOp()
same => n,Playback(tt-monkeys)
same => n,UserEvent(TestResult,result:pass)</pre>
</div></div> </li>
</ol>
<p>At the end of this, you should have:</p>
<ul>
        <li>A folder in <em>tests</em> named <em>sample</em></li>
        <li>An empty file in <em>tests/sample</em> named <em>run-test</em></li>
        <li>An empty file in <em>tests/sample</em> named <em>test-config.yaml</em></li>
        <li>A subfolder in <em>sample</em> named <em>configs</em></li>
        <li>A subfolder in <em>sample/configs</em> named <em>ast1</em></li>
        <li>A populated <em>extensions.conf</em> in <em>sample/configs/ast1</em></li>
</ul>
<h1><a name="WritingaPythonTest-DescribingthetestinTestConfig.yaml"></a>Describing the test in Test-Config.yaml</h1>
<p>Each test has a corresponding <a href="http://yaml.org/" class="external-link" rel="nofollow">yaml</a> file that defines information about the test, the dependencies the test has, and other optional configuration information. The fields that should be filled out, at a minimum, are:</p>
<ul>
        <li>testinfo:
        <ul>
                <li>summary: A summary of the test</li>
                <li>description: A verbose description of exactly what piece and functionality of Asterisk is under test.</li>
        </ul>
        </li>
        <li>properties:
        <ul>
                <li>minversion: The minimum version of Asterisk that this test applies to</li>
                <li>dependencies:
                <ul>
                        <li>python: Any python based dependencies. Often, this will be noted twice, once for 'twisted' and once for 'starpy'</li>
                        <li>custom: Custom dependencies, e.g., 'soundcard', 'fax', etc.</li>
                        <li>app: External applications that are needed, i.e., 'pjsua'</li>
                </ul>
                </li>
        </ul>
        </li>
</ul>
<div class='panelMacro'><table class='infoMacro'><colgroup><col width='24'><col></colgroup><tr><td valign='top'><img src="/wiki/images/icons/emoticons/information.gif" width="16" height="16" align="absmiddle" alt="" border="0"></td><td>See the Test Suite's README.txt for all of the possible fields in a test configuration file</td></tr></table></div>
<p>The <em>test-config.yaml</em> file for our <em>sample</em> test is below.</p>
<div class="code panel" style="border-width: 1px;"><div class="codeContent panelContent">
<pre class="theme: Confluence; brush: java; gutter: true">testinfo:
summary: 'A sample test'
description: |
This test verifies that monkeys have taken over the phone system.
properties:
minversion: '1.8'
dependencies:
- python : 'twisted'
- python : 'starpy'</pre>
</div></div>
<p>While we've created our test description, we haven't yet told the Test Suite of its existence. Upon startup, <em>runtests.py</em> checks <em>tests/tests.yaml</em> for the tests that exist. That file defines the folders that contain tests, where each folder contains another <em>tests.yaml</em> file that further defines tests and folders. In order for the Test Suite to find our sample test, open the <em>tests/tests.yaml</em> file and insert our test:</p>
<div class="code panel" style="border-width: 1px;"><div class="codeContent panelContent">
<pre class="theme: Confluence; brush: java; gutter: true">tests:
- test: 'example'
# We're inserting our sample test here:
- test: 'sample'
- test: 'dynamic-modules'
- dir: 'manager'
# And so on...</pre>
</div></div>
<h1><a name="WritingaPythonTest-Writingruntest"></a>Writing run-test</h1>
<p>Now we start to actually write the meat of our test. Each test in the Test Suite is spawned as a separate process, and so each test needs an entry point. First, lets import a few libraries and write our main.</p>
<div class="code panel" style="border-width: 1px;"><div class="codeContent panelContent">
<pre class="theme: Confluence; brush: python; gutter: true">#!/usr/bin/env python
# vim: sw=3 et:
import sys
import os
import logging
from twisted.internet import reactor
sys.path.append("lib/python")
from asterisk.TestCase import TestCase
logger = logging.getLogger(__name__)
def main():
"""
Main entry point for the test. This will do the following:
1. Instantiate the test object
2. Tell the test object to start the Asterisk instances
3. Run the twisted reactor. This will automatically call the test's run method when twisted is up
4. Tell the test object to stop Asterisk. The reactor.run method will return when the reactor is stopped by the test.
5. Check results. If the test passed, return 0; otherwise, return any other value (usually 1). The top level script checks the return code from the process it spawned to determine whether or not the test passed or failed.
"""
test = SampleTest()
test.start_asterisk()
reactor.run()
test.stop_asterisk()
if not test.passed:
return 1
return 0
if __name__ == "__main__":
sys.exit(main() or 0)</pre>
</div></div>
<p>There are a few things to note from this:</p>
<ul>
        <li>We're going to use the twisted reactor for our test. This is usually useful as we typically will use asynchronous AMI events to drive the tests.</li>
        <li>We've told the python path where the Test Suite libraries are, and imported the TestCase class. Our test case class, SampleTest, will end up deriving from it.</li>
        <li>We have a logging object we can use to send statements to the Test Suite log file</li>
        <li>Our test case class will have to have start_asterisk and stop_asterisk methods. These are actually provided by the TestCase class, but we can override them if we feel like it.</li>
</ul>
<p>Moving on!</p>
<h2><a name="WritingaPythonTest-DefiningtheTestClass"></a>Defining the Test Class</h2>
<p>We'll need a test the inherits from TestCase. For now, we'll assume that the basic class provides our start_asterisk and stop_asterisk methods and that we don't need to override them (which is a safe assumption in most cases). We'll fill in some of these methods a bit more later.</p>
<div class="code panel" style="border-width: 1px;"><div class="codeContent panelContent">
<pre class="theme: Confluence; brush: python; gutter: true">class SampleTest(TestCase):
"""
A class that executes a very simple test, using TestCase to do most of the
heavy lifting.
"""
def __init__(self):
super(SampleTest, self).__init__()
""" You should always call the base class implementation of __init__ prior
to initializing your test conditions here. Some useful variables the TestCase
class provides:
- passed - set to False initially. Set to True if your test passes.
- ast - a list of Asterisk instance
- ami - a list of StarPY manager (AMI) instances, corresponding to each Asterisk instance
- reactor_timeout - maximum time a test can execute before it is considered to fail. This
prevents tests from hanging and never finishing. You can reset this timer using a call
to TestCase.reset_timeout
"""
"""
In your initialization, you should usually set the reactor_timeout if it should
be something other than 30 (the default). You should also call create_asterisk, which
will create and initialize the Asterisk instances. You can specify as a parameter the number
of Asterisk instances to create.
"""
self.create_asterisk()
def run(self):
"""
Run is called by the reactor when the test starts. It is the entry point for test execution,
and will occur after Asterisk is started (so long as the instructions in this example are followed).
Typically, after calling the base class implementation, connections over AMI are created. You
can also interact with the started Asterisk instances here by referencing the ast list.
"""
super(SampleTest, self).run()
"""
Create a connection over AMI to the created Asterisk instance. If you need to communicate with
all of the instances of Asterisk that were created, specify the number of AMI connections to make.
When the AMI connection succeeds, ami_connect will be called.
"""
self.create_ami_factory()
def ami_connect(self, ami):
"""
This method is called by the StarPY manager class when AMI connects to Asterisk.
Keyword Arguments:
ami - The StarPY manager object that connected
"""</pre>
</div></div>
<p>At the end of this, we have the following:</p>
<ul>
        <li>A class that inherits from <em>TestCase</em>. In its constructor, it calls the base class constructor and creates an instance of Asterisk by calling the <em>TestCase.create_asterisk()</em> method. The base class provides us a few attributes that are of particular use:
        <ul>
                <li><em>passed</em> - a boolean variable that we can set to True or False</li>
                <li><em>ast</em> - a list of asterisk instances, that provide access to a running Asterisk application</li>
                <li><em>ami</em> - a list of AMI connections corresponding to each asterisk instance</li>
                <li><em>reactor_timeout</em> - the amount of time (in seconds) that the twisted reactor will wait before it stops itself. This is used to prevent tests from hanging.</li>
                <li><em>TestCase</em> has a method we can call called <em>create_asterisk()</em>, that, well, creates instances of Asterisk. Yay!</li>
                <li><em>TestCase</em> has another method we can call called <em>create_ami_factory()</em> that creates AMI connections to our previously created instances of Asterisk. We do this after the twisted reactor has started, so that Asterisk has a chance to start up.</li>
        </ul>
        </li>
        <li>An entry point for the twisted reactor called <em>run()</em>. This calls the base class's implementation of the method, then spawns an AMI connection. Note that in our <em>main</em> method, we start up the created Asterisk instances prior to starting the twisted reactor - so when <em>run()</em> is called by twisted, Asterisk should already be started and ready for an AMI connection.</li>
        <li>A method, <em>ami_connect</em>, that is called when an AMI connection succeeds. This same method is used for all AMI connections - so to tell which AMI connection you are receiving, we can check the <em>ami.id</em> property. Each AMI connection corresponds exactly to the instance of Asterisk in the <em>ast</em> list - so <em>ast[ami.id]</em> will reference the Asterisk instance associated with the <em>ami</em> object.</li>
</ul>
<h2><a name="WritingaPythonTest-MakingtheTestdosomething"></a>Making the Test do something</h2>
<p>So, we have a test that will start up, spawn an instance of Asterisk, and connect to it over AMI. That's interesting, but doesn't really test anything. Based on our <em>extensions.conf</em>, we want to call the <em>s</em> extension in <em>default</em>, hopefully have monkeys possess our channel, and then check that the <em>UserEvent</em> fired off to determine if we passed. If we don't see the UserEvent, we should eventually fail. Lets start off by adding some code to <em>ami_connect</em>.</p>
<div class="code panel" style="border-width: 1px;"><div class="codeContent panelContent">
<pre class="theme: Confluence; brush: python; gutter: true">def ami_connect(self, ami):
"""
This method is called by the StarPY manager class when AMI connects to Asterisk.
Keyword Arguments:
ami - The StarPY manager object that connected
"""
logger.info("Instructing monkeys to rise up and overthrow their masters")
df = ami.originate(
channel="Local/s@default",
application="Echo")
df.addErrback(self.handleOriginateFailure)</pre>
</div></div>
<p>What we've now instructed the test to do is, upon an AMI connection, originate a call to the <em>s</em> extension in context <em>default</em>, using a Local channel. starpy's <em>originate</em> method returns a deferred object, which lets us assign a callback handler in case of an error. We've used the TestCase class's <em>handleOriginateFailure</em> method for this, which will automagically fail our test for us if the originate fails.</p>
<p>Now we need something to handle the UserEvent when monkeys inevitably enslave our phone system. Let's add that to our <em>ami_connect</em> method as well.</p>
<div class="code panel" style="border-width: 1px;"><div class="codeContent panelContent">
<pre class="theme: Confluence; brush: python; gutter: true">def ami_connect(self, ami):
"""
This method is called by the StarPY manager class when AMI connects to Asterisk.
Keyword Arguments:
ami - The StarPY manager object that connected
"""
ami.registerEvent('UserEvent', self.user_event)
logger.info("Instructing monkeys to rise up and overthrow their masters")
df = ami.originate(
channel="Local/s@default",
application="Echo")
df.addErrback(self.handleOriginateFailure)
def user_event(self, ami, event):
"""
Handler for the AMI UserEvent
Keyword Arguements:
ami - The StarPY AMI object corresponding to the received UserEvent
event - The AMI event
"""
if event['userevent'] != 'TestResult':
return
if event['result'] == "pass":
self.passed = True
logger.info("Monkeys detected; test passes!")
else:
logger.error("No monkeys found :-(")
self.stop_reactor()</pre>
</div></div>
<p>Now we've registered for the UserEvent that should be raised from the dialplan after monkeys are played back. We make the assumption in the handler that we could have other UserEvents that return failure results - in our case, we don't have failure scenarios, but many tests do. Regardless, once we receive a user event we stop the twisted reactor, which will cause our test to be stopped and the results evaluated.</p>
<p>We should now be ready to run our test.</p>
<h1><a name="WritingaPythonTest-Runningthetest"></a>Running the test</h1>
<p>From a console window, browse to the base directory of the Test Suite and type the following:</p>
<div class="code panel" style="border-width: 1px;"><div class="codeContent panelContent">
<pre class="theme: Confluence; brush: java; gutter: false">./runtests.py --test=tests/sample/</pre>
</div></div>
<p>You should see something similar to the following:</p>
<div class="code panel" style="border-width: 1px;"><div class="codeContent panelContent">
<pre class="theme: Confluence; brush: java; gutter: false">Making sure Asterisk isn't running ...
Running ['tests/sample/run-test'] ...
Resetting translation matrix
Parsing /tmp/asterisk-testsuite/sample/ast1/etc/asterisk/logger.conf
Parsing /tmp/asterisk-testsuite/sample/ast1/etc/asterisk/logger.general.conf.inc
Parsing /tmp/asterisk-testsuite/sample/ast1/etc/asterisk/logger.logfiles.conf.inc
<?xml version="1.0" encoding="utf-8"?>
<testsuite errors="0" failures="0" name="AsteriskTestSuite" tests="1" time="18.83">
<testcase name="tests/sample" time="18.83"/>
</testsuite></pre>
</div></div>
<p>We can inspect the log files created by the Test Suite for more information. The Test Suite makes two log files - full.txt and messages.txt - by default, DEBUG and higher are sent to full.txt, while INFO and higher are sent to mssages.txt. The following is a snippet from messages.txt - yours should look similar.</p>
<div class="code panel" style="border-width: 1px;"><div class="codeContent panelContent">
<pre class="theme: Confluence; brush: java; gutter: false">[Feb 07 17:04:51] INFO[6991]: asterisk.TestCase:86 __init__: Executing tests/sample
[Feb 07 17:04:51] INFO[6991]: asterisk.TestCase:135 create_asterisk: Creating Asterisk instance 1
[Feb 07 17:04:52] INFO[6991]: asterisk.TestCase:208 start_asterisk: Starting Asterisk instance 1
[Feb 07 17:04:52] INFO[6991]: asterisk.TestCase:159 create_ami_factory: Creating AMIFactory 1
[Feb 07 17:04:52] INFO[6991]: AMI:158 connectionMade: Connection Made
[Feb 07 17:04:52] INFO[6991]: AMI:172 onComplete: Login Complete: {'message': 'Authentication accepted', 'response': 'Success', 'actionid': 'mjordan-laptop-22006600-1'}
[Feb 07 17:04:52] INFO[6991]: asterisk.TestCase:297 __ami_connect: AMI Connect instance 1
[Feb 07 17:04:52] INFO[6991]: __main__:67 ami_connect: Instructing monkeys to rise up and overthrow their masters
[Feb 07 17:05:09] INFO[6991]: __main__:85 user_event: Monkeys detected; test passes!
[Feb 07 17:05:09] INFO[6991]: asterisk.TestCase:256 stop_reactor: Stopping Reactor
[Feb 07 17:05:09] INFO[6991]: asterisk.TestCase:223 stop_asterisk: Stopping Asterisk instance 1</pre>
</div></div>
</div>
<div id="commentsSection" class="wiki-content pageSection">
<div style="float: right;" class="grey">
<a href="https://wiki.asterisk.org/wiki/users/removespacenotification.action?spaceKey=AST">Stop watching space</a>
<span style="padding: 0px 5px;">|</span>
<a href="https://wiki.asterisk.org/wiki/users/editmyemailsettings.action">Change email notification preferences</a>
</div>
<a href="https://wiki.asterisk.org/wiki/display/AST/Writing+a+Python+Test">View Online</a>
|
<a href="https://wiki.asterisk.org/wiki/display/AST/Writing+a+Python+Test?showComments=true&showCommentArea=true#addcomment">Add Comment</a>
</div>
</div>
</div>
</div>
</div>
</body>
</html>