[asterisk-dev] [Code Review]: Initial Bridging Test Object and sample tests

Matt Jordan reviewboard at asterisk.org
Thu Aug 2 12:55:47 CDT 2012



> On July 31, 2012, 2:09 p.m., Matt Jordan wrote:
> > /asterisk/trunk/lib/python/asterisk/BridgeTestCase.py, lines 119-124
> > <https://reviewboard.asterisk.org/r/2065/diff/1/?file=30685#file30685line119>
> >
> >     Do we want to make the channel technology type configurable?
> 
> Mark Michelson wrote:
>     It could be useful to make the channel type configurable. The only thing that's worrisome is ensuring that the rest of the channel string is applicable to other channel types. For instance, I'm pretty sure that the syntax used here would not apply to IAX2.
>     
>     I think the tech/destination/extension syntax is pretty universal though...
> 
> rmudgett wrote:
>     I don't think it is that universal.  Each tech implements what it wants after the tech portion based upon what the tech protocol requires.
> 
> Mark Michelson wrote:
>     I think the easiest way to deal with this will be to change the extension configuration item to take the entire channel to pass to the originate. So where I just have "test_call" the new value would be either "SIP/test_call at uut" or "SIP/uut/test_call". This way, other technologies can easily be substituted. If I make this change, should I change the name of the configuration option from "extension" to something else? Perhaps "originate_channel" or something along those lines?

"originate_channel" sounds like a good idea.


- Matt


-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviewboard.asterisk.org/r/2065/#review6830
-----------------------------------------------------------


On July 27, 2012, 9:48 a.m., Mark Michelson wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviewboard.asterisk.org/r/2065/
> -----------------------------------------------------------
> 
> (Updated July 27, 2012, 9:48 a.m.)
> 
> 
> Review request for Asterisk Developers and Matt Jordan.
> 
> 
> Summary
> -------
> 
> This is the initial work towards writing a bridging test object, along with three sample tests.
> 
> The bridging test object works by starting three instances of Asterisk
> 
> Instance 0: the UUT
> Instance 1: "Alice"
> Instance 2: "Bob"
> 
> The basic setup is that Alice is the caller, Bob is the callee, and the UUT bridges the call.
> 
> In an attempt to make test creation simple, common configuration for the Asterisk instances is stored in the configs/bridge/ directory of the test suite. This way, it is not necessary to create duplicate configuration files for every test. Configuration files that are stored there at the moment are:
> 
> Alice:
> extensions.conf
> sip.conf
> 
> Bob:
> extensions.conf
> sip.conf
> 
> UUT:
> features.conf
> sip.conf
> 
> With these in place, the only file that typically needs to be written for each test is extensions.conf for the UUT.
> 
> YAML configuration:
> 
> The YAML configuration for the bridge test object is surprisingly simple. The configuration consists of a list of test calls to execute. For each call, there are three things to configure:
> * Extension - This is the extension in the UUT's dialplan that Alice should dial. All extensions dialed are in the default context.
> * Hangup - This lets the test object know whether it should send an AMI hangup to Alice or Bob. This option can be omitted if the hangup should not be initiated by the test object.
> * Features - This is a list of DTMF features to execute once the call is established. Features consist of three options:
>     * Who - This lets the test object know whether Alice or Bob should initiate the feature
>     * What - This lets the test object know which feature to execute. The names of the features are the same as they would appear in features.conf
>     * Success - This lets the test object know whether the feature should succeed. "Succeeding" in this case simply meaans that the feature is recognized and that the proper built-in feature is executed. For evaluating success, we do not care about whether the feature actuall succeeded (e.g. automon actually successfully recorded audio)
> 
> Test Execution:
> 
> The YAML configuration is suggestive of how the test suite operates, but there are a few extra steps that may not seem obvious. Here are the steps executed when running a test:
> 
> 1. Asterisk instances are started up. Base configuration in configs/bridge/ is installed first and test-specific configuration is applied afterward. This allows individual tests to overwrite base configuration if it is necessary.
> 2. Calls are executed one-by one. The first step is to originate a call from Alice to the extension specified for the call.
> 3. Once the call is connected, Alice will play an audio file towards Bob.
> 4. Bob records the file and then plays it back to Alice.
> 5. Alice records the file and then uses BackgroundDetect in order to determine if there is audio present. If there is, this establishes that two-way audio is present for the call.
> 6. The test suite is notified at this point that the call is up and that audio is present or not present.
> 7. The test suite then uses a series of AMI GetVar actions to determine if the connected line values for Alice and Bob are correct. It also checks that the BRIDGEPEER channel variable is correct.
> 8. Once all of these tests have passed, then features are executed one-by-one. The first step is to use an AMI PlayDTMF action in order to send the proper feature code from the correct Asterisk instance.
> 9. From here, the test suite waits for a TestEvent from Asterisk to determine if the feature succeeded or failed.
> 10. Once all features are finished, the next call is executed.
> 11. Once all calls are finished, the test is over.
> 
> Problems and potential improvements:
> 
> A problem due to the way review board works is that you can't see the test events I added into Asterisk without creating a separate review. I added a FEATURE_DETECTION test event that gives a "success" or "fail" result. The success result occurs after a feature callback has executed. The fail result occurs if, when checking for a feature, no feature is found that matches the DTMF sequence input.
> 
> One thing you'll notice while looking at the tests' extensions.conf files is that odd features have been enabled for certain parties. To give an example, in the disconnect test, the alice_disconnect extension gives Alice the permission to use the disconnect feature, which makes sense given the test. It also has given Bob the transfer permission, which may seem odd. This is because I had to work around an optimization in Asterisk. If a party has no DTMF features available to it, then the bridge will never get broken when DTMF is received from that party; the DTMF is always immediately passed through. This means that my test events in features.c would never trigger. My attempts to work around this resulted in failure, so for now I have gone with the odd configuration.
> 
> One problem that may be unavoidable is the length of time required for tests. Since each call requires three playbacks of a file, plus whatever time it takes to execute features, it means that tests can take a long time to execute. On my machine, the atxfer test takes around 60 seconds to complete. I at first attempted to verify audio on bridges using simpler methods, but they invariably failed for one reason or another.
> 
> One potential improvement that will need to be added is the ability to register observers for either mid-call or after a call has completed. Mid-call observers will be useful for testing things such as the L option for app_dial or for testing mid-call updates to connected line. I can't think of an immediate use for post-call observers, but having the facility available can't be a bad thing, right?
> 
> Another improvement that should be made is to make use of the CDRModule and CELModule in each test so that records can be verified.
> 
> 
> Diffs
> -----
> 
>   /asterisk/trunk/configs/bridge/ast1/sip.conf PRE-CREATION 
>   /asterisk/trunk/configs/bridge/ast2/extensions.conf PRE-CREATION 
>   /asterisk/trunk/configs/bridge/ast2/sip.conf PRE-CREATION 
>   /asterisk/trunk/configs/bridge/ast3/extensions.conf PRE-CREATION 
>   /asterisk/trunk/configs/bridge/ast3/sip.conf PRE-CREATION 
>   /asterisk/trunk/lib/python/asterisk/BridgeTestCase.py PRE-CREATION 
>   /asterisk/trunk/lib/python/asterisk/TestCase.py 3371 
>   /asterisk/trunk/lib/python/asterisk/asterisk.py 3371 
>   /asterisk/trunk/sample-yaml/ami-config.yaml.sample 3371 
>   /asterisk/trunk/sample-yaml/bridge-config.yaml.sample PRE-CREATION 
>   /asterisk/trunk/tests/bridge/atxfer/configs/ast1/extensions.conf PRE-CREATION 
>   /asterisk/trunk/tests/bridge/atxfer/test-config.yaml PRE-CREATION 
>   /asterisk/trunk/tests/bridge/disconnect/configs/ast1/extensions.conf PRE-CREATION 
>   /asterisk/trunk/tests/bridge/disconnect/test-config.yaml PRE-CREATION 
>   /asterisk/trunk/tests/bridge/simple_bridge/configs/ast1/extensions.conf PRE-CREATION 
>   /asterisk/trunk/tests/bridge/simple_bridge/test-config.yaml PRE-CREATION 
>   /asterisk/trunk/tests/tests.yaml 3371 
> 
> Diff: https://reviewboard.asterisk.org/r/2065/diff
> 
> 
> Testing
> -------
> 
> The three tests included here all pass. They are intended to illustrate variables in the test
> 
> simplebridge - A simple bridge test with no features. Illustrates that features are not required. Tests hangup of both the Alice and Bob Asterisk instances.
> disconnect - Tests the disconnect feature. Tests that a sequence of features works as expected, both for successful and unsuccessful features. Demonstrates that tests with no hangup parameters work properly.
> atxfer - Tests the atxfer feature. Similar to the disconnect feature except that a hangup is required from the test object.
> 
> 
> Thanks,
> 
> Mark
> 
>

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.digium.com/pipermail/asterisk-dev/attachments/20120802/ecde91cc/attachment-0001.htm>


More information about the asterisk-dev mailing list