add compare-results.sh, call from start-testsuite.sh
Compare current test results to the expected results, and exit in error on
discrepancies.
Add compare-result.sh: (trivially) grep junit xml output to determine which
tests passed and which didn't, and compare against an expected-result.log,
another junit file from a previous run. Summarize and determine success.
Include an "xfail" feature: tests that are expected to fail are marked as
"xfail", unexpected failures as "FAIL".
In various subdirs, copy the current jenkins jobs' junit xml outputs as
expected-results.log, so that we will start getting useful output in both
jenkins runs and manual local runs.
In start-testsuite.sh, after running the tests, invoke the results comparison.
Due to the single-line parsing nature, the script so far does not distinguish
between error and failure. I doubt that we actually need to do that though.
Related: OS#3136
Change-Id: I87d62a8be73d73a5eeff61a842e7c27a0066079d
2018-04-05 14:56:38 +00:00
|
|
|
<?xml version="1.0"?>
|
2020-09-11 18:24:23 +00:00
|
|
|
<testsuite name='Titan' tests='62' failures='5' errors='0' skipped='0' inconc='0' time='MASKED'>
|
2020-05-14 19:10:28 +00:00
|
|
|
<testcase classname='MGCP_Test' name='TC_selftest' time='MASKED'/>
|
2018-04-11 13:56:41 +00:00
|
|
|
<testcase classname='MGCP_Test' name='TC_crcx' time='MASKED'/>
|
2018-09-06 12:13:34 +00:00
|
|
|
<testcase classname='MGCP_Test' name='TC_crcx_no_lco' time='MASKED'/>
|
2018-04-11 13:56:41 +00:00
|
|
|
<testcase classname='MGCP_Test' name='TC_crcx_noprefix' time='MASKED'/>
|
|
|
|
<testcase classname='MGCP_Test' name='TC_crcx_unsupp_mode' time='MASKED'/>
|
|
|
|
<testcase classname='MGCP_Test' name='TC_crcx_early_bidir_mode' time='MASKED'/>
|
|
|
|
<testcase classname='MGCP_Test' name='TC_crcx_unsupp_param' time='MASKED'/>
|
|
|
|
<testcase classname='MGCP_Test' name='TC_crcx_missing_callid' time='MASKED'/>
|
|
|
|
<testcase classname='MGCP_Test' name='TC_crcx_missing_mode' time='MASKED'/>
|
|
|
|
<testcase classname='MGCP_Test' name='TC_crcx_unsupp_packet_intv' time='MASKED'/>
|
2018-06-06 11:15:03 +00:00
|
|
|
<testcase classname='MGCP_Test' name='TC_crcx_illegal_double_lco' time='MASKED'/>
|
2018-04-11 13:56:41 +00:00
|
|
|
<testcase classname='MGCP_Test' name='TC_crcx_sdp' time='MASKED'/>
|
|
|
|
<testcase classname='MGCP_Test' name='TC_crcx_wildcarded' time='MASKED'/>
|
|
|
|
<testcase classname='MGCP_Test' name='TC_crcx_wildcarded_exhaust' time='MASKED'/>
|
|
|
|
<testcase classname='MGCP_Test' name='TC_mdcx_without_crcx' time='MASKED'/>
|
|
|
|
<testcase classname='MGCP_Test' name='TC_dlcx_without_crcx' time='MASKED'/>
|
|
|
|
<testcase classname='MGCP_Test' name='TC_mdcx_wildcarded' time='MASKED'/>
|
|
|
|
<testcase classname='MGCP_Test' name='TC_dlcx_wildcarded' time='MASKED'/>
|
|
|
|
<testcase classname='MGCP_Test' name='TC_crcx_and_dlcx_ep_callid_connid' time='MASKED'/>
|
|
|
|
<testcase classname='MGCP_Test' name='TC_crcx_and_dlcx_ep_callid' time='MASKED'/>
|
|
|
|
<testcase classname='MGCP_Test' name='TC_crcx_and_dlcx_ep' time='MASKED'/>
|
|
|
|
<testcase classname='MGCP_Test' name='TC_crcx_and_dlcx_ep_callid_inval' time='MASKED'/>
|
|
|
|
<testcase classname='MGCP_Test' name='TC_crcx_and_dlcx_ep_callid_connid_inval' time='MASKED'/>
|
|
|
|
<testcase classname='MGCP_Test' name='TC_crcx_and_dlcx_retrans' time='MASKED'/>
|
2019-05-14 11:40:49 +00:00
|
|
|
<testcase classname='MGCP_Test' name='TC_crcx_osmux_wildcard' time='MASKED'/>
|
|
|
|
<testcase classname='MGCP_Test' name='TC_crcx_osmux_fixed' time='MASKED'/>
|
|
|
|
<testcase classname='MGCP_Test' name='TC_crcx_osmux_fixed_twice' time='MASKED'/>
|
|
|
|
<testcase classname='MGCP_Test' name='TC_one_crcx_receive_only_osmux' time='MASKED'/>
|
|
|
|
<testcase classname='MGCP_Test' name='TC_one_crcx_loopback_osmux' time='MASKED'/>
|
|
|
|
<testcase classname='MGCP_Test' name='TC_two_crcx_and_rtp_osmux' time='MASKED'/>
|
|
|
|
<testcase classname='MGCP_Test' name='TC_two_crcx_and_rtp_osmux_bidir' time='MASKED'/>
|
|
|
|
<testcase classname='MGCP_Test' name='TC_two_crcx_mdcx_and_rtp_osmux_wildcard' time='MASKED'/>
|
|
|
|
<testcase classname='MGCP_Test' name='TC_two_crcx_mdcx_and_rtp_osmux_fixed' time='MASKED'/>
|
2018-04-11 13:56:41 +00:00
|
|
|
<testcase classname='MGCP_Test' name='TC_crcx_dlcx_30ep' time='MASKED'/>
|
|
|
|
<testcase classname='MGCP_Test' name='TC_rtpem_selftest' time='MASKED'/>
|
2018-06-27 15:52:04 +00:00
|
|
|
<testcase classname='MGCP_Test' name='TC_one_crcx_receive_only_rtp' time='MASKED'/>
|
|
|
|
<testcase classname='MGCP_Test' name='TC_one_crcx_loopback_rtp' time='MASKED'/>
|
2018-09-06 12:13:34 +00:00
|
|
|
<testcase classname='MGCP_Test' name='TC_two_crcx_and_rtp' time='MASKED'/>
|
2018-06-27 15:52:04 +00:00
|
|
|
<testcase classname='MGCP_Test' name='TC_two_crcx_and_rtp_bidir' time='MASKED'/>
|
2018-09-06 12:13:34 +00:00
|
|
|
<testcase classname='MGCP_Test' name='TC_two_crcx_diff_pt_and_rtp' time='MASKED'/>
|
|
|
|
<testcase classname='MGCP_Test' name='TC_two_crcx_diff_pt_and_rtp_bidir' time='MASKED'/>
|
2018-06-27 15:52:04 +00:00
|
|
|
<testcase classname='MGCP_Test' name='TC_two_crcx_mdcx_and_rtp' time='MASKED'/>
|
|
|
|
<testcase classname='MGCP_Test' name='TC_two_crcx_and_unsolicited_rtp' time='MASKED'/>
|
|
|
|
<testcase classname='MGCP_Test' name='TC_two_crcx_and_one_mdcx_rtp_ho' time='MASKED'/>
|
2019-02-21 16:35:01 +00:00
|
|
|
<testcase classname='MGCP_Test' name='TC_ts101318_rfc5993_rtp_conversion' time='MASKED'/>
|
2019-03-07 09:54:10 +00:00
|
|
|
<testcase classname='MGCP_Test' name='TC_amr_oa_bwe_rtp_conversion' time='MASKED'/>
|
|
|
|
<testcase classname='MGCP_Test' name='TC_amr_oa_oa_rtp_conversion' time='MASKED'/>
|
|
|
|
<testcase classname='MGCP_Test' name='TC_amr_bwe_bwe_rtp_conversion' time='MASKED'/>
|
2020-05-14 19:10:28 +00:00
|
|
|
<testcase classname='MGCP_Test' name='TC_conn_timeout' time='MASKED'/>
|
2020-09-11 18:24:23 +00:00
|
|
|
<testcase classname='MGCP_Test' name='TC_e1_crcx_and_dlcx_ep' time='MASKED'>
|
|
|
|
<failure type='fail-verdict'>Response didn't match template
|
|
|
|
MGCP_Test.ttcn:MASKED MGCP_Test control part
|
|
|
|
MGCP_Test.ttcn:MASKED TC_e1_crcx_and_dlcx_ep testcase
|
|
|
|
</failure>
|
|
|
|
</testcase>
|
|
|
|
<testcase classname='MGCP_Test' name='TC_e1_crcx_with_overlap' time='MASKED'/>
|
|
|
|
<testcase classname='MGCP_Test' name='TC_e1_crcx_loopback' time='MASKED'/>
|
|
|
|
<testcase classname='MGCP_Test' name='TC_crcx_mdcx_ip4' time='MASKED'/>
|
|
|
|
<testcase classname='MGCP_Test' name='TC_crcx_mdcx_ip6' time='MASKED'/>
|
|
|
|
<testcase classname='MGCP_Test' name='TC_two_crcx_mdcx_and_rtp_ipv4_ipv6' time='MASKED'/>
|
|
|
|
<testcase classname='MGCP_Test' name='TC_two_crcx_mdcx_and_rtp_ipv6' time='MASKED'/>
|
|
|
|
<testcase classname='MGCP_Test' name='TC_two_crcx_and_rtp_osmux_bidir_ipv6' time='MASKED'>
|
|
|
|
<failure type='fail-verdict'>Received unexpected type from Osmux
|
|
|
|
MGCP_Test.ttcn:MASKED MGCP_Test control part
|
|
|
|
MGCP_Test.ttcn:MASKED TC_two_crcx_and_rtp_osmux_bidir_ipv6 testcase
|
|
|
|
</failure>
|
|
|
|
</testcase>
|
|
|
|
<testcase classname='MGCP_Test' name='TC_two_crcx_and_rtp_osmux_bidir_ipv4_ipv6' time='MASKED'>
|
|
|
|
<failure type='fail-verdict'>Received unexpected type from Osmux
|
|
|
|
MGCP_Test.ttcn:MASKED MGCP_Test control part
|
|
|
|
MGCP_Test.ttcn:MASKED TC_two_crcx_and_rtp_osmux_bidir_ipv4_ipv6 testcase
|
|
|
|
</failure>
|
|
|
|
</testcase>
|
|
|
|
<testcase classname='MGCP_Test' name='TC_two_crcx_and_rtp_osmux_bidir_ipv6_ipv4' time='MASKED'/>
|
|
|
|
<testcase classname='MGCP_Test' name='TC_two_crcx_mdcx_and_rtp_osmux_ipv6' time='MASKED'>
|
|
|
|
<failure type='fail-verdict'>Received unexpected type from Osmux
|
|
|
|
MGCP_Test.ttcn:MASKED MGCP_Test control part
|
|
|
|
MGCP_Test.ttcn:MASKED TC_two_crcx_mdcx_and_rtp_osmux_ipv6 testcase
|
|
|
|
</failure>
|
|
|
|
</testcase>
|
|
|
|
<testcase classname='MGCP_Test' name='TC_two_crcx_mdcx_and_rtp_osmux_ipv4_ipv6' time='MASKED'>
|
|
|
|
<failure type='fail-verdict'>Received unexpected type from Osmux
|
|
|
|
MGCP_Test.ttcn:MASKED MGCP_Test control part
|
|
|
|
MGCP_Test.ttcn:MASKED TC_two_crcx_mdcx_and_rtp_osmux_ipv4_ipv6 testcase
|
|
|
|
</failure>
|
|
|
|
</testcase>
|
|
|
|
<testcase classname='MGCP_Test' name='TC_two_crcx_mdcx_and_rtp_osmux_ipv6_ipv4' time='MASKED'/>
|
add compare-results.sh, call from start-testsuite.sh
Compare current test results to the expected results, and exit in error on
discrepancies.
Add compare-result.sh: (trivially) grep junit xml output to determine which
tests passed and which didn't, and compare against an expected-result.log,
another junit file from a previous run. Summarize and determine success.
Include an "xfail" feature: tests that are expected to fail are marked as
"xfail", unexpected failures as "FAIL".
In various subdirs, copy the current jenkins jobs' junit xml outputs as
expected-results.log, so that we will start getting useful output in both
jenkins runs and manual local runs.
In start-testsuite.sh, after running the tests, invoke the results comparison.
Due to the single-line parsing nature, the script so far does not distinguish
between error and failure. I doubt that we actually need to do that though.
Related: OS#3136
Change-Id: I87d62a8be73d73a5eeff61a842e7c27a0066079d
2018-04-05 14:56:38 +00:00
|
|
|
</testsuite>
|