Wednesday, January 30, 2008

Customized test script for OpenSta

The article describes how the open source tool OpenSTA facilitated system integration testing of a middleware solution based on IBM WebSphere Application Server and the WebSphere Branch Transformation Toolkit (BTT). OpenSTA automation improved test execution accuracy and reproducibility while preserving the project's tight test budget. With the benefit of the lessons learned and key OpenSTA workarounds documented in this article, you may want to consider OpenSTA for your own distributed application testing strategy.

Customizing the Test Script Template
The generic design of the test script template results in only four lines that have to be modified for each specific test script:

* Line 4 contains the test script name per the project's naming convention. This name appears in the test results.
* Lines 15 and 17 point to the IFX-encoded request and model response specific to the test script.
* Line 46 calls the subroutine designed to make specific comparisons of the actual response to the model response based on the type of test being run.

Test Script Subroutines
The OpenSTA Script Control Language supports several structured programming techniques. The test script template exploits OpenSTA's support for user-defined subroutines. By defining a generic test script template that was used for all tests, any logic changes or enhancements made to the test script capabilities could be contained in two files, IFX_SUBROUTINES.INC and IFX_VARIABLES.INC. This eliminates the need to refresh the entire set of test scripts when a scripting enhancement is identified. Instead, these two common files could be deployed to the test clients.

Of the 24 custom subroutines supporting these test scripts, the following three subroutines from the IFX_SUBROUTINES.INC file are representative of OpenSTA's various capabilities:

* Read_Rq - this subroutine makes use of the character string variable REQUEST_LINE shown in Listing 1 to read in the IFX-encoded request message at script execution time. The OpenSTA Script Control Language reference suggests using the built-in command OPEN for file handling; however, there is an outstanding bug in OpenSTA, so only the first line of a file can be read. The Read_Rq subroutine provides a custom alternative to using the built-in command.
* Remove_NewUpDt - this subroutine makes extensive use of the string manipulation commands available in OpenSTA. Its purpose is to remove those IFX elements from both the model response and actual response that contain timestamps. Since the timestamp elements of the actual response is different for every execution of the test script, they are removed from the comparison of the actual response with the model response. For the initial system integration test cycles, the test team visually inspects the timestamp elements of the actual response to ensure that the value is in the correct format and time range.
* Generate_RqUID - this subroutine uses OpenSTA's current date, current time, and random number generation features to generate a unique identifier for each IFX-encoded request sent to the middleware. This unique identifier is subsequently associated with the actual IFX-encoded response per the project's IFX specification.

The IFX_SUBROUTINES.INC and IFX_VARIABLES.INC files are included in the OpenSTA script package available with this article online at http://websphere.sys-con.com/ (WebSphere Journal archives Vol: 4, Iss: 8).

There are some limitations imposed by OpenSTA on the design of the subroutines and the use of variables in those subroutines. For instance, the variables defined are all global. This means that local variables can't be used in the subroutines. Local variable techniques could have reduced the amount of duplicated logic in the subroutines.

Another OpenSTA limitation is that there can be at most 128K of character string data defined by any one script. This proved to be a challenge since some of the IFX-encoded messages can be up to 15K in length. In combination with the variables used to store message fragments for various parsing routines, the scripts come close to breaking this character string data maximum.

Loop control also proved to be a challenge. Specifically, OpenSTA can't specify a "do while" or "do forever" control for loops in which the number of iterations isn't known in advance. The solution is illustrated in Listing 1.

Listing 1 shows that a loop count maximum has been specified (MAXFILESIZE). If the last line of the IFX-encoded message is encountered before the loop count maximum, additional subroutines are called and the loop is ended. If the loop count is exceeded, a message is recorded in the test script log and is considered by the tester to be a failure in the test script. Fortunately, this message wasn't encountered during the formal system integration test cycles.

Executing the Tests
Once the test scripts were prepared, the formal system integration testing could begin. OpenSTA provides a test execution framework that lets testers execute one or more scripts in various combinations and a configurable number of iterations. Figure 1 shows a typical OpenSTA Test designed to run all the tests in project test category 00_4A one at a time with a delay of five seconds between each test execution as indicated in the Start column. Note that since there's only one iteration of each test script, the delay options shown at the bottom of Figure 1 don't apply.

As the tests were executed, OpenSTA tracked the test results in two files: TestRep.txt and TestLog.txt. OpenSTA provided additional test result files, but these were the two used by the test team.

The TestRep.txt file recorded all the messages issued by the REPORT command in the test script, including the REPORT commands that wrote the actual IFX-encoded response. It also recorded the pass/fail flag for the test case. The testers used the TestRep.txt file to verify the test case results, investigating the test cases flagged as failed.

The TestLog.txt file tracked the execution of the test, recording each subroutine used by the test and any script execution errors. This file was useful while testing the logic of the test scripts and as a double check during formal execution of the test scripts.

Lessons Learned
In summary, OpenSTA satisfied our requirement for a test automation tool. The following lessons learned provide additional detail.

OPENSTA MET OUR NEEDS WITH SOME WORKAROUNDS
The test team was able to stay within the project test budget and schedule by creatively using a combination of OpenSTA features. As noted throughout this article, several workarounds were devised when OpenSTA didn't perform as expected or documented; however, the time spent finding and circumventing these issues was minimal. It should be noted that though OpenSTA can only be executed on a Windows platform, it can send and receive HTTP requests to servers deployed on any technology.

OPENSTA USERS SHOULD BE COMFORTABLE WITH SCRIPTING LANGUAGES
The OpenSTA Script Control Language lets testers devise complex test scenarios complete with test result validation. To do this, the test script authors must have programming experience in languages similar to the Script Control Language such as Unix shell scripting or IBM REXX.

MODULAR DESIGN GREATLY REDUCES THE EFFORT TO DEPLOY TEST SCRIPT ENHANCEMENTS
The test scripts were designed to exploit OpenSTA's modular architecture. By designing a generic test case template with calls to common subroutines, script enhancements were easily deployed to the test environment by simply updating two files: IFX_SUBROUTINES.INC and IFX_VARIABLES.INC. The hundreds of test case scripts developed for the project would automatically reference the script enhancements in these files.

TEST SCRIPT NAMING CONVENTION IS ESSENTIAL
The test team found it beneficial to adopt a rigorous naming standard for the test cases and the test scripts even though this wasn't a mandatory OpenSTA requirement. The naming convention helped the team organize all the test script components and facilitated incident reporting.

No comments: