Java程序辅导

C C++ Java Python Processing编程在线培训 程序编写 软件开发 视频讲解

客服在线QQ:2653320439 微信:ittutor Email:itutor@qq.com
wx: cjtutor
QQ: 2653320439
  
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
The candidate confirms that the work submitted is his/her own and the appropriate credit has 
been given where reference has been made to the work of others. 
Scenario Testing of an 
application server based multi-
channel architecture. 
 
Vassilis Rizopoulos 
 
MSc in Distributed Multimedia Systems 
1999/2000 
Scenario testing of an Application Server based Multi-channel Architecture. 
 
 II
Summary. 
 
This project concerns the investigation of content delivery over multiple channels using 
application servers. 
It examines an architecture comprised of several components that had to be integrated, and 
presents a strategy for testing the architecture using a scenario application. 
The steps undertaken from the initial architecture design up to the final test configurations are 
presented, as well as the reasoning for the choices made. 
Results are presented and conclusions drawn on the performance of the architecture. 
 
 
 
Scenario testing of an Application Server based Multi-channel Architecture. 
 
 III
Acknowledgements. 
I would like to thank Mark Berner for managing the project, assisting in every way possible 
so that we could worry only about the results of the project. 
All the Rubus employees who contributed time and knowledge into this project but 
especially: 
Sanjay Manandhar who helped bring the goals into focus. 
Daniel Makin and the rest of the Innovation Center staff for providing the multi-channel 
techniques we used, the content and the skeleton of the scenario application. 
Andrew Redhead for the help with ATG Dynamo. 
Rupert Benbrook for setting up the test facility, helping with the physical configuration used 
in the tests and the advise on NT performance. 
Martin Kendal, without whom it would have proved impossible to properly configure Oracle. 
Les Hughes on valuable advice about setting up a development environment and last but not 
least Jack Beaken and Daniel Brock for the hours of working with me. 
Also, Lydia Lau for helping me with balancing the technical aspects of the project with its 
academic value. 
Scenario testing of an Application Server based Multi-channel Architecture. 
 
 IV
Contents. 
SUMMARY. .......................................................................................................................II 
ACKNOWLEDGEMENTS.............................................................................................. III 
CONTENTS...................................................................................................................... IV 
1. INTRODUCTION........................................................................................................5 
1.1 PROJECT BACKGROUND. ...............................................................................................5 
1.2 MOTIVATION. ...............................................................................................................5 
1.3 CHALLENGES. ...............................................................................................................6 
1.4 APPROACH....................................................................................................................6 
2. THE UNDERLYING ARCHITECTURE...................................................................8 
2.1 THE WAP (WIRELESS APPLICATION PROTOCOL) CHANNEL. ..........................................8 
2.1.1 The underlying design for WAP.............................................................................9 
2.1.2 Integration of WAP and WWW. ...........................................................................10 
2.2 THE ROLE OF APPLICATION SERVERS IN THE ARCHITECTURE. .......................................10 
2.2.1 The traditional model of Application Servers.......................................................12 
2.2.2 Adapting the model for multiple channels............................................................12 
2.3 XML – XSLT. ...........................................................................................................13 
3. THE RUBUS ENVIRONMENT. ..............................................................................14 
3.1 TEST FACILITY – PHYSICAL CONFIGURATIONS .............................................................14 
4. DESIGN OF THE SCENARIO APPLICATION. ....................................................16 
4.1 THE CHOICE FOR THE SCENARIO APPLICATION..............................................................16 
4.2 APPLICATION SCENARIO DESIGN. ................................................................................16 
4.3 IMPLEMENTATION ISSUES. ...........................................................................................18 
5. TESTING AND ANALYSIS OF THE RESULTS....................................................20 
5.1 TESTING STRATEGY.....................................................................................................20 
5.2 TEST PROCESS DESIGN.................................................................................................20 
5.3 SERVER CONFIGURATIONS...........................................................................................22 
5.4 THE TESTING TOOL......................................................................................................23 
5.5 THE TEST SCRIPTS. ......................................................................................................24 
5.6 THE TESTS ..................................................................................................................25 
5.6.1 The significance of XSLT optimization.................................................................25 
5.6.2 XSLT processing times. .......................................................................................27 
5.6.3 Performance Scaling. ..........................................................................................28 
5.6.4 The crash recovery test........................................................................................30 
5.6.5 MULTIPROCESSING OF JAVA INSTANCES. ..................................................................31 
CONCLUSION. .................................................................................................................32 
MAIN EXPERIMENTS AND FUTURE WORK............................................................................32 
ADDITIONAL VALUE .........................................................................................................33 
REFERENCES. .................................................................................................................34 
APPENDICES. ..................................................................................................................36 
Scenario testing of an Application Server based Multi-channel Architecture. 
 
 5
1. Introduction. 
 
This report presents the findings and results from the 1999/2000 MSc. project of the same 
title. 
The project was carried out between June and September 2000 for the architecture lab of 
Rubus Ltd. by  
• Vassilis Rizopoulos, University of Leeds student. 
• Jack Beaken, Rubus employee. 
• Daniel Brock, Rubus employee. 
1.1 Project Background. 
The project was an initiative of Rubus Ltd1, an e-business consultancy with offices in 
Bracknell, London, Manchester and Paris. 
Project manager is Mark Berner, head of Rubus’ architecture lab. 
The initial title for this project was “Developing a robust Web/WAP multi-channel 
architecture” and it’s goals to investigate the BEA Weblogic and ATG Dynamo application 
servers and various WAP gateways as modules for a “multi-channel” architecture for the 
delivery of content to mobile devices as well as web browsers. 
Initial research focused on the emerging technology of WAP2 and Internet connectivity for 
mobile devices. 
During the first stages of familiarization with the project subject and the planning of the 
project’s phases in June it became obvious that the project did not include the development of 
an entirely new architecture rather than the investigation of the performance and integration 
issues arising from the addition of techniques used for delivering content to WAP-enabled 
devices in the existing application server framework. 
Additionally after consultation with Rubus’ experts on the difficulties of developing for two 
different application servers (the timescale estimates for coverage of both application server 
products exceeded the project’s duration), and the prior research by Rubus’ Innovation 
Center on WAP gateways and multi-channel content delivery the decision was taken to 
concentrate on developing tests for ATG Dynamo only. 
1.2 Motivation. 
The popularity of the Internet as a medium of communication has sparked the development of 
technologies such as WAP for mobile phones and PDAs (Personal Digital Assistants) and 
iTV (Interactive TV) that enable a multitude of devices to connect to the network and take 
advantage of WWW information services such as on-line shopping and e-mail. 
As a result in the next few years the number of devices that will access the Internet is 
expected to rise dramatically. Computer users form a much smaller percentage of the 
population than people using devices like mobile phones. Forrester Research3 predicts that by 
the year 2005 WAP-enabled mobile phone users will reach 28.5 million in the UK. 
With the proliferation of devices and the popularity of the Internet as a provider of news, 
entertainment and business information as well as a retailing market, there is a need to 
examine the technologies used to deliver content conforming to the varying requirements and 
                                               
1
 Company website: http://www.rubus.com. 
2
 WAP, the wireless application protocol and the application server technologies are described in detail in the 
body of the report. 
3
 The number is quoted from an internal company report that quotes a research report by Forrester Research. 
The Forrester Research reports are available for a fee at http://www.forrester.com. 
Scenario testing of an Application Server based Multi-channel Architecture. 
 
 6
capabilities of these devices. The re-evaluation of the ability of existing systems to 
incorporate these techniques and to scale so as to provide services to all “channels” (each 
different device type forms a channel) is necessary in order for the correct solution to be 
deployed in a multi-channel site. 
1.3 Challenges. 
The project incorporates a number of very recent technologies like the Java 2 Platform 
Enterprise Edition (J2EE) framework of Enterprise Java Beans on which application servers 
are based, XML and XSLT which became standards as recently as December 1999 and WAP 
which is still under constant development (at the beginning of the project in April the formal 
specification was version 1.0 while now version 1.2 is already submitted as a final draft). 
Application servers are based on J2EE but each product implements the specifications 
differently and also includes some proprietary mechanisms so familiarization with the 
application server, ATG Dynamo, was the first phase. It was also the reason for limiting the 
scope of the project to one application server scenario application due to the time restrictions. 
The challenge is to incorporate all the new technologies (XML, XSLT, WAP) and the prior 
work done by Rubus’ Innovation Center in the application server framework and devise a test 
strategy for evaluation. 
This includes the selection of the testing tools and measures and a good knowledge of all 
component technologies as well as the requirements of Rubus from this project. 
1.4 Approach. 
The project included the following phases: 
• Familiarization phase (the initial research and the first week). 
In this phase the various component technologies were studied. With the relocation to the 
London Rubus offices familiarization with the company’s methods and organization was 
essential. 
Meetings with Rubus experts helped determine the scope of the project based on the 
timeline available and the company’s requirements as well as the availability of Rubus 
resources in software, hardware and expert knowledge. 
 
• Scenario Design phase. (2 weeks) 
The phase included the choice and design of the test scenario. 
 
• Testing and Evaluation Strategy Definition phase. (1 week) 
During this time the decisions on the significant measures and the focus of the testing 
were established. 
The design of the actual configurations to test, the testing tools to use and the method of 
testing were also parts of this phase. 
This phase produced a testing strategy and a test plan that served as guidelines during the 
testing phase. 
The project adopted the testing procedure used by Rubus for each application, which 
involves the definition of “use cases” for the application, where the normal paths of usage 
are described (the number and order of actions a user undertakes to perform a task in the 
application). 
Each “use case” is the basis for a “test case”, a test of the applications normal 
functionality. 
Scenario testing of an Application Server based Multi-channel Architecture. 
 
 7
“Test scenarios” are extended tests based on test cases, which determine the application’s 
behavior, such as the maximum load a server can hold without errors or the ability of the 
software to recover from intentional or unintentional errors. 
 
• Implementation phase. (4 weeks) 
During implementation the scenario application’s code was developed according to the 
design specified. 
 
• Testing phase. (2 weeks) 
The application was deployed in Rubus’ test facility and the tests executed. 
 
• Analysis and Report writing. (10 days) 
Scenario testing of an Application Server based Multi-channel Architecture. 
 
 8
2. The Underlying Architecture. 
 
 
Figure 1 presents an application server based multi-channel architecture. 
The basis for this architecture is the same as the one for applications that cater to web 
browsers with the addition of the presentation processor. 
The new channels integrate with the network through a gateway so the details of the 
communication between device and network are opaque to the Internet and content can be 
hosted and served using the existing protocols with only the presentation requirements and 
capabilities of each channel in mind. 
For this reason a presentation processor is added to apply each channel’s presentation format 
accordingly. 
The project focused on testing for two channels, WWW and WAP which poses the most 
restrictions on size and presentation of content. 
The presentation processor was based on a version developed by Rubus’ Innovation Center 
using XML and XSLT hence some modification was needed. 
2.1 The WAP (Wireless Application Protocol) channel. 
 
“The Wireless Application Protocol (WAP) is an open, global specification that empowers 
mobile users with wireless devices to easily access and interact with information and services 
instantly.”[WAP1] 
 
Developed as a solution to mobile computing, WAP enjoys the support of most of the largest 
telecommunications and computing companies in the world.  
Figure 1 
Scenario testing of an Application Server based Multi-channel Architecture. 
 
 9
The WAP Forum is an industry group dedicated to the development of the WAP standards 
founded in June 1997 and the official source of the WAP specifications. 
 
2.1.1 The underlying design for WAP. 
 
The primary objective of WAP design is the creation of a platform that can provide mobile 
network services to a variety of devices that share the common characteristics of  
 
• Low processing power and memory. 
• Limited display and input capabilities: 
Small screen size and resolution, as low as 2x8 characters and limited input devices 
(i.e. stylus or phone keypad). 
• Low bandwidth availability and network latency problem: 
Devices have 300bps-10Kbps connections with 5-10 sec propagation delay. 
 
WAP includes an application framework and network protocols specifically designed to 
address these problems. 
 
Figure 2 presents the WAP stack and it’s relation to the services provided by the TCP/IP stack 
used on the Web. 
 
The WAE includes the micro-browser environment with the 
specification of WML (Wireless Markup Language) a lightweight application of XML 
designed for mobile devices and WMLScript, a scripting language similar to JavaScript.  
It also defines the architecture model for WAP application development. 
 
Following closely the WWW model, the WAP architecture uses URL location identifiers, 
standard content definition (MIME types) and HTTP functionality (in WSP) that allow 
content to be hosted on standard web servers.  
Figure 2 [WAP3] 
Scenario testing of an Application Server based Multi-channel Architecture. 
 
 10
 
2.1.2 Integration of WAP and WWW. 
 
The WAE model assumes gateway functionality for the communication of a wireless network 
with the WWW. This is achieved with standard proxy technology: 
The client connects to the WAP gateway, which in turn accesses the requested page from the 
remote web server and serves it back to the client. 
.  
Figure 3 [WAP3] 
The WAP Gateway serves as a protocol translator from the WAP stack to the WWW 
(TCP/IP) stack. It also performs processor intensive functions such as DNS queries and 
content encoding/decoding that are not within the capabilities of mobile CPUs. 
 
This arrangement provides a transparent way of communication between standard Web 
servers and mobile devices. By using the already established method of MIME types to 
identify the channel for the content and URLs to locate it, the application developers use 
known methods and technologies to serve WML content in the same application framework 
that is used for HTML content. 
2.2 The role of Application Servers in the architecture. 
 
Application Servers are a category of middleware software that was developed to provide a 
multi-tier solution for deploying applications over the World Wide Web. The existing 
products were developed in support of single-channel architectures and this project was to 
experiment with delivering content to multiple channels. 
 
The application server consists of one or more layers (tiers) for the middle tier of the 3-tier 
architecture. It provides a way of managing the resources of a system by pooling database 
Scenario testing of an Application Server based Multi-channel Architecture. 
 
 11
connections together, keeping track of users (session management), load balancing between 
multiple servers and providing backup and recovery in the event of failure without affecting 
the data structures and storage (Database Layer) or the way they are presented to the user 
(Presentation Layer). 
 
Session management is a central characteristic of every application server. 
The HTTP protocol is stateless; it does not carry any information on the previous state the 
client that initiates the connection was. This poses significant problems when a server tries to 
track a user through a site to be able for example to keep track of purchases. 
The application server provides a way to manage state between requests to the same site thus 
making it possible to have sessions that span multiple requests. 
 
A server of a popular commercial site has to be able to receive and process traffic in the order 
of a few million hits per day while providing twenty-four hour service. Load-balancing 
features in the application server allow traffic to be distributed between a cluster of servers. 
Scaling the system to handle more traffic becomes an issue of adding another server to the 
cluster.  
Uninterruptible service is assured by backup and recovery features that ensure that a cluster 
of servers will continue functioning normally even if a number of the cluster members fail.  
 
These features are all incorporated in the application system framework and used 
automatically along with transaction management, network connectivity and life-cycle issues 
(how long an object remains active and what happens when it is invalidated).  
The framework for Java-based application servers is J2EE and Enterprise Java Beans (EJB). 
 
There are two main technologies in application servers, the Distributed Network Architecture 
by Microsoft and the Java 2 Platform Enterprise Edition (J2EE) by Sun [APP3]. 
This project is focused on ATG Dynamo 4.51, a Java-based application server that is partly 
J2EE compliant (see Rubus Environment chapter) and the technology described here is 
specific to Java. 
A comprehensive list of available application servers can be found at http://www.appserver-
zone.com/guide.asp. 
Scenario testing of an Application Server based Multi-channel Architecture. 
 
 12
2.2.1 The traditional model of Application Servers. 
 
Figure 4 illustrates the tier architecture 
introduced by Java-based application servers 
from an end-to-end application perspective. 
 
The thin presentation layer is the Web browser 
that interprets HTML. 
 
The application layer handles session 
management, dynamic content creation and 
user management. 
 
The business logic layer handles all data 
persistence issues and is supported by a thin 
interface that handles the translation from the 
database structures to objects (JDBC, the Java 
database connectivity is such an interface). 
 
 
 
 
 
2.2.2 Adapting the model for multiple 
channels. 
 
Application servers provide a robust and scalable 
framework for application development for the 
WWW but until now had to cater only to Web 
browsers using HTML as the presentation medium. 
 
The introduction of multiple devices with different 
display capabilities that do not “understand” HTML 
but have each it’s own presentation language 
presents a problem for existing application servers 
similar to the problem of handling different sources 
of data like plain text files, hierarchical or relational 
databases etc. 
The problem of interfacing with databases is 
handled by technologies like JDBC, ADO (Active 
Data Objects, by Microsoft), ODBC (Object 
Database Connectivity, a precursor to ADO) creating a thin interface layer between the 
business logic layer and the database layer. 
A similar solution, a presentation interface, is needed to handle the 
different formats of presentation demanded by each device (figure 5). 
Such an interface is provided by XML and XSLT. 
Figure 4 
Figure 5 
Scenario testing of an Application Server based Multi-channel Architecture. 
 
 13
 
2.3 XML – XSLT. 
 
The XHTML v1.0 specification [XML4] that defines HTML as an XML application (a set of 
XML compliant tags that encompass the HTML 4.0 standard) allows XSLT to be used for the 
bulk of content existing in the Web and transformed to conform to channel requirements. It 
also underlined the decision of the W3C to adopt XML as the language for data annotation 
over the Internet. 
The fact that WAP uses WML (another application of XML) simplifies the procedure of 
adapting content to the architecture’s channels as long as the content is in XML format. 
 
XML, the extensible Markup Language is a standard developed by the W3C (the World Wide 
Web Consortium) as 
“A subset of SGML…It’s goal is to enable generic SGML to be served, received and 
processed on the Web in the way that is now possible with HTML. XML has been designed 
for ease of implementation and for interoperability with both SGML and HTML”[XML1]. 
More generally, XML is classed as a meta-data language, a way to transmit data and also 
describe them, a “mechanism for the interchange of structured information on the 
Web”[XML2].  
XML is a subset of SGML and shares with it the distinction between content and 
presentation. It is possible to define a set of tags that identify content separately from the way 
it will be displayed and let the interpreter apply formatting. This meant that XML is a very 
powerful tool for structuring and communicating data but the applications had to parse the 
XML documents using specially written code that could interpret the tags per case. 
XSLT (the eXtensible Stylesheets Language: Transformations) evolved as a separate 
specification from XSL (the eXtensible Stylesheets Language). While the latter deals in great 
detail on rules about formatting the content (size of fonts, color, alignment rules etc.) the 
former specifies a language that transforms one type of XML document into another 
[XSLT1]. 
XSLT provides a very powerful tool in the use of XML; The XSLT processor accepts a set of 
rules in XML and applies them on an XML document to produce a different one. Instead of a 
specially written translator parsing the document, extracting and altering elements, a 
standards-based processor applies filters to transform it. Changing the filter changes the 
resulting document. 
 
Incorporating XSLT as the presentation processor into the architecture allows the architecture 
to scale to include new channels by simply adding an XML document containing the 
formatting rules for the new channel. 
 
Scenario testing of an Application Server based Multi-channel Architecture. 
 
 14
3.  The Rubus Environment. 
 
The multi-channel project from the Rubus perspective is the continuance of research into 
WAP and other channels done by Rubus’ Innovation Center. 
The Innovation Center developed an application for delivering personalized content over 
multiple channels without using application servers, as a means to explore the capabilities of 
XSLT as a presentation interface and of documenting techniques for content delivery over 
multiple channels. 
This research dictated the use of XML and XSLT as well as the choice for the scenario 
application used in testing while providing very useful guidelines in programming. 
The scenario application uses content provided by the Innovation Center project as well as 
the database structure developed by Daniel Makin. 
 
The choice of ATG Dynamo as the application server used for the architecture also was due 
to the fact that Rubus has built a significant knowledge base (both in resources and in 
experts) in the usage of the product. BEA’s Weblogic application server was not included due 
to the differences between the two products, which would require a separate development and 
testing phase for each. 
3.1 Test facility – Physical configurations 
 
Testing was done on Rubus’ testing facility (full specifications in Appendix G). 
The ‘testing rig’ allows for testing of different configurations of servers (using multiple 
application servers, multiple web servers etc.). 
The actual network configuration (topology, IP addressing and hardware components used) is 
termed a ”physical” configuration. 
Figure 6 shows the hardware and network topology used in testing the architecture.  
Scenario testing of an Application Server based Multi-channel Architecture. 
 
 15
 
Figure 6: Physical test configuration. 
The servers all have multiple CPUs and multiple network adapters. The operating system 
installed was Windows NT 4.0 Enterprise Edition. 
The multiple CPUs on each server allowed the use of multiple instances of the server 
software, thus enabling testing for a server configuration with two and three application 
servers. 
Additionally, one computer (DC01) functions as a NT Domain Controller and provides 
services like WINS and DNS, while another is the Management Server (MGMT01), used to 
collect all the performance monitor data.  
The web servers sit on both network segments, isolating the server architecture from the 
client side. 
Scenario testing of an Application Server based Multi-channel Architecture. 
 
 16
4. Design of the scenario application. 
4.1 The choice for the scenario application. 
The goal during the initial planning stages of the design phase was to select a scenario 
application, which could closely emulate real world application and reuse as much of Rubus’ 
knowledgebase as possible. 
 
This goal evolved into two criteria for the application’s design: 
 
• A commercial-value service provision in the context of the multi-channel architecture, 
especially the WAP-enabled mobile phone market. 
Given that the timeline for the project did not allow examination of all possible channels, 
choosing WAP enabled the investigation of the extreme requirements for the architecture 
against the mainstream Web channel. But in order to emulate a real-world application, the 
content used by the application had to be something that is seen as a potential application for 
the WAP channel. 
It is recognized that WAP phones, with limited screen space and the normal usability issues 
associated with early technologies, are today only appropriate for ‘ABSNOW’ applications – 
applications where the need to find something or do something ABSolutely must be done 
NOW. 
The Innovation Center’s pilot application provided weather reports, sports results and stock 
quotes and a design for acquiring the content. For the scenario application the ‘stock quotes’ 
option reusing the database design and content used by the Innovation Center. . This feature 
apart from allowing the minimum functionality required for the development timeline 
available was seen as the most ‘ABSNOW’ application. 
• Sufficient functionality had to be built into the scenario to provide meaningful results 
in testing. 
This led to the decision to implement a database for storing user profiles and the content with 
added functionality for altering the database in order for the test results to be directly 
comparable to previous performance tests. The RDBMS system used for implementing the 
database was Oracle 8i due to the quality of JDBC drivers it provides. 
 
4.2 Application Scenario Design. 
The first step in designing the application was the functional specification. 
The application’s functionality was outlined as follows: 
 
The application must recognize the client’s channel and present the appropriate login screen, 
authenticate the user and retrieve the user’s preferences (a list of stocks he/she wants quoted). 
The content is then delivered to the client in the channel’s appropriate format. 
 
Originally the functionality was to include some code to determine the maximum size of a 
WML page due to the restrictions in size of packets that mobile phones impose. The code 
would then segment large content into packets of valid size for WAP devices and deliver it as 
multiple pages. This was not included in the final scenario application due to time 
restrictions. Instead functionality for the Web channel was added that altered the database’s 
content (the user’s profile) to more closely emulate real-world behavior.  
Scenario testing of an Application Server based Multi-channel Architecture. 
 
 17
Appendix D is the original functional specification. The document also includes the 
functional specification for a testing tool. 
The functional specification is a basis for the development of the application’s design but 
does not take into account the restrictions imposed by the software products used for the 
architecture. 
The design of the application was an iterative process: 
The initial draft based on the functional specifications presented the application as using two 
major modules the ‘login and authentication’ module and the ‘content serving module’. 
Subsequent stages were refined with feedback from the implementation process due to the 
restrictions imposed by the application server. 
 
 
 
 
 
Figure 8 shows the final application design. Servlets are designated using light blue boxes 
while the pages send to the clients are designated with white boxes.  
Figure 7 presents the table structure of the database used by the scenario application. The 
tables PREFSTORY and CONSTOCKSTORY initially were included to emulate large 
content (news stories) handling but were not used by the application. 
 
The entry servlet performs client identification by matching the HTTP ‘user-agent’ header 
against the BROWSERNAME field of the CLIENTSPECS table. The table contains the type 
of channel (WEB or WAP) and the maximum size of the content that the channel can handle 
per request. 
The HTMLLogin and WMLLogin servlets perform the authentication by matching a 
username and password submitted through a form (HTML or WML correspondingly) in the 
USERS table. 
The GetContent servlet uses the PROFILEID field from the USERPROFILE table to retrieve 
all stock quotes that match the user’s profile from the CONSTOCK table. 
Figure 7: The database design used. 
Scenario testing of an Application Server based Multi-channel Architecture. 
 
 18
Stock quotes are stored in XML format in the CONSCONTENT field of the CONSTOCK 
table. 
The Logout servlet invalidates the session so that the next time the user visits the site he or 
she has to go through the login procedure again. 
4.3 Implementation issues. 
 
The software used for the various components of the architecture was: 
 
Application Server ATG Dynamo 4.5.1 
Operating System Microsoft Windows NT 4.0 Enterprise Edition 
Database Management System Oracle 8i v8.1.6 
Web Server Microsoft Internet Information Server (IIS) 4 
XSLT processor Xalan, Apache Foundation Software 
Test tool Microsoft Web Application Stress Tool 
Programming Platform Java Development Kit 1.2.2 and Java 2 Enterprise 
Edition 1.2.1 
 
Identification of the channel is done using the HTTP ‘user-agent’ header. Unfortunately 
every new WAP browser identifies itself using a different string so the application will be 
able to handle only the browsers available at the time of development. To make updating the 
range of browsers recognized by the application upgradeable, the specifications and ‘user-
agent’ strings are kept in the application database. Updating the database enables the 
application to recognize newer browsers. 
 
ATG Dynamo uses a proprietary mechanism for server-side processing of pages, which 
includes a set of non-standard HTML tags that allow the inclusion of Java code into a page 
and the execution of special servlets (called droplets) from HTML pages. 
This mechanism relies on MIME types, mapping certain extensions to Dynamo’s internal 
structures, i.e. HTML pages containing the proprietary tags that have to be parsed by the 
server are mapped to the extension .jhtml with the MIME type:  
 
dynamo-internal/html, jhtml 
 
 
This mechanism does not include support for handling pages that are not HTML and such 
support had to be added to the application server. 
 
The XSLT processor chosen for implementing the presentation interface was Xalan, by the 
Apache Software Foundation4. 
Xalan is not the only Java-based XSLT processor available but it is free and was the only 
product that fully supported the XSLT specification at the time of the project’s design phases. 
 
Appendix E includes a sample content page and the transformation documents used for 
generating the channel-specific pages. 
Appendix F lists a document by Jack Beaken that details the technical issues that had to be 
resolved during implementation. 
                                               
4
 http://www.apache.org or http://xml.apache.org. 
Scenario testing of an Application Server based Multi-channel Architecture. 
 
 19
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
Entry Servlet 
 
Login.jhtml 
 
Login.jwml 
 
Logout.jwml 
 
 
Logout.jhtml 
 
GetContent.jwml 
 
GetContent.jhtml 
 
 
Prefs.jhtml 
 
Menu.jhtml 
 
WMLLogin Servlet 
 
HTMLLogin Servlet 
 
GetContent Servlet 
 
 
Logout Servlet 
 
 
ChangePref Servlet 
 
ProcessPref.jhtml 
ProcessPrefs Servlet 
Unknown.jhtml 
Figure 8: The scenario application design 
Scenario testing of an Application Server based Multi-channel Architecture. 
 
 20
5. Testing and analysis of the results 
5.1 Testing strategy. 
 
The primary goal of the tests devised for the architecture was to examine the behavior and 
performance of the XSLT presentation processor and the load it incurred on the application 
server. 
This defined the measures to collect during testing: 
 
• The processor usage for the application server process for increasing number of 
requests. 
This number is provided by the NT performance monitor for the computer that runs the 
application server process. 
• The number of pages per second serviced by each configuration for increasing 
number of requests. 
This is provided by the testing tool. 
• The actual processing time for the XSLT operations for different loads on the 
application server. 
ATG Dynamo allows code to be wrapped in a startOperation() - endOperation() block 
and be monitored for execution time and memory consumption. 
Because memory monitoring has a severe impact in server performance the test settings 
provided only execution time figures. 
 
Additionally during testing the performance of the database server and the web server(s) were 
monitored using the NT Performance monitor to ensure that the traffic generated by the 
testing tool did not create a bottleneck in one of those servers that affected the overall 
performance. 
5.2 Test process design. 
 
With the conclusion of the design process and the first stages of the implementation phase the 
design of the testing process began by using the previously documented use cases for the 
scenario application. 
There are 5 use cases: 
• Login in for the Web channel. 
• Login in for the WAP channel. 
• Changing preferences (Web). 
• Browsing the content for WAP and Web. 
• Login out 
The document detailing the use cases is attached as Appendix J. 
Using the use cases as a basis the corresponding test cases were developed in the following 
form: 
 
Scenario testing of an Application Server based Multi-channel Architecture. 
 
 21
Test Case: User login (Web). 
 
Test Case User login (Web) 
Description This test case covers the Web path. 
Related Use Case User logs on to the system 
Business rule(s) 
exercised 
N/A 
Initial state of the 
system before the test 
case starts 
• The Multi-channel site is up and running  
• Test client is connected to the internet 
• A WEB browser functions on the client 
State of the data used 
by the system before 
the test case start 
N/A 
 
Test Case steps Request /entry.dyn URL 
Expected final state of 
the system after the 
test case finishes 
Menu.jhtml is presented. 
Session data have been retrieved from database. 
Expected state of the 
data used by the 
system after the test 
case finishes 
N/A 
 
The test cases match the design of the application detailing the steps required by the testing 
application.  
There are 6 test cases for the scenario application: 
• User login (WAP) 
• User login (Web) 
• User browses through content (WAP) 
• User changes preferences (Web) 
• User logout (WAP) 
• User logout (Web) 
 
Using the test cases as building blocks, test scenarios - like the following - that outline the 
goals of the testing are built: 
 
 
Scenario testing of an Application Server based Multi-channel Architecture. 
 
 22
Balanced Traffic – Simple Configuration with think time enabled. 
Test Scenario Balanced Traffic – Simple Configuration with think time 
enabled. 
Purpose Load the simple configuration system with 45% normal Web 
traffic, 45% WAP traffic and 10% of changing user preferences 
and monitor the performance as the number of users 
increases. 
Identify the maximum number of users that does not generate 
errors in the server and/or load generated on the web server 
and application server. 
“Think time enabled” scenarios simulate real user behaviour 
by incorporating intervals between actions. 
Description The mixed traffic scenarios aim to provide us with data on the 
impact of each channel on the load of the server. 
By varying the traffic percentages for each channel and using 
a simple and a (redundant) test configuration we can extract 
information on the load percentage each channel represents. 
Testing Category Performance/Scalability test 
Test Cases used • User login (WAP) 
• User login (Web) 
• User browses through content (WAP) 
• User changes preferences (Web) 
• User logout (WAP) 
• User logout (Web) 
 
Initial state of the 
system before the test 
scenario starts 
The same state as before the start of User Login (WAP).  
State of the data used 
by the system before 
the scenario start 
N/A 
Test Scenario steps N/A 
System counters to 
monitor and their 
expected value range 
All defined. 
 
5.3 Server configurations. 
 
The physical configuration of the testing facility allows the running of the tests for different 
server configurations (different number of servers running). 
The server configurations used (except the one that uses two web servers and two application 
servers) are illustrated on diagram 1. 
 
Scenario testing of an Application Server based Multi-channel Architecture. 
 
 23
ATG Dynamo provides a 
load-balancing module for 
multiple server configurations 
that can run in the same 
process as the application 
server or as a separate 
process. During the tests the 
load-balancer was always 
running as a separate process. 
Load-balancing for the web 
servers was implemented 
using the Windows Load 
Balancing Service (WLBS) a 
module for Windows NT 4 
Enterprise Edition available 
for download form 
http://www.microsoft.com. 
 
 
 
5.4 The testing tool. 
 
Microsoft’s Web Application Stress tool (WAS)5 is “	
		


	

 
	
 
 	
 


 
 	  

	
  
It was selected because it satisfies the functional requirements set out for the testing tool and 
because it collaborates with the NT Performance Monitor utility. 
WAS features a graphical user interface for script creation that allows the tester to declare a 
series of HTTP requests to be send to an IP address. 
For each request the HTTP headers can be customized so WAS allows the simulation of 
WAP browsers but because it uses cookies (WAP 1.1 does not support cookies) the 
simulation is not entirely accurate. 
This issue is not significant though because the WAP functionality of the application was 
tested with the Nokia WAP toolkit to ensure full compliance and the usage of cookies does 
not affect performance. 
A further issue that was significant during testing is the fact that when an error occurs (i.e. in 
the login procedure) WAS provides no way to conditionally state the next action without 
significant development with an API, but continues with the requests defined next. A 
consequence of this is that it results in multiple errors being generated and false pages/sec 
and response time numbers reported.  The only way to avoid false measurements was the 
supervision of the tests and identification of errors and the intervention to stop the tests 
immediately so the most accurate results could be obtained.  
Additionally, although WAS allows the distribution of the generated traffic among groups of 
pages, if the percentage of traffic is small (1-10%), the traffic allotted to that group is 
generated in a short time period. This means that overall the percentage might be 10% but for 
                                               
5
 Available for download from http://homer.rte.microsoft.com/.  
Diagram 1: The test configurations 
Scenario testing of an Application Server based Multi-channel Architecture. 
 
 24
that short period it is close to 100% of the traffic arriving at the server. Traffic is not evenly 
distributed and this affects the measurements during the mixed traffic tests (the ‘final’ and 
‘browse all’ scripts) and led to the creation of more specific tests. 

 
Figure 9:The Microsoft Web Application Stress Tool. 
5.5 The test scripts. 
The following table describes the test scripts that were used in the valid tests. 
 
Script name  Description of test scenario 
Final Script 45% Web traffic logging on and viewing the chosen stocks, 45 WAP 
traffic logging on and viewing the chosen stocks, 3% web traffic 
removing all pre chosen stocks and replacing them with 10 other stocks, 
3% web traffic removing all pre chosen stocks and replacing them with 
40 other stocks and 4% web traffic removing all pre chosen stocks and 
replacing them with 60 other stocks. 
Browse WEB 100 percent web traffic, logging on and viewing the chosen stocks then 
logging out. This browsing involves database accesses and XML and 
XSLT processing. 
Browse WAP 100 percent WAP traffic, logging on and viewing the chosen stocks then 
logging out. This browsing involves data base accesses and XML and 
XSLT processing. 
Browse All 45 percent traffic from web 45 percent from WAP and 10 percent heavy 
traffic from Web, by choosing to view stocks causing database access 
and XSLT transformations. 
Browse Big 100 percent traffic from the Web viewing the list of all available stocks 
causing database access and XSLT transformations. Used to measure 
processing time of large pages. 
 
Scenario testing of an Application Server based Multi-channel Architecture. 
 
 25
The initial test plan included test scenarios similar to the ‘Final Script’ scenario with different 
percentages of traffic for every channel. The scripts described above were developed after 
initial tests revealed programming errors in order to isolate the errors in the application.  
Also, the uneven distribution of requests for different page groups by WAS prompted the use 
of the ‘Browse Web’, ‘Browse WAP’ and ‘Browse Big’ scripts so that measurements for the 
behavior of the different sets could be observed independently. 
5.6 The tests 
There were 41 tests run on the scenario application (a full list of them, compiled by Daniel 
Brock is given in Appendix H). 
The first tests revealed a problem with thread-safe access of the session profile (multiple 
users could alter the contents of two global variables resulting in errors) and helped illustrate 
the difference of optimized XSLT processing. 
The following subsections present the most important conclusions from the tests, grouping 
the tests that helped achieve the results together. 
 
All tests, except the crash-recovery test, had a duration of five minutes. 
All tests used random delay of 0-5000 milliseconds to simulate ‘think time’ (the time a user 
takes to read through a page). Due to this the maximum error-free load (maximum number of 
concurrent connections that did not generate errors) is defined as the number of connections 
that did not produce errors in repeated tests i.e. if a test did not produce errors for 30 
concurrent connections after 3 runs but produced errors once in three runs for 31 concurrent 
connections, the maximum error-free load is 30. 
Concurrent connections are the number of connections initiated by WAS. Due to the random 
delay time the connections are not actually concurrent after the first request but ‘think-time’ 
simulates a real-world case more closely. 
The database held 100.000 user entries with randomized profiles to ensure that the database 
server would not return cached queries very often. 
WAS does not allow randomization of variables (like user and password) but this problem 
was overcome by creating a list of the usernames in Microsoft Excel, randomizing the order 
and importing it into WAS.  
After each test the application servers were reset and restarted to ensure that each test was 
conducted in the same conditions. 
When running the database altering scripts, the database was also returned to its original state 
for the next test. 
Unless specifically mentioned the tests were run on server configuration I. 
5.6.1 The significance of XSLT optimization. 
The tests confirmed that XSLT processing is very process intensive, e.g. it consumes a large 
percentage of the CPU cycles allotted to a process. Optimizing the operation by caching 
often-used filters proved to increase performance dramatically. 
To illustrate this, the results of the maximum error-free load performance for optimized and 
non-optimized code are presented here. 
Graph 1 presents the percentage of process cycles usage for the application server using 
unoptimized code (test 8) for 5 concurrent connections: 
The unoptimized code parses the XSLT filter used for transforming the content into HTML 
for every request.  
 
Scenario testing of an Application Server based Multi-channel Architecture. 
 
 26
 
Graph 1: Unoptimized code, 5 concurrent connections. 
The blue line is the process cycle usage of the database server. The peaks coincide with the 
requests for the user profile and are due to the fact that the table holding the profile was not 
indexed. 
The red line is the ATG Dynamo load manager process that handles the connection with the 
Web server and the green line is the application server process. 
Even for the very small number of 5 connections the application server process reaches 100% 
utilization and is able to service just 0.8 requests per second.  
 
The optimized code parses the XSLT filters once during the initial loading of the servlets into 
memory and caches them. As shown on graph 2 the performance of optimized code (test 26) 
has similar impact on the application server for 30 concurrent connections but WAS reported 
6.73 requests/sec serviced an improvement of 8 times in performance. 
 
Graph 2: Optimized code - 30 concurrent connections 
Scenario testing of an Application Server based Multi-channel Architecture. 
 
 27
The database usage (blue line) has also been minimized by indexing the tables used. 
The content size for both tests was 40KB. 
5.6.2 XSLT processing times. 
Using the ‘Browse WEB’ and ‘Browse WAP’ test scripts execution times for the XSLT 
processing operation were gathered. 
Running the tests for 40 to 80 concurrent connections in increments of ten the XSLT code 
execution times averaged from 250 milliseconds to 900 hundred milliseconds with maximum 
execution times of 3000 to 5000 milliseconds for both scripts. 
The execution time increases with the increase of the load on the server. The size of the 
content is the same for both channels and execution time does not vary from one channel to 
another. The sizes of the resulting pages are also almost the same (with a difference of a few 
bytes). 
These tests compare only the XSLT processing times and do not take into account the delays 
in the response that images or static content served by the Web server incur (the scenario 
application does not include static content in HTML pages for this reason). 
Maximum error-free load for both scripts was 60 and WAS reported an average of 20 
requests/sec. 
The ‘Browse Big’ script used the changing preferences path of the scenario application to 
serve 40KB of content. 
Graph 2 gives the process usage for the maximum error-free load, which was 30. 
Average XSLT execution time for that test was 3100 milliseconds with a maximum of 17400 
milliseconds 
Graph 3 gives the process usage for 50 concurrent connections with the ‘Browse Web’ script 
(the green line represents the load manager process) with the execution times mentioned at 
the beginning of the paragraph. 
 
 
Graph 3: 50 concurrent connections - 5KB content 
The two graphs and the execution times highlight the dependence of XSLT processing 
performance on the size of the content to be processed. 
Scenario testing of an Application Server based Multi-channel Architecture. 
 
 28
The iterative process of XSLT increases the processing time required and incurs a much 
heavier CPU penalty than expected. Streamlining the filters and avoiding the usage of slow 
operations is therefore recommended (the Xalan site, http://xml.apache.org/xalan, hosts a 
‘coding practices’ document that offers advice on optimizing XSLT filters). 
5.6.3 Performance Scaling. 
To examine the way the architecture scaled with the addition of more applications servers to 
share the load, the server configurations I, II and III were used with the ‘Browse Web’ script. 
 
Adding a second application server increased the maximum error-free load to 120 (Graph 4 - 
the purple line indicates the number of requests arriving at the web server per second.) but the 
addition of a third server revealed a software bottleneck. 
 
 
Graph 4: Server Configuration III - 120 concurrent connections. 
 
A ‘Browse All’ test with 100 concurrent connections using three application servers (server 
configuration II) produced errors, although for the same test configuration III (2 application 
servers) did not. 
Combining the performance monitor graphs with the application event log for the web server 
for successive error-producing tests revealed that the ISAPI connection module that serves as 
the bridge between IIS 4 and ATG Dynamo crashed after sustained load. 
To illustrate better, graph 5 shows the 100 concurrent connections test for configuration II, 
noting with pink the connections waiting to be serviced by the application server (Current 
ISAPI Extension Requests) and with purple the requests/sec arriving at the web server. 
Due to the 100% process-cycle usage for two of the application servers in the beginning of 
the test the number of requests queuing for service in the ISAPI connection module was large 
and the module stopped responding (denoted by the rapid decline of the purple line) 
producing an error in the NT application error log. 
Similar behavior was observed for every test that produced errors: 
Scenario testing of an Application Server based Multi-channel Architecture. 
 
 29
The application server did not service requests fast enough and the queue of unserviced 
requests caused the connection module to crash. 
 
Graph 5: ISAPI module bottleneck. 
Additionally for tests where more than 120 concurrent connections were generated, the ISAPI 
module would crash even if the application servers did not operate at 100%, which indicated 
that the ISAPI module could not handle the flood of requests (Graph 7 ,the crash-recovery 
test illustrates this point). 
Unfortunately the error codes reported by the ISAPI module were not covered by the ATG 
Dynamo knowledge base at http://www.atg.com so there is no technical explanation for the 
fault.  
This presents a software problem for the architecture, which will have to be investigated 
further in the future by substituting the web server software with an alternative. 
 
A further observation on the configurations with multiple application servers is that the load 
balancer appeared to be able to distribute the load between two servers more evenly from the 
beginning of the tests (Graph 4). In the case of three servers the load balancer tended to 
assign more requests to one of the servers (for every configuration II test executed it was a 
different server) and then slowly converge the loads (Graph 6.). 
This is also a point of future investigation with additional configuration using an odd number 
of application servers and an even number of application servers to establish if it is a feature 
of the load balancing algorithm. 
 
 
Using multiple web servers (configuration IV and V) was unsuccessful. The Windows Load 
Balancing Service produced connection errors even for tests with 60 concurrent connections 
and allowed only a very small fraction of the requests to reach the application servers. Due to 
the limited time of the testing facility’s availability the reason was not researched so it was 
not established if it was due to configuration errors or an inherent software fault (there is a 
significant number of articles in http://msdn.microsoft.com, on the problems of configuring 
and tuning WLBS). 
Scenario testing of an Application Server based Multi-channel Architecture. 
 
 30
 
 
Graph 6: Load balancing with 3 servers 
 
5.6.4 The crash recovery test. 
 
Graph 7: Crash recovery test 
 
Scenario testing of an Application Server based Multi-channel Architecture. 
 
 31
The “crash recovery” test was devised after the ISAPI module induced errors were 
discovered to establish the impact of the errors on the application servers. 
It was conducted in the following manner: 
The ‘Browse Web’ script was used with 180 concurrent connections and the system 
monitored until errors were detected. Then the script was interrupted and immediately (so 
that the created sessions did not have time to expire) re-executed, this time with 40 
concurrent connections, a number previously established as not generating any errors. 
The system was monitored for errors for the duration of the script and then no traffic was 
generated for a period of 5 minutes (the expiration time for active sessions). 
Then the 40 concurrent connections test was run again and the system monitored for errors. 
What the test showed was that while during the 180 connections test errors were generated in 
all the application servers, the subsequent tests generated errors only in one of the servers. 
That server’s process usage is traced with the red line in graph 7. For both of the 40 
concurrent connections tests a sharp decline in the server’s process usage can be noted before 
the test’s end and coincides with the breakdown of the server. 
This indicates that the application server is also affected by the ISAPI module’s failure and 
that the failure upsets the application’s normal operation producing errors that in a normal 
environment are not acceptable. 
5.6.5 Multiprocessing of Java instances. 
ATG Dynamo’s deployment guide (available from http://www.atg.com after a registration 
procedure, or with the product) states that in systems with multiple processors the Java 
Virtual Machine  (JVM) does not take advantage of more than one processor. 
Given that at the time of publishing the JVM version was 1.1.7, during the tests the behavior 
of the JVM used (version 1.2.2) was also investigated. 
Graph 8 shows that the 1.2.2 JVM cooperates well with the operating system in sharing the 
processing requirements between the two CPUs of the system. In blue is the process usage 
time of the application server JVM while in red and green the CPUs usage. 
 
Graph 8: CPU usage for a single JVM 
Scenario testing of an Application Server based Multi-channel Architecture. 
 
 32
Conclusion. 
Main experiments and future work. 
It was established that the XSLT implementation of the presentation interface is very process 
intensive and that the size of the content is the important factor. 
In addition, the importance of optimizing code for such process intensive operations was 
clearly illustrated. 
For Web pages of medium size (40KB) the average response time was 3000 milliseconds, a 
response time which is acceptable in many circumstances, but under conditions of heavy load 
the system is slow. The XML/XSLT technology is still very new but provides great flexibility 
in content delivery. Techniques are already being developed for performance improvement 
but at the moment the dynamic creation of content using XSLT is recommended only for 
small sized content. 
 
ATG Dynamo shows the promise of scaling well, being able to double the number of 
connections handled from configuration I in configuration III, however the problem that the 
IIS4 connection module presents renders that capability irrelevant for the examined 
configuration. Unfortunately, the Windows load balancing facility did not perform either so 
the combination of IIS4 and ATG Dynamo has proved unreliable. 
 
The additional observation of the symmetric multiprocessing behavior of the 1.2.2 JVM is a 
useful point in planning deployment configurations for application servers in general. 
 
The framework designed for testing and the scenario application’s design allows the 
architecture to be tested using different components.  
Alternative Web servers can be used to clarify the performance and scaling characteristics of 
ATG Dynamo. The timeline required for adapting the configurations to use alternative web 
servers should be significantly shorter since only the testing phase has to be repeated. 
Redeveloping the scenario application for use with BEA WebLogic would allow the 
evaluation of the performance of a second application server and a direct product 
performance comparison. 
 
Extending the scenario application to provide for all available channels and re-designing the 
testing procedures will enable a more comprehensive examination on the performance 
impact. This time factors such as images and multimedia content should be included were 
appropriate to better evaluate each channel’s impact on the whole of the architecture. 
 
During the course of the project ATG announced a new version of it’s Dynamo product with 
support for WML, while the WAP specification was upgraded to 1.2 and XSLT performance 
has become the focus of the development community. The architecture used and the methods 
for designing and testing a multi-channel application as well as the testing strategy outlined 
provide a well documented approach to performance assessment, which can be adapted to 
include all new developments, although the specifics of each implementation will vary. 
Scenario testing of an Application Server based Multi-channel Architecture. 
 
 33
Additional Value 
A critical conclusion that is not evident in the test results is the need for stress testing of an 
application to reveal possible programming errors or oversights. The scenario application that 
was developed and tested using single client connections, initially behaved as expected for a 
low number of connections. Only when the tests increased the performance requirements and 
the application had to handle simultaneous requests did the programming errors become 
evident. Such testing will help detect application errors and significantly reduce the risk of 
final application deployment. 
 
 
 
 
Scenario testing of an Application Server based Multi-channel Architecture. 
 
 34
References. 
 
[WAP1] What is WAP and WAP Forum, The WAP Forum, available online at 
http://www.wapforum.org/what/index.htm, accessed 06/06/2000. 
 
[WAP2] The WAP White paper, WAP Forum, October 1999, available online at 
http://www.wapforum.org/what/WAP_white_pages.pdf, accessed 06/06/2000. 
 
[WAP3] Role of the Architecture Group, WAP Forum 15/02/99, presentation, available 
online at http://www1.wapforum.org/member/developers/slides/WAP-
Architecture/index.htm, accessed 06/06/2000. 
 
[WAP4] Wireless Application Environment Overview, WAP Forum, November 1999, 
available online at  http://www.wapforum.org/what/technical.htm, last access date 
25/08/2000 
 
[WAP5] Wireless Application Environment Specification, WAP Forum, November 1999, 
available online at http://www.wapforum.org/what/technical.htm, last access date 
25/08/2000. 
 
[APP1] App Server Zone, http://www.appserver-zone.com/, last access date 23/08/2000. 
 
[APP2] Introduction to WebLogic Server, http://www.weblogic.com/docs51/intro/index.html, 
last access date 23/08/2000. 
 
[APP3] Java 2 Platform Enterprise Edition, http://java.sun.com/j2ee/docs.html, last access 
data 23/08/2000. 
 
[APP4] J2EE Architecture, Online Documentation, Sun Microsystems 1999, 
http://java.sun.com/j2ee/j2sdkee/techdocs/guides/ejb/html/Overview3.html#9353, last access 
date 23/08/2000. 
 
[APP5] ATG Dynamo main website, http://www.atg.com/, last access date 23/08/2000, 
requires registration to access documents. 
 
 [XML1] Extensible Markup Language (XML) 1.0 W3C Recommendation, W3C, 
10/02/1998, http://www.w3.org/TR/1998/REC-xml-19980210, last access date 26/08/2000. 
 
[XML2] Mastering XML, A. Navarro, C. White, L. Burman, Sybex 2000 
 
[XML3] The XML Companion Second Edition, Neil Bradley, Addison-Wesley 2000. 
 
[XML4] XHTML™ 1.0: The Extensible HyperText Markup Language, A Reformulation of 
HTML 4 in XML 1.0, W3C Recommendation, 26 January 2000, 
http://www.w3.org/TR/2000/REC-xhtml1-20000126/, last access date 23/08/2000. 
 
[XSLT1] eXtensible Stylesheets Language: Transformations, http://www.w3.org/TR/xslt.  
 
[XSLT2] XSLT Programmer’s reference, Michael Kay, Wrox 2000. 
Scenario testing of an Application Server based Multi-channel Architecture. 
 
 35
 
[WAS] The Microsoft Web Application Stress Tool, http://homer.rte.microsoft.com/, last 
access date 27/08/2000. 
 
Scenario testing of an Application Server based Multi-channel Architecture. 
 
 36
Appendices. 
 
 
1. Appendix A. 
2. Appendix B. 
3. Appendix C. 
4. Appendix D: Functional Specification Document. 
5. Appendix E: XML content sample and the XSLT filters. 
6. Appendix F: Multi-channel development procedures document. 
7. Appendix G: Test facility technical specifications. 
8. Appendix H: Full test list. 
9. Appendix I: Use Cases definition document. 
Scenario testing of an Application Server based Multi-channel Architecture. 
 
 37
Appendix A. 
 
The most crucial point in the successful completion of this project was planning. 
With the help of Mark Berner, the project manager, the project was initially planned in 
phases, described in very broad terms with rough estimates of the time needed for each phase. 
The initial plan was refined with the definition of tasks for each phase and more accurate time 
estimates through an iterative process. 
Having good time estimates for each phase meant that the potential for delays due to 
unforeseen difficulties was also calculated and this proved valuable during the 
implementation phase where the project was delayed for a week but did not exceed the 
projected deadlines. 
 
Consulting early on with people that have experience on the various subjects touched by the 
project saved time and problems later. Especially, it helped to limit the scope of the project so 
that it could be achieved within the time frame of 3 months. 
 
Working within the company meant also that the methods of the company had to be adopted. 
Although it required a period of adjustment and familiarization, it was essential in the good 
cooperation of the team. Again though this period was incorporated into the project plan. 
 
The design of a test strategy proved also very useful. Outlining the goals of the testing 
procedures (measurements and object of testing) before a detailed test plan allowed the 
adjustment of the tests when it became obvious that the desired goals would not be achieved 
following the initial testing plan. 
 
Scenario testing of an Application Server based Multi-channel Architecture. 
 
 38
Appendix B. 
Scenario testing of an Application Server based Multi-channel Architecture. 
 
 39
Appendix C. 
 
 
Scenario testing of an Application Server based Multi-channel Architecture. 
 
 40
Appendix D: Functional Specification Document. 
 
Scenario testing of an Application Server based Multi-channel Architecture. 
 
 41
Multi-Channel 
Application Server 
Scenario 
Specification 
 
Synopsis 
This document outlines the functional 
specification for the multi-channel application 
server scenario to be used for deploying a multi-
channel architecture. 
 
Author(s) 
Vassilis Rizopoulos 
 
 
Created 
21 Jun 2000 
 
 
Version 
0.1. 
 
Scenario testing of an Application Server based Multi-channel Architecture. 
 
 42
Document control 
Table of contents 
0 DOCUMENT CONTROL.................................................................................................42 
0.1 TABLE OF CONTENTS .......................................................................................................42 
0.2 DOCUMENT CHANGES .....................................................................................................43 
0.2.1 Current version 43 
0.2.2 Changes forecast 43 
0.2.3 Distribution 43 
0.3 GLOSSARY......................................................................................................................43 
1 INTRODUCTION .............................................................................................................43 
1.1 PURPOSE.........................................................................................................................43 
2 SCENARIO SPECIFICATION ........................................................................................43 
2.1 CHOICE JUSTIFICATION....................................................................................................43 
2.2 FUNCTIONAL DIAGRAMS .................................................................................................44 
2.2.1 Application Scenario 44 
2.2.2 Testing scenario 45 
3 ISSUES...............................................................................................................................45 
 
Scenario testing of an Application Server based Multi-channel Architecture. 
 
 43
Document changes 
Current version 
Version Date Author Comments 
0.1. 21 Jun 2000 Vassilis Rizopoulos 
 
Work in progress 
 
Changes forecast 
This is a first draft of work in progress, subject to approval. Expect changes!  
Distribution 
Sanjay Mandahar, Mark Berner 
 
Glossary 
 
Term Definition 
WEB Used to signify a standard web browsing application like Microsoft Internet 
Explorer or Netscape Navigator. 
WAP Wireless Application Protocol – used to signify the browsing equivalent for WAP 
enabled devices. 
 
Introduction 
In order to test the design of the multi-channel application server architecture and extract 
some useful results, a scenario that represents a real world situation as closely as possible 
will be implemented. 
Purpose 
The project aims to scale the work done by the Innovation Centre onto an application server 
configuration. This includes the design of a framework for multi-channel applications on 
existing server architecture and the examination of performance and reliability issues. 
 
Scenario Specification 
Choice justification 
The choice for the scenario is based on the goals of this project: 
• To transfer the Innovation Centre tools and methods on multi-channel architectures 
on an application server framework. 
• Test the methods for delivering content to different channels and assess 
performance, scalability and robustness. 
Because of the focus on assessing the methods in use, the scenario functionality has been 
kept to the minimum required to provide realistic results. 
Scenario testing of an Application Server based Multi-channel Architecture. 
 
 44
My recommendation is to also simplify the content database although this will have to be 
taken under consideration during load testing of the application. The reasoning behind this is 
that the performance requirements of database access have been extensively researched 
and benchmarked and the main purpose for the project is to concentrate on the unknown 
factor of content restructuring and transformation. 
Functional diagrams 
Application Scenario 
 
1. Client Identification. 
The application will be able to recognise the type of client  (WEB or WAP). 
2. The application will return to the client the login form that corresponds to it. 
3. The application authenticates the user according to a user database and exits/proceeds. 
4. The personalisation parameters are determined according to user profile 
5. If the client is a standard web client the content is formatted in HTML (applying XSLT) and 
served. 
For WAP the size of the content generated has to be calculated according to client 
limitations and served in one or multiple pieces (decks) 
6. (and subsequent steps) Content is delivered to WAP clients in WML form (again using 
XSLT). 
Scenario testing of an Application Server based Multi-channel Architecture. 
 
 45
Testing scenario 
 
 
The complete architecture of a multi-channel application with WAP clients includes a WAP 
gateway, but for the purposes of testing the Application Server performance a simulator 
engine can generate the traffic. 
The engine must be able to: 
• Emulate a basic WAP or Web session (step through the application) 
• Generate a large volume of mixed traffic with varying percentages of Web and WAP 
requests. 
• Produce a set of results for evaluation purposes (to be defined later). 
 
Issues 
• Content DB size. 
As discussed in section 2.1 the size of the content database will probably play a role 
during stress testing of the application server. 
• User DB. 
The user database has to be generated. The high number of users requires a naming 
convention for userids and passwords that can be calculated programmatically by the 
simulation engine and the script that will generate the DB. 
• Client identification. 
There is an ever increasing number of different clients on the network, especially for 
WAP-we might implement support for only a couple (but will design with extensibility 
in mind). 
 
 
Scenario testing of an Application Server based Multi-channel Architecture. 
 
 46
Appendix E: XML content sample and the XSLT filters. 
 
Sample Preferences Content. 
 
 
 
144 
LSE 
MRN        
Merant PLC 
 
 
93 
LSE 
FMN        
Fleming Mercantile Inv Trust PLC 
 
 
 
Sample Browsing Content. 
 
 
 
144 
LSE 
FMN        
Merant PLC 
 
71.00 
- 2.00 
-2.74 
 
 
 
Scenario testing of an Application Server based Multi-channel Architecture. 
 
 47
The WAP filter. 
 
 
 
    
      <!DOCTYPE wml PUBLIC "-//WAPFORUM//DTD 
WML 1.1//EN" "http://www.wapforum.org/DTD/wml_1.1.xml"> 
       
          
            


Logout


Ticker :
GBP:
+/-:
% :
Scenario testing of an Application Server based Multi-channel Architecture. 48 The Web Filter. Multi-Channel
Logout


Ticker :
GBP:
+/-:
% :
Scenario testing of an Application Server based Multi-channel Architecture. 49 The Preferences Filter. Multi-Channel Preferences

List of Preferences


:
:
Scenario testing of an Application Server based Multi-channel Architecture. 50 Appendix F: Multi-channel development procedures document. Scenario testing of an Application Server based Multi-channel Architecture. 51 Multichannel Project Development Procedures Synopsis This document covers the development process for the Multichannel project using ATG Dynamo 4.5.1 Author(s) Jack Beaken Created 21 Jun 2000 Version 0.1. Scenario testing of an Application Server based Multi-channel Architecture. 52 Document control Table of contents 0 DOCUMENT CONTROL.................................................................................................52 0.1 TABLE OF CONTENTS .......................................................................................................52 0.2 DOCUMENT CHANGES .....................................................................................................53 0.2.1 Current version 53 0.2.2 Prior versions 53 0.3 DISTRIBUTION ................................................................................................................53 0.4 REFERENCES...................................................................................................................53 0.5 GLOSSARY......................................................................................................................53 1 INTRODUCTION .............................................................................................................54 1.1 PURPOSE.........................................................................................................................54 1.2 SCOPE ............................................................................................................................54 2 ISSUES ARISING DURING DEVELOPMENT..............................................................55 2.1 CONFIGURING DYNAMO TO USE WAP .............................................................................55 2.2 IDENTIFY CLIENT’S BROWSER FROM THE HTTP REQUEST’S USER-AGENT. .......................56 2.3 USE XML/XSLT TO RETURN HTML/WML CONTENT TO BROWSER ................................57 2.4 MICROSOFT WEB APPLICATION STRESS TOOL AND SESSION IDS ......................................58 Scenario testing of an Application Server based Multi-channel Architecture. 53 Document changes Current version Version Date Author Comments 0.1. 16 August 2000 Jack Beaken Initial Draft Prior versions Version Date Author Comments No prior versions Distribution References [1] Multichannel Use Cases [2] Glossary Term Definition Dynamo servlet-mapped A Javabean that has been mapped to a specific URI in the Servlet Pipeline. See Dynamo Programmers Guide page 217 for details Scenario testing of an Application Server based Multi-channel Architecture. 54 Introduction This document describes issues that arose during development of a simple testing application for the Multichannel project. The application was written using ATG Dynamo 4.5.1 and requirements of the application were: • It had to be simple, the purpose of the project was to test the Application Server, not create a killer application. The application had a small number of pages which contained no images • It had to support multiple channels. WEB and WAP traffic were supported • It had to read and write to a database. The application requirements are more fully described in the white papers Multichannel Use Cases and The Multichannel Application. See diagram 1 The major issues in development were • The configuration of ATG Dynamo for WML as well as HTML pages or in real terms the configuration of Dynamo to process jwml alongside the proprietary jhtml. • How best to process the XSLT/XML content in terms of speed and stress Purpose Describe the process for developing a WAP/HTTP multichannel simple application for ATG Dynamo 4.5.1 for reference in future Rubus projects Scope Scenario testing of an Application Server based Multi-channel Architecture. 55 Issues Arising During Development Configuring Dynamo to use WAP Technical Bulletin #29 published by ATG on 29 July 2000 describes the configuration of Dynamo to use WAP traffic. It was published too late for the Multichannel project, but we independently reached many of the same conclusions. The bulletin does go further and describe additional MIME types, the $ escape character etc and should be used in future configurations of dynamo. Issues that appeared and that are covered in bulletin #29: Additional MIME types Dynamo uses a proprietary page compiler that can best be described as an extension of Java Server Pages. Any page with extension .jhtml will be processed by Dynamo and passed down ServletPipeline as a HTML pages. Dynamo had to be configured to compile .jwml files into WML. URL Rewriting WAP browsers do not support cookies, so session ids had to be encoded into the URL’s. See point 2.5 of this document regarding Microsoft WAS. Form Handling The Multichannel project used a GenericFormHandler bean to process the HTML form. We did not use a GenericFormHandler bean for the WML due to the multichannel project not having knowledge of bulletin #29, specifically point 2.4 reproduced below: 2.4 Form handling The standard form handling in Dynamo involves the page compiler adding hidden input tags to a form. For example:
Name: Age:
becomes:
Name: Age:
Since WML does not have tags but instead uses tags we must adopt a slightly different approach. In a .jwml page, instead of a form like that above we need:

Name: Age: Submit Scenario testing of an Application Server based Multi-channel Architecture. 56

The multichannel project used a dynamo mapped-servlet to process the WML form instead. This caused a problem as the URL encoded session id was lost when the request was sent to the mapped servlet and the client’s session was invalid. This was possible due to the WMLLogin bean having global scope (all dynamo servlet-mapped beans must be global). The work around was to use a DROPLET tag (a droplet is Dynamo’s version of the tags) into the WML’s tag which called a GetSessionId bean and posted the session id to WMLLogin. The following is part of the WMLLogin.jwml form:

Username Password Log In

The GetSessionId bean contained the following code: ////// ServletOutputSream out = response.getOutputStream(); out.println(“”); //////This has been simplified, no escape characters for a start! This is an awkward solution, bulletin #29 2.4 now shows this was not necessary, a GenericFormHandler bean could have been used after all!! Also see point 2.5 of this document concerning Microsoft WAS and rewriting URL’s Identify client’s browser from the HTTP Request’s User-Agent. The multichannel application did not initially know the identity of the client’s browser (web or WAP). The identification of the client had to be browser independent. This meant no HTTP Response could be returned to the browser prior to identification. A servlet-mapped bean called EntryServlet bean handled new client requests. It compared the client’s User-Agent header with a database of known User-Agents. The known user agents were stored in the database as the ClientSpecs table (shown below). As the table was so small, it was loaded into memory in the EntryServlet.doStartService() method. Scenario testing of an Application Server based Multi-channel Architecture. 57 CLIENTSPEC TABLE USERAGENT MAXCONTENT6 BROWSERTYPE Mozilla 0 WEB Nokia 1400 WAP If the User-Agent is recognised, a new session is created and the ClientSpec Javabean representing that row is put into the session. session.putValue7(“CLIENT_SPECS”, clientSpecifications); The client is then redirected to the corresponding login page (WAP or WEB). If the User-Agent is not recognised, no session is created and the client is redirected to a WML page which displays an error message. A WML rather than HTML page is because a WAP browser will not display a HTML page. Use XML/XSLT to return HTML/WML content to browser The XSL style sheet is dependent on the type of browser (WAP or WEB). The javabean that displayed the HTML/WML stock ticker content was called GetContent. It is called by both web and WAP clients, and identifies the type of client browser by reading the ClientSpec object in the client’s session: HttpSession session = request.getSession(false); ClientSpec clientSpecifications = (ClientSpec)session.getValue(“CLIENT_SPEC”); if(clientSpecifications.getBrowserType().equals(“WAP”) respondWML; else if(clientSpecifications.getBrowserType().equals(“WEB”) respondHTML; See point 2.2 for information on client specifications In the initial multichannel project, the two DOM object was instantiated for each client request. This was achieved by parsing the XSL using the Apache Software Foundation’s Xercus API (see http://xml.apache.org for further details). In the Xercus API, a XSLTInputSource represents a DOM object XSLTInputSource webInputSource = new XSLTInputSource("locallib/xsl/mc-web.xsl"); XSLTInputSource wapInputSource = new XSLTInputSource("locallib/xsl/mc-wap.xsl"); locallib/xsl/mc-web.xsl is the relative file location of the XSL source (c:\atg\dynamo4.5.1\locallib\xsl\mc-web.xsl) Parsing XSL is a very expensive operation, and the Dynamo servers broke relatively early on during testing. A Solution was to place the two DOM objects (webInputSource and wapInputSource) into memory. The GetContent bean was given global scope rather than session and the DOM objects were instantiated as above in the GetContent.doStartService() method. See results for an analysis of test results. 6 MaxContent is represents the maximum deck size for different WAP browsers. This attribute could be used in future projects which use more than one WAP browser. 7 putValue(String name, Object obj) is deprecated in Servlet API 2.2 (replaced by setAttribute) but Dynamo has to use the Servlet 2.0 API. Scenario testing of an Application Server based Multi-channel Architecture. 58 The XSLT processing took place in the GetContent.service() method for each request. The Apache foundations Xalan API was used: private void transformXML(StringBuffer xmlContent, ServletOutputStream out, boolean isWeb) { Reader xmlReader = new StringReader(xmlContent.toString()); String localXSLFilename; //File location of stylesheet try { //Get the processor XSLTProcessor processor = XSLTProcessorFactory.getProcessor(); //Get XSLInputSource XSLTInputSource xsl = null; if(isWeb) xsl = webInputSource; else xsl = wapInputSource; XSLTInputSource xmlSource = new XSLTInputSource(xmlReader); //Create DOM object processor.process(xmlSource, xsl, new XSLTResultTarget(out)); //Process XSLT processor.reset(); processor = null; }catch(SAXException e){logError("SAXException : " + e.getMessage());} catch(Exception ex){logError("error: " + ex.getMessage());} } Microsoft Web Application Stress Tool and Session Ids As mentioned in point 2.1, WAP browsers cannot use cookies. Dynamo instead uses URL rewriting in order to create persistent sessions. Microsoft Web Application Stress Tool (aka Homer), the multichannel project testing tool only supported cookies, not session id rewriting (the HTTP requests could not be scripted). The work around was to allow all WAP traffic to support cookies. This was possible because we used WAS to emulate WAP browsers. It was decided that using cookies had no significant impact on our results. The decision to use cookies for WAS traffic meant the javabean GetSessionId in WMLLogin.jwml was not needed (See point 2.1) Scenario testing of an Application Server based Multi-channel Architecture. 59 Appendix G: Test facility technical specifications. Web Servers (2) Compaq 1850R, 2 Intel Pentium III 500MHz CPUs, 512Mb RAM, 2 x 18Gb HDD, 5 NICs∗ Application Servers (2) Compaq 1850R, 2 Intel Pentium III 500MHz CPUs, 512Mb RAM, 2 x 18Gb HDD, 3 NICs Domain Controllers (2) Compaq 1850R, Intel Pentium III 500MHz, 512Mb RAM, 2 x 18Gb HDD, 3 NICs Management Server (1) Compaq 1850R, Intel Pentium III 500MHz, 256Mb RAM, 2 x 18Gb HDD, 2 NICs, 35/70 DLT Database Server (2 node cluster) Compaq CL1850R, 4 x 18Gb Shared HDD, 1 x Controller Each node: 2 Intel Pentium III 550MHz CPUs, 512Mb RAM, 2 x 18Gb HDD, 3 NICs, Array Controller. ∗ NIC: Network Interface Controller. Scenario testing of an Application Server based Multi-channel Architecture. 60 Appendix H: Full test list. Date of test. File name Description of the Performance monitor file Script Configuration 1 8/9/00 000809-change10test-5.log Initial tests. Not valid. _ _ 2 8/9/00 000809-change10test-5_2 Initial test. Not valid. _ _ 3 8/9/00 000809-change10test-5_3 Initial test. Not valid. _ _ 4 8/9/00 000809-full-256-1-10.log Initial test. Not valid. _ _ 5 8/9/00 000809-single-50-4 Initial test. Not valid. _ _ 6 8/10/00 Browsing-2 clients 40th1s.log Initial test. Not valid. _ _ 7 8/10/00 Change 2 threads one socket.log Initial test. Not valid. _ _ 8 8/10/00 Change10 5 threads one socket.log Used to illustrate errors and optimization. _ _ 9 8/10/00 ChangeNoPost 10-1 no restart.log Initial test. Not valid. _ _ 10 8/10/00 ChangeNoPost 10-2 no restart.log Initial test. Not valid. _ _ 11 8/10/00 ChangeNoPost 40-1 no restart.log Initial test. Not valid. _ _ 12 8/10/00 ChangeNoPost 5-1 no restart.log Initial test. Not valid. _ _ 13 8/10/00 ChangeNoPost 5-1.log Initial test. Not valid. _ _ 14 8/10/00 ChangeNoPost 5-2 no restart.log Initial test. Not valid. _ _ 15 8/11/00 40x1x4.log 40 threads 1 socket and 4 client. Browse WAP l. 16 8/11/00 50x1x4.log 50 threads 1 socket and 4 clients. Browse WAP l. 17 8/11/00 60x1x4.log 60 threads 1 socket and 4 clients. Browse WAP l. 18 8/11/00 60x1x4-100 take 2 with web01 app log.log 60 threads, 1 socket, 4 clients and 100 DRP requests. (This is the second time the script was run) Browse WAP l. 19 8/11/00 60x1x4-100.log 60 threads, 1 socket, 4 clients and 100 DRP requests. Browse WAP l. 20 8/11/00 80x1x4-100.log 80 threads, 1 socket, 4 clients and 100 DRP requests. Browse WAP l. 21 8/11/00 100x1x4.log 100 threads 1 socket and 4 clients. Browse WEB l. 22 8/11/00 200x1x4.log 200 threads 1 socket and 4 clients. Browse WEB l. Scenario testing of an Application Server based Multi-channel Architecture. 61 23 8/11/00 50x1x4.log 50 threads 1 socket and 4 clients. Browse WEB l. 24 8/14/00 100x1x4-100drp-3apps- 1web 45_45_10.log 100 threads, 1 socket, 4 clients and 100 DRP requests. Browse All ll. 25 8/14/00 80x1x1-100drp-3apps-1web 45_45_10.log 80 threads, 1 socket, 1 client and 100 DRP requests. Browse All ll. 26 8/14/00 30x1x4-100.log 30 threads 1 socket 4 clients 100 DRP requests. Browse Big l. 27 8/14/00 120x1x4-100-2apps in app02-1web.log 120 threads, 1 socket, 4 clients and 100 DRP requests. Browse WEB lll. 28 8/14/00 120x1x4-100-2apps in app02-2webs.log 120 threads, 1 socket 4 clients 100 DRP requests. Browse WEB lV. 29 8/14/00 120x1x4-1003apps-1 web ll.log 120 threads, 1 socket 4 clients 100 DRP requests. (This is the second time the script was run ) Browse WEB ll. 30 8/14/00 120x1x4-100-3apps-1 web.log 120 threads, 1 socket 4 clients 100 DRP requests. Browse WEB ll. 31 8/14/00 120x4-100-3apps- 2webs.log 120 threads, 1 socket 4 clients 100 DRP requests. Browse WEB V. 32 8/14/00 180x1x4-100-2apps in app02-1 web.log 180 threads, 1 socket 4 clients 100 DRP requests. Browse WEB lll. 33 8/14/00 180x1x4-100-3apps-1 web.log 180 threads, 1 socket, 4 clients and 100 DRP requests. Browse WEB ll. 34 8/14/00 180x1x4-100-3apps- 2webs.log 180 threads, 1 socket, 4 clients and 100 DRP requests. Browse WEB V. 35 8/14/00 30x1x4-100.log 30 threads, 1 socket, 4 clients and 100 DRP requests. Browse WEB l. 36 8/14/00 90x1x4-100-2apps in app02-1 we.log 90 threads, 1 socket, 4 clients 100 DRP requests. Browse WEB lll. 37 8/14/00 90x1x4-100-3apps- 1web.log 90 threads, 1 socket, 4 clients and 100 DRP requests. Browse WEB ll. Scenario testing of an Application Server based Multi-channel Architecture. 62 38 8/15/00 Crash recovery test.log 180 threads then 40 threads no connections till all sessions have expired and then another run of the 40 threads. Browse WEB ll. 39 8/15/00 Final Script 60 threads 2 app 1 web.log 60 threads,1 socket and 4 clients. Browse WEB lll. 40 8/15/00 Final Script 80 threads 2 app 1 web.log 80 threads, 1 socket and 4 clients. Browse WEB lll. 41 8/15/00 Final Script 80 threads 3 app 1 web.log 80 threads, 1 socket and 4 clients. Browse WEB ll. Server Configuration Description of configuration I. 1 Dynamo application server 1 IIS web server. II. 3 Dynamo application servers and 1 IIS Web server. III. 2 Dynamo application servers and 1 IIS web server. IV. 2 Dynamo application servers on and 2 IIS web server. V. 3 Dynamo application servers 2 IIS web server Scenario testing of an Application Server based Multi-channel Architecture. 63 Appendix I: Use Cases definition document. Scenario testing of an Application Server based Multi-channel Architecture. 64 Multi-channel Scenario use cases Synopsis Use Cases for the Multi-channel testing scenario application Author(s) Vassilis Rizopoulos Created 21 Jun 2000 Version 0.1. Scenario testing of an Application Server based Multi-channel Architecture. 65 Document control Table of contents 0 DOCUMENT CONTROL.................................................................................................65 0.1 TABLE OF CONTENTS .......................................................................................................65 0.2 DOCUMENT CHANGES .....................................................................................................66 0.2.1 Current version 66 0.2.2 Prior versions 66 0.2.3 Changes forecast 66 0.3 DISTRIBUTION ................................................................................................................66 0.4 REFERENCES...................................................................................................................66 0.5 GLOSSARY......................................................................................................................66 1 INTRODUCTION .............................................................................................................67 1.1 PURPOSE.........................................................................................................................67 2 USE CASES .......................................................................................................................68 2.1 USER LOGS ON TO THE SYSTEM ........................................................................................68 2.2 USER CALLS SERVING MODULE DIRECTLY ........................................................................68 2.3 USER PERFORMS BROWSING TASKS ..................................................................................68 2.4 USER CHANGES PREFERENCES .........................................................................................68 2.5 USER LOGS OUT ..............................................................................................................68 Scenario testing of an Application Server based Multi-channel Architecture. 66 Document changes Current version Version Date Author Comments 0.2 06 July 2000 Vassilis Rizopoulos Prior versions Version Date Author Comments 0.1 03 July 2000 Vassilis Rizopoulos Changes forecast Distribution Mark Berner, Sanjay Manandhar, the MC team References Rubus Stock Use Cases, Enricos Manassis Glossary Term Definition Scenario testing of an Application Server based Multi-channel Architecture. 67 Introduction This document describes the user – system interaction for the testing application used in the multi-channel project. It follows the format set-up by the “Rubus Stock use cases” document. The Multi-channel application is a very simple – functionally – scenario with the specific purpose of testing the methods for delivering content across multiple channels through application servers. Focus has been shifted to the examination of the capabilities of the deployment environment (the application server) to cope with the high processing demands of content transformation. Purpose To provide a framework for developing the application and a basis for determining the test cases. Scenario testing of an Application Server based Multi-channel Architecture. 68 Use Cases User logs on to the system The user knows the URL of the application’s home page and uses a known client. If the client is unknown the system returns a default message else a login dialog is presented where the user is required to enter: • Username • Password The form does not perform any validation tasks. On submission the system validates the input and verifies the information. On invalid information the login page is presented again with an appropriate message: Wrong password Empty username Unknown user On successful login the system passes control to the serving module. User calls serving module directly If a valid session ID is appended to the URL the system serves the appropriate content. If the session has expired or no session ID is provided the user is redirected to the login page. User performs browsing tasks When presented with multiple pages of the stock quotes defined by the user profile, the user has the choice of moving to the first, last, next or previous page. The system serves the appropriate content. User changes preferences The user is presented with a form containing the available stock tickers with those tickers that match the profile choices checked. The user then has the option of revising the list of quotes he/she will receive when querying the system. The system updates the preference database and informs the user of success or failure to do so. User logs out At any point after logging in the user can choose to log out. This forces the system to invalidate the session ID and force any subsequent hits bearing this ID through the login page. The user is presented with a logout page.