Applying Process Analysis and Prototype Techniques to Developing Media Service Software
Forced to replace a legacy media services order entry system, the University engaged in a cross-functional process analysis effort, coupled with rapid application development techniques, to create a reengineered system compliant with established information architecture guidelines. This report discusses the methodologies employed to analyze, develop, and revise the system
In support of its centralized Instructional Media Services group, the University of Pittsburgh maintained a host-centric application to schedule classroom equipment, services, and personnel.
The software ran on an aging hardware/database platform with extremely limited capabilities and was not Y2K compliant. After an extensive systems investigation failed to identify viable commercial software options, a project was initiated in early 1997 to develop a new software solution.
The new software was designed to comply with the University’s established information architecture guidelines. To ensure that the system would be responsive to customer needs, a cross-functional project team engaged in a process analysis effort, while systems analysis staff simultaneously used rapid application development techniques to generate working prototype applications. The system developed as a result of this two-pronged methodology failed to provide a satisfactory solution.
Using a modified technique, a subsequent process analysis effort identified process disconnects, created a new target envisioned process, and guided the application development modifications. The resultant system successfully addressed the needs of the institution. This paper describes the analysis, development, and revision processes employed to bring the project to fruition
The University of Pittsburgh is a state-related public research university and a member of the Association of American Universities (AAU). Founded in 1787, Pitt offers graduate, undergraduate, professional, and continuing education programs through 19 schools and four regional campuses. The University typically serves about 28,000 FTE in the fall term, and employs nearly 4,000 full- and part-time faculty members. Pitt’s regional campuses are located between 30 and 170 miles from the main campus in Oakland, Pittsburgh’s cultural and medical center, Among the notable sites on the Oakland campus is the 42-story Cathedral of Learning with its 28 nationality classrooms, each representing the heritage of one of the region’s many ethnic groups.
The Center for Instructional Development & Distance Education (CIDDE) works directly with schools, departments, and individual faculty members to facilitate the academic goals of the University in the areas of instructional development and technology, media support, faculty development, and distance education. The Instructional Media Services (IMS) group within CIDDE is responsible for providing technology support for classroom activities and special events. IMS technicians provide operator and technical support for events in the University’s media-enhanced facilities and also deliver equipment and materials to rooms that have no built-in technology. The IMS staff of 11 full-time and 25 student employees (part-time) supports over 11,000 media-related events each year.
The problem addressed in this report is the applicability of process analysis and rapid application development methodologies to the creation of responsive information system solutions in a complex university setting. The specific client/server application that was developed addresses the management of the equipment, services, and staff associated with the delivery of instructional media services across the University.
To help manage their classroom media efforts during the 80’s and 90’s, the Instructional Media Services group used a host-centric application running on an AT&T 3B2 Unix minicomputer. This character-based application was home grown, developed using an early version of Informix, and accessible from VT220 terminals or from microcomputers running the Kermit terminal emulator. The application provided the functionality the department required, but had several problems:
- The system hardware was obsolete and out of maintenance.
- The application software was not Y2K compliant.
- The software had a variety of problems: its functionality was limited, data structures were inflexible and difficult to change, and end users had no control over the application and were not able to perform adhoc reporting.
- The application’s character-based interface ‘was not intuitive, required a steep learning curve, and was difficult. It for even the experienced operator to use,
- The system had significant performance and reliability problems.
- The original author was no longer employed at the University, and none of the existing information technology staff was familiar with the technologies employed.
By 1997, it was clear that the application had to be replaced. The initial investigation articulated the required functionality and desired features to address the limitations of the old software. An aggressive search of available software packages identified eight candidates, three of which were examined in detail, and one tried on a demo basis. When none of the candidates proved acceptable, a feasibility study was initiated to determine if the application could be written in-house.
Information Architecture Guidelines
In 1994 the University completed a project to define an enterprise-wide information systems architecture. The architecture is based on an information systems philosophy and set of related principles that articulate objectives and quality characteristics for the University Information System. The architecture is intended to guide the analysis, design, and decision-making relative to all aspects of information systems and processes. It determines the technological approach taken in defining components of the architecture and how they must operate, and provides a set of guidelines by which information system design decisions can be made.
For example, to comply with the enterprise architecture, applications should minimally:
- Capture data one time at its source.
- Facilitate flexibility and ease of adapting to changes in policy, to
incremental improvements in processes, and advances in technology.
- Implement adopted standards, such as utilizing an SQL-compliant database
- Utilize the client/server model as the basic paradigm.
- Implement a common graphical user interface (GUI).
The authors valued the enterprise guidelines and were committed to building local applications and systems that complied with them.
As recommended by the information systems architecture, business processes should be reviewed to determine the need for reengineering or for process improvement. The first step in process analysis is typically process mapping. The University of Pittsburgh uses an approach to process mapping based on the theory and research of Geary A. Rummler and Alan P. Brache, and modified by the work of Kathleen Shade and her colleagues in the Organizational Development department in the University’s Human Resources unit.
Process mapping determines how a process currently works with the goal of identifying and implementing solutions to improve the efficiency or effectiveness of that process. Process mapping focuses on the flow of work and people through different areas and departments and the value each stepin the process adds to the final product or service. All of the steps in a process may be contained within one function (e.g. a computer programming process) but most processes are cross -functional, spanning the white spaces between the boxes on an organization chart.
Process mapping is accomplished in a series of meetings run by a facilitator who is expert in leading such sessions. After explaining the process mapping procedures, the facilitator leads the group in painting a picture of the process by identifying the tasks that comprise the process, identifying the responsible job functions of the process participants, and “mapping” the process as a series of steps over time. The resultant map is a two-dimensional grid showing the participants (by job function) on the vertical axis and time on the horizontal axis.
Figure 1: Format for Mapping Processes
A critical success factor for process mapping is ensuring that all of the appropriate stakeholders are represented in the process mapping team. Since all but the simplest processes are cross functional, a strong commitment from senior level management may be necessary to encourage cooperation. The media services process analysis included the following representative users from all steps in the process:
- Process Owner (Manager of Media Services)
- Media Services Staff Supervisors
- Order Entry Specialist
- Equipment Operators
- Inventory Manager
- Classroom Technology Engineer
- Systems Analyst/Programmer
- Billing Manager
- University Auditor
- Process Analysis Facilitator
- Unit Administrator
Customers were conspicuously missing from the cross functional team. In this case, the customers of the service are University faculty. Their needs and input were determined through a facilitated focus group meeting. The meeting was attended by a select group of faculty representing different categories of system patrons, including heavy users of technology, novice users, and faculty with special needs such as those teaching film studies, computer science, and fine arts.
The process team delineated several major processes and completed current process maps for each. The process mapping methodology enabled the participants to move, add, or delete tasks as necessary to ensure a consistent level of documentation, the activity was useful in enabling all team participants to understand and agree on all of the steps of the process, as it currently existed. One of the most difficult challenges in this phase of process mapping is to delineate the process as it currently is, rather than as the participants think it should be operating. For this reason, the current process map is often called the “is” map,
After the process maps were completed, the team began to identify every possible thing that could go wrong at each step throughout the process, These “things that go wrong,” called “disconnects,” were placed on the map and described separately. Figure 2 shows a partial excerpt from the “office deliveries” process, labeled with its process disconnects. Figure 3 shows a brief description of some of the identified disconnects.
Figure 2: Excerpt From Current Process Map
Figure 3: Excerpt From Disconnect List
- Multiple people/areas may need to be consulted for more complex jobs.
- Insufficient information from patron to determine if job can be filled.
- Staff (Operations Supervisor, Inventory Manager, Coordinator) not available for consultation.
- Patron does not understand terminology.
- Patrons not familiar with resources and/or limitations of media services.
- Patrons place orders at last minute.
- Difficult to read patrons’ writing, especially on Fax. Fax may not arrive.
- Patrons may want “personal” confirmation, so may not use www.
- Web page does not provide as much information as some might like (Film/Video catalog)
- System runs slowly.
- Patrons try to circumvent the system (appeal to supervisor if denied services),
- Patrons don’t want to accept alternative of setting up equipment themselves,
- Patrons provide no account number or inaccurate account number.
Identifying and describing the process disconnects was another difficult challenge. The process mapping facilitator went to great lengths to deflect the natural defensiveness of the project participants and keep them on purpose. The process disconnects were subsequently analyzed for their causes and for their impact on the process. The disconnects were plotted onto a grid to reflect the effort required to address the disconnect versus the impact on the process if the disconnect were eliminated (see Figure 4).
The disconnect analysis, ordinarily a tool for process improvement activities, helped define areas that the newly defined process should address.
Figure 4: Disconnect Analysis
The last step in the process analysis was to create a new process map, called the “envisioned” or “should” map, that could guide system development decisions. The challenge the participants faced in mapping the envisioned process included the resistance to change natural to all participants in a process and the difficulty in thinking “outside the box” as part of a general brainstorming session. The facilitator helped the group to consider suggestions that would ordinarily be summarily dismissed. The only negative aspect of the process mapping was that it took an inordinate amount of time due to difficulties in scheduling the facilitator.
Systems Analysis, Design and Implementation
Based on prior experience as members of the information architecture team, the authors were advocates of the rapid application development methodology, The systems analyst assigned to the project immediately began developing a prototype application and engaging end users in its assessment and refinement. Figure 5 depicts the PAD methodology, adopted from the Gartner Group. The most important feature of the methodology is the commitment to continuous user involvement throughout the application development life cycle.
Figure 5; Application Development Methodology
The prototyping technique involved first performing a review of the existing system and designing and normalizing the new database. Because of the rapidly changing nature of classroom technologies and the services associated with them, it was critical that the new system be designed with as much flexibility as possible. The system would need to be responsive to end users, and not require time-consuming development work to add new functionality and database elements.
To accomplish these goals, the application was designed to utilize “metadata,” or data about data. In the metadata approach, the content, appearance, and functionality of the application windows are driven by metadata elements and dynamically configured at run time, as opposed to being hard-coded in the application. The application therefore allows real-time additions and modifications to user forms without requiring further database work to add new classes of equipment.
The new application was named Opus, an acronym for “Order Processing System for University Media Services.” Opus is a client/server application, developed in Visual Basic, and running on a Windows NT server utilizing Microsoft SQL Server 6.5. The server is configured with mirrored disk and automated backup. Standard Windows NT and Windows 2000 PCs are used as clients, as well as a small number of Net PCs for the “will-call” counter.
The implementation planned for converting the legacy system data into the new system was to follow a parallel scheme. The parallel conversion method is staff intensive, as it requires duplicate data entry into both the old and new system, but it is the most reliable method for critical functions such as classroom support. Unfortunately, the legacy platform failed two weeks prior to the planned implementation date, and the beta version of Opus was forced into production prematurely (the “plunge” conversion method). Figure 6 depicts the development and implementation timeline.
Figure 6: Opus System Development timeline.
Systems Maintenance and Evaluation
When initial systems development was complete and the system became operational, attention was turned to identifying and correcting bugs and problems. After the system stabilized, it was clear that it did not live up to expectations. Users complained of performance problems and of awkward sequences of steps to perform some operations. The new system did not accurately implement the envisioned process map.
How had the system deviated from the vision? Perhaps engaging in application prototyping while simultaneously pursuing process engineering was a mistake. In retrospect, the stark contrast between the evolving GUI of the new prototype and the antiquated character-based legacy application had the effect of limiting the creativity of the project participants during the envisioning process, The new prototype was so much better than the legacy system that the users were willing to jump to the conclusion that it would meet all of their needs,
When it became clear that the new system had flaws, a second smaller process-mapping group was reconvened with a new facilitator. The facilitator reviewed the process analysis methodology with the participants and guided them through a mapping of the newly implemented system in a timely manner. This effort, conducted with a more mature and demanding group of users, produced an envisioned process map that guided the revision of Opus, phased over two releases.
The Resultant Media Order Entry System.
Opus provides a comprehensive media services management system. Its metadata driven design allows end users to redefine its functionality at both the database and application levels in real time. Its reports are available on demand via the on-line graphical interface and in ad hoc reports via a Microsoft Access interface to defined logical user views. Opus includes the following functionality:
- Real-time scheduling of equipment, media, and staff associated with campus locations,
- Maintenance of patron, staff, building, equipment, media, and classroom records.
- Management and tracking of equipment inventory, utilizing barcode scanning,
- Integration of an associated film/video collection database.
- Creation of tailored customer confirmations.
- Graphical scheduling of staff assignments.
- Collection, processing and reporting of staff payroll data.
- Creation of bills and financial reports.
Summary and Conclusion
The project was notable for several reasons. First, it was guided by principles and standards articulated in the University’s established information architecture. These guidelines resulted in a multi-tiered client/server application designed using application metadata to create a flexible, maintainable, and customizable (in real-time) application, Second, the project utilized a process analysis methodology that mapped the existing business process, identified and analyzed process disconnects, and built an ideal envisioned process. Third, the project used rapid application development techniques to design a working prototype application. Finally, when these methodologies produced less than-satisfactory results, a revised process analysis effort led the project to a successful completion.
The project, through the use of sound architectural principles, prototyping techniques, and the successive application of process analysis methodologies, resulted in a successful implementation, fully meeting the requirements of all stakeholders in the process. The following lessons were learned from the project:
Process analysis can improve the responsiveness and functionality of application development efforts. Critical success factors include:
- Creating a cross-functional process analysis team.
- Obtaining senior management support and involvement.
- Performing the process analysis in a timely fashion.
Prototyping is an effective methodology for designing and implementing responsive and user-friendly systems. Critical success factors include:
- Development must be rapid and responsive to end user feedback.
- Communication between developers and end users must be frequent and rich,
- The timing of process analysis versus prototyping activities is crucial. Process analysis should precede prototyping.