Title: Crisis Information Management Software (CIMS) Feature Comparison Report Series: Special Report Author: National Institute of Justice Published: October 2002 (electronic only) Subject: Technology in law enforcement and information systems 40 pages 80,000 bytes ------------------------------- Figures, charts, forms, and tables are not included in this ASCII plain-text file. To view this document in its entirety, download the Adobe Acrobat graphic file available from this Web site or order a print copy from NCJRS at 800-851-3420 (877-712-9279 for TTY users). ------------------------------- U.S. Department of Justice Office of Justice Programs National Institute of Justice Special Report Crisis Information Management Software (CIMS) Feature Comparison Report ------------------------------- U.S. Department of Justice Office of Justice Programs 810 Seventh Street N.W. Washington, DC 20531 John Ashcroft Attorney General Deborah J. Daniels Assistant Attorney General Sarah V. Hart Director, National Institute of Justice This and other publications and products of the U.S. Department of Justice, Office of Justice Programs and NIJ can be found on the World Wide Web at the following sites: Office of Justice Programs http://www.ojp.usdoj.gov National Institute of Justice http://www.ojp.usdoj.gov/nij Crisis Information Management Software (CIMS) Feature Comparison Report NCJ 197065 ------------------------------- NIJ Sarah V. Hart Director Acknowledgments This report was developed in response to numerous requests from State and local public safety agencies. The project was funded by the National Institute of Justice and the Office for Domestic Preparedness. Other partners included the National Emergency Managers Association, the District of Columbia Emergency Management Agency, and the Institute for Security Technology Studies at Dartmouth College. Staff from the United States Navy's Space and Naval Warfare Systems Center in Charleston, South Carolina, provided technical assistance and served as the project officer. The bulk of the research was conducted by Camber Corporation. Findings and conclusions of the research reported here are those of the authors and do not reflect the official position or policies of the U.S. Department of Justice. The National Institute of Justice is a component of the Office of Justice Programs, which also includes the Bureau of Justice Assistance, the Bureau of Justice Statistics, the Office of Juvenile Justice and Delinquency Prevention, and the Office for Victims of Crime. ------------------------------- Table of Contents 1.0 INTRODUCTION 1.1 Purpose of This Report 1.2 Incident Management System (IMS) 1.3 Feature Comparison Accommodates the Conditions of the Local and State Emergency Management Community 2.0 BACKGROUND 2.1 Crisis Information Management Software 2.1.1 CIMS Research: Four Sources 2.1.2 Site Surveys 3.0 GENERAL FINDINGS 3.1 Understanding the Feature Comparison Results From the Perspective of Your Agency's Priorities 3.2 All 10 Products Are Viable but Differ in Significant Ways 3.2.1 System Environment 3.2.2 Functions 3.2.3 Security 4.0 FEATURE COMPARISON PROCESS 4.1 Features and Descriptors 4.2 Development of Feature Comparison Procedures and Features 4.3 Feature Comparison Schedule 4.4 Feature Comparison Teams 4.5 Feature Comparison Sequence 5.0 SUMMARY 6.0 APPENDIX A--CANDIDATE CIMS PRODUCTS 7.0 APPENDIX B--FEATURE COMPARISON SUMMARY 8.0 ATTACHMENT 1--GUIDE TO USING THE FEATURE COMPARISON MATRIX 8.1 Intro 8.2 Entering Data into the Feature Comparison Matrix 8.2.1 Steps 1-3: Feature Weighting 8.3 Results 8.3.1 All Feature Results 8.3.2 Results for Highly Weighted Features Only 8.4 Vendor Scoring Detail 9.0 ATTACHMENT 2--VENDOR APPLICATION HOSTING MODELS 10.0 ATTACHMENT 3--PRICING GUIDELINES 10.1 Overview 10.2 Checklist 11.0 ATTACHMENT 4--FEATURE COMPARISON QUESTIONNAIRE ------------------------------- 1.0 INTRODUCTION The Crisis Information Management Software (CIMS) Test Bed Project was implemented by the U.S. Department of Justice, National Institute of Justice (NIJ)/Office of Science and Technology (OS&T), in support of its Critical Incident Technology Program (CITP). Through the CITP, NIJ advances the research and development of public safety technologies that will assist State and local law enforcement and other public safety professionals in the prevention of and their response to critical incidents, including acts of terrorism. CIMS, the software found in emergency management operation centers, supports the management of crisis information and the corresponding response by public safety agencies. The primary goal of the CIMS Test Bed Project is to assist Emergency Management Agencies (EMAs) in comparing and contrasting commercially available CIMS software. Source selection is not a goal of this project. Accordingly, EMAs can apply the results of this project to their procurement process for selecting the product that best meets their requirements and financial criteria, while industry vendors can apply the results of this project to improving their product offerings. 1.1 Purpose of This Report The outcome of this project is the detailed results of the products compared within the CIMS Test Bed. The report's contents include the following: o Background pertaining to the activities leading up to the product feature comparisons. o General findings and the implications for the selection of an appropriate product for each agency. o How the feature comparisons were conducted. o CIMS candidate product information. o A Feature Comparison Matrix that permits each agency to apply the results to their priority requirements. 1.2 Incident Management System (IMS) CIMS is part of the integrated incident management system (IMS) concept. IMS is a notional term defined by the CIMS project to represent the ideal system that integrates multiple technologies (e.g., cell phones, personal digital assistants, radios, etc.). The development of an integrated IMS is among the top priorities articulated by the State and local incident response community. The fundamental objective is optimizing emergency management operations by the use of technology tools that augment and enhance the deployment of emergency response assets. An integrated IMS requires that an EMA use an integrated system of technical capabilities that operates seamlessly and without duplication. This notion combines the elements of policy and procedure within a comprehensive emergency response plan that is aided by information technology. The objective of an integrated IMS is a system that connects all elements of an agency's response profile (telecommunications, wireless, network, voice, video, and audio) and eliminates separate stovepipe communications networks. Consequently, the IMS concept affected the development of the feature comparison questionnaire. For example, questions pertaining to the degree to which the product had built-in interfaces for other technologies typically used in EMA operations (e.g., mapping, alert notification, wireless communications, radio, voice, video, audio, and remote access capabilities) were included in the questionnaire. In addition to this report, this project produced a Feature Comparison Matrix--an Excel-based spreadsheet that allows users to input criteria specific to their agency and then determines the CIMS products, among those reviewed, that meets such criteria. The matrix also allows users to sort the evaluation results in many ways. The matrix is available at http://www.ojp.usdoj.gov/nij/temp/publications/197065-Matrix.zip. 1.3 Feature Comparison Accommodates the Conditions of the Local and State Emergency Management Community The feature comparison was designed to replicate to the greatest extent possible the conditions representative of those customarily found in State and local EMAs. For example, a modern "mid tier" local area network (LAN) was designed as the platform for conducting the feature comparisons. A mid-tier LAN is the type of computer data network used by the majority of cities and counties throughout the country. The largest metropolitan areas, such as New York City and Los Angeles, operate more advanced or "high tier" computer networks. The feature comparison posed questions consistent with the basic functional needs common to all the agencies while accounting for the different approaches to how software is used. For example, the test bed planned for (1) software products that are accessible over the Internet by dispersed participants and (2) software products that are used only from the LAN. These differences required a feature comparison that covered the wide range of features and the compatible network capable of supporting the different modes of access. The CIMS Test Bed also provided the necessary configuration to compare a "hybrid" system installation. The hybrid installation is designed to provide access to the application at both the application service provider (ASP) site (for remote users) and the LAN site (for the other users). This type of installation requires a design that synchronizes the data between the two server sites. While the CIMS Test Bed compared LAN, ASP, and hybrid functionality, there was no attempt to compare an additional mode of operation-- standalone. Standalone operation allows the CIMS software to be run on a single computer if necessary, such as when a network connection is not available. Several vendors stated that they support this capability. An additional capability that may be offered within the standalone mode includes the ability to synchronize data with a "master" server once network communications have been restored. Organizations that are interested in or require this type of functionality should check with the individual vendors to determine if and how they support the standalone mode. The fact that the standalone mode of operation was not compared does not indicate that it was considered to be an unimportant capability, but only reflects the limitations and constraints imposed on the project for ensuring that it was implemented in a timely manner and would produce an end product with the greatest value. ------------------------------- 2.0 BACKGROUND The primary goal of the CIMS Test Bed Project was to provide an impartial and unbiased comparison of an established set of features associated with commercial software products currently in use by EMAs. The software products were specifically developed for augmenting EMA response to crisis incidents, in addition to enhancing the function of emergency management planning and mitigation. Ten CIMS application vendors participated in the voluntary product feature comparisons conducted during a 3-month period from February 4, 2002 to May 3, 2002. Several vendors had multiple products to offer and in these cases, the vendor selected the product for feature comparison. Product feature comparisons were based on the current version of each product available at that time. However, it should be noted that new releases or versions of the 10 products compared may currently be available. In addition, the vendors may be offering different, but related products. Regardless, it is recommended that an agency considering procurement of a CIMS product also consider conducting a survey that identifies the current product offerings. The CIMS feature comparison was structured to begin with a survey of those commercially available technologies designed specifically to meet the needs and requirements for managing incident information, an agency's response, and the corresponding consequences. This approach permitted a more rapid delivery of information necessary for selecting the most appropriate software technologies for inclusion in the feature comparison and ensured their association with the needs of the emergency management community. 2.1 Crisis Information Management Software The search for CIMS products began with defining the conditions that identify the products serving the information management needs of State and local EMAs. These conditions were defined as follows: o The product was designed to support crisis or event information management functions of State and/or local emergency management organizations. o The product was commercially available (no beta version software was considered). These two conditions confined the feature comparison to products that conformed to the operational needs and requirements of the EMA community. 2.1.1 CIMS Research: Four Sources The project team conducted a four-part market survey to identify potential product candidates for the CIMS feature comparison. The first part involved an e-mail sent to the 59 members of the National Emergency Management Association (NEMA), representing 50 State agencies and the nine Federal territories. This e-mail requested each State agency to query its respective county and city EMAs as to the product currently used for emergency management operations and to provide a consolidated report based on the responses it received. The second part of the survey involved a Commerce Business Daily (CBD) "sources sought" announcement. The third part involved on-site interviews conducted at nine EMAs from August 2001 to October 2001. The final part identified potential candidates via searches on the World Wide Web and from contacts developed by the District of Columbia Emergency Management Agency (DC EMA) and NIJ/OS&T. The result of the survey was the identification of 26 software products. Based on the product literature provided by the vendors and Web site reviews, 15 products appeared to meet the requisite conditions. The project team invited each of the respective companies to provide a demonstration of its software products. Each demonstration served to validate compliance with the project's two conditions and initiated the development of the feature comparison criteria. Of the 15 vendors invited to the demonstration, 13 clearly met the project's conditions. These vendors represented the target group of participants and their products are described in appendix A. Of the 13 vendors invited to participate in the feature comparison, three elected to decline the invitation, leaving 10 vendors to participate. 2.1.2 Site Surveys Beyond identifying the various CIMS products in use at the locations visited by the project team, the information gathered at these sites also served to establish the criteria for the feature comparison. An important consideration was the acknowledgement that agencies within the emergency management community vary greatly in both the scale and the direction of their respective operations and therefore have different expectations of CIMS products. Of the nine sites visited, none had experienced anything resembling the current concern with weapons of mass destruction (WMD) events. However, the events of September 11, 2001, have made WMD an ever-present concern, bringing to the forefront the need to acquire and apply technology to enhance the level of preparedness and emergency response. Casualty projections, hospital bed- count reports, and data communications among geographically dispersed responders have now become more important features required of CIMS products. The survey resulted in many important findings, which are listed below in random order. The software should: o Be affordable given the size of a jurisdiction's budget. o Be user friendly. o Be easy to maintain by existing EMA staff with access to the vendor's technical support services. o Be easy to tailor to the conditions and policies of the agency. o Allow for remote access by authorized users located outside the LAN. o Comply with the provisions and standards for Incident Command System (ICS). ICS is the model tool for command, control, and coordination of a response and is built around five major management activities of an incident: -- Command. -- Operations. --Planning. -- Logistics. -- Finance/administration. o Comply with the provisions of the Emergency Support Functions (ESF). ESF consists of 12 main groups that manage and coordinate specific categories of assistance common to all disasters. Each ESF group is headed by a lead organization responsible for coordinating the delivery of goods and services to the disaster area and is supported by numerous other organizations. The ESF annexes are-- --Transportation (ESF 1). -- Communications (ESF 2). -- Public works and engineering (ESF 3). -- Firefighting (ESF 4). -- Information and planning (ESF 5). -- Mass care (ESF 6). -- Resource support (ESF 7). -- Health and medical services (ESF 8). -- Urban search and rescue (ESF 9). -- Hazardous materials (ESF 10). -- Food (ESF 11). -- Energy (ESF 12). o Integrate with other systems, such as mapping, other CIMS, and telephonic alert notification systems. o Integrate public health into emergency management. o Operate within a variety of network configurations. o Have a wide range of features consistent with the four phases of emergency management operations: planning, mitigation, response, and recovery. o Have help desk support available on a 24-hour, 7-days-per-week basis, including on-call or availability by cellular phone. ------------------------------- 3.0 GENERAL FINDINGS 3.1 Understanding the Feature Comparison Results From the Perspective of Your Agency's Priorities The feature comparison results presented herein assessed 10 software products at the individual feature level. No summary recommendations of any kind have been made or inferred to certify or recommend a particular product. This report does, however, provide the actual results of the feature comparison so that an agency can assess the products in the context of its own priorities, requirements, and conditions of operation. The "Feature Comparison Summary" (appendix B) offers feature-level insight into the current state of the technology offered by the 10 vendors that participated in the CIMS Test Bed. Appendix B provides a summary of the feature comparison by identifying those features that were supported by each product. An 'X' indicates that the feature comparison team agreed that a particular feature was supported at least at the minimum level. This comparison summary is aided by the Feature Comparison Matrix, a tool that is provided in the form of Excel spreadsheets (see attachment 1). The matrix makes it possible to assess the performance of a particular product or of the entire group of products. For example, 8 of the 10 products performed above average for access control, but conversely, 8 of the 10 products performed below average in the area of password security. When reviewing the feature comparison results, the reader should keep in mind that the vendors are constantly striving to improve their products by adding functionality. Additionally, several vendors offered capabilities that were not a part of the feature comparison. Agencies currently considering the procurement of a CIMS product should determine which features are most important to them, including those that were not a part of the feature comparison. Such features should be considered by the agency prior to preparation of procurement specifications or the purchase of any product. The following considerations should be given to most every type of product, including CIMS products: o There is no best product. o There is no perfect fit. o The best product for your agency should be based on: -- Budgets. -- System environment. -- Scale of operation. -- Sophistication of operation. -- Discipline to implement. -- Political considerations. 3.2 All 10 Products Are Viable but Differ in Significant Ways Each of the 10 products was considered viable in the sense that the software loaded and operated as intended to support the operations of an EMA, particularly for the management of critical events. Beyond this general statement, the products varied in many significant ways, as demonstrated by their performance in the CIMS Test Bed. The feature comparison focused on three areas: System Environment (including security), Functions, and Product Support. These areas are discussed in more detail in the following sections. For each descriptor- level question in the feature comparison, the feature comparison teams assigned a rating of Not Supported, Minimally, Partially, Satisfactory, Fully, or Exceeds. Thus, the ratings given at the descriptor level describe the degree to which a feature was supported. The average rating over all products and all areas of the feature comparison was Satisfactory. A review of the ratings of the 10 products revealed that no product performed far ahead of its competitors. As stated at the beginning of this section, each product was considered a viable software product that can serve the intended purpose of supporting operations within an EMA. However, there are agencies that may consider the development of their own software applications or the use of government-developed products. Any agency looking to these other options should first consider factors of development, maintenance, training, documentation, software support, and enhancement costs, as well as long-term viability. Los Angeles County and the City of Seattle are two agencies that have undertaken the development of custom software and can provide insight into these factors. 3.2.1 System Environment This part of the feature comparison determined, validated, and rated the technical performance parameters of the product when running in the intended system environment, such as operating system and hosting model. The ratings were applied respective to the vendor's published performance parameters. Within the System Environment area of the feature comparison, many differences among the products were noted. These differences and implications for the selection of the appropriate product for each particular agency are listed below: o One of the 10 products operated primarily through an ASP. Six were intended to operate solely from a LAN server, and three could operate in either configuration. ASP-based systems use the Internet as the network to connect the end user via a Web browser to both the software application and the data maintained at a third-party data center (typically a commercial facility designed and hardened specifically for this purpose). Two of the four ASP-capable products could also operate as hybrid installations, meaning that the system could be configured to operate as a LAN-based application with the ASP site serving as a secondary or backup location. -- The Internet has provided new choices for system configurations. For example, agencies that have severely limited access to technical help now have the option of selecting an ASP solution. All they need are computers that have access to the Internet and can operate a Web browser. This is not an endorsement of this approach, since it is recognized that there are considerable issues, such as security and availability, for ASP-based systems. We simply note the fact that these services now exist and offer options not previously available. -- There are strong arguments for and against relying on the Internet for EMA operations. The evidence offered by the feature comparison results indicates that the industry has made strong moves to offer software via the Internet. However, Internet access is generally offered as a secondary capability for LAN-based systems. -- While not all of the 10 vendors offered the ASP or hybrid solutions as part of their business model, it is noted that LAN-based systems are not restricted to the LAN solution. These systems could be hosted at an ASP site as well, and some vendors of the LAN-based systems made it clear they could provide the ASP solution if required. Agencies desiring the ASP solution should solicit all vendors to determine the latest available hosting models. o The products have been developed for reliable performance and utilize standard commercial robust components that are scalable to support large numbers of users. Without endorsing any specific technology or product design, the feature comparison team found that the 10 products, without exception, were designed consistent with current state-of-the-art, commercially available system components that can be maintained and supported by technicians readily available in the service sector. o The majority of the products used Web browser interfaces whether they were connecting to a local server or through the Internet to a remote server. This implies that the products are likely to have user-friendly interfaces that capitalize on industry trends in this area. Also, these interfaces will permit more seamless integration between the CIMS application and the domain of the Internet outside the LAN. This important development promises continued improvements in functionality as the industry matures and delivers tighter integration between its functional components. 3.2.2 Functions This part of the feature comparison focused on the functional features of the product, such as the ability to generate an event log, reporting, planning, resource management, operations, executions, organizational communications, etc. Some of the products tended to emphasize a particular set of functional capabilities. Below are some differences noted among the products within the functions area of the feature comparison: o While each of the 10 products supported the basic requirements for emergency management, several products had a special emphasis. Areas of emphasis included modeling and simulation, response planning, resource planning, and accounting. -- As an example, it is reasonable for an agency to consider purchasing a product specifically for its planning capability, while using a different product for the other functions of the EMA. o Each of the products supported the minimum requirements: the ability to record the information generated by an event or incident and to provide reports containing this information. o Each of the products either had developed interfaces to third party geographic information systems (GIS) or had provided GIS capabilities that were organic to the software. o Each product was developed consistent with the ICS model for incident management. Eight of the 10 products were developed consistent with the ESF model for incident management. -- All of the vendors appeared to understand the need to support standards of operations in their software. o Conversely, there was no significant effort noted for permitting exchange of information between vendor products. -- Without intervening efforts to develop and promulgate uniform standards for sharing and exchanging information, automated exchange of information between agencies using different products will continue to be problematic. The solution to this problem lies only in part with the industry. The largest share of this responsibility belongs to the user community, which must establish standards and insist that products comply with this requirement in order to qualify for submitting a bid to a procurement request. Based on past experiences with standardization issues, industry will typically comply when there is a clear demand for products that are compliant with accepted standards. 3.2.3 Security This part of the feature comparison reviewed the software security parameters, such as username and password access, secure handling of e- mail, secure data storage, and alerts on suspicious activity. In general, the security capabilities of the various packages are inadequate given the current state of technology for securing information. The feature comparison reflects the fact that many components affect security other than just the database and the application software. Software security must also be understood in the context of the threat. Clearly, EMAs are cyber targets in the context of terrorist-related threat scenarios. The points listed below provide further insight into the software security capabilities: o In general, the vendors need to improve their service of providing customers with materials that document the security features of the product and describe how to use the application securely. o Many of the products rely on e-mail messaging capabilities. E-mail is often the most easily compromised component of a network and traditionally sent in clear text, which means anyone who can capture a network's Internet protocol (IP) traffic can read and manipulate the messages. Encryption and authentication of e-mail messages are essential in protecting both the source of the traffic and the content of the message. E-mail messages can also contain virus code file attachments that can compromise the integrity of a computer network and the data stores. o Each product relies heavily on passwords as part of its security policy. However, password protection is a weak form of network and application protection. Given the increased emphasis on passwords as a protective measure, vendors need to provide the ability to force users to have strong alphanumeric passwords and periodically change their passwords. o All 10 products can coexist with common security components of a network and/or system. Vendors should make reference to these security tools in their documentation. o Vendors need to improve their products' ability to log system utilization transactions. This includes the ability to record whenever a person has logged in, where they have logged in from, when they have a failed login, where they have failed from, and all changes a user makes to the system. These capabilities are vital to protecting the system from "insider attacks." Insider attack is the risk of deliberate attempts to sabotage a system by personnel with legitimate network access. Logging is also an important capability in tracking down system intrusions. o Vendors need to review their products' access control features. Access control provides the capability to restrict the ability of users and groups to view and manipulate only the data that is authorized for their use, as prescribed in the respective system security policy. ------------------------------- 4.0 FEATURE COMPARISON PROCESS The CIMS Test Bed Project Team conducted a hands-on feature comparison of commercial software products meeting the two conditions required for selection as noted in Section 2.1. Participation by the respective product vendors was extended by invitation and was strictly on a voluntary basis. Each feature comparison was based on the existing version of the product available at the time of its feature comparison. Recognizing that software is in a constant state of revision, it cannot be overstressed that agencies using the information in this document should contact each vendor to determine the current software products available and how they may be different from the versions available at the time of the feature comparison. The feature comparisons took place in the CIMS Test Bed LAN located at the DC EMA in Washington, D.C. The feature comparison consisted of the following three parts: o Part 1--System Environment: This part of the feature comparison determined, validated, and rated the technical performance parameters of the product when running in the test bed's system environment. The ratings were applied respective to the vendor's published performance parameters. This part of the feature comparison also reviewed the software security parameters. o Part 2--Functions: This part of the feature comparison focused on the functional features of the product, such as the ability to generate an event log, reporting, and planning. o Part 3--Product Support: This part of the feature comparison determined whether the product vendor operates a help desk and hours of operation and provides information only. No effort was made to rate the performance of the vendor's support activities, such as help desk. Please note that the Feature Comparison questionnaire contains the feature comparison criteria for each part referenced above and is provided as attachment 4 to this report. 4.1 Features and Descriptors The three feature comparison areas were further broken down into Categories, Features, and Descriptors, as illustrated in the figure below. The category level was used to group related features together, but no ratings were applied at this level. Comparison at the feature level documented whether or not the software product supported a particular feature. The ratings at the feature level are "Y" for "yes" and "N/S" for "not supported." A feature that received an "N/S" rating was reviewed with the vendor to ensure that the feature comparison team did not miss a feature that was supported by the product. A feature that received a "Y" rating was further compared at the descriptor level with questions intended to quantify the degree to which the feature was supported. At this level, the feature comparison teams were instructed to provide a rating based on a scale of zero to five, with "N/S" equal to zero and a rating of five serving as the highest possible rating. The example below is provided to illustrate this process. The ratings applied to the questions at the descriptor level were based on the following scale, unless otherwise noted by a particular descriptor question. This scale was used to answer the following question: To what degree does the product support a particular feature? N/S--Not Supported 1--Minimally 2--Partially 3--Satisfactory 4--Fully 5--Exceeds 4.2 Development of Feature Comparison Procedures and Features The feature comparison focused on software products developed for the practices of the local and State agencies, specifically for the functions of crisis or event information management. The functional part of the feature comparison document was first developed as a draft document by the CIMS Project team, drawing on the information acquired during the site visits and the software demonstrations. It was subsequently reviewed and modified with the participation of subject-matter experts from the EMA community, including DC EMA and NEMA. NEMA also provided the final validation of the questions. 4.3 Feature Comparison Schedule The feature comparisons took place from February 4, 2002, to May 3, 2002. During this 3-month period, some of the products may have had new versions released subsequent to their feature comparison. The project did not attempt to capture that information, choosing rather to compare the current version available at the time of the individual vendor comparison. Actual comparison dates and each product version compared are noted in appendix A. The feature comparison was organized as follows: o The actual schedule was published 3 weeks prior to the start of the feature comparisons. o The order of vendors was selected by random drawing. Final candidate selections were made by December 30, 2001. o Each product was allotted 5 days for the feature comparison. o All feature comparisons were completed within the 5-day term. o No vendor was allowed to change its scheduled week. o There were no changes to the feature comparison process once the feature comparisons began. o The Feature Comparison Questionnaire was sent via e-mail and Federal Express Overnight service 7 days prior to each product vendor's scheduled feature comparison. 4.4 Feature Comparison Teams The feature comparison was conducted by feature comparison teams composed of individuals with experience in different areas of expertise, including technical, functional, and security. Parts one and three (System Environment and Product Support, respectively) of the feature comparison were conducted by the team members with technical and security backgrounds from Camber Corporation, the Dartmouth College Institute for Security Technology Studies, and the Space and Naval Warfare Systems Center (SPAWARSYSCEN) in Charleston, South Carolina. Together, this team compared the technical performance of the software within the vendor-prescribed system, including software security parameters and product support, to the criteria within the feature comparison questionnaire. This team was composed of the same members for each product feature comparison. Part two (Functions) of the feature comparison was conducted by a 4- person team composed of emergency management subject-matter experts from EMAs at the State and local level. These team members were responsible for comparing the functional features of each vendor product to the criteria within the feature comparison questionnaire. The 4-person team for each feature comparison was made up from a pool of 22 individuals from EMAs across the country, who were chosen based on their extensive experience in emergency management. Additionally, individuals were placed on feature comparison teams based on their availability for a particular feature comparison, with the requirement that their parent EMA did not use the CIMS product for which their team was performing the feature comparison. 4.5 Feature Comparison Sequence The feature comparison for each CIMS product followed the sequence of events as noted in the table below: Days--Events 7 days prior to start--Feature comparison questionnaire sent via e-mail and FedEx Overnight Service. Day 1--Installation; began feature comparison of part 1 (System Environment). Day 2--Functional feature comparison team trained. Days 3-4--Feature comparison of parts 2 and 3 (Functions and Product Support). Day 5--Feature comparisons completed; final review and reports completed. ------------------------------- 5.0 SUMMARY This project conducted a feature comparison of 10 CIMS products with the overall finding that each product had adequate features to support the operations of an EMA in managing crisis events. The feature comparisons were conducted by subject-matter experts from both the technical field and the emergency management community. The process included an examination of the system environment in addition to functional and product support capabilities. Although there is a natural inclination to determine a "best in class" product resulting from the feature comparison, the project was not structured or intended to reach this type of conclusion. The project's structure is intended to permit agencies to determine on their own the top candidate products that meet their specific needs and requirements. This approach reflects the experience of the EMA community, in which there is wide variance in scale and in the manner in which emergencies are handled. Therefore, no single product will fit all situations. The Feature Comparison Matrix was developed as a mechanism for each agency to apply a priority factor into the feature comparison results. The Feature Comparison Matrix is not the "silver bullet" for an agency's selection of a CIMS product. It is simply a tool to assist agencies during the procurement process. Once the inputs to the Feature Comparison Matrix are processed and the top products meeting the agency's priorities are determined, an agency can take the results into its procurement process. There, it should conduct a "flyoff" and cost analysis. An initial cost analysis was conducted as part of the technical feature comparison of the CIMS products. While cost data was gathered, it is not provided as part of this report. However, this data was used to develop general guidelines (attachment 3) to assist agencies in acquiring accurate vendor quotes that take into consideration both the initial investment and life-cycle costs of the product(s). Agencies should require vendors to submit quotes that are consistent with the specified requirements. There are many factors to consider, in addition to the many different ways to purchase and support the software. Before a final decision is made on a product, the evaluating organization must make some difficult technical and financial decisions regarding the hosting model, whether it is a LAN, ASP, or hybrid solution. As previously stated, when applying the content of this report, evaluating organizations should be aware that-- o There is no best product. o There is no perfect fit. o There is only a best product for your agency based on: -- Budgets. -- System environment. -- Scale of operation. -- Sophistication of operation. -- Discipline to implement. -- Political considerations. ------------------------------- 6.0 APPENDIX A--CANDIDATE CIMS PRODUCTS The tables below list the 13 candidates selected to participate in the CIMS feature comparison process. Please note that product feature comparisons were based on the version of each product available at the time of feature comparison. New releases or versions may be available on the current market. The information on each product has been reviewed by the vendor and should be viewed as vendor-provided information. The description is a compilation of vendor information, not any results of this feature comparison. Candidate CIMS Product Product: Blue292 Company: Blue 292, Inc. Address: 2505 Meridian Parkway Suite 325 Durham, NC 27713 Contact information: 919-806-2440 Attn: Phillip M. Hanna phanna@blue292.com Web site: www.blue292.com Description: Blue292 is a Web-based application designed to deliver a wide range of features for planning and management of emergency or incident information. The Blue292 system activates an organization's emergency plans to drive the response to incidents. It features a friendly and logically sequenced process workflow from planning through execution of the crisis or incident. In addition to its friendly design, it also features advanced capabilities in training, alert notification with verification, GIS, flexible form creation and reporting, wireless and internal messaging. Blue292 has strengths related to the inherent flexibility of the Internet coupled with mobile, remote, and Intranet-based access. The Web-based applications can be delivered either as an ASP or onsite. For organizations with limited IT support help, the ASP approach has the advantage of requiring minimum in-house technical expertise. Participant Yes Version compared/date: Blue292 2.0, ASP hosted/week of February 18, 2002 Candidate CIMS Product Product: CRISIS [trademark] Company: Ship Analytics International Address: 305 21st Street, Suite 228 Galveston, TX 77550-1696 (Alternate address) 183 Providence - New London Turnpike North Stonington, CT 06359 Contact information: 409-765-7081 Attn: Keith O. Palmer kpalmer@shipanalytics.com (Alternate contact) 860-535-3092 Attn: Roger Cook rcook@shipanalytics.com Web site: www.shipanalytics.com Description: CRISIS [trademark] is a web-enabled, all-hazard decision support and incident response management system. The system provides the Incident Management Team with a user-friendly intuitive interface to industry standard relational databases, a geographic information system (GIS) and science-based fate and trajectory hazard models, while facilitating critical real-time data exchange between these components and all of its intra/internet nodes. CRISIS [trademark] was designed with maximum built-in flexibility in order to allow system configuration for a variety of incident types, contingency plans, and command system organizational structures (Emergency Support Functions-ESF, Incident Command System-ICS, etc.). CRISIS [trademark] is a client-server/web- based application operating with an Oracle relational database engine, embedded ESRI MapObjects GIS, and embedded Microsoft Explorer browser, which operates on common PC platforms/servers and standard networks. With an open architecture and no proprietary data formats, CRISIS [trademark] is modular and can be extended as desired by end users to incorporate existing applications in a manner that can provide a seamless user-interface. CRISIS [trademark] can be delivered in a thin- client ASP, LAN, WAN, and/or Intranet configuration. Participant Yes Version compared/date: CRISIS [trademark] 5.2, LAN hosted/week of April 29, 2002 Candidate CIMS Product Product: EM2000 Company: Specialized Disaster Systems International, Inc. Address: 2140 Bagnell Dam Boulevard #302B Lake Ozark, MO 65049 Contact information: 573-365-7373 Attn: Henrietta Alleman 301-599-7373, 301-452-5849 Attn: Edward (Bill) Lent blent@sdsintl.net Web site: www.sdsintl.net Description: EM2000 is an application designed to deliver a wide range of features to plan and manage incident/event information. The EM2000 software provides controls that are customizable to suit the organization's terminology and workflow. It is designed consistent with the ICS/ESF standards tracking every step made in the management of an incident. The software is designed in a client-server Lotus-Notes environment. EM2000 can view and manage multiple emergencies, while also integrated with an alert notification capability (priced separately). It has resource review, deployment and recall features as well as integrated capabilities to generate reports and create media statements. It has provisions to create and manage plans and SOPS integrating these into the response activities. EM2000 provides an integrated mapping capability and also integrates with the ESRI mapping application. Participant Yes Version compared/date: EM2000 3.1, LAN hosted/week of March 25, 2002 Candidate CIMS Product Product: EOC System Company: Clark A. Reynolds Co. Address: 6501 Tahawash Street Cochiti Lake, NM 87083 Contact information: 505-465-0135 Attn: Clark Reynolds Web site: www.emergency-planning.com Description: EOC is a series of Microsoft Word and Excel files provided in template form. The different files are intended to be printed and used in a prescribed manual process derived from FEMA's Emergency Support Functions (ESF) and the Incident Command System (ICS). The files can be purchased a-la-carte or together offering a low-cost manual solution to manage emergencies. It is principally targeted for agencies at the municipality and county level although its 13 Operations Plans (earthquake, fire, chemical, etc) can be used at any level. The vendor's principal claim is that it offers a solution that does not rely on computers or other high-tech solutions, but that is based on sound principals of information management derived from military command and control procedures. It offers a complete diagram of how to set up an emergency operations center, how to collect and distribute information, and the operations response plans for 13 different scenarios. It also offers procedures and scripts for conducting exercises of the different plans. The complete package is delivered over CD-ROM with files opened and printed with any PC that has Microsoft Word and Excel. Participant No Candidate CIMS Product Product: E-Team Company: E-Team, Inc. Address: 7301 Topanga Canyon Boulevard Suite 300 Canoga Park, CA 91303 Contact information: 845-615-8599 Attn: John Hughes Jhughes@eteam.com Web site: www.eteam.com Description: E-Team is a multi-user, network based system that uses industry standard web browsers as client software. As tested, the version 1.6 iteration of the product is designed to deliver a wide range of features to plan and manage incident/event information. E-Team is specifically designed to provide common functionality and scale adaptability regardless of whether the product is accessed via its web-based ASP (Applications Service Provider) version; in its LAN-based, self-hosted mode; or in a hybrid ASP/LAN mode enabling parallel storage of information on both servers. The company promotes its long experience with emergency management systems to validate its many features and its focus on a friendly to use interface. The software is integrated with the ESRI-ArcIMS mapping application to provide comprehensive functional mapping capabilities. E-Team is designed to be highly configurable at the user level without need of a programmer, significantly reducing cost of ownership. The software is both ICS and ESF compliant permitting the user to readily define the user roles and to track the actions of the user in the defined role. The E-Team software has easy-to-use status views to get up-to-date status of resources and situations. Participant Yes Version compared/date: E-Team 1.6, LAN hosted/week of March 18, 2002 Candidate CIMS Product Product: Incident Master and InfoBook Company: Essential Information Systems, Inc. (A subsidiary of Environmental Support Solutions, Inc.) Address: 1395 Picard Drive Suite 230 Rockville, MD 20850 Contact information: 301-556-1700, ext. 1721 Attn: Tracy La Mendola tracy_lamendola@environ.com Web site: www.essential-technologies.com Description: Incident Master is a Web-based application designed to deliver a wide range of features to help plan and manage incident/event information. As Essential Information Systems' newest application, Incident Master performs the core functionality of its client-server legacy system - Essential GEM InfoBook - but with a new Web-enabled look and feel. Incident Master and InfoBook can be deployed in the same operational environment since they share a common database, which can be either MS Access, MS SQL Server or Oracle. Essential Information Systems provides a free copy of InfoBook with Incident Master, together delivering the full range of functions for crisis information management, including management of messages & tasks, assets, personnel, and Standard Operating Procedures, as well as several mapping/GIS options. Both products are ICS and ESF compliant and run from common PC platforms/servers and standard networks. Incident Master is offered as both a self-hosted application (internet or intranet) and in the ASP Model. Participant Yes Version compared/date: Incident Master 1.5 and InfoBook, LAN hosted/week of April 22, 2002 Candidate CIMS Product Product: LEADERS (Lightweight Epidemiological Advanced Detection & Emergency Response System) Company: Oracle Consulting Address: 1910 Oracle Way Reston, VA 20190 Contact information: 703-364-1374 Attn: Charles Yaghmour charles.yaghmour@oracle.com Web site: www.leaders-svcs.net Description: LEADERS is a suite of software tools designed to help federal, state and local government agencies, as well as health care providers, prepare for and respond to major incidents. LEADERS helps emergency responders track, respond to, and manage major incidents using map based software, message board, document management and checklist management capabilities. LEADERS also helps the Emergency Medical Services community track Emergency Department diversion status, bed availability, and the availability of other critical care services. Finally, LEADERS helps the Public Health community with the early detection of infectious disease outbreaks, both naturally occurring and intentionally induced. LEADERS is a web-based system intended to facilitate communication and coordination within, and between, the various government, and non-government, agencies involved in responding to major incidents. The system allows critical information to be shared quickly and securely, thus greatly enhancing the speed and effectiveness of the response. LEADERS is a hosted solution, currently available through an Application Service Provider (ASP) offering, allowing for very rapid deployments. LEADERS runs on common PC platform and requires a standard web browser connected to the internet. LEADERS was built using Oracle state of the art products including the Oracle database and Oracle 9iAS. Participant No Candidate CIMS Product Product: OpsCenter Company: Alert Technologies Corp. Address: 4625 First Street Suite 110 Pleasanton, CA 94566-7387 Contact information: 877-653-7887 (toll free) 925-461-5934, ext. 15 Attn: Jim Paulson jim@alerttech.com Web site: www.alerttech.com Description: OpsCenter is an internet-based application from Alert Technologies Corp. that delivers a full range of capabilities to manage critical situations. It is billed as software to help manage not only crisis related incidents, but also special events involving any number of people, equipment and other resources. OpsCenter is designed to be highly configurable at the user level without need of programming skills. According to Alert Technologies, this feature is critical to permit agencies to easily configure the software to meet the specific requirements and terminology used by each agency. The software is both ICS and ESF compliant permitting the user to readily define the user roles and to track the actions of the user in the defined role. It provides a mapping interface that works with other mapping software. Alert Technologies software is focused on response activities with the ability to link to existing plans. It offers multiple status boards to show the real-time status of one or multiple incident/events with configurable screens as defined by the user organization. Status reports may be directly input by the personnel accessing the application in their defined roles. Participant Yes Version compared/date: OpsCenter 2.3, LAN hosted/week of March 11, 2002 Candidate CIMS Product Product: RAMSAFE [trademark] Company: RAMSAFE, LLC Address: 3225 Shallowford Road Suite 300 Marietta, GA 30062 Contact Information: 800-499-9879 or 865-482-1234 Attn: Michael Maston mikem@ramsafe.com Web site: www.ramsafe.com Description: RAMSAFE [trademark] is a new generation emergency management product developed under a U.S. Government-private sector partnership with substantial input from emergency managers and responders. RAMSAFE [trademark] is organized into an integrated set of five core modules that support decision making through all critical phases of emergency management and response -- planning, exercises, training, operations, and recovery. RAMSAFE [trademark] provides: o Pre-incident planning tightly integrated with operations o Superior capabilities to organize, locate, and display vast amounts of information o Dynamic response options based on current conditions o Unprecedented strategic, operational, and tactical situational awareness detail o Communications integrated with operational tasking and resource management o Resource management tightly integrated with operations o Standalone operating capability that allows use when internet systems are down o Response Option Generator (ROG) Module that enables proactive bioterrorism management RAMSAFE [trademark] was instrumental at the 2002 Winter Olympic Games where the Utah Olympic Public Safety Command used the technology for theater-wide support of security planning and operations. An extensive library of security plans, maps, photographs, protocols, and other documents were logically organized in RAMSAFE [trademark] and required a simple point-and-click to be accessed. Security forces exercised and improved their response plans on in-the-field laptops while venue commanders used 360 o x 360 o full emersion digital images that covered virtually every inch of all Olympic venues for enhanced situational awareness. Over 225 professionals from 20 different federal, state, and local agencies were trained and equipped with RAMSAFE [trademark]. Post-Olympic testimonials indicated that the product was a key to the security success of the National Security Special Event. Additionally, RAMSAFE [trademark] provided exceptional tools needed to respond to weapons of mass destruction terrorism planning and response. It incorporates eight bioterrorism "templates" as part of the ROG Module which the 2001 Dartmouth Medical Disaster Conference report concluded would reduce deaths and economic loss by up to 50%. Participant Yes Version compared/date: RAMSAFE 2.0, LAN hosted, standalone/week of April 15, 2002 Candidate CIMS Product Product: RESPONSE Company: E.A. Renfroe & Company Address: Two Chase Corporate Drive Suite 250 P.O. Box 361850 Birmingham, AL 35244 Contact information: 713-334-3435 Attn: Don Costanzo don@earenfroe.com Web site: www.earenfroe.com Description: RESPONSE is an application designed to deliver a wide range of features for planning and management of incident/event information. Its principal focus is the resource tracking and accountability perspective implemented within the ICS process. The software package traces its history to the oil and insurance industry with emphasis on accounting for the resources used to manage and recover from incidents. RESPONSE is organized along the ICS model of the Command, Operations, Planning, Logistics, and Finance/Administration Sections. The software is written as a client-server application with a rich set of forms and a well tuned OMNIS relational database engine for fast data sorts and query capability. It also features a friendly data loading mechanism permitting the user to drag and drop data from the planning module into the incident form. Implementation typically involves consulting support services to install, configure and assist with the management of the incident. RESPONSE is integrated with both an accounting package (called Transact) and an integrated ESRI MapObjects mapping application together delivering the full range of capabilities for automating daily work, response planning and incident management. It is integrated with the web for "pushing" incident status boards, ICS forms, maps and reports to the agency web site. Participant Yes Version compared/date: RESPONSE 8.0, LAN hosted/week of February 25, 2002 Candidate CIMS Product Product: Response Information Management System (RIMS) Company: CSMC Address: 7936 Mountain Avenue Orangevale, CA 95662 Contact information: 916-987-5203 Attn: Bob Crawford Web site: www.csmco.com Description: Please visit the CSMC Web site. Participant No Candidate CIMS Product Product: SoftRisk Company: SoftRisk Technologies Address: P.O. Box 20163 St. Simons Island, GA 31522 Contact information: 912-634-1700 Attn: Jim H. Fraser jfraser@softrisk.com Web site: www.softrisk.com Description: The SoftRisk Emergency Management Software from SoftRisk Technologies is an application designed to deliver a wide range of features to plan and manage incident/event information. The two versions of the software, SoftRisk 5.1 (based on MS Access), and SoftRisk SQL (based on MS SQL Server) are Windows based applications that adhere to the Microsoft Windows look and feel. The icon based functions with tool-tips and menu bars are an essential part of the software taking advantage of Windows wide familiarity. The software is ICS and ESF compliant and is highly configurable by the user without need of a programmer. The administration of the software does not require a technical systems administrator. The software has planning capabilities and can provide training via the web with the SoftRisk's training staff in conference over the telephone. The SoftRisk products also include an image viewer that can open a variety of different file types, including common imaging formats and CAD designs. Mapping features are well integrated with ESRI Map Objects included in the software. An easy to use "Reports Wizard" allows the quick creation of on the fly reports of all instances of an incident/event. The software also provides status reports of incident/events and tasks. Participant Yes Version compared/date: SoftRisk SQL 5.1, LAN hosted/week of February 4, 2002 Candidate CIMS Product Product: WebEOC Standard, WebEOC Professional Company: Emergency Services integrators (ESi) Inc. Address: 699 Broad Street Suite 1011 Augusta, GA 30901 Contact Information: 800-596-0911 706-823-0911 Attn: John O'Dell Jodell@esi911.com esi@esi911.com Web site: www.esi911.com Description: WebEOC is an application designed to deliver a wide range of features for the planning and management of real-time incident / event information. It is designed specific to emergency operations center functions constructed with a control panel (the "remote control) that, depending on configuration, can launch status boards, maps, and links to other applications or sites, etc. Easy to use, WebEOC Users are often trained in under 15 minutes. MapTac, a companion software product, can interface with other standard mapping applications and provides a tactical mapping capability that offers common or agency specific mapping views (fire, police, HazMat, etc). WebEOC is configurable at the administrator level without need of a programmer. The software can accommodate the Incident Command System (ICS) and FEMA's ESF structure. WebEOC offers chronological and categorical status boards of one or multiple incident/events with user configurable screens. Status reports can be directly input by individual responders. It also features a Drill Simulator offering the capability to construct exercises that are scenario based. Real- time links to 911 CAD systems are also possible through WebEOC. It is offered as both a self-hosted application and in the ASP model. Participant Yes Version compared/date: WebEOC Professional 5.3, LAN hosted/week of March 4, 2002 ------------------------------- 7.0 APPENDIX B--FEATURE COMPARISON SUMMARY Below is a table containing each of the compared features. An "X" indicates that the feature comparison teams agreed that the product supports the feature. It is noted that this matrix is NOT an approved "standard." It is simply a compilation of features that might be available in CIMS products. Nor is this matrix all-inclusive. Individual agencies should add any features of importance to them and conduct a feature comparison for those additional features. The Feature Comparison Questionnaire (attachment 4) includes blank sheets to use as a template for this purpose. It is also important to note that products may have important features that were not a part of the feature comparison and that vendors may have added important features since the feature comparison. In the System Environment area, some of the features apply only to some hosting models. These features are self-evident from the feature description. It is up to the evaluating organization to determine which hosting models and feature comparison criteria are important to their specific situation. ------------------------------- 8.0 ATTACHMENT 1--GUIDE TO USING THE FEATURE COMPARISON MATRIX 8.1 Introduction This guide provides step-by-step instructions for using the Feature Comparison Matrix, a tool that can augment an evaluating organization's procurement process for selecting a CIMS product. The tool works by applying a priority factor to the scores that resulted from the feature comparison process. Where appropriate, it also provides a high-level description of the logic behind the calculations. Once the inputs to the Feature Comparison Matrix are processed and the products that best match the agency's priorities are revealed, the agency can take these results as well as other priorities not addressed in the Feature Comparison Matrix into its decision process for final product comparison and selection. The Feature Comparison Matrix is not the "silver bullet" for selecting a CIMS product, but a tool that can provide evaluating organizations with additional information about the different products. 8.2 Entering Data into the Feature Comparison Matrix The Feature Comparison Matrix requires the user to enter data at various points in the process. Data are entered into the fields located on the screen that correspond to various questions. These are much like the fields on a paper form. These data input fields are easily identifiable by the fact that they have a white background and are enclosed by a thick black line. Nondata input field areas of the page will have a shaded background. In order to input data into these fields, they must be selected independently. To select a field, you can use the arrow keys on the keyboard or move the mouse cursor over the field and click the left mouse button once. You will know the field is selected when the black line enclosing the field becomes white. The screen shot below displays nonselected and selected fields: Once the field is selected, data can be entered. To complete the data entry, simply hit the enter key, return key, or any arrow key on the keyboard. To change the entry, select the cell again and enter the new value. 8.2.1 Steps 1-3: Feature Weighting These steps are designed to determine the importance of various features to the evaluating organization. Each feature that was compared during the feature comparison process is listed. The features are segregated into three categories: System Environment, Functional, and Product Support. Clicking on the text of any feature will provide more detail on that feature. The screen shot below displays the feature weighting: Each feature has a corresponding input field. This input field is used to determine the importance of this specific feature to the evaluating organization. This is done by entering a number from 0 to 5, with 0 being the least important and 5 being the most important. This weighting is used to compute the scores that the vendors received during the final step of the analysis. The following provides a definition for the different weighting options. o 0--Of no importance to the user whether or not the feature is provided. o 1--Possibly useful. o 2--Nice to have. o 3--Important. o 4--Very important. o 5--Extremely important. In the System Environment area, some of the features that are to be assigned a weighted value apply only to some hosting models. These features are self-evident from the feature description. It is up to the evaluating organization to determine which hosting models and feature comparison criteria are important to their specific situation. The following definitions are provided for the hosting models: o ASP hosting model--the application software only resides at a location managed by the vendor. o LAN hosting model--the application software resides only at a location managed by the customer. o Hybrid hosting model--the application software resides at both a vendor location and at a customer location. Data are shared seamlessly between the two locations so that users can log into whichever one is most convenient for them depending on where they are located. Navigation buttons are provided to move between weighting steps and forward to the results. 8.3 Results The Feature Comparison Matrix provides two sets of results. The first set of results takes into account all the features as weighted by the different teams within the evaluating organization. The results from this method provide insight into the breadth of capability that the products have that match the evaluating organization's criteria. Both methods are important and should be taken into account. The second set of results only takes into account those features weighted 4 or 5. These results allow the evaluating organization to look specifically at how the products compared for the features that it considers highly desirable. The reasoning behind having two sets of results is because of the possibility for a product to get a good "all-around feature" score without having high scores in any one area. The potential for this occurrence is due to the number of feature comparison criteria. A product having a very wide breadth of capabilities but only "satisfactory" in the quality of those capabilities would get a similar score to a product that had a narrower focus but performed those features at a very high level. As an example, product A supports 90 percent of the features in the questionnaire and gets mostly 2s and 3s. Product B supports 60 percent of the features and gets mostly 3s, 4s, and 5s. In this case, the overall score for both would be similar. The two methods are discussed in more detail in the following sections. 8.3.1 All Feature Results The first set of results for all products are provided in the "All Feature Results" screen. Clicking on the name of any vendor will provide detailed scoring for that vendor's product. The screen shot below is an example of the "All Feature Results" screen: Scores are determined by multiplying the average score that the feature comparison team gave to a given feature (raw score) by the weighting given to that feature by the evaluating organization. This results in the weighted score. Weighted scores for each feature are summed by feature comparison area to give the scores shown on the results page and are broken down by the System Environment, Functional, and Product Support feature areas. System Environment features that were compared include some that do not apply to every hosting model. Therefore, System Environment scoring has been given separately for each hosting model. Functional and Product Support features are independent of hosting model and therefore have only one score. Navigation buttons are provided to return to the weighting steps as well as to go to the "Highly Weighted Features Only" results discussed further in the next section. 8.3.2 Results for Highly Weighted Features Only This page contains the same type of information as the results page with one modification. Only features assigned a weight value of 4 or 5 are taken into account. In this way, the total scores reflect only features identified as highly desirable by the evaluating organization. Specifically, scores are determined by multiplying the average score that the feature comparison team gave to a specific feature (Raw Score) by the weight value assigned to that feature by the evaluating organization. However, if the weighting is 0, 1, 2 or 3, then the Raw Score is multiplied by zero. This provides the results where a feature was assigned a weight value of 4 or 5. The scores for those features that are assigned a weight of 4 or 5 are then summed by feature comparison area to give the scores shown on the "Highly Weighted Features Only" results page. 8.4 Vendor Scoring Detail This page provides the detailed scoring for a particular vendor. No data entry is required. Each vendor has a separate page that is accessed from the "Results" page by clicking on the name of the vendor. The screen shot below provides an example of vendor N's detailed scoring: Raw scores (average score from feature comparison teams), weighted scores (raw scores multiplied by weighting) and 4- and 5-only scores (same as weighted using only features weighted 4 or 5) are given. Scores are given at the feature level (X.X.X.a) with appropriate score rollups for feature group level (X.X.X), category level (X.X) and feature comparison area level (X.0). The different levels of scoring are accessed by selecting the pluses and minuses to the left of the feature list. In this way, the user can drill down to a specific feature of interest. A navigation button is provided to return to the results page. ------------------------------- 9.0 ATTACHMENT 2--VENDOR APPLICATION HOSTING MODELS The table below identifies the application hosting models that each vendor offers for its product. Please note that each hosting model was compared with respect to the Systems Environment portion of the feature comparison, but not within the Functional Area of the feature comparison. o ASP hosting model--the application software only resides at a location managed by the vendor or a third party ASP data center. o LAN hosting model--the application software resides only at a location managed by the customer. o Hybrid hosting model--the application software resides at both a vendor location and at a customer location. Data are shared seamlessly between the two locations so that users can log into whichever one is most convenient for them depending on where they are located. o Standalone hosting model--the application software can be run on a single computer if necessary, such as when a network connection is not available. Several vendors have stated that they support this capability. Additional functionality that may be offered within this mode includes the ability to synchronize data with a "master" instance of the software once network communications have been reestablished. Organizations that are interested in or require this type of functionality should check with the individual vendors to determine if and how they support the standalone mode. ------------------------------- 10.0 ATTACHMENT 3--PRICING GUIDELINES 10.1 Overview This report does not include pricing information from the vendors that participated in the feature comparison process. Vendor pricing is always dynamic and any pricing information that was gathered during the feature comparison process would be outdated quickly. Additionally, the final price that any given organization pays is dependent on many factors that are not common across all organizations, such as pricing negotiated through competitive bids, discounts, and special offers. The recommendation is for the evaluating organization to solicit current pricing information from each vendor for those products that have been identified as potential candidates for the organization's procurement. Furthermore, a pricing proposal that clearly outlines each item to be purchased should be provided by the vendor prior to executing a purchase decision. For instance, the proposal should include a detailed description of each item, the price of each item, and the period of time that a particular price covers (if applicable to the line item). 10.2 Checklist During the process of investigating vendor pricing, it was apparent that the industry uses a wide range of pricing models that make it difficult to achieve "apples to apples" pricing comparisons. However, some key questions were discovered during the process that each vendor should be asked in order to obtain a reasonable comparison. The following checklist was developed and can be used to supplement other questions specific to the evaluating organization's circumstances. These questions do not imply that each evaluating organization will need each item discussed nor are they all-inclusive. Individual vendors may offer something similar or have a satisfactory alternative to the specific item in the checklist. Note that any items not included in the base price will need to be priced separately and added to the product's total cost. o Is onsite installation by vendor personnel included in the base price? o Is onsite training by vendor personnel included in the base price? o Is toll-free phone support included in the base price? If yes, what are the hours of operation? o Are upgrades included in the base price? -- If yes, for how long? Often the first year of upgrades is included in the initial price, but subsequent years have an additional fee. -- If no, what is the cost of upgrades or the additional cost to get on a plan (typically called a maintenance agreement) to receive all upgrades? o Is the database server license included in the base price? o Is all software, including third-party software, needed to use the mapping capability included in the base price? o Is the software sold using a seat, server, site, enterprise, or other license model? What is a particular vendor's detailed definition of the model? Different vendors may use the same word for the model but mean different things. o What fees are one-time costs and what fees are recurring (e.g., annual) costs? Again, please note that this list is supplemental to any of the pricing questions that the evaluating organization may have specific to its operations. ------------------------------- 11.0 ATTACHMENT 4--FEATURE COMPARISON QUESTIONNAIRE The Feature Comparison Questionnaire is in Excel format and can be found at the following URL: http://www.ojp.usdoj.gov/nij/temp/publications/197065-Matrix.zip. ------------------------------- About the National Institute of Justice NIJ is the research, development, and evaluation agency of the U.S. Department of Justice and is solely dedicated to researching crime control and justice issues. NIJ provides objective, independent, nonpartisan, evidence-based knowledge and tools to meet the challenges of crime and justice, particularly at the State and local levels. NIJ's principal authorities are derived from the Omnibus Crime Control and Safe Streets Act of 1968, as amended (42 U.S.C. sections 3721-3722). NIJ's Mission In partnership with others, NIJ's mission is to prevent and reduce crime, improve law enforcement and the administration of justice, and promote public safety. By applying the disciplines of the social and physical sciences, NIJ-- o Researches the nature and impact of crime and delinquency. o Develops applied technologies, standards, and tools for criminal justice practitioners. o Evaluates existing programs and responses to crime. o Tests innovative concepts and program models in the field. o Assists policymakers, program partners, and justice agencies. o Disseminates knowledge to many audiences. NIJ's Strategic Direction and Program Areas NIJ is committed to five challenges as part of its strategic plan: 1) rethinking justice and the processes that create just communities; 2) understanding the nexus between social conditions and crime; 3) breaking the cycle of crime by testing research-based interventions; 4) creating the tools and technologies that meet the needs of practitioners; and 5) expanding horizons through interdisciplinary and international perspectives. In addressing these strategic challenges, the Institute is involved in the following program areas: crime control and prevention, drugs and crime, justice systems and offender behavior, violence and victimization, communications and information technologies, critical incident response, investigative and forensic sciences (including DNA), less-than-lethal technologies, officer protection, education and training technologies, testing and standards, technology assistance to law enforcement and corrections agencies, field testing of promising programs, and international crime control. NIJ communicates its findings through conferences and print and electronic media. NIJ's Structure The NIJ Director is appointed by the President and confirmed by the Senate. The NIJ Director establishes the Institute's objectives, guided by the priorities of the Office of Justice Programs, the U.S. Department of Justice, and the needs of the field. NIJ actively solicits the views of criminal justice and other professionals and researchers to inform its search for the knowledge and tools to guide policy and practice. NIJ has three operating units. The Office of Research and Evaluation manages social science research and evaluation and crime mapping research. The Office of Science and Technology manages technology research and development, standards development, and technology assistance to State and local law enforcement and corrections agencies. The Office of Development and Communications manages field tests of model programs, international research, and knowledge dissemination programs. To find out more about the National Institute of Justice, please contact: National Criminal Justice Reference Service P.O. Box 6000 Rockville, MD 20849-6000 800-851-3420 e-mail: askncjrs@ncjrs.org