The following report presents results of a competitive performance evaluation of IBM App Connect Enterprise Version 188.8.131.52 against IBM Integration Bus Version 10.0.0.18 in traditional non-virtualized environments on Windows and Linux as well as in an AIX 7.2 logical partition on Power 8.
Before using performance information, be sure to read the general information under Notices.
- Evaluation Methodology
- Test Environment
- Optimization and Tuning
- Test Scenarios
The information provided in the performance report pages illustrates the key processing characteristics of IBM App Connect Enterprise. It is intended for architects, systems programmers, analysts, and programmers wanting to understand the performance characteristics of IBM App Connect Enterprise. The data provided will assist you with sizing solutions. Please note that it is assumed that the reader is familiar with the concepts and operation of IBM App Connect Enterprise.
This information has been obtained by measuring the message throughput for a number of different types of message processing. The term “message” is used in a generic sense, and can mean any request or response into or out of an integration server, regardless of the transport or protocol.
The performance data presented in the reports was measured in a controlled environment and any results obtained in other environments might vary significantly. For more details on the measurement methodologies and environments used, see the “Evaluation Methodology” and “Test Environment” sections, respectively, of this document.
The performance measurements focus on the throughput capabilities of the integration server using different message formats and processing node types. The aim of the measurements is to help you understand the rate at which messages can be processed in different situations as well as to understand the relative costs of the different node types and approaches to message processing.
You should not attempt to make any direct comparisons of the test results in this report to what may appear to be similar tests in previous performance reports. This is because the contents of the test messages are significantly different as is the processing in the tests. In many cases the hardware, operating system, and prerequisite software are also different, making any direct comparison invalid.
In many of the tests the user logic is minimal and the results represent the best throughput that can be achieved for that node type. This should be borne in mind when sizing IBM App Connect Enterprise.
References to IBM products or programs do not imply that IBM intends to make these available in all countries in which IBM operates. Information contained in this report has not been submitted to any formal IBM test and is distributed “as is”. The use of this information and the implementation of any of the techniques is the responsibility of the customer. Much depends on the ability of the customer to evaluate this data and project the results to their operational environment.