Use the latest performance reports, design and adoption recommendations, and more, to optimize the performance of your integration solutions.
Before using performance information, be sure to read the general information under Notices.
V10 Performance Reports
Our performance reports cover a selectionÂ of scenarios with a range of message sizes to give a guide as to the expected performance of a customers workload.
- V10 Performance Highlights
- Evaluation Method
- Use Cases
- Message Throughput ResultsÂ
- PDF Formatted Reports
The sample applications and artifacts used in the IBM Integration Bus V10 Performance Reports are available to download from Github – IIB v10 Performance Samples
Included are full instructions on deployment and configuration, input messages as well as a set of run configurations that make it easy to invoke a performance test.
The operating practices information in this section share common approaches to solving common problems based on real environments. They do not provide a â€śone size fits allâ€ť solution. They assume that you have a basic understanding of IBM Integration Bus. As technology evolves, new recommendations and information might be added to the information in these documents.
For more information, see Performance Recommendations
The performance information provided in performance reports pages illustrates key processing characteristics of IBM Integration Bus. It is intended for Architects, Systems Programmers, Analysts and Programmers wanting to understand the performance characteristics of IBM Integration Bus. You can use this information to understand the performance characteristics of IBM Integration Bus. The data provided will assist you with sizing solutions. Please, note that it is assumed that the reader is familiar with the concepts and operation of IBM Integration Bus.
This information has been obtained by measuring the message throughput that is possible for a number of different types of message processing. The term “message” is used in a generic sense, and can mean any request or response into or out of an integration node,
regardless of the transport or protocol.
The performance data contained in the performance reports was measured in a controlled environment and results obtained in other environments might vary significantly. For more details on the measurement environments used, see the Message Throughput Results section for each platform.
The performance measurements focus on the throughput capabilities of the broker using different message formats and processing node types. The aim of the measurements is to help you understand the rate at which messages can be processed in different situations as well as helping you to understand the relative costs of the different node types and approaches to message processing.
You should not attempt to make any direct comparisons of the test results in this report with what may appear to be similar tests in previous performance reports. This is because the contents of the test messages are significantly different as is the processing in the
tests. It is not meaningful to make such comparisons. In many cases the hardware, operating system and prerequisite software are also different, making any direct comparisons invalid.
Some optimizations of the test environment and procedures have been implemented to minimize the effect of logging for example and to ensure that messages do not accumulate on output queues which has a detrimental effect on message throughput. These are
detailed under Tuning.
In many of the tests the user logic used is minimal so the results presented represent the best throughput that can be achieved for that node type. This should be borne in mind when sizing IBM Integration Bus.
References to IBM products or programs do not imply that IBM intends to make these available in all countries in which IBM operates. Information contained in this report has not been submitted to any formal IBM test and is distributed “as is”. The use of this information and the implementation of any of the techniques is the responsibility of the customer. Much depends on the ability of the customer to evaluate these data and project the results to their operational environment.