IBM® Transformation Extender

Consulting and Staffing

Remedi's services offer flexible solutions that cater to the individual IBM Transformation Extender (formerly known as WebSphere Transformation Extender WTX) needs of each company.

Thanks to a broad portfolio of IBM solution experience, our team offers the skills to help achieve your goals, stay on schedule and stick to budget. We can help you make the most out of ITX / WTX, a unique product that works seamlessly with IBM B2B Integrator, WebSphere Message Broker WMB, WebSphere Enterprise Service Bus ESB, and Business Process Manager.

Additionally, the mapping component of the product, the "common mapper", can be utilized across at five different IBM products, including IBM B2B Integrator, making it an essential tool for data transformation.

Talk to Remedi about unlocking the full potential of ITX / WTX and streamline your business processes.

IBM ITX Product Description

ITX enables organizations to integrate industry based customer, supplier and business partner transactions across their enterprise.

Deploy a single, universal transformation solution that works with all of IBM's key integration products. Deliver industry transformation solutions with predefined packs for Healthcare, Financial Services, Insurance and EDI. Transform large volumes of data of any format efficiently and effectively.

  • Single, universal transformation solution that works across all of IBM's key integration solutions.
  • Key transformation component of IBM's new Standards Processing Engine or ITX Advanced.
  • Extends IBM B2B Integrator, WebSphere Message Broker, WebSphere ESB and Business Process Manager with advanced transformation and validation capabilities.
  • Easily manage and adapt to changing industry standards with easy to install Industry-pack updates.
  • Deploy IBM Transformation Extender within virtualized and private cloud architectures with Hypervisor Edition.
  • Process large volumes of data efficiently with one pass lookup, validation and transformation.
  • Access data directly from source applications for improved data integrity and high throughput transformation capabilities.
  • Aggregate disparate data and information stores to provide new business information views and services.
  • Reuse the same transformation assets throughout the enterprise infrastructure to provide consistency and scalability without the need to write custom code.

IBM ITX can improve your data exchange and processing capabilities in the following ways:

  • A programming-free environment - ITX provides a unique, graphical user environment that allows integration designers to visualize complex data types in graphical form, and provide powerful data processing and manipulation capabilities. This code-free method allows users to build processing and integration flows based on the business requirements, without programming model or common data model constraints. Users are able to construct integration and data processing objects through an easy-to-use, drag-and-drop interface, and deploy from the design, instead of "coding under the design" as far too many methods require. ITX allows users to integrate data of disparate types, from disparate sources, and can allow them to process their integration object natively in those environments - all without the need to know the programming languages of those environments.
  • Lower processing costs, providing greater capabilities - Some of the greatest integration and processing challenges to enterprises revolve around the need to process many incoming data objects together, and resulting in many outputs. This "Many-to-Many" challenge has forced many organizations to resort to programming-based processing, or to have extremely high compute costs due to the need to handle each "step of the many inputs" separately, and then tie the results together on an output-by-output basis. In many cases, this challenge has resulted in companies building integration infrastructures that are nearly as costly as the systems which they integrate.ITX is able to process natively many data inputs together, with a "single read of the data," and to provide these combined input processes to many outputs in a single process. This unique ability can allow companies to dramatically lower their processing costs, increase their processing throughput, and, most importantly, provide powerful data integration, enhancement, and processing capabilities which are available across their enterprise infrastructure.
  • Your environment, your data, your business - ITX can be deployed in your environment in the way which you need it to be. Through the ITX Server's wide operating environment support, companies can choose the optimal environment for their processing. Further, ITX is available as an embeddable or stand-alone data transformation engine, providing its powerful processing capabilities to messaging infrastructures, applications, enterprise service bus, application servers, and devices. ITX is designed to deal with custom data and system types. From binary data to hierarchical data, from semi-structured to mixed formats, and literally to any data, ITX provides data transformation and processing across the enterprise environment.
  • A powerful, proven solution - ITX is the data processing and data integration engine for some of the most demanding environments. From stock exchanges to pharmaceuticals, from manufacturing to insurance and global commerce, ITX provides high-throughput, highly complex data processing capabilities in some of the most mission-critical applications in the world.

Customer Pains Addressed

  • Diverse and complex data to exchange between applications, data stores and external trading partners
  • Mixed enterprise infrastructure(SOA, ESB, event-driven, stand-alone, embedded, business process orchestration)
  • Need for common transformation across all integration tools and projects
  • Projects delayed or long development cycles due to diverse data formats and semantics
  • Handling of highly complex and unique industry standards-based requiring in-depth knowledge and understanding of a data architect.
  • Unable to quickly code and test program changes to heritage applications
  • Ongoing maintenance costs rising, responding to evolving industry standards.
  • Large up to 4GB multi-transaction batch documents to split transform, verify, and forward.
  • Missing a multipurpose onramp for data cleansing
  • Transitioning from File based integration (Email, FTP, VAN,File) and maintaining batch file processes


  • Consistent data transformation and enrichment
  • Reduced application development and maintenance costs
  • Increased knowledge reuse
  • Scalability of development
  • Standardized components
  • Increased quality of applications
  • Reuse of transformation across the enterprise
  • Decreased time to market of new applications

Unique Differentiators

  • One engine, multiple deployment options
  • High-throughput execution of complex transforms and enhancements
  • In-process data validation
  • Code-free design and deployment
  • Eclipse standard tools
  • Ontological data model for all data types
  • Library of prebuilt functions to accelerate development
  • Metadata that can represent any data format
  • Data syntax and semantic validation in-process