|
|
|
Watch Ivo Kulm's collection of blog entries for the PI REST Adapter that shows architectural concepts and configuration for integrating your SAP PI landscape with other REST services. Dec 2014 |
|
OData Adapter in SAP HANA Cloud IntegrationSabarish T S provides in his blog an excellent overview on the OData Adapter and OData support with the SuccessFactors Adapter that are offered with SAP HANA Cloud Integration. Dec. 2014 |
Watch the SAP TechEd && d-code Replays!If you've missed the integration sessions @ TechEd && d-code in Las Vegas, October 20-24 take now the opportunity to watch the replays and receive the newest highlights of our SAP Process Orchestration / SAP Process Integration and HANA Cloud Integration. Nov.2014 |
Featured Content in Process Integration (PI) & SOA Middleware
Process Integration - Previously Featured
Previously featured |
---|
Results of the Global SAP PI survey now availableThe results of this year's global PI survey is now published. As every year the International Focus Group for Integration conducted a survey and 362 users from over 20 different user groups participated. Watch Holger Himmelmann's blog to see the first part of the results with the analysis of general questions. Nov. 2014 |
|
|
|
SAP has released SP4 for B2B Add-on and Secure Connectivity Add-onAfter SAP's announcement that B2B is now included in the SAP Process Orchestration license the Support Package 4 is now released with important enhancements. Get an overview on the major enhancements with the blog by Piyush Gakhar. Oct. 2014 |
|
RKT Workshop on SAP HANA Cloud IntegrationWe are organizing a RKT workshop for SAP HANA Cloud Integration (SAP HCI) in Bangalore on Nov. 20+21, 2014. Visit us and get deep insight on SAP HCI with solution details, use cases and hands-on exercises. Find more information and the registration link in the blog by Piyush Gakhar. Oct. 2014 |
Results of Global Process Integration Survey 2014The blog by Holger Himmelmann reveals the results of the 2014 global SAP PI survey. Thanks to everyone who participated and shared their feedback! |
Build your SAP TechEd && d-code agenda!Have you already finalized your agenda for TechEd && d-code Las Vegas? The event in Las Vegas is now just one week ahead. Don’t miss the integration sessions @ TechEd && d-code in Las Vegas, October 20-24. Take the advantage to meet the integration experts in person and receive the newest highlights of our on-premise and also cloud-based integration platform. Learn in demo rich lectures and hands-on trainings how SAP integration technologies, like SAP HANA Cloud Integration and SAP Process Orchestration are being used for the integration of SAP's cloud pillars. Get insights into the current solutions and road maps and get a preview on prepackaged content for jump-starting your integration projects. Get on overview on relevant SAP Middleware-related sessions with the blog by Smadar Ludomirski and read the blog by Gunther Stuhec to know all important sessions to understand the new tool of the Integration Advisor and see how it will work in real. Oct 2014 |
Meet SAP at “I love API 2014” Conference Sep 8 – 10 in San FranciscoI Love APIs 2014 is the conference for business and technology leaders driving digital acceleration with big data and APIs hosted by Apigee. This year SAP has strong presence at the event following the recent strategic partnership announcement made between SAP and Apigee at end of July. August 2014 |
SP1 of the SAP Process Integration, connectivity add-on 1.0 has just been released. This release consists of an OData Adapter and extensions to the existing SFSF Adapter. Find out more in this blog by Finny Babu and reach out to him with any questions. July 2014 |
SAP HANA Cloud Integration (HCI) is SAP’s strategic integration platform to integrate SAP Cloud applications with other systems, SAP and non-SAP, on premise or on the cloud. The new SAP HANA Cloud Integration editions: Standard and Professional allow organizations to leverage SAP HCI in arbitrary (i.e. any system to any system) integration scenarios. Join us in Walldorf on July 30th for a free info day to learn more and try HANA Cloud Integration hands-on! Follow Udo Paltzer to find out about other locations coming soon. July 2014 |
2014 Global SAP Process Integration SurveyThe yearly PI survey is open until August 19th and the 2014 special topics are BPM and Process Orchestration. The survey aims to collect information about the latest state of positioning, organization and use in companies using SAP PI or SAP Process Orchestration as an integration platform. You will find all details and survey access link in Holger Himmelmann’s blog. Participate and help beat the 2013 response record! July 2014 |
Get your free seat for the Intelligent Business Operations #IBO Tech Academy in the UK on 25/7With this blog Tony Read invites you to join SAP and CompriseIT for a hands-on workshop on Intelligent Business Operations covering SAP Process Orchestration and SAP Operational Process Intelligence on HANA. The workshop is free for all SAP customers and prospects. Next opportunity in the UK is on 25/7. Seats are limited so mail Tony to book your seat today. June 2014 |
In this blog Meghna Shishodiya announces a new HANA Cloud Integration (HCI) webinar series starting on May 29 with an overview, architecture and security aspects. Bookmark the blog for upcoming dates and join us to learn everything you wanted to know about hci and get your questions answered. May 2014 |
|
SAP HANA Cloud Integration (HCI) Roadmap Webinar - 3 JulyHow much do you know about SAP HCI? How it is leveraged for integration of SAP Cloud solutions, such as Cloud for Customers, SuccessFactors, Ariba etc. Join us for a roadmap session to learn the latest innovations and planning for HCI and get your questions answered. June 2014 |
SAP Process Orchestration Roadmap webinar - recording is now available!The webinar covers the latest innovations and planning for SAP Process Orchestration. You will also learn how Process Orchestration customers can benefit by enabling intelligent business operations ibo on the SAP HANA platform and how to take advantage of other Middleware offerings such as SAP Gateway and SAP HANA Cloud Integration hci. June 2014 |
12/6 - global webcast on Intelligent Business Operations: Infuse bigdata insights into your business processes in realtimeThis webcast will showcase real-life examples of how to work smarter by infusing #BigData insights into your processes, how to take corrective actions when or even before issues occur, anticipate what will happen using predictive analytics, gain real-time visibility into your end-to-end operations. Register for the live event on 12/6 and/ or to get the slides and replay. June 2014 |
Middleware Tech Academies at Saphila 2014 - 9&10 JuneIf you plan to attend the conference do not miss the Intelligent Business Operations ibo hands-on workshops. On both 9 & 10 June you have the opportunity to learn more about the SAP IBO technology bundle and try the software. June 2014 |
If you are using the SAP Process Orchestration trial offering on AWS, check this blog by Abhinag Palvadi. You can now access an instance with the latest available version - SP11 equivalent to SP6 of SAP Process Orchestration 7.4. May 2014 |
2014 Global SAP Process Integration SurveyThe yearly PI survey is now open and the 2014 special topics are BPM and Process Orchestration. The survey aims to collect information about the latest state of positioning, organization and use in companies using SAP PI or SAP Process Orchestration as an integration platform. You will find all details and survey access link in Holger Himmelmann’s blog. Participate and help beat the 2013 response record! May 2014 |
May 27th - join us for an SAP Process Orchestration Roadmap Webinar!The webinar will be delivered by SAP Product Management and will cover the latest innovations and planning for SAP Process Orchestration. You will also learn how Process Orchestration customers can benefit by enabling intelligent business operations ibo on the SAP HANA platform and how to take advantage of other Middleware offerings such as SAP Gateway and SAP HANA Cloud Integration hci. May 2014 |
SP3 of B2B Add-On and Secure Connectivity Add-On Released!In this blog Piyush Gakhar introduces all new features and improvements with the SP3 release, like secure PGP key storage, multiple directory support for SFTP adapter, SLA alerts, enhanced AS2 support, EDI content manager and more. April 2014 |
In this blog Christian Loos announces that the former individual PI and BPM product roadmaps have been combined into one single Process Orchestration roadmap. Review all planned innovations and stay tuned for the upcoming roadmap webcast. April 2014 |
|
SAP Middleware newsletter – the newest addition to the SCN newsletters family!Free monthly insight into all innovations in ALM, Software Logistics, Software Defined Data Center solutions, Virtualization & Cloud Management, Architecture, Process Orchestration, Decision Service Management,Operational Process Intelligence, SAP HANA Cloud Integration, Big Process and Big Data and powerful technology bundles like Intelligent Business Operations powered by SAP HANA. Mar 2014 |
|
Explained: the new B2B Trading Partner Management functionalityIn this blog Shilpa Nair and Sarath Sasi provide a summary and links to all recent content on the new Trading Partner Management functionality shipped with SP2 of the SAP Process Orchestration B2B Add-on. Feb 2014 |
|
|
IFG Survey: Central PI Monitoring with SAP Solution ManagerMonitoring is a key challenge according to the global 2013 PI survey. To gain a better understanding of the situation and requirements, the IFG for PI and SAP have launched a follow-up survey with focus on central PI monitoring with SAP Solution Manager. Read all details in Holger Himmelmann’s blog and participate. Jan 2014 |
SAP Orchestration and Integration Solutions: TechEd 2013 ReplaysMissed TechEd? In this blog Gabriela Gahse highlights the available replays from TechEd Las Vegas 2013 and invites you to check the latest and greatest on B2B with Process Orchestration and HANA Cloud Integration hci , SAP Operational Process Intelligence opint, as well as Business Rules and Decision Service Management dsm with SAP. Jan 2014 |
Released: SAP NetWeaver Process Orchestration B2B Add-On SP2In this blog Piyush Gakhar highlights the main enhancements with the latest SP like Trading Partner Management, new Messages Support for Tradacoms and EANCOM. Read this blog for a full overview of all new EDI and B2B features and enhancements. Dec 2013 |
Roadmap for SAP HANA Cloud IntegrationIn this presentation Udo Paltzer provides an overview of the current capabilities and uses cases, as well as the roadmap for SAP HANA Cloud Integration (HCI). Do not miss the HCI library on SCN where you can find everything about hci in one place. Dec 2013 |
|
|
Results: Global SAP NetWeaver PI Survey 2013 - Part 1 of 2 and Part 2 of 2These two blogs by Holger Himmelmann reveal the results of the 2013 global SAP NetWeaver PI survey. Thanks to everyone who participated and shared their feedback! Nov 2013 |
Process Orchestration, HANA Cloud Integration, OData and more. Read the summary and key takeaways of SAP Mentor Shabarish Vijayakumar from the integration arena at this year’s teched_las_vegas. Nov 2013 |
In this blog Udo Paltzer shares details about the opportunity to get hands-on experience with SAP HANA Cloud Integration. Join the program and become one of the early adopters! Nov 2013 |
SAP HANA Cloud Integration (HCI): Getting StartedThe first set of projects are underway in the HCI space, and now is a good time to get a closer look at the HCI world. Review this blog by Meghna Shishodiya and also the SAP HANA Cloud Integration: An Intro by Sujit Hemachandran. Nov 2013 |
teched_amsterdam is in full swing now! Find out from Alexander Bundschuh which sessions are a must see in the areas of SAP NetWeaver BPM | PI | Process Orchestration, SAP HANA Cloud Integration, Business Rules, B2B and SAP Operational Process Intelligence powered by SAP HANA. Nov 2013 |
|
|
Webcast: SP1 of B2B Add-on with SAP NetWeaver Process Orchestration> Overview and RoadmapThis session is part of the ramp-up knowledge transfer program and will be presented by Piyush Gakhar from SAP Product Management. Read this blog for details and join us on Sep 30 or Oct 1! |
|
SAP NetWeaver Process Orchestration – the best is yet to come!In this blog Volker Stiehl explains why you should opt in for Process Orchestration as your single Middleware platform from SAP. Also check in detail What is new in SP7 of SAP NetWeaver Process Orchestration 7.31 and see all new and continuous investments that make SAP’s Middleware platform best in class. |
Global SAPNetWeaver PI Survey 2013: new record and 4 weeks to go!The survey will be closed on August 24th. Read more in Holger Himmelmann’s latest blog and do not miss to share your feedback. Thanks to everyone who already participated. |
How AmerisourceBergen uses SAP’s Process Orchestration technologiesIn this blog Eduardo Chiocconi is sharing key takeaways from SAPPHIRE NOW 2013 and highlighting a great customer story. Watch the full session replay to learn how AmerisourceBergen,one of the world's largest pharmaceutical services companies is transforming their business with SAP NetWeaver Process Orchestration technologies. |
SAP NetWeaver Process Orchestration Webcast Series 2013Join us for a 5 webcast series to hear latest news about Process Management software from SAP: SAP NetWeaver Process Orchestration including B2B, SAP Operational Process Intelligence and SAP NetWeaver Decision Service Management. Please share with anyone who may be interested. We look forward to meeting you there. |
Global Survey for SAP NetWeaver Process Integration 2013The 2013 PI survey is now on and the focus this year is, not surprisingly, B2B and EDI. More details and survey access link in Holger Himmelmann’s blog. We look forward to your participation. |
Try SAP NetWeaver Process Orchestration in Public Cloud!SAP is now giving you a free license to try SAP NetWeaver Process Orchestration in the cloud. Read this blog to see how you can get started. |
Simple use-cases with SAP NetWeaver Process Orchestration B2B Add-onCheck these new articles by Vikas Singh Rajpurohit providing use-cases and configuration options of OFTP; SFTP and PGP; AS2 and EDI Separator; Modules and X400 adapter available with the SAP B2B Add-on. |
* What is new in SP6 of SAP NetWeaver Process Orchestration 7.31Find out about the multiple new features and enhancements for Business Process Management, Process Integration and Orchestration scenarios. See how integration between SAP NetWeaver PI and BPM has been tightened further. |
* Generate a Migration Report to estimate the migration effort from PI dual-stack to AEXMigration from PI dual-stack to single-stack (Java-only) can be a daunting task. In this article William Li presents a java client program (read only) to be used to browse through all the configurations in the Integration Directory and produce a report helpful for the preliminary assessment of the migration task in your landscape. |
* cbs PI MeMo App for MobiIe Message Monitoring
* Released: SP1 of B2B Add-on and SFTP PGP with SAP NetWeaver Process Orchestration
* Upgrade options to the latest Process Integration or Process Orchestration
* IFG for PI: Subscription process for new PI Features in SAP Customer Connection Program
* Client Certificate based authentication while using ABAP Web Service for communication between ERP and SAP NET Weaver PI and The Myth of a Load Balancer - PI/Web Service Scenario
* Try SAP NetWeaver Process Orchestration in Public Cloud!
* Near Zero Downtime Management for SAP Netweaver Process Integration (nZDM/PI) available on Service Marketplace
* How to Load keys and certificates in SAP PI 7.3, SAP PO 7.3 EHP1 NWA's Key Storage
* Dynamic filename in mail receiver adapter made easier
* TechEd 2012: Process Orchestration session replays!
* SAP NetWeaver Process Orchestration technology in Healthcare
* Consolidated view on release notes for Process Integration and Orchestration
*SAP NetWeaver Process Orchestration PI | B2B | BPM | BRM on SCN - October
*SAP TechEd 2012 Sessions covering Process Orchestration with focus on Process Integration
* Results: 1) 2012 Global PI Survey 2) PI Requirements Prioritization
* SAP’s B2B Integration Strategy
* New SAP NetWeaver Process Orchestration RDS for EDI available now!
* Getting Started with SAP NetWeaver Process Orchestration
* IWAY adapters bundled within Process Integration / Orchestration
2012 Global SAP NetWeaver PI Survey
Get your hands on those (precious) module JAR files!
Have you ever wondered why an adapter module behaves a certain way... and not the way you want it to!!
Have you ever wondered how SAP developer's module coding is like?
Have you ever wondered how to get the source code for a custom adapter module that was deployed by a developer that has already left the organization?
Have you ever wondered if an adapter module has any undocumented parameters, hidden features or unknown use cases?
If you have... well... wonder no more!!! With the following simple steps and a lil' bit of patience, in no time you would be getting your hands on those precious adapter module JAR files... and plowing through the source codes for your bedtime reading!
Step by Step Example
In this example, I want to get my hands on MessageLoggerBean, a standard SAP module that is not listed on SAP's Help Library but has been mentioned in Vadim Klimov's Message Staging and Logging Options in Advanced Adapter Engine of PI 7.3x blog. This is a really handy module for logging purposes and I would like to know more about how it works.
So, here are the steps to go about it!
Step 1 - Get module details from the JNDI Browser
Login to NWA and navigate to Troubleshooting -> Java -> JNDI Browser
Search for the module and get the details from the Object Value. Copy the value of the clientAppName ignoring the .app prefix. In this case, it is com.sap.aii.af
Step 2 - Get the corresponding EJB class for the module
Next navigate to Configuration -> Infrastructure -> Application Modules
Put the value com.sap.aii.af that was copied from above in the filter. From the entries remaining, look for the entry with an additional .ejb.jar suffix from the value entered. In this case it is the com.sap.aii.af.ejb.jar entry. Next, find the module name from the list of Enterprise JavaBeans of the JAR file, and get the corresponding EJB class name. In this case, the EJB class name is com.sap.aii.af.app.modules.MessageLoggerBean
Step 3 - Locate JAR file in PI server via Telnet
With the EJB class name in hand, the next step is to locate the actual JAR file location in the server file system. This is accomplished via Telnet which is explained in Vadim Klimov's How to Find Java Library Resource File Possessing Information for Class Name blog.
Do note that Telnet is limited to localhost as mentioned in this documentation so it must be done from the PI server.
First login to the OS of the PI server. Then execute following command, and login with an administrator credential. For older releases, it will be the J2EE_ADMIN user.
telnet localhost 50008
Execute the LLR command below with the EJB class name from previous step. Note that the delimiters for the class package hierarchy must be changed from dot (.) to slash (/)
llr -all -f com/sap/aii/af/app/modules/MessageLoggerBean.class
The command returns the location of the JAR file in the file server.
Navigate to the location in the file server. Retrieve the file whichever means available (WinSCP, FTP, etc) or ask your neighbourhood-friendly Basis consultant for assistance.
Step 4 - Decompile JAR file to get access to the source code
With the JAR file in hand, use a Java decompiler like JD-GUI to decompile it and access the adapter module Java source code.
Conclusion
Voila!
In 4 easy steps, we now have access to the source code and hopefully that will make our lives as PI consultants a bit simpler (or a bit more interesting if you like!!)
Once you learn the concepts behind this technique, it can be used to get your hands on different types of JAR files in the PI system (like those pesky missing implementation classes in the mapping JAR files.) As they say, "The world is your oyster"!
Get started with SAP HANA Cloud Integration
Welcome to SAP HANA Cloud Integration! hci
Do you want to be one of the early adopters? Then test and learn more about HCI. We offer you a tenant with exclusive access to get first hands-on experience with our cloud based integration solution. Don’t miss this opportunity to work closely with SAP development on this new solution. |
- SAP’s Cloud for Customer application to on-premise SAP CRM / SAP ERP
- SuccessFactors BizX suite of applications to SAP HCM
These applications provide pre-packaged integration content for HCI, presented in an Integration Content Catalog and accessible over a web-based application. It eases the daily work of configurators, administrators and business analysts for exploring ready to run integration content as well as introductory information and demos. The integration content covers templates with prebuilt process integration, data integration flows and other integration artifacts that significantly reduce the implementation time, cost, and risk. These templates provide the bases for the easy adoption to specific business needs.
The design time is Eclipse based offering an Integration Designer perspective for integration developers to configure, deploy, administer, and monitor integration flows on detailed level.
Have a look at the SAP HANA Cloud Integration Landing Page - Public Integration Content Catalog to see how easy it is for customers and partners to find and understand what it is all about.
HCI - hosted in SAP HANA Cloud and offered as a managed service on top of SAP HANA Cloud Platform - comes with complete new architecture and deployment options that are designed and best suited for cloud-cloud and cloud-on-premise integration and process orchestration. Since the integration can be consumed as a service the solution provides a multi-tenant architecture and comprises highest level of security features such as content encryption, signing of messages, encrypted data storage and certificate based authentication. It contains a core runtime for processing, transformation, and routing of messages, as well as an out-of-the-box connectivity support (IDoc over SOAP, sFTP, SOAP/https, SuccessFactors adapter). SAP HANA Cloud Integration will be developed towards a functional rich cloud-based integration platform. A continuously increasing set of connectors and available enterprise integration patterns will lay the foundation for this.
New content will be posted here, so stay tuned!
- SAP HANA Cloud Integration on SAP.com
- SAP HANA Cloud Integration documentation on SAP Help Portal
- SAP HANA Cloud Integration Product Availability Matrix
- SAP HANA Cloud Integration Knowledge Transfer on SAP Service Marketplace (S-user required)
- SAP Financial Services Network documentation on SAP Help Portal
- SAP Financial Services Network Connector documentation on SAP Help Portal
- SAP Financial Services Network Knowledge Transfer on SAP Service Marketplace (S-user required)
- SAP HANA Cloud Integration - Tools
- NEW! SAP HANA Cloud Integration - Public Integration Content Catalog
- SAP Financial Services Network (SAP FSN)
- SAP HANA Cloud Developer Center
- SAP Data Services
- SAP NetWeaver Process Orchestration
- NEW! SAP HANA Cloud Integration: Standard and Professional Editions
- NEW! SAP HANA Cloud Integration for Application-Development Partners
- SAP HANA Cloud Integration Roadmap
- SAP HANA Cloud Integration – Early Customer and Partner Project - sign up to try HCI. It's easy!
- SAP HANA Cloud Integration: Getting Started
- SAP HANA Cloud Integration: An Intro
- NEW! SAP HANA Cloud Integration - a complementary offering to SAP Process Orchestration
- SAP HANA Cloud Integration for Data Services Tutorials
- Missed the TechEd 2013 conference? Check out the available replays of SAP HANA Cloud Integration sessions!
- E-Learning for Integration of SAP Cloud for Customer with SAP ERP and SAP CRM
- A recent information drop reveals many new details for HANA Cloud Integration
- A first attempt at using HANA Cloud Integration to integrate Workday and Salesforce
Events
PI-Acknowledgment contains system errors
Summary: Although your data sending to completed still it displays error message as “Acknowledgement contains system errors” in Ack. Status field in PI monitoring as shown in fig. below. This document will give you an idea how we can disable the default acknowledgement for the desired message type from sender SAP ECC system.
For disabling the acknowledgment, we have to set the type field as (three blanks) in table IDXNOALE for either for our message type or we can also set it globally message type as * in this table. The below four cases depicts the various possibilities.
Step 1: Execute report IDX_NOALE in PI and enter relevant Sender Port and Client which is sending data to PI (of SAP ECC sender system).
Step 2: Create an entry for Message Type of IDOC you send thorough PI.
- Click on ‘Request Acknowledgments’ and uncheck below checkboxes message will be passed without Requesting Ack. Status and error shown in Figure 1 won’t display in PI monitoring.
For above scenario we’ll have entry in table IDXNOALE as below which means Acknowledgment will request for all Message Type except for our desired Message Type because for our specific message type we have set (three blanks in field type in IDXNOALE table as shown below).
- Click on ‘Request Acknowledgments’ and check below checkboxes message will be passed Requesting Ack. Status and error shown in Figure 1 will display in PI monitoring.
For above scenario we’ll have entry in table IDXNOALE as below which means Acknowledgment will request for all Message Type and also for our desired Message Type (type field having the value as three X ‘XXX’ so).
- Click on ‘Do Not Request Acknowledgments’ and uncheck below checkboxes message will be passed without Requesting Ack. Status and error shown in Figure 1 won’t display in PI monitoring.
For above scenario we’ll have entry in table IDXNOALE as below which means Acknowledgment won’t request for all Message Type and also for defined Message Type.
Please note here even we don’t need to maintain the entry for our specific message type because once we select ‘do not request acknowledgement’ option then it inserts the generic (*) entry with type field as three blanks in table so it’s called generic settings.
- Click on ‘Do Not Request Acknowledgments’ and check below checkboxes message will be passed Requesting Ack. Status and error shown in Figure 1 will display in PI monitoring.
For above scenario we’ll have entry in table IDXNOALE as below which means Acknowledgment won’t request for all Message Type except for defined Message Type.
Java Mapping: Unzip file and create additional attachments
Background
I had a requirement recently which dealt with incoming Zip files. Basically, there are a few files in the Zip file and each of them will be used differently in subsequent processing, i.e. fileA will be processed by interface A, fileB by interface B. In order to achieve this, it is important to preserve the filename for each file in the Zip file. Additionally, a copy of original Zip file needs to be saved in the target directory too.
This requirement could not be achieved by using the standard module PayloadZipBean because:
- The main payload is replaced with the contents of the first file in the Zip file
- The filename of the first file that replaces the main payload is no longer available
In this article, I will share the custom Java mapping that was developed to fulfill this requirement. The core logic is to:
- Unzip main payload and create additional attachments for each file in Zip file
- Retain filename of each file
Source code
Below is a snippet of the portion of the code which deals with the unzipping and attachment creation.
ZipInputStream is used to read the compressed data. The logic loops through getting each ZipEntry to retrieve the content of each file. The filename is retrieved and used to create the additional attachment for the message.
// Unzip input file ZipInputStream zis = new ZipInputStream(new ByteArrayInputStream(content)); ZipEntry ze = null; // Loop through all entries in the zip file while((ze = zis.getNextEntry()) != null) { byte[] zipContent = getInputStreamBytes(zis); // Create attachment String name = ze.getName(); String ctype = "text/plain;name=\"" + name +"\""; if (outAtt != null) { Attachment att = outAtt.create(name, ctype, zipContent); outAtt.setAttachment(att); } zis.closeEntry(); } zis.close();
The full source code can be found in the following public repository on GitHub:
GitHub repository for PI_JavaMapping_UnzipAttach
Additionally,the JAR file is also available for direct download from GitHub:
com.equalize.xpi.esr.mapping.unzip.jar
Testing
Below are some screenshots of a test run using the Java mapping.
Firstly, we have a Zip file in the source folder. This Zip file contains two files within it.
From the message log, we can see that the original payload (logged with MessageLoggerBean) does not contain any attachments.
After the mapping step (AM) is performed, there is now two additional attachments in the message.
The message is then delivered to an SFTP receiver channel (with Store Attachments checked.) The channel processes all three files - main payload and 2 attachments.
Finally, we can see the 3 files in the target folder. The name of the first file is retained as is, while the two attachments have filenames which are concatenation of the main payload filename and the attachment name, i.e. <main_filename>_<attachment_name>
Reference
For further reference on dealing with Zip files, the following article covers the reverse flow - incoming payload with multiple attachments are compressed into a single Zip outgoing payload
Multi-mapping with Dynamic Configuration - SOAP loopback approach
They say "you can't have your cake and eat it."
They also say you cannot use dynamic configuration with multi-mapping.
But what if you could...
Introduction
In this article, I will demonstrate an approach to overcome one of the most common limitation of multi-mapping, which is the usage of dynamic configuration.
As each child message of a multi-mapping split shares the same dynamic configuration header during the mapping step, it is not possible to have different values (for a particular namespace attribute) assigned during mapping for each child message. One common scenario that requires such functionality is when a source message needs to be split into multiple files, and each target file needs to be dynamically named.
This approach is named the SOAP loopback approach, and has the following benefits:-
- Minimal coding required, i.e. no custom adapter module development
- Utilize standard features of PI for design and configuration
- No additional file system space required for placing temporary intermediate files
Approach
The key concepts to achieve this approach are:
- Intermediate structure with additional placeholder field to store value of dynamic configuration
- Loopback from receiver of iFlow1 to sender of iFlow 2 via SOAP adapter
Below are the flow diagrams of the two iFlow scenarios.
![]() |
![]() |
Following are the steps for the design and configuration of this approach.
Design
Data Type
Here is an example of the data type. The source contains a segment with multiple occurrences, while the target has only single occurrence (for each order.) An additional intermediate structure is defined that is exactly the same as the original target structure but with an additional placeholder field for dynamic configuration.
Structure | Definition |
---|---|
Source | ![]() |
Target | ![]() |
Intermediate Target | ![]() |
Service Interface
The sender and receiver interfaces are defined. Additionally, an abstract interface based on the intermediate structure is defined. This abstract interface will be used as the receiver of the first iFlow and also the sender of the second iFlow. The associated Message Types are also created.
Service Interface | Category | Type | Associated Data Type |
---|---|---|---|
SI_O_A_Order_Source | Outbound | Async | DT_Order_Source |
SI_I_A_Order_Target | Inbound | Async | DT_Order_Target |
SI_A_A_Order_Target_Intermediate | Abstract | Async | DT_Order_Target_Intermediate |
Message Mapping
Message mappings are created for the two iFlow scenarios. Important thing to note here is that the first iFlow mapping is a 1-n multi-mapping, while the second iFlow mapping is a normal 1-1 mapping.
iFlow | Source Message | Source Occurrence | Target Message | Target Occurrence |
---|---|---|---|---|
First | MT_Order_Source | 1 | MT_Order_Target_Intermediate | Unbounded |
Second | MT_Order_Target_Intermediate | 1 | MT_Order_Target | 1 |
For the first mapping, the source is mapped to the intermediate structure. Additionally, the filename is dynamically constructed based on source fields, and mapped to the placeholder field in the intermediate structure.
For the second mapping, all the fields are mapped 1-1 from the intermediate structure to the final target structure. Additionally, the value in the placeholder field of the intermediate structure is used in a UDF mapping to set the dynamic configuration value. This can be mapped to a variable or to any arbitrary field with occurrence = 1.
Operation Mapping
The operation mappings are similar to the message mapping for the two iFlow scenarios.
iFlow | Source Interface | Source Occurrence | Target Interface | Target Occurrence |
---|---|---|---|---|
First | SI_O_A_Order_Source | 1 | SI_A_A_Order_Target_Intermediate | Unbounded |
Second | SI_A_A_Order_Target_Intermediate | 1 | SI_I_A_Order_Target | 1 |
Configuration
Systems and Communication Channel
For the configuration steps, we will need to configure 2 iFlows. Therefore there will be 2 sender systems and 2 receiver systems. For simplicity sake, we will reuse the same sender and receiver system, so in this example, it will only be BC1 and BC2. Alternatively, the receiver in the first iFlow and the sender of the second iFlow can be any arbitrary system or virtual component.
iFlow | Channel | Associated System | Direction | Adapter Type |
---|---|---|---|---|
First | CC_SOAP_S_XI_ProxySender | BC1 | Sender | SOAP (XI3.0) |
First | CC_SOAP_R_LoopbackReceiver | BC2 | Receiver | SOAP |
Second | CC_SOAP_S_LoopbackSender | BC1 | Sender | SOAP |
Second | CC_SFTP_R_DemoOrder | BC2 | Receiver | SFTP |
iFlow
We will configure the iFlow for the second scenario first, as the SOAP endpoint URL generated from the second scenario will be needed in the first scenario.
iFlow 2
The second iFlow will be used for processing the individual child messages to the final target. The main things to take note during configuration here are:
- Sender SOAP channel used for loopback is set to QOS Exactly Once to ensure async processing
- ASMA needs to be set for the receiver SFTP channel
Sender System | Sender Interface | Sender Channel | Operation Mapping | Receiver System | Receiver Interface | Receiver Channel |
---|---|---|---|---|---|---|
BC1 | SI_A_A_Order_Target_Intermediate | CC_SOAP_S_LoopbackSender | OM for iFlow 2 from design step | BC2 | SI_I_A_Order_Target | CC_SFTP_R_DemoOrder |
Once iFlow 2 is completed, we need to get the SOAP endpoint for the iFlow. Follow the steps in Generating WSDL in IFLOW in SAP PO 7.4 to switch the perspective and Copy the endpoint.
iFlow 1
Now we are ready to configure the first iFlow that performs the multi-mapping to the intermediate structure. The main thing to note here is the configuration of the receiver SOAP channel for loopback.
Sender System | Sender Interface | Sender Channel | Operation Mapping | Receiver System | Receiver Interface | Receiver Channel |
---|---|---|---|---|---|---|
BC1 | SI_O_A_Order_Source | CC_SOAP_S_XI_ProxySender | OM for iFlow 1 from design step | BC2 | SI_A_A_Order_Target_Intermediate | CC_SOAP_R_LoopbackReceiver |
Using the value of the endpoint from iFlow 2, enter this into the SOAP receiver channel's target URL - the hostname can be changed to localhost. Enter a messaging service user with sufficient credentials for the SOAP channel, typically it will be PIAPPL<SID>.
Testing Results
Here are the results from testing of the design.
A test message with two orders in the source XML is sent via the testing functionality
iFlow 1
In the payload after mapping of iFlow 1, we can see two child messages with the dynamically constructed filenames in the placeholder field.
The log also shows the child messages created from the split.
The log for the child message shows that the message is sent to the endpoint URL of the second iFlow.
iFlow 2
Next we can view the payload and logs of the child messages in the second iFlow. From the logs we can see that the child message has been transformed to the final target structure, and the filename is stored in the dynamic configuration attribute, which is finally used during processing by the SFTP receiver channel.
First child message
Second child message
Finally, we see in the target folder, the two files created for two child messages with dynamically constructed filenames.
Conclusion
Voila! There we go, "having out cake and eating it"
With this no-hassle approach, another common hurdle in the PI world can now be easily overcome.
PI-Send data to dynamic web service receiver
Summary: - In this scenario we’ll send data to the dynamic SOAP receiver i.e. data send to the SOAP URL which we received from the sender payload. With this we can send data to different receiver system with same receiver communication channel. As this become an Asynchronous communication (instead of Synchronous communication for SOAP receiver) it avoids timeout error.
Create Function Library and pass URL and Username from sender Payload.
Create DynamicConfigurationKey to hold dynamically created URL and username for URL store it in TServerLocation and for username in TAuthKey.
In Message Mapping pass of sender data URL and Username to Function Library.
Give any name in Target URL field as it is obligatory field for receiver communication channel.
Set Variable Header as TServerLocation and username in Key field Related Passwords in Authentication Keys
Note: - Here we’ve multiple Key and Password fields u can give all Username Password for different system which can receive this data.
Benefits
- Different receiver system can use same channel.
- Less Time to develop scenario.
- Avoid Timeout in synchronous scenario by making it Asynchronous.
Dependencies
- As shown in fig. below as Target URL is obligatory we must enter ‘http://’ otherwise it throws error in Payload as shown in fig. below and not allow send data although dynamic URL is correct.
- In Authentication keys fields Key (username) and Password which can access this system must defined as shown below.
- Username and URL fields in sender payload are obligatory to receive data.
Group, Sort and handle Duplicate XML records – using UDF
Below User Define Functions (UDF) logic can be used to group, sort and handle duplicate XML records.
Note: -
- Number of queue entries in each input field, to UDF should be equal (except key fields). Use ‘mapWithDefault’ for all optional input fields.
- Input fields context should be removed. Use ‘removeContexts’ OR right click on field and then change ‘Context’.
- Create UDF with Execution Type as ‘All Values of Queue’. Input all related fields to one 'multiple input and multiple output UDF'. When match is found, write all related fields to output.
- Please tryout below examples in sandbox. And then solve your actual requirement.
Grouping XML records
Here is input XML and expected output XML: -
Mapping: -
UDF: -
UDF will search ‘Name_People’ in ‘Name_Record’, if it finds a match, corresponding ‘Where’ and ‘Phone’ will be written to target fields with Context Change. Context Change will be added to output ‘Record_out’ for each ‘Name_People’.
Sorting XML records
Here is input XML and expected output XML: -
Mapping: -
Mapping is same as above, add standard function 'sort' to key field. UDF logic is same as above. Output XML will be sorted with Name_People.
Handling Duplicate XML records
Here is input XML and expected output XML: - Duplicate records are ignored.
Mapping: -
UDF: -
HashSet will return true, if unique value is added. It will return false, if duplicate is added.
UDF code for reference
// Grouping XML records. public void udf_getCorrespondingRecord(String[] Name_People, String[] Name_Record, String[] Where, String[] Phone, ResultList Record_out, ResultList Where_out, ResultList Phone_out, Container container) throws StreamTransformationException{ for(int i = 0; i < Name_People.length; i ++) { for(int j = 0; j < Name_Record.length; j ++) { if(Name_People[ i ].equals(Name_Record[ j ])) { Record_out.addValue(" "); Where_out.addValue(Where[ j ]); Phone_out.addValue(Phone[ j ]); Where_out.addValue(ResultList.CC); Phone_out.addValue(ResultList.CC); } } Record_out.addValue(ResultList.CC); } } //If match has to be found using a combination of two fields. Please use logic something like this in above UDF: //if((Name_People[ i ] + Where[ i ]).equals(Name_People[ j ] + Where[ j ])) //------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ //------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ // Handle duplicate XML records. public void udf_getUnique(String[] Id, String[] Name, String[] field1, ResultList Record_out, ResultList Id_out, ResultList Name_out, ResultList field1_out, Container container) throws StreamTransformationException{ Set<String> unique = new HashSet<String>(); for(int i = 0; i < Id.length; i ++) { if(unique.add( Id[ i ] )) { // True if unique. Record_out.addValue(" "); Id_out.addValue(Id[ i ]); Name_out.addValue(Name[ i ]); field1_out.addValue(field1[ i ]); Id_out.addValue(ResultList.CC); Name_out.addValue(ResultList.CC); field1_out.addValue(ResultList.CC); } } }
Special character’s handling in PI/XI simplest ever.
We normally come across this very common issue where special characters are received from ECC and passes through PI successfully and at receiving system it fails as the receiver does not accept special characters.
Especially when we are dealing with Bank scenarios, the protocols at Bank’s end are very strict and the payments are rejected very frequently.
The major problem is that these characters do not get detected as they are not visible with naked eyes or in payload.
If you try opening payload in notepad then you will be able to figure out.
Normally when we try to deal with special characters we focus on the characters which are not acceptable, but while using this technique we normally miss some or the other character and the problem persists.
So to deal with special character problems, I have used a method which is fool proof and can anytime be used very easily and in very simple way anytime and in any version of PI/XI.
In this method we will be concentrating on the allowed characters list rather than not allowed ones.
It is a simple reusable java code which can be used to write and UDF and can be used for any input fields which need special character handling.
In this code we will match the input field character by character with the allowed list and if the character of input field matches the one from the allowed list, add them to the string buffer and return the list from the buffer in the end.
This is the best and simplest ever practice to deal with special characters in PI
PIP(Partner Interface Process) Codes: Consolidated List
The entire business supply chain domains for which PIPs are specified are divided into many clusters. Each cluster is further sub-divided into segments. Each segment consists of several PIPs.
PIPs include specification of partner business roles such as buyer, seller and so on, activities conducted between the roles and type, content, and sequence of documents exchanged by the roles while performing these activities. Also specified are the time, security, and authentication of these interactions. The structure and content of the documents exchanged is based on the business dictionary and technical dictionary. The PIP is a compilation of this message type information into document type definitions (DTDs) and message guidelines.
Trading partners exchange documents that match the DTDs by using network protocols specified by RNIF.
The Below Codes may be useful for various information:-
4A1– Notify of the Strategy Forecast
4A3 – Notify Release Forecast Information
4A5 -- Forecast Replay
3A4 – Purchase order request
3A4R -- Purchase order response
3A8 -- Purchase Request
3A8R -- Purchase order Change Confirmation
3B2 -- ASN ( Advance Shipments Notifications)
3B3 -- Distributed Shipments Receipt
4B2 -- Notification of shipment Receipt
4C1 -- Distributed Inventory Receipt
3C3 -- Notify Invoice
3C6 -- Notify Remittances Advice
3C7 -- Self Building Invoice
5C1 -- Distributed Product List
7B1 -- Distributed Working Progress
If the codes starts with:
3*-- It mainly deals with Purchase Order related Interfaces at ECC level .
4*-- It mainly deals with Forecast Information in SCM.
5*-- Product Related Information.
7*-- Distribution information at SCM.
Example:
3 |
A |
8 |
R |
Cluster |
Quote |
Order Entry |
Request Purchase Order |
This completes the consolidated list of PIP codes.
DynamicAttributeChangeBean - The no-mapping solution to changing Dynamic Configuration ... dynamically!
Introduction
As there is no dedicated Managed File Transfer (MFT) capability in PI (yet!), passthrough scenario is a common approach to achieving a 1-1 file transfer interface. A passthrough scenario is one of the simplest configuration in PI as there are no ESR objects required.
However, every now and then, we come across requirements that slightly complicates the passthrough scenario, i.e. changing the filename extension, adding prefix to the original filename, adding timestamp in a custom format. These cannot be achieved by the standard file-based adapters. To get around it, some form of mapping is required and therefore we return to the necessity to have ESR objects. Here are some common approaches so far:
- Create a passthrough Java mapping that dynamically modifies the Dynamic Configuration attributes
- Configure FCC at sender, perform 1-1 mapping and configure FCC at receiver
- Execute additional OS-level commands/scripts to perform renaming/moving of file
DynamicAttributeChangeBean (DACB) is an adapter module solution that aims to fill this gap, in particular addressing these passthrough scenarios. DACB is another "develop/deploy once, use multiple times" adapter module (the likes of ExcelTransformBean) that is highly configurable and reusable.
Source Code
The full Java source code can be found in the following public repository on GitHub.
GitHub repository for DynamicAttributeChangeBean
EAR deployment file is also available for download from the following GitHub repository release.
This EAR can be used for direct deployment in order to use the Adapter Module Bean. Note that this EAR is created based on EJB3.0 version on NWDS 7.31 SP13 Patch 0. For earlier versions of PI, the EJB and EAR file will have to be manually built and compiled using the source code provided.
Module Parameter Reference
Below is a list of the parameters for configuration of the module. Certain parameters will automatically inherit the default values if it is not configured.
Parameter Name | Allowed values | Default value | Remarks |
---|---|---|---|
mode | add, change, delete, regex, none | Required field. Determines the mode of processing
| |
namespace | http://sap.com/xi/XI/System/File | Namespace of the input Dynamic Configuration attribute | |
attribute | FileName | Input Dynamic Configuration attribute | |
outNamespace | If blank, inherit from namespace | Optional output namespace if it differs from input Dynamic Configuration namespace | |
outAttribute | If blank, Inherit from attribute | Optional output attribute if it differs from input Dynamic Configuration attribute | |
prefix | Available when mode = 'add' or 'delete' Value at the front of attribute to be added or deleted | ||
suffix | Available when mode = 'add' or 'delete' Value at the end of attribute to be added or deleted | ||
oldValue | Required field when mode = 'change' Existing value in attribute to be replaced | ||
newValue | Required field when mode = 'change' New value as replacement in attribute | ||
replaceAll | Y, N | N | Available when mode = 'change' Replaces all values of oldValue in the attribute |
regex | Required field when mode = 'regex' The regular expression for matching patterns of the attribute | ||
replacement | Required field when mode = 'regex' The replacement value for the matched patterns. Matched capturing groups from the regular expression above can be referenced in the replacement value | ||
addTimestamp | Y, N | N | Adds timestamp at the end of the attribute before the extension (the last .xxx) |
timestampFormat | yyyyMMdd-HHmmss-SSS | The format of the timestamp following allowed patterns in Java's SimpleDateFormat |
Example Scenarios
Here are some example use case scenarios for DACB based on different configuration options. For the sake of simplicity, the results shown are Console output using the Standalone testing of Adapter Module in NWDS approach.
Scenario 1
Add prefix and suffix to the FileName attribute.
Module parameters
Parameter Name | Parameter Value |
---|---|
mode | add |
namespace | http://sap.com/xi/XI/System/File |
attribute | FileName |
prefix | MY_ |
suffix | .zip |
Result
The FileName attribute is appended with a prefix and suffix.
Scenario 2
Change the extension of the FileName attribute.
Output attribute in a different namespace.
Module parameters
Parameter Name | Parameter Value |
---|---|
mode | change |
namespace | http://sap.com/xi/XI/System/File |
attribute | FileName |
outNamespace | http://sap.com/xi/XI/SFTP/SFTP |
oldValue | .txt |
newValue | .xml |
Result
The extension is changed from txt to xml and stored in a different namespace. Note that the original input attribute is not modified.
Scenario 3
Delete the last extension.
Use default namespace and attribute.
Add a custom timestamp.
Module parameters
Parameter Name | Parameter Value |
---|---|
mode | delete |
suffix | .pgp |
addTimestamp | Y |
timestampFormat | '_'yyyyMMdd |
Result
The default namespace and attribute is used. The PGP extension is removed from the filename, and a timestamp is added.
Scenario 4
Dynamic change mode using regular expression.
Match the filename pattern while also capturing the two numeric groups - the first (\d+) after order Order and the second (\d+) after Batch.
Rename the output using references from the matched capturing groups ($1 and $2.)
Module parameters
Parameter Name | Parameter Value |
---|---|
mode | regex |
regex | Order_(\d+)_Batch_(\d+).xml |
replacement | PO_$1_Group_$2.txt |
Result
Values 1234 and 10 are captured from the input attribute, and used in the renaming of the output attribute.
Scenario 5
And finally (and this is the most exciting example use case!) an example based on an actual requirement from this thread. For this, actual configuration and testing screenshots will be shown.
Requirement: Determine the target folder based on the dynamic values contained in the file name.
Module configuration
Monitoring Logs and Testing Result
An input file with value of Partner1_002_Invoice.xml set in the Dynamic Configuration FileName attribute.
After the module processing by DACB, an additional attribute for Directory is added, with the value dynamically determined from the input filename. The target directory = /sapinterface/xxx/Partner1/002/
The audit log shows the attribute changes and the file being routed to the dynamically determined target directory.
And we find the file in the directory as expected!
Conclusion
With usage of DACB, most passthrough scenarios can remain "ESR object"-less (although there is actually reason to have at least the Service Interface defined in ESR)
Using SAP Hana Cloud Integration with SAP HCI Odata Provisioning
SAP HCI Odata Provisioning is a part of SAP Hana Cloud platform based tools which among other functionality provides secure exposure of REST based interfaces from On Premise SAP Business Suite into the SAP Hana Cloud. In this case, let us look into an approach to use SAP Hana Cloud Integration to leverage SAP HCI Odata Provisioning to fetch data from on Premise SAP Business Suite.
For this approach we will use the following:
1) SAP Gateway trial service that is been provided by SAP – for details please refer to the link
http://scn.sap.com/community/developer-center/netweaver-gateway.
2) SAP Hana and SAP Hana Cloud Integration Trial account
3) SAP Cloud Connector:
https://help.hana.ondemand.com/help/frameset.htm?57ae3d62f63440f7952e57bfcef948d3.html.
SAP Gateway :
We will use the SAP Gateway trial as a substitute for on Premise system and connect to it from locally installed SAP Cloud Connector. Check if the SAP Gateway can be accessed from your local machine.
SAP Cloud Connector Configuration
- Start the go.bat from the SAP Cloud Connector directory (in case of Windows OS).
- Login to the SAP Cloud Connector – the default is https://localhost:8443.
Set the Access Control so that the virtual host and port (sapgateway 1234) is pointing to the IW_BEP service in the SAP Gateway. This hides the actual hostname of the on Premise system and securely sets up the communication from the SAP Hana Cloud.
SAP HCI Odata Provisioning Configuration (formerly Gateway as a Service)
- Log in to the SAP HCI Odata Provisioning tool – using the following URL or by going into the Services tab in your SAP Hana Cloud Account. Please make sure you have assigned in the Authorization of your user the two roles ‘GW_User’ and ‘GW_Admin’ from Account ‘gwaas’ and Application ‘gwaas’.
https://gwaas-<user>trial.hanatrial.ondemand.com/Admin
2. In the HCI Odata Provisioning Configuration there are 4 sections :
a) Browse to the ‘Destinations’ section and create a destination to point to the onPremise system. The hostname and port will be the virtual hostname and port provided in the SAP Cloud Connector above and the credentials would be the details to log in to the SAP Gateway.
b) Browse to the ‘Services’ section and register a new Service. Choose ‘Select from the list’ option and the destination name created in the last step.
c) Fill in the details in the Register Manually section – you can use the service name as GWDEMO and the other details as below :
d) Save and browse to the ‘Explore’ section to view the metadata accessed from SAP Gateway.
e) In the ‘Services’ section refer to the ‘Open Service Document’ link to get the URL to be used by SAP Hana Cloud Integration to use this service.
The link should open an URL https://gwaas-<user>trial.hanatrial.ondemand.com/odata/IWBEP/GWDEMO;v=1
SAP Hana Cloud Integration Configuration
- In the Eclipse tool let us develop a simple Integration iFlow to fetch the data from the URL above. Basically the scenario would be to fetch data from the URL above using the Odata adapter in SAP HCI.
- The iFLow details are as follows :
3. The Odata communication channel should be configured as below :
a) Choose the Adapter type as below :
b) In the Connection section provide the connectivity details :
c) The credentials to access the service should be deployed as ‘User Credential’ artifacts and the same name is referenced in the communication channel above.
d) Click on the ‘Model Operation’ to build the Query for fetching the data from the service. Fill up the URL , username and password to proceed.
e) Click on next to get the Entity list - the filter box above makes it easier to search through Entities.
f) Select the particular entity to create the query – and then choose the particular fields. Optionally click on next and create filter conditions for the query.
On completion of the creation of the query – the Query appears in the Operation Details section.
g) Deploy the Integration Project (Put the Starter Timer at ‘Run Once’ in the iFlow) and check for errors.In the SFTP server check for the file created – it should be available for confirmation of the testing.
Result of the scenario - The file created contains the data queried from the SAP Gateway.
References:
Relevant links are mentioned below:
PI Monitoring,Alert and Incident Creation using SolMan
Hi All,
In this blog we will discuss the how Solution Manager can be used to monitor SAP PI System availability,Message Status,Ping the Channnel,STOP and START Channel directly from SolMan without logging into the SAP PI System.At the same time we can also create Alert,Notofication and Incident directly from SolMan without any third party tool such as HP Service Manager or Remedy.This blog will be helpful for those who are working in Production Environment where even a minor error in manual monitoring can leads to high escalation.
Objective:-
In SAP PI,different software components interact together to send messages from source system to target system and are responsible for transforming data from one format to another, at runtime makes decision to route the messages based on the runtime context, choosing right protocol to transfer the message.It is very important to monitors all of these components and take proactive measures to resolve the issues before they are visible to the users.It will help in reducing the count of Productive Issue and at the same time it helps in addressing the issues more quickly and efficiently.
Benefits:-
- Integration:-System Monitoring and Root Cause Analysis, Alerting Infrastructure, Notification and Incident Management all at one level
- Self check :-Self check features provides availability of monitored components.
- Production Resources Free:-Central collection of monitored data relives the production system from additional burden of monitoring.
Pre-requisites Roles:-
Please assign these two roles to your user Id 1) solman_setup 2) solman_workcenter
Step1:- Tcode :-solman_workcenter
There are 6 Monitoring Component of PI Monitoring.I will explain all these component in this document.
1) Overview Monitor:-
The overview monitor will give us the aggregated view of the information collected from component monitoring, message monitoring ,
channel monitoring,system availability and alerts.You can view the status in both form :-Graphical and Tabular View
Graphical View:-
Tabular View:-
You can also check the System Status,Job Status,System Availability and Performance Details in Overview Component.
2)Component Monitoring:-The Component monitor gives you the availability status and the self check status in a tabular view for PI components for the and status information are constantly refreshed at regular intervals.
This is will also give you the status of particular selected PI Component such as:- Is Web service security available? Are Messaging system
Jobs running without errors? Are Messaging System Recover Jobs running without errors? etc
3)Channel Monitoring:-The Channel Monitor lists all the PI communication channels from central and non-central adapter engines of the selected PI domain. It provides the channel active status, channel processing status after processing the last message, adapter type, component and the receiver or sender party etc.It also provides us the root cause and the last message processed.
We can also Stop, Start and Ping Communication Channel directly from Solman.
4)Message Monitoring:-The message monitoring tool provides the status of messages such as Error,Schedule, Cancel and Succesful from various PI components such as Central Integration Engine, Central Adapter Engine, Decentral Adapter Engine etc.It helps in analysing issue in easy and quicker manner.
It consists of three views:-
a)Error Monitor:-It provides you graphical view of all the errors messages in PI Landscape.
b)Backlog Monitor(scheduled messages):-It is similar to Error Monitor but main difference is that it will give you message
status in Scheduled Category.
c)Message Flow Monitor:-It will give you all the message status from Error, Scheduled, Cancelled etc.
Graphical View is also present.
5)Message Search:-You can search the message using their payload attributes.This functionality is based on the 'User Defined Search' (UDS)
which has to be configured on the local runtime components.
6)Alerts:-It handles alerts, availabilty status and self test for PI Components and PI Communication Channels.Using Create Notification Button you can create your own alert.
7)Notification and Incident Management:- Incident ticket can be created in the incident management application of SAP Solution Manager to address the issue and also you can set the priority such as High Priority or Medium Priority etc.
I hope this blog will help you in automating the Monitoring of SAP PI system.Please share your views and experience.
Happy Learning
Thanks,
Abhinav Verma
SAP Standard Interface (SAP ESR Content/ Package) Enhancement in PI
Motivation:
In a preeminent SAPC(SAP Consulting) project, integration part should also be treated as high priority same as business functionality. A bad case is that one project which would GoLive with ERP 6.0 EHP7 and other latest SAP products, but lack of good integration management. Then after GoLive end user and IT person always complained that why we used latest product, but performance deficiency, interface errors and even business process issues happen frequently. After investigation, major reason is that chaotic integration design and wrong PI usage. It just like a brilliant machine but with a rusty cable, so how can we expect such collocation could work together without obstruction. So please no not just blame why PI is difficult to use.
As SAP always push forward SOA platform so far (Integration as Service coming now for cloud integration), so it is very important to take care of SOA governance. From PI/PO perspective, that should be how to design interface and how to governance interface etc. A responsible integration architect/ integration lead in one project should well control interface usage type (IDoc, Proxy, RFC, WS etc.), interface naming conversion, QoS, interface version and also interface enhancement.
So in this document, we will focus on how to do the enhancement of SAP standard interface base on the SAP ESR content/package. Because this part almost happen in every integration project, especially for SAP RDS (Rapid Develop Solution), PoC(Proof of Concept) project which use SAP predefined ESR package with customized requirement. For example, in MFG(manufacturing) solution,like SAP ME,MII and SAP TM/EWM/SNC, it is very common that customer need to add some customized field under SAP standard interface.
For IDoc and RFC interface, it is easy to create ZIDoc and ZRFC as enhancement in Backend system, then import into PI ESR do the following PI implementation. Because they are fulfit Inside-out interface develop strategy. Following document will demonstrate another situation which is Outside-in interface develop strategy. From PI point of view that is so-called ABAP Proxy, we should first enhance interface in ESR, then generate proxy in Backend system; From whole SAP ES(enterprise service) develop point of view is that all related ES enhancement even not go through PI, like pure WebService interfaces or POL(Process Object Layer) interfaces which use ESR as interface repository.
Reading Tips:
Due to SCN Web format reason, it is not clear to see the picture until you double-click on it. You can just choose "View as PDF" on the right "Action" tab, then all the picture in clear mode. Or directly via http://scn.sap.com/docs/DOC-62041.pdf
Demonstration Description/ Requirement:
Add one customized field(ZLEON_ID) under standard AribaNetWork SAP interface(Interface Name: MIAbs_Async_LastUpdate_File, Interface Namespace:http://ariba.com/xi/ASN/ERPInvoice)
Step by Step:
1. Find the interface which would be enhanced;
In this demo, it is following standard AribaNetWork SAP interface(Interface Name: MIAbs_Async_LastUpdate_File, Interface Namespace:http://ariba.com/xi/ASN/ERPInvoice ,corresponding Data Type: DT_LastUpdate_File). So we will add ZLEON_ID field under
DT_LastUpdate_File.
Data Type in ESR | DT_LastUpdate_File | Type |
---|---|---|
SAP Standard | SystemID | String |
SAP Standard | Date | String |
SAP Standard | Time | String |
Enhancement/ Customized | ZLEON_ID | String |
2. Create customized SWCV and maintain SAP standard SWCV as its prerequisite software component in SLD;
In this demo: create TUCC_INTERFACE SWCV, and set AribaNetWork 12 as its prerequisite SWCV.
Note: It is not recommended to change SAP standard SWCV in ESR directly, even if you could while you change it into Modifiable. But it will break up SAP SWCV Lifecycle Management and will bring risk in the future develop. So we should create own SWCV, and mark SAP standard one as prerequisite under Dependencies tab.
3. In ESR, also maintain SAP standard SWCV as Underlying SWCV.
In the demo: set ARIBA_SUPPLIER_CONN_ADAPTER_12S2 as TUCC_INTERFACE's underlying SWCV.
4. Create Data Type Enhancement in own SWCV;
In the demo: Create Data Type Enhancement Ariba_SAP_Interface_Enhancement2under TUCC_INTERFACE. Customized field named ZLEON_ID in DT Enhancement.
5. Generate and Active corresponding proxy in Backend system via SPROXY;
Note: During proxy generation, error may happen. Said source Namespace:http://ariba.com/xi/ASN/ERPInvoice ,corresponding Data Type: DT_LastUpdate_File not exist on system. Need to generate the standard DT firstly, then generate Enhancement DT.
While generation completed, you will see ZXLEONZLEON_ID(ZXLEON is prefix) in Internal View.
6. Check if enhancement successfully in standard SWCV.
While generation completed and successfully, you will see ZXLEONZLEON_ID(ZXLEON is prefix) in standard SWCV ARIBA_SUPPLIER_CONN_ADAPTER_12S2 SWCV. Now standard interface MIAbs_Async_LastUpdate_File enhanced with one customized field named ZLEON_ID.
7. Following ABAP developing as normal;
While enhancement completed and successfully, you can implement proxy method via customized ABAP code or BAdI enhancement spot as normally Proxy development.
Summery:
We do see some bad enhancement cases in real project.
1.Directly change SAP package for the current requirement:
In the future, if same SAP interface used in other business requirement. Then it will confuse new developer, why some customized field exist and which ones are standard. More worse, such behavior(break up SAP SWCV Lifecycle) will endanger system upgrade/migration.
2.Create new customized SWCV as enhancement:
Totally customer management, not in SAP SOA governance. It is difficult to reuse exist interface in the future while new developer don't know details about old customized SWCV, more and more SWCV will be created by customer. Instead, you can search all the SAP ES interface via ES-Workplace | SAP and maintain enhancement under SAP standard ones.
XSLT approach to enable FCC for deep structures (Part 2 - Flat File to Deep XML)
Introduction
This is the second part of the two-part series on achieving file content conversion for deep structure with XSLT. The first part covered the approach to flatten a deep structure XML prior to receiver FCC conversion.
This second part will focus on deepening a flat XML after FCC conversion at a sender channel.
Source Code and Explanation
The full source code, example input and output files are available in the GitHub repository listed in the first part of this series.
Prerequisite: To deepen an XML, the segments that will be structured in a parent-child relationship needs to have a field with the same value on both the parent and the child.
The logic comprises of three different types of XSLT template match sections.
1. Selecting child segments at root element level
At root element level, the segments that should be immediate child segments are explicitly selected.
There are two different versions of the code as this is dependent on the FCC configuration of the sender channel. For NFS/FTP channels configured with ignoreRecordsetName = true, the generated XML from FCC will not have the Recordset segment as a child of the root element. Due to slight behavioral difference of the MessageTransformBean module, it does not support the above parameter and will always have a Recordset segment.
In the example below, segments Header, Delivery and Footer are "designated" as direct child elements of the root element (or Recordset element.)
With Recordset | <!-- At the root element level, manually select the child segments --><xsl:template match="*[local-name() = 'MT_Deep']"> <xsl:copy> <xsl:apply-templates select="@* | node()"/> </xsl:copy></xsl:template><xsl:template match="Recordset"> <xsl:copy> <xsl:apply-templates select="Header"/> <xsl:apply-templates select="Delivery"/> <xsl:apply-templates select="Footer"/> </xsl:copy></xsl:template> |
Without Recordset | <!-- At the root element level, manually select the child segments --><xsl:template match="*[local-name() = 'MT_Deep']"> <xsl:copy> <xsl:apply-templates select="Header"/> <xsl:apply-templates select="Delivery"/> <xsl:apply-templates select="Footer"/> </xsl:copy></xsl:template> |
2. Deepening segments with parent-child relationship
For subsequent segments with parent-child relationship, first the key field in the parent segment is saved in a variable, then the immediate child fields are copied, finally, child segments which have its key field matching the parent are selected.
The similar match template logic is repeated as required for all parent segments that will contain child segments.
<xsl:template match="Delivery"> <!-- (1) - Save parent key value to be used to select corresponding child segments --> <xsl:variable name="deliveryno" select="DeliveryNo"/> <xsl:copy> <!-- (2) - Select the child elements that are fields --> <xsl:apply-templates select="*[not(*)]"/> <!-- (3) - Select the child segments which have matching value as parent key value --> <xsl:apply-templates select="../Order[DeliveryNo=$deliveryno]"/> </xsl:copy></xsl:template>
3. Identity transformation for all other attributes and nodes
Finally, for all other attributes and nodes, identity (1-1) transformation is applied.
<!-- Match all other attributes (@*) and other node types (elements, comments) --><!-- Copy the current element, and select child attributes and nodes --><xsl:template match="@* | node()"> <xsl:copy> <xsl:apply-templates select="@* | node()"/> </xsl:copy></xsl:template>
Example Scenario
We will reuse the example structure used in the first part. The goal is to achieve creation of a deep XML payload from the flat file with deep/nested structure.
The input file will be a shipment file which contains all the details for the shipment. Each shipment file can have multiple deliveries, each delivery in turn can have multiple orders, and each order multiple items. Each segment occurs under the parent segment, therefore indicating a deep/nested structure.
Segment Name | Parent Segment | Type Indicator | Occurrence |
---|---|---|---|
Header | - | H | 1 |
Delivery | - | D | 1 - unbounded |
Order | Delivery | O | 1 - unbounded |
OrderText | Order | T | 0 - 1 |
Item | Order | I | 1 - unbounded |
Footer | - | F | 1 |
Design
This article will only focus on the sections that are required to enable this XSLT approach, therefore common steps for design and configuration will not be covered.
Source structure definition
The data type for the source structure will be defined as a deep XML schema. No flat DT definition that is required for FCC needs to be defined. Below are the sample deep XML schema representation of the input format, depending on whether Recordset is ignored during FCC or not.
For the remainder of the example, the design will be based on the structure with Recordset as it will be using the MessageTransformBean module in a sender SFTP channel.
Without Recordset | With Recordset |
---|---|
![]() | ![]() |
Message Mapping
Create a normal graphical message mapping to map the deep source structure above to the intended target structure.
Import XSLT code
Zip the XSL file and import it to ESR as an Imported Archive.
Operation Mapping
Typically the XSLT mapping will be used as the first step of the Operation Mapping that will deepen the XML generated from FCC. Then the message mapping will map the deep source to intended target.
Step | Mapping Type | Name |
---|---|---|
1 | XSL | Flat2Deep_WithRecordset |
2 | Message Mapping | <Message Mapping object created above> |
Below is a sample execution of the first step of the OM to deepen the XML that was created by the FCC at sender. The result shows that the XSLT have deepened the XML to the intended structure.
Note: As seen in the test tab, the XML output of the sender FCC is not a valid XML document when validated against the original data type definition (the segments indicated in red icons.) It does not matter as the main purpose of the XML is to be further deepened by the XSLT.
Configuration
The normal configuration steps are performed. At the sender channel, specify the FCC parameters. The following screenshot shows the FCC parameters using MessageTransformBean on an SFTP channel.
Note: The recordset structure order needs to be set to variable (xml.recordsetStructureOrder = var) because due to input flat file having deep structure, the segments are not in an ascending order.
Testing Results
After execution of the interface, we can view the logs to see the content of the original file prior to sender FCC.
After the XSLT transformation (and using a 1-1 graphical message mapping) the target payload in the receiver proxy is displayed. As we can see below, the XML generated has a deeply nested structure.
Conclusion
Similar to the first part, we have achieved a deep XML structure after sender FCC with a relatively simple XSLT logic. However, note that the logic cannot be coded so that it can be generically reused because the exact parent-child relationship needs to be specified.
XSLT approach to enable FCC for deep structures (Part 1 - Deep XML to Flat File)
Introduction
There have been multiple threads cropping up in the forum lately regarding FCC conversion for deep/nested structures. Unfortunately, the File Content Conversion (FCC) functionality (whether directly in the File/FTP adapter or through MessageTransformBean module) has not changed significantly for nearly 10 years since the days of XI 3.0 We are still stuck with standard FCC functionality that is only able to handle flat structures - all record-types/segment can only be at the same single level. However, in reality not all flat file structures conform to this "expected" pattern. We would have expected SAP to enhance the capabilities of the standard FCC functionality to cater for these commonly used deep structures, but that does not seem to be the case.
Therefore, all these years, there have been a myriad of custom solutions/workarounds to handle this. Some custom approaches are listed below:
- Graphical mapping - multi-step graphical conversion using intermediate structure. Example: XI/PI: Convert Flat File to Deeply Nested XML Structures Using Only Graphical Mapping
- Java mapping - full blown Java logic. Examples: Simplest way to Convert Flat File to Deeply Nested XML Structures Using Java Mapping and File Conversion using 'Nodeception'
- Adapter module - handle the FCC conversion fully in a custom adapter module
- Seeburger BIC mapping - custom BIC mapping to handle flat file conversion (I think SAP's B2B Add-on has a similar functionality for custom flat file conversion, but I have not had my hands on that Add-on yet to confirm)
In general, my preferred choice for custom development goes in the following order - Graphical mapping, Java mapping, XLST, adapter module. However there are certain areas where XSLT excels in, achieving robust transformation with minimal coding. Some of these are sorting XML and performing partial identity (1-1) transformation. The combination of the copy and apply-templates command allows for identity transformation without the need to individually specify field names.
In this two-part series, I will share the approach based on XSLT that was used in one of my previous projects to handle creation and parsing of deeply structured flat files. I have since tweaked the XSLT codes so that it is able to be used in any scenario with minimal changes to the code. Here are the scenarios that will be covered in each part of this series.
- Part 1 - flatten a deep structure XML prior to receiver FCC conversion
- Part 2 - deepen a flat XML after sender FCC conversion
Source Code and Explanation
The full source code, example input and output files are available in the following GitHub repository
The logic comprises of three XSLT template match sections.
1. Flatten segments with subsegments
This template uses *[*] to match any nodes that have subsegments (subnodes with child fields.) Once such a segment is matched, the immediate child fields are copied over 1-1, while the subsegments are brought up to the same level (flattened.) The logic then continues recursively.
<!-- Match any segment with child segments/fields --><xsl:template match="*[*]"> <!-- (1) - Copy the current segment, then select attributes and child elements that are fields --> <xsl:copy> <xsl:apply-templates select="@* | *[not(*)]"/> </xsl:copy> <!-- (2) - Further select child elements that are segments --> <xsl:apply-templates select="*[*]"/></xsl:template>
The following sample input and output shows that the subsegments are brought to the same level in the hierarchy as the parent segment.
Input | Output |
---|---|
<Level1> <Field1>A</Field1> <Field2>B</Field2> <Level2> <Field3>C</Field3> <Field4>D</Field4> <Field5>E</Field5> </Level2> </Level1> | <Level1> <Field1>A</Field1> <Field2>B</Field2> </Level1> <Level2> <Field3>C</Field3> <Field4>D</Field4> <Field5>E</Field5> </Level2> |
2. Identity transformation for root element
This template matches the root element of the XML document and copies it 1-1 to the output.
Note: To reuse this XSLT code, the Root element need to be changed according to the root element name of the source input XML.
<!-- Match root element, copy details and and select child attributes and nodes --><xsl:template match="*[local-name()='MT_Deep']"> <xsl:copy> <xsl:apply-templates select="@* | node()"/> </xsl:copy></xsl:template>
3. Identity transformation for all other attributes and nodes
This template is similar to the one above and does a 1-1 transformation for all other attributes/nodes.
<!-- Match all other attributes (@*) and other node types (elements, comments) --><!-- Copy the current element, and select child attributes and nodes --><xsl:template match="@* | node()"> <xsl:copy> <xsl:apply-templates select="@* | node()"/> </xsl:copy></xsl:template>
Note: In theory templates 2 and 3 can be combined into one by specifying an additional /* match for the root element. However, such logic does not provide the intended result when being executed on PI's mapping runtime, although it works on Online XSLT Test Tool and the XML-Tools plugin of Notepad++. It will also work if usage of the SAP XML Toolkit is specified in the Operation Mapping, however this goes against SAP's recommendation as documented in the following SAP Library link - XSLT Mapping - SAP Library.
I am unsure of the reason for the behavioral difference when it is executed on the XSLT parser of the JDK. Otherwise, the XSLT code would have been a generic logic that could have been reused in any transformation without any modification.
Example Scenario
Here is a sample scenario where the requirement is to achieve the creation of a flat file with deep/nested structures.
The output file will be a shipment file which contains all the details for the shipment. Each shipment file can have multiple deliveries, each delivery in turn can have multiple orders, and each order multiple items. Each segment needs to occur under the parent segment, therefore indicating a deep/nested structure.
Segment Name | Parent Segment | Type Indicator | Occurrence |
---|---|---|---|
Header | - | H | 1 |
Delivery | - | D | 1 - unbounded |
Order | Delivery | O | 1 - unbounded |
OrderText | Order | T | 0 - 1 |
Item | Order | I | 1 - unbounded |
Footer | - | F | 1 |
Design
This article will only focus on the sections that are required to enable this XSLT approach, therefore common steps for design and configuration will not be covered.
Target structure definition
The data type for the target structure will be defined as a deep XML schema. No flat DT definition that is required for FCC needs to be defined. Below is the sample deep XML schema representation of the output format.
Message Mapping
Create a normal graphical message mapping to map the source structure to the deep target structure above.
Import XSLT code
Zip the XSL file and import it to ESR as an Imported Archive.
Operation Mapping
Typically the XSLT mapping will be used as the second step of the Operation Mapping. The first step will be the message mapping to map the source to target. Then the XSLT will flatten the final output.
Step | Mapping Type | Name |
---|---|---|
1 | Message Mapping | <Message Mapping object created above> |
2 | XSL | Deep2Flat |
Below is a sample execution of the second step of the OM to flatten the deep structure in preparation for FCC. The result shows that the XSLT have flattened all the segments to the same level.
Note: The output of the XSLT mapping step is not a valid XML document when validated against the original data type definition, but it does not matter as the main purpose of the output is as an input to the FCC at the receiver channel.
Configuration
The normal configuration steps are performed. At the receiver channel, specify the FCC parameters. The following screenshot shows the FCC parameters using MessageTransformBean on an SFTP channel.
Testing Results
After execution of the interface, we can see from the after FCC log version that the payload has been converted into a flat file format with deep structure.
Similarly, when the CSV file is viewed in Excel, we can see that all the subsegments are created in the correct order underneath their corresponding parent segments.
Conclusion
With a relatively simple XSLT logic, we can still achieve creation of flat file with deep structure whilst still using the standard FCC functionality. The benefits of this XSLT approach are:
- Easily reusable for other interfaces - just need to change the name of the root element
- The XSLT logic works irrespective of the number of levels of the deep structure
- No intermediate structure needs to be defined for flat XML representation of the file - this is required when using the Graphical mapping approach
- No changes required if fields of a segment are changed, i.e new fields, deleted fields, modified field name
In the continuation of this series, part 2 covers handling of deep structure FCC at the sender side.
Context in SAP PI
If you are new to SAP PI, there is a great chance that you get confused, the moment you get to hear the word "context". It is one of the most confusing topics for almost all new joiners to SAP PI. In this document, we've made an attempt to simplify it to the every possible extent and have concentrated only on this topic and tried to give exhaustive examples, so that it leaves the readers with a mind free of doubt. |
By default, Friend1 is under “Friends” context.
Changing the context to Personal Details (next parent level - (1st level above the default level).
Friend1 nodes under different “Friends” context are grouped under one “Personal Details” context.
Changing the context of “Friend1” to next parent level (2nd level above the default level).
Friend1 nodes under different “Personal Details” context are grouped under one “F_Address_MT” context.
UseOneAsMany in SAP PI
UseOneAsMany Analysis
1st argument: Value to be passed to target.
2nd argument: How many times the value must be repeated in target.
3rd argument: How context changes must be in target.
Rule 1: First queue and Second queue should have same number of context changes.
Else:
First Queue: Second Queue: Result:
Rule 2: Second queue and Third queue should have same number of values (total count).
Else:
Second Queue: Third Queue: Result:
Rule 3: First queue should not have repeated values.
Else:
First Queue: Result:
Else:
Pick1:
Everything seems right. Right?
.
.
.
Wrong!
(Wrong Error)
Reason: State is under F_Address_MT and Name is under PersonalDetails.
Intuition: A field of lower level cannot be mapped to one in higher level.
Pick 2:
Expecting something like this?
No? Then you’re right. Look at this queue:
This is what you are gonna get:
Why?
The context changes on the target are at the same level as first queue which is at PersonalDetails, whereas third queue is at Friends level (lower level). Hence the context changes propagated to the target get converted to the level in first queue.
Handling Code page, Character encoding in SAP PI / PO
As a middle-ware SAP PI / PO integrates SAP / non-SAP systems, which use different formats (text(XML, CSV...), binary) to represent data. Sometimes they even encode text in different formats OR use different code-pages. This document helps to understand and handle those situations.
Code-page is a table, assigning a number for each character. Example 'A' is 65, 'a' is 97 and 'b' is 98 and so on.
Click on image to expand. HTML form of below screenshots are attached (please rename .txt to .html). ASCII, ISO 8859-1, CP-1252 and Unicode.
![]() |
'A' is 65. 65 = 10 0001 (64*1 32*0 16*0 8*0 4*0 2*0 1*1). Representing code-page number in 0's and 1's is encoding.
10 0001 is 65. Lookup 65 in code-page, it is 'A'. Looking up code-page number is decoding.
Some encodings are fixed length. Example ASCII, ISO 8859-1, cp1252, UTF-32 and ISO 8859-1 and cp1252 have to use 1 byte to represent code-page number. ASCII has to use 1 byte (it actually use only 7 bites, 1st bit is ignored). UTF-32 has to use 4 bytes.
Some encodings are variable length. Example UTF-8 and UTF-16. UTF-8 will start with 1 byte, if code-page number is too big to be represented in 1 byte, it can use 2 or 3 or 4 bytes. UTF-16 will start with 2 bytes, if needed it will use 4 bytes (i.e., 2 bytes or 4 bytes).
UTF-8: - UTF-8 is the preferred encoding on internet. HTML, XML, JSON ... are encoded in UTF-8 by default.
Understand UTF-8, BOM, endian. FYI..Characters, Symbols and the Unicode Miracle - Computerphile - YouTube, Characters in a computer - Unicode Tutorial UTF-8 - YouTube
Byte Order Mark (BOM):- It's a heads-up notice to target system about encoding. Some Microsoft Windows applications require BOM to properly decode UTF text. This is how BOM works. If we are sending UTF-8 encoded text, then we prefix that text stream with binary form of EF BB BF (hex). Then target system reads these characters and understands "This text stream starts with EF BB BF, then this text must be UTF-8 and I should use UTF-8 decode logic". It will not display EF BB BF. If we are sending UTF-16 Big-Endian, then we will prefix that text stream with FE FF (hex). Then target system reads these characters and understands "This text stream starts with FE FF, then this text must be UTF-16 BE".
If target program does not understand BOM heads-up notice, i.e., when it sees EF BB BF (hex) at starting of text stream and it is not programmed to understand it. It may interpret it as cp1252 characters . If you see any error or display starting with  OR þÿ OR ÿþ. It means that, target program is not decoding data properly.
Click on image to expand.
To test whether source, PI/PO and target system are using proper encoding or not. You can request source system to send Euro sign € in one of data elements. If target system does not decode € properly, then there is issue with code-page / encoding.
Why Euro sign € is displayed as €?
€ -> U+20AC (hex) -> 0010 0000 1010 1100 -> 11100010 10000010 10101100 -> E2 82 AC -> €
Please go through How to Work with Character Encodings in Process Integration.
Here are some points to note from above document.
When reading XML, SAP recommend to "File Type" as 'Binary'. As XML prolog has encoding details <?xml version="1.0" encoding="utf-8"?>. SAP note 821267.
You can use below adapter modules to change encoding.
MessageTransformationBean: Transfer.ContentType = text/xml;charset="cp1252"
TextCodepageConvertionBean: Conversion.charset = "utf-8"
XMLAnonymizerBean: anonymizer.encoding = "utf-8"
FYI. cp1252 is superset to ASCII and ISO 8859-1. UTF-8 is superset of cp1252, but number of bytes used may vary.
Lets handle issues mentioned section 5 and 6 in How to Work with Character Encodings in Process Integration.
1) Java mapping to change code-page/encoding. Supported Encodings.
package com.map; import com.sap.aii.mapping.api.*; import java.io.*; public class ChangeEncoding_JavaMapping extends AbstractTransformation { @Override public void transform(TransformationInput transformationInput, TransformationOutput transformationOutput) throws StreamTransformationException { try { InputStream inputStream = transformationInput.getInputPayload().getInputStream(); OutputStream outputStream = transformationOutput.getOutputPayload().getOutputStream(); //Read input as cp1252 and write output as UTF-8. byte[] b = new byte[inputStream.available()]; inputStream.read(b); String inS = new String(b, "Cp1252"); outputStream.write(inS.getBytes("UTF-8")); } catch (Exception ex) { getTrace().addDebugMessage(ex.getMessage()); throw new StreamTransformationException(ex.toString()); } } }
Result: -
2) Java mapping to handle Quoted-Printable input.
package com.map; import com.sap.aii.mapping.api.*; import java.io.*; public class QuotedPrintable_JavaMapping extends AbstractTransformation { @Override public void transform(TransformationInput transformationInput, TransformationOutput transformationOutput) throws StreamTransformationException { try { InputStream inputStream = transformationInput.getInputPayload().getInputStream(); OutputStream outputStream = transformationOutput.getOutputPayload().getOutputStream(); //Convert quoted-printable to unicode output. Add JAX-WS library when compiling. inputStream = javax.mail.internet.MimeUtility.decode(inputStream, "quoted-printable"); //Copy Input content to Output content. byte[] b = new byte[inputStream.available()]; inputStream.read(b); outputStream.write(b); } catch (Exception ex) { getTrace().addDebugMessage(ex.getMessage()); throw new StreamTransformationException(ex.toString()); } } }
Result: -
3) Java mapping to handle Base64 input.
package com.map; import com.sap.aii.mapping.api.*; import java.io.*; public class Base64_JavaMapping extends AbstractTransformation { @Override public void transform(TransformationInput transformationInput, TransformationOutput transformationOutput) throws StreamTransformationException { try { InputStream inputStream = transformationInput.getInputPayload().getInputStream(); OutputStream outputStream = transformationOutput.getOutputPayload().getOutputStream(); //Decode Base64 Input content to Output content. FYI. Java 8 has java.util.Base64. byte[] b = new sun.misc.BASE64Decoder().decodeBuffer(inputStream); //Above class is internal class. As an alternative you can use below line, whichever works for you. //byte[] b = javax.xml.blind.DatatypeConverter().decodeBuffer(inputStream); outputStream.write(b); } catch (Exception ex) { getTrace().addDebugMessage(ex.getMessage()); throw new StreamTransformationException(ex.toString()); } } }
Result: -
4) Java mapping to add BOM.
package com.map; import com.sap.aii.mapping.api.*; import java.io.*; public class BOM_JavaMapping extends AbstractTransformation { @Override public void transform(TransformationInput transformationInput, TransformationOutput transformationOutput) throws StreamTransformationException { try { InputStream inputStream = transformationInput.getInputPayload().getInputStream(); OutputStream outputStream = transformationOutput.getOutputPayload().getOutputStream(); //Copy Input content to Output content. byte[] b = new byte[inputStream.available()]; inputStream.read(b); //Prefix BOM. For UTF-8 use "0xEF,0xBB,0xBF". For UTF-16BE use "0xFE,0xFF". For UTF-16LE use "0xFF,0xFE". outputStream.write(0xEF); outputStream.write(0xBB); outputStream.write(0xBF); outputStream.write(b); } catch (Exception ex) { getTrace().addDebugMessage(ex.getMessage()); throw new StreamTransformationException(ex.toString()); } } }
Result: - BOM characters will not be displayed.
5) Java mapping to handle XML Escape Sequence.
FYI...How to create Java mapping.