Quantcast
Channel: SCN : Document List - Process Integration (PI) & SOA Middleware
Viewing all 571 articles
Browse latest View live

File acknowledgement in NW BPM scenario

$
0
0

Objective


In NW BPM we don't have acknowledgement handling option that we had in ccBPM. The goal of this document is to show a workaround for the problem.

There is already a good solution in that topic (http://scn.sap.com/community/pi-and-soa-middleware/blog/2012/10/01/file-record-confirmation-in-a-nw-po-scenario) but it is writing in a temporary folder and uses an extra file sender channel to implement the acknowledgement functionality.

The solution described in this document uses a single file receiver channel.

 

The idea behind

 

After a successful writing the receiver file adapter can send back an empty message. In case of failure the message goes to error status (after the configured number of retries) and there is no response sent back. In our case we have 3 retries with 5 minutes interval which means either we get an acknowledgement from the file adapter in 15 minutes or we can be sure that the file writing has failed.

It is modeled by an NW BPM which is sending the asynchronous message to the file adapter and which has a timeout of 16 minutes.

What we need to make this work is:

- to make the file adapter to send back an acknowledgement message - it is done by a ResponseOneWay bean on the module tab

- to make an xml from the empty message - it is done by a Transformation bean (plain2XML)

- to transfer correlation data from the inbound message to the new asynchronous acknowledgement message created by the ResponseOneWay bean - it is done by a DynamicConfiguration bean using read/write commands.

 

Message flow

 

message_flow.png

 

Implementation

 

ESR

 

Interfaces and message types

 

We need inbound and outbound interfaces for the main message, the acknowledgement and the error/success message with the following types:

msg_types.png

The two fields filename and path of MT_MESSAGE will be used in the file adapter and will also form the content of the file.

The obj_id of MT_Acknowledgement is used for message correlation in the BPM and will contain in our example the file name.

The description field of MT_STATUS will be written in an error/success file in order to model some feedback logic to see the outcome of the main message transfer (it could have been also email sending, alerting etc. but for demo purpose file transfer seemed to be the easiest). In our case it will hold again the file name.

 

The following asynchronous interfaces are used in the scenario:

interfaces.png

The inbound interfaces MIIA_MESSAGE and MIIA_Acknowledgement belong also to the BPM business component thus they have to be XI30 compatible.

 

Mapping objects

 

Before calling the receiver file adapter we insert the file name from the payload into the ASMA (Adapter specific message attributes). After calling the receiver file adapter when sending back the acknowledgement message we put the file name from the dynamic configuration into the acknowledgement payload.

These two operations are performed in the following mappings:

mapping.png

mm_setfile.png

mm_ack.png

 

Directory

 

The business components

 

The following business components and channels are defined for the scenario:

bc.png

The BPM is communicating via a sender and receiver soap channel the same way as in any other NW BPM scenario so they won't be detailed here.

The soap sender channel of the sender system and the error/success file receivers of the receiver systems are also the usual ones, no explanation needed.

It is however important to explain the file receiver channel of the main message which is the heart of the solution. The fault soap sender channel of the receiver system is also an ordinary soap channel but it is worth mentioning that it is used for the acknowledgement sending from the receiver file adapter of the main message.

 

The file receiver channel

 

As mentioned earlier this file receiver channel will send back an asynchronous acknowledgement message. To achieve this we need some module configuration.

file_rcv.png

The modules:

modules.png

GetFileName: we read the file name from the dynamic configuration and store it in a variable to reuse it in the ack. sending.

modules_getfile.png

Plain2XML: we need to convert the empty response of the file channel to an xml message.

modules_xml.png

SetFileName: we take the file name from the above variable and put it in the new acknowledgement message's dynamic configuration.

modules_setfile.png

RequestResponse and ResponseOneway: we send back a new asynchronous acknowledgement message.

modules_rob.png

The integrated configurations

 

First we send MESSAGE from the sender system to the BPM:

ico_11.png

ico_12.png

ico13.png

ico14.png

Then the BPM sends MESSAGE to the receiver system:

ico_21.png

ico_22.png

ico_23.png

ico_24.png

The acknowledgement is sent back from the receiver system to the BPM:

It is important to setup this ICO with virtual receiver as the file adapter sets the receiver in the message header.

ico_31.png

ico_32.png

ico_33.png

ico_34.png

Finally the BPM sends an error or success status to the receiver system (they look the same so only one is detailed here):

ico_41.png

ico_42.png

ico_43.png

ico_44.png

 

NWDS

 

nwds.png

 

Start event: we map the incoming message (via MIIA_MESSAGE) to the local variable MESSAGE.

Parallel split: we start the file sending branch and the timeout branch.

Send_MESSAGE: in the input mapping we map the local MESSAGE to MT_MESSAGE and send it out via MIOA_MESSAGE.

Acknowledgement: in that intermediate message event we wait for MIIA_Acknowledgement with correlation on the file name

                              string-equal(MT_Acknowledgement/obj_id,MESSAGE/filename)

Send_SUCCESS: we map the file name in the MT_STATUS-description field and send the message via MIOA_SUCCESS

Wait_Ack: internal timer for 16 minutes (in case of error the adapter tries to resend 3 times with 5 min. interval). After 16 minutes the message is already in error status, we can send out the error status.

Send_ERR: we map the file name in the MT_STATUS-description field and send the message via MIOA_ERR

End, Termination 0: termination events


Configuring Java IDoc Adapter (IDoc_AAE) in Process Integration

$
0
0

This guide helps to understand the configurations required for IDoc scenarios with IDoc_AAE adapter type in PI 7.3/ 7.31 involving advanced adapter engine.  The new IDoc Adapter IDoc_AAE is part of the advanced adapter engine.

View Document

IDOCs with red error flag in OUTBOUND STATUS

$
0
0

There is one rare case in SAP PI when the message is successful in SAP PI but it actually failed in PI with its OUTBOUND STATUS in error.

 

As a SAP PI consultant I luckily faced this issue. Our ABAP team asked us for few IDOCs that were not reached to their system on one particular date. When we checked the IDOCs in PI for mentioned date, we found that those were successful but actually not.When we scrolled to right of the message entry we noticed the red flags which means not processed to target system.

 

We usually check the messages with the main STATUS which we see at very first column in SXMB_MONI window but it may not tell you the real story so OUTBOUND STATUS also play role.

 

Now I will discuss the issue with screenshot references.

At first we find the message entries with red error flags in OUTBOUND status

 

000.png

As we know the RFC calls of IDOCs happen through tRFC queue i.e. SM58. So we need to give a look for the erroneous entries in SM58 for the same failure date. Here in our case we found that calls for the same ABAP system were in error so we need to test the connection and if connection is OK we need to execute all entries.

 

111.png

 

These RFC calls (LUWs) need to be executed from SM58 but if they are many then do the Execute LUWs

So to execute all failed LUWs with below steps:

 

Step 1:

Goto Edit and Choose Execute LUWs


222.png

Step 2:

Enter the target system details in Destination and select the date as per date of failed entries. Select the check boxes as per error entries.

333.png

 

Step 3:

  Once done the failed entries in SM58 and the OUTBOUND STATUS turned successful.444.jpg

Cache Issues with IR and ID

$
0
0

We recently noticed an issue in PI that if we were creating\updating some design objects in IR and ID the same were working on older version or doesn't even reflected at runtime if it is newly created.


Case is that we created a channel in ID but when we were checking it in 'Communication Channel Monitoring' it was not visible under the all channels list and the mappings we were updating that were not even executing as per the current version.

 

It clearly seems to be a cache issue which we can fix in IR and ID itself. PFB the steps that will help us in resolving these kind of cache issues.

 

Step 1:

 

Go to Environment Tab and look for Cache Notifications… You will see recent entries with relevant details. If the entries are in green then it’s OK but if not then that is the problem to us.


00.png


Step 2:


When you select one entry or more then the Icons on the left side get enabled for you and you can execute them as per requirement. The highlighted one is for Cache Update.

11.png

 

Step 3:


Below one is one example where you can see the entry is in Red status that means it is in error and you need to update that or delete that.


22.png


Step 4:


Select the erroneous entry and Click on ‘Repeat Cache Notification’ Icon, which will change the status from red to green. And after that you are done.


44.png

Cache issues should get resolve by following these steps

Manual transports in PI using File System..Its simple

$
0
0

If your CTS, CMS are not working and if we have some urgent pending transport we can choose one option of transport which is manual. Our ID and IR objects can easily be transported by using this method but we may have dependency on our basis team if we go for this option and if we have access to server directories then it become simple to us .

 

So here are the steps that we need to follow to complete this task.

 

Step 1:

Go to your namespace whose objects you want to export.

XIExport1.PNG

Step 2;

Select the Mode as ‘Transport Using File System’

 

XIExport2.PNG

Step 3:

Click Continue

XIExport3.PNG

Step 4:

Select ‘Object Set’ as per your requirement.

XIexport4.PNG

Step 5:

Take an example of Individual Objects and go to middle icon of ? to select the required object.XIexport5.PNG

Step 6:

All kind of repository objects are shown for manual selection, select the required one.XIexport6.PNG

Step 7:

For one or e.g. we have selected one mapping.

XIexport7.PNG

Step 8:

One selected it will be shown like this, just click ‘Finish’

XIexport8.PNG

Step 9:

After Clicking on finish you will get details of the file.  Note down the Export Path and Filename from the window.

XIexport9.PNG

Step 10:

Go to the Export path location of you PI server from where you want to Transport the Objects like Dev. E:\usr\sap\,Dev PI server name>\SYS\global\xi\repository_server\export

XIexport10.PNG


Step 11:

Locate you file and place to target PI server IMPORT directory. E:\usr\sap\<Quality Server name>\SYS\global\xi\repository_server\importXIimport.PNG

XIimport1.PNG

Step 12:

Now go to your QA PI server and Go to ‘Tools’ -> Import design objects

XIimport2.PNG

Step 13:

The file will be visible if you have placed it right, you just need to click OK to get it imported.

XIimport3.PNG

Once imported the placed file will move to importedFiles

Xiimported.PNG

Step By Step Guide for Configuring User-Defined Search In PI

$
0
0

Introduction:

 

If we consider any PI System, across an interface there will be thousands of messages flowing. So it becomes very difficult to search for a message which is required to us. To make the message search in an easy way User-Defined Search can be configured. User-Defined search is nothing but searching for a required message in an interface based on the content of the payload or the dynamic header. In this document step by step procedures for configuring User-defined search in Integration engine (Abap stack) and Adapter engine (Java stack) has been explained.

 

Basic Concepts and Overview:

 

1. If you consider the payload which is given below We can define user-defined search for any of its content.

UDS_1.png

2. Let us take a Name here as an example.

3. Once we define the Name as search criteria in the UDS, Based on the value which we give the particular message that satisfies our search value will be filtered and that message will be displayed.

 

For configuring User-defined Search the following three things need to be configured.

           

  • Filters-They contain the details about the interface .i.e. the sender and receiver components, Interfaces and their namespaces.
  • Extractors-Extractors for filters contain extraction details for the filters i.e. based on which criteria we want to filter the messages. It can be content of the payload or dynamic header. So here we need to define the required the X-path expression or dynamic header.
  • Namespace Prefixes-Here we define the namespace prefix and the corresponding value for the namespace prefix.

 

Indexing:

 

In Adapter engine User-defined search we have an Indexing option. Indexing is nothing but searching for the message which has been already sent to PI and processed. Currently this option is available in adapter engine. We don’t have this option for Integration Engine.

 

System Requirements:


PI 7.3 system and Access to Transactions SXMS_LMS_CONF, SXMB_MONI, and SXMB_IFR are required for PI System.

 

User-Defined search in Integration Engine (Abap Stack):

 

With PI 7.3 We can open Abap stack by two ways.

 

  • Using normal SAP Logon.
  • Using Browser.
  • The step by step procedure to open Abap stack using browser is explained in this guide.
  • In the Abap stack use the transaction “SXMS_LMS_CONF” to define filters, extractors and namespace prefixes.

 

 

Opening Abap stack using web browser:

 

Opening ABAP stack using web browser is a new feature available with PI 7.3.Now let us  

see how to open Abap stack using web browser and then we will how to configure User-

defined Search using the transaction “SXMS_LMS_CONF”.

 

1. Open PI browser.

2. Then select “configuration and Monitoring Home" Which is encircled in black in the screenshot below.

UDS_1.png

 

3.Then select monitoring tab and under that Integration Engine tab and select “message monitor (Database)” option which is encircled in black in the screenshot below.

UDS_1.png

 

4. Once we click that we will be redirected to Abap stack as shown below.

UDS_1.png

5. Then we need to go to transaction”SXMS_LMS_CONF” to define User-Defined Search.

 

 

Filters:


1.Once we open the transaction “SXMS_LMS_CONF” we will find the option to add, delete and edit filters as shown below.


UDS_1.png

2. Define filters by giving sender and receiver parties, components, Interface name and namespace.

 

UDS_1.png

 

Extractors:


1. The next step is configuring Extractors for the filters.

2. Select the filter for which Extractor need to be defined.

3. At the bottom of the screen you will find the option to create Extractor which is encircled in Black in the screenshot below.

4. Once we select new option a new tab will open as shown in next slide where we will define our Extractors.

UDS_1.png

 

5. Define extractors for the filters by giving X-path or dynamic header.

6. Also you can invoke these extractors during message processing or by external job.Select them according to the requirement.

UDS_1.png

7. One Filter can have many Extractors.

 

 

Namespace Prefixes:


1.in the same screen at the right side bottom you will find an option to add Namespace Prefix which is encircled in black in the screenshot.

UDS_1.png

2.  The screenshot below is for Namespace prefix definition.

UDS_1.png

3. Define namespace prefix and namespace value and corresponding prefix.

 

 

Searching for the message using User-Defined Search Criteria:

 

1. After configuring filters and extractors, namespaces go to transaction “SXI_MONITOR”.

2. Select the “User-Defined selection Criteria” tab.

 

UDS_1.png

 

3. There you can find two columns Name and value.

4. Name is where the extractor defined will be selected and give the value based on which You want to search for the message.

5. Values are given as shown below.

UDS_1.png

6. You can search for all values or for any of the value.

7. Its like “or” and “And”

8. Now the messages will be filtered based on the" user-defined” search criteria and you can see the user defined attributes by selecting attributes tab or by pressing F7.

UDS_1.png

9. This is how UDS works in Integration engine.

 

 

User-Defined Search in Adapter Engine (Java Stack):


1. Hope you all now got the idea what is User-Defined Search. Now we will see how to do various configurations for UDS in Adapter Engine message search.

2. Open java stack of PI.Go to configuration and monitoring home.

3. There you will find “User-Defined Search Configuration”. Select that option.

UDS_1.png

4. Once you select the “User-Defined Search” option you will be directed to the page as shown below.

UDS_1.png

 

5. Here you find the filters, extractors and prefixes tab.

6. You can create new filter by using the “New” button which is highlighted by the black circle.

7. You can edit the existing filter by using the" Edit” button which is highlighted by red circle.

8. You can also the delete the existing filter by using “Delete” button which is highlighted by green circle.

9. You can also activate or deactivate the filter.

 

Filters:


1. Now we will see how to create filters for a interface.

2. Select the “New" button for creating filter. You will be redirected to the page as shown below.

UDS_1.png

3. New filter can be created by giving Name of the filter, Sender & receiver parties, components, and interface name and namespace.

4. Here also you can activate or deactivate the filter with the help of status option.

 

Extractors:


1. Now its time to create extractors for the filters.

2. Select the “search Criteria" option which is encircled in black in the screenshot below to define extractors.

UDS_1.png

 

3. Select the Filter for which you want to create extractor. Now select the new Button and define the extractor.

UDS_1.png

4. For extractor you need to give the Name, Type .i.e. whether it is xpath expression or dynamic header.

5. Then give the required Xpath or Dynamic header based on which you want to search for the message.

UDS_1.png

 

Namespace Prefixes:


1. After creating Filters and extractors now namespace prefixes should be defined.

2. Select the prefixes tab.

UDS_1.png

UDS_1.png

    

 

3.Select the prefix tab and give the namespace prefix and the corresponding value.


UDS_1.png



Searching for the message using User-Defined Search Criteria:


1. Now its time to check our UDS which we have configured.

2. Select monitoring option and below which you will see under" Adapter engine” tab.

3. After selecting that select" message monitor option "which is encircled with black in the figure below.

UDS_1.png

1. You will be redirected to Monitoring messages page. Select Database view and also select 

    “Advanced” option which is encircled with Red in the figure below.

UDS_1.png

 

 

2. Once you select the Advance option you can see the “User-defined Search” criteria.

UDS_1.png

3. Select the ADD predefined tab.

4. Now you can see the filters and extractors which we have defined earlier.

5. Now select the required filter and extractor for your interface.

UDS_1.png

6. Then give the value based on which you want to filter.

7. for example if EmpId is the filter criteria and your giving”1” as the value. Messages that contain “1”in the payload will be displayed.

8. The screenshot below here shows the messages filtered based on the value or content.

UDS_1.png

9. In User-Defined Attributes tab you can see the values which we have defined.

 

Indexing:


1. As discussed earlier, Indexing option is available for adapter engine User-Defined search.

2. Select the Indexing options button which has been encircled with black in the screenshot below.

UDS_1.png

 

3. By indexing we mean we can search for the message which has been already processed.

UDS_1.png

 

 

4. We can give the required dates and select “start Indexing" button to start indexing.

And the status can be checked by selecting" Indexing status" button.

EJB in SAP PI

$
0
0

Objective

 

While working on SAP PI you must have come across the following objects i.e. Adapter Module, Java Proxy etc. These PI components belong to the java stack and they are based on EJB.

 

In the Java world, people know what EJB is but for us who are in SAP PI, we are not very familiar with EJB though this is exclusively used. So let us discuss EJB for a while. Before we start with EJB, it is also necessary to discuss the Web Server and the Application Server.

 

Web Server vs. Application Server

 

When we were in colleges, many of us have built our own web-site or have seen our class-mates making some. Most of them has downloaded some or the other freely available web server from the net and built their web application on them. Later when we are working as IT professionals and again making web based application for industries i.e. Banking, Manufacturing, Finance etc., we no longer use a Web Server but we use Application Server. Moreover no application server is freely available in the market. You have to buy it by paying thousands and even lacs of rupees.

 

Why do we need an Application Server?

 

It is like making your own boat and trying to cross the river. If you get drowned then it is your bad luck. But if you would like to ferry passengers then nobody will sit in your boat unless they feel secured, no matter how attractive and comfortable it looks. Similarly, the business will look for some primary requirement from your application vis-à-vis their business security etc and once they are satisfied then only they will look into it with interest.

 

What are those primary requirements? There are many, only some are mentioned here i.e. load balancing, transparent fail-over, transactions, clustering, object life cycle, resource pooling, security, caching etc. Obviously, you cannot be master of all. As IT professionals, it is best we concentrate on finding solutions to their complicated business logic i.e. Finance, Sales and Distribution, Human Resource etc and leaving this primary requirement to different category of people, who specializes in them only. There are many companies who are pioneer in giving solutions to these requirements. They are generally known as the vendor companies and their solution is known as the middle-ware services. You can use their solutions in your application so that you are then free to concentrate on core business logic, coding etc.

 

That is how an application server is born. The primary requirements are bought from the vendor companies and plugged into a web server. All this needs lot of effort and money. That is why the application server is never freely available in the market and you always have to buy it. 

fig-1.JPG

Fig-1: Web Server vs Application Server

 

A web application is client-server architecture and may be of 2-tier or 3-tier architecture. 3-tier architecture consists of the following:

  1. Presentation layer - The presentation layer contains the components that implement and display the user interface and manage user interaction.
  2. Business layer - The business layer contains components used to implement the business logic and to define the business entities.
  3. Data layer - The data layer contains components used to meet the database access requirements of your application.

In 2-tier architecture, the two layers (Presentation and Business layers) are combined into one.

 

Enterprise Java Beans

 

As we move on to build more robust application for our customers we use Application Server which enable us to plug-in the required middle-ware services. To make it possible, we develop objects using the EJB framework.

  • Enterprise Java Beans (EJB) is a server side component architecture that simplifies the process of building enterprise-class distributed component applications in Java.
  • An enterprise bean is a deployable bean and it has to be deployed in a container i.e. Application Server. Each enterprise bean consists of one java class, two interfaces and two descriptor files. Regardless of an enterprise bean’s composition, the clients of the bean deal with a single exposed component interface.

In Java-world, there is a very familiar term - Java Bean. A java bean is a simple java class with get and set methods. It consists of a single java class. A java bean does not need a runtime environment i.e. container. It is not part of the EJB framework.

 

Enterprise bean is used to build distributed component application in Java. What does distributed means? A distributed application is one whose components does not reside in one application server but get deployed in several application servers i.e. in different address-space (Clustering). But they work in unison and to the user it appears to be a single application.

 

Where do we use the EJB framework?

 

fig-2.JPG

Fig-2: EJB in 3-tier architecture

 

EJB framework replaces the business layer and the data layer. The objects we built using the EJB framework is known as the Enterprise Java Bean. Application Server plus the enterprise Java Bean together enable us to built robust application in Java for our customers and provide them the necessary middleware services. There are three types of enterprise beans - Session, Entity and Message Driven. Functionally they are classified into two i.e. whether they represent

  1. business logic or
  2. business data

 

Type of Enterprise Bean

fig-3.JPG

Fig-3: Type of Enterprise Bean

 

The Session Bean is further classified into two - Stateful and Stateless. A Stateful session bean is designed to service business processes that span multiple method requests or transactions. A stateless session bean is for business processes that span a single method call.

 

The Entity Bean is further classified into two – Bean Managed Persistence and Container Managed Persistence. In Bean-Managed Persistence you map the class variables to the database fields and develop the corresponding database access logic within the bean. In Container Managed Persistence, you only map the two fields in the deployment descriptor and it is the container which generates the database access logic using a Data Access Object.

 

Functionally the Message Driven Bean is similar to session bean but the only difference is that you can call a Message Driven Bean by sending messages to it. You cannot do lookup.

 

In SAP PI, we use Session Bean mostly to develop objects i.e. Adapter Module, Java Proxy etc. Henceforth, our discussion will remain limited to Session Bean only. Entity Bean and Message Driven Bean will not be covered here.

 

Detail discussion on an Enterprise Java Bean

 

When we write a simple Java class, following are the components of the class

  1. Constructor
  2. Methods to support business operation
  3. Methods for internal operation i.e. data access, reading properties file, modularization of code etc; they are never called by the user.

 

Similarly following are the components of an enterprise bean

  1. Life-cycle methods i.e. constructor - create, remove
  2. Methods to support business operation
  3. Methods for internal operation and container methods. Container methods are never declared but we extend them from parent interface and over-ride them i.e. ejbActivate, ejbPassivate etc. They communicate with the container.

 

An enterprise bean consists of two interfaces and a class. One interface bears the signature of the life-cycle methods and the other the business methods. The former is known as the Home Interface and the later the Remote Interface.  They make the enterprise bean remote-enabled i.e. the user can call the deployed enterprise bean from another address-space.

fig-4.JPG

Fig-4: Remote enabled enterprise bean

 

Remote vs. Local Interface

 

All enterprise beans are made remote enabled to support distributed component architecture. But it comes at a cost. Calling a remote enabled bean adds a lot of overhead, consumes resources and makes the system slow.

 

Whenever we built an application for our customer, the enterprise beans created can be classified into functional modules. A business process may involve multiple enterprise beans. It is found that barring a few, most of the enterprise beans in one module does not need to call an enterprise bean in another module. So if we deploy the enterprise beans module-wise in different container then all of them need not be remote enabled. A new set of interface has come for them - Local interface and LocalHome interface. The former replace the Remote interface and the later the Home interface.

fig-5.JPG

Fig-5: Remote and Local Enterprise Bean

 

When remote enterprise bean call each other they pass parameter by value while when local enterprise bean call each other they pass parameters by reference. They consume much less resources and they are very fast. But they will only work when both the caller and the called bean are in the same address-space.

 

Creating an Enterprise Bean

 

We start by creating the two interfaces and the class. We declare the life-cycle methods and the business methods in the two interfaces and implement the same in the class. For remote enabled, we use Remote and Home interface and if not required then we go for Local and LocalHome interface.

 

Next, we declare two descriptor files - Deployment Descriptor and Vendor Specific File. We plug-in the necessary middle-ware services for our bean through these two files. The two files are needed as not all middle-ware services can be configured in a similar fashion across the containers. Some are vendor-specific and need vendor-specific way of configuring. EJB specification does not touch them. While the deployment descriptor is portable across the container, the vendor-specific files are not.

 

Deployment Descriptor

Vendor Specific Files

It is an XML file

It may be XML, TXT etc

Portable across containers

Not portable across containers

Life-cycle Requirements, Transaction, Persistence, Security

Load Balancing, Clustering, Monitoring etc

   

Once all the files are ready we bundle them up into a JAR file and deploy them in the container. An ejb-jar file is a compressed file that contains everything needed for deployment.

 

Calling an EJB

 

We do not use the ‘new’ operator to make an instance of an Enterprise Bean. A pool of instances is created by the application server when it starts up. They are then ready to cater to request from users. Whenever a request comes from the user, the container maps it to any one of the available instance. The same instance is maintained for subsequent request from the same user if the bean is stateful and is released back to the pool once the user leave. For stateless, the instance is released after every request-response call. When the number of users increases above a threshold limit, the application server may decide to increase the pool or direct the call to another application server.

 

While calling our enterprise bean how will the user know what are the methods we have declared and how do we call them i.e. arguments. We have to provide them the two interfaces.

 

If we would like to make our bean remote enabled then we have to generate the stub and skeleton. They are the two java objects which take care of the communication protocol across the network i.e. to establish socket connection etc. They act as proxies between the server and the user. They are generated from the two interfaces – Remote and Home

 

Following are the list of objects to be deployed at the user and the server end.

 

User End

Server End

  • User class
  • Interfaces
  • Bean Class
  • Interfaces
  • Deployment Descriptor
  • Vendor Specific File
  • Stub
  • Skeleton

            Note – Stub and Skeleton is only applicable to remote enabled bean and not the local bean

 

An example on Enterprise Bean (Type - Session)

 

Suppose you are a user and you are doing online transaction on your savings account. These are the operations you are executing.

  1. login to your bank account using your user-id, password combination
  2. check your account balance
  3. transfer amount from your savings account to your credit card account
  4. logout


The above functionality is implemented in an Enterprise Java Bean. Method 1 and 4 are life-cycle methods. They are called only once during the life-cycle of the bean. Method 2 and 3 are business methods and may be called any number of times.

fig-6.JPG

Fig-6: User-Account Enterprise Bean

 

As the user session has to be maintained we will go for the Stateful session bean. When the user logs-in, the container engage one instance to the user and it remains committed for the entire session. The user then checks balance and transfer amount to credit card account. When the user log-out, the instance is released back to the pool and is ready to cater to the need of any other user.

 

Adapter Module

 

Integration Engine communicates with non SAP System using the adapter’s i.e. FILE, JDBC, SOAP, JMS etc. They belong to the JAVA stack. The adapters convert message from the native protocol to the HTTP-XML protocol and vice-versa.The Adapter Framework is the basis of the Adapter Engine. The Adapter Framework contains two default module chains: one for the sender/inbound direction and the other for the receiver/outbound direction.

 

Modules are java application developed in EJB. We add them to the relevant location in the default pipe-line i.e. module-chain and they provide us the ability to process messages between the messaging system and the default adapter i.e. the one which connects to the external system. There are standard modules provided by SAP and we can also develop our own custom modules using the SAP provided API. More than one module can be added to the module chain. But the default adapter is always the last in the chain for inbound flow and the first in the chain for outbound flow.

fig-7.JPG

Fig-7: Adding modules to the default module chain

 

A module is not for persisting data so it is developed as a Stateless Session Bean. If you look at the skeleton code - the bean class extends both the Session Bean interface and Module interface.

 

public class Module_Name implements SessionBean, Module {

public ModuleData process(ModuleContext moduleContext, ModuleData inputModuleData) throws ModuleException {

     // code here

}

}

 

The Module interface is specific to SAP and communicates with the pipeline to exchange messages. The modules are developed as an EAR file and deployed in the container. The orders in which the modules are called and any configuration parameter needed are defined in the adapter i.e. the module tab of the communication channel.

 

Java Proxy

 

Proxies are features provided by SAP to connect the Integration engine to an external application i.e. Java and ABAP. Proxy is generated from the corresponding service interface.

  1. Outbound interface generates client proxy and
  2. Inbound interface generates server proxy

Proxy support both synchronous and asynchronous communication. For synchronous communication, the corresponding interface should also be synchronous.


Proxy provides us an easy way to integrate complex interfaces with external system. Proxy takes care of all communication protocol and we as developer only integrate it at the application end.

fig-8.JPG

Fig-8: Proxy Communication

 

We generate an ABAP Proxy when we integrate PI with SAP System. In the SAP system, the generated code will provide us an EXECUTE method.

  1. EXECUTE_SYNC – for synchronous interface OR
  2. EXECUTE_ASYNC – for asynchronous interface

  We write ABAP code inside the method for calling BAPI, tables, validation etc.

 

We generate a java proxy when we integrate PI with a Java System. The generated code provides us with an enterprise bean which we integrate with our application. The enterprise bean will have both the type of interfaces for remote and local access. We use the one best fit for our application, depending upon where our application is deployed.

 

For operating with Java proxy, a Java Proxy Runtime is needed. The Java proxy runtime comes with a message processing and queuing system and provides security mechanisms. Adapters are not needed. We need to define a Business System within the Technical System – Web As Java. This business system serves as the default sender system for client proxies and as receiver system for server proxies as well. We must not assign more than one business system to the same technical system Web AS Java.

 

There is a difference in configuration between the client and the server java proxies that we need to do in the PI.

 

Client Proxy – Here proxy is the initiator of the process and so the proxy needs to know where to send the data. When we create a client proxy, the proxy gets created with the details of the integration engine i.e. IP Address, Port, Sender BS, interface name and namespace. We do not need to pass these details to the proxy. Only if the particulars of the integration engine changes, we have to generate the proxy once again.

 

When you send message with the client Java proxies, the details are already there in the message. The sender service is automatically set to the business system that you have maintained in the SLD. The interface name and namespace are derived from the message interfaces from which the Java proxies are generated. A sender agreement and sender channel is only necessary for Java proxies if you have special requirements for message security.

 

Server Proxy - Here PI is the initiator of the process and the PI will never know where you have deployed your proxy. So the following configuration is needed to pass the information about the whereabouts of your proxy to the PI.

  1. Create a receiver Channel – Type XI Adapter with the details of the Java WAS where the bean is deployed.
  2. Register the server Java proxy to the proxy server. The mapping between the interface name and the class name of the server Java proxy is done by the proxy server. Type the following in the browser

        http://<Host>:<Port>/ProxyServer/register?ns=<Namespace>&interface=<MessageInterface>&bean=<JNDI_Name>&method=<MethodName>

Rest of the configuration i.e. the pipe-line steps are as it is.


And my heartiest gratitude to Ed Roman for the knowledge I have gained from his book.

How to use system events to communicate XI and R3.

$
0
0

How to use system events to communicate XI and R3.

 

 

 

SUMMARY

 

This document shows how to create and trigger events.

 

This scenario makes a collect of xml files and transforms it in an inbound  Idoc (ZTUR_ATR_IN) to SAP. Some xml files are stored in a FTP folder. Then the interface is triggered when the xml file arrived to the FTP dir. Later, the files go to  BPM and the files are collected. After that those files are transformed in an Idoc. This Idoc is uploaded to SAP I-SU System.

 

The really trouble is that the Idoc need be executed manually from the transaction BD20. Then you need to implements how to trigger this transaction automatically.

 

To do this process you need from the BPM XI a last step to call to the BAPI. This BAPI will start the event in Sap R3. When the event is running, it will trigger a Job that you use to call a variant of the program BD20.

 

 

By:  José Antonio Roldán Luna

 

 

 

CONTENTS

 

  1. 1. SAP R3

 

  1. a. Create the variant of BD20.
  2. b. Create the system event.
  3. c. Create the Job.

 

  1. 2. XI Integration Repository (Design)

 

  1. a. Import the RFC to call the BAPI that send the event.
  2. b. Create a message to reference the BAPI that have sent the event.
  3. c. Add values to the BAPI message that have been sent the event.
  4. d. Updates the BPM append the BAPI that have sent the event.

 

  1. 3. XI Integration Directory (Configuration)

 

  1. a. Create objects to call the RFC.

 

  1. 4. Test

 

 

 

REFERENCES

 

  • “Working SAP Events” SAP guide.

 

http://help.sap.com/saphelp_nw2004s/helpdata/en/44/c079599d3756a2e10000000a1553f6/frameset.htm

 

  • Another example to use events

http://searchsap.techtarget.com/tip/0,289483,sid21_gci1253243,00.html

 

  • Sap Library BPM step types.

http://help.sap.com/saphelp_nwpi71/helpdata/en/62/dcef46dae42142911c8f14ca7a7c39/frameset.htm

 

 

  1. 1. SAP R3

 

  1. a. Create the variant of BD20.

 

To create a variant of the transaction BD20 you have to use the transaction SE38. In the field Program write the program name RBDAPP01, and click into Variants


ScreenHunter_139 Feb. 06 13.14.jpg

 

 

In the next screen, write the name of the variant (f.e. TRANSIDOC_ATR).


ScreenHunter_139 Feb. 06 13.16.jpg

 

 

Now, click on Continue.


ScreenHunter_139 Feb. 06 13.22.jpg

 

 

In the next screen, you need to configure the variant. In the picture below you can see the values for this case.

In Message type write a concrete Idoc type, in this case ZMTUR_ATR_IN. After write the values click into Variant Attributes

 

 

ScreenHunter_140 Feb. 06 13.22.jpg

 

In the next screen, you can select a list of values and you can take the values of the picture below.


ScreenHunter_140 Feb. 06 13.23.jpg

 

 

And click to save.

 

  1. b. Create the system event.

 

To create the system event you need to execute  the transaction SM64 and click on Create.

The checkbox System needs be selected.


ScreenHunter_141 Feb. 06 13.23.jpg

 

 

  1. c. Create the Job.

 

To create the Job you need use the transaction SM36.

Write the Job name and  click on Start Condition.

 

ScreenHunter_141 Feb. 06 13.24.jpg

 

In Start Condition menu click on After event and select the event that have been created before. And save.


ScreenHunter_142 Feb. 06 13.24.jpg

 

 

To continue with the configuration of the Job click on Step

 

In the next step, you need to link the Variant of the program that has been created before.


ScreenHunter_142 Feb. 06 13.25.jpg

 

click to save.

 

  1. 2. XI Integration Repository (Design)

 

  1. a. Import the RFC to call the BAPI.

 

To import a RFC to call the BAPI right-click on RFC and Import of SAP Objects.


ScreenHunter_142 Feb. 06 13.26.jpg

 

 

Write the SAP server and logon to the system then click on Continue

 

 

In the next screen, select the RFC RSSM_EVENT_RAISE and click on Finish.

 

double-clicking on the object, you can see his structure.


 

ScreenHunter_142 Feb. 06 13.27.jpg

 

If you have problem to import the RFC, make sure that you have selected the correct SAP system. Check the Software Component Version.

 

ScreenHunter_143 Feb. 06 13.27.jpg

 

 

  1. b. Create a message to reference the BAPI.

 

To use the RFC in the BPM, you need to create a Message Interface with Abstract and Asynchronous attributes.

Right-click over Message Interfaces and click on New. Then an aux window will be displayed. you need to write the Message Interface name here.  After that choose Abstract and Asynchronous attributes. In the field TypeMessage Name select the RFC RSSM_EVENT_RAISE

 

ScreenHunter_143 Feb. 06 13.28.jpg

 

Now, save and activate changes.

 

  1. c. Add values to the BAPI message.

 

The next steep is to assign the values to the BAPI, so, you need to create a Message Mapping and put the name of the event.

Right-click over Message Mapping and New. Then an aux window will be displayed. You need to write the Message Mapping name here. 

You need to select the source and target structure. In this case, the source structure is ZTUR_FACTURA_ATR and the target structure is the BAPI.

 

ScreenHunter_144 Feb. 06 13.28.jpg

 

Now, double-click on I_EVENTID field and link it with a constant box with the name of the event.


ScreenHunter_145 Feb. 06 13.28.jpg

 

 

To use the Message Mapping you need to use an Interface Mapping object. Here select the source and target messages (MI) and the mapping which will be transformed.


ScreenHunter_145 Feb. 06 13.29.jpg

 

Now, save and activate changes.

 

  1. d. Update the BPM.

 

ScreenHunter_146 Feb. 06 13.29.jpg

 

In the BPM you have to join the procedure to receipt, transform and send the BAPI.

To receipt the BAPI, you need to add the Message Interface (MI) in the Container menu as shown in the the picture below.

 

 

In the BPM you need to insert a Wait step to enhance the process.


ScreenHunter_146 Feb. 06 13.30.jpg

 

The next step is set a transformation step. You use a transformation step to assign the previous Interface Mapping and his messages.


ScreenHunter_147 Feb. 06 13.30.jpg

 

 

The last step is send the BAPI to a Sap system.

you write the Message to send. The right configuration of the process must be created in XI Integration Directory (Configuration).

 

ScreenHunter_147 Feb. 06 13.31.jpg

 

Then, save and activate changes.

 

 

  1. 3. XI Integration Directory (Configuration)

 

  1. a. Create the object to call the RFC.

 

To configure the process you need to create the next objects:

 

  • Receiver Determination

 

You need a Receiver Determination to describe the process flow. In this case you use the Receiver Determination to ensure that a Message like a BAPI should be enable to use in the BPM.


ScreenHunter_148 Feb. 06 13.31.jpg

 

 

  • Interface Determination

 

If the message is going to be transformed, you need to create an Interface Determination. In this case is no necessary to set the Interface Mapping here, because it has been set in the BPM.


ScreenHunter_149 Feb. 06 13.31.jpg


 

  • Communication Channel

 

You need a communication channel to set the communication between XI and R3. In this case you use a receiver RFC adapter with the R3 connection parameters.


ScreenHunter_150 Feb. 06 13.32.jpg

 

 

  • Receiver Agreement

 

You need a Receiver Agreement to indicate which communication channel is used to send your Message.

 

ScreenHunter_151 Feb. 06 13.32.jpg

 

 

  1. 4. Test

 

To you know that the Job had been executed successfully. You can see the Jobs executed in the SM37 in Sap ISU.


ScreenHunter_151 Feb. 06 13.33.jpg

 

 

And find the Job that you test.


ScreenHunter_152 Feb. 06 13.33.jpg


Fill your boots xD !!


IDOC AAE non-SAP communication

$
0
0

This blog shows how to establish IDOC AAE communication to and from non-SAP systems using the Java single stack. There are a few notes around and some pitfalls when starting to integrate with non-SAP systems using IDOCs.

 

First of all, these are the relevant notes about IDOC AAE non-SAP communication:

1752276 - Idoc support for non-sap systems

1729203 - Support for communication with external RFC server

1717833 - RFC destinations to support external rfc servers

1877907 - Support of extern-to-extern RFC communication with JCo 3.0

 

To be able to use the IDOC AAE for non-SAP communication, you have to be aware of these prerequisites:

 

What is Supported (enabled with this Note)
-Communication with non-SAP back-end systems connected via the RFC SDK (SAP note 825494)
-Communication with non-SAP back-end systems connected via JCo standalone 3.0.10 (SAP note 1077727)
-Communication with non-SAP back-end systems connected via JCo standalone 2.1 (SAP note 549268 - as per the note the maintenance and support period for SAP JCo 2.1 ended on 03/31/2013)

 

 

What is Not supported (no change)
-Communication with non-SAP back-end systems connected via SAP NetWeaver RFC Library (SAP note 1025361)
-Communication with non-SAP back-end systems connected via SAP .NET Connector 3.0 (SAP note 856863)

 

Now, let’s go into detail. First of all, you have to set the system VM parameter “jco.allow_non_abap_partner” to “1”, otherwise the non-SAP communication is not allowed. This is also important for JCO server applications, because otherwise you are not able to handle requests coming from a java stack. Therefore also set “-Djco.allow_non_abap_partner=1” as VM argument, when starting the server.

 

Scenario 1: IDOC AAE adapter is sending an IDOC to a non-SAP system

 

We need to create 3 objects to complete this scenario.

  1. NWA RFC Destination
  2. JCA Connection Factory
  3. IDOC AAE receiver adapter

 

My personal recommendation is to start with the IDOC AAE receiver adapter within NWDS or IB. Create a new receiver adapter using the RFC Client Parameters “From NWA”.

 

IDOC AAE receiver channel.PNG

 

There seems to be a naming scheme, but it is not documented anywhere. I think it is used as shown in the screenshot. First the static part “xi/idoc/CF_” then the party name, followed by the business system or business component and the last part is the channel name.

You can choose a name here, but I would recommend to start with a dummy value, pinging the channel in NWA afterwards, to get the right JCA connection factory name suggested. Otherwise the channel ping will always show an error for that.

Channel Ping.PNG

Now create the needed JCA Connection Factory within the NWA. Filter for the outboundRA JCA Resource and switch to the “Related JCA Connection Factories” tab. Select the outboundRA_CF entry, and press the button “Copy and add…”.

NWA AS.PNG

Now use the JCA Connection Factory name, which you got from the channel ping, for the JNDI name, and switch to the “Configuration Properties” tab, and add the property autoCommit with the value false. Also set the DestinationName, which you want to use for the RFC destination, which we will create next.

JCA connection factory.PNG

Now we have to create the RFC destination within the NWA. You could do this as first step also, this is not imported. Create a new RFC destination, tick Load Balancing “No” and fill in the gateway host and service which is used for the communication and press the next button. Choose a destination for the repository connection and press next. This destination is used to read the IDOC metadata. Now just add the parameter “jco.client.tpname” to the Generic Options table, and fill in the Program-ID, which is used for the communication with the external system. The Program-ID should be registered in the gateway you configured before. That’s it, press finish.

 

Now you can ping the destination, to check if it is working fine. If you forgot to set the “jco.allow_non_abap_partner” parameter mentioned before, this will show up as an error. Also it checks, if the Program-ID is present in the gateway you provided. Just give it a shot…

 

There is also a variant to configure this without using a destination, providing all the details in the JCA Connection Factory. I do not recommend this, as afaik it is not possible to perform a ping on the factory.

 

Scenario 2: IDOC AAE adapter is receiving an IDOC from a non-SAP system

 

The way back is much easier to configure. Just use the Program-ID set in the inboundRA Resource Adapter to communicate with the Java stack. You then can configure an IDOC AAE sender adapter, which uses the default RFC Server Parameters.

 

Just watch out to set the Logical System name for the Business Component according to the sender port (SNDPOR) in the IDOC Control Record. This is used to identify the right Business Component during runtime.

 

Hopefully this helps some of you out there!

PI 7.3: CUSTOM ADAPTER MODULE FOR SENDING IDOC FLAT FILE INTO AN XML TAG & EXTRACTING IDOC FLAT FILE FROM AN XML TAG

$
0
0

INTRODUCTION: This document covers the details of custom adapter module developed to populate an IDoc flat file into an XML tag which would be sent to 3rd Party system for outbound scenarios & to extract the IDoc flat file sent from an XML tag from the XML file sent by 3rd party system for Inbound scenarios. The requirement was such that the IDoc's will be triggered from ECC & the IDoc XML will not undergo any mapping transformation & will be converted into an IDoc flat file using the standard IDOCXmlToFlatConvertor module. However, the 3rd party system did not accept a flat file format & required the flat file to be sent into a single XML tag. Apart from the tag which would contain the Idoc flat file, there were a few more tags which would be populated from the control record of the Idoc such as Idoc Type, SNDLAD, RCVLAD etc. In case of inbound scenarios, the 3rd party system would send IDoc flat file embedded into the same XML tag. In PI, the standard module IDOCFlatToXmlConvertor would be used to convert IDoc flat file into XML, but for this, the IDoc flat file had to be extracted from the XML tag.


Outbound Scenario:




Inbound Scenario:



 

PROBLEM: The problem was that as such an IDoc XML could be converted into a flat file using the standard module, but since the 3rd party system was unable to read IDoc flat files, the entire flat file which may consist of 1 or more IDoc's, had to be sent in an XML tag. Also the 3rd party would send an IDoc flat file embedded into an XML tag, which had to be extracted in order to convert into an IDoc XML.           

 

SOLUTION: Since the above requirement was not being met with the standard adapter modules, the solution was to develop a custom adapter module which would be used in conjunction with the standard adapter modules IDOCXmlToFlatConvertor & IDOCFlatToXmlConvertor.


CONFIGURATION & TESTING:


Outbound Scenario:


Sender Communication Channel:

 

 

Receiver Communication Channel:



 

Inbound Scenario:


Sender Communication Channel:

 

 

When the parameter value for ConversionType: Add or Remove is Add, as provided in the Outbound scenario, the IDoc flat file obtained from standard module IDOCXmlToFlatConvertor is embedded into the XML tag as shown below:



 

 

When the value of this parameter is provided as Remove as in the Inbound scenario, the IDoc flat file within the XML tag is extracted from the XML tag & is sent to next module in the module chain i.e IDOCFlatToXmlConvertor module, which then converts this flat file into an IDoc XML. Thus an Idoc was created using this Idoc XML as shown below. The Idoc is in status 51 due to data issue.



 

Issue with Audit Log Entries in Message Monitoring:


An issue was faced by us when the sequence of entries in audit log was different from the sequence maintained in the module code for these entries. The XML which was constructed for the outbound scenario would contain the Idoc control record fields extracted from Idoc flat file, a single field containing the Idoc flat file and some other fields, whose values would be provided as parameters in the communication channel(Using Module Text). For each of the steps such as starting & completion of extraction of Idoc control record, reading parameters, extracting Idoc flat file from module chain etc., an entry had to be added in the audit log. A randomly occurring issue was discovered in the audit log entries such that the entries of completion of a step would appear before the starting of a particular step. On further investigation, it came to our notice that when the timestamp of two or more entries in audit log is the same, the entries are sorted alphabetically because of which the entry of completion of a step would occur before its starting.



 

As shown above, when the timestamp of audit log entries is the same i.e 23.02.2014 15:07:53.004, the audit log entries of completion of a step came before starting & the entry showing that the custom module has been called came in the end. Ideally these steps should be in sequence of the steps i.e Calling the custom Module followed by starting & completion of each of the sub-steps & then successful execution of the custom module.The sequence of entries maintained in module code was as per the steps of processing as shown in the code snippet below:




Resolution: The resolution was to provide the entries which were in alphabetical order such as:

 

1. Custom Adapter Module IdocFLatFileWrapper called.

2. Custom Adapter Module Processing Step1: Starting extracting Idoc flat file.

3. Custom Adapter Module Processing Step2: Completed extracting Idoc flat file.

& so on.

 


 

CODE USED FOR MODULE:


The code used for writing the module can be found in the attachments section.

 

References:

  1. http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/c0b39e65-981e-2b10-1c9c-fc3f8e6747fa?overridelayout=true.
  2. http://help.sap.com/saphelp_nw73/helpdata/en/B5/BD93642DD3410F90EBEA702399FAC4/frameset.htm
  3. http://help.sap.com/saphelp_nw73/helpdata/en/74/A45BC07E2043FB9B63295229178903/frameset.htm
  4. http://scn.sap.com/docs/DOC-52018

Upgrade Guide available for Seeburger EDI-Adapters

$
0
0

Hello,

 

Since the introduction of the new PI versions with the different installation options (e.g. PI 7.31, PO 7.4, Single Stack / Dual Stack), Seeburger has also issued new versions corresponding to the available PI/PO Versions.

(usually I am writing some quick information in my blog about the Seeburger EDI-Adapters SEEBURGER EDI-Adapter - News, Updates and Feedback )

For information about the specific releases, you can also refer to the Compatibility Matrix ( the latest Matrix should always be available via SAP Note 890721 )

 

Since there have been several questions regarding an update of the Seeburger EDI/B2B-Adapters for the different Versions (PI/PO as well as the See-Adapter-Versions), Seeburger is now providing a new extended version of the “Master Installation Guide” , providing a detailed description on “Updating or Upgrading existing SEEBURGER Solutions and Components” in chapter 4 .

Although it is intended to be provided with SAP Note 1167474 I am also providing a version for download here.

 

MasterInstallationGuide_2014-02-20.pdf:

https://mft.seeburger.de:443/portal-seefx/~public/6eaf18f4-b1ad-49bd-8e62-a5d1df4140ae?download

 

As always, any comment or feedback would be greatly appreciated to make sure that this guide can be developed further and provide valuable information to everyone.

Global Usage Code of preamble in CIDX ChemXMLs

$
0
0

CIDX Message Structure:

 

CIDX Message contains below:

 

  1. Preamble: Handles the information global to document.
  2. Service Header: Contains the information about transaction routing and processing for the given transaction.

   3.  Service Content: Contains the actual action message, for example Invoice.

   4.  Attachments: Attachments are optional.

 

     Sample preamble looks like below.

   1.png

         DateTimeStamp: CIDX adapter service generates date and time each time a message is processed.

 

       GlobalAdministeringAuthorityCode: As this is a Chem XML, a Constant “CIDX” is defaulted for this XML element by the adapter engine.

 

       GlobalUsageCode: This element of the preamble differentiates if the chem XML processed is either for production systems or test/dev systems. Usually key words like “Test” or “Production” are populated here.

 

 

GlobalUsageCode in Quality/Test systems:

 

     PI/XI defaults the value “Production” for this element. But when project is in testing phase, this value should be updated. Below is the process to pass “Test” value to GlobalUsageCode field in preamble header.

 

     If we don’t change the value for GlobalUsageCode then target system will consider test messages as production. This is the only parameter in preamble which differentiates between Test and Production.

 

 

  • Open home page of Netweaver Administrator

          http://hostname:port/nwa

 

 

      2.png

 

 

 

  • Go to “Configuration” and then “Infrastructure” tab.

 

          3.png

  

 

  • Click on “Java System Properties”

            4.png

     

  

  • Go to “Services” tab.

              5.png

 

 

  • Filter with “XPI Adapter”

               6.png

   

 

 

  • Click on “XPI Adapter: Ispeak”

             7.png

  

  • You can see default value for “EXECUTION_MODE” as “Production”.Property EXECUTION_MODE refers mode of execution of Ispeak adapter. The value set here is used by RNIF and CIDX to fill GlobalUsageCode and indicates whether message sent out is test or production message.

            

             8.png

   

 

 

  • Now enter Custom calculated value as “Test” for quality/test systems.

           9.png

 

        After this change, once you run the interface you will be able to see the preamble like below in quality.

 

         10.png

Handling Paramatereized and simple XSLT mapping together with Java Class In PI 7.4

$
0
0

Dear Scn users,

 

This Document gives the approach of executing XSLT mapping using java class in PI 7.4.

 

Summary:

 

The approach of handling parameterized values for a particular field and normal XSLT mapping for other fields. So a combination a parameterized and normal XSLT mapping is handled in this scenario which will be normally applied in most of the business cases.To achieve this a Java class is written which will is used to call the parameters and this java class is called in XSLT mapping.Hence the combination of Java and XSLT mapping is used to acheive the purpose.

 

In my scenario, I have 4 fields in the input structure

 

1.FirstName.

 

2.LastName.

 

3.Date of joining

 

4. Serial.

 

And the fields in the output structure are as below.

 

1.Name - concatenation of FirstName + LastName.

 

2. Date of joining-For this field parameterized mapping is used i.e. Date is obtained as parameter from Integration directory.

 

Step1:

 

Create Data Types.

 

One data Type for Input.

 

406929_620_283_cache.jpg

One for Output

 

406929_620_283_cache.jpg

 

Step 2:

 

Create Message Types and Service Interfaces.Just use the data types created and create 2 message types and 2 service interfaces for Input & Output Structures.

 

Step 3.

 

Create Java project in NWDS.

 

1.Open NWDS->New Java project.

 

2.Then Create a package and class which is used for parameterized mapping for Date field and the export the same as .jar file.

 

3.Below is the Java code which I used to extract parameters.

 

package com.po;
import java.io.IOException;
import java.io.InputStream;
import java.io.OutputStream;
import java.util.Map;
import com.sap.aii.mapping.api.AbstractTransformation;
import com.sap.aii.mapping.api.DynamicConfiguration;
import com.sap.aii.mapping.api.DynamicConfigurationKey;
import com.sap.aii.mapping.api.InputParameters;
import com.sap.aii.mapping.api.StreamTransformationException;
import com.sap.aii.mapping.api.TransformationInput;
import com.sap.aii.mapping.api.TransformationOutput;
public class JavaCode extends  AbstractTransformation{

   public final static String namespace = "http://pi";

   public final static String Date = "Date";

   public  void transform(TransformationInput in, TransformationOutput output) throws StreamTransformationException
   {
    DynamicConfiguration dc = in.getDynamicConfiguration();

    InputParameters parameters = in.getInputParameters();

    // add the seperate values
    getTrace().addInfo("Date " + parameters.getString("Date"));
    DynamicConfigurationKey userKey = DynamicConfigurationKey.create(namespace, Date);
    dc.put(userKey, parameters.getString("Date"));

    //copy the original content
    try{
     InputStream instream = in.getInputPayload().getInputStream();
     OutputStream outstream = output.getOutputPayload().getOutputStream();
     byte[] buf = new byte[1024];
     int numRead=0;
     while((numRead=instream.read(buf)) != -1){
      outstream.write(buf, 0, numRead);

     }
     outstream.flush();
    }catch(IOException ioe){
     throw new StreamTransformationException(ioe.getMessage(),ioe);
    }
   }

   /**
    * Method called from the XSL to return the value of the attribute
    * Remeber to have <xsl:param name="inputparam"/> in the XSL and call with
    * xslsave:getValue($inputparam, 'USERNAME')
    * @param map the inputparams
    * @param key the key to lookup
    * @return
    */
   public static String getValue(Map map, String key){
    try{
     DynamicConfiguration dc = (DynamicConfiguration)map.get("DynamicConfiguration");
     DynamicConfigurationKey keyConfig = DynamicConfigurationKey.create(namespace, key);
     return dc.get(keyConfig);
    }catch(Exception e){
     return "Error fetching key "+ key + " because: " +e.getMessage();
    }

 


   }

}

 

Step 4:

 

Create an .XSL file to transform input->output structure as per the requirement.For date field it will a call of java class using Java code.Below is the XSLT code.


<?xml version="1.0" encoding="UTF-8"?>
<xsl:stylesheet version="1.0"
xmlns:ns0="http://nttdata.com/Raghu" xmlns:javamap="java:com.po.JavaCode" xmlns:xsl="http://www.w3.org/1999/XSL/Transform">
<xsl:param name="inputparam"/>
<xsl:template match="/">
<XSLTTestLegacy>
<Name>
xsl:value-of select="concat(concat(ns0:XSLTTestECC/FirstName,' '), ns0:XSLTTestECC/LastName)" />
  </Name>
<Date>
<xsl:value-of select="javamap:getValue($inputparam, 'Date')"/>
</Date>
<Serial>
  <xsl:value-of select="ns0:XSLTTestECC/Serial" />
  </Serial>

</XSLTTestLegacy>
   </xsl:template>
</xsl:stylesheet>

 

Call the Java class in the XSLT for parameterized mapping.

 

 

Step 5:

 

After completing the code import the JAVA class(.jar file) and XSLT code(.rar file) as imported archive in ESR.

 

Step 6:

 

Create the operation mapping by using the service interfaces created and in the mapping program.Select java class at first and XSLT as second as shown below.

 

406929_620_283_cache.jpg

 

And dont forget to select the "use SAP XML Toolkit"option.Now define a parameter and bind it with the java class Date.

 

Step 7:

 

After doing all the steps you can test and it can be seen that the Date which is a parameter is passed successfully in to the XSLT class and displayed in the output Structure

 

406929_620_283_cache.jpg

 

And for creating ID objects it is a normal way as we do for interfaces and in the Interfaces Tab of Integrated configuration the value of parameter Date should be declared.

 

References:

 

http://wiki.scn.sap.com/wiki/display/Snippets/Java+code+for+Parameterized+XSLT+mappings+in+PI7.1?original_fqdn=wiki.sdn.sap.com

 

http://www.piarchitecture.com/2011/07/using-configure-xsl-with-the-build-in-parameters-in-the-sap-pi-mappings/

Data Type enhancements and SLD binding of software components

$
0
0

Data type enhancements are generally used to add extra
fields (enhancing the structure) in existing or standard data types.

 

 

The cases where we may need to enhance the DATATYPES using
Datatype Enhancements are:

 

 

  1. Where we got standard Datatypes which we can’t
    edit or modify.
  2. In the existing datatype where we are not
    supposed to change the structure of already created datatype as it might be
    used in some other interface and changing the structure might cause
    inconsistency in other existing scenarios.

 

 

 

 

 

 

 

  1. Note - Sometimes we get a requirement where
    client want to use standard components created by SAP itself like E-Sourcing or
    sometimes if want to use the standard ones for our purpose like SAP BASIS SWC.
  2. Suppose the standard ones are not solving the
    entire requirement and we need to enhance the existing structure by adding few
    more fields. Now the first problem we face while enhancing the objects is that
    the STANDARD ones are not editable or modifiable so you need to create a custom
    SWCV to create your own datatype enhancements and then create the dependency of
    standard SWCV and custom SWCV.

 

 

 

 

Let’s discuss it technically:

 

 

DATA TYPE
ENHANCEMENT WITHIN THE SAME SOFTWARE COMPONENT:-

 

 

There are ACTUALLY 2 cases for this:

 

 

CASE1:- WITHIN the
same namespace:-

 

 

  • Sometimes we get a requirement from client where
    we have to modify a data type for a particular scenario but the same data type
    is being used in multiple scenarios, in this case we go for data type enhancements:-

 

 

51.png

 

 

 

Suppose we have to modify the data type DT_ASHU as shown
below:-

 

 

52.png

 

 

 

The requirement is to add an additional field PO NUM under
the address field as shown in the above screenshot, so here we will go for data
type enhancement:-

 

 

Click on data type enhancement under the Interface Objects
tab and create a data type enhancement as shown below:-

 

 

54.PNG

 

 

After the DT_ASHUN gets created add the extra field PONUM in
the data type enhancement. Save and activate the same as shown below:-

55.PNG

 

 

After the change gets activated that extra field added
(PONUM) will be visible in the message type as shown below:-

 

 

 

57.png

 

 

 

 

With this case you will not facing the issues in mapping as
each object (Datatype, Datatype Enhancment and Mapping) fall under same
namespace.

 

 

CASE2:- In the different
namespace:-

 

 

In this case we will be enhancing the data type in different
namespace but within the same software component. The steps are as follows:-

 

 

We will enhance a
datatype  “DT_ASHU”  present in the namespace   http://Shubham
with a  Data Type Enhancement DTE_ASHU
created into http://POCofPI

 

 

Now create a DATA TYPE ENHANCEMENT in http://POCofPI

 

61.PNG

 

62.PNG

 

Note-
In below screenshot, please check the highlighted text, there we need to play
with the namespaces. The first one is having the default information which will
automatically come when you create the Datatype Enhancements

 

63.PNG

 

 

 

 

 

If you follow the default way, you will definitely face
problem in mapping.

 

 

The enhanced field may look as a part of structure in the
tree view but in XML source view different namespaces shows the real issue.

 

 

62.PNG

 

In tree view it looks fine but check below screenshot, the
XML has problem.

 

 

 

 

 

 

 

 

 

 

65.png

 

The
trick to handle that is simple, GO to your created data type enhancement object
and remove the XML namespace and activate it

 

66.PNG

 

 

 

 

 

Check below screenshot, the problem is resolved the
structure seems fine to us now.

 

 

 

67.png

 

 

 

NOW we talk about SLD binding. We need to do SLD binding in
order to create dependency between SAP standard SWCV and custom SWCV. Below are
the details for the SLD binding.

 

 

 

 

 

SLD Binding:-

 

 

Sometimes there is a requirement from client in which we
have to modify a standard data type provided by SAP.

 

 

Ideally it is not possible as the edit button is disabled
and we cannot modify it.

 

 

Creating
Dependencies Between SWCV and EnSWCV in SLD

 

 

69.png68.png

 

 

After this click on Software components as shown in the
above screenshot:-

 

 

70.PNG

 

 

After this click on Software components as shown in the
above screenshot:-

 

 

Now suppose we want to modify the data type in standard
software component named as “SAP Basis”, then in this case we will enhance it
after copying in into another software component named as XCITEC in this case.
Please find the same below:-

 

 

 

71.png

 

 

After selecting SAP_BASIS , click on “Define Prerequisite
Software Components” tab as shown in the above screenshot. After clicking we
see a screen as shown below:-

 

 

72.PNG

 

 

 

Doing this a dependency is created between XCITEC and
SAP_BASIS so now SAP_BASIS(software component in which data type we have to
enhance) will be visible under XCITEC .

 

 

 

Now to see the imported object log on to integration
repository as shown below:-

 

 

 

80.png

 

After
we login in integration repository, go to Tools and then click on Import
Software component versions as shown in the below screenshot:-

 

81.png

 

 

 

After clicking on Import Software component versions, we get
a screen as below:-

 

 

85.png

 

 

 

 

After this we select the software component XCITEC in the
import list in the above screenshot and once the import is successful we get
the message as below:-

 

 

86.png

 

 

 

 

After the import is successful, the objects to be
enhanced(in this case SAP_BASIS 7.0) will be visible under the software
component XCITEC as shown in the below screenshot:-

 

 

87.PNG

 

 

Thus in this way the objects gets imported.

PDF to XML conversion Using Java Mapping in NWDS:-

$
0
0

In this document I will be discussing the basic procedure of
converting PDF to XML using JAVA mapping. I tried to keep it as simple as I
can, so that every PI developer can understand it. Only pre-requisite is to
know basic java here.

 

 

Now we will come directly on the steps needed while creating
the JAVA mapping.

 

 

First of all, we have to create a project in SAP Net Weaver Developer
studio, click on file, and go to project as shown below:-

 

 

Create a JAVA project here.

501.png

 

 

 

 

Once the project is created then we have to create a package
as shown below:-

 

502.png

 

503.png


505.png

 

 

Now clicking on the java class, a java classes is created as
shown below:-

 

506.png

 

 

 

Libraries
to be used for the conversion of PDF to XML:-

 


Libraries used for
mapping purposes:-


508.PNG

Now we have  to import
the library files in NWDS, for this we right click on the project(in this case
as Shubham) and go to “Properties” tab as shown below:-

 

 

509.png


 

After this click on the libraries tab, on the top of the
screen as shown below:-


510.png

 

 

 

After clicking on the Libraries tab click on the tab “ Add
External JAR’s” and then click on the OK tab as shown below:-

 

 

 

511.png


Doing this the entire required JAR files are added to the
project.

 

Now add the code in the Class file:


/*

 

 

* Created on Mar 24, 2014

 

 

*

 

 

* To change the template for this
generated file go to

 

 

*
Window&gt;Preferences&gt;Java&gt;Code Generation&gt;Code and
Comments

 

 

*/

 

 

packageShubham1;

 

 

 

 

 

 

 

 

importjava.io.FileInputStream;

 

 

importjava.io.FileOutputStream;

 

 

importjava.io.InputStream;

 

 

importjava.io.OutputStream;

 

 

importjava.util.HashMap;

 

 

importjava.util.Map;

 

 

 

 

 

importcom.lowagie.text.pdf.PdfReader;

 

 

importcom.lowagie.text.pdf.parser.PdfTextExtractor;

 

 

importcom.sap.aii.mapping.api.AbstractTrace;

 

 

importcom.sap.aii.mapping.api.StreamTransformation;

 

 

 

 

 

/**

 

 

*@authorshubham.e.agarwal

 

 

*

 

 

*Tochangethetemplateforthisgeneratedtypecommentgoto

 

 

*Window&gt;Preferences&gt;Java&gt;CodeGeneration&gt;Codeand
Comments

 

 

*/

 

 

publicclassShubham2
implementsStreamTransformation{

 

 

      private
Mapmap=null;

 

 

      private
AbstractTracetrace
=null;

 

 

      public
voidsetParameter(Map
arg0){

 

 

            map
=arg0;// Store reference to the mapping parameters

 

 

            if
(map==null){

 

 

             this.map=newHashMap();

 

 

            }

 

 

      }

 

 

 

 

 

 

 

 

      /*public
staticvoid
main(String[]args)
{//FOR EXTERNAL
STANDALONE TESTING

 

 

    

 

 

      try
{

 

 

            FileInputStream
fin=newFileInputStream
("C:\\test.pdf");//INPUT FILE
(PAYLOAD)

 

 

            FileOutputStream
fout=newFileOutputStream
("C:/Users/Shubham.e.agarwal/My
Documents/pdfXML.xml"
);//OUTPUT FILE (PAYLOAD)

 

 

            Shubham2
mapping=newShubham2();

 

 

            mapping.execute(fin,
fout);

 

 

            }

 

 

            catch
(Exceptione1)
{

 

 

            e1.printStackTrace();

 

 

            }

 

 

      }*/

 

 

    

 

 

      public
voidexecute(InputStream
inputstream,OutputStream
outputstream){

 

 

            try
{

 

 

                                        

 

 

                  String
msgType="MT_shubham";
//A dummy Message type, please change it as per
your requirement.

 

 

                  String
nameSpace="http://Shubham";
//A dummy namespace, please change it as per your
requirement.

 

 

                  String
str;

 

 

                  str="<?xml version=\"1.0\"
encoding=\"UTF-8\"?>\n"
+
"<ns0:"+msgType+" "+"xmlns:ns0=\""+nameSpace+"\">";

 

 

                  str
=str+"\n<Record>";

 

 

                  PdfReader
reader=newPdfReader(inputstream);

 

 

                  PdfTextExtractor
pdf=newPdfTextExtractor(reader);

 

 

                  str
=str+pdf.getTextFromPage(1);

 

 

                  str=
str+"\n</Record>"+"\n</ns0:MT_shubham>";

 

 

                  byte
by[]=str.getBytes();

 

 

                  outputstream.write(by);

 

 

                  reader.close();

 

 

                  outputstream.close();

 

 

                  System.out.println(str);     

 

 

            }

 

 

            catch(Exceptione){

 

 

                  e.printStackTrace();

 

 

            }

 

 

      }

 

 

 

 

 

}


Now we will design an integration scenario where a PDF will
be converted into XML document

 

 

INTEGRATION
SCENARIO:-

 

 

First of all we will create a data type as below:-

 

 

512.png

 

 

513.png

 

 

We will be using this data type for both sender and receiver.

 

 

MESSAGE TYPE:-

 

 

514.png

 

 

Same as above, we will be using this message type for both
sender and receiver.

 

 

 

INTERFACES:-

 

 

We will create the inbound and outbound interfaces using the
same message types

 

 

Outbound
Interface:-


515.png


Inbound
Interface:-

 

 

520.png

 

 

 

Now we will create imported archive where we will be using java
mapping but before that we need to export the Jar file as below:-

 

 

531.png

 

 

 

After exporting the JAR files we need to import them in the
imported archive so we Import the same and activate and save them as shown
below:-


532.png


Also import the external jar file whose libraries we are
using in our JAVA mapping code.

 

 

In my case I’m doing the PDF conversion so I used itext.JAR(IR_PDFUtility) is the Imported Archive name) which I need to import
into same namespace to avoid “JAVA class not instantiate” error. The screenshot of the same is
below:-


533.png


Now we will create the interface mapping in which we will be
using the java mapping created as below:-

534.png

ID OBJECTS:-

 

 

Now we will
be using the service BS_Shubham, as the sender and receiver both.

 

 

SENDER AGREEMENT:-


535.png


Now we will
be creating a communication channel which we will have to configure in sender
agreement

 

 

 

 

Now we will
be configuring this communication channel in the sender agreement and will
activate the same. Please find the screenshot of the same below:-

 

537.png


 

RECEIVER DETERMINATION:-

 

 

 

 

538.png

 

INTERFACE DETERMINATION:-

 

 

545.png


RECEIVER AGREEMENT:-


546.png


Now we will
be creating a receiver channel which we will be configuring in the receiver
agreement as shown below:-

 

547.png

 

548.png

 

 

  1. Now after
    completing the IR and ID part , we will be testing the scenario
    J

 

 

Results:-

 

 

549.png

 

 

So finally the
results indicate that we were able to convert a PDF file into XML as shown
above.

 

 

Furthermore
we can split the XML using user defined function as per the requirement.

 

 

 

References:-

 

 

http://wiki.scn.sap.com/wiki/display/NWTech/XLS+to+XML+conversion+using+JAVA+Mapping+in+SAP+XI+7.0


How to install SAP NW 7.4 PO on localhost

$
0
0

It is not a good practice to make tests and proof of concepts on customer system, and our companys test system has different version than system of my customer, so I decided, that I will install my own NW 7.4 PO system on my laptop. Here you can find how to do that.

 

I have to point at the fact that this document will guide you on only one of many installation options of SAP Netweaver - Process orchestration with MaxDB on windows x64. But other options are more or less similar.

 

At first you have to download all installation packages. Go to http://service.sap.com/swdc log in with you SAP ID and navigate to Installation and Upgrades->A-Z Index->N->SAP Netweaver->SAP Netweaver 7.4

 

ScreenHunter_16 Apr. 02 16.08.jpg

 

You have to install Software provisioning (if you don`t have one). This is tool provided by SAP, which serves as software installing platform for Netweaver. In the documentHow to install Software Provisioning Manager you can find a guide how to install one.


So if you have installed Provisioning manager click on Installation and Upgrade->Microsoft Windows->MaxDB ScreenHunter_19 Apr. 02 16.38.jpg

and scroll down to installation files links. Here is list of files you will need during installation:


FilenameDescription
51047454_1NW 7.4 SR1 Java based SW Comp.s 1 of 7
51047454_2NW 7.4 SR1 Java based SW Comp.s 2 of 7
51047454_3NW 7.4 SR1 Java based SW Comp.s 3 of 7
51047454_4NW 7.4 SR1 Java based SW Comp.s 4 of 7
51047454_5NW 7.4 SR1 Java based SW Comp.s 5 of 7
51047454_6NW 7.4 SR1 Java based SW Comp.s 6 of 7
51047454_7NW 7.4 SR1 Java based SW Comp.s 7 of 7
51047454_8NW 7.4 SR1 Installation Export
51048107_8SAP Kernel 7.41 Windows Server on x64 64bit
51046952_1RDBMS MaxDB 7.9 RDBMS - SP8 Build 12 1 of 3
51046952_2RDBMS MaxDB 7.9 RDBMS - SP8 Build 12 2 of 3
51046952_3RDBMS MaxDB 7.9 RDBMS - SP8 Build 12 3 of 3

 

Downloading that files will take a while because summary they are something over 20 GB in size.

 

When you got all that files at your local machine - unzip them using SAPCAR utility (same as you used to unzip provisioning manager).

 

Now you have everything prepared to perform installation itself. So go to directory, where you unziped Provisioning manager and run sapinst.exe

ScreenHunter_15 Mar. 31 18.49.jpg


 

Now navigate through installation options tree

     SAP NetWeaver 7.4 Support Release 1->MaxDB->SAP Systems->Optional Standalone Units->Process Orchestration->Standard System->Process Integration and Orchestration Package

ScreenHunter_16 Mar. 31 18.49.jpg

By confirming that option you started installation process.

Next step is pretty long and contains setting all parameters of installation. Installator gives you two choices

  • Typical parameters
  • Custom parameters

 

I choosed custom parameters (even if typical are set on the screenshot)

 

ScreenHunter_17 Mar. 31 18.50.jpg

You have to set directory JAVA_J2EE_OSINDEP_UT - you can find that in directory, where you unziped SAR 51047454

ScreenHunter_18 Mar. 31 18.52.jpg

Set ID of your SAP system

ScreenHunter_19 Mar. 31 18.53.jpg

I unchecked domain because in my case it caused a problems. But you can left it checked

ScreenHunter_20 Mar. 31 18.53.jpg

Set the directory with kernel and MaxDB

ScreenHunter_21 Mar. 31 18.54.jpg

ScreenHunter_23 Mar. 31 18.55.jpg

Set master password for all users created during installation. That password have to fullfill all password creation options used in your OS, so it is better to use strong password with at least one uppercase and at least one digit.

ScreenHunter_24 Mar. 31 18.56.jpg

Set domain model

ScreenHunter_25 Mar. 31 18.57.jpg

Set DB ID

ScreenHunter_26 Mar. 31 18.57.jpg

In that step, installator runs preinstall checks - as you can see - I didnt passed two of them, but I just ignored it and I continued with installation.

ScreenHunter_26 Mar. 31 18.58.jpg

ScreenHunter_27 Mar. 31 18.58.jpg

Choose disk onto which you want to install your SAP system

ScreenHunter_27 Mar. 31 18.59.jpg

Set passwords for database users. Your master password is predefined, so if you dont change anything here - master password will be used.

ScreenHunter_28 Mar. 31 18.59.jpg

Dont change parameters here

ScreenHunter_28 Mar. 31 19.00.jpg

Set path for new DB volumes

ScreenHunter_29 Mar. 31 19.00.jpg

ScreenHunter_29 Mar. 31 19.01.jpg

Set password for DB Schema and secure store. Also master password is predefined

ScreenHunter_30 Mar. 31 19.01.jpg

ScreenHunter_30 Mar. 31 19.02.jpg

Here you can define SAP system ID and how many java server nodes will be created

ScreenHunter_31 Mar. 31 19.02.jpg

Predefined message server port

ScreenHunter_32 Mar. 31 19.02.jpg

Password for ICM webadmin. Master password is predefined.

ScreenHunter_33 Mar. 31 19.02.jpg

Passwords for java UME users. Master password predefined.

ScreenHunter_33 Mar. 31 19.03.jpg

Here you have option to install other versions of some packages. But for our case - dont change anything.

ScreenHunter_34 Mar. 31 19.03.jpg

You dont need diagnostic agent

ScreenHunter_35 Mar. 31 19.03.jpg

But you probably want to use SAP Netweaver Developer studio

ScreenHunter_36 Mar. 31 19.04.jpg

Check your parameters before executing the installation

ScreenHunter_36 Mar. 31 19.05.jpg

After that step, the installation begins and you have few hours to read how to make post install steps.

 

I hope it helped you to make your own Process orchestration system on localhost.

How to install Software Provisioning Manager

$
0
0
  1. Log into http://service.sap.com/swdc
  2. Go to Installation and Upgrades->A-Z Index->N->SAP Netweaver->SAP Netweaver 7.4

     

    ScreenHunter_16 Apr. 02 16.08.jpg

  3. Click on SOFTWARE PROVISIONING MGR 1.0 and choose your OS
  4. Scroll down where you can see software packagesScreenHunter_17 Apr. 02 16.18.jpg
  5. Files which begins with 70 (70SWPM10SP05 0-200009707.sar here) can install only SAP NW 7.0. File without 70 at the beginning installs all version above 7.0 - so 7.1, 7.3. 7.4. Click and download to your disk latest patch level of SWPM (SWPM10SP05 0-200009707.SAR in my case). It is usually around 300 mb in size.
  6. Download SAPCAR.EXE - tool used to extract SAR archives. Go to Support packages and Patches->A-Z Index->S->SAPCAR choose latest version and then click your OS. Scroll down and download file there.ScreenHunter_18 Apr. 02 16.27.jpg
  7. copy downloaded SAPCAR.exe file into directory with previously downloaded SAR file, then run in command prompt SAPCAR.EXE -xvf <filename>.SAR. That command will unzip sar archive.

 

 

You can find sapinst.exe (if you are on windows) and sapinstgui.exe files in directory where SAR file was unzipped.


SAPINST.exe servers as installator for localhost and SAPINSTGUI.exe servers as remote gui for remotely run installation.

In the case of remote installation - you have to run sapinst.exe -nogui on the remote system and sapinstgui.exe on local system and put ip address of remote system as host.


SAP NetWeaver Process Integration Overview

$
0
0

Reasons for Using SAP NetWeaver Process Integration

 

Integration processes in a heterogeneous system landscape usually span several systems, and the external interfaces for connecting systems were implemented individually,resulting in a large number of point-to-point connections and a complicated network of relationships. Software often differ in their data structure and the protocol that they support, somapping programs are required to map the fields of the source documents to the fields of the target documents.

 

If processes change or new systems are added, you must change the individual interfaces in the applications accordingly, which can require a significant amount of time and effort.

 

The challenges regarding integration in companies can be summarized in the following points:

 

Challenges of Today's Integration Solutions

 

  • Individual point-to-point integration that uses “any” technology
  • Patchwork of integration solutions
  • No centralized knowledge about interfaces and no way of building this knowledge
  • A "homegrown" infrastructure that cannot be adapted, or only with great difficulty, and is very costly to maintain
  • High costs for upgrading components

 

 

SAP NetWeaver Process Integration (SAP PI) provides a platform that enables different interfaces to interact using a common technology. SAP PI offers a central point of integration, with central information on involved processes, systems, and interfaces.

 

 

The Integration Platform of SAP NetWeaver Process Integration

 

As an integral part of SAP NetWeaver, SAP PI is based on an open architecture and uses open standards (in particular from the XML and Java environments).

 

The services provided by SAP PI are indispensable in a heterogeneous and complex system landscape:

 

  • Central repository for interface design.
  • Configuration options for controlling the message flow.
  • Options for transforming message content between the sender and receiver.
  • A runtime infrastructure for exchanging messages.
  • Options for modeling and executing processes.

 

pi1.JPG

Structure of a PI SOAP message

 

A message in SAP PI has the following structure. The properties of the message itself are contained in the message header, for example the sender, which is used later to determine the receiver. The actual business data is contained in the "payload ". You can also attach any number of attachments (for example pictures, text documents, and so on) to the message. The message header and the payload are in XML format.

 

The focus of SAP PI is a SOAP message-based communication (based on an XML message format and the HTTP protocol). Application-specific content is transferred from the sender to the receiver as XML messages using the Integration Engine (IE) at runtime.

 

Processing of the messages on the Integration Server is stateless, that is, a message arrives, the receiver or receivers are determined, and the message is forwarded immediately.

 

pi2.JPG

Connecting Different Systems to SAP NetWeaver Process Integration

 

The message processing in SAP PI is based on PI SOAP messages. Most of the systems to be connected use different formats and protocols, that is why a variety of PI Adapters is offered by SAP and partners. The Adapter converts between the (PI internal) PI SOAP fomat and the (external used) format and protocol. SAP provides a number of adaptors for this purpose. (IDoc, File, RFC, JDBC, Mail, SOAP, RosettaNet, ...).

 

IDoc is an intermediate document the transport information. It is an SAP standard format for electronic data exchange between systems.

 

The sender system provides data in a document format, for example, IDoc, and makes it available to the adapter by means of a protocol. The adapter transforms the document to the PI SOAP format and forwards it to the Integration Server by using HTTP(S). In the configuration you specify which adaptor the receiver is to use to receive the message. The Integration Server sends the message to the relevant adapter, which in turn converts it to the protocol of the receiver and finally sends it to the receiver.

 

In the Integration Directory,you assign specific systems as senders and receivers to the interfaces that you defined in the Enterprise Services Repository.

 

Apart from Adapters, SAP systems support Proxy connectivity with PI without an Adapter: the SAP system can send and receiver PI SOAP messages directly, so no Adapter is necessary in this case. For this approach, interface descriptions are created in the Enterprise Services Repository and then, using this description, Proxies are generated in the application systems.

 

From a technical perspective, proxy objects are classes and methods in a programming language (ABAP or Java) that can generate and process messages for the format defined in the Enterprise Services Repository (ESR).You enter a description of all required interfaces in your company in the Enterprise Services Repository at design time. The descriptions are not platform-dependent. The description may also include mapping definitions if the data to be transferred between the interfaces is not all in the same format.

 

Based on a WSDL description, proxy generation generates proxy objects in an application system to use to send or receive messages.

 

pi3.JPG

The Principle and Advantages of Shared Collaboration Knowledge

 

The structure of a message is determined by the data structures of an interface. The central concept of SAP PI is that you develop all required interfaces at design time, independently of the platform, and store them in the Enterprise Services Repository (via import or manual creation). SAP PI thus applies the principle of shared collaboration knowledge.

 

You no longer have to search out information about a distributed process from all the involved systems (point-to-point) because you can now call this information centrally. This reduces the costs of development and maintenance of distributed applications. Another advantage to the customer as a result of using SAP NetWeaver Process Integration is that SAP ships predefined interfaces.


SAP applications (CRM, SRM, SCM, xRPM) can thus contribute their integration knowledge to the Enterprise Services Repository. If the application is enhanced, the content in the Enterprise Services Repository is also enhanced. This enables you to integrate SAP solutions “out-of-the-box” and simplifies solution upgrades.

 

SAP PI provides a range of adapters that enable you to connect systems to SAP NetWeaver Process Integration. Customers can also use the Adapter Framework to create their own adapters.

 

The Adapter Engine is a runtime component for adapters that integrate applications and systems into SAP NetWeaver Process Integration. You can deploy the Adapter Engine centrally, as part of the Integration Server (standard), or decentrally on any SAP AS Java or SAP NetWeaver AS Java.

 

If a customer uses SAP PI and wants to use it to communicate with a “smaller” customer, SAP provides the Advanced Adapter Engine Extended (AEX). This enables a connection using SAP PI. The smaller customer does not need to have the entire SAP PI solution installed, neither must it be an SAP customer.

 

In SAP NetWeaver Process Integration, you can use ccBPM to model and execute processes. This is described in more detail in the lesson “Business Process Management in SAP”.

 

Advantages of SAP PI and Shared Collaboration Knowledge are as follows:


  • The gradual transition to SAP PI safeguards existing investments.
  • Using shared collaboration knowledge reduces the costs of maintaining and developing interfaces.
  • Shipment of a range of Adapters for connecting systems, as standard.
  • Option of implementing customer-specific adapters for integration using the Adapter Framework.
  • Option for customers to use the Avanced Adapter Engine Extended to create messages in SAP PI format without having to install the entire SAP PI solution.
  • Option of modeling, executing, and monitoring cross-system processes (ccBPM).
  • Additional advantages when using SAP systems.
  • Integration of SAP solutions “out of the box”.
  • Easier upgrade of SAP solutions.

 

Test Scenario Process Integration 'Making a Travel Request and Booking Flights'

 

The course scenario for process integration demonstrates the structure of SAP NetWeaver Process Integration and Business Process Management with Business Workflow. The scenario is as follows:

 

An employee of a company creates a travel request using a Business Server Page application (BSP application named ZSAPNW) in an Internet browser.

 

 

The request triggers a business workflow in the company's SAP system, which forwards the request to the employee's supervisor for approval. The approval process also uses a BSP application. In this case, the BSP page is started directly from the workflow process and a WebFlow service is used.

 

Once the request is approved, the employee can book the relevant flights. The requester receives a Business Workflow work item, which calls a BSP application of the Travel_Agency_Summer travel agency, which is closely linked to the company. The employee books an outbound and a return flight in the SingleFlightBooking scenario in the BSP application ZSAPNW_XIAGENT. The SingleFlightBooking service books a single flight in each case, that is, it must be called twice. The flights are booked asynchronously with the airline AA (American Airlines) or LH (Lufthansa) using SAP PI.

 

To keep things simple, the various systems (company, travel agency, SAP PI, airline) are different clients of the same system in the training system.

 

In the figure for the integration process, you can see that SAP NetWeaver Process Integration enables processes to be controlled across systems, whereby the SAP PI server assumes control of the process, thus removing point-to-point connections. SAP PI controls communication between processes, whereas Business Workflow controls communication between users within processes.

 

pi4.JPG

Integration Directory: Sender and Receiver Systems

 

During a configuration phase you then assign components, which you have defined in the Enterprise Services Repository, systems or processes as senders and receivers of messages. SAP PI calls this information thelogical routing and the corresponding objects are stored in the Integration Directory.

 

There you specify from which system, called communication component and with which output interface messages are sent via the Integration Server to which system with which input interface and whether a mapping program must be executed.

 

Unlike the figure showing design time, the figure for configuration time not only shows the inbound and outbound interfaces, but also the systems that send a message to, or receive a message from the Integration Server.

 

You use the Integration Builder to make these settings in the Integration Directory for each individual customer according to their specific system landscape.

 

To access the Integration Directory, call transaction SXMB_IFR. The application specific contents are transferred from the sender to the receiver by means of messages in a freely-definable XML schema. The structure of a message is determined by the data structures in the interface used (IDoc, file, database, and so on).

 

The Integration Engine on the Integration Server evaluates the configuration in the Integration Directory when an inbound message is received at runtime. It uses the configuration data to determine the receiver or receivers of the message, maps the inbound message to the interface structure of the receiver, and then forwards it for further processing. The Integration Server is the central communication and distribution machine for XML messages.

 

At design time, an integration scenario merely describes how communication will take place and which messages will be used. It does not describe which systems are involved.

 

Both the Enterprise Services Repository and the Integration Directory are written in Java and must be managed with the appropriate Java administration tools.

 

From a technical point of view, SAP NetWeaver Process Integration is based on an SAP NetWeaver Application Server (ABAP and Java). However, you must install this server in addition to SAP NetWeaver Process Integration. A detailed SAP installation guide is available for this purpose.

 

pi5.JPG

File Content Conversion in ESR using Java Mapping

$
0
0

Concept:

 

Here I will explore an alternative way of file content conversion without using Module configuration (XML2PLAIN) or standard file content conversion, present at receiver file adapter.

BLOG_MessageMappingConcept.jpg

Above diagram depicts how normally we do content conversion in a standard way.

Recently, I had gone through a requirement where I needed to convert a complex nested xml like below, into a flat file.

 

<Record>

                <Header>

                </Header>

                <Detail>

                                <Detail1>

</Detail1>

<Detail2>

              <SubDetail1>

              </SubDetail1>

</Detail2>

<Detail3>

              <SubDetail2>

                                  <DeepSubDetail3>

                                  </DeepSubDetail3>

               </SubDetail2>

</Detail3>

                </Detail>

              <Trailer>

</Trailer>

</Record>

 

Additionally, as per the requirement while generating flat file, each field should have padded with different amount of spaces and lots of fields and lines were optional and conditional, so I wanted to have full control on each field of each line in flat file.

 

I didn’t want to customize adapter module, so I come up with following solution.

BLOG_JavaMappingConcept.jpg

Here I am taking my input xml as input stream in java mapping. I am creating each line of my flat file as per requirement by padding spaces and all. And resulted output stream is written by file adapter directly into flat file.

No module configuration or file content conversion is required at adapter level processing.

 

Real Time Scenario and Code snippets:

Let’s take a real sample input xml from which we will be going to generate a flat file using above concept.

 

Input XML:

<?xml version="1.0" encoding="UTF-8"?>

<ns0:MT_Input xmlns:ns0="urn:example">

      <Record>

                <Header>

                           <RecordType>HDR</RecordType>

                            <FileName>INPUT.TXT</FileName>

                            <Data1>110011.24022014</Data1>

                            <Data2>AM</Data2>

                  </Header>

                   <Invoice>

                             <RecordType>INV</RecordType>

                              <Data>1010021-PAY</Data>

                              <Source>AMERP </Source>

                              <Filler>0000000000</Filler>

                                             <MSG_TO_VND>

                                                          <RecordType>MSG_TO_VND</RecordType>

                                                           <Data>2100-ORDER</Data>

                                                           <DescriptionType>NEW ORDER BY AM</DescriptionType>

                                               </MSG_TO_VND>

                                                <MSG_INV_DESC>

                                                             <RecordType>MSG_TO_DESC</RecordType>

                                                              <Data>PAY INV-1010021</Data>

                                                              <DescriptionType>PAY INV BY AM</DescriptionType>

                                                 </MSG_INV_DESC>

                        <DIT>

                           <RecordType>DIT</RecordType>

                            <Data>1100.ORDER.NEW</Data>

                            <DescriptionType >NEW ORDER 1100 24022014</DescriptionType>

                                                   <MSG_TRN_DESC>

                                                                 <RecordType>MSG_TRN_DESC</RecordType>

                                                                 <Data>PAY ORDER 11001010021</Data>

                                                                  <DescriptionType>NEW PAY ORDER BY AM</DescriptionType>

                                                    </MSG_TRN_DESC>

                       </DIT>

                 </Invoice>

                 <Trailer>

                                <RecordType>TRLR</RecordType>

                                 <TotalCount>10.00000</TotalCount>

                                  <Filler>1111111111</Filler>

                  </Trailer>

        </Record>

</ns0:MT_Input>

 

Our expected output flat file will be,

 

Output Flat file:

HDR INPUT.TXT 110011.24022014   AM

INV 1010021-PAY AMERP 0000000000

MSG_TO_VND 2100-ORDER NEW ORDER BY AM

MSG_TO_DESC PAY INV-1010021 PAY INV BY AM

DIT 1100.ORDER.NEW NEW ORDER 1100 24022014

MSG_TRN_DESC PAY ORDER 11001010021 NEW PAY ORDER BY AM

TRLR 10.00000 1111111111

 

SAP PI 7.3 and above: (Using Message Mapping)

 

For PI 7.3 and above, you can write java mapping directly in ESR in message mapping as explained in Sunil's blog post.

http://scn.sap.com/community/pi-and-soa-middleware/blog/2013/03/13/write-java-mapping-directly-in-esr

I would recommend this approach as we can test our java mapping in test tab of message mapping right there.

 

I have written a java code to parse this sample input xml and convert it into flat file as expected. You can edit this code as per your requirement in Netweaver development studio or any Jdk by importing appropriate mapping libraries.

 

Sample code:

public void transform(TransformationInput in, TransformationOutput out) throws StreamTransformationException {

            try{

                        // Instantiating output stream to write at Target message

                        OutputStream os = out.getOutputPayload().getOutputStream();

                        // building a new document to parse input stream i.e. our source message

                        DocumentBuilderFactory dbFactory = DocumentBuilderFactory.newInstance();

                        DocumentBuilder dBuilder = dbFactory.newDocumentBuilder();

                        Document doc = dBuilder.parse(in.getInputPayload().getInputStream());

 

                        //Instantiating HeaderParser class

                                    HeaderParser hparser= new HeaderParser();

                        // retrieving header line of flat file

                                    String strHeader=hparser.parseHeaderString(doc);

                        // Writing header line at output stream

                                    os.write(strHeader.getBytes());

                        //Instantiating InvoiceParser class

                                    InvoiceParser iparser= new InvoiceParser();

                        // retrieving Invoice lines of flat file

                                     String[] strInvoice=iparser.parseInvoiceString(doc);

                                     for (int i=0;i<strInvoice.length;i++)

                                     {

                        // Writing each invoice line at output stream

                                     os.write(strInvoice[i].getBytes());

                                     }

                        //Instantiating TrailerParser class

                                    TrailerParser tparser= new TrailerParser();

                        // retrieving Trailer line of flat file

                                    String strTrailer=tparser.parseTrailerString(doc);

                        // Writing Trailer line at output stream

                                    os.write(strTrailer.getBytes());

                                    os.flush();

                                    os.close();

            }

                                    catch (Exception e)

                                    {

                                                throw new StreamTransformationException(e.getMessage());

                                    }

            }

 

// Header parser class

 

public class HeaderParser {   

            public String parseHeaderString(Document doc){

                 

                        String strInput="";

                        NodeList nList = doc.getElementsByTagName("Header");

                                    Node nNode = nList.item(0);

                                    Element element= (Element) nNode;

                        //Forming header line

                              strInput=(element.getElementsByTagName("RecordType").item(0).getTextContent()+ " “

+element.getElementsByTagName("FileName").item(0).getTextContent()+ " "                         

+element.getElementsByTagName("Data1").item(0).getTextContent()+ "  "

+element.getElementsByTagName("Data2").item(0).getTextContent()+ " " )

+"\n";

                                    return strInput;

                                    }

            }

 

//Invoice Parser class

 

public class InvoiceParser {

            public String[] parseInvoiceString(Document doc) { 

                        NodeList nlistInv = doc.getElementsByTagName("Invoice");

                        String[] strInput = new String[nlistInv.getLength()];

                        int j = 0;

                        try {

                                    //looping Invoice tag

                                    for (int i = 0; i < nlistInv.getLength(); i++) {

                                                Node nNode = nlistInv.item(i);

                                                NodeList n1 = nNode.getChildNodes();

                                                Element e1 = (Element) n1;

                                    //Nodes under Invoice tag

                                                NodeList n2 = e1.getElementsByTagName("MSG_TO_VND");

                                                NodeList n4 = e1.getElementsByTagName("MSG_INV_DESC");

                                                NodeList n6 = e1.getElementsByTagName("DIT");

                                    // Converting Invoice node into element

Element element = (Element) nNode;

                                   //Fetching all text nodes value under Invoice tag                  

String recordType=element.getElementsByTagName("RecordType").item(0).getTextContent()+" ";                                             

String data=element.getElementsByTagName("Data").item(0).getTextContent();

String source=element.getElementsByTagName("Source").item(0).getTextContent();

String filler=element.getElementsByTagName("Filler").item(0).getTextContent();

                              //forming 1st line of strInput[i], Maintain the order of values                                 

                         strInput[i]= recordType

                                                + data

                                                + source

                                                + filler

                                                +"\n";

                         //Assuming MSG_TO_VND tag is optional within invoice tag. Checking MSG_TO_VND tag existence

 

                         if (n2.getLength()>0)

                         {

                        Node n3 = n2.item(j);

                        // Converting MSG_TO_VND node into element

                        Element e2 = (Element) n3;

                        // Forming next Line of strInput[i], if exist                            

                        strInput[i]=strInput[i]+

                          e2.getElementsByTagName("RecordType").item(0).getTextContent()+ " "      

                         + e2.getElementsByTagName("Data").item(0).getTextContent()+" "

                        + e2.getElementsByTagName("DescriptionType").item(0).getTextContent()+" "    

                        + "\n";                         

                         }

                        // MSG_INV_DESC tag within Invoice tag. Assuming it is mandatory tag.

                        Node n5 = n4.item(j);

                        // Converting MSG_INV_DESC node into element

                        Element e4 = (Element) n5;

                        // Forming next Line of strInput[i], if exist

                        strInput[i]=strInput[i]+                                  

                e4.getElementsByTagName("RecordType").item(0).getTextContent()+ " "

             + e4.getElementsByTagName("Data").item(0).getTextContent()+ " "

             + e4.getElementsByTagName("DescriptionType").item(0).getTextContent()+ " "

             +"\n";

                                                                                         

                        //looping DIT tag within Invoice.                               

                        for (int m = 0; m < n6.getLength(); m++) {

                                                Node n7 = n6.item(m);

                        // Converting DIT node into element

                                                Element e5 = (Element) n7;

                                                NodeList ditn = e5.getElementsByTagName("MSG_TRN_DESC");

                        //Fetching all text nodes value under DIT tag           

String RecordType=e5.getElementsByTagName("RecordType").item(0).getTextContent();

String ditdata=e5.getElementsByTagName("Data").item(0).getTextContent();

String descriptiontype=e5.getElementsByTagName("DescriptionType").item(0).getTextContent();                 

                    // Forming next line of strInput[i], if exist, maintain order of values     

                    strInput[i]=strInput[i]+

                                       RecordType+" "

                                    + ditdata+" "

                                    + descriptiontype+" "

                                    +"\n";

                         //Assuming MSG_TRN_DESC tag is optional within DIT tag. Checking MSG_TRN_DESC tag existence

 

                         if (ditn.getLength()>0)

                         {

                        Node ditno = ditn.item(j);

                        // Converting MSG_TRN_DESC node into element

                        Element dite = (Element) ditno;

                        // Forming next Line of strInput[i], if exist                            

                        strInput[i]=strInput[i]+

                 dite.getElementsByTagName("RecordType").item(0).getTextContent()+ " "

             + dite.getElementsByTagName("Data").item(0).getTextContent()+" "

            + dite.getElementsByTagName("DescriptionType").item(0).getTextContent()+" "

            + dite.getElementsByTagName("Filler").item(0).getTextContent()+" "                                

            + "\n";                         

                         }         

                                                }

                                    }

                        }

catch (Exception e) {

                        System.out.println(e);

                        }

                        return strInput;

            }

}

 

//Trailer parser class

 

public class TrailerParser {

            public String parseTrailerString(Document doc){

                 

                        String strInput="";

                        NodeList nList = doc.getElementsByTagName("Trailer");

                        Node nNode = nList.item(0);

                        Element element = (Element) nNode;

 

            // Forming Trailer line

        strInput=(element.getElementsByTagName("RecordType").item(0).getTextContent()+ " "  

                        +element.getElementsByTagName("TotalCount").item(0).getTextContent()+ " "

                        +element.getElementsByTagName("Filler").item(0).getTextContent()+ "  ");

                 

return strInput;          

            }

            }

 

You can directly copy above java code and paste it into Function tab in message mapping under "Attributes and Methods" area as explained in Sunil's blog. Don't forget to import appropriate libraries.

 

BLOG_Mapping.jpg

Now, your message mapping will behave like java mapping. While executing this message mapping, first your java code under function tab gets executed, after that graphical mapping if there is any present.

 

You can test this java mapping by giving sample input xml in test tab at source. You will get Output flat file as expected at target.

BLOG_MappingTest.jpg

 

All SAP PI Versions: (Using Imported Archive)

 

If you are working on PI version < 7.3. You can do this using imported archive by importing your java mapping code in ESR and then use it into Operation mapping.

 

Open any Java SDK (NWDS, Eclipse, Netbeans etc.).

Create a java project.

Create FileParser.java, HeaderParser.java, InvoiceParser.java and TrailerParser.java class without having Main method.

BLOG_NWDS.jpg

Paste following code in respective java classes.

 

FileParser:

import java.io.OutputStream;

import javax.xml.parsers.DocumentBuilder;

import javax.xml.parsers.DocumentBuilderFactory;

import org.w3c.dom.Document;

import com.sap.aii.mapping.api.AbstractTransformation;

import com.sap.aii.mapping.api.StreamTransformationException;

import com.sap.aii.mapping.api.TransformationInput;

import com.sap.aii.mapping.api.TransformationOutput;

 

public class FileParser extends AbstractTransformation{

     

public void transform(TransformationInput in, TransformationOutput out) throws StreamTransformationException {

            try{

                        OutputStream os = out.getOutputPayload().getOutputStream();

                        DocumentBuilderFactory dbFactory = DocumentBuilderFactory.newInstance();

                        DocumentBuilder dBuilder = dbFactory.newDocumentBuilder();

                                    Document doc = dBuilder.parse(in.getInputPayload().getInputStream());

                                    //Header

                                    HeaderParser hparser= new HeaderParser();

                                    String strHeader=hparser.parseHeaderString(doc);

                                    os.write(strHeader.getBytes());

                                    //Invoice

                                    InvoiceParser iparser= new InvoiceParser();

                                     String[] strInvoice=iparser.parseInvoiceString(doc);

                                     for (int i=0;i<strInvoice.length;i++)

                                     {

                                     os.write(strInvoice[i].getBytes());

                                     }

                                     //Trailer

                                    TrailerParser tparser= new TrailerParser();

                                    String strTrailer=tparser.parseTrailerString(doc);

                                    os.write(strTrailer.getBytes());

                                    os.flush();

                                    os.close();

            }

                                    catch (Exception e)

                                    {

                                                throw new StreamTransformationException(e.getMessage());

                                    }

            }

     

}

 

HeaderParser:

import org.w3c.dom.Document;

import org.w3c.dom.Element;

import org.w3c.dom.Node;

import org.w3c.dom.NodeList;

 

public class HeaderParser {

            public String parseHeaderString(Document doc){

                 

                        String strInput="";

                        NodeList nList = doc.getElementsByTagName("Header");

                                    Node nNode = nList.item(0);

                                    Element element= (Element) nNode;

                                    strInput=(element.getElementsByTagName("RecordType").item(0).getTextContent()+ " "                                  

             +element.getElementsByTagName("FileName").item(0).getTextContent()+ " "

            +element.getElementsByTagName("Data1").item(0).getTextContent()+ "  "

            +element.getElementsByTagName("Data2").item(0).getTextContent()+ " " )

            +"\n";

                        return strInput;

                 

            }

            }

 

 

InvoiceParser:

import org.w3c.dom.Document;

import org.w3c.dom.Element;

import org.w3c.dom.Node;

import org.w3c.dom.NodeList;

 

public class InvoiceParser {

 

            public String[] parseInvoiceString(Document doc) {

                        NodeList nlistInv = doc.getElementsByTagName("Invoice");

                        String[] strInput = new String[nlistInv.getLength()];

                        int j = 0;

                        try {

                                    //looping Invoice tag

                                    for (int i = 0; i < nlistInv.getLength(); i++) {

                                                Node nNode = nlistInv.item(i);

                                                NodeList n1 = nNode.getChildNodes();

                                                Element e1 = (Element) n1;

                                    //Nodes under Invoice tag

                                                NodeList n2 = e1.getElementsByTagName("MSG_TO_VND");

                                                NodeList n4 = e1.getElementsByTagName("MSG_INV_DESC");

                                                NodeList n6 = e1.getElementsByTagName("DIT");

                     

                                    // Converting Invoice node into element

                                                Element element = (Element) nNode;

                                    //Fetching all text nodes value under Invoice tag

String recordType=element.getElementsByTagName("RecordType").item(0).getTextContent()+" ";                                             

String data=element.getElementsByTagName("Data").item(0).getTextContent();

String source=element.getElementsByTagName("Source").item(0).getTextContent();

String filler=element.getElementsByTagName("Filler").item(0).getTextContent();

                        //forming 1st line of strInput[i], Maintain the order of values                     

                         strInput[i]= recordType

                                    + data

                                    + source

                                    + filler

                                    +"\n";

                         //Assuming MSG_TO_VND tag is optional within invoice tag. Checking MSG_TO_VND tag existence

 

                         if (n2.getLength()>0)

                         {

                                    Node n3 = n2.item(j);

                        // Converting MSG_TO_VND node into element

                                    Element e2 = (Element) n3;

                        // Forming next Line of strInput[i], if exist                            

                                    strInput[i]=strInput[i]+

                        e2.getElementsByTagName("RecordType").item(0).getTextContent()+ " "

                        + e2.getElementsByTagName("Data").item(0).getTextContent()+" "

                        +e2.getElementsByTagName("DescriptionType").item(0).getTextContent()+" "     

                        + "\n";                         

                         }

                        // MSG_INV_DESC tag within Invoice tag. Assuming it is mandatory tag.

                                                            Node n5 = n4.item(j);

                        // Converting MSG_INV_DESC node into element

                                                            Element e4 = (Element) n5;

                        // Forming next Line of strInput[i], if exist

                                                            strInput[i]=strInput[i]+

               e4.getElementsByTagName("RecordType").item(0).getTextContent()+ " "

            + e4.getElementsByTagName("Data").item(0).getTextContent()+ " "

            + e4.getElementsByTagName("DescriptionType").item(0).getTextContent()+ " "  

            +"\n";                                                              

                        //looping DIT tag within Invoice.                   

                                                for (int m = 0; m < n6.getLength(); m++) {

                                                            Node n7 = n6.item(m);

                        // Converting DIT node into element

                                                Element e5 = (Element) n7;

                                                NodeList ditn = e5.getElementsByTagName("MSG_TRN_DESC");

                        //Fetching all text nodes value under DIT tag                       

String RecordType=e5.getElementsByTagName("RecordType").item(0).getTextContent();

String ditdata=e5.getElementsByTagName("Data").item(0).getTextContent();

String descriptiontype=e5.getElementsByTagName("DescriptionType").item(0).getTextContent();

                 

                    // Forming next line of strInput[i], if exist, maintain order of values        

                    strInput[i]=strInput[i]

                                    + RecordType+" "

                                    + ditdata+" "

                                    + descriptiontype+" "

                                    +"\n";

                         //Assuming MSG_TRN_DESC tag is optional within DIT tag. Checking MSG_TRN_DESC tag existence

                         if (ditn.getLength()>0)

                         {

                                                Node ditno = ditn.item(j);

                        // Converting MSG_TRN_DESC node into element

                                                Element dite = (Element) ditno;

                        // Forming next Line of strInput[i], if exist                            

                        strInput[i]=strInput[i]+

                dite.getElementsByTagName("RecordType").item(0).getTextContent()+ " "

            + dite.getElementsByTagName("Data").item(0).getTextContent()+" "                                

            + dite.getElementsByTagName("DescriptionType").item(0).getTextContent()+" "

            + dite.getElementsByTagName("Filler").item(0).getTextContent()+" "                    

            + "\n";             

                                                }          

                                    }

                                    }

                        } catch (Exception e) {

                                    System.out.println(e);

                        }

                        return strInput;

            }

}

 

TrailerParser:

import org.w3c.dom.Document;

import org.w3c.dom.Element;

import org.w3c.dom.Node;

import org.w3c.dom.NodeList;

 

public class TrailerParser {

            public String parseTrailerString(Document doc){

                        String strInput="";

                        NodeList nList = doc.getElementsByTagName("Trailer");

                        Node nNode = nList.item(0);

                        Element element = (Element) nNode;

            strInput=(element.getElementsByTagName("RecordType").item(0).getTextContent()+ " "   

            +element.getElementsByTagName("TotalCount").item(0).getTextContent()+ " "

            +element.getElementsByTagName("Filler").item(0).getTextContent()+ "  ");

                 

return strInput;

            }

            }

 

You can edit above code as per your requirement. Also you can add additional classes, if required.

Export all java files in your java project as JAR.

Make an imported archive in ESR.

Browse generated jar file in imported archive.

Select "FileParser" class (the class in which you have implemented transform method) in operation mapping while choosing java mapping.

BLOG_OperationMapping.jpg

You can test it only end to end.

Your output stream from java mapping will be written directly at receiver adapter.

Your generated flat file would be like below.

BLOG_FlatFile.jpg

 

Hope it would help you to achieve complex file content conversion using java mapping in SAP PI.

 

Happy coding,

Ambuj Mishra

Outside-In Web service Providers Development, Deployment and Testing in SAP PI

$
0
0

Requirement:

 

Developing and deploying a provider web service on SAP PI JAVA AS. This web service should be capable of taking an input xml from integration server, perform some action and return an output xml to integration server.

MainDiag.jpg

 

This Web service will be implemented using EJB (Enterprise java beans) and published on an http URL on SAP PI JAVA AS.

SAP PI will connect to this web service via SOAP Adapter.

 

 

Providing Web Services:

You can provide Web services using any of the two approaches:

·         Inside-out

You start from an implementation which is already available and expose it as a Web service. You create a Web service from a session Enterprise Java Bean or from a pure Java class.

This approach is also called bottom-up.

·         Outside-in

You start from a WSDL document of a service. In this case, the framework generates the skeleton of the implementation bean and you have to provide your own implementation of the business methods. You can create outside in Web services in session Enterprise Java Beans.

This approach is also called top-down.

 

Here we will explore Outside-in web service. That means we will use WSDL document of our inbound service interface to generate web service skeleton.

 

 

Development Steps:

 

 

1.     1. Open NWDS (NetWeaver Developer Studio).

      Create an EJB project

1.jpg

    Add project to an EAR.

2.jpg

Open ESR.

Select Inbound service interface for which you need to generate web service.

Go to WSDL tab. Export WSDL and save it on your machine.

 

Open NWDS.

Select EJB project which you have created in above steps-> Right click->New->Others->Web services-> Web service->OK

Browse your WSDL in service definition.

 

OR

 

2.      2. Connect your NWDS to PI 7.3 server.

       Go to Windows->Preferences->SAP AS JAVA->Add->Provide Host name and Instance number->OK.

3.jpg

3.       3. Open Enterprise Service Browser in NWDS.

       Go to Windows->Show View-> Others->Web Services-> Enterprise Service Browser->OK

4.jpg

4.       4. Connect to ESR.

        Click Connect to ESR button->Provide PI server credentials.

5.jpg

5.       5. Generate Java Bean Skeleton for inbound service interface.

        Navigate to your inbound service interface->Right click->Generate Java Bean Skeleton.

6.jpg

 

6.      6. Select Web Service Type “Top down java web service” and Level of service generation “Develop”->Click Next.

7.jpg

 

7.      7. Select Update WSDL.

8.jpg

8.      8. Select Resolve Collisions Automatically->Click next for more customization or Click finish.

9.jpg

9.      9. You can customize service, port type, port and binding name here-> Click next for more customization or Click finish.

10.jpg

10.    10. You can customize package here->Click Finish.

11.jpg

11.     11. Now your EJB Project structure will be something like below.

12.jpg

12.     12. Open implementation bean class.

               Define TransportBindingRT annotation and Write the processing logic of your web service.

13.jpg

    Note:

    TransportBindingRT annotation defines HTTP URL of your web service. The same URL will be used as a Target URL in Receiver SOAP communication  channel in PI.

    For example:

   @TransportBindingRT(AltHost="pidserver.net",AltPort=51100,AltPath="RetriggerBean")

   Your Web service’s HTTP URL will be, http://pidserver.net:51100/RetriggerBean

   Your Web service WSDL URL will be, http://pidserver.net:51100/RetriggerBean?wsdl

 

 

Deployment Steps:

 

1.      Select your EJB Project ->Right Click->RunOnServer->select the server name to be deployed->Click finish.

 

14.jpg

 

 

 

Testing Steps:

 

   Go to Startpage of SAP NetWeaver Application.

15.jpg

    Go to Web Service Navigator.

    Provide your WSDL URL.

 

    http://pidserver.net:51100/RetriggerBean?wsdl

 

16.jpg

  Follow the steps and provide appropriate input to test your web service.

 

 

 

Thanks & regards,

Ambuj Mishra

Viewing all 571 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>