Quantcast
Channel: SCN : Blog List - Process Integration (PI) & SOA Middleware
Viewing all 676 articles
Browse latest View live

Starting with Hana Cloud Integration? Create a simple integration flow (iFlow)!

$
0
0

In my previous blog I wrote about the ‘problems’ I encountered when starting with HCI. I hope that you could bypass those problems, and otherwise found the solution for it!
In this blog I would like to show how to create a simple integration flow, iFlow.

In this example I will show you how to get information out of a SuccessFactors instance, how to send this to a public web service, and how to process the results. I will show you the current weather for a city, which ZIP code we will collect out of SuccessFactors and sent that via email!

 

The following prerequisite are needed, and will not be further discussed in this post:

  • a HCI tenant
  • a keystore with SuccessFactors certificate
  • an artifact with  SuccessFactors credentials
  • a keystore with an email certificate
  • an artifact with email credentials

 

Configuring the sender

01-Overview.pngThis is what we are going to create.

 

 

First, we create a new integration project in Eclipse and configure the sender. I named the sender SuccessFactors and choose for ‘Basic Authentication’ in properties. The sender channel will be configured like this:

 


02-AdapterType.png

03-SenderChannel.png
These are the connection settings for the SuccessFactors channel.
Credential Name * is the name of the credentials you deployed earlier to your tenant.

 

Now we are going to set the operation details. Click on Model Operations and fill out your SF credentials. Because we want the ZIP code of a person we select the entity ‘User’, and go for zipCode.

 

04-ModelOperations.png

05-ModelOperations2.png

 

 

In the demo environment we are using there are a lot of users, with a lot of zip codes. We want to use only one, so we are going to set a filter. I know the ID of the user I want the zip code from is USR-4, so we create a filter like:

 

06-ModelOperations3.png

 

 

That is almost it for configuring the sender channel. We only need to fill in the Scheduler tab. For testing purposes I always choose ‘Run Once’.

 

Create the WSDL file

 

So now we have set up the sender from which we will get our input. Next step is to create a WSDL file from the web service. There are several ways to do that, here is what I did.

Search the WSDL page for the service you want to use (in this example: http://wsf.cdyne.com/WeatherWS/Weather.asmx?wsdl ) and save the page. It will save the page as Weather.asmx.xml. You can select this file and drag it to your Eclipse. Make sure you drop the file in the “src.main.resources.wsdl” map. Now rename the file to Weather.wsdl and double click the file to see the outcome.
We are going to use this file in the mapping and in the SOAP call later on.

 

Configure the mapping

 

Now it is time to add some mapping to our iFlow. Create a new message mapping and add it to your flow. When we configured the sender channel the model operation was saved as a file (probably something like UserEntityquerySync0.xsd), which is also in your resources.wsdl map. Choose this file as the source element.

As for the output element, you can select the wsdl file we created earlier and select the global element of your choosing. In this example we go with GetCityWeatherByZip.

 

07- MessageMapping.png

After setting the source and target go to the Definition tab and map the zipCode we got from SuccessFactors to the ZIP the web service requires.

08-MessageMapping.png

 

 

 

 

Calling the web service

 

After we managed to get the zip code form SuccessFactors and map it to the input the web service demands, we can call the web service. For that you need to add a service call to your flow.

 

09-RequestReply.png

 

The Request-Reply box is added to your flow and from here out you can add a message flow to a Receiver.

 

10-RequestReply.png

 

 

To configure the channel, double click the message flow and choose SOAP as Adapter Type.

11-SoapChannel.png

 

On the tab ‘Adapter Specific’ you can enter the details of the SOAP service you want to call. The address is the same address from earlier in this blog, where we downloaded the WSDL. The URL to WSDL can be found by clicking on ‘Browse’, and selecting the WSDL in the resources.wsdl map.

 

12-SoapChannel.png

13-SoapChannel.png

 

 

In this example we go with GetCityWeatherByZip.

 

 

Create the converter

 

To show you how to use the converter we will use a XML to CSV converter in our iFlow.
Select the converter from your pallet and drag it to your flow. Right mouse click on the box and go for ‘Switch to XML to CSV converter’.

 

14-XMLconverter.png

15-XMLconverter.png

 

Now we need to point the converter to the incoming flow and tell it how to process the data. Go to the properties tab of the converter and define the parameters.

 

16-XMLconverter.png

 

 

Send the outcome by mail

 

If everything is setup in the correct way, which we will see in a moment, the outcome will be the weather information of a city. We want to send this information via email. For that we need to create a new receiver and go for the Mail adapter.

 

17-MailAdapter.png

 

Under ‘General’ you can configure your mail settings. In the address field you can fill in your smtp-server, and the Credential Name is the name of the artifact you deployed earlier.

If you want to have the data showed in the message body you can enter “${in.body}” in the mail body.

 

18-MailAdapter.png

 

If you want an attachment with the data you can set it up by adding the parameters for an attachment.

 

19-MailAdapter.png

 

 

 

Run it

 

And now your iFlow should look something like this:

 

01-Overview.png

 

Deploy your Integration Content and watch the Message Monitoring of your tenant to see if it deployed correct.

 

20-RunTest.png

 

 

If instead of COMPLETED you see FAILED, there is something wrong (you might had figured that out already). This is a great moment to use the Trace feature (if you don’t know how to activate your tracing, check my previous blog here).

 

21-RunTest.png

 

Your iFlow will be decorated with envelopes. These envelopes can be selected and the content can be seen in the ‘Properties’ tab. This is a great way to see what content is moved from one ‘box’ to another and to see where it goes wrong.

 

22-RunTest.png

 

 

 

 

That’s all! You now created a simple integration flow using several adapters, mappings and converters. I hope this blog answered more questions than raised. If not, please do not hesitate to contact me or leave a comment below.

 

Bob van Rooij

 

 

Blog 1: Starting with Hana Cloud Integration? Keep this in mind!

Blog 2: Starting with Hana Cloud Integration? Create a simple integration flow (iFlow)!


Explore/Troubleshoot Component Based Message Alerting in PI

$
0
0

These are my findings/troubleshooting while I was configuring CBMA in dual stack SAP PI 7.31 server.

 

Little bit about Alerts in PI:

Alert configuration is used to configuring alerts in your PI system. It informs you if any error comes during message processing.

You use the alert configuration to have the system inform you of errors during message processing. You can receive the alert by e-mail, fax or SMS.

There are two ways to configure alert in PI.

1.       Classic using ALRTCATDEF

This is classical approach of configuring e-mail alerts in SAP PI. Using this method you can configure alert category and container element which defines what should be passed in email when alert is triggered.

2.       CBMA

Component-Based Message Alerting (CBMA) is the new way of sending alerts on SAP PI (either single or double stack) without the use of any additional components. CBMA is made of three components:

1.jpg

a) Central configuration for creating alerts - alerts can be created in Integration Directory or in NWA

b) Alert Engine - which is supposed to evaluate rules and create alerts.

c) Alert receivers (consumers) - components which can receive alerts from the Alert Engine

This article is about Component-Based Message Alerting. It will demonstrate how to create a typical alert configuration with an e-mail consumer. When the alert gets generated, an e-mail with alert data will be distributed to the alert receivers.

 

Types of errors where you can apply Alerts:

Usually it should apply on every type of scenario, but to be specific,

There were various Proxy to SOAP synchronous scenario communicate via PI in one of my projects. These scenarios  generates two kinds of error in PI.

  1. Application error
  2. System error

In case of Application error from receiver soap application, we can use fault message type and fault mapping in PI to send the error back to proxy to handle it as exception.

In case of System errors from receiver soap application, there is no standard way to send the error back to proxy.

As part of my requirement to handle these system errors, we needed to trigger an alert Email to developer team, every time we encounter any system error in PI from receiver application.

 

How to configure:

As this point, we all are aware of, that PI comes with 2 types of installation (single and dual) now. Single stack will have only JAVA instance of PI application server (AS)  while dual will have both java AS and ABAP AS instances. There are plenty of docs and blog post to explore this.

 

I will always prefer CBMA on AS JAVA as it's simple, straightforward and lightweight. And may be in 2 or 3 years, you won't find any PI ABAP stack to work with.

 

CBMA on AS JAVA can be configured following this link,

Michal's PI tips: Component-Based Message Alerting

 

CBMA on AS ABAP can be  configured following this link,

Component Based Message Alerting On As Abap

 

After following above steps on AS ABAP, my email was not triggering. As I found out , few steps were missing, which I am covering here.

 

List of all steps:

Integration Engine Configuration

Alert Rule Creation

Manage Alert Rule

Email Distribution List

Email Generation Variant

Alert Collector Job

Email Sender Job

Monitoring Configured Alert

Testing Alerts

 

Email sender job:

          This job will be responsible for pushing all configured and collected alert emails to consumer’s email inbox.

          Go to t-code SCOT.

          Create a Send Job. Choose Schedule job for INT for emails.

              1.jpg

           1.jpg

               Press continue.

            1.jpg

              Following will be the ABAP program, corresponding to this job.

          1.jpg

             Configure start condition for this job, as per your requirement.

          1.jpg

Monitoring Configured Alert:

           Not a mandatory step, but could be useful in monitoring.

           We can monitor the job which we have configured in SM37.

       1.jpg

Test Result:

           Whenever any message is failed for the mentioned interfaces in alert rule, an e-mail will be triggered to the registered Distribution List. Below is a sample alert email.

            Sample E-mail:

           1.jpg

Hopefully, this blog will give you a basic idea of alerts and all necessary configuration steps to follow.

 

Reference:

Component-Based Message Alerting - Administering Process Integration (PI) - SAP Library

Alert Configuration - Process Integration Monitoring - SAP Library

TechEd Presentation on Best Practices in SAP Integration

$
0
0
SAP Best Practices

 

I am proud to announce that I am starting a series of posts on the topic of best practices in the area of SAP.

 

I submitted an abstract for the SAP TechEd Conference Las Vegas (October 19-23, 2015), and I have recently learned that it had been approved.

 

My first thought was: “Super! Great!” Then the next was: “Oh, crap, I need to prepare that presentation!” That is going to be a lot of work. On the other hand, there is a lot of great material that I can share, and it will make it easier to create the questions that customers need to go through when starting new projects.

 

The title is:

 

SAP Best Practices Integration with SAP PO and Other Solutions

 

My abstract is the text below:

 

“Everybody asks for a set of best practices on all integration projects. But does this set of proper procedures really exist? There will be a walk-through of the different guidelines that are in existence today. The presentation will cover how I have seen successful integration setups. The following topics will be covered: adapter development, custom development to support functions, naming conventions, integration patterns, and multi-platform strategies. We will also discuss the organizational changes and setups that have the biggest impact on the process of establishing and enforcing best practices.”

 

I guess I have time until the end of August to complete the presentation.

 

I hope that you will help in sharing ideas related to best practices in the area of SAP Integration. My view is that I have got one view of it: I have been in different organizations - and I have seen that there’s always room for improvement.

 

I think the following main points must be covered:

  • Reasons to establish and enforce best practices
  • The concept of best practices is dead
  • Naming standards
  • Development guidelines
    • What should they contain
    • On-prem vs Cloud integrations
    • Documentation
  • B2B
    • What to do
    • Custom vs special mapping
  • Process and User Interaction
    • User access
    • What can be done in BPMN (Business Process Model and Notation)
  • Custom development
    • When is it okay to use extra development
    • NWDI (SAP NetWeaver Development Infrastructure)
    • Adapters
    • BPMN

 

My hypothesis is that there is not one single set of best practices. The main reason behind this is that each customer takes different steps into consideration in order to make the best use of their time and resources. I believe customers need to know all the different aspects of the issue in order to reach a conclusion; knowing what worked for others can be of great assistance. I'll be creating a list of different things that should be discussed when starting a new project.

 

I do need your help, since I’m a one man army trying to collect all this information. Over the next few months I’ll be sharing my opinions on what I see as best practices. Please let me know whether you agree or disagree, so I can improve my presentation. Do you think there are other important topics (related to the concept of best practices) that need to be discussed? If so, feel free to share your ideas! Together we can create valuable content, and help other developers while we're at it.

 

This post is cross posted to Best Practices in SAP Integration

Automatic update of SXI_CACHE in SAP back-end from Single Stack PO 7.4

$
0
0

Purpose:

 

My client was using PO7.4 Single Stack system and had a requirement that whenever the alert rule is changed in PO system's Integration Directory, the same change should be automatically reflected in transaction SXI_CACHE of SAP ECC system. There should not be a need to do Delta or Complete cache refresh manually in SXI_CACHE.

 

Following configurations were done to make this happen.

 

Steps in SAP back-end system (eg. SAP ECC):

 

1) In SM59, create a RFC destination "INTEGRATION_DIRECTORY_HMI" of type "H".

  • Under "Technical Settings" tab
    • Target host: Enter the fully qualified host name of PO
    • Service No: Enter the service number of PO
    • Path Prefix: /dir/CacheRefresh

 

1.PNG

 

  • Under "Logon & Security"
    • Logon with User: Select Basic Authentication radio button
    • User: Enter the technical user created in PO system having atleast the roles "SAP_SLD_CONFIGURATOR" & "SAP_XI_IS_SERV_USER".

               For example: User PIISTDO where TDO was the PO system.

    • Password: Maintain the password.

 

2.PNG

 

  • Under "Special Options" tab
    • Specify Timeout: 30000.
    • Compression: Inactive
    • Compressed Response: NO
    • Accept Cookies: Yes (All)

3.PNG

 

2) Create a system user "PICACHEUSER". It should have the following roles:

    • SAP_BC_WEBSERVICE_PI_CFG_SRV
    • SAP_SLD_CONFIGURATOR
    • SAP_XI_CACHE_SERV_USER.


3) In transaction SM59, create a RFC destination SAPXICACHE<sy-client> of type 3. For example: SAPXICACHE110 for client 110.

  • Under "Technical Settings" tab
    • Target host: Enter the fully qualified host name of ECC system
    • Service No: Enter the service number of ECC system
    • Save to database as: IP address

 

4.PNG

 

  • Under "Logon & Security" tab
    • Client: Enter the client number of ECC system.
    • User:PICACHEUSER created in step 2.
    • Password: Maintain the password.

 

5.PNG

 

  • Under "Unicode" tab

6.PNG

 

  • Under "Special Options" tab

7.PNG

 

4) Create a system user "PIDIRUSER". It should have the following roles

    • SAP_BC_WEBSERVICE_PI_CFG_SRV
    • SAP_SLD_CONFIGURATOR
    • SAP_XI_ID_SERV_USER
    • SAP_XI_ID_SERV_USER_MAIN.



Step in PO:


1) Go to PO -> Integration Directory and open the ECC business system.


2) In the business system, go to "Logon Data" tab and select "support distribution of configuration" and enter the Logon Data for user "PIDIRUSER".


8.PNG



Please note: If the PI system is dual stack (ABAP + Java) and is supposed to receive Cache Updates then steps 1-4 mentioned for SAP ECC need to be done for PI system as well. i.e. creation of RFC destination "INTEGRATION_DIRECTORY_HMI", system user "PICACHEUSER", RFC destination SAPXICACHE<sy-client> and system user "PIDIRUSER").


References:


http://help.sap.com/saphelp_nw73ehp1/helpdata/en/48/a9b9957e28674be10000000a421937/content.htm

http://help.sap.com/saphelp_nw73ehp1/helpdata/en/a6/54a40db5db4c6591670186cd72ceff/content.htm

http://help.sap.com/saphelp_nw73ehp1/helpdata/en/48/cfd1d49bf23e49e10000000a421937/content.htm?frameset=/en/48/ced2c918d3424be10000000a421937/frameset.htm&current_toc=/en/5d/ab866dd6164363a9c29986fcce6716/plain.htm&node_id=139

http://help.sap.com/saphelp_nwmobile71/helpdata/en/8f/770f41218ff023e10000000a155106/content.htm




External Control of PI Communication Channel from ECC System

$
0
0

Requirement:
In complex requirement where ECC system was creating multiple file and the requirement is like PI should start picking those file only once all the files are placed.

 

Initial Approach:

Initially we thought of going with Trigger file mechanism where we define an additional file in File List parameter, with same name as the source file name(only file extention needs to be different.)

 

Limitaion.

It works with a single file and with static file name.

 

On further digging we realised that we can levarage the External Control Option provided by SAP in such situation for starting/stopping channel.

 

Approach:

Step 1: We need to set "External Control On"

 

Image1.PNG

After Switching On the External Control for the Communication Channel we can Control this Communication Channel Externally.

 

Step 2: After this we have to write an executable report in ECC system that can start and stop the communication channel.

 

 

*&---------------------------------------------------------------------*
*& Report  Z_COMM_CHANNEL_START_STOP
*&
*&---------------------------------------------------------------------*

REPORT  Z_COMM_CHANNEL_START_STOP.
data: V_URL type STRING,
     
CLIENT type ref to IF_HTTP_CLIENT.
data: RESPONSE_CODE type  SYSUBRC,
      RESPONSE_TEXT
type  STRING.
data: FIELDS_TAB type TIHTTPNVP,
      STATUS_CODE
type STRING,
      STATUS_REASON
type STRING,
     
NUMBER type I.
data: W_RESULT type STRING,
      RESULT_TAB
type table of STRING,
      RESULT_WA
like line of RESULT_TAB.

"Set this v_url parameter as per your requirement(wether you want to start the channel or stop or just want the status)
"Get comm channel status
*V_URL = '/AdapterFramework/ChannelAdminServlet?party=*&service=*&channel=Test_CommChannel_ExtCtrl&action=status'.

"Start comm channel
v_url
= 'http://host:port/AdapterFramework/ChannelAdminServlet?party=*&service=*&channel=CC_File_Merch_Recv&action=start'.


"Stop comm channel
"v_url = '/AdapterFramework/ChannelAdminServlet?party=*&service=*&channel=Test_CommChannel_ExtCtrl&action=stop'.


call method CL_HTTP_CLIENT=>CREATE
 
exporting
    HOST              
= 'host'         "PI system name
    SERVICE           
= 'port'         "PI Port number
*      PROXY_HOST         =
*      PROXY_SERVICE      =
*      SCHEME             = SCHEMETYPE_HTTP
*      SSL_ID             =
*      SAP_USERNAME       =
*      SAP_CLIENT         =
 
importing
   
CLIENT             = CLIENT
 
exceptions
    ARGUMENT_NOT_FOUND
= 1
    PLUGIN_NOT_ACTIVE 
= 2
    INTERNAL_ERROR    
= 3
   
others             = 4.


*if SY-SUBRC <> 0.
*  message E000 with SY-SUBRC.
*endif.
*
*if SY-SUBRC <> 0.
*  message E000 with SY-SUBRC.
*endif.


*set header fields
call method CLIENT->REQUEST->SET_HEADER_FIELD
 
exporting
    NAME 
= '~request_method'
   
VALUE = 'POST'.


call method CLIENT->REQUEST->SET_HEADER_FIELD
 
exporting
    NAME 
= 'Content-Type'
   
VALUE = 'application/xml'. "; charset=UTF-8' .


*Set request protocol
call method CLIENT->REQUEST->SET_HEADER_FIELD
 
exporting
    NAME 
= '~server_protocol'
   
VALUE = 'HTTP/1.0'.


*Update url
call method CLIENT->REQUEST->SET_HEADER_FIELD
 
exporting
    NAME 
= '~request_uri'
   
VALUE = V_URL.


*Disable logon popup
CLIENT->PROPERTYTYPE_LOGON_POPUP = CLIENT->CO_DISABLED.
call method CLIENT->AUTHENTICATE
  
exporting
      USERNAME            
= ''        "PI System Username
      PASSWORD            
= ''  .     "PI System Password

CL_HTTP_UTILITY
=>SET_REQUEST_URI( REQUEST = CLIENT->REQUEST
                                  URI
= RESPONSE_TEXT ).
*Send http request to server
call method CLIENT->SEND
 
exceptions
    HTTP_COMMUNICATION_FAILURE
= 1
    HTTP_INVALID_STATE        
= 2
    HTTP_PROCESSING_FAILED    
= 3
   
others                     = 4.
if SY-SUBRC <> 0.
 
call method CLIENT->GET_LAST_ERROR
   
importing
     
CODE    = RESPONSE_CODE
     
message = RESPONSE_TEXT.

 
message I000(SR) with RESPONSE_TEXT.

 
exit.
endif.


*Get http response from server
call method CLIENT->RECEIVE
 
exceptions
    HTTP_COMMUNICATION_FAILURE
= 1
    HTTP_INVALID_STATE        
= 2
    HTTP_PROCESSING_FAILED    
= 3
   
others                     = 4.
if SY-SUBRC <> 0.

 
call method CLIENT->GET_LAST_ERROR
   
importing
     
CODE    = RESPONSE_CODE
     
message = RESPONSE_TEXT.

  STATUS_CODE
= CLIENT->RESPONSE->GET_HEADER_FIELD( '~status_code' ).
  STATUS_REASON
= CLIENT->RESPONSE->GET_HEADER_FIELD( '~status_reason' ).
 
concatenate RESPONSE_TEXT '(' STATUS_CODE STATUS_REASON ')'
 
into STATUS_REASON separated by SPACE.

 
message I000(SR) with STATUS_REASON.

 
exit.
endif.

*Get header_fields contents such status code, reason etc
"call method client->response->GET_HEADER_FIELDS
"   changing
"     FIELDS             =  Fields_Tab  .
clear: W_RESULT.
W_RESULT
= CLIENT->RESPONSE->GET_CDATA( ).
refresh RESULT_TAB.
split W_RESULT at CL_ABAP_CHAR_UTILITIES=>NEWLINE into table RESULT_TAB.

loop at RESULT_TAB into RESULT_WA.
 
write / RESULT_WA.
endloop.

*&---------------------------------------------------------------------*

 

 

Step 3: This reports needs to be called at the end of the report from where multiple files has been created.

 

Through this approach we can easily control SAP PI channel and start/stop as per the requirement from ECC end.

 

Regards,

Pradeep

Java Mapping : Base64 Zipped CSV to XML Conversion

$
0
0

Introduction

 

In my recent client project i had a requirement where the incoming file was a csv file zipped and Base64 encoded. Source system was Ariba and Target was ECC.

My aim at writing this blog is to provide you with a reusable Java Mapping for similar requirements.

 

Description

 

Ariba sends file to PI using webservice. The encoded file is within a tag named "Header".

Following actions are required to be performed by PI before sending the file to ECC for further processing .

 

1. Decode base64 file

2. Unzip the content

3. Convert the unzipped CSV file into XML

 

In order to achive the above i have created a Java Mapping.

 

Incoming Payload


Encoded file is is within the "HeaderExport" Tag

 

Input paylaod.png

 

CSV file after unzipping

 

UnzippedCSV.png

 

 

Source Code : Please see attachment.

 

Output of the Mapping

 

Output.png

 

outputPayload.png

Best Practices: Naming Conventions for SAP Integration

$
0
0

This is part 2 of my series on best practices. You might also want to read the reason for creating this series on SAP PI/PO best practice. In this post, I will share further topics I wish to cover in the series. Furthermore, the contents of a best practices document will be discussed in depth.

 

InvoiceCreation_IA-300x300.pngAn essential element of the developer’s handbook is the naming convention, which covers the naming of all objects used in an integration. Naming is important because it cannot be changed without a lot of work. There is no easy way to rename objects because of the many places where the names have been used.

 

The downside of choosing object names – and then having to change them – is that you first realize how they work only after you have made your way through a considerable chunk of your projects, and after so much invested time you cannot change the names without a lot of waste.

 

IFG (The International Focus Group for Integration) has the only updated naming convention. New information has been added to keep up with new data types. Ask your local user group for the document.

SAP also has its view on naming conventions. The document was created in 2009, but it does contain the most used objects that are still the main focus of SAP users: http://scn.sap.com/docs/DOC-16239

 

Let these documents work as a way to get leads for your own custom document. It does not have to be as lengthy – it can be a 3 pages long summary with some of the names that you will be using.

 

One big issue arises if you want to have prefixes on all objects, like OM_ , before operation mapping. It may make it easier for new developers to understand what has been developed. However, it is just extra information that I do not consider useful, so I would skip it if possible.

 

I recommend the use of the business object that you are working with. For example, Invoice could be that business object. Then you have got different posts and prefixes on, like InvoiceCreation_IA or whatever you prefer.

 

The other important discussion is about where to place the objects. The general SOA (service-oriented architecture) approach was to create two interface components, e.g. one for the SAP ERP (Enterprise Resource Planning) system, and one for the third-party system. They would have information about the interfaces; sometimes they could contain some logic (but not most of the time). Then there could be mappings in a PI (Process Integration) or Application component.

 

The other way is to have one software component per software that you want to integrate, and then place mappings and an interface in this. This may make some things much easier to transport and develop on. There is not much reuse in this approach, but in general it is difficult to get to the “Generic” components that can be reused all over again.

 

There cannot be one document that covers all cases. Sometimes developers have to make intelligent decisions to get away from the naming convention, enabling a method of differentiation. You will not be able to handle all cases in the beginning, or you will probably spend too much time on them.

 

I think the biggest mistake that companies make is forgetting about the naming convention, and hoping they can make it right along the way. As it turns out, it can be quite expensive to change something at a later point.

 

In other situations, they just take one of the standard templates and use it as their own. If they do not have local ideas, it becomes too difficult to make decisions, and developers have to figure things out on their own.

 

My best advice is to have a workshop with some consultants, preferably with different backgrounds, when you have an idea of what you want to accomplish, while also possessing enough knowledge on what types of integration will be the primary area of expertise your project requires.

What’s your opinion on naming conventions? Would you make some changes halfway through a project?

This post orginal appered on  http://picourse.com/best-practices-naming-conventions

Rest Sender Adatper Poll (REST API--->PI--->JDBC)

$
0
0

Dear All,

 

I have little struggled on REST adapter when fetching the data from rest web service using SAP standard REST adapter. But I have achieved on this requirement with the help from SAP document , Alexander's Blog & SAP thread

 

 

http://help.sap.com/saphelp_nw74/helpdata/en/d4/ee3eca7baf436b996e2473c04809b4/content.htm

PI REST Adapter - Polling a REST API

Restful webservice-source

Prerequisites on this REST sender poll:

 

PI 7.31 SP 16 (or) PO 7.4 SP11

SAP XI AF java component should be the latest patch level 1( Available up to today) and please make sure all dependent components should be update when you update SAP XI AF.

 

 

As you know from PI 7.31 SP 14 (or) PO 7.4 SP 09 , SAP has introduced standard REST features. and they keep on adding the features. the latest one is available on  PI 7.31 SP 16(or) PO 7.4 SP 11 and it contains Rest Sender Adapter poll.

 

This feature is exactly match with my requirement(REST API-->PI-->JDBC) when PI wants to use GET operation (for fetching data from Rest API)

 

My REST API gives JSON format, so PI must converts into XML format for understand, mapping, routing.. etc. So I used REST sender poll adapter which is avialable  on PI 7.31 SP 16 (or) PO 7.4 SP 11.

My sample JSON Format


{"projects":[{"PKZ":"AO000","projectState":"yes","projectName":"xyz"},

{"PKZ":"BE000","projectState":"no","projectName":"abc"}]}

 

 

My JSON structure is very big , but I short you for very few fields with values. see in my JSON , "projects" is root tag of JSON and there are two datasets.


According to JSON format, I have created the datatype, message type in ESR( the data type structure is very important, you need to create right way according to your JSON structure)


RestAPIDatatypeon ESR.jgp.jpg


see in the screen shot 'projects' should be 0 ... unbounded , because we have multiple datasets on JSON.


Importantant Note: I heard from some blogs multiple data sets not possible to convert JSON to XML ,but on PI 7.31 SP 16 (or) PO 7.4 SP 11 is possible to convert multiple data sets, so we do not need any customer adapter modules



See I only added screen shot of Datatype which created on PI ESR. I do not cover any data type structure of JDBC, message type, Message interfaces, mapping... etc. You can create as usual according normal integration scenario.



Integration Directory settings

 

This is my sample REST Web service URL:  https://tools-dev:9007/ceres/data/basic?lastChangedTime=2015-06-01T01:00:00Z

But i have to generate dynamic value on URL when PI calls the Rest web service  using GET operation. so we need to set up our connection according to that.




 

Sender Adapter poll:


See in the REST sender adapter , we can able choose REST (or) REST Pooling in the message protocol, selected REST polling according my requirement


RESTPollingAdapter.jpg

Now in the 'General' tab. i used only basis authentication. Note: but make sure if your web sevice requires certificate based login then use client certificate authentication check box. in my case I just imported REST web service certificate into PI NWA but I didn't used the client certificate authentication section . in my case I get  error without certificate import.

 

Quality of Service: Exactly once  (because i used asynchronous)

RestsenderGeneralTab.jpg

 

RestGeneralTabadditional.jpg

 

 

 

In the HTTP Request tab, specify your HTTP/HTTPS url for call Rest API webservice and choose REST operation and Polling Interval.

 

See in the URL I used "{incrementalToken}" for generation of Dynamic value using Incremental Requests

 

(Setting up Incremental RequestsFor REST APIs that allow incremental requests, the adapter allows to store the timestamp of the latest call or a custom value (specified by an XPath expression or a JSON element). The value will be stored between the calls and can be referenced in the REST URL as a placeholder with name incrementalToken)

 

HTTP operation : GET

Polling Interval time : accroding to your freequency of Data

RESTHTTPRequest.jpg

 

In the ' Data Format ' Tab choose data type format that PI receives from Rest

 

Check the box that "convert JSON to XML

ADD Wrapper element section : Put your message type which has created in ESR and Namesapce of message type

 

RestDataFormat.jpg

 

 

In the same tab "Data format" there are  few more other options to use. but in my case I used only "incremental Requests" for generate the dynamic value on REST URL when PI calls the REST web service for fetch data(GET)


See in the incremental Requests section, I choose the incremental type is "Timestamp of Last call"

Time stamp format should be ISO 8601

initial value  for one time request( it should require)


RESTIncrementalRequest.jpg

 

 

I hope it will help you how to use Rest Sender poll adapter

In this blog I only pointed out how you can fetch the data  using sender REST poll adapter. I do not concentrating on Target side.

 

Thanks & Best Regards,
Sateesh


Test Tools for Posting IDoc XML to SAP over HTTP

$
0
0

There are quite some and detailed posts on emulating the process to post IDoc xml over HTTP and also posting IDoc using SOAP protocol. However, there are times where we want to test these scenarios ourselves using some kind of tools/ Applications.  This document details on some of the various tools which we could use to post IDoc XML to sap.

 

IDoc xml over HTTP

You could follow the blog here from Grzegorz Glowacki for detailed step-by-step procedure to set up the configuration.

In nutshell these are the steps to post IDoc xml to SAP over HTTP:

  1. Maintain distribution model in BD64 & generate Inbound Partner Profiles. Use WE20 to set up partner profiles manually.
  2. Check HTTP is enabled ( which is true in almost every case ) . Tcode SMICM è GoTo è Services (Shift +f1). 
  3. Activate the idoc_xml service in SICF . Path /sap/bc/idoc_xml/   Virtaual Host is usually the default one.
  4. Ping http://[server]:[port]/sap/public/ping?sap-client=[client] to get response “Server reached successfully”.

 

Tools to post Idoc XML :

 

The UI is simple and straightforward. One could click and add additional custom headers & Authentication for the message easily.

Screen Shot 2015-07-03 at 15.47.07.png

Once you install the add on, click on the RESTCLIENT add on to trigger the below screen. Select Method “POST”  and enter the url as http://<server>:<port>/sap/bc/idoc_xml?sap-client=<client>

 

Screen Shot 2015-07-03 at 16.31.18.png



Click on the header and add the header as text/xml and click on Authentication to add the basic authentication as below.

 

Screen Shot 2015-07-03 at 16.30.18.png

 

Once you enter the above details and press “SEND” to post the XML to SAP. You should get response as “IDoc-XML-inbound ok”.

 

Screen Shot 2015-07-03 at 16.30.57.png

 

 

IDoc is successfully posted as shown below.

 

Screen Shot 2015-07-03 at 16.44.31.png

 

WE05 / WE09 in SAP:

Screen Shot 2015-07-03 at 16.47.15.png

 

 

  • Google Chrome with Postman addon by www.getpostman.com with better UI and more features like the possibility to run scripts before request is called and save test cases etc. I like this personally because of the ease of the UI and the possibility to see the History. The working of this tool is similar to REST CLIENT add-on on Mozilla.

 

Screen Shot 2015-07-03 at 16.07.44.png

 

Enter the URL/ Endpoint along with selecting basic authorization. Select the format as text/xml in the headers and press SEND to post the IDoc to SAP. Status code 200 represents a successful posting to SAP.

Screen Shot 2015-07-03 at 16.07.01.png

  • SOAPUI

SoapUI is a free and open source cross-platform for testing SOAP and REST services. It has got easy-to-use graphical interface , enterprise-class features and has multiple options  when it comes to creating test scenarios. More about SOAP UI here. Download latest version of SOAP UI from here

 

First step would to create a new REST project as shown below. Click on new REST project and input the endpoint/URL where the IDoc needs to be posted to:

 

http://<host>:<port>/sap/bc/idoc_xml?sap-client=<client>

      

Screen Shot 2015-07-06 at 10.37.50.png

 

Enter the URL and press enter.

 

 

Screen Shot 2015-07-06 at 13.25.16.png

Choose the authentication as Basic and select the Authenticate pre-emptively option.

 

Screen Shot 2015-07-03 at 17.05.23.png

 

Now choose the method as “POST” along with selecting the media type as text/xml as shown below. Copy paste the IDoc xml with out any whitespaces and paragraph breaks into the REQUEST /INPUT window as shown below and press on submit.

 

Screen Shot 2015-07-06 at 13.31.03.png

When the IDoc is successfully received at the recipient SAP system, following success message is shown in SOAPUI.

 

Screen Shot 2015-07-03 at 17.04.48.png

 

Important points:

 

  • Use textfixer to remove the white spaces between the tags and paragraphs in the xml. Formatted XML should like the one below:

<?xml version="1.0" encoding="UTF-8"?><ADRMAS03><IDOC BEGIN="1"><EDI_DC40 SEGMENT="1"><TABNAM>EDI_DC40</TABNAM><DOCREL>702</DOCREL><DIRECT>2</DIRECT><OUTMOD>3</OUTMOD><IDOCTYP>ADRMAS03</IDOCTYP><MESTYP>ADRMAS</MESTYP><STD></STD><SNDPOR>WSO2</SNDPOR><SNDPRT>LS</SNDPRT><SNDPRN>WSO2</SNDPRN><RCVPOR>NL</RCVPOR><RCVPRT>LS</RCVPRT><RCVPRN>ERDCLNT400</RCVPRN></EDI_DC40><E1ADRMAS SEGMENT="1"><OBJ_TYPE>KNA1</OBJ_TYPE><OBJ_ID>0030000005</OBJ_ID><CONTEXT>0001</CONTEXT><E1BPAD1VL SEGMENT="1"><TITLE>0001</TITLE><NAME>Jhon</NAME><NAME_2>Nestle</NAME_2> <CITY>NL</CITY><POSTL_COD1>sdsd</POSTL_COD1><POSTL_COD2>NL</POSTL_COD2><PO_BOX>123456789</PO_BOX><PO_BOX_CIT>AM</PO_BOX_CIT><STREET>sdsd</STREET><HOUSE_NO2>4545</HOUSE_NO2><COUNTRY>NL</COUNTRY><COUNTRYISO>NL</COUNTRYISO></E1BPAD1VL><E1BPAD1VL1><ADDR_GROUP>BP</ADDR_GROUP></E1BPAD1VL1><E1BPADFAX SEGMENT="1"><COUNTRY>NL</COUNTRY><COUNTRYISO>NL</COUNTRYISO><STD_RECIP>X</STD_RECIP><HOME_FLAG>X</HOME_FLAG><CONSNUMBER>001</CONSNUMBER></E1BPADFAX><E1BPADSMTP SEGMENT="1"><E_MAIL></E_MAIL><STD_RECIP>X</STD_RECIP><HOME_FLAG>X</HOME_FLAG><CONSNUMBER>001</CONSNUMBER></E1BPADSMTP><E1BPADURI SEGMENT="1"><URI_TYPE>HPG</URI_TYPE><URI>http://www.google.com</URI><HOME_FLAG>X</HOME_FLAG><CONSNUMBER>001</CONSNUMBER></E1BPADURI></E1ADRMAS></IDOC></ADRMAS03>

  • Always use the addition encoding="UTF-8" in the xml message.
  • Some times you get error “IDoc '0000000000000000' has already been received, therefore reception is refused”. To over come this error message, you need to put an external break-point in the include LEDINF10 after the select statement on SRRELROLES. Now make the retrun code to “1” after the select on SRRELROLES , to post the Idoc with out any error.  The cause of the error (“I think”) could be due to inconsistency in buffer refresh.  Every time an IDoc is posted from one of the above tools, SAP tries to select an IDoc with number 0000000000000000 from SRRELROLES, which seems to always exist and eventually the error.

Screen Shot 2015-07-03 at 16.29.55.png

Fix as described above:

Screen Shot 2015-07-03 at 16.43.23.png

 

  • You could use function module IDOC_XML_TRANSFORM to transform an existing IDoc into XML format.

SAP TechEd (#SAPtd) Lecture of the Week:The latest and greatest on Cloud Integration

$
0
0

It's again that time of the year when I start planning out my sessions at the upcoming Tech-Eds in Las Vegas and Madrid. Time sure flies and I guess you would agree with me on that point. I was looking at the recording of the INT200 session that I delivered in Las Vegas last year. The session was an update on SAP's Cloud Integration Strategy. If you are interested in listening to the recording you can find it here.

 

I talked about how SAP HANA Cloud Integration is being used in the context of the different Cloud LOB solutions from SAP. I also talked about some of the customers and partners who are using HCI namely Bentley, Applexus and also Itelligence. I showed a live demo showing HCI in action and how easy it is to consume the pre-packaged content that is being shipped by SAP and run an end to end integration scenario live.

 

Stay tuned for additional updates on HCI also at this Tech-Ed. But until then if you are itching to learn more then I would suggest the following links for you:

 

1) All latest and greatest information on HCI can be found at : http://scn.sap.com/docs/DOC-40396

 

 

2) If you want to browse through the public content catalog page for HCI: https://cloudintegration.hana.ondemand.com

 

 

3) If you want access to a 30 day trial of HCI: http://scn.sap.com/community/hana-in-memory/blog/2013/10/22/sap-hana-cloud-integration--test-and-learn-more-about-sap-s-cloud-based-integration-solution

 

 

4) Documentation on HCI: http://help.sap.com/cloudintegration

 

 

If you still have additional questions reach out to me: sindhu.gangadharan@sap.com

Queues issues in SAP PI and ECC system overview and steps to resolve it

$
0
0

Blog Overview

 

The objective of this blog post is to:

  • Provide a brief idea about ‘Queue Processing’ in SAP, SAP Tables consisting of Queue status details and various jobs to be scheduled to clear these entries and re-indexing, re-organizing and updating of the statistics of these tables in order to have smooth processing of the messages in the Queues without causing any delays in processing to the Critical Business.
  • This Blog will help all those who face issues in the Queues processing in SAP and the way to overcome it. I have searched through many SCN blogs but haven't come across any consolidated blogs, hence i decided to come up with the one


Executive Summary

 

This paper is intended for professionals interested in understanding and learning the way Queue processing takes place in both R/3 and PI system and the jobs to be scheduled for smooth processing of these queues without any queue being stuck in the system for more time, which will cause the delay in receiving the critical messages for the various 3rd party systems And for which the message throughput must be very high so that it does not cause any problem in running the business smoothly.

 

Content

 

Introduction

 

Queue Processing

 

There are two types of LUWs namely qRFC and tRFC. qRFC’s can be monitored through the transaction code SMQ1/SMQ2 and tRFC’s through SM58.

 

Outbound Message Processing

 

The outbound message flow if it is through qRFC then it can be monitored through SMQ1 of R/3 system and through SM58 if it is a tRFC of R/3 system. After this step, the message reaches PI system and this can be monitored through the transaction SMQ2 in PI System, once the message has been processed from SMQ2 then it follows the pipeline steps and gets processed by PI system to 3rd Party system.

 

Inbound Message Processing

 

The inbound message flow if it is through qRFC then it can be monitored through SMQ of PI system and through SM58 if it is a tRFC of PI system. After this step, the message reaches R/3 system and this can be monitored through the transaction SMQ2 in R/3 System, once the message has been processed from SMQ2 then the IDoc will be posted to SAP.


SAP Tables consisting of Queue Status

           

            ARFCRSTATE and ARFCSSTATE Tables

           

ARFCRSTATE consists of the statuses of Asynchronous RFC (ARFC) calls on the Receiver side, while that of ARFCSSTATE consists of ARF Calls on Sender side. There are different statuses updated based on the queue processing namely –

 

  • EXECUTED

The related LUW is completely executed in the target system. The system waits for an internal tRFC/qRFC confirmation from the sending system before this entry is deleted.

 

  • HOLD

The corresponding application has processed this LUW in parts and wants this LUW to not be repeated in the case of subsequent network or communication errors

 

  • WCONFIRM

During a LUW execution the application has prompted the tRFC/qRFC Manager to set status HOLD. If the LUW execution has already been completed but this application has not yet signaled the logical LUW end and if the tRFC/qRFC-internal confirmation from the sending system has been received, then this LUW receives status WCONFIRM.If the respective application informs the tRFC/qRFC Manager about the logical LUW end, then this entry is deleted

           

And both the above mentioned tables consists of the table entries for qRFCs

 

tRFCQOUT and tRFCQIN Tables

 

tRFCQOUT consists of the details of the tRFC queue description and it consists of queue description of data related to BW and it stores the data about the outbound queue.

tRFCQIN consists of the details of the tRFC queue description and it consists of queue description of data related to BW and it stores the data about the inbound queue.

           

Jobs to be scheduled in both R/3 and PI System

 

a.      Image showing entries processing very slowly in the SM58 tRFC queue of R/3 System


Queues_Stuck.png

 

b.   Queue stuck in SMQ2 of PI System


 

2.png

 

In order to avoid the above mentioned problems we need to schedule the jobs mentioned below to have smooth processing in the Queues –


  1. Batch Job Scheduling for the report RSTRFCEU– This report deletes the confirmed entries from the table ARFCRSTATE. This is being done by selecting those entries where the value of the field arfcstate is either ‘CONFIRMD’ or ‘WCONFIRM’. This table has huge effect on the messages coming in to the R/3 system; hence this batch job needs to be scheduled in R/3 system. And this job needs to be scheduled to run every half an hour. The time-frame has been given just as half an hour because this table fills up very fast which has huge impact on queue processing

 

ARFCRSTATE Table Entries before scheduling the batch job


3.png

 

 

ARFCRSTATE Table Entries after scheduling the batch job


4.png

 

 

 

  1. Batch Job Scheduling for the report RSTRFCQD – This report deletes the entries present in the tables ARFCSSTATE, ARFCSDATA and tRFCQOUT using the values in the field qtid.


ARFCSSTATE Table Entries before scheduling the batch job


5.png

 

ARFCSDATA Table Entries before scheduling the batch job

6.png

tRFCQOUT Table Entries before scheduling the batch job

 


7.png


ARFCSSTATE Table Entries after scheduling the batch job



8.png


ARFCSDATA Table Entries after scheduling the batch job

9.png

tRFCQOUT Table Entries after scheduling the batch job


10.png

 

  1. Re-organizing and updating of statistics for the table IDXRCVPOR Table in PI System – IDXRCVPOR Table stores the details of relation between IDoc number and PI message ID which will be displayed in the transaction of IDX5 of PI system.

 

 

 

Conclusion

 

Queue processing in SAP systems can be made to run without any delays by Re-indexing and updating the statistics of the tables like ARFCRSTATE, ARFCSSTATE, TRFCQOUT, ARFCSDATA in R/3 system and IDXRCVPOR table in PI system and scheduling batch job for the reports like RSTRFCEU and RSTRFCES to delete the entries from the tables ARFCRSTATE, ARFCSSTATE, TRFCQOUT and ARFCSDATA.

Stored Procedure Execution and Database Insert, Update with Lookup Option

$
0
0

Requirement

Sometimes the stored procedure call is necessary in the data processing or data translation. Also sometime it is required to execute DML statement in database and it is important to know the status of execution for subsequent step. So the requirement is to insert or update the database table rows or to execute the stored procedure within the mapping.

 

Solution

The database Insert, Update and Execute is possible with System Accessor Java APIs of SAP PI. It is required to go with Java Mapping or to write an UDF.

 

Below is the code snippet for the same:


----------------------------------------------------------------------------------------------------------------------------------

// String Variable to hold the Request XML

String strDBSysRequest = "";

 

 

// Channel Object with JDBC Receiver Channel

Channel channelDBSys = LookupService.getChannel("BC_XYZ", "JDBC_R_XYZ_LOOKUP");

 

 

// System Accessor Object with JDBC Receiver Channel Object

SystemAccessor accessorDBSys = LookupService.getSystemAccessor(channelDBSys);

 

 

// Convert the Request XML String to Input Stream

InputStream isDBLookup = new ByteArrayInputStream(strDBSysRequest.getBytes());

 

 

// Create XML Payload object with request input stream

XmlPayload payload = LookupService.getXmlPayload(isDBLookup);

 

 

// Call System Accessor to process the Request data and return the response as XML Payload

XmlPayload result = (XmlPayload) accessorDBSys.call(payload);

 

 

// Convert the XML Payload into Response XML String

String strDBSysResponse = convertStreamToString(result.getContent());

----------------------------------------------------------------------------------------------------------------------------------

 

The String strDBSysRequest need to be prepared as below in case of Stored Procedure execution:

 

<?xml version="1.0" encoding="UTF-8" standalone="yes"?>

<ns2:SPValidate xmlns:ns2="http://abcd.com/a">

    <Statement>

        <SP_VALIDATE action="EXECUTE">

<table>SP_VALIDATE</table>

            <i_id isInput="1" type="string">JY4UbMhBkoP0501W</i_id>

            <i_card isInput="1" type="string">123123123123</i_card>

            <i_timestamp isInput="1" type="TIMESTAMP">14-07-2015 13:13:52.031</i_timestamp>

            <o_res_code isOutput="1" type="integer" />

            <o_res_msg isOutput="1" type="string" />

        </SP_VALIDATE>

    </Statement>

</ns2:SPValidate>

 

The sample response XML caputured in strDBSysResponse is as below:

 

<?xml version="1.0" encoding="utf-8"?>

<ns2:SPValidate_response xmlns:ns2="http://abcd.com/a">

    <Statement_response>

        <o_res_code>0</o_res_code>

        <o_res_msg>Valid Data</o_res_msg>

    </Statement_response>

</ns2:SPValidate_response>

 

In case of database update and insert, corresponding XML structure need to prepared and pass as request XML Payload. Refer the link below for more information on SQL XML request structure:

http://help.sap.com/saphelp_nw73ehp1/helpdata/en/2e/96fd3f2d14e869e10000000a155106/content.htm


The method convertStreamToString is used in above snippet and the code in this method is as below for the quick reference.

 

----------------------------------------------------------------------------------------------------------------------------------


public String convertStreamToString(InputStream in) {

      StringBuffer sb = new StringBuffer();

      try {

            InputStreamReader isr = new InputStreamReader(in);

            Reader reader = new BufferedReader(isr);

            int ch;

            while ((ch = in.read()) > -1)

                  sb.append((char) ch);

            reader.close();

      } catch (Exception exception) {

      }

      return sb.toString();

}


----------------------------------------------------------------------------------------------------------------------------------


Summary


So here the database interaction is with use of System Accessor and JDBC receiver channel. It is also possible to execute select statement with System Accessor and it is not required to create separate Database Accessor in case if select is also required to be executed in addition to insert/update/execute.

Advantages

  • Avoid using of multiple Integration Flows
  • Avoid writing java code to create java.sql.Connection and then doing the insert/update/execute

 

Disadvantages

 

  • There is no control over database commit
  • Not possible to rollback the database updates

External Search of Messages Processed by Advanced Adapter Engine of PI

$
0
0

Advanced Adapter Engine of SAP PI/PO systems is shipped with APIs which can be used for collection of information on message processing statistics by external applications, these APIs are exposed via HTTP protocol in a form of servlets and SOAP services and are well described in various materials on SCN (for example, see Andreas Job’s blog http://scn.sap.com/community/pi-and-soa-middleware/blog/2015/01/22/reading-messages-from-pi-system, Michal Krawczyk’s blogs http://scn.sap.com/community/pi-and-soa-middleware/blog/2010/04/13/pixi-how-to-get-a-pi-message-from-java-stack-aae-in-pi-711-from-abap, http://scn.sap.com/community/pi-and-soa-middleware/blog/2012/06/27/michals-pi-tips-how-to-get-rwb-message-overview-data-to-an-external-system, http://scn.sap.com/community/pi-and-soa-middleware/blog/2013/03/09/michals-popi-tips-audit-logs-from-a-native-ws--new-feature, my blog http://scn.sap.com/community/pi-and-soa-middleware/blog/2014/02/25/external-collection-of-message-processing-statistics-from-advanced-adapter-engine-of-pi) and in SAP Notes.

 

Besides from use cases already highlighted in referenced resources, often the requirement is not only to collect statistics about processed messages (like processed messages overview and performance statistics), often requirement is capability of searching for specific already processed messages by data contained in their payload. Advanced Adapter Engine is bundled with respective functionality named User-Defined Message Search (UDMS) being embedded into Message Monitor of PIMON, which provides capabilities for searching for processed messages based on values of elements of dynamic header or payload. For details on configuration of this functionality, see http://help.sap.com/saphelp_nw74/helpdata/en/48/b2e0186b156ff4e10000000a42189b/content.htm, for its usage – http://help.sap.com/saphelp_nw74/helpdata/en/48/b2dfe56b156ff4e10000000a42189b/content.htm. After UDMS is configured locally in Advanced Adapter Engine, it can be consumed centrally from Technical Monitoring workcenter of Solution Manager.


But these are not the only tools which can be utilized for fulfilling requirement of message search by payload for messages processed by Advanced Adapter Engine. The standard SOAP service AdapterMessageMonitoringVi, which is shipped with PI/PO, can help to achieve this goal – being exposed via the SOAP service, this functionality can be consumed by any external application / SOAP client. For example, this may be helpful if some external application system needs to search for processed messages in PI/PO knowing application data that has been sent out from that system. Or, if central monitoring system other than Solution Manager is used in the company.


The service itself was described in blogs of Andreas and Michal mentioned above, but I will concentrate on few operations of it, which were not well described before and which are of particular interest in the light of the topic of this blog. They are following three operations:


getUserDefinedSearch.png


First two operations (getUserDefinedSearchFilters and getUserDefinedSearchExtractors) are used to retrieve information on UDMS configuration – such as configured filters and extractors.


In sake of demonstration, the following UDMS filter has been maintained:


UDMS configuration.png


The operation getUserDefinedSearchFilters doesn’t require any input in the request – when executed, it will provide a list of configured UDMS filters:


getUserDefinedSearchFilters - response.png

 

Knowing the interface and namespace which are subject for UDMS (for example, based on preceding call of getUserDefinedSearchFilters), we can search for respective extractors (search criteria) configured for it using the operation getUserDefinedSearchExtractors:

 

getUserDefinedSearchExtractors - response.png

 

The third operation – getUserDefinedSearchMessages– can be used to search for messages utilizing already available UDMS configuration.


The service request element node attributes should contain UDMS specific search criteria – namely:

  • BusinessAttribute– a list of search criteria and their values: name– UDMS search criteria / extractor name, value– searched value. These fields’ values are case sensitive, and mask symbols are permitted when specifying the searched value;
  • operator– search mode. Valid values are: AND and OR. Values are not case sensitive. It is mandatory to provide the value for operator even if only one search criterion is specified. If operator value is not provided or is different from AND / OR, the service operation call will return error saying the operator is unknown.

 

Content of these fields can be mapped to their corresponding representations in user interface of Message Monitor, in the section User-Defined Search Criteria of advanced message search of it:

 

UDMS search - Message Monitor.png


(please note that operator corresponds to the selector of search mode in user interface: AND = Message contains all values, OR = Message contains one of the values).


Below is the screenshot of SOAP request and response being outcome of querying the service operation AdapterMessageMonitoringVi.getUserDefinedSearchMessages(), for search criteria depicted above:


request-response.png

 

The service request which is analogous to the selection above, will be:

 

getUserDefinedSearchMessages - request payload.png


The service response returns amount of found messages matching provided search criteria (element node number) as well as detailed information regarding each of found messages (element nodes AdapterFrameworkData):


getUserDefinedSearchMessages - response payload.png


Besides from UDMS specific search criteria, other filters can be specified in the request in order to make messages search more precise and specific. Those filtering criteria are located in the service request section filter and correspond to sections Message Header Data, Technical Attributes and Identifiers of advanced message search of Message Monitor.

EMail with System Accessor Lookup Service

$
0
0


The System Accessor is becoming very useful in Java Mapping. It has opened up new horizon and provided a great level of flexibility to the SAP PI users. My previous blog was to show the database interaction with help of System Accessor. Now here is another usage for sending email. Sending email from Java Mapping and UDF avoids the multi-mapping and so it eventually reduces the complexity of the interface.

 

Below is the code snippet which can be used in Java Mapping or UDF to send an email:


----------------------------------------------------------------------------------------------------------------------------------

// String Variable to hold the Request XML

String strEMailSysRequest = "";

 

// Channel Object with Mail Receiver Channel

Channel channelEMailSys = LookupService.getChannel("BC_EMAIL", "EMAIL_R_ COMMON");

 

// System Accessor Object with Mail Receiver Channel Object

SystemAccessor accessorEMailSys = LookupService.getSystemAccessor(channelEMailSys);

 

// Convert the Request XML String to Input Stream

InputStream isEMailLookup = new ByteArrayInputStream(strEMailSysRequest.getBytes());

 

// Create XML Payload object with request input stream

XmlPayload payload = LookupService.getXmlPayload(isEMailLookup);

 

// Call System Accessor to process the Request data and return the response as XML Payload

XmlPayload result = (XmlPayload) accessorDBSys.call(payload);

 

// Convert the XML Payload into Response XML String

String strEMailSysResponse = convertStreamToString(result.getContent());

----------------------------------------------------------------------------------------------------------------------------------

 

The String strEMailSysRequest need to be prepared as below which is similar to the one which needs to be created using mail package:

 

<ns2:Mail xmlns:ns2="http://sap.com/xi/XI/Mail/30">

    <Subject>EMail Lookup Test</Subject>

<From>xyz@abc.com</From>

<To>pqr@abc.com</To>

<Content>Hurreyyyyyyyyy, EMail Lookup Test Successful!!! </Content>

</ns2:Mail>

 

The response XML caputured in strEMailSysResponse was as below:

 

<xim:MailReceipt xmlns:xim='http://sap.com/xi/XI/Mail/30'>

<Server>smtp://111.222.333.444:587</Server>

<Greeting>EMAIL.SERVER.NET Microsoft ESMTP MAIL Service ready at Sat, 18 Jul 2015 18:17:54 +0530</Greeting>

<Format>XIPAYLOAD</Format>

<UseMailPackage>true</UseMailPackage>

<Encoding>binary</Encoding>

    <Subject>EMail Lookup Test</Subject>

<From>xyz@abc.com</From>

<To>pqr@abc.com</To>

<Date>2015-07-18T12:49:36Z</Date>

<MailID>2.6.0</MailID>

</xim:MailReceipt>

 

Appreciate your feedback and comments on the System Accessor usages.

Updating Value Mapping via SAP Data Services (Part 2)

$
0
0

Recap

In the first part, linked below, we discovered how value mappings work with SAP PI and how to access the web services that allow you to create changelistsandvaluesmappings.

 

Updating Value Mapping via SAP Data Services (Part 1)

 

Connecting to SAP PI

In order to connect to SAP PI and update value mappings, you need to have a user with role SAP_XI_API_DEVELOP_J2EE. In this tutorial we will use the user id inbound_bods. In our landscape, we often add the suffix corresponding to the specific PI environment. For example, in QA we have specific environments for projects, support, and performance.InDataServiceswewilltiethese system configurations which will be explained later in the tutorial.

 

Creating the Data Stores

Within the Data Services application, you can connect to other systems via Data Stores. A Data Store can represent a specific database, like Oracle or SQL Server; and application, like Success Factors; or a web service. For this tutorial we will create web service data stores to connect to SAP PI.

 

The screenshot below shows the two Data Stores used in this tutorial.

ds_repository_view.png

The Data Stores contain functions which correspond to the operations supported bythewsdl. You can create the Data Store by supplyingthewsdlpathfrompart 1 of the tutorial. An example is below.

 

ds repository edit screen.png

For the connection, use the User name you defined earlier. Once the connections are established, you can import the functions as you would with any other web service within Data Services.

 

Reading Existing Value Mappings

At the core of this solution is the ability to track changes against the existing value mapping defined within SAP PI and create, update,anddeletevaluemappings. In ordertodothiscompareproperly, it is necessary to read the existing value mappings from SAP PI.

 

The solution to this is broken into two separate steps. The first step is to query for value mappings. This returns a list of value mapping id's. The second step is to use the value mapping id's to read the details.

 

Setting Up the Tables

Within Data Services, there are two main types of tables. Template tables can be created automatically by Data Services at run-time. They are write-once and read many. They can be used as a source, but notforalook-up. Tables must exist in the database already.Theyarewritemanyandread many. They can beusedforlook-ups. In our environment, we try to get the best of both. This means that wecanuseDataServicestocreate the table, while still writing to it many times as well as usingitforlook-ups. I first saw this approach in SAP's Best Practices for Data Migration and have adapted it for my organization. In order to achieve this, we have created two data stores that point to the same database server and database. One is used for creating tables as template tables, namely DS_INT_TABLES_INIT, and another to have actual tables, namely DS_INT_TABLES.

 

In order for this to work, we create a job to create a template table in DS_INT_TABLES_INIT. Then we run the job which creates the object in the database.Finallyweimportthetable into DS_INT_TABLES. Now if the structure changes,wechangetheinitializationjob and re-import the table.

 

To create these tables, I have created a job, JOB_CONVERSION_SAP_PI_CREATE_TABLES. I have included the xml of the job details. You can import the job into your environment and change the details of the data store to fit your environment.

 

After executing the job, you will have the following tables:

 

STG_PI_SAP_PI_VALUE_MAPPING_RESULTS

stg_sap_pi_value_mapping_results.png

STG_PI_VALUE_MAPPING_UPDATE_PROPOSAL

stg_pi_value_mapping_update.png

STG_PI_CHANGE_LIST_ERRORS

stg_pi_change_list_errors.png

 

Description of the Tables

 

STG_SAP_PI_VALUE_MAPPING_RESULTS

 

This table is to hold the data read from SAP PI. It has the columns from the read operation plus two additional fields, GENERATED_KEY and SYSTEMCONFIGURATION. The generated key is just an internal key to ensure the records are unique. Within SAP PI,uniquenessisrelatedtoValue Mapping Id, Scheme Id, and Scheme Agency Id. The other additional field is system configuration. In order to understand the purpose of system configuration, you need to understand the meaning of a system configuration within Data Services. A single Data Servicesinstancecanconnectto the same data store with different configurations. A configuration contains specific connection details for an environment. At my company, we have many QA systems. Each system has one data store, but a separate configuration depending on the environment, e.g. QA1, QA2, QA3, and QA4. The individual data store configurations are linked together via a System Configuration. For example, we have a system configuration called QA1 that links our production support SAP system to our production support SAP BW system to our production support loyalty system. By storing the system configuration, we get further separation of data for each of our different SAP PI systems.Inprincipal, it workslikeamandantfieldwithinSAP.

 

STG_PI_VALUE_MAPPING_UPDATE_PROPOSAL

 

This table holds the proposed values to be imported into SAP PI. In the scenario for this tutorial, we will place the customer details from our loyalty system in this table. Its structure matches, almost exactly, the structure in the above table.

 

STG_PI_CHANGE_LIST_ERRORS

 

This table is used to track the results of creating or activating a change list.

 

The Read Job

For this exercise, we will create a job called JOB_SAP_PI_READ_VALUEMAPPING that will call the web services to retrieve and store the current value mappings.

 

Job Variables

The Job Variables are all optional. They represent the filter criteria for the web service. If no values are passed, then all value mappings are read from SAP PI.

 

$G_DESCRIPTION Type VarChar(1024)

$G_USER_RESPONSIBLE Type VarChar(1024)

$G_DATE_LAST_CHANGED Type DateTime

$G_USER_LAST_CHANGED Type VarChar(1024)

$G_GROUP_NAME Type VarChar(1024)

$G_SCHEMEID Type VarChar(1024)

$G_SCHEME_AGENCY_ID Type VarChar(1024)

$G_MAPPING_VALUE Type VarChar(1024)

 

Initialization Script

In the initialization script, we just print the variables. When troubleshooting issues or just monitoring jobs, it is helpful to see the variables used when calling the job. The included xml for the read job includes a function I developed to standardize printing variables. It includes the calling context of the print statement too. This means that if you print from within a job, it will print the job name. If you print from within a workflow, it will print the workflow name. If you print from within a data flow, it will print the owning workflow or job. The function also returns the value you pass in. This can be helpful when troubleshooting an issue because you can use the function within a mapping and will print the value and map it to the target.

 

DF_SAP_PI_READ_VALUE_MAPPING

The purpose of this data flow is to call the query web service and read the system-generated value mapping ids. These value mapping ids are guids generated automatically by SAP PI when creating a new value mapping. In Part 3 we will create and update value mappings. In order to update an existing value mapping, the mapping id is required. When creating a value mapping, the mapping id is blank and gets generated by SAP PI. Note: For language-specific entries like description, I have hard-coded English. If you require a different language, change the data flow or use a variable.


The data flow stores the resulting mapping ids in a template table called STG_SAP_PI_VALUE_MAPPING. The same results could have been accomplished by using the output of this call into the read web service call.


DF_SAP_PI_READ_VALUE_MAPPING_RESULTS

This data flow uses the value mapping ids and, via the read web service, reads the value mapping details. After many un-nesting calls, it writes to the STG_SAP_PI_VALUE_MAPPING_RESULTS table.


Next Time

In the next part we will update the value mappings from Data Services.


(The xml file that contains the read job is split in 2. To use it, merge the two files first.)


Deep dive in Hana Cloud Integration

$
0
0

Deep dive in Hana Cloud Integration.

 

Don’t be scared by the title of the blog, don’t go away: It’s not you, it’s me!
It is me who joined the SAP Hana Cloud Integration Deep Dive training in Walldorf and I want to share some of the great things I learned there. Off course, just like my previous blogs (1, 2) I will keep it quite simple. If you want the full, deep diving deal, just check out Piyush Gakhar’s profile and keep an eye on his training sessions.

This time we will cover encryption and data stores. When all goes well your two iFlows will look like
this:


1. overzicht.png
iFlow 1: Encryption and writing to a data store.

 

2. Overzicht.png

iFlow 2: Decryption and getting from data store.

Preparing for encryption

 

Before we start with building our iFlow we need to add an RSA key pair to our keystore. If you do not know how to create (or adjust) your keystore please read my first blog (direct link to the document ‘How to create a keystore.pdf’).

 

For creating the RSA key pair I also used KeyStore Explorer (just like we used for creating the keystore). When you open yourkeystore.jks in KeyStore Explorer (KSE) you can right click on an empty space and choose for ‘Generate Key Pair’.

 

3. KSE create.png

 

In the next screen you can choose the Key Size. For this example we leave everything as it is.

 

4. KSE create.png

 

 

 

 

On the next pop-up screen we change nothing and go directly to the pretty address book next to Name.

 

5. KSE create.png

 

 

 

There we fill out all the information is needed.

 

6. KSE create.png

 

Then we get prompted for an Alias for our key pair:
Please name you RSA key pair “id_rsa”. This name is needed for a good HCI handling.


7. KSE create.png

 


Now save your keystore and deploy it to your tenant.

 

Creating the first iFlow

 

Now everything is set up for the encryption part we are going to build our first iFlow. In this iFlow I choose SuccessFactors as a Sender. Off course you can use any kind of data from any kind of adapter for this exercise. I will not discuss setting up the sender (and the receiver). If you do not know how to set up a sender or receiver please read my previous blogs.

 

 

 

 

I begin with placing a Multicast on the integration project. You can find this under Message Routing in your pallet. Because I want to show you the different outcomes with and without encryption I multicast my incoming message (from SuccessFactors) two ways.

 

 

8. iFlow1.png

 

 

On one of the branches I add a Content Encryptor (under security elements). The other branch stays empty.

 

9. iFlow1.png

 

The properties of the PKCS7Encryptor are displayed below. You can change it to your likes but for the exercise I will only assign my public key alias.
Under Encryption choose add and set in “id_rsa” as the Public Key Alias.

10. iFlow1.png

11. iFlow1.png

 




Now we added the encryption to our SFSF data in one branch, and nothing in the other branch. Because I want to use the encrypted file for later, but also show you the different outcomes I will add another multicast right after the PKCS7Encryptor block.

 

12. iFlow1.png


Now we have a branch we will use to mail the encrypted file, and a branch which we can store in our data store.
On the end of the data store branch, put a Data Store Operation block, which can be found under Message Persistence.

 

 

 

13. iFlow1.png

 

 

 

 

 

In the properties of the Write block you can choose a name for your data store. Also you can change the visibility. Please change the visibility from Integration flow to Global, because we are going to need in in iFlow2. The option ‘Encrypt Stored Message’ can be checked on, but this is not the encryption we configured earlier. So if you choose to check this of, your stored message will still be encrypted with you own encryption. In this exercise I will leave it on.

 

14. iFlow1.png

 

 

 

I connected both branches to an end event and connect those to the receiver. For both I chose the Mail adapter. For the branch without encryption I set up the subject with something like: SFSFfileNOTENCRYPTED and the subject on the other branch will then be: SFSFfileENCRYPTED.

 

15. iFlow1.png

 

 

 

So now your complete iFlow1 should look something like this:

1. overzicht.png

 

 

When we save and deploy the iFlow you should get three different things. One is an email with the encrypted file, the second one is a mail with an unencrypted file and the third is an entry in a data store. You can check the data store in your tenant under Data Store Viewer.

 

16. iFlow1.png

17. iFlow1.png

 

 

 

The emails should like something like this, but will be different based on your chosen input.

 

19. iFlow1.png

  Not encrypted



18. iFlow1.png

  Encrypted

 

 

Creating the second iFlow

 

In the second flow we are not going to use a Sender shape, you can delete this. Instead of a sender shape we start this flow with a Timer Start (under events). From the timer start we place a Data Store Operations and select ‘Switch to Get Operation’.

 

 

20. iFlow2.png21. iFlow2.png

 

 

 

 

When you select the Get Operation you need to enter the Data Store Name and the Entry ID. Both can be found in the Data Store Viewer on your tenant.

 

 

 

21b. iFlow2.png

 

 

 

 

After you point the Get operation to the right data store and the right entry in that data store we create a multicast. This is to prove we really did decrypt the data, and where no decryption took place the file is still encrypted.

 

 

22. iFlow2.png

 

On the ‘to be decrypted’ branch we add, surprisingly, a content decryptor. This can be found under Security Elements.

Because we did not change the settings while we were encrypting, we do not need to change anything now.

 

 

Just like in iFlow1 we will connect the two branches to an endpoint and connect those via a mail adapter to the receiver. When you save and deploy this iFlow2 the result should be something equal to what we saw in the first iFlow, that means; a decrypted message (from the Not decrypted branch) and an encrypted message (from the decrypted branch).

23. iFlow2.png


24. iFlow2.png

 

I hope you learned something from this blog, if not; please let me know. If you want more detailed information please feel free to contact me, and don’t forget to check out Piyush Gakhar’s profile to see when the next HCI Deep Dive training is near you!


Blog 1: Starting with Hana Cloud Integration? Keep this in mind!
Blog 2: Starting with Hana Cloud Integration? Create a simple integration flow (iFlow)!

Blog 3: Deep dive in Hana Cloud Integration.

Writing Log Entries of a Java Application to an External Log Management System

$
0
0

Intro

In heterogeneous environments, it is commonly required to analyse huge amount of logs generated by various systems – and it is convenient to manage these logs centrally in order to avoid overhead caused by accessing local log viewing tools of each affected system. Generally speaking, there are several approaches in populating centralized log management systems with logs produced by backend systems:

  • Poll logs: the backend system generates and persists logs locally and centralized log management system collects (polls) and processes generated logs periodically or ad hoc (real time on user demand);
  • Push logs: the backend system generates logs and sends (pushes) them to the centralized log management system.

 

In this blog, I would like to focus on the second approach and describe one of its possible implementations suitable for SAP AS Java systems (for example, SAP Process Orchestration or Enterprise Portal) using standard APIs shipped with AS Java – namely, functionality of SAP Logging API. In sake of concretization of the example, let us consider the scenario where some application of the SAP PO system (in real world, it can be mapping, adapter or adapter module, some other application deployed on AS Java) generates logs and our intention is to propagate these logs to some JMS broker (for example, to the specific JMS queue hosted on that broker), which is used by centralized log management system to parse and process log records later on. The one may think of other communication techniques different from JMS – using the approach discussed in this blog, the solution can be adapted to particular needs and communication techniques, JMS has been chosen for demonstration purposes as a commonly used technique for building distributed solutions.

 

Some 3rd party logging frameworks implement approach of decoupling log producer (the application which utilizes logger and creates a log record) from log consumer (the application which processes logs) and have capabilities of propagating generated log records to destinations other than local console or file. For example, one of commonly used logging APIs – Apache Log4j – introduces a concept of appenders, which are components delivering the generated log record to the specific destination. Destination may be console, file, JMS queue/topic, database, mail recipient, syslog, some arbitrary output stream, etc. It is possible to deploy such 3rd party logging library to AS Java system and utilize its functionality, but as stated above, the goal of this blog is to describe the solution where SAP standard functionality is employed, so usage of 3rd party logging frameworks is out of scope of this blog.

 

 

Overview of log destinations in SAP Logging API

Architecture and main components of SAP Logging API are well described in SAP Help: SAP Logging API -  Using Central Development Services - SAP Library. The aspect which is important for us in scope of this blog, is the way how logging framework sends log records out. The component responsible for management of this process is Log Controller. For each log location, it is possible to assign one or several logs, where Log is a representation of the destination to which the assigned Log Controller will distribute generated log records for the specific location. In SAP Logging API, there are several classes that implement logs and that may be of interest for us:

  • ConsoleLog– used to write log record to System.err;
  • FileLog– used to write log record to the specified file;
  • StreamLog– used to write log record to an arbitrary output stream.

 

Log destinations can be configured in various ways:

 

There is a brief description of these log destinations in SAP Help: Log (Destination) -  Using Central Development Services - SAP Library.

 

ConsoleLog is the simplest from them and is the least applicable when thinking of centralized log management system.

 

FileLog can be of use when we need to output log records not to default log files of AS Java, but to some specific file or a set of rotating files (potentially, to the location which is scanned by collectors of centralized log management system). This may be helpful, for example, if we need to persist log records generated by some application, in a specific dedicated file and not in a common shared application log files. FileLog is described in several materials published on SCN, such as:

You may also find relevant information regarding usage and configuration of FileLog in SAP Help: Output File -  Using Central Development Services - SAP Library.

 

In this blog, my focus will be on the log destination StreamLog, which will be helpful in fulfilment of our requirement formulated at the beginning.

 

 

Demo

In sake of simplified demonstration, logging configuration will be implemented from the source code of the application.

 

In the demo scenario, Apache ActiveMQ is used as a JMS broker. The JMS queue named Logs has been registered there and is intended to be used as a destination for generated logs so that log records are accumulated and persisted in that queue:

 

JMS queue - registered.png

 

The entire utilized set of operations with the logging system can be logically split into three lifecycle phases:

  1. Initialization of the log destination and corresponding output stream, followed by initialization of a logger which writes to it;
  2. Generation of log records and writing them to the log destination;
  3. Termination and closure of used resources.

 

As a part of initialization, it is firstly necessary to establish connection to the log destination and open the output stream to it:

 

ActiveMQConnectionFactory connectionFactory = new ActiveMQConnectionFactory(jmsBrokerUrl);
Connection jmsConnection = connectionFactory.createConnection();
jmsConnection.start();
Session jmsSession = jmsConnection.createSession(false, Session.AUTO_ACKNOWLEDGE);
Destination jmsQueue = jmsSession.createQueue(jmsQueueName);
OutputStream os = ((ActiveMQConnection) jmsConnection).createOutputStream(jmsQueue);

 

Here, jmsBrokerUrl is a String holding JMS broker URL (tcp://<host>:<port> for ActiveMQ – for example, tcp://activemqhost:61616) and jmsQueueName is a String holding JMS queue name (in this example, Logs).

 

The next step is to initialize the logger:

 

Location logger = Location.getLocation(logLocationName);
logger.setEffectiveSeverity(Severity.ALL);
Formatter formatTrace = new TraceFormatter();
Log logJms = new StreamLog(os, formatTrace);
logger.addLog(logJms);

 

Here, logLocationName is a String holding log location name (can be arbitrary meaningful name which would, for example, identify the log location in the application hierarchy).

Note that in this example, we used a simple trace formatter – based on requirements, it is possible to utilize variety of other formatters in order to apply required layout to the generated log record. In sake of demonstration, severity was explicitly set to all – depending on logging needs, this can also be adjusted accordingly.

The important part of this block is creation of the Log object (that represents the JMS queue to which log records will be written) and adding this Log to the initialized logger. In this way, the logger gets instructed regarding destination or several destinations (if several Log objects are created and added to the logger), to which generated and filtered log record should be written to.

 

After two initialization blocks are executed successfully, we can now create log records - in the simplest way, by calling method <severity>T() of the Location object, which is corresponding to the desired log record severity:

 

logger.infoT(logRecordText);

 

As a result, the created log record will be sent to the queue hosted on ActiveMQ server and can be observed there:

 

JMS queue - log record.png

 

JMS message.png

 

Attention should be paid to termination logic in case the application doesn’t need this log destination anymore – this is important in order to ensure there is no resource leak (unclosed streams, sessions, connections, etc.). To be more precise, it is important to take care of closing the used output stream, JMS session and JMS connection:

 

os.close();
jmsSession.close();
jmsConnection.close();

 

Respective exception handling and proper output stream closure and JMS resources release should be implemented accordingly.

 

After the log record has been written to the JMS queue and persisted there, the centralized log management system may process it further - for example, aggregate with other log records based on some rules, retrieve required information and visualize it in a user-friendly way, generate alerts, etc. That part of log management is out of scope of this blog - our current goal was to make log records of the application running on AS Java delivered to the central destination and storage.

 

 

Outro

This described solution has several drawbacks which should be taken into account:

  • Performance. A log record is created and written to a log destination synchronously. This means, logging operation is a blocking operation for the application which triggered log record creation and the application has to wait until log record creation operation is completed, before it can continue execution of application logic. As a result, the more time is spent for logging logic, the more performance overhead logging will bring to the application and bring negative impact to overall processing time of the application. Writing log entries to the remote log destination (such as remote JMS destination, remote database, etc.) is more “expensive” operation than writing them to a local file system, that’s why should be implemented carefully. Compromise can be found in locating the log destination (for example, JMS broker instance) as close as possible to SAP AS Java.
  • Lifecycle of operations with the output stream prescribe necessity of prior creation of the output stream with attachment to the specific data destination and finalization of writing to the output stream by closing the stream and respective connections. These operations should normally be executed at initialization and termination phases, correspondingly, and not for every written log record, in order to avoid additional overhead related to output stream management.
  • Possible necessity of 3rd party libraries deployment. In order to utilize log destinations which are components other than SAP AS Java, it may be needed to deploy 3rd party libraries which will provide necessary APIs for SAP Logging API to be capable of writing to those destinations. In its turn, this brings necessity of maintenance overhead of such a solution and ensuring compatibility of deployed libraries during upgrades.

 

Summarizing all described above, StreamLog is a powerful and flexible feature of SAP Logging API in building centralized log management systems and facilitating logs processing and analysis routines, but it should be estimated and used thoughtfully.

Multiple Syn Request Calls from PI through ccBPM

$
0
0

Dear All,

 

You might have seen this kind of approach blogs in SDN, but may be not exactly related to same requirement.

have a little complex requirement and achieved through ccBPM (Database(Jdbc)<--> PI<--->SAP RFC)

 

Steps on Requirement:

  1. PI uses select query and get data from Database.
  2. Select query gets multiple data sets to PI ( for ex: 10 data sets)
  3. Based upon select query data from Database, PI needs to call RFC 10 times and get the response back on 10 times.
  4. PI sends RFC response back to Database for 10 times.

 

Recommend to check: Based upon the data volume from Database, if high volume of data sets, takes time to finish the whole process, This approach only best  for  less volume of data.

 

Note: You may use same kind of approach to use to call web service instead of RFC

 

This blog only shows you message mapping, Operational Mapping and ccBPM steps (between Database and RFC), this information will help you to achieve on.

 

Message Mapping:

 

MessageMapping1.jpg

 

In this ‘Signatures’ tab of message mapping you must be changed to (0  ... Unbound) of your RFC target message type. This setting will help to call multiple times through ccBPM.

After you changed you actual mapping look like this.

 

MessageMapping2.jpg

 

Operational Mapping:


In the Operational Mapping also the target service interface occurrence should be 0 .. unbounded

 

Operational mapping.jpg

 

 

 

ccBPM steps:

 

5 Important steps to achieve on this logic

  1. Transformation step
  2. Container Operation Step
  3. Block Step
  4. Send step
  5. Multi line- multi line container element is a table comprising elements of the same type.

 

 

BPM1.jpg

 

Transformation Step:


Map between JDBC Select Query res To RFC request,

In the container list, I specified RFC request enable the ‘Multi line’

 

BPM2.jpg

 

 

 

Container Operation Step:


In the container Specify the target and use APPEND operation for add expression to the Target.

See here in the screen shot of Expression and Target has used same Abstract Interface and enable the multi line option and operation mode in properties is Append.

 

BPM3.jpg

 

 

 

 

Block Step:

 

Block step is used for Loop in RFC call and mode is ‘For Each’

In the Block step, Properties tab “you specify the multi line container element in the Multi line Elementattribute. In the Current Linefield, specify a container element that takes the value of the multi line container element for which the block will run”.

Please check also multi line Element and Current Line filed should have same kind of Abstract Interface.

 

 

BPM4.jpg

 

 

      Send Step:


      Send the Request to RFC, and get the response back from RFC.

      In the properties Tab of ‘Send step’ the request message should be of current line element that you have already specified in the Block step.

 

 

BPM5.jpg

 

 

 

In these steps I have only specified you up to  get the response back from RFC, the remaining all other steps continue in the block (like send RFC response back to database. Etc).

 

Hope this blog help you achieve on multiple requests from PI through ccBPM

SOAP faults and Webservice faults from 3rd Party webservices

$
0
0

There are several blogs and discussions about capturing and handling SOAP faults. This blog leverages on these blogs and discussions to provide a solution about how SOAP faults in addition to the web service faults can be captured in SAP PO/PI when invoking a 3rd Party web service.

 

Requirement: A 3rd party web service returns errors as web service faults and SOAP fault. Web service faults can be handled easily by adding a fault message in the service interface and adding a fault mapping. However SOAP fault messages need to be explicitly captured in PO/PI.

 

Solution:

 

     1. In the SOAP receiver communication channel check do no use SOAP envelope. Doing this ensures that the entire SOAP message including Envelope, Body and Fault can be captured in PO/PI.

          Capture.PNG

     2.  Create a simple XSLT mapping for adding the SOAP envelope (since this gets stripped off from the setting in step 1) and use this after the request     mapping so that the request message has the necessary SOAP envelope when the web service is invoked.         

          Capture1.PNG

     3. Create a custom xsd which would have both the fault message and the response message. Sample message in external definition from imported XSD   as below:

          Capture3.PNG

     4. Use the above external message as the response message in the service interfaces and the response mapping.

     5. If there are namespaces in the response message e.g. “WebServiceFault” has a namespace attached to it these can be removed by using the               XMLAnonymizerBean in the receiver SOAP communication channel.

          Capture4.PNG

     6. Also in the receiver SOAP communication channel set the parameter XMBWS.NoSOAPIgnoreStatusCode = true so that the receiver SOAP           adapter ignores the HTTP Status code when “Do not user SOAP envelope” is used.

          Capture5.PNG

     7. When the web service scenario is tested with the above settings, all the fault messages can be captured in PO/PI and you can decide on the           further course of action.

          Note: Since the response and fault will be returned to the response mapping, you can further split this here or customise the mapping based on           specific requirements.

 

          Sample error message returned by the web service with the above solution:

          Capture6.PNG

          Sample response message returned by the web service with the above solution:

          Capture7.PNG

References:

856597 - FAQ: XI 3.0 / PI 7.0/7.1/7.3 SOAP Adapter

http://scn.sap.com/people/jin.shin/blog/2007/05/21/handling-web-service-soap-fault-responses-in-sap-netweaver-xi

http://scn.sap.com/people/alessandro.guarneri/blog/2011/01/10/soap-fault-in-sap-pi-hijack-it

Merging Irregular XML Files

$
0
0

I spent a lot of time in searching if this was actually possible, did a lot of research, tried many different methods and finally I’ve been successful.


Before I start let me give you some background;

 

 

Business Scenario


The Sender system is IBM WebSphere MQ and the receiver side is an SFTP.


 

So it becomes JMS to SFTP.

 

Untitled-1.png


Business Requirement 


The sender side is splitting each xml document to 3 mb parts. But it is splitted irregularly.


For example;


This is the structure before they split and what receiver system expects from PI.

Untitled-2.png


The xml document above becomes like below parts when client put the files to MQ.  

Untitled-3.png

Untitled-4.png

Untitled-5.png


 

 

 

Another requirement is, the message parts are uploaded to MQ unordered. So last message part can be at first row or first message part can be at middle rows etc.

And total size of the complete/unsplitted xml document can be max. 50+ MB.

Also quantity of message parts is unknown and most important thing is there is nothing to use to define the message parts sequence.



 

Solution Tries


Firstly it occurred to me that I can send message parts to ccBPM and merge them with a transformation step. Then I tried this scenario.

Untitled-7.png


Because we don’t have any possible field to use in correlation, I made dummy correlation like giving ‘sender component’.

Untitled-8.png


When I run this scenario, I noticed that incoming messages are failing on transformation step. I thought that reason of this exception is while messages goes in to mapping, they are tried to parse as if the message is xml.


By the way sender and receiver adapter validations are off.

Untitled-9.png

 


To achieve it I tried to add xml elements like <Envelope> and </Envelope> at beginning of the message and end of the message to convert my irregular xml message to smooth and parsable xml message.


First I tried this method using XSLT by embedding incoming message to smooth xml structure as CDATA. This was 1. step on Operation Mapping and the 2. step was my actual mapping which I merge incoming messages as below.

Untitled-10.png


This attempt failed. Because XSLT expects incoming data as xml format.


Then I used Java Mapping instead of XSLT with the same logic; embedding incoming data in to xml structure as CDATA. But this time I placed java mapping in front of the ccBPM instead of transformation step like below. This attempt failed too. 

Untitled-11.png


I realized that it mighty have impossible to handle irregular xml after message go out from sender channel.


Then I decided to write a simple custom module on JMS adapter to convert incoming data to xml message by adding xml tags beginning and end of the message.


After the start of writing EJB, an idea came; why I didn’t just make a runnable jar and call it from channel?



 

 

 

 

 

 

 

 

 

Solution


I made some changes on configuration and made it JMS to File as shown below.

Untitled-12.png


 

PI is taking message parts from MQ and putting them in a folder on itself without any conversion.


While leaving the messages to folder, there is an operating system command on receiver file adapter which works after message processing.

Untitled-13.png


 

With this command PI executes a script file whenever each message file created in target folder.


The script file executes a runnable jar file. But before the execution it checks whether the jar is running currently or ready to run at that moment. This check is necessary to prevent triggering jar file on every message part. It must run once when the first message part placed in the folder.


When the jar app is executed, it fetches all message part files from PI’s output folder and goes in a loop which turns number of file times.


First looks for the begin of root element if the message part contains it then it is assigned to a string. Then looks for the end of root element likewise and it is assigned to another string. Remainders is assign to another else string by concatenating each other in order and in a row.
After each assignment, currently processing file is moved to ‘processed’ folder.


At the end of the loop, which means all message part files are processed and moved from PI’s output folder, the strings assigned above are concatenated and this generates complete xml. At the same time this strings represents three main part of the complete xml which are begin part, middle parts and end part.


Then the new complete xml file is created with containing timestamp in filename to prevent duplicate situation.


The checking logic in bat file is, if jar file name is ‘ready’ then change its name to ‘running’ and execute it. Then check the folder continuously which PI puts incoming message parts, until there is no file anymore. This means jar app fetched all message parts and done its job. Then bat file changes the jar file’s name again to ‘ready’  then the script ends.


Also I made some performance tests like merging 20 files which equals to 60 mb total. During the tests I have observed that playing with big data causes Java heap size error. To achieve it I added -Xmx parameter to bat file which sets maximum java heap size.


As a result the script in front of the jar app enables to run jar app once until all message parts processed.






Script file;


@echo off
if exist "XMLMerger_ready.jar" (
rename XMLMerger_ready.jar XMLMerger_running.jar
java -Xmx512M -jar XMLMerger_running.jar
)
:loop
if not exist "T:\in\incoming\%" (
rename XMLMerger_running.jar XMLMerger_ready.jar
exit
) else (
goto :loop
)



Jar source code;

 

package com.xml;
/**
* @author ridvanpolat
*
*/
import java.io.BufferedReader;
import java.io.BufferedWriter;
import java.io.File;
import java.io.FileReader;
import java.io.FileWriter;
import java.io.IOException;
import java.sql.Timestamp;
public class Merge {    public static void main(String[] args) throws IOException {          // wait 5 seconds to let PI to put all message parts to the incoming folder        try {            Thread.sleep(5000);        } catch (InterruptedException ex) {            Thread.currentThread().interrupt();        }        // declarations        java.util.Date date = new java.util.Date();        String beginPart = "";        String midParts = "";        String endPart = "";        // we will put generated xml to this path        String path = "T:\\in\\";        // we will fetch message parts from this path which PI puts        File file = new File(path + "\\incoming\\");        // check whether the path is exist or not        if (file != null && file.exists()) {              // get all files to an array, not only the filename, gets with the path            File[] listOfFiles = file.listFiles();            // if there is file in the folder then..            if (listOfFiles != null) {                      // we will use this string to collect all parts at the end                String completeXML = "";                      // run the codes below number of file times                for (int i = 0; i < listOfFiles.length; i++) {                              // this control is necessary because there might be some other non-file items in the folder, we have to process only files.                    if (listOfFiles[i].isFile()) {                                      BufferedReader br = null;                                        try {                                            // for the current file. we are reading first line only, because the part files are composed of single line                            String sCurrentLine;                                     br = new BufferedReader(new FileReader(listOfFiles[i]));                                                     // if file is not empty                            while ((sCurrentLine = br.readLine()) != null) {                                        // get begin part                                if (sCurrentLine.contains("<Envelope")) {                                    beginPart = sCurrentLine;                                            // check if the message is complete xml                                    if (sCurrentLine.contains("</Envelope")) {                                        completeXML = sCurrentLine;                                    }                                            // get end part                                } else if (sCurrentLine.contains("</Envelope")) {                                    endPart = sCurrentLine;                                              // get middle parts and concat                                } else {                                    midParts = midParts.concat(sCurrentLine);                                }                            }                                  } catch (IOException e) {                            e.printStackTrace();                        } finally {                            try {                                if (br != null) {                                    br.close();                                }                            } catch (IOException ex) {                                ex.printStackTrace();                            }                        }                                      try {                        // move processed files from incoming folder                            if (listOfFiles[i].renameTo(                                    new File(                                            path + "\\processed\\"                                            + listOfFiles[i].getName()))) {                                System.out.println("File is moved successful!");                            } else {                                System.out.println("File is failed to move!");                            }                        } catch (Exception e) {                            e.printStackTrace();                        }                             }                                 // endloop                }                             // concat 3 main part in order and in a row                completeXML = beginPart.concat(midParts.concat(endPart));                             // for error handling from script file                System.out.println(completeXML);                          BufferedWriter output = null;                try {                            // get time stamp                    String getTimestamp = new Timestamp(date.getTime()).toString();                    // change unsupported characters for filename                    getTimestamp = getTimestamp.replace(' ', '_').replace(':', '-').replace('.', '-');                                     // create new file with xml extention and timestamp to prevent duplicate situation                    File newFile = new File(path + "\\XMLdata" + getTimestamp + ".xml");                    output = new BufferedWriter(new FileWriter(newFile));                                     // push our concated data to new file                    output.write(completeXML);                                 } catch (IOException e) {                    e.printStackTrace();                } finally {                    if (output != null) {                        output.close();                    }                }                      }        }      }
}




I hope has been useful.

 

Ridvan Polat

Viewing all 676 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>