Quantcast
Channel: SCN : Blog List - Process Integration (PI) & SOA Middleware
Viewing all 676 articles
Browse latest View live

Is your assumptions as a SAP PI Developer giving you a wrong result?

$
0
0

I just got a small insight today that I’ll share with you. It is on the video below.

 

I’m working as a consultant with Integration development on SAP PI, so it is from this context I’m thinking. My guess is that this not only is related to integration development but you will also see the same issue in other situations.

 

The main part is that we often take for granted what our customers as the real thing and what we need to follow. It is of cause often the case that they knew what we need to help them with.

Sometime though we may be able to find a different solution that will more cater for what they need to do. That way we can help them solve the case in a much smarter fashion. I often get blind for the cases where we can do other things and just do what is asked of me.

But if we want to be servicing our customers/or internal prospects then we must be able to ask some important questions and let them know different ways to solve the issue.

The place where you really need to question the assumptions is when the customers want you to develop something that is not a pattern for in your enterprise. Or it is something that wills affects the way you normally will see how the system should work.

 

 

I recorded this video where I share some more on the topic.

 

 

Do you have any cases where you were questioning your assumptions and made a better solutions based on it?


SAP BC VS. SAP PI

$
0
0

One of my recent tasks was to list down the advantages of PI over BC.i.e. A customer wanted to know whether its productive to phase out SAP BC and install PI instead. Though there was quite some information on the internet over the SAP Business Connector and SAP Process Integration , it was distributed and not complete. Having worked on developing interfaces both in SAP BC and SAP PI, here are some of my thoughts on "Downsides"" of BC over PI along with the advantages of Installing PI.


Please be noted that all these points are based on my perspective and understanding.


According to SAP notes “1094412- Release and Support Strategy of SAP Business Connector 4.8”, SAP Business Connector 4.6 and 4.7 has already gone out SAP support due to the fact that both the supported Java run time and most of the operating systems have gone out of maintenance. End of maintenance for SAP BC 4.8 will be December 31st, 2020. SAP Customer and SAP Partner who has purchased a SAP License is entitled to receive the SAP BC.


More information on what SAP BC is, could be found here

 

SAP BC Downsides:

  • End of Maintenance.
  • No new releases of BC to support growing needs of the customers like BPM, Workflows and cannot integrate readily with evolving technologies like Odata Web services.
  • Observed that BC is not stable when it comes to handling large volume of data.
  • Monitoring and Logging in BC is limited.
  • Synchronous Messages using BAPI’s cannot be readily monitored.
  • Guaranteed delivery can only be emulated with lot of development effort.
  • Persistence of messages with relevant message status within the pipeline configuration is missing. A critical message might be lost, if the server is not available.
  • Enterprise Services which are delivered are being delivered by SAP in new releases are not available in BC.
  • No pre-configured scenarios and ready to use solutions.
  • Local Queuing of Message not possible and Alert Mechanisms limited.
  • Services based on Point to Point Scenarios and do not use collaborative approach .i.e. Design and Run time cannot be accessed and called centrally.
  • mySAP solutions SRM and CRM has very limited connectivity with BC.
  • Integration tool acquired by SAP. Not of its own.

 

SAP PI

PI is belongs to the SAP Netweaver Technology and based on spoke-hub model. For customers who are already using other modules in the SAP NetWeaver stack, PI is often available for little additional cost. However, there are additional costs for Licenses , CPU usage and data traffic. Following are some of the benefits/key points for considering SAP PI as middle ware.


  • Supports synchronous and Asynchronous communication based on almost all the protocols available and transport mechanism in the market. I.e. HTTP/s, FTP/SFTP, IDOC, RFC/BAPI , JAVA & ABAP Proxy, Odata, JDBC/ODBC, FILE ,XML , SMTP , EDI , Industry Specific Protocols ( RosettaNet , EDIFACT , AS2, CIDX ..).
  • Predefined content and Services which are readily usable.
  • B2B and A2A support.
  • Service and Operational reporting along with Automated Alert Operations.
  • Integrates seamlessly with all SAP solutions including the newest onDemand solutions.
  • Transformations using graphical, JAVA, XSLT and ABAP Mapping are possible along with and cross-component integration processes.
  • Local Queues for individual scenarios.
  • Guaranteed Delivery of a message and NO message is never lost. Also, messages that fail due to delivery errors can be configured to re-submit to the target system.

  · Persistence of Message at various stages.

  • Central monitoring as well as Error Monitoring for message flow out of SAP ECC thru SAP PI.
  • Validating the messages, cleansing (archiving & Deleting) and common format.
  • Centralized and standardized flow management leading to better security and reliability.
  • Change real-time messages received from source system to batch processing to be delivered to the target system
  • User management, security and Transport Protocol. i.e. SNC, SSL/TLS.
  • For automated business process, scalability, archiving, admin, audit and orchestration using Business Process management (similar to SAP workflow) BPM and BRM.
  • Supports complex system landscape.
  • Enterprise Service Repository enables usage of SAP all Enterprise services readily.


SAP PI Downsides Over SAP BC:

  • More infrastructure and Installation effort required. While BC can be installed and brought to LIVE in days.
  • Regular Maintenance required. Upgrade is associated with additional costs.
  • More TOC associated with SAP PI


Conclusion:

  • For simple and fewer scenarios and basically point to point communication, SAP BC is cost effective over PI.
  • For complex landscapes , B2B communications, high & critical transactions , SOA based integration which is independent of Vendor, Technology and Product  with "almost any" non-sap, SAP PI is right and complete ESB.



Now that you have inclination towards installing SAP PI. Next challenge is which version to choose from. Below details sheds some light on choosing the appropriate version


SAP PI 7.4 is latest version available in market with below installation options

  • Dual-stack installation: Based on both AS ABAP and AS Java.
  • Advanced Adapter Engine Extended (AAEX): Based on AS Java only.

 

Considering that SAP PI is a fresh installation, a Dual-stack installation is not necessary. ABAP Stack is only necessary for the already PI installations, where a lot of dependent ABAP proxies and ABAP mappings are involved along with complex CCBPM scenarios which need to be re-designed to migrate to JAVA only.

 

Advantages of SAP PI 7.4 AS JAVA Only (Single Stack) are:

  • Hardware requirement reduced by 50% compare to dual stack installation.
  • Up to 60% less in energy conceptions and easy maintenance.
  • Higher volume scenarios can be handled effectively and processing time is optimum.
  • SAP Strategy and development effort is targeted at the Java-only installation options. New features, functionalities, and further enhancements will mainly go into the Java-only options.
  • Faster installation and Restart.
  • Eclipse-based design time for service provisioning.
  • Increased productivity & richer connectivity on AAE.
  • Improved upgrade and fault tolerance.
  • Loose Coupling and content based routing.
  • AAE with local ES Repository and Integration Directory
  • Reliable connectivity between messaging and process layer via proven Java Proxy Runtime.
  • Model-driven development environment based on BPMN standard
  • Leverage SAP NetWeaver BRM for business rules.
  • Reduced TCO.


My first foray into Web Services

$
0
0

I am currently making my first foray into developing Web Services and have developed one Web Service that creates a Quotation in SAP. I used the BAPI_QUOTATION_CREATEFROMDATA2 to create the Web Service and SOAPUI software to test it.

 

It was a fairly simple process but I suspect it will be a bit more challenging once I actually start trying to integrate with a Cordys ESB and a non-SAP CRM system.

 

I will try to capture the issues I face and blog about how I overcame them at a later stage.

Generating WSDL in IFLOW in SAP PO 7.4

$
0
0

In this blog i am going to show how to create WSDL in IFlow in SAP PO 7.4 similar to the way which we create on top of sender agreement in Integration builder for SOAP based integration. IFlow is an integration flow which will be created for an integration scenario in NWDS tool instead of logging into the classical Integration builder.

 

This blog will not cover about the creation of IFlow and assuming the target audience will have the basic knowledge on SAP PO and NWDS.

 

Prerequisites:

 

Below are the systems involved in the integration

SAP PO 7.4

SAP NetWeaver Developer Studio

SAP Enhancement Package 1 for SAP NetWeaver Developer Studio 7.3 SP07 PAT0000

SAP ECC6 EH7

 

 

Step 1 : Below is the IFlow created in NWDS for SOAP To RFC synchronous interface.Make sure that the IFlow is activated and deployed.

IFLOW.jpg

 

Step 2 : Once you are done with the end-to-end configurations in IFlow its time to create/generate WSDL to consume WSDL using the SOAP UI Test tool.

Choose the SAP Process Integration Runtime Perspective from the Menu.

 

pers.jpg

 

 

Step 3 : Once you in the runtime perspective we can see as per the below screen and right click on our IFlow and choose Show Runtime Properties.

 

right click.jpg

 

 

Step 4 : Copy the WSDL URL by right click and open it in browser and use the WSDL in the SOAP UI for testing purpose.You can also directly access the endpoint URL in SOAP UI.

 

WSDL.jpg

 

I hope this blog will help for the people who were thinking( like me )of generating the WSDL without logging into Integration Builder.I think SAP is doing its best to offer single tool for design ,configure and monitor the interfaces, bypassing the ESR and ID.

How to Save Payload from Message Display Tool in Mozilla Firefox

$
0
0

How to Save Payload from Message Display Tool in Mozilla Firefox

Many times we can see the target paylaod is not available in Abab Stack (SXMB_MONI). But We need that payload for any issue or any requirement.So How to collect that payload here i have shown through Mozilla browser.

Step1: Go to the Particular message which payload wants to be saved.

1.JPG

 

Step2: Right click on the web page as per given screen

2.PNG

Step3: Save as “Web Page, XML only” on your system. That’s it.

3.PNG

Mapping messages for different namespace

$
0
0

Hellocolleagues...

 

Often there is  an incoming message does not match the schema xsd. But the problemis notin the structure. The problemappearsemptynamespase. For example:

Pic.JPG

Quicksolution to this problemis to adda into OM verysimpleXSLT-programs:

 

Pi2.JPG

ABAP proxy to SFTP with PGP Encryption using SAP PI 7.4 dual stack

$
0
0

Step by step to develop ABAP proxy to SFTP with PGP encryption:


Pre-requsities to communicate SFTP with PI.

 

  1. SFTP PGP ADDON installation on PI.
  2. BASIS team to generate a new public key certificate.
  3. After generating the certificate need to send to SFTP server admin to generate server finger prints.
  4. Then the server finger prints should be maintained in PI server.
  5. ESR & ID config.
  6. Proxy connectivity setup between ECC and PI.

 


Steps to install PGP ADDONS in PI:


1. Open the NetWeaver System Information using following URL

http://<host>:<port>/nwa/sysinfo

 

 

2. Open Tab “Components Info”


 

 

3. If you are able to find the Component Name “PIB2B_SFTP”. That means PI server is successfully deployed with SAP PI SFTP PGP ADDON.

   check that SFTP Application is started from NWA->Operations->Start & Stop->Java Applications


 

Generating Certificates:

 

1. Generate Private and Public Certificate in PI System and Extract Public certificate

Generate and Extract the public X.509 Certificate from SAP PI NetWeaver Administrator -> Configuration ->Certificates and Keys using Button “Export Entry”

 

2.   Convert Public PI X.509 certificated into SSH compatible public key.

 

Since PI NWA key storage doesn't support SSH keys for Private key based authentication, OpenSSL utility is required convert SSL keys to SSH keys and vice versa. OpenSSL can be installed separately in SFTP server.


To import the public key in SSH compatible SFTP server. First convert the PI X.509 certificate into SSH based public key.


PuTTY can use as client software to connect to SFTP server. It works as command prompt screen to execute key convert commands provided by OpenSSL. Conversion can take place in two steps.


Convert X.509 Certificate into Open SSL based certificate. We can use following command in SSH based client like putty

openssl x509 -in {X.509 Certificate}.cert -noout -pubkey > {Open SSL based certificate}.pkey


 

File has been generated now.

 

 

Convert Open SSL Based certificate into SSH based certificate. We can use following command in SSH based client like putty.

ssh-keygen -i -m PKCS8 -f {Open SSL based certificate}.pkey > {SSH based certificate}.pkey.pkey

 

 

 

File has been generated now.


Keys are generated in SFTP directory.




Import PI public certificate in SFTP Server:


SAP PI converted public key must be registered with the SSH server, typically by copying it into the server's authorized_keys file.

 

Keys are imported in user folder: <user>/.ssh as shown in above screen shot and this user would be used while making connection to SFTP server.

Go to the root folder of the user account & check for the folder „.ssh‟.

 

Create one, if the folder does not exist. Check for the file - authorized keys‟, Create one if it‟s not available.

File must be entered in exactly one line



If server need to authorise multiple public key for particular user, Paste the public key content in new line as following screen shot.

 

Generated server finger prints :

 

 

 

To Establish proxy Connectivity set up between ECC and PI:

 

  1. Create RFC Destination in ECC to connect to PI system.
  2. Configuration Business System as Local Integration Engine (ECC System)
  3. Create RFC destination (TCP/IP) LCRSAPRFC for the SLD connection to establish the connection between Business System and SLD (ECC System)
  4. Create RFC destination (TCP/IP) SAPSLDAPI for the SLD connection in ECC
  5. Maintaining the SAP J2EE Connection parameters for LCRSAPRFC and SAPSLDAPI in SAP J2EE Engine

 

ESR Part:

 

Message Mapping:

 

 

ID Part:

 

    For proxy no need to create sender communication channel and sender agreement.


Receiver communication channel:


Module Configuration to convert XML to Plain with PGP encryption:


 

 

Testing:

 

File has been placed in SFTP folder with PGP encryption


 

PGP Encryption:

 

TechEd && d-code 2014 - Las Vegas Replays on SAP HANA Cloud Integration and SAP PI

$
0
0

If you haven’t had the chance to attend TechEd && d-code in Las Vegas or Berlin, the following replays give you the possibility to gain valuable insight to our integration solutions SAP Process Integration and HANA Cloud Integration. SAP PI is and will remain important for integration of on-premise to on-premise or cloud applications. And HCI is SAP’s strategic cloud integration platform as you’ve certainly seen in Björn Görke’s keynote. With these replays you will understand our strategy and roadmap and hopefully enjoy the presented demos. Feel free for sharing!

 

Integration and Orchestration – Overview and Outlook

(Lecture INT100, 60 min, Speaker: Gunther Rothermel)   Replay

 

This session focuses on SAP Process Orchestration, SAP NetWeaver Gateway, SAP HANA Cloud Integration, and SAP Operational Process Intelligence powered by SAP HANA and discusses how these products come together to provide a holistic solution to address your integration and orchestration needs.

 

 

Cloud Integration – An Update on Our Strategy

(Lecture INT200, 60 min, Speaker: Sindhu Gangadharan)   Replay

 

Learn how SAP integration technologies, like SAP HANA Cloud Integration and SAP Process Orchestration are being used for the integration of SAP's cloud pillars. See the approaches and technologies via end-to-end demos. Prepackaged content for jump-starting your integration projects will be previewed.

 

 

SAP Process Orchestration as B2B Gateway – Business Partners Integration

(Lecture INT202,  60 min, Speaker: Gunther Stuhec)   Replay

 

SAP Process Orchestration business-to-business add-on helps organizations to integrate with their business partners. Trading Partner Management, EDI Dashboards, preshipped message guidelines, technical protocols, and enhanced security makes it a complete B2B gateway solution. This session provides insights into current solutions and road maps.

 

 

OData in SAP Process Orchestration

(Lecture INT103, 60min, Speakers: Mustafa Saglam, Christian Loos)   Replay

 

This session provides an overview of Concept Gateway and newly added OData provisioning capabilities in SAP Process Orchestration. The session also covers the positioning of various integration solutions offered by SAP such as SAP Process Orchestration, SAP Gateway, and Integration Gateway.

 

Access SAP TechEd && d-code Online to watch all available on-demand replays of executive keynotes, lectures, demo jams, interviews, and other highligths!


HCI- The Sky Is Yet Too Cloudy

$
0
0

Hi Everyone,

 

It is not long that first time I heard about Cloud Computing and the story of it's birth from Grid Computing and try making a cloud server using 4 laptops and 2 desktops with Linux. And a desktop Cloud Eye OS. Since then I came in the world of Integration and made my path in SAP PI, then SAP PO and now again the CLOUDS are so fascinating for me even in the Integration world.

 

There are so many blogs about SAP Hana Cloud Integration from our experts but then everytime my friends and colleague read it, they ask - What is Cloud? What is SAP Hana? What is in-memory computing- And the questions start growing like a woood fire, which made me realise "The Sky is Really Too Cloudy".

 

So here is an attempt to gather the scattered information and provide things at one place.

 

Have you understood “The Cloud”?


“The cloud” is “the Internet”. Cloud computing is sharing of computing resources over the internet rather than having local servers. The computing resources can be:

  • Infrastructure
  • Platform
  • Software

In simple terms, cloud provides you everything as a service; let it be storage to store your stuff, software to use, apps to work with etc. It provides following resources on demand.

  • Infrastructure as a Service (IaaS)– Servers, Virtual machines, VLANs, load balancers are provided from huge set of data centers on Demand.
  • Platform as a Service (PaaS)– Operating System, Programming language execution environment, Database and Web Servers. For e.g.:- Microsoft Azure to develop .NET projects and Google App Engine to do programs.
  • Software as a Service (SaaS)– Software is installed over the cloud and licensed for temporary basis as per the demand and use.

 

SAP HANA – The reason to use HCI?


If you are an Integration consultant then you may look at SAP HANA as an ECC (R3) system. P.S.: This is not really the same, just a logical view.

  • SAP Hana is an “in-memory” database.
    • In-memory? – It means all the data is stored on Random Access Memory and not on slow hard drives. Thus the data processing as well as data loading is very fast.
  • SAP Hana is a row as well as column based database, made up of combination of hardware and software to process massive amount of real-time data based on in-memory computing.

Note: Do browse a little about in-memory computing.

  • Those who are aware of SAP R/3 basics might be aware of the 3 layers Database, Application and Presentation layer. The complex calculation happens in Application layer.

In SAP Hana the complex calculations also moved to Database Layer.

  • Multi-engine query processing provides Relational, Graphical and Text based data loading and processing
  • SAP Partnered with HP, Dell, Hitachi, IBM, CISCO, Fujitsu, NEC for the Hardware and provides SAP Hana Studio as Data Modelling tool.

Having said this much about SAP Hana, you now have a picture of it. But is SAP Hana father of Hana Cloud Integration?

NO, it is not. The HCI is the ideal solution to use when the core business solution is on CLOUD. There are many more reasons and situations based on which we can decide the suitability of HCI as the middleware Integration Solution?

 

Hana Cloud Integration- Let us get Friendly!


Keeping in mind the focus SAP is giving on SAP PO as the strategic middleware integration tool, I wonder isn’t Hana Cloud Integration a competition a competition to SAP PO as a Product.

This is a good time to put forwards that why HCI is called so?

  • HCI is hosted on SAP HANA, and offered as a managed service on top of SAP HANA.

HCI comes with a Design time where all objects can be designed and configured based on Eclipse tool. It has runtime environment deployed over the SAP Hana cloud. And the HCI comes with a WEB UI for different categories of users to access it.

Features of HCI:

  1. a)      Architecture and Deployment options suited best for cloud-cloud, cloud-on premise
  2. b)      Multi-tenant architecture.
  3. c)       Highest level of security with content encryption, signing of messages, encrypted data storage and certificate based authentication.
  4. d)      HCI runtime supports connectivity via large no of adapters like SFTP, IDOC, SOAP, HTTP, Successfactors and OData.
  5. e)      Predefined Prepackaged Integration content are available at present for integration with SAP ERP/ SAP CRM
  6. f)       It facilitates process integration as well as data integration capabilities.
  7. g)      Comes with a Web UI for ease of accessing by different categories of users.
  8. h)      MONITORING- The central monitoring via Solution manager is available. And here the monitoring can be done in Eclipse IDE itself along with Web UI monitoring screen.

Hana Cloud Integration versus SAP PO

Hana Cloud Integration is a natural growth path of SAP PI server, but it is not an enhancement of existing SAP PI/PO server. HCI is a completely new product and not a successor of SAP PO, thus migration is not possible from SAP PO to HCI.

 

 

SAP PO

Hana Cloud Integration

On-Premise Installation of the product is needed in the landscape.

No installation needed at local hardware. We can get it on-demand on-subscription basis and use for integration

Any-To-Any Integration possible ( A2A, B2B, C2C etc)

Application Edition available only for standard integration contents provided by SAP.

Platform Edition strategic launch with customized integration.

Needs License cost and annual fee

Needs subscription fee

Built-in Virtualization capabilities, and Integration contents only for few SAP applications

No built-in virtualization capability, Integration contents available for many

 

SAP PO or SAP Hana Cloud Integration – Be Choosy!


Going by the name please do not be confused about the capabilities of SAP PO integration with SAP Cloud customers. But then, which product is ideal for which solution-

  • If we are not using SAP PO already in our landscape then as and when required for SAP Cloud customers subscribe HCI.
  • If you are already using SAP PO in the landscape,  use SAP PO for the integration requirements.

 

Unclear Conclusions- May be sky is too Cloudy!

We are still exploring the capabilities of SAP PO and we still expect so much from SAP’s strategy towards SAP PO. With HCI already launched for customers it is again a debate to choose the solution. Of-course there can’t be a correct answer for this, but as Necessity is the mother of Inventions, so is Requirements for Solutions.

We must not consider SAP PO and HCI as competitors and compare them but look at them two different products with their own strengths in certain areas.

 

 

 

   

Useful Links

 

  • Mariana Mihaylova after so many great contents on SAP PO, she has given a brief and useful content for “Getting Started with HCI”.

http://scn.sap.com/docs/DOC-40396

  • Thanks to Fons van Nuland for a nice comprehensive comparison of HCI and SAP PI

http://scn.sap.com/community/pi-and-soa-middleware/blog/2014/01/02/hana-cloud-integration-versus-sap-pi

  • Ginger Gatling’s authored contents are really great. There is one link I have provided here, but there are many more blogs/documents authored by Ginger over SAP Cloud for Customer and HCI are worth going through in this context

http://scn.sap.com/community/cloud-for-customer/blog/2014/03/18/sap-cloud-for-customer-integration-use-hci-or-nw-pi

  • The interesting blog by Shabarish Vijayakumar. The beauty of blog lies in the diagram present there

http://scn.sap.com/community/process-orchestration/blog/2014/10/30/sap-and-integration-the-dark-clouds-may-have-just-passed

  • Last But not the Least- The favorite one on SCN for Hana Cloud Integration is by Abinash Nanda

http://scn.sap.com/community/cloud-for-customer/blog/2014/05/24/understanding-hana-cloud-integration-hci

 

 

Rregards,

 

Vikas Kumar Singh

SPROXY: Generation Error type is not set (xsd type or type reference)

$
0
0

Hallo

 

Generation of proxies with external XSD descriptions is not always an easy part and requires some Manual corrections of the xsd.

usually i replace decimal/integer and date to xs:string to avoid unnecessary syntax errors

 

i tried to generate a Proxy for an Service Interface from PI ESR (NW 7.40SP8) using Transaction SPROXY

 

but i've got error message SPXN 008 type is not set, without any hints witch element of the external xsd description is not ok.

 

when i open the externall structure tree and navigate down the tree,

 

 


breakpoint at

Class CL_PROXY_UIX_TREE_DATA, Method Create_Type_nodes, catch cx_pxn_fault.

Show variable L_ID

 

now i can see the name of the wrong xsd-element and fix it in the xsd description using XML spy or Notepad:

after doing so, everything was ok!

 

sproxy.jpg

How to Easily Create an User defined function for Message Mapping Using NWDS

$
0
0

Dear SCN Friends,

 

If you are someone who has been working with the PI dual stack architecture and has now moved to SAP Process Orchestration or you are starting to work with the Process Orchestration suite,this blog provides the basic understanding on how to create an udf with different execution types by comparing with Swing tool used in the Enterprise Service Repository (ESR) .The focus here will be on execution types and Input/Output variable types.


Either in Swing tool(ESR) UDF or UDF in NWDS we have below execution types and Input/Output variable types


Execution Tpes:


1)Single Values

2)All Values of a Context

3)All Values of Queue


Input/Output Variable Types:


1)Argument

2)Parameter

3)Result

 

I'm taking couple of examples to explain in brief.

 

Example1:Covers Execution type "Single Values"


UDF Creation for mapping in Swing tool(ESR) :

SingleValues.jpg

 

UDF Creation for mapping in NWDS:

 

To create the same UDF in NWDS we need to follow the below streps.

 

1)Add the below Import statements.

 

import com.sap.ide.esr.tools.mapping.core.LibraryMethod;

import com.sap.ide.esr.tools.mapping.core.ExecutionType;

import com.sap.ide.esr.tools.mapping.core.Argument;

 

2)To the existing default annotations (Init,Cleanup) , we need to add the annotation LibraryMethod which acts as an header for our main method .

 

@LibraryMethod(title="insertDoubleQuotes", description="", category="ReusableUserDefinedFunctions", type=ExecutionType.SINGLE_VALUE)


3)Main Method creation -->This method contains main Logic


public String insertDoubleQuotes (


              @Argument(title="") String input,Container container) throws StreamTransformationException{


               //UDF logic Starts here


               // Assign double quotes to the String "doulbleQuote"

               String doubleQuote = "\"";

               // Create  variable "outString"

               String outString = "";

               // Concate doubleQuote with input value and assign to outString

               outString = doubleQuote.concat(input).concat(doubleQuote);


               //UDF logic ends here

 

                                  return outString;


               }//close brace for insertDoubleQuotes method

 

 

 

Example2:Covers Execution type "All Values of a Context"

 

UDF Creation for mapping in Swing tool(ESR) :

AllVC.jpg



























UDF Creation for mapping in NWDS:


To create the same UDF in NWDS we need to follow the below streps.


Step1 willl remain the same.


2)


@LibraryMethod(title=" splitByDelimiter", description="", category=" ReusableUserDefinedFunctions", type=ExecutionType.ALL_VALUES_OF_CONTEXT)


3)Main Method creation -->This method contains main Logic


public void splitByDelimiter (

                      @Argument(title="")  String Input[],

                      @Argument(title="")  String delimiter[],

                           ResultList outFirstString,

                           ResultList outLastString,

                           Container container) throws StreamTransformationException{

 

              //UDF logic Starts here

 

                              String[] output;

                              output =Input[0].split(delimiter[0]);  // splitting the input by delimiter

                               if(output.length == 0)

                                {

                                outFirstString.addValue("");  // setting the empty string since the input value is empty

                                outLastString.addValue(""); // setting the empty string since the input value is empty

                                }else  if(output.length >= 2)

                                {

                                outFirstString.addValue(output[0]); // setting the result for outFirstString

                                outLastString.addValue(output[1]); // setting the result for outLastString

                                }else{

                                outFirstString.addValue(output[0]); // setting the result for outFirstString

                                outLastString.addValue(""); // setting the result for outLastString

                                        }

 

                                   //UDF logic ends here

 

                           }//close brace for splitByDelimiter method

 

 

For Execution type “All Values of Queue” ,only modification required when compared to “All Values of Context” type is ,we need to replace "ExecutionType.ALL_VALUES_OF_CONTEXT"statementwithExecutionType.ALL_VALUES_OF_QUEUEstatement.

 

 

Example3:Covers  Using Input variable type  "paramter" and  Java Type "Channel" as an Input to UDF:


Step1 willl remain the same. (I hope most of them would get a doubt that don't we need to use Import statement for this ?  ...Ans is ,"It is optional.The annotation is not required for arguments of type Channel and ResultList unless the user wants to provide a title to the argument.")


Step2 willl remain the same.


@LibraryMethod(title="getHike", description="", category="GetHikeDet", type=ExecutionType.SINGLE_VALUE)


3)Main Method creation -->This method contains main Logic


 

  public String getHike (  @Argument(title="")  String grade,  @Argument(title="")  String region,  Channel soapHikeDet,

  Container container)  throws StreamTransformationException{

 

               //UDF logic

 

                    return something;                        

}

 

 

Here is the reference Link for How to Create an UDF for Message Mapping in NWDS:BlogbyLee

 

Regards

Venkat

SMQ2 Queue Monitoring Enhancement: Show Receiver Business System

$
0
0

hallo

 

i did an enhancement to SMQ2 in order to Show also the Receiver Business System Name, which is sometimes useful to see whats in the Queues:

 

Change Report (or copy to Z_...) RSTRFCM3

...

      write: sy-vline no-gap,
(24) trfcqview-firsttid color col_key no-gap,
sy
-vline no-gap, (10) trfcqview-fdate color col_key no-gap,
sy
-vline no-gap, (8trfcqview-ftime color col_key no-gap,
sy
-vline no-gap, (32) trfcqview-dest   color col_key no-gap,
sy
-vline no-gap, (24) trfcqview-wqname color col_key no-gap,
sy
-vline no-gap.
hide: trfcqview-firsttid, trfcqview-dest, trfcqview-wqname.
endif.
endif.
hide: valid_line.
*{   INSERT                                                 1
* Erweiterung für das XI-Monitoring - Empfänger System ausgeben
data: ls_sxmsqueue_rcv type sxmsqueue_rcv.

select single *
from sxmsqueue_rcv
into ls_sxmsqueue_rcv
where shortname eq trfcqview-qname+4(4).
if sy-subrc eq 0.
write: ls_sxmsqueue_rcv-bsn_system.
endif.

*}   INSERT

 

after that it looks like this:

smq2.gif

 

where D02 is an example (R/3) Business System

 

hope you also like this  :-)

Type Ahead F4 Search Help in SXI_MONITOR (Autocomplete)

$
0
0

from release 7.40, we have the new 'Type Ahead' F4 help feature which is also available to non-hana systems

 

so if you dont know the interface names or sender/receiver business service names, i thought this might also be funny in PI Monitoring.

 

Example:while starting with typing ORDERS.... i am already getting the proposals (like google...), somethimes use a * before,

so *05 will give also the results

 

sxi_monitor.gif

 

as sap did not (yet) change this in standard, i had to copy the search help and assign to a customer structure:

si_monitoring_structure.gif

 

sxi_monitor_f4_help.gif

 

select checkbox

 

with hana you can use fuzzy logic  here

 

 

where else could this feature be useful?

ESR: Message and Java Mapping's - in a Single Message Mapping!!

$
0
0

In this blog, I would like to share my recent experience with Graphical mapping in regards to having java mapping in it

 

For those who don't know the concept of 'having a java mapping within graphical mapping', please check Sunil Chandra's blog: Write Java Mapping directly in ESR!

 

We can also call graphical mapping using super.transform(TransformationInput in, TransformationOutput out) method from same graphical mapping as per  Daniel Graversen's blog: Hacking: To get error from the PI Mapping Mapping realtime.

 

 

So now the question is, can we club the above 'two blog concepts in a single graphical mapping' ?

(or)

Rather I would say, 'can we have below operation mapping patterns in a single graphical mapping' ?

 

Pattern1:

1) JavaMapping 2) GraphicalMapping

 

Pattern2:

1) GraphicalMapping 2) JavaMapping

 

Pattern3:

1) JavaMapping One 2) GraphicalMapping 3) JavaMapping Two

 

 

Well, it is not possible directly, and this is where this blog comes into picture

 

The solution is, we should override standard TransformationInput, TransformationOutput, InputPayload and OutputPayload classes with our own implementation code.

 

With a little bit experimentation I am able to achieve above patterns in a single graphical mapping. Please note that, the java code should be put under Graphical mapping --> Functions --> 'Attributes and Methods' section.

 

Please use only required pattern and remove/comment the rest of patterns as per your needs

 

Java Code:-

 

InputAttachments inputAttachments;

InputHeader inputHeader;

InputParameters inputParameters;

DynamicConfiguration dynamicConfiguration;

 

 

OutputAttachments outputAttachments;

OutputHeader outputHeader;

OutputParameters outputParameters;

 

 

public void transform(TransformationInput in, TransformationOutput out) throws StreamTransformationException{

  try

  {

  dynamicConfiguration = in.getDynamicConfiguration();

  inputAttachments = in.getInputAttachments();

  inputHeader = in.getInputHeader();

  inputParameters = in.getInputParameters();

 

 

  outputAttachments = out.getOutputAttachments();

  outputHeader = out.getOutputHeader();

  outputParameters = out.getOutputParameters();

 

 

  InputStream is = (InputStream) in.getInputPayload().getInputStream();

 

 

  /*

  * **************************** GuideLines for Java Mapping ********************************************************************

  * You can have java mapping code INLINE if it is few lines. And if it is a big and complex code, then I recommend to isolate

  * java mapping logic and develop it separately as a NORMAL java program. Import it as 'imported archive' and refer it in graphical mapping

  * And then call the externally developed public method here

  * Recommendation for external method signature: public ByteArrayOutputStream YourExtBussLogicMethod(InputStream is, ...)

  * **********************************************************************************************************************************

  */

 

  /*BEGIN ************************************************* PATTERN 1 (JM - GM) **************************************************/

 

  //Java Mapping: YourExtBussLogicMethod(is) code is nothing but your java mapping logic. You can also have INLINE code here

  ByteArrayOutputStream baos = YourExtBussLogicMethod(is);

 

  InputStream newInputStream = new ByteArrayInputStream(baos.toByteArray());

  InputPayloadImpl payloadInObj = new InputPayloadImpl(newInputStream);

  TransformationInputImpl transformInObj = new TransformationInputImpl(payloadInObj);

  //Graphical mapping called here

  super.transform(transformInObj, out);

 

 

  /*END ************************************************* PATTERN 1 (JM - GM) **************************************************/

 

 

  /*BEGIN ************************************************* PATTERN 2 (GM - JM)  **************************************************/

 

 

  InputPayloadImpl payloadInObj = new InputPayloadImpl(is);

  TransformationInputImpl transformInObj = new TransformationInputImpl(payloadInObj);

  OutputStream os = new ByteArrayOutputStream();

  OutPayloadImpl payloadOutObj = new OutPayloadImpl(os);

  TransformationOutputImpl transformOutObj = new TransformationOutputImpl(payloadOutObj);

  // Graphical mapping called here, but the transformed stream is written to intermediate ByteArrayOutputStream 'os'

  super.transform(transformInObj, transformOutObj);

  OutputPayload outPayload = transformOutObj.getOutputPayload();

  ByteArrayOutputStream baos1 = (ByteArrayOutputStream) outPayload.getOutputStream();

  InputStream is1 = new ByteArrayInputStream(baos1.toByteArray());

  //Java Mapping: This funciton code is nothing but your java mapping logic. You can also have INLINE code here

  ByteArrayOutputStream baos2 = YourExtBussLogicMethod(is1);

  // Finally write it to actual mapping runtime outputstream fetched from TransformationOutput

  out.getOutputPayload().getOutputStream().write(baos2.toByteArray())

 

 

  /*END ************************************************* PATTERN 2 (GM - JM) **************************************************/

 

 

 

 

  /*BEGIN ************************************************* PATTERN 3 (JM1 - GM - JM2) **************************************************/

 

 

  //Java Mapping1:This funciton code is nothing but your java mapping logic. You can also have INLINE code here

  ByteArrayOutputStream baos = YourExtBussLogicMethod1(is);

 

 

  InputStream newInputStream = new ByteArrayInputStream(baos.toByteArray());

  InputPayloadImpl payloadInObj = new InputPayloadImpl(newInputStream);

  TransformationInputImpl transformInObj = new TransformationInputImpl(payloadInObj);

 

 

  OutputStream os = new ByteArrayOutputStream();

  OutPayloadImpl payloadOutObj = new OutPayloadImpl(os);

  TransformationOutputImpl transformOutObj = new TransformationOutputImpl(payloadOutObj);

 

 

  // Graphical mapping called here, but the transformed stream is written to intermediate ByteArrayOutputStream 'os'

  super.transform(transformInObj, transformOutObj);

 

 

  OutputPayload outPayload = transformOutObj.getOutputPayload();

  ByteArrayOutputStream baos1 = (ByteArrayOutputStream) outPayload.getOutputStream();

  InputStream is1 = new ByteArrayInputStream(baos1.toByteArray());

 

 

  //Java Mapping2:This funciton code is nothing but your java mapping logic. You can also have INLINE code here

  ByteArrayOutputStream baos2 = YourExtBussLogicMethod2(is1);

  //Finally write it to actual mapping runtime outputstream 'out' fetched from TransformationOutput

  out.getOutputPayload().getOutputStream().write(baos2.toByteArray());

 

 

  /*END ************************************************* PATTERN 3 (JM1 - GM - JM2) **************************************************/

  }

  catch (Exception e){

  throw new StreamTransformationException(e.getMessage());

  }

}

 

 

class InputPayloadImpl extends InputPayload{

  InputStream in;

    public InputPayloadImpl(InputStream in){

  this.in = in;

  }

  @Override

    public InputStream getInputStream(){

  return in;

  }

}

 

 

class TransformationInputImpl extends TransformationInput{

 

 

  InputPayload payload;

 

 

  public DynamicConfiguration getDynamicConfiguration(){

  return dynamicConfiguration;

  }

 

 

  public TransformationInputImpl(InputPayload payload){

  this.payload = payload;

  }

 

 

  @Override

  public InputAttachments getInputAttachments(){

  return inputAttachments;

  }

 

  @Override

  public InputHeader getInputHeader(){

  return inputHeader;

  }

 

 

  @Override

  public InputParameters getInputParameters(){

  return inputParameters;

  }

 

 

  @Override

  public InputPayload getInputPayload(){

  return payload;

  }

 

 

}

 

 

class OutPayloadImpl extends OutputPayload {

    OutputStream ou;

 

    public OutPayloadImpl(OutputStream ou){

  this.ou = ou;}

  @Override

    public OutputStream getOutputStream(){

  return ou;}

}

 

 

class TransformationOutputImpl extends TransformationOutput {

    OutputPayload payload;

 

 

    public TransformationOutputImpl(OutputPayload payload){

  this.payload = payload;

  }

 

 

  @Override

    public void copyInputAttachments(){ }

 

 

  @Override

    public OutputAttachments getOutputAttachments(){

  return outputAttachments;

  }

 

 

  @Override

    public OutputHeader getOutputHeader(){

  return outputHeader;

  }

 

 

  @Override

    public OutputParameters getOutputParameters() {

  return outputParameters;

  }

 

 

  @Override

    public OutputPayload getOutputPayload(){

  return payload;

  }

}

The advantage of isolating and developing public ByteArrayOutputStream YourExtBussLogicMethod(InputStream is) java code classes externally is that we don't have to rely on SAP's java mapping api to compile the code

 

Hope this concept will be helpful. And please share your valuable feedback..

SAP Process Integration JDBC Format XSD Tool

$
0
0

This is continuation of my previous blog to create XSD structure through php tool SAP Process Integration XSD Tool Using PHP

This blog will explain to create a JDBC document format required in PI for various JDBC Scenario like Insert, Stored Procedure  and INSERT_UPDATE using PHP.

Prerequisites

  • Basic knowledge of PHP.
  • Basic knowledge of SAP PI.

Tool Overview

JDBC Format XSD Converter tool is mainly developed to reduce the effort of manually creating the data types in SAP PI.

The Converter tool will take the input as a CSV file.

The CSV file contains the list of nodes to be created in SAP PI with element name and data types.

We have different format for each scenario like Insert, Procedure ,Insert_update respectively.

The CSV file should be similar to below template format.

SAP PI JDBC Format XSD Tool Using PHP -  Browse /Readme at SourceForge.net

Process

  • The code will read the entire CSV file and generate the XSD file in database format which contains elements with type like integer, string, date etc...
  • After downloading the tool, extract the CSV_XI_DB_XSD zip files to the root folder /www/
  • We can access the tools main page through  - http://localhost/CSV_XI_DB_XSD/upload_csv.php1.png
  • Select the appropriate template (maintain CSV) to load for creating insert or procedure structure.
  • Upload the CSV file using Browse option and provide the details like IR- data type name, namespace, table Insert/procedure name and select the XSD structure.
  • 1.png
  • After submitting we can get a XSD file.
  • We can import the XSD file to the Data types using the import file option available in the XI middleware tool.
  • 1.png
  • Created XSD file can be imported into SAP PI.

1.png

1.png

  • After importing we can add / edit the elements manually if required.
  • INSERT_UPDATE structure wont support for now.

Code
Installables can be downloaded from

http://sourceforge.net/projects/sappijdbcformattool/files/CSV_XI_DB_XSD.zip/download

 

 

Please feel free to share your thoughts and suggestions in the comments below!


 


Value conversion in SAP interface communication – architectural thinking

$
0
0

Value conversion in SAP interface communication – architectural thinking

This article discusses the SAP technologies and key points that need to be addressed when an SAP solution is needed for value conversion in business process integration scenarios across systems. The article is based on SAP recommendations which have been enriched with additional architectural key points. The first part of the article explains value conversion in general. The next chapter discusses SAP technologies for value conversion. The third chapter analyzes architectural key points of value conversion that need to be taken into consideration. In the last chapter, the SAP technologies are explained in different architecture scenarios which are based on SAP’s recommendations and the key points discussed in chapter three.

 

Value conversion basics

Many different types of IT systems can still be found in many organizations. Each system contains its own data and semantics, especially in enterprises that are spread across multiple companies and geographies. Problems are likely to occur when data is exchanged between these systems via interfaces.

 

A simple example of a value conversion issue could be the representation of country codes in different systems. One system could use the letter “D” for Germany and another uses the common notation “GER”. When data including country codes is exchanged between these systems, the operations will fail because the systems do not work with different representations except their own. A mechanism is needed to convert the sender value to the expected receiver value. This kind of value mapping is classified as a functional value conversion, because the content is critical for correct transactional processing inside each system.

 

Other value conversions, such as the conversion of system names, are needed only for technical reasons. This type of mapping can be technically necessary to enable communication between the sender and receiver system. The functional departments are not concerned about these types of technical details. They will also not feel responsible if errors occur in this type of mapping. The separation between technical and functional value mapping is important when we discuss different value mapping approaches later on.

 

The definition of one single representation of a value (e.g. unique customer numbers) in the whole company could be considered a solution from an architectural point of view which would make value conversion unnecessary. In practice, this is often not applicable, however, because each business unit is interested in keeping its own representation in its system. Making changes would also incur additional costs and the users would need to get used to the new representation. In a worst case scenario, it would be impossible to make any changes for certain terms as they are hard-coded by a third-party vendor. In such cases, another solution will be needed for value conversions.

 

This section briefly discussed what “value conversion” is about and why a concept is needed to handle it. The next chapter explains the technologies that SAP recommends.

 

 

SAP Technologies

In this section, we discuss the SAP Process Integration (PI) and SAP Application Interface Framework (AIF). Both are intended to connect interfaces with each other and both explicitly provide value conversion functionality. Other SAP middleware products, such as SAP gateway, are not discussed here because they do not focus mainly on interface integration and value mapping.

 

PI is the middleware which is able to deal with different technologies (like file, SOAP, RFC, etc.) and the mapping of different interfaces on message type level. The middleware seems to be designated for message transformation and value conversion. The source system sends data to the middleware, maps the values (e.g. from “D” to “GER”), and delivers the message to the receiver.

SAP PI provides different standard concepts that can be used to map values between interfaces. These concepts, which are “Fix values”, “Value Mapping”, “Value Mapping Mass Replication” and “RFC look-Up”, will be discussed briefly.

 

 

 

Figure 1 SAP PI as a middleware.png

Figure 1: SAP PI as a middleware

 

The most basic option in PI is “Fix Values”. The value pair is entered and used for the specific target field mapping only within the enterprise service repository (design time). The value pair cannot be re-used in other message mappings or target field mappings. Maintenance and look-up of the value pairs are done in PI because no mass upload of values is possible. You can only enter source and target values and there is a 1:1 relationship between them. This option is recommended if you only want to map very few values by hand and the mapping is not to be re-used in other message mappings. The following figure shows a “Fix Value” example in PI.

 

 

Figure 2 Fix Values.png

 

Figure 2: Fix value in SAP PI

 

“Value Mapping” is another more complex option in PI. The value pair is entered in the integration directory (configuration time) and not directly in the target field mapping, like fix values. It can be re-used in all message mappings. Maintenance and look-up of the value pairs are done in PI. The creation of value mapping pairs is done manually and you need to specify additional meta-data, such as scheme, agency, and a context, for each individual value, which requires a sophisticated naming convention. Inside the Enterprise Service Repository (ESR) in the target field mapping, you must enter the source scheme and source agency of the source value and the target scheme and target agency for the target value. You must also define the context.

All values that belong together are classified as a “Value Mapping group.” In this group, there is no predefined source and target value. You can convert values in both directions based on the semantics that you use for source and target scheme and agency in the target field mapping. Compared to fix values, you can have more than two values in one group.

The following table is an example of a representation of the same person, Mr. Smith, in different systems:

 

Context

Agency

Scheme

Value

Smith

System A

Employee

Mr Smith

Smith

System B

Customer

0175965

Smith

System C

Member

3885

Table 1:  Entity representation with value mapping

 

Based on the determination for context, agency, scheme and source value, the looked-up target value may differ.

 

“Value Mapping Mass Replication” is an extension of “Value Mapping”. Mass upload of value pairs is done via java inbound interfaces on PI. This allows any back-end system to store and maintain value pairs which can then be replicated to PI.

It is impossible to replicate values from PI Cache to the back-end. Only back-end to PI replication is possible.

Caution: Back-end and PI Cache should always be in sync. If the conversion rule base is updated in the back-end, the information needs to be replicated to PI. In the meantime, the value mapping cache in PI will not be up-to-date. If interfaces are running during this timeframe, inconsistency and false mappings may occur.

 

When using “RFC Look-up”, the value mapping pairs are saved in a back-end system.

During run-time, the value mapping pairs are looked up dynamically in the back-end.

The look-up API supports RFC, SOAP und JDBC.

The inbound and outbound interfaces of the look-up need to be configured in the ESR.

The communication channels also need to be set up in the Integration Directory.

 

PI is one solution in which data conversion can be converted. SAP also introduced AIF as another technology which is an add-on that is to be installed on each SAP back-end system. It enables interface development, monitoring and error handling. The functionality of the framework is performed before the actual inbound or outbound interface is executed in the SAP back-end system during run-time.

 

 

Figure 3 AIF and PI.png

 

Figure 3: SAP AIF combined with SAP PI

 

This SAP framework provides functionality to implement value conversion in interfaces without or with very little coding effort via customizing transactions. These value mapping objects point to different SAP tables which are to be looked-up during run-time to retrieve the value needed. If a more complex mapping rule is needed, a custom function module can be developed and called by AIF to do the conversion. Another advantage is the re-use of value mapping objects in several interfaces. AIF even provides advanced error handling capabilities such as automated notifications and mass correction, which are also useful in correcting value conversion errors. The error notifications can be directed to different users based on their roles to speed up error correction and reduce coordination effort. That allows the functional end-user to do the correction directly. Mass correction of the same type of error in several messages can be done to make things easier. AIF can be combined with SAP PI, but it can also be used alone. Because AIF is only to be installed on each SAP back-end system, AIF cannot be used in pure non-SAP scenarios that only involve legacy systems. The AIF interface development, functional mapping and monitoring functionality complements the interface transformation, technical mapping and routing functionality of PI.

Requirements for a value mapping architecture

When developing a solution for value conversion, the following points need to be taken into consideration before choosing the appropriate SAP technology:

 

System landscape: Are SAP systems involved? AIF cannot be used in a scenario that involves only non-SAP systems. Nevertheless, PI can always be used as a middleware technology. If value conversion has already been implemented in PI for such scenarios, it might be appropriate to re-use it in scenarios where SAP systems are involved instead of re-implementing value mapping in AIF, if it is introduced afterwards. On the other hand, if there is at least one SAP system involved per scenario, you can consider moving functional value mapping to AIF.

SAP recommends that AIF be designed for functional value mapping, functional error correction and user notification. PI is designed for message transformation of different interfaces, message conversion of different protocols and routing of messages to different target systems. Technical interface monitoring is also to be done in PI.

This recommendation makes sense as both technologies have strengths and weaknesses and can be combined in an effective manner.

 

Maintenance: Who should maintain the value conversions? The technical team is responsible for IT-systems, like the middleware. Typically, IT teams do not update functional conversion rules. They expect the functional departments to maintain these values because it is their area of responsibility. However, the functional departments do not want to work with any other system besides their own. They consider another system, such as a middleware engine, to be an additional burden and double work. A solution is needed which is acceptable for both departments, functional and IT. This again emphasizes the split between technical and functional mapping rules for which each department is responsible. If AIF and PI are combined, then PI should attend to technical conversions and AIF functional.

 

Possession of data: As described, an easy solution would be to save the value conversions directly in the middleware. Other solutions would be to save the conversions rules in the sender or receiving system. It could also be possible to replicate conversion rules between several systems. In the end, the concept must take into consideration who the owner of the conversion rules is and who is responsible for updating them. A master system needs to be defined. The master system should be the system that is to be used over the long-term.

 

Interface dependencies: There can be situations in which dependencies between interfaces must be considered. Let’s say there are two interfaces between two ERP systems. One interface replicates vendor master data and the other vendor account payable bookings. Critical for the target system is that the vendor number of the sender is always converted to the vendor number of the receiver system, as they have different number ranges. If a booking is passed before the vendor master data was replicated, the booking might fail because the vendor number does not exist yet in the target system. Even if the vendor master data was replicated before, the conversion rule must be updated first, before the other interface is executed. This means that if PI contains the conversion rule, it needs to be updated with the new vendor master data by the receiving ERP system before it can process the interface.

 

Timeliness of data: The example involving the vendor master data demonstrates that the conversion rule base must always be up-to-date in order to be able to execute other interfaces successfully. On the other hand, country codes are rather static as they do not change very often. In this case, a solution might be sufficient by which the conversion rule base is not updated that often.

 

Performance: Performance is very important in cases where the end-user expects real-time processing. Other scenarios do not demand high performance, e. g. when data is replicated at night and the user expects the updates the next day. A middleware usually leads to more processing overhead and less performance. The performance drops even more when the middleware also needs to do look-ups in other systems, e.g. to find a conversion rule for country codes.

 

Re-usability: When the conversion rule base is replicated between several systems there is redundancy and always a risk in terms of obsolete data. Implementing a central conversion rule base eliminates redundancy and increases re-usability. However, it might lead to worse performance because each conversion request needs to be routed to the central conversion rule base. The non-functional requirements need to be taken into consideration here and the right trade-off between performance and redundancy needs to be evaluated.

 

Error handling: How errors are handled represents an important aspect. A distinction is made between functional errors (wrong values) and technical errors (incompatible message types). A decision must be made as to who is responsible for certain types of errors and how they should be corrected. Audit requirements, e.g. for financial bookings, demand transparency in this process which usually results in a copy of the original broken message. Especially when it comes to defining role-based error handling, one must distinguish between technical and functional errors.

 

Development effort: Different solutions for value conversions also mean different degrees of effort for implementation. This must be taken into consideration during evaluation to find a solution that is worthy of being implemented. The more SAP technologies are used and combined, the higher the costs for implementation, licenses and operations.

SAP architectures

When considering the architectures of several systems that communicate with each other, we must distinguish between a middleware and a point-to-point architecture.

 

In a point-to-point connection, the sender and the receiver systems communicate with each other directly, without PI. If the number of interfaces increases, point-to-point connections will be difficult to maintain. A middleware often reduces both the effort and the costs. Nevertheless, point-to-point connections still often exist because they do not generate additional overhead compared to a middleware. Point-to-point connections usually result in very good performance. In terms of conversion rules, either one or both systems may establish a conversion rule base. As for communication that involves SAP, the AIF is designated for such scenarios. It still allows for high performance and conversion for inbound and outbound interfaces. When communication between two SAP systems takes place, one system needs to be defined as the master and do the value mapping. The systems also remain autonomous and the users do not have to work with additional systems. To reduce redundancy, as few as possible systems should contain the conversion rule base.

 

Figure4.png

Figure 4: Point-to-point connections

 

By using PI as a middleware, the central mapping instance is highly re-usable, but the performance is worse compared to point-to-point. The middleware needs to be maintained with the conversion rules, which can lead to more effort compared to local implementation with AIF, for example. When PI is used and SAP back-ends are involved, AIF can also be used, which is what SAP recommends. As both technologies enable interface development and operations, one needs to discuss whether tasks can be split usefully, especially because value mapping is available in both products. The technical IT is usually aware of technical mappings and they also maintain the technical system details. It makes sense to assign responsibility for the technical mapping to them and execute it in PI directly. On the other hand, functional mapping rules are usually defined in the back-end systems that the functional departments responsible use.

 

Figure5.png

Figure 5: SAP PI as central middleware

 

Conclusion

Implementing a value mapping concept in SAP is a complex topic that needs to take several key points into consideration. Different scenarios are possible depending on the requirements and the usage of SAP PI and/or AIF. Ideally, SAP PI and AIF are combined to differentiate between technical and functional value conversion and to allow for role-based error handling. If only one of them is used, another approach will be needed. Also, different value mapping concepts can be applied within PI. Each one has strengths and weaknesses which need to be evaluated for the problem statement. The following table shows where specific types of mapping should take place.

 

 

PI only

AIF only

PI + AIF

Functional mapping

PI

AIF

AIF

Technical mapping

PI

AIF

PI

Table 2: Usage of SAP PI and AIF

 

The following table also shows the difference between PI and AIF in terms of value conversion as well as the advantages and disadvantages of each mapping concept in PI:

 

 

 

Mapping with AIF only

Fix Values in PI

Value Mapping in PI

Value Mapping Replication in PI

RFC Look-up in PI

SAP Standard

Yes

Yes

Yes

PI only

Back-end custom development

PI only

Back-end custom development

Performance

Good

Good

Good

Good

Poor

Development effort

Low - Middle

Very low

low

Middle

Middle

Re-usability of values in message mapping

Limited – non-central approach

No

Yes

Yes

Yes

Amount of pairs to be handled

High

Low

Low-Middle

Middle-High

High

Default exception handling when no matching value was found

No restrictions

  • Pass original source value to receiver
  • Set default value
  • Throw exception

Accuracy of data

Always up-to-date

Need to be updated by hand

Depending on replication interval, data can be outdated

Always up-to-date

Conversion rule

maintenance

Back-end

PI

PI

Back-end

Back-end

Table 3: Overall recommendation

Blog 1: Tracing Capability in SAP HANA Cloud Integration (HCI-PI)

$
0
0
Hello Integration Community!

In this blog, I shall explain how you can use the tracing capability of SAP HANA Cloud Integration (HCI-PI).

 

What is Tracing?


Tracing is an awesome feature that allows you check the payload values after each and every integration flow step. Tracing details appear in little yellow envelopes in an integration flow (a screenshot the one below).


Blog1_1.JPG


How can you enable Tracing in HCI-PI?

 

Tracing can be enabled in two simple steps:

 

  1. Explicitly request the tracing capability to be enabled on the tenant. Currently, you have to request SAP to enable this feature as it is disabled by default. Create a message on LOD-HCI requesting the tenant capability to be enabled.
  2. Configure at the integration flow level. In the integration flow properties, you have to enable integration flow at the properties level.

Blog1_2.JPG

 

How to view Tracing?


The traces of the message payload can be accessed from the Message Monitoring screen. You have two options: View MPL and Export MPL.

The Export MPL feature allows you to export the traces (optionally, with the integration flow) and send it to another colleague. You can import the traces and check out the data, too.


Blog1_3.JPG

 

Keep in mind the following points:

 

1. Disable tracing after you have completed the tests. Especially, if you are doing a data load. You do not want to overload your tenant storage!

2. Also, Tracing data is available only for 24 hours.

 

Video on the Tracing Feature

 

Check the following video on how you can enable and use tracing.


 

So, go ahead - use the feature and let us know your feedback.

Happy Integrating!

 

Best Regards,

Sujit

Dev 10 Blog: Ten Developer Blogs on SAP HANA Cloud Integration (HCI-PI)

$
0
0

Hello Integration Community,

 

Now that the TechEd days for this year has come to a close, I want to extend the knowledge sharing sessions of SAP HANA Cloud Integration (HCI) a bit more. Over the next few weeks - in a series of 10 blogs, I shall go into more depth on the features of SAP HANA Cloud Integration (HCI-PI). These blogs are meant to more insight on the features of SAP HCI. They are not of an introductory nature, and I am targeting mainly Integration Developers when I write.

 

In this blog, I shall keep updating the links to all the ten blogs. So, read the first blog and let me know your comments !

 

Blog 1: Tracing Feature in SAP HANA Cloud Integration

 

Warm Regards,

Sujit

Sanity checks on ERP"ECC" System towards PI Sytem after System copy or Upgrade

$
0
0

There are cases where after an upgarde or a system copy of SAP ERP System, connection issues to SAP PI systems popup or you want to make sure all the settings on SAP ERP point to the correct SAP PI Integration Server after an upgarde.It could also be that you are setting up setting up  an SAP ABAP System to connect to PI system for the first time.


Performing  the below checks/steps on the SAP ERP system, should ideally resolve the connectivity isuues to PI System.


  • Check the following RFC connections which are of Type T:TCP/IP connections : LCRSAPRFC & SAPSLDAPI.These RFC destinations are used for the SLD connections. They use the SLD access data maintained with transaction SLDAPICUST. The RFC destination LCRSAPRFC is used to read the exchange profile and SAPSLDAPI is used by the ABAP API.

1.JPG


2.JPG

  • HTTP Connections to Ext. Server  : IS_ENGINE . Recommended user for this RFC is PIISUSER. Status code of test should return 500.

3.JPG

  • Maintain SLD Access Data: SLTDAPICUST . DONOT USE XIAPPLUSER or PIAPPLUSER !!! Recommended USER = PISUPER

4.JPG

  • Check connection to PI SLD. Transaction : SLDCHECK. Below lines in green are of concern.

Summary: Connection to SLD works correctly

Summary: Connection to the XI Profile works correctly

5.JPG

  • SLD Integration Transaction : RZ70

6.JPG

Also check the below RFCs required for RZ70

7.JPG

 

8.JPG

Integration Engine Configuration: Transaction SXMB_ADM

9.JPG

 

Check whether the Integration Server is corrcectly point to HTTP destination created in the above steps.

10.JPG

 

  • Transaction SICF:The following services should be active ( should not be black)

11.JPG

 

  • Connection Test: Transaction : Sproxy . All the displayed software components should be active.

         To do connection test GoTo=> Connection Test

12.JPG




 

13.JPG

 

Finally run the below reports to see the check the connection issues.

 

  • Run report SPROX_CHECK_IFR_ADDRESS.
  • Run report SPROX_CHECK_HTTP_COMMUNICATION
  • Run report SPROX_CHECK_IFR_RESPONSE.

 

Good Luck and Happy troubleshooting.

Amber Badam

SFTP using custom adapter module Part2 - Sender

$
0
0

The second blog on sftp adapter module, which I wanted to publish soon after the first one but couldn't do it and here I am now. In first blog I wrote about Receiver module which was quite easy as we just have to invoke our receiver channel to deliver the message which will in turn invoke the SFTP receiver module and write file on SFTP server. But in case of Sender scenario where we need to read files from SFTP server, our sender SFTP module won't work by itself because it cannot poll files and thus for that we will have to use "Trigger file". The trigger file is nothing but an empty file available on local drive, The sender File adapter will poll this trigger file on test mode at defined polling interval which will invoke SFTP module to read files from remote sftp server at same polling interval.


Now that we have read file from sftp server using module the standard file adapter would still try to send triger file content into message queue, whereas we want the content of file read from sftp server. For this we will have to replace the message content polled by standard file adapter with the content read from SFTP server and put that into messaging queue. The code below shows how that is handled, The remaining module parameters and other necessary functionality like file archiving are also needs to be included in module parameter and in the module code.



/**
*
*/
package com.sap.adaptermodule;
import java.io.BufferedInputStream;
import java.io.BufferedReader;
import java.io.InputStream;
import java.io.InputStreamReader;
import java.io.Reader;
import java.io.StringWriter;
import java.io.Writer;
import java.text.SimpleDateFormat;
import java.util.Date;
import java.util.Vector;
import javax.ejb.CreateException;
import javax.ejb.SessionBean;
import javax.ejb.SessionContext;
import com.jcraft.jsch.Channel;
import com.jcraft.jsch.ChannelSftp;
import com.jcraft.jsch.JSch;
import com.jcraft.jsch.Session;
import com.jcraft.jsch.SftpATTRS;
import com.sap.aii.af.lib.mp.module.Module;
import com.sap.aii.af.lib.mp.module.ModuleContext;
import com.sap.aii.af.lib.mp.module.ModuleData;
import com.sap.aii.af.lib.mp.module.ModuleException;
import com.sap.aii.af.service.auditlog.Audit;
import com.sap.engine.interfaces.messaging.api.Message;
import com.sap.engine.interfaces.messaging.api.MessageKey;
import com.sap.engine.interfaces.messaging.api.MessagePropertyKey;
import com.sap.engine.interfaces.messaging.api.XMLPayload;
import com.sap.engine.interfaces.messaging.api.auditlog.AuditLogStatus;
/**
* @author ambharti
*
*/
public class SenderSFTPBean implements SessionBean, Module {    // private Log logger = LogFactory.getLog(getClass());    private SessionContext myContext;    public void ejbRemove() {    }    public void ejbActivate() {    }    public void ejbPassivate() {    }    public void setSessionContext(SessionContext context) {        myContext = context;    }    public void ejbCreate() throws CreateException {    }    // Start custom function of SFTP    @SuppressWarnings("unchecked")    public ModuleData process(ModuleContext mc, ModuleData inputModuleData)            throws ModuleException {        Object obj = null;        Message msg = null;        MessageKey amk = null;        InputStream bis = null;        String lv_del_flag = null;        try {            // Retrieves the current principle data, usually the message ,            // Return type is Object            obj = inputModuleData.getPrincipalData();            // A Message is what an application sends or receives when            // interacting with the Messaging System.            msg = (Message) obj;            // MessageKey consists of a message Id string and the            // MessageDirection            amk = new MessageKey(msg.getMessageId(), msg.getMessageDirection());            // Reading file name from message header            MessagePropertyKey mpk = new MessagePropertyKey("FileName",                    "http://sap.com/xi/XI/System/File");            String filename = msg.getMessageProperty(mpk);            Audit.addAuditLogEntry(amk, AuditLogStatus.SUCCESS, "Filename is "                    + filename);            Audit.addAuditLogEntry(amk, AuditLogStatus.SUCCESS,                    "Input file read successfully");            // Archiving target file on FTP server            {                String HostName = (String) mc.getContextData("HostName");                String Port = (String) mc.getContextData("Port");                String Directory = (String) mc.getContextData("Directory");                String Filename = (String) mc.getContextData("Filename");                String Username = (String) mc.getContextData("Username");                String Password = (String) mc.getContextData("pwd");                String PrivateKey = (String) mc.getContextData("PrivateKey");                String HostKey = (String) mc.getContextData("HostKey");                //Added for Archive                String Archive = (String) mc.getContextData("ArchivePath");                //Added for Archive                Audit.addAuditLogEntry(amk, AuditLogStatus.SUCCESS,                        "Connecting to SFTP location " + HostName);                int Portno = Integer.parseInt(Port);                JSch jsch = new JSch();                Session session = null;                // Use key authentication if it is set, else use password                // authentication                if (PrivateKey != null && PrivateKey != "") {                    Audit.addAuditLogEntry(amk, AuditLogStatus.SUCCESS,                            "Authenticating key to SFTP");                    if (Password != null && Password != "") {                        byte[] passphrase = Password.getBytes();                        jsch.addIdentity(PrivateKey, passphrase);                    } else {                        jsch.addIdentity(PrivateKey);                    }                    session = jsch.getSession(Username, HostName, Portno);                } else if (Password != null && Password != "") {                    Audit.addAuditLogEntry(amk, AuditLogStatus.SUCCESS,                            "Authenticating password to SFTP");                    session = jsch.getSession(Username, HostName, Portno);                    session.setPassword(Password);                }                if (HostKey != null) {                    Audit.addAuditLogEntry(amk, AuditLogStatus.SUCCESS,                            "Authenticating Hostkey of SFTP server");                    jsch.setKnownHosts(HostKey);                    session = jsch.getSession(Username, HostName, Portno);                } else {                    session.setConfig("StrictHostKeyChecking", "no");                }                session.setTimeout(15000);                // Connecting to SFTP                Audit.addAuditLogEntry(amk, AuditLogStatus.SUCCESS,                        "Authenticating to SFTP");                session.connect();                Channel channel = session.openChannel("sftp");                channel.connect();                ChannelSftp sftpChannel = (ChannelSftp) channel;                sftpChannel.cd(Directory);                Audit.addAuditLogEntry(amk, AuditLogStatus.SUCCESS,                        "Connection to SFTP location Successful");                          // Check if file exists                Boolean fileExists = false;                try {                Vector<ChannelSftp.LsEntry> list = sftpChannel.ls(Filename);                Audit.addAuditLogEntry(amk, AuditLogStatus.SUCCESS,                        "File listing is Successful");                          for (ChannelSftp.LsEntry entry : list) {                    Filename = entry.getFilename();                    Audit.addAuditLogEntry(amk, AuditLogStatus.SUCCESS,                    "File found is : " + Filename );                    fileExists = true;                    continue;                }                          } catch (Exception ex) {                    Audit.addAuditLogEntry(amk, AuditLogStatus.SUCCESS,                    "File listing is Unsuccessful");                }                          Audit.addAuditLogEntry(amk, AuditLogStatus.SUCCESS,                "File avaialbale :" + fileExists );                          if (fileExists == true)                          {                    Vector<ChannelSftp.LsEntry> list = sftpChannel.ls(Filename);                    Audit.addAuditLogEntry(amk, AuditLogStatus.SUCCESS,                            "File listing is Successful");                              // Read file                    try {                        for (ChannelSftp.LsEntry entry : list) {                            bis = new BufferedInputStream(sftpChannel.get(entry                                    .getFilename()));                            Filename = entry.getFilename();                            continue;                        }                        // bis = new                        // BufferedInputStream(sftpChannel.get(Filename));                        Audit.addAuditLogEntry(amk, AuditLogStatus.SUCCESS,                                "File read sucessfully : " + Directory + "/"                                        + Filename);                        // If file is available set message property filename                        MessagePropertyKey msz = new MessagePropertyKey(                                "FileName", "http://sap.com/xi/XI/System/File");                        msg.setMessageProperty(msz, Filename);                    } catch (Exception e) {                        Audit.addAuditLogEntry(amk, AuditLogStatus.SUCCESS,                                "Module Exception caught:");                        // Set flag to not perform delete operation                        lv_del_flag = "X";                        Audit.addAuditLogEntry(amk, AuditLogStatus.WARNING,                                "No such file exist");                    }                }                try {                    XMLPayload xmlpayload = msg.getDocument();                    String sXML = convert(bis);                    byte[] docContent = sXML.getBytes();                    if (docContent != null) {                        Audit.addAuditLogEntry(amk, AuditLogStatus.SUCCESS,                                "Inside DocContent");                        xmlpayload.setContent(docContent);                        inputModuleData.setPrincipalData(msg);                    } else {                        Audit.addAuditLogEntry(amk, AuditLogStatus.SUCCESS,                                "Data not parsed");                    }                } catch (Exception e) {                    ModuleException me = new ModuleException(e);                    throw me;                }                if (fileExists == true) {                                  try {                                          if (Archive != null && Archive != "") {                        // Archive file                        String pathname = sftpChannel.realpath(Directory) ;                        pathname = pathname + "/" + Filename;                        String archname = sftpChannel.realpath(Archive) ;                        /* Write time stamp to file*/                        Date date = new Date(); // get the current date                        SimpleDateFormat dateFormatter = new SimpleDateFormat(                                "yyyyMMddHHmmssSSS"); // set the format for date                        String dfmt = dateFormatter.format(date);                        dfmt = dfmt + "_";                        archname = archname + "/" + dfmt + Filename;                        //Filename = archname;                        sftpChannel.rename(pathname, archname);                        Audit.addAuditLogEntry(amk, AuditLogStatus.SUCCESS,                        "File Archived in " + archname);                                          }                        else{                        // delete file from server                        sftpChannel.rm(Filename);                        Audit.addAuditLogEntry(amk, AuditLogStatus.SUCCESS,                        "File " + Filename + " Deleted from " + Directory);                        }                    } catch (Exception e) {                        Audit.addAuditLogEntry(amk, AuditLogStatus.SUCCESS,                                "Module Exception caught while deleting or archiving the file:" + Filename);                        ModuleException me = new ModuleException(e);                        throw me;                    }                }                sftpChannel.exit();                session.disconnect();            }        } catch (Exception e) {            Audit.addAuditLogEntry(amk, AuditLogStatus.SUCCESS,                    "Module Exception caught:");            ModuleException me = new ModuleException(e);            throw me;        }        return inputModuleData;    }    public String convert(InputStream is) {        char[] buff = new char[1024];        Writer stringWriter = new StringWriter();        try {            Reader bReader = new BufferedReader(new InputStreamReader(is,                    "UTF-8"));            int n;            while ((n = bReader.read(buff)) != -1) {                stringWriter.write(buff, 0, n);            }        } catch (Exception e) {            e.printStackTrace();        }        return stringWriter.toString();    }
}

 

The module parameter entries would like like -

 

 

And the communication channel logs -

 

 

 

References :

 

SFTP using custom adapter module Part1 - Receiver

http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/c0b39e65-981e-2b10-1c9c-fc3f8e6747fa?overridelayout=t…

JSch - Java Secure Channel - Examples

Viewing all 676 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>