Quantcast
Channel: SCN : Blog List - Process Integration (PI) & SOA Middleware
Viewing all 676 articles
Browse latest View live

From Swing to Eclipse: My two months experience on NWDS 7.31

$
0
0

Update 9 Feb 2015: There is actually a setting in Preference to trigger iFlow deployment upon activation

 

In my recent role, I have the opportunity to work on PO 7.4 SP08, nearly the latest and greatest version of PO to date. My previous experience on NWDS was on version 7.11 CE which was mainly used for development of Java mapping and custom adapter module. As there have been strides of improvements on the NWDS end over the last few years like Eclipse Tool for ESR in NW PI and Introducing iFlow in PI 7.31 Configuration, I decided to try it out to see if I am able to fully perform all the development/configuration tasks in NWDS or do I still need to use the traditional Swing clients.

 

Here is my experience after two months of working on NWDS 7.31.

 

 

Installation

To begin with, head over to the link below to download the correct version of NWDS. Note that the download location requires an S-user authentication.

NWDS Download Links - Java Development - SCN Wiki

 

Once the NWDS Zip file has been downloaded, installation is rather straightforward as it just involves unzipping the file to any destination on the local machine. However, getting NWDS to work nicely (read: least buggy!) in subsequent usage can be a little bit tricky, and some of the issues can be traced back to installation-related issues.

 

Here are some general tips on getting it right:

 

  • The general consensus is that the NWDS version, support pack level and patch level should match EXACTLY the PI/PO server details. For those on PI/PO 7.4, SAP Note 1791485 mentions that the corresponding NWDS 7.31 version can be used. To get the correct version, NWDS 7.31 support pack level is exactly 5 more than PI/PO 7.4, i.e. for my PO 7.4 SP08 Patch 0000 system, I use NWDS 7.31 SP13 Patch 0000.
  • OS can be either 32-bit or 64-bit
  • As PI/PO 7.3x/7.4 is running on SAP JVM 6.1, a JDK 1.6 version is required on the local machine. I found that some functions did not work correctly (cannot open Message/Operation Mapping) when I use Sun/Oracle's JDK 1.6. Using SAP's own JVM 6 provides a more stable experience - the installation file can be downloaded from the download location, go to Installing the Developer Studio -> Prerequisites -> JDK Version

 

Once the installation issues have been sorted out, the experience is so much better and I can actually begin to work on it for my daily tasks.

 

 

Likes

 

  • The NWDS IDE is lightweight, responsive and running it is less resource/memory intensive than running the Swing ESR/ID clients. I am able to connect to both ESR and iFlow within the same IDE. However, it can only connect to a single server system at one time. The screenshot below shows that memory used by a PI-connected NWDS session is slightly less than combined usage of Swing clients for ESR and ID.

memory.png

  • It is a complete Eclipse experience when developing Java codes in UDFs or Function Library. All the nice IDE features like always-on syntax check, inline JavaDoc reference, code helpers, etc, makes for a pleasant Java development experience.
  • Additional wizard to create message type when creating the service interface. Definitely nice to be able to quickly complete that step.
  • You gotta love iFlows! - it is a definitely a more intuitive approach to configuring integration scenarios. My wishlist would be if there is an option to both activate and deploy an iFlow at the same time (I often forget to deploy the activated iFlow!)Update: I've found out since that there is an option to activate and deploy an iFlow at the same time via the Preferences section below.

activateiflow.png

  • The transport mechanism for iFlow is done pretty well. In particular, the export functionality provides a good view of the objects to be exported. I would have liked it though if there is an option to hide the Deleted iFlow objects.

lib.png

 

 

Dislikes

The downside to my NWDS experience is mostly due to features from the Swing client that are missing. AFAIK, these are some of it although it is possible that the features are there, but just not that obvious to me!

 

  • No option to connect to multiple systems at the same time. Therefore, there is no way to do eye-ball comparison of the same object in different systems (i.e. Dev vs QA).
  • No transport system for ESR objects. Export and import still needs to be performed from the Swing client.
  • No copy object functionality for ESR objects.
  • The Data Type editor has only basic functionality - no function to import/export XSD, no way to restructure fields (move fields up/down).
  • Functionality to import external definition does not work well - I couldn't get it to work importing an external WSDL.
  • No option to view history of an object.
  • No functionality to view details of an imported archive.
  • No functionality to download the XML version of a communication channel.
  • Unable to add SAP AS Java entry in Preferences for deployment of Modules and BPM. I have gone through SCN threads and even the first point in the NWDS Troubleshooting wiki but still unable to add an entry. The weird thing though is that after adding one, and although it does not appear, I can still actually deploy to that system. So it looks like it's retained in memory for the current NWDS session. Once I close the program, it is no longer in memory.

 

The features (or lack of it!) around Message Mapping deserves a section of its own!

 

  • The one feature I missed most from the Swing client is the "Display Queue" function in Message Mapping. IMHO, a solid understanding of graphical mapping's queues and contexts and the ability to display queue at any point in the mapping logic are key ingredients to developing robust and accurate mapping logic. It is the single feature that causes me to switch back to the ESR swing client most.
  • Another feature that I miss is the ability to check the consistency of the source input. This can be easily done via the "Test" functionality of the Swing client. By copying the sample in Source Text View and switching back to Tabular Tree View, the little green or red icons help to immediately flag any inconsistencies. This is very useful especially in cases with tricky namespaces or typo errors.
MatchMismatch
good.pngerr.png
  • The feature to copy the logic of an entire mapping flow is also not available in NWDS. This is an often used feature as there are a lot of similarities between the mapping for different target field, and this feature helps to reduce the effort to rebuild the logic for each target field.

copy.png

  • Also missing is the ability to ungroup a target field from a grouped mapping flow. Sometimes, after the mapping flow has been developed for a group of target fields, it might be necessary to ungroup one of the fields, and yet retain the logic for that field. The only option in NWDS now is to delete the target field from the grouped mapping but all the associated logic for that target field is also deleted.

ungroup.png

  • Although writing the code for UDF or Function Library is better due to the Eclipse-based editor, there is no wizard to create the input/output parameters for the UDF. Creating UDF for Message Mapping Using NWDS is a manual process (and prone to error) as the input arguments and parameters have to be declared via annotations. The correct import statements need to also be manually included in the code.

 

 

Bugs

Although the experience has been relatively stable, there are occasional bugs. One of it is that when trying to make a change, NWDS does not prompt a new Change List, so I can't save my changes. The workaround is to quit and restart NWDS.

 

Another bug that I've come across is when using the Condition Editor for iFlow. If there are any unintentional white spaces in the Xpath condition, the generate condition on the iFlow is incorrect (has an additional Xpath prefix in front) and does not work. It took me a while to nail down why the condition does not work, only to find out that it's because of a little white space!

Good Condition

good_xpath.png

good_cond.png

Bad Condition

bad_xpath.png

bad_cond.png

 

 

Conclusion

iFlow seems like a complete experience, although the same cannot be said about the ESR perspective, especially with regards to Message Mapping. All in all, I do enjoy developing on NWDS. I do know that some of the missing features are scheduled in the roadmap and I'm looking forward to those and more enhancements in the future (maybe someone from SAP Labs might come across this and help to push things .) I've certainly embraced it as my primary IDE for developing in PI and BPM, but I'm still far from dropping off the Swing client completely.


WebSphere MQ JMS adapter configurations for SAP PI

$
0
0

WebSphere MQ  JMS adapter  configurations for SAP PI


Business requirements need to integrate with  other EAI technologies or some clients may use other EAI tools like(IIB9) in such cases we may get chance to work with inbound or outbound scenario between Queues(MQ) and SAP .


Some cases IBM mainframe systems needs to exchanges data from/to SAP at that time they might use other EAI technologies to place/get the data from host(Mainframe system) to IIB(MQ) where PI need to pick data from queues(MQ) then route it to SAP, In such cases we probably need to configure JMS adapter.


Prerequisites :

1)Before going to start scenario development we need to deploy respective jar files in PI server using NWDS.(com.sap.aii.adapter.lib.sda)SDA file .

for more information for deploying Jar files  please go through  below link :

http://scn.sap.com/community/netweaver-administrator/blog/2014/01/28/nwds-step-by-step-in-the-loving-memory-of-sdm


Below are Receiver JMS adapter Configurations:


RCVR.jpg

`

Processing Tab:

 

Process.jpg

 

Sender JMS adapter:

 

SN.jpg

 

Processing Tab:

 

PC.jpg

 

Common Error & Solutions while working with JMS adapter:

 

1)Error:A channel error occured.Deailed error(if any):java.lang.NoClassFoundError:com/ibm/mq/jms/MQQueueConnectionFactory:cannot initialize class because prior initialization attempt failed at java.lang.ClassforName0(Native Method)

Solution: SAP NetWeaver Administrator → Java System Properties → Applications, select the com.sap.aii.adapter.jms.app application and modify the property value.

For client library MQ version 7.5, set the "preloadClasses" property as: com.ibm.mq.MQEnvironment,com.ibm.mq.internal.MQCommonServices,com.ibm.mq.jms.MQQueueConnectionFactory,com.ibm.mq.jms.MQTopicConnectionFactory.

2)Error:Error creating JMS connection.The JMS provider gave the error message as JMSWMQ2013:The security authontication was not valid that was supplied for QueueManger 'SAPPIQM'with connection mode 'Client'and host name 'xx.xx.xx.xx(I.P)'.JMSCMQ0001:WebSphere MQ call failed with compcode '2'(MQCC_FAILED)reason '2035'('MQRC_NOT_AUTHORIZED'),and the error code as JMSWMQ2013.

Solution: Need to execute the following commands where ever MQ instance placed

Open cmd prompt (run as administrator) execute cmd:

runmqsc LOGGER_QM_DB(QUEUE Manager name)

ALTER QMGR CHLAUTH(DISABLED)

3)Error:Transmitting the message to endpoint <local> using connection File_http://sap.com/xi/XI/System failed, due to: com.sap.engine.interfaces.messaging.api.exception.MessagingException: com.sap.aii.adapter.jms.core.fsm.DFA$InvalidTransitionException: No transition found from state: ERROR, on event: process_commence for DFA: CC_RCVR_F2J:5e9985d8d2cb3f75a5ba830e6b51b1ac

Solution: AS same above.

Adapter Module Development. Set QoS EOIO with Queue Name based on XPath expression

$
0
0

This adapter module allow you to set EOIO QoS and Queue ID in sender channel (even it was configured as EO). Queue Id can be setted as fixed value, or as value from Payload (using XPath).

Module has two parameters: QueueName and XPath.

SenderChannel.PNG

Parameter QueueName is mandatory. Even you fill XPath parameter, you should set QueueName (with any non-empty value).

XPath example for PI message:

For example, we have this message:

<ns0:MT_1xmlns:ns0='someuri'>
<item>
<ThisIsWillBeUsedForQueueName>1</ThisIsWillBeUsedForQueueName>
</item>
</ns0:MT_1>

Xpath is: //item/ThisIsWillBeUsedForQueueName

In the message Data log you will see (even it is EO sender channel):

    Quality of Service Exactly Once in Order

    Sequence ID : <value from your QueueName or XPath result>

Code of Module in the attachment (this is for PI 7.4, for others may be some differences)

Useful links to create Module:

http://scn.sap.com/community/pi-and-soa-middleware/blog/2014/08/08/pi-74--adapter-module-creation-using-ejb-30

And check comments

Note: even you select XPI Libraries you can't add Audit, cause in those libraries is missing one important file com.sap.aii.af.svc_api.jar

You can get it from PI server. Very very great blog for it: http://scn.sap.com/community/pi-and-soa-middleware/blog/2012/10/19/how-to-locate-java-library-resource-during-java-development-in-pi

Another useful link: http://wiki.scn.sap.com/wiki/display/NWTech/Custom+Adapter+Module+Development+-+SAP+PI+7.1

There I found how to get payload.

Have fun

PS. If you know, how to optimize it or do better - write in comments please. Thx

PPS. Another blog about : http://scn.sap.com/community/pi-and-soa-middleware/blog/2015/01/29/create-sap-pi-adapter-modules-in-ejb-30-standard

How to create Java Mapping in SAP PI / PO

$
0
0

This blog will complement other blogs on 'Java Mapping' in SCN.

 

 

Sample Java Mapping used in the video: -

package com.map;
import java.io.*;
import com.sap.aii.mapping.api.*;
public class Test_JavaMapping extends AbstractTransformation {      @Override    public void transform(TransformationInput transformationInput, TransformationOutput transformationOutput) throws StreamTransformationException {        try {            InputStream inputstream = transformationInput.getInputPayload().getInputStream();            OutputStream outputstream = transformationOutput.getOutputPayload().getOutputStream();            // Copy Input content to Output content            byte[] b = new byte[inputstream.available()];            inputstream.read(b);                      outputstream.write("Prefixing this line to input. Test_JavaMapping. \r\n".getBytes());            outputstream.write(b);        } catch (Exception exception) {            getTrace().addDebugMessage(exception.getMessage());            throw new StreamTransformationException(exception.toString());        }    }
}

Sample text file used in the video:-

Name,ID,Phone
AAAA,11,123456789
世界,22,987654321
BBBB,33,777777777

Helpful links: -

NWDS Download Links - Java Development - SCN Wiki

SAP Support Portal S user id is required to download NWDS.

Not well-formed XML - & issue

JAVA Mapping - Managing Services in the Enterprise Services Repository - SAP Library

PI Mapping and Lookup API - SAP Javadocs

Trail: Java API for XML Processing (JAXP) (The Java Tutorials)

Dynamic file name for pass-through scenario - Process Integration - SCN Wiki

Multi-Mapping using Java Mapping - Process Integration - SCN Wiki

Reading Binary File Using ICO

Java Mapping in Exchange Infrastructure (download the mwv file).

Adapter module dynamic Queue Name, n xml elements

$
0
0

This is easy way to configure Sender channel to assign dynamic name to Queue, you can use n elements of XML Payload to assigne Qname

 

Module EJB  SAP EJB J2EE 1.4 Project, view doc [1], Is a very good document, there is a little mistake, the type of EJB is stateless, the image shows a wrong stateful (take care)

 

 

The parameters for Module you need with this solution:

 

ElementPath1: Xpath of node

ElementPath2: Xpath of node

...

ElementPath(n): Xpath of node

 

Separator: char for separation of elements, use une like "-"

 

QName: Starting part of queue, something like (ORD, MAT, PORD)

 

This is the configuration:

 

2015-02-10 10_57_50-SAP Process Integration Designer - SoapSenderEOIOPedidos - SAP NetWeaver Develop.png

 

The parameters:

 

This is a sample of Monitor view (pimon):

 

2015-02-10 11_09_53-Message Monitor - SAP NetWeaver Administrator.png

 

 

Use This code for module:

------------------------------------------------------

GetQueueBean.java.html (Attach file)

-----------------------------------------------------

 

have a nice day

 

Reference URL

[1]

How to Create Modules for the JEE Adapter Engine

 

[2]

How XPath Works (The Java&amp;trade; Tutorials &amp;gt; Java API for XML Processing (JAXP) &amp;gt; E…

Deploying SQL JDBC driver in PI through NWDS

$
0
0

I am writing this blog as we faced multiple issues while deploying JDBC driver in our PI system through SUM and finally NWDS made it easy. I hope it will be useful for others as well.

 

Our requirement was to connect our PI system to SQL 2012 DB used for staging tables. Major steps involved in this are- Downloading JDBC driver from Microsoft site, Bundling SCA file and Deploying through NWDS.

 

1. Download sqljdbc4.jar from Microsoft.

http://www.microsoft.com/en-in/download/details.aspx?id=11774

 

2. Download  SAPXI3RDPARTY07_0-10008107.SCA  (SCA file for NW 7.3) from SAP  service market place, depending on your PI system version.

TIP : Use ‘Search for Software’ in Support packages and patches.

 

3. Rename SAPXI3RDPARTY07_0-10008107.SCA to SAPXI3RDPARTY07_0-10008107.zip.

 

4. Open with winzip

S1.jpg

 

Double click on DEPLOYARCHIVES folder.

S2.jpg

Double click on com.sap.aii.adapter.lib.sda and again open with winzip.

S3.jpg

Click on lib and then Add

S5.jpg

Add sqljdbc4.jar downloaded from Microsoft site

S6.jpg

S7.jpg

Click on OK

 

Now click on server

S8.jpg

Right click on  provider.xml and edit

S9.jpg

 

Add the sqljdbc4 jar entry in provider.xml as per screen shot.

S11.jpg

Save this file and close Winzip(where you have opened com.sap.a11.adapter.lib.sda), it will ask to save changes made. Click on YES and then OK.

S12.jpg

S13.jpg

 

5. Rename SAPXI3RDPARTY07_0-10008107.zip to SAPXI3RDPARTY07_0-10008107.SCA

 

6. Now open NWDS.

 

Click on window and click on preference.

S14.jpg

 

Click on SAP java and click on Add

S15.jpg

 

Add SAP PI system by providing below details-

 

S16.jpg

Click OK.

It may take 1 to 5 minutes for adding system in NWDS, it will asks for administrator user and password. Once added close Preferences window.

 

Now Click on Windows and open Perspective and Other.

 

S18.jpg

S19.jpg

 

Select deployment and click on OK.

 

S20.jpg

 

Click on import.

 

Select File System and click on Finish.

S21.jpg

 

Select SCA file which we created and click on Open.

S22.jpg

 

7. Select File and click on Start

S23.jpg

 

Once imported successfully, it will give Import successful message.

 

-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------

 

You may face below error or issues while deploying sca file through NWDS.

 

1. While adding PI system in NWDS, it is asking for operating system user and password

- Refer SAP Note#1834579 and make changes accordingly.

 

 

2. Got error as JVM vendor is not supported.

Er1.jpg

- You can ignore this error or Install JDK 1.6 patch 26 or above. If you install it then restart NWDS and select javaw.exe from proper path.

 

 

3. If you are using Windows 7 64bit and get error ‘JVM terminated. Exit code =-1’ on starting NWDS, as below-

Er2.jpg

 

- Refer SAP note#1566977 - NWDS on Windows 7 – Solution is to install 32bit JDK.

 

 

4. Deployment failed with error : [ERROR CODE DPL.DCAPI.1031] AllItemsAlreadyDeployedValidationException. Reason: ASJ.dpl_dc.003456 All batch items are marked as AlreadyDeployed because of Version check.

 

- In NWDS before starting deployment of file, click on Settings and change below setting, it is for force deployment which we use while deploying through Telnet/SUM.

 

Er3.jpg

 

 

Important SAP notes and KBA on this topic-

 

SAP KBA 1829286 - PI: How to check which drivers are installed for the JMS and JDBC Adapters

SAP KBA 1770206 PI: How to extract the archive for deploying 3rd party drivers

SAP KBA 1770304 PI: How to prepare the com.sap.aii.adapter.lib.sda for deployment

SAP KBA 1770384 Cannot deploy the archive com.sap.aii.adapter.lib.sda to PI

SAP KBA 1816456 PI: Deploying new JMS/JDBC drivers overwrites existing deployed drivers

SAP note 1138877 How to Deploy External Drivers JDBC/JMS Adapters

XML concepts/pre-requisites of XSLT mapping in SAP PI.

$
0
0

XML concepts/pre-requisites of XSLT mapping in SAP PI

First let’s take a look at basic concepts:

1. what is XML-

It is Extensible mark-up language. It is platform independent data format as well as manufacturer independent data format.
It has easy to read structured information, and there is separation of information and semantics.
Moreover it has metalanguage for defining description languages.
XML defines set of rules for encoding documents in a format which is both human readable and machine readable.

2. What is XSL-
It is language for expressing style sheets. Extensible stylesheet language.
It is a file that describes how to display an XML document of given type.
This style sheet describes how a document (XML/HTML) should look like on the screen/print.
Without this, there is no meaning to data in XML.

3. What is XSL-FO-
It is another language for formatting XML data, and focuses on styles that work on both screen and print such as PDF.
This formatting allows you to create files that look fantastic on any screen or print neatly.
There is no difference in XSL and XSL-FO.
www consortium (WC3) expanded on XSL theories to create XSL- FO.

4. What is XSLT-
It is XSL transformation. It is part of XSL language for XSLT processes.
XSLT defines order/frequency of processing of XML source document.
It has templates having processing instructions.
It also has various features like variable declaration, repeat instructions, conditional checks etc.

5. X-Path-

Ex. Just like we execute mapping in SAP PI, we take input payload (say XML), take data type as reference (say XSL), and we produce output using mapping transformation (say XSLT).

We can separate content and output format using XML.
We can create numerous output documents using single source document by performing transformations.

Following are examples of applications for XML:


APPLICATIONS OF XML.png

Following is basic syntax 1 for XML:

SYNTAX 1 XML.png
Following is basic syntax 2 showing various elements in XML :


SYNTAX 2 XML.png

XML can also be viewed as tree structure.

 

XML TREE.png

Job of XSLT is to transform data from XML file into another format like html/xhtml. Data is sent as xhtml (format after XSLTransformation), and web browser  modifies it into the required data to display (html).
XSL TRANSFORM.png

 


In case of XSL-FO, we can use various formats which can be printable such as PDF etc.

XSL FO TRANSFORM.png

SAP Business Connector:

SAP business connector.png

XML in SAP PI:

xml in SAP PI.png


Well-formed XML:
1. Must have XML declaration.
2. Root element should be defined.
3. All elements have start and end tag.
4. All attribute values are in inverted commas.
5. Element groups are correctly nested

 

Validation of XML:
1. All points mentioned above.
2. It must contain DTD/Schema.
3. Rules of DTD/Schema are compiled.

DTD-

DTD is document type definition. Under DTD, main categories comprising XML document are declared. Ex. elements, their attributes, ordering, nesting of  elements etc.

Schemas-

XML schema have namespace and their own DTD. It is more powerful than DTD. It can have most basic program till expanded complex programs.

CSS-

CSS is designed around styling a document, structured in markup language (XML/HTML).
They are cascading style sheets used to style XML markup.

To conclude, these three are data type definitions. To transfer data, we use these structures, in which XML data is passed.

Advantages of Schema over CSS:
1. It is in XML syntax.
2. Sequence of source data can be changed.
3. Elements can be repeated any number of times in output.
4. Additional information can be included in output document.

DTD vs Schema:

DTD

Schema

Has element nesting

Simple and complex data types

element repetition

type inheritance

attributes permitted

element repetition

attribute types and standard values

Restrictions on number and order of child elements.

Not supported-

Not supported-

Special data types for elements and attributes

Entity definition for abbreviations

No restriction on length and type of character strings



XSLT mapping:


The most common thing heard in SAP PI in this subject regards is XSLT mapping.
Different ways of achieving conversion in XI is:

  1. 1. Message Mapping (Graphical Mapping using Mapping Editor in XI).
  2. 2. Java Mapping.
  3. 3. ABAP Mapping.
  4. XSLT Mapping.

  Steps required for developing XSLT Mapping 

  • Create a source data type and a target data type
  • Create Message types for the source and target data types.
  • Create Message Interfaces includes Inbound Message interface and Outbound Message interface.
  • XSLT Mapping does not require creation of Message mapping, so don’t create any Message mapping.
  • Create an .XSL file which converts source data type into target data type.
  • Zip that .xsl file and import it into Integration Repository under Imported Archives.
  • In Interface Mapping choose mapping program as XSL and specify this zip program. (Through search help you will get XSL Mapping programs that you imported under Imported Archives, select your corresponding XSL Program)
  • Test this mapping program by navigating to Test tab.

Time zone conversion in PI mapping

$
0
0

Introduction

I had a requirement recently that involved handling of time zone conversion between the sender and the receiver systems. I had hoped that this was achievable by the standard mapping functions in PI, however there were certain limitations. Searching on SCN did not yield any suitable solutions, so I turned to good old Mr Google. I soon found out that Joda-Time was the de facto standard for handling date/time in Java prior to Java SE 8, as the date/time classes before that were considered poor.

 

After a bit of fiddling with the Joda-Time classes in a mapping UDF, I finally got it to work according to my requirement. In the section below, I will detail the solution I came up with as well as the options available when dealing with time zone conversion in PI.

 

 

Option 1 - Converting to local time zone

For conversions from other time zones to the local time zone (on the PI server), it can be achieved by the standard DateTrans function of the Date group. The input/output date formats in DateTrans are based on the Java's SimpleDateFormat which caters for time zone related letters (z, Z, X) in the pattern.

 

In the following example, the input timestamp in Pacific Standard Time (PST stated in general time zone format z) is converted a timestamp in local server timezone GMT+8 (+0800 stated in RFC 822 time zone format Z.)

datetrans.png

 

 

Option 2 - Converting from any time zone to any other time zone

Since option 1 does not cater for conversions to time zones other than the local time zone, it had to be handled by a custom mapping UDF. Following is the source code of the UDF.

 

Prerequisite:

Download the library files for Joda-Time from Joda-Time's repository.

Import JAR file into an Imported Archive object.

Add Imported Archive in Archives Used section of Message Mapping or Function Library.

 

Import statements for UDF

import org.joda.time.DateTime;
import org.joda.time.DateTimeZone;
import org.joda.time.format.DateTimeFormat;
import org.joda.time.format.DateTimeFormatter;

 

Public method with Execution Type = 'Single Values', having 1 String input argument and 3 configurable parameters.

@LibraryMethod(title="convertTimeZone", description="Convert input time from one timezone to another",        category="FL_DateTime", type=ExecutionType.SINGLE_VALUE)
public String convertTimeZone (  @Argument(title="Input timestamp")  String timestamp,  @Parameter(title="Timestamp format")  String format,  @Parameter(title="From timezone")  String fromTZ,  @Parameter(title="To timezone")  String toTZ,   Container container)  throws StreamTransformationException{  // ----------------------------------------------------------  // Convert input time from one timezone to another  // - utilizes Joda Time libraries  // ----------------------------------------------------------  // timestamp - input timestamp  // format - pattern format of input and output timestamp similar allowed patterns below  //        - http://docs.oracle.com/javase/7/docs/api/java/text/SimpleDateFormat.html  // fromTZ - from timezone  // toTZ   - to timezone  //        - expected timezone in long format timezone ids listed in following site  //        - http://en.wikipedia.org/wiki/List_of_tz_database_time_zones  // ----------------------------------------------------------  DateTimeFormatter formatter = DateTimeFormat.forPattern(format);  DateTimeZone originalTZ = DateTimeZone.forID(fromTZ);  DateTime fromDateTime = new DateTime(DateTime.parse(timestamp, formatter), originalTZ);  DateTime toDateTime = fromDateTime.withZone(DateTimeZone.forID(toTZ));  return formatter.print(toDateTime);
}

 

The UDF expects the from/to time zone in long ID format (available in the TZ column of the Wikipedia site listed in the Reference section.)

 

In the following example, the input timestamp is converted from the Asia/Kuala_Lumpur time zone to the timestamp in UTC.

udf.png

 

 

Reference

Joda-Time

SimpleDateFormat

Wikipedia list of time zones


Idoc adapter in PI 7.4

$
0
0

a)          Configurations in PI side - NWA.

 

Open the NWA page (http://<host>:<port>/nwa)

1) Click on:     Configuration > Infrastructure > Application Resources

 

Imagen1.png

 

Filter by key name in column Resource Name: ‘inboundRA’:

 

Imagen2.png

 

Here we have to select: Resource Name: “inboundRA” – Resource Type: “Resource Adapter” and click on Properties. Next, configure these parameters:

  • GatewayService: set in accordance to PI configuration (see trx code: SMGW).
  • GatewayServer: set in accordance to PI configuration (see trx code: SMGW).
  • MaxReaderThreadCount: 5
  • ProgramID: “XI_IDOC_DEFAULT_PID” where “PID” refers to PI system ID. Set in accordance to local configuration. This Program ID name will be used in ECC system when we create RFC dest. type “T”.
  • DestinationName: XI_IDOC_DEFAULT_DESTINATION
  • Local: false
  • BindingKey: PI_AAE_IDOC

  Remaining values will be kept as they were.

 

_Imagen3.png

 

2) Again from NWA, create a destination pointing to ECC using the name XI_IDOC_DEFAULT_DESTINATION

 

Imagen7.png

 

Set the Connection Parameters and Logon Data for ECC system where Idocs will be sent. Do a connection test.

 

Imagen8.png

 

3) We also have to create another Destination equals to the last one (XI_IDOC_DEFAULT_DESTINATION) but in this case we will add the R3 System ID in the end of the name. For example: XI_IDOC_DEFAULT_DESTINATION_ECC

Note: This destination will be used to transport the Idoc metadata.

 

b)          Configurations in Sender System – R3 side.

 

 

1)          Create RFC Destination Type T: (Client Independent. Use Customizing client)

  1. Go to trx. code SM59
  2. Create a new RFC Dest. Use a name that refers to PI AAE, for example: “PIDCLNT001_AAE”.
  3. Select radio button “Registered Server Program”
  4. In Program ID enter the same Program ID name already configurated in resource adapter inboundRA from NWA.
  5. Enter the “Gateway Host” y “Gateway Service” from PI Server.
  6. In “MDP Unicode” tab, actívate the radio button Unicode.
  7. Make a connection test to check connectivity.

   

Imagen9.png

 

Imagen10.png

 

2)          Create Port: (Client Dependent. Use the client where Idocs will be sent)

  1. Go to trx. code WE21.
  2. Display “Ports” and click on “Transactional RFC”
  3. Click on “Create”
  4. Specify a name and a description.
  5. InRFCdestinationspecify theRFCdestination namecreatedin the previous step.

   

 

3)          Create Logical System (Client Independent. Use Customizing client)

  1. Enter trx. code BD54
  2. Specify a name like <SID>CLNT<Client.Nr> for example “PIDCLNT001” where PID is the PI System ID and “001” it’s client number.
  3. Save.

   

 

4)          Create Partner Profile (Client Dependent. Use the client where Idocs will be sent)

  1. Enter trx. code: WE20
  2. Select Partner Type LS
  3. Click on button “Create”
  4. Specify Logical System Name created in previous step as “Partner No.”
  5. Create an Outbound Parameter.
    1. Select the desired “Message Type”
    2. Select the Receiver Port (Created on Step #2)
    3. Activate the option “Transfer IDoc Immediately”
    4. Configurate credentials in tab “Post Processing Permitted Agent”

 

 

Imagen11.png

 

Imagen12.png

 

c)          Scenario configuration in Integration Directory.

 

Create and configurate an Integrated Configuration (ICO):

i)          Create a Communication Channel sender type:  IDOC_AAE.

 

Imagen13.png

 

ii)          Create an ICO, choosing Sender System and Service Ifz. name (Idoc name)

 

Imagen14.png

 

iii)          Select the Communication Channel previously created.

 

Imagen15.png

 

iv)          Finish the configuration entering: Receiver System, Receiver Interfaces and destination Communication Channel in their respective tabs. Save & Activate.

Why I hate... I mean avoid using RFC in PI

$
0
0

Warning: This blog is essential a rant and an opinion piece!

 

I have repeatedly mentioned my dislike of the RFC functionality especially in terms of PI-related integration. RFC is a technology from the old days of SAP that amazingly (and unfortunately ) still surviving in these days of SOA, cloud and mobile!

 

Even SAP themselves have publicly come out to discourage the use of RFCs in the following guideline published years ago, but unfortunately the use of RFC is still widespread today.

SAP Guidelines for Best-Built Applications that Integrate with SAP Business Suite

SOA-WS-3. If access is needed to SAP application functionality that has not yet been service-enabled, SAP recommends wrapping remote function calls (RFCs) or BAPI® programming interfaces as web services. Direct access to RFCs or BAPIs is possible, but it is not encouraged.

 

In my humple opinion, it is possible to design and architect an integration landscape that is free from the usage of RFC in all its form. However, this requires an architectural approach that considers integration holistically (factoring in all the parts - middleware and backend systems) instead of the silo approach of limiting integration to the sole responsibility of the middleware/PI/PO.

 

Here are my thoughts on the common usage of RFCs in PI-related integration and some possible alternatives.

 

 

RFC as a lookup

The introduction of the RFCLookup functionality in PI 7.1 has unfortunately contributed to the proliferation of RFC usage in PI, much to the detriment of the SOA guiding principles. A graphical SOAP Lookup functionality would have been more appropriate but until now there is no indication if we will ever have that.

 

As I mentioned before in this blog, the RFCLookup functionality is tolerable at best and buggy at worst!

 

The common use case for RFCLookup is for data enrichment in PI - enriching the contents from the source prior to delivery to the target system. The usage of RFCLookup implies that either the sender or the receiver is an SAP backend system. If we look at integration holistically, especially taking into consideration the debate of Integration Logic Vs Business Logic, it can be argued that data enrichment should be performed at the backend system - prior to sending for sender systems or after receiving for receiver systems. Sometimes, the argument for RFCLookup is to reduce ABAP development on the SAP backend system but this argument becomes invalid if a custom RFC-enabled function module needs to be developed to be used by the RFCLookup in PI. More often than not, this is true as commonly the data enrichment requirement is unlikely to be fulfilled by existing standard SAP RFCs.

 

Here are some options for performing data enrichment on the backend SAP systems.

BAdI, User Exits

If the interfaces used in the backend systems are standard SAP functionality (RFC, IDoc, Enterprise Services), data enrichment can be handled in the user exits or BAdIs.

 

Integration Framework

A more robust approach would involve having an integration framework in the backend system. Such frameworks provide capability for data enrichment/translation (fix values or value mapping), validation and logging. Below are some options of standard and custom frameworks:

 

Even if data enrichment lookups from PI cannot be avoided, it is still possible to perform SOAP based lookup which provides better logging mechanism (although it takes more effort.) Below is a blog utilizing SOAP lookup via ABAP proxy.

Mapping-Lookup to ABAP-Proxy

 

 

RFC as an interface

As we can see from this example thread, there is no clue what happens after an RFC call is executed as there are no traces/logs in the called system. Even though the message status in PI is "delivered", it is hard to determine what happened. This makes troubleshooting and supporting such an interface difficult.


Instead of using RFC as an interface (either sender or receiver), there are other alternatives as listed below:

  • Exposing the RFC as a web service as mentioned by the guideline above - this uses the Web Service Runtime of the SAP backend system which provides better logging and error handling capabilities
  • Proxy interface - the SAP recommended outside-in approach for developing new custom interfaces based on ESR. The RFC function module can be called within the proxy logic
  • IDoc interface - If the RFC is a Business Object (BOR) function module, associated IDoc/ALE objects can be generated for it using BDBG. The IDoc basically just wraps around the RFC function module, and also provides better error handling capabilities

 

 

Conclusion

As mentioned in the arguments above, with some effort in design and architecture, it is possible to avoid usage of RFC in PI-related integration. The end results are more robust interfaces and increase in supportability. Wherever I go, I try my best to advocate such an approach. However, there are organizations where the usage of RFC is so deeply embedded into the architecture and design that it is hard to avoid it.

 

If you have other thoughts, whether for or against this, feel free to comment below - I'd love to hear your thoughts

Easy searching in your messages with SAP PI/PO

$
0
0

As a seasoned PI consultant, I wanted to create an archiving solution that would make a developer’s work easier. Specifically, I wanted to create an archiving module that would store a lot of metadata on the file, while also adding a useful search function to it. This is howwww.piarchiving.comwas born, it is no longer being maintained because there is other solutions. Some of the newer user-defined message search modules also have different search criteria for text and so does HANA.


I was talking with a friend around searching content and how easy it would be to implement your own search solution.

After taking a look at search APIs, such as Amazon’s, and having long discussions with a friend, I came across this useful page:https://www.searchify.com/documentation/java-client

My friend mentioned that this would be very simple to use in a PI context for full-text searches, in case you wish to view and save certain documents.

 

 

After gathering my ideas, I had created a module. In only takes one hour to learn and implement the code in an SAP PI module.

 

 

To get started, you need to add the API URL that should be used. The next step is to create and add the Index. After these steps, the content should be put into the text field and uploaded. A step-by-step explanation is shown in my video, with examples on how to find message IDs after the completion of keyword-based searches.

 

 

This is an easy application for those who wish to search through all of their content using Searchify in a PI/PO context. It is good for scenarios where you don't have a HANA instance to save and search through messages.

 

It is also quite inexpensive, at only $59 per month which will not let you run much servers.

If this was to be used in a real scenario there should be a focus creating more stabile code and a async process. There would obvious also be the need for better front end.


For a more in-depth explanation, watch my video at.


 

 

I posted the code for my module on my other blog: Searching SAP PI/PO message content with module development - SAP PI course

XML concepts/pre-requisites of XSLT mapping in SAP PI.

$
0
0

XML concepts/pre-requisites of XSLT mapping in SAP PI

First let’s take a look at basic concepts:

1. what is XML-

It is Extensible mark-up language. It is platform independent data format as well as manufacturer independent data format.
It has easy to read structured information, and there is separation of information and semantics.
Moreover it has metalanguage for defining description languages.
XML defines set of rules for encoding documents in a format which is both human readable and machine readable.

2. What is XSL-
It is language for expressing style sheets. Extensible stylesheet language.
It is a file that describes how to display an XML document of given type.
This style sheet describes how a document (XML/HTML) should look like on the screen/print.
Without this, there is no meaning to data in XML.

3. What is XSL-FO-
It is another language for formatting XML data, and focuses on styles that work on both screen and print such as PDF.
This formatting allows you to create files that look fantastic on any screen or print neatly.
There is no difference in XSL and XSL-FO.
www consortium (WC3) expanded on XSL theories to create XSL- FO.

4. What is XSLT-
It is XSL transformation. It is part of XSL language for XSLT processes.
XSLT defines order/frequency of processing of XML source document.
It has templates having processing instructions.
It also has various features like variable declaration, repeat instructions, conditional checks etc.

5. X-Path-
It is language for addressing specific parts of an XML document.
It models XML document as tree of nodes.
XPath expression is mechanism for navigating through and selecting nodes from XML documents.

So to conclude the relationship between above components is, we use XML language for transfer of data because of its advantages mentioned above, and XSL document is a reference, using which XSLT does transformation and produces final document.
Ex. Just like we execute mapping in SAP PI, we take input payload (say XML), take data type as reference (say XSL), and we produce output using mapping transformation (say XSLT).


We can separate content and output format using XML.
We can create numerous output documents using single source document by performing transformations.

Following are examples of applications for XML:


APPLICATIONS OF XML.png

Following is basic syntax 1 for XML:

SYNTAX 1 XML.png
Following is basic syntax 2 showing various elements in XML :


SYNTAX 2 XML.png

XML can also be viewed as tree structure.

 

XML TREE.png

Job of XSLT is to transform data from XML file into another format like html/xhtml. Data is sent as xhtml (format after XSLTransformation), and web browser  modifies it into the required data to display (html).
XSL TRANSFORM.png

 


In case of XSL-FO, we can use various formats which can be printable such as PDF etc.

XSL FO TRANSFORM.png

SAP Business Connector:

SAP business connector.png

XML in SAP PI:

xml in SAP PI.png


Well-formed XML:
1. Must have XML declaration.
2. Root element should be defined.
3. All elements have start and end tag.
4. All attribute values are in inverted commas.
5. Element groups are correctly nested

 

Validation of XML:
1. All points mentioned above.
2. It must contain DTD/Schema.
3. Rules of DTD/Schema are compiled.

DTD-

DTD is document type definition. Under DTD, main categories comprising XML document are declared. Ex. elements, their attributes, ordering, nesting of  elements etc.

Schemas-

XML schema have namespace and their own DTD. It is more powerful than DTD. It can have most basic program till expanded complex programs.

CSS-

CSS is designed around styling a document, structured in markup language (XML/HTML).
They are cascading style sheets used to style XML markup.

To conclude, these three are data type definitions. To transfer data, we use these structures, in which XML data is passed.

Advantages of Schema over CSS:
1. It is in XML syntax.
2. Sequence of source data can be changed.
3. Elements can be repeated any number of times in output.
4. Additional information can be included in output document.

DTD vs Schema:

DTD

Schema

Has element nesting

Simple and complex data types

element repetition

type inheritance

attributes permitted

element repetition

attribute types and standard values

Restrictions on number and order of child elements.

Not supported-

Not supported-

Special data types for elements and attributes

Entity definition for abbreviations

No restriction on length and type of character strings



XSLT mapping:


The most common thing heard in SAP PI in this subject regards is XSLT mapping.
Different ways of achieving conversion in XI is:

  1. 1. Message Mapping (Graphical Mapping using Mapping Editor in XI).
  2. 2. Java Mapping.
  3. 3. ABAP Mapping.
  4. XSLT Mapping.

  Steps required for developing XSLT Mapping

  • Create a source data type and a target data type
  • Create Message types for the source and target data types.
  • Create Message Interfaces includes Inbound Message interface and Outbound Message interface.
  • XSLT Mapping does not require creation of Message mapping, so don’t create any Message mapping.
  • Create an .XSL file which converts source data type into target data type.
  • Zip that .xsl file and import it into Integration Repository under Imported Archives.
  • In Interface Mapping choose mapping program as XSL and specify this zip program. (Through search help you will get XSL Mapping programs that you imported under Imported Archives, select your corresponding XSL Program)
  • Test this mapping program by navigating to Test tab.







Image courtesy: SAP e-learning hub(introduction to XML).

How to create XSLT Mapping in SAP PI / PO

$
0
0

This blog will complement other blogs on 'XSLT Mapping' in SCN.

 


Direct link: How to create XSLT Mappinng in SAP PI / PO - YouTube   


     I recommend, practicing how to use XSL elements like xsl:template, xsl:if, xsl:choose, xsl:for-each, xsl:output, xsl:sort and xsl:namespace-alias in SAP NWDS or Eclipse or NetBeans. Please search internet for more examples.


Sample XSLT used in the video.

<?xml version="1.0" encoding="UTF-8"?><xsl:stylesheet version="1.0" xmlns:xsl="http://www.w3.org/1999/XSL/Transform">  <xsl:output encoding="cp1252" />  <xsl:template match="/">  <xsl:copy-of select="." />  </xsl:template></xsl:stylesheet>


Sample XML used in the video.

<?xml version="1.0" encoding="utf-8"?><hr>   <Resources>      <EmployeesRecords>         <Record>            <Name>AAA € </Name>            <ID>111</ID>            <Phone>1111111111</Phone>         </Record>         <Record>            <Name>BBB</Name>            <ID>222</ID>            <Phone>2222222222</Phone>         </Record>      </EmployeesRecords>      <ContractorsRecords>         <Record>            <Name>ZZZ</Name>            <ID>999</ID>            <Phone>9999999999</Phone>         </Record>      </ContractorsRecords>   </Resources></hr>


Helpful Links: -

XML Path Language (XPath)

XSL Transformations (XSLT)

XSLT Mapping - SAP Library

Parameterized XSLT Mappings - SAP Library

Using the Lookup API in an XSLT Program - SAP Library

How to.. Import and use XSLT 2.0 mappings in SAP PI/PO

XSLT Mapping in Exchange Infrastructure (download the mwv file).

How did you learn SAP XI/PI/PO

$
0
0

I was talking to my little brother yesterday. He is a medical doctor and he is currently finishing his higher education. In this phase, he is working as a general practitioner - but he has supervision. So every day he has 30 minutes to talk about some of the experiences he is going through in this important period. This is quite a privilege when trying to master a craft. I don’t think that I had that option when I started learning SAP XI, 10 years ago.kozzi-survey_on_clipboard_shows_very_good-306x424.jpg

 

I also went through a learning process. There were two standard SAP XI courses and some e-learning. I had to become an expert in my own field. That was a bit challenging. I was supposed to know how everything would work, and how to design and solve issues. I really learned a lot during that phase. One of my advantages was that I had a few colleagues I could call to discuss my strange questions. I had many questions, I did not know anything about design back then. Those conversations really helped me get started.

 

We all have our different learning paths that help us become who we want to be. There are a lot of things we must do to fully understand the concepts of SAP Integration.

 

I’m getting quite a lot of requests regarding new SAP PI/PO roles at the moment, from all over the globe. It must mean that there is a requirement for SAP Customers to get new PI developers. I can tell that customers aren’t keen on having novice SAP PI developers working for them. There are some concepts that you must understand before you can become an expert PI developer.

 

I’m looking into effective methods of improving various people’s skills in the area of PI/PO development, so they can get started faster. I would like to hear about your learning journey. What was important for you? What is the best path to follow in order to become an expert developer? By ensuring that new developers can receive this crucial information, we can make sure they speed up their learning process.

 

Another part of the survey is focused on how you currently work. Even though you have been working with SAP XI/PI/PO for many years, there are still new use cases and different ways to solve new issues. It is also possible that a new, improved feature has been introduced. Sharing your objectives with your colleagues and receiving feedback on your projects is always a good idea. I have also put some of this into the survey.

 

 

 

So please take the time  to help with the survey, so i can learn how you managed to learn SAP Integration

 

Java 8 Update 20->40 - Security Error prevents ESR/ID Opening

$
0
0

Introduction

 

So perhaps like me you couldn't help yourself and updated your local Java Run-time Environment to Update 40 of Java 8. That runtime is central to your loading of the development and configuration tools for PI. And just like me, you've probably tried really hard to not update the JRE but for whatever reason, some other dependent application on your computer required it and you had not choice.

 

This blog describes what happens when you do, due to changes to the Java security requirements as of Java 8 Update 20.

 

Symptom

 

Apart from the funky new blue colour of the Java logo, you have run into a problem where the ESR or ID would not open. The following screenshot is likely what you saw;

 

scnJavaError.jpg

 

You click on the link; Security level settings in the Java Control Panel and find out that


Starting with Java 8 Update 20, the Medium security level has been removed from the Java Control Panel. Only High and Very High levels are available.

 

Perhaps like me you smacked yourself in head thinking what a fantastic way to start the day.

 

Resolution

 

But don't fear! Launch the Java Control Panel.

 

scnJavaCP.jpg

 

Navigate to the Security tab and click on the 'Edit Site List' - this will allow you to add the URL from where your are launching your ESR and ID from.

 

scnJavaFQDN.jpg

 

Really important to ensure that you fully type in the address. Wildcard prefixed address do not work. Eg, http://*.mydomain.com/ has no positive impact. (Which seemed ok in earlier versions with Medium Security enabled).

 

Once done relaunch ESR and ID and then continue on your merry way.


How to create Java Mapping in SAP PI / PO - using DOM parser

$
0
0

This blog will complement other blogs on 'Java Mapping using DOM parser' in SCN.




Direct link: How to create Java Mapping in SAP PI - PO using DOM parser - YouTube


Attached text file has: -

1. XML used in Java class.

2. Java class with main method (to practice/debug DOM parser).

3. XML used in grouping example.

4. Java Mapping to group records.

5. XML used in sorting example.

6. Java Mapping to sort records.


Helpful Links: -

W3C Document Object Model

Lesson: Document Object Model (The Java Tutorials Java API for XML Processing (JAXP))

org.w3c.dom (Java Platform SE 7 )

Parsing an XML file with DOM - Java Programming - YouTube

USEFUL UDFs

$
0
0

Dear SCN Users,

 

Collected some UDFs which will be used in most of the scenarios.

1.UDF to get the input filename and write the output file with input filename

Capture.JPG

DynamicConfiguration conf = (DynamicConfiguration) container.getTransformationParameters().get(StreamTransformationConstants.DYNAMIC_CONFIGURATION);


DynamicConfigurationKey key = DynamicConfigurationKey.create("http://sap.com/xi/XI/System/File", "FileName");

DynamicConfigurationKey key1 = DynamicConfigurationKey.create( "http://sap.com/xi/XI/System/File","FileName");

String filename="output_"+conf.get(key);

conf.put(key1,filename);

return filename;

2.UDF to add two timestamps

 

import java.text.DateFormat;

 

import java.text.ParseException;

 

import java.text.SimpleDateFormat;

 

import java.util.Calendar;

 

import java.util.Date;

 

import java.util.GregorianCalendar;

 

 

int date1 = var1;// first argumnet in UDF

 

int date2 = var2;//Second Argumnet in UDF

 

Date date = null;

 

String date1String = "" + date1;

 

DateFormat df = new SimpleDateFormat("yyyyMMdd");

 

try {

 

date = df.parse(date1String);

 

} catch (ParseException pe) {

 

pe.printStackTrace();

 

}

 

GregorianCalendar cal = new GregorianCalendar();

 

cal.setTime(date);

 

cal.add(Calendar.DAY_OF_YEAR, date2);

 

Date Cal = cal.getTime();

 

String newstring = new SimpleDateFormat("MM/dd/yyyy").format(Cal);

 

return newstring;

 

3.Getting the last 3 digits(If there are only 2 then only last 2 and so on).

 

if(EmpId.length()>3)

{

EmpId=EmpId.substring(EmpId.length()-3,EmpId.length());

return EmpId;

}

else

return EmpId;

 

Capture.JPG

 

4.Making the String to particular number of digits(In this case I am making it as 10 digits adding 'X' to fill the spaces)

 

if(empName.length()<10)

{

for(int i=empName.length();i<10;i++)

{

empName=empName+"x";

}

return empName;

}

else

return empName;

 

Capture.JPG

 

5.UDF to get the last value in the queue\Context

 

int i=EmpTime.length;

result.addValue(EmpTime[i-1]);

 

Capture.JPG

 

6.UDF to check the duplicate Entries in the queue(In this case if the entry is first then' FirstEntry' else ' Duplicate is populated)

 

if(EmpDetails.length==2)

{

if(EmpName[0]==EmpName[1])

{

result.addValue("True");

}

else

result.addValue("False");

}

 

Capture.JPG

7. UDF to add context change(In this case after every 2 values)

 

int count = 0;               

           int j =0;

               for(int i=0;i<EmpGift.length;i++)

               {
                 j++;

               
                  if(j == 2&& i <EmpGift.length-1)

                       {

                               result.addValue(EmpGift[i]);

                               result.addContextChange();                               

                               count++;

                               j =0;

                       }

               
                       else

                       {

                               result.addValue(EmpGift[i]);

                       }

                       
               }

 

Capture.JPG

 

8.UDF to Sort Elements in the queue

 

int var2=0;

Arrays.sort(var1);

for(int i=0;i<var1.length;i++)

{

var2 =var1[i];

result.addValue(var2);

}

 

Capture.JPG

 

9. Remove Decimal(This UDF can be used to remove special characters as well)

 

String Out= EmpAllowances.replace(".","");

return Out;

 

Capture.JPG

 

10. Rounding off to 3 Decimal digits

 

java.math.BigDecimal bd  = new java.math.BigDecimal(var1).setScale(2,java.math.BigDecimal.ROUND_HALF_UP);   

String s=bd.toString();

return s;

 

Capture.JPG

 

 

 

 

 

 

Integration between SAP and SharePoint using Advantco REST Adapter

$
0
0

Requirement

This blog is to demonstrate How Advantco REST Adapter can be used to retrieve or place file in the SharePoint server using Microsoft Azure Integration.


REST (Representational State Transfer) is a software architecture style for designing client-server applications based on HTTP which is less complex than the SOAP (Simple Object Access Protocol) approach.


SharePoint is a web application framework and platform developed by Microsoft that integrates intranet, content management, and document management.

The Microsoft Azure Platform provides an API built on REST, HTTP, and XML that allows a developer to interact with the services provided by Microsoft Azure. Microsoft also provides a client-side managed class library which encapsulates the functions of interacting with the services.


The Advantco REST adapter for SAP PI enables system integration between SAP and NON SAP backend systems to connect, retrieve or place file in any SharePoint folder based on the REST along with OAuth which is an open standard for authorization.


Configuration Required

The major challenge that was faced while achieving this integration involves configuration of the Authentication, to connect to the share point site. This could be overcome by retrieving a refresh token which would be used by the REST adapter to retrieve the access token, which in turn is used by the Adapter.

 

The following are the steps that has to be performed to achieve the goal:


Go to the URL

https://login.windows.net/<tenant_id>/oauth2/authorize?response_type=code&client_id=<client id>

Get Code from URL

                Login and then get the URL. Copy it into a notepad and collect the <CODE>

Get the Refresh Token

Now use the Advantco REST Client to access the rest of the information.

  1.png
  

 

Channel Configuration

Use the above token to create the channel configuration:

Authorization Grant Type

Refresh Token

Client Id

<client id>

Client Secret

<client secret>

Token Endpoint

https://login.windows.net/<tenant id>/oauth2/token?client_id=<client id>&client_secret=<URL-encoded client secret>&resource=<App ID>

Refresh Token

<Refresh token obtained above>

Scope

 

 

 

  2.jpg
  

 

Testing

A file present in the SharePoint folder would be retrieved and brought into the PI system:

3.png

File that would be used:

4.png

Channel used:

5.png

Screenshot from the SXMB_MONI showing the message:

6.png

The file present in the SAP Folder:

7.png

 

Summary

Connection to SharePoint can be complex because of the Microsoft Authentication Mechanism. Advantco REST Adapter reduces the complex integration process with Microsoft Azure/Sharepoint to simple configurations steps without any custom development.


Referances

For more details on Advantco REST Adapter, please refer: http://scn.sap.com/community/pi-and-soa-middleware/blog/2011/11/08/rest-adapter-for-netweaver-sap-pi

For the Advantco Rest Adapter website, please refer: https://www.advantco.com/product/REST

Read Entire Input file data as-is in Single node and pass to Output - With Java Mapping in ESR

$
0
0

Hi All,

 

In one of our projects, we had a requirement to read the incoming .CSV file and pass its entire content as-is in one of the Target Structure Node.

 

As in normal scenario, I decided to use FCC in order to achieve this requirement. But when I started testing the scenario, I noticed that we are not able to read the Entire CSV data in one single Target field. Instead it creates a new Record of each single line in CSV file. (Whenever a Newline is encountered in Source File, it creates a new Record node; even placing some vague value for endSeparator did not help).

 

Thus with FCC, we were not able to achieve this.

 

As next steps I started to look for other alternatives. I had the option to achieve it using either XSLT mapping or using Java Mapping.

I decided to go for Java mapping over XSLT because its light weight & performance efficient.

 

To start With Java Mapping we have to import all the relevant jar files and also have the Local IDE - Eclipse or NWDS available for our development. Since there were some challenges in getting these we were stuck in our development, until I stumbled across this blog: Write Java Mapping directly in ESR!

written by Sunil Chandra

 

I decided to extend the same logic as mentioned in the blog for my case. I wrote the java logic in Message Mapping - Functions Tab under Attributes and Methods as follows:

 

//Following code is to read the Flat File As-Is from source into the Target Node and to map Constants to rest of the Target XML Nodes:

 

 

 

public void transform(TransformationInput in, TransformationOutput out)

  throws StreamTransformationException {

try {

 

  String source = ""; String targetxml =""; String line ="";

  InputStream ins =    in.getInputPayload().getInputStream();

  BufferedReader br = new BufferedReader( new InputStreamReader(ins));

  while ((line = br.readLine()) != null) 

  source +=line+'\n'; 

  br.close();

 

     String XML_START_TAG = "<?xml version=\"1.0\" encoding=\"UTF-8\"?><ns1:SendFile xmlns:ns1=\"http://test.com/\">";

  String XML_END_TAG = "</ns1:SendFile>";

 

 

  targetxml = XML_START_TAG+

   "<ns1:EM>"+

   "<ns1:Security>"+

   "<ns1:CompanyName>ABC</ns1:CompanyName>"+

   "<ns1:UserId>test</ns1:UserId>" +

   "</ns1:Security>" +

   " <ns1:Action>New</ns1:Action>" +

   "<ns1:File>" +

   "<ns1:FileContent>"+source+"</ns1:FileContent>" +

   "<ns1:FileType>CSV</ns1:FileType>" +

   "</ns1:File>" +

   "</ns1:EM>" +

  XML_END_TAG;

 

 

  out.getOutputPayload().getOutputStream().write(targetxml.getBytes());

  }

  catch (Exception e) {   throw new StreamTransformationException(e.getMessage());  

}

 

//End of the code

 

 

Following are the screenshots attached for the Function Mapping and Testing in ESR:

 

1. Message Mapping - Functions Tab:

CSV-MM.jpg

 

2. ESR Testing:

CSV-1.jpg

 

3. ESR Testing:


CSV-2.jpg

We are able to see & verify the required output in ESR Message Mapping Testing.

 

Rest of the configuration remains same as we do in other scenarios. However there is one important configuration setting that needs to be taken care. Since we are not parsing the Input file (not doing conversion from CSV to XML) we need to keep the Sender SWCV empty in our Configuration Scenario, else we might end up getting error at runtime - "Content not allowed in prolog" as mentioned in the blog- The Mystery of 'Content is not allowed in prolog'

 

We will use dummy Sender Interface in this case, since we are not converting the CSV file into XML structure, but passing it as-is to target.

 

Advantages of this approach:

1. No need to worry about getting the Jar files and having IDE in your local system.

2. No need to do FCC conversions in Configuration

3. Instantaneous testing can be done in ESR - MM test tab for your Java code

 

 

Same logic and steps can be further extended to suit other requirements as well.

 

This blog covers one of the possible ways to read the entire Source file data into single Node field. There are few other blogs that provides alternative solution like: Read Input Text File as a Single Field. - Process Integration - SCN Wiki

Whole Payload to a XML field - Process Integration - SCN Wiki

 

Hope this blog will be helpful and informative & as one of the alternatives to achieve this requirement.

 

Please let me know your valuable feedback and suggestions.

 

Regards,

Azhar


dynamicSubstring - simplifying your substring logic

$
0
0

Introduction

A common mapping requirement that crops up every now and then is the requirement to extract a portion of data from a source field. This can be easily achieved with the standard function substring from the Text category. However, the substring function can only be configured with static start position and character count parameter values. If the input to the function is a dynamic content of variable length, it is possible to encounter the familiar "java.lang.StringIndexOutOfBoundsException: String index out of range" exception when the input is shorter than expected.

 

Below are some common approaches to handle such scenarios:

  • Create a custom substring UDF based on java.lang.String's substring method with arguments as input instead of parameters.
  • Create a mapping logic to determine length of input and reconstruct it prior to input of substring function.

 

In this blog, I will share an alternative approach using a simple UDF named dynamicSubstring. The usage and configuration is similar to the standard substring function, however it will dynamically calculate the length of the input value so that the substring operation does not trigger the out of range exception.

 

 

Source code

The UDF is a Single Values execution type UDF, having 1 String input argument and 2 configurable parameters.

 

@LibraryMethod(title="dynamicSubstring", description="Get substring of input", category="FL_TextPool", type=ExecutionType.SINGLE_VALUE)
public String dynamicSubstring (   @Argument(title="Input String")  String input,   @Parameter(title="Start index")  int start,   @Parameter(title="Length of output")  int length,   Container container)  throws StreamTransformationException{  // ----------------------------------------------------------  // Extracts the substring of an input String     // ----------------------------------------------------------  // input - input string  // start - position index for start of string  // length - length of substring  // ----------------------------------------------------------  int startPos = start;  int endPos = start + length - 1;  String output = "";  if ( startPos < 0 ) {   // (1) Start position is before start of input, return empty string   output = "";  } else if ( startPos >= 0 && startPos < input.length() ) {   if ( endPos < input.length() ) {    // (2) Start & end positions are before end of input, return the partial substring    output = input.substring( startPos, endPos + 1 );   } else if ( endPos >= input.length() ) {    // (3) Start position is before start of input but end position is after end of input, return from start till end of input    output = input.substring( startPos, input.length() );   }  } else if ( startPos >= input.length() ) {   // (4) Start position is after end of input, return empty string   output = "";  }  return output;
}

 

 

Testing Results

Scenario 1

Start position and end position is less than length of input. This scenario can be achieved by standard function without any exception.

test1.png

 

Scenario 2

The input value is now shorter that scenario 1 such that start position is less than input length, but the end position is more than input length. For standard function, this will cause an exception. However, for dynamicSubstring, it will just extract from the start position until the end of the input.

test2a.png

 

Scenario 3

Finally, the input value has shortened significantly such that the start position is more than the input length. Again this will cause an exception with the standard function. However, for dynamicSubstring, it will just return an empty string.

test3.png

 

 

Conclusion

With the usage of dynamicSubstring, we can now simplify mapping logic that involve substring operations on content that is variable in length.

Viewing all 676 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>