Quantcast
Channel: SCN : Blog List - Process Integration (PI) & SOA Middleware
Viewing all 676 articles
Browse latest View live

Webinar Series: Innovate and Monetize with SAP HANA Cloud Integration as an SAP Partner

$
0
0

I recently mentioned it here, but it's so true that it's worth repeating: cloud opportunities are up for grabs by everyone, and especially by SAP's ecosystem...

 

Not sure about it? Join us for a series of webinars between April 9 and May 8, and let's have a chat about SAP HANA Cloud Integration, SAP’s cloud integration middleware solution and core application service of SAP HANA Cloud Platform.

 

If you already belong to the SAP partner family and the developer community at large, or if you feel like becoming part of it, come over and learn from SAP product managers and early-adopter SAP partners about the potential of SAP HANA Cloud Integration and the opportunities it creates for you.

Cloud integration is the linchpin that can make continuity, simplicity and innovation happen for our customers as they journey into the cloud. Whether you  are "just connecting" cloud and ground apps using SAP HANA Cloud Integration as an iPaaS, or if you are building custom cloud apps and app extensions via SAP HANA Cloud Platform, there is an opp for you and you have a role to play, if you want it.

 

We are going to kick-off this 5-webinar series with the big-pic of SAP's cloud integration strategy, solution and roadmap; then we'll deep-dive into SAP HCI, looking under the bonnet of developing integration flows and connectivity adapters. Finally, right after ASUG|SAPPHIRENOW, we'll wrap up by exploring the opportunities for partners to innovate in the cloud with SAP HANA Cloud Integration and to monetize cloud integration solutions with SAP.

 

Still between two minds? Check out the agenda below and register here.

 

Without further ado, here's what we plan to talk about…

 

Webinars Agenda |Register for the series

 

DateWebinar Summary
FIRST-DATE.jpg

Getting started with SAP HANA Cloud Integration

Cloud is the ticket to innovation and agility, but the ride can be bumpy if organizations address integration just as an afterthought.

Join this webinar to understand SAP's cloud integration strategy, solution, and roadmap to address cloud integration challenges.

 

Speakers: Vladimir Dinev & Katrin von Ahsen, SAP

When: Thu, April 9, 2015 | 16:00 CET | Duration: 45 minutes

4_16.png

SAP HANA Cloud Integration: Deep Dive in Process Integration Capabilities

As organizations journey into the cloud – public, private, or hybrid, it is so easy to accidentally build application silos and fragment business processes.

Join this webinar to learn how SAP HANA Cloud Integration can help integrate processes and data for a smooth experience across cloud and ground applications and application extensions.


Speaker: Katrin von Ahsen, SAP

When: Fri, April 16, 2015 | 16:00 CET | Duration: 45 minutes

4_23.png

Developing Integration Flows for Process Integration with SAP HANA Cloud Integration

You integrate cloud and on-premises applications for a living? You want to build and monetize custom cloud apps and app extensions on SAP Hana Cloud Platform, too? Whatever your specialization, pre-packaged integration content shortens time-to-value for you and your customers.

Join this webinar to learn first-hand from SAP's partner QforIT how simple and pleasant it can be to create integration flows with SAP Hana Cloud Integration's native toolset.

 

Speakers: Udo Paltzer, SAP; Igor Mitrovic & Guido Koopmann, QforIT

When: Fri, April 23, 2015 | 16:00 CET | Duration: 45 minutes

FOUTH-DATE (1).jpg

Developing Connectivity Adapters and HCI based solutions

We all know the value that a well-built connectivity adapter brings to the table, don’t we? But have you ever imagined that developing connectivity adapters can be that easy – just a piece of cake?

If you are hungry to know more, join this webinar to learn first-hand from Itelligence, an SAP partner, about their experience in developing a Salesforce.com adapter with SAP HANA Cloud Integration's adapter SDK.

 

Speakers: Sujit Hemachandran, SAP; Nico Bredenbals, itelligence

When: Thu, April 30, 2015 | 16:00 CET | Duration: 45 minutes

FIFTH-DATE.jpg

Business Opportunities with SAP HANA Cloud Integration for SAP Partners

Cloud is big, and growing. Cloud opportunities are up for grabs by everyone, and specifically by SAP's ecosystem...

Join this webinar to explore how you can innovate in the cloud with SAP HANA Cloud Integration and to monetize your cloud integration solutions with SAP.


Speakers: Vladimir Dinev & Katrin von Ahsen, SAP

When: Fri, May 8, 2015 | 16:00 CET | Duration: 45 minutes

Register for the series

 

 

Webinar Crew:

 

Speakers
KatrinVonAhsen.jpgKatrin von Ahsen, SAP, is a member of the product management team for SAP HANA Cloud Integration (HCI) and SAP Process Orchestration (PO) at SAP SE. She is responsible for the trial program for HCI and takes care of partner related activities for HCI and PO. Prior to this position she was working in various roles always in context of SAP integration technologies (e.g. SAP Consulting, Solution Management, Enterprise SOA Program) and is therefore a domain expert for SAP integration technology products. She is with SAP SE now for more than 14 years.
VladimirDinev.jpgVladimir Dinev, SAP, is with Product Management in the Process & Network Integration unit at SAP UK Ltd. He is focusing on cloud and is specializing in cloud integration. Vladimir has been with SAP for 10 years in a variety of roles, where he has worked on go-to-market and forward looking topics and has built expertise in different domains, including integration, business process management, and enterprise architecture.
SujitHemachandran.jpgSujit Hemachandran, SAP, is a Product Expert on SAP HANA Cloud Integration (SAP HCI). As a lead for the Market Entry Team for SAP HCI, he focusses on how customers and stakeholders can realize SAP HCI for their businesses. He is also the Product Owner for the Adapter Development Kit of SAP HCI (where he eagerly awaits HCI partners to develop cool adapters!).
UdoPaltzer.jpgUdo Paltzer, SAP, is Product Manager for SAP's integration middleware portfolio, including SAP HANA Cloud Integration, SAP Financial Services Network, and SAP Process Orchestration. Udo has been with SAP for 13 years. He loves working with customers and is committed to helping SAP customers succeed in their cloud journey.
IgorMitrovic.jpg

Igor Mitrovic, QforIT, believes that using SAP HANA Cloud Integration you can easily facilitate the integration of your business processes and data whether you are in the cloud or on-premise.

This means that you can consume your Integration as a Service which opens endless possibilities for future development. Igor Mitrovic is an enthusiastic, energetic, hardworking team player who believes that a problem well stated is a problem half solved. He also believes that SAP HANA Cloud integration will change the whole perspective on Integration as we know it. Igor Mitrovic is an SAP Integration Consultant who knows how to improvise and prioritize. A practical mind with strong analytical skills and 7 years of experience in delivering end-to-end integration services to multiple customers.

GuidoKoopmann.jpgGuido Koopmann, QforIT, truly is a cloud believer and he believes SAP is an example for all cloud believers. With SAP HANA Cloud Integration a new integration era has started, it will offer our customers: Integration as a Service. Guido Koopmann is a certified all-round SAP professional with a focus on SAP Integration.  He believes that quality is not an accident; it is always the result of high intention and sincere effort. In his believes Guido Koopmann is able to deliver end-to-end solutions with the best quality standard, from the functional design until go-live.
NicoBredenbals.jpegNico Bredenbals, itelligence, is a business integration developer with the Technology Consulting & Products unit at itelligence AG. He specializes in integration technologies and specifically around SAP Process Orchestration and SAP Hana Cloud Integration. In his 9 years with Itelligence AG, Nico has supported and led a large number of EDI and adapter implementation projects.

FormatConversionBean - One Bean to rule them all!

$
0
0

Introduction

Over the past months, I have introduced various custom adapter modules that are configurable and reusable; aiming to handle various format conversions to/from XML that are not available in standard adapter modules.

 

I have recently refactored the different modules and consolidated them into a single module, FormatConversionBean. The benefits of this refactoring are:-

  • Single point of entry for the different types of conversion
  • Refactoring of factory class utilizing ConverterFactory allows dynamic instantiation of converters
  • Easily extensible to new converter formats by extending class AbstractModuleConverter
  • Conforms to the Open Close Principle whereby new converters can be added without modification to existing code

 

 

Converter Class List

Below is the comprehensive list of the converter classes that are part of FormatConversionBean now. This list will be updated as and when new converter classes are introduced in the future.

 

Converter ClassSource FormatTarget FormatReference
Excel2XMLTransformerExcelXMLExcelTransformBean Part 1: Convert various Excel formats to simple XML easily
XML2ExcelTransformerXMLExcelExcelTransformBean Part 2: Convert simple XML to various Excel formats easily
DeepPlain2XMLConverterDeep Plain TextXMLDeepFCCBean - The better FCC at meeting your deep (structure) needs! (Part 2 - Flat File to Deep XML)
XML2DeepPlainConverterXMLDeep Plain TextDeepFCCBean - The better FCC at meeting your deep (structure) needs! (Part 1 - Deep XML to Flat File)
JSON2XMLConverterJSONXMLJSONTransformBean Part 1: Converting JSON content to XML
XML2JSONConverterXMLJSONJSONTransformBean Part 2: Converting XML to JSON content

 

 

Source Code

The Java source codes for the module are located in the following GitHub repositories.

GitHub repository for com.equalize.xpi.af.modules

GitHub repository for com.equalize.xpi.util (prerequisite library for com.equalize.xpi.af.modules)

 

The EAR file for deployment (compiled on NWDS 7.31 SP13 Patch 0) is also available from the latest repository release below.

Latest release for repository com.equalize.xpi.af.modules

 

 

Further Reference

If you plan to download the source codes into your own NWDS installation for make changes and/or develop new modules, the following blog would be useful for performing testing in NWDS prior to deployment into the PI system.

Standalone testing of Adapter Module in NWDS

Extracting data for ESR Objects in a SAP PI System

$
0
0

This wiki link already covers how to extract information from ESR / ID objects . This works fine for checking a single object. However, sometimes we may need to extract metadata of all PI objects ( for instance, to find all objects modified after a certain date ) or even analyse the source code ( to find all objects using a certian Java function / Value map ).

 

Through this blog, I'll show how we can automate reading SimpleQuery.

 

Looking at the SimpleQuery structure in terms of HTML pages in the below image, the information we need is in the step highlighted in the Green box below.

 

 

image1.png

 

How to automate filling this information ?

 

Here we'll use HtmlUnit. From it's homepage link description : HtmlUnit is a "GUI-Less browser for Java programs". It models HTML documents and provides an API that allows you to invoke pages, fill out forms, click links, etc... just like you do in your "normal" browser.

 

For our example, I'm interested in filling up the below page.

 

 

1.png

 

 

2.png

 

We will need to run some tests to see what the default values are for each of the fields.

 

The general sequence of steps is:

 

- Create an instance of WebClient

 

WebClient webClient = new WebClient();

 

- Encode {username:password} using base 64 encoding and update the header


  setCredentials(webClient);


It can be implemented as


private static void setCredentials(WebClient _webClient) {

  String base64encodedUsernameAndPassword = base64Encode(combinePasswd);

  _webClient.addRequestHeader("Authorization", "Basic "

  + base64encodedUsernameAndPassword);

  }

 

- Create an instance of HtmlPage and get the form by checking the body

 

HtmlPage currentPage = webClient.getPage(url);

HtmlForm form = (HtmlForm) currentPage.getByXPath("/html/body/form").get(0);

 

 

- Once we have the form, select different objects, set/ unset values

 

// Get SWC

  HtmlRadioButtonInput choseSWC = form

  .getInputByValue("All software components");

  choseSWC.click();

 

//Unclick changeList User

HtmlCheckBoxInput changeListUser = form.getInputByName("changeL");

changeListUser.click();

 

 

- Click the Start Query button to get the result page.

 

HtmlSubmitInput submit = form.getInputByValue("Start query");

HtmlPage resultPage = submit.click();

 

 

This page will have all the links and we can convert it nicely to a String.

 

String resultPageText = resultPage.asXml().toString();

 

I wanted to keep this blog brief to focus on automating getting the result page. This github link has the source code. Please make sure you have HtmlUnit library added to your project. You can run the source code - just adapt the static variables in the beginning of the file.

 

We can now build up this knowledge to get more information. I'll take two use cases:


Compare map versions between two PI systems.


- Extract graphical maps using value mappings.

Compare map versions between two PI systems

$
0
0

In this blog, I'll extend to what we learnt in this blog ( link ) about populating SimpleQuery Parameters and getting metadata about ESR Objects. We'll get the maps in system 1 and compare them with the map version in system 2.

 

 

SimpleQuery gives us information in the below format with R being a hyperlink to the metadata of the graphical map.

3.png

 

Clicking the link we get information in the below format.

 

Here, element idInfo's attribute VID has the message map id which represents a unique version of the map.

 

4.png

It's interesting to get some more information apart from version ID. So we can create a Class to hold the attributes we're interested in.

 

 

 

String SID;
String mapName;
String version;
String modifBy;
String modifAt;
String SWCV;
String namespace;

 

Now, we have the required information to build our required program. Before moving ahead, let me introduce to one more friend: jsoup

 

 

From it's webpage:  jsoup is a Java library for working with real-world HTML. It provides a very convenient API for extracting and manipulating data, using the best of DOM, CSS, and jquery-like methods.

 

it's my personal rule of thumb - for filling out pages etc I prefer HtmlUnit and for just extracting information from a web page I prefer using jsoup. Ofcourse,

we can use only HtmlUnit but I'll leave that as an exercise for you to do  :-) .

 

 

The structure of the program will be something along these lines:

 

- Retrieve all the links

 

String resultPageTextSys1 = fillSimpleQueryParams(usrSys1,  combinePasswdSys1);

 

Document docSys1 = Jsoup.parse(resultPageTextSys1, "UTF-8");

Elements linksSys1 = docSys1.select("a[href]");

 

- for each link, get the HTTP response

 

- "Cleanse" the output as jsoup doesn't work well with the <p1> , <tr> etc tags we have in the response. It could be implemented as follows:

 

static String removeExtraTags(HttpEntity responseEntity) throws IOException {

  String content = EntityUtils.toString(responseEntity);

  content = content.replace("<p1:", "<");

  content = content.replace("</p1:", "</");

  content = content.replace("<tr:", "<");

  content = content.replace("</tr:", "</");

  return content;

  }

 

- Retrieve a Document and parse it to retrieve SWCV, namespace, Version ID, modifyBy, modifyAt .

 

for (Element e : docResponse.select("xiObj")) {

  MappingObject map = new MappingObject();

  for (Element e1 : e.select("idInfo")) {

 

 

  for (Element e2 : e1.select("vc")) {

  String SWCV = e2.attr("caption");

  map.setSWCV(SWCV);

  break;

  }

 

 

- Add this map information to a List.

 

- Repeat the same process for information from the second system as well.

 

If we have m is the processing time from sys1 and n from sys2, the processing time will be in order of O(m*n).

 

To make processing faster, sort the smaller list. For instance, if comparing the information between dev and prod systems, we'll sort the prod list as we need to iterate on the dev list and search the prod list to check if the prod version is the same. We'll also have more maps in development system which may be missing in prod system and iterating over dev and then searching in prod list will ensure that we're not missing any maps.

 

- To allow sorting of the list, implement a Comparator. As maps can span across SWCVs and namespaces, we need to consider them as well ins order. Else, we can get unexpected results.

 

Collections.sort(sys2Maps, MappingObject.MapNameComparator);



 

public static Comparator<MappingObject> MapNameComparator = new Comparator<MappingObject>() {

 

 

public int compare(MappingObject map1, MappingObject map2) {

 

 

String mapName1 = map1.getMapName().toUpperCase();
String mapName2 = map2.getMapName().toUpperCase();
String ns1 = map1.getNamespace();
String ns2 = map2.getNamespace();
String swcv1 = map1.getSWCV();
String swcv2 = map2.getSWCV();
String str1 = mapName1.concat(ns1.concat(swcv1));
String str2 = mapName2.concat(ns2.concat(swcv2));

 

 

// ascending order
return str1.compareTo(str2);

 

 

}

 

 

};

 

 

- Now try to find the corresponding map for the map at the current iterator value.

 

MappingObject sys2Map = new MappingObject();

index = Collections.binarySearch(sys2Maps, sys1Map,MappingObject.MapNameComparator);

 

 

- We can use Apache POI libraries to write the information to a spreadsheet.

Information will appear in the below format.

 

6.png

 

This github link has the full source code. Make sure you have the required libraries from Apache POI , HtmlUnit & jsoup and you should be able to extract the information.You can run the source code - just adapt the static variables in the beginning of the file.

Get List of all Graphical Maps using Value Mapping

$
0
0

This is one more use case of automating extracting ESR information via Simple Query. This link has the introduction and this link has a use case to compare map versions between two systems.

 

Now, let's look at something more interesting. Here, we need to identify all graphical maps and hence need to scan the source code of the map.

5.png

 

- The information we want is in the third element - aptly named SourceCode .

 

- So the initial step are the same as in the other two blogs mentioned above .

 

- Once we get the SourceCode, we need to decode it using base64 decoding.

 

It could be implemented as :

 

public static String decode(String _str) {

  byte[] valueDecoded = Base64.decodeBase64(_str);

  return new String(valueDecoded);

  }


- However, one of the most painful part when dealing with decoding files is that specifying the encoding doesn't really work ( try it :-) ). This value is set when JVM initialises and hence we need to hack it to allow it to reset it to use our encoding format.


System.setProperty("file.encoding", "Cp1256");
java.lang.reflect.Field charset = Charset.class
.getDeclaredField("defaultCharset");
charset.setAccessible(true);
charset.set(null, null);



This is important .


- As we're going to deal with a lot of information, it's better to write it to disk. So the zipped file after base 64 decoding is a normal zipped file.


 

8.png


- Unzipping it once, we get the below file.


9.png


- and it needs to be unzipped once more before we're able to scan the source code.

10.png


- Unzipping the file  could be implemented as follows:


ZipFile zipFile = new ZipFile(zippedFileOut);

  Enumeration<?> enu = zipFile.entries();

  while (enu.hasMoreElements()) {

  ZipEntry zipEntry = (ZipEntry) enu

  .nextElement();

 

 

  InputStream is = zipFile.getInputStream(zipEntry);

 

 

  String fileNameUnZipped = zipEntry

  .getName();

  newFile = new File(unzipDir

  + File.separator

  + fileNameUnZipped);

 

 

  FileOutputStream fos = new FileOutputStream(

  newFile);

  byte[] bytes = new byte[1024];

  int length;

  while ((length = is.read(bytes)) >= 0) {

  fos.write(bytes, 0, length);

  }

  is.close();

  fos.close();

  }

 

- The resulting file is still a zipped file and needs to be unzipped again as shown in the above screen-shots.

 

- Unzip the file and get the source code.

 

Once we have the source code, we can scan it to check for Value Mapping

 

FileUtils.readFileToString(newFile1).contains("ValueMapService") == true) { ..

 

- The output will be as in the below image.

 

12.png

 

This approach can be extended to identify all maps using a Java statement if it needs to be updated.

 

The github link is here for the source code . Please make sure HtmlUnit, jsoup and Apache POI libraries are added to the project.You can run the source code - just adapt the static variables in the beginning of the file.

Is it bye bye to SAP standard enterprise services? With no way to enhance in NWDS

$
0
0

Hi all,

This is just a warning to all fans of the standard SAP enterprise services. I am beginning to suspect that there are not that many of us , but I will keep hope alive. One of these days, I hope to see a statistic for how many standard SAP enterprise services that are used in a large pool of productive PI/PO landscapes. How high do you think the percentage will be in average? From my own experience, I am beginning to suspect the number is concerning low.

How many do you use the productive environment you work with? Actual raw untouched SAP standard enterprise services? When I take a look myself, I see 31 out of 259 unique Integrated Configuration interactions. Close to 12%. Is that a high or low percentage!? Either way, I hope SAP will not make it harder to push the number upwards by removing features and options supporting SAP standard.

 

However, that hope got yet another setback (see my blog about the lack of modelling option in the PO single stack) when I saw that the enhancement feature is not part of the NWDS ESR workbench.

 

Enhancement defined in Spring ESR:

enhancement def.jpg

 

Added to Enterprise Service:

Bank ES.jpg

 

Mapping in Spring ESR:

graphicalMapping.jpg

Mapping in NWDS ESR (note: no context menu for enhancements exist):

messageMaapingINWDS.jpg

This means that not only do we have no way of creating enhancements to existing data types directly in the NWDS ESR. Additionally, if we want to work in NWDS ESR then we also have to live with the following:

1) we cannot see existing enhancement definitions in NWDS ESR

2) existing enhancement mappings disappear when opening the message mapping in NWDS ESR

3) existing enhancements used in message mappings are not included in the output of the new feature "Export to spreadsheet". (A feature I, otherwise, am a big fan of)

 

On a related topic: I have also found that enhancements disappear when you import operation mappings with enhanced data types into BPM.

BPM_MM_afterIMport.jpg

The good news is, that BPM can support imported services with enhancements. I have tried updating the service definition after importing the operation mapping in BPM, which does update the data type definition, so you can see the enhancement in the BPM Output Mapping configuration, but the mapping steps fails to be deployed correctly after doing this. So I suspect that operation mappings which include enhancements to the data types within BPM is also a no-no.

 

My overall concern is, that SAP is making it impossible to (convince business/clients to) use SAP standard enterprise services if we (developers) are not given the possibility of enhancing these. I have found that most customers still have a few custom developed stuff, that they want included in their integrations.

I also mention this because I am now begining to meet the first "generation" of PI developers who have exclusively developed PI in NWDS only. How can SAP expect these new developers to embrace SAP standard enterprise services with the limitation described above?

 

Of course I should consider the very likely possibility, that I got it all wrong, and that it is or will become SAP's recommendation that we start building all/any service from scratch? Always. Hmmm, I can still feel that hope-thingy.

 

I have currently working on Netweaver PO 7.4 SP9 single stack.

Have a good one, Emil

Whitepaper: Customer Use Cases in SAP HANA Cloud Integration

$
0
0

Hello Colleagues!

 

We see an explosion of cloud applications in the market today. And the ability to quickly and accurately integrate your existing on-premise systems with the cloud applications is a legitimate business advantage. In this context, SAP HANA Cloud Integration has seen a growing adoption with customers.

 

In this whitepaper, I am providing the major use cases of how SAP HCI has been used in the past year. Here is a quick snapshot of the use cases that you shall see in the whitepaper:

 

  • In the Customer cloud space, customers have utilized their new cloud for customer applications with their on-premise CRM and ERP systems. This is a great example of how real-time integration helps maximize value of all systems in your landscape.

 

  • In the HR space, customers are using SAP HANA Cloud Integration as THE integration layer. SuccessFactors is a very interesting component in the integration story. With SAP HANA Cloud Integration, SuccessFactors can exchange compensation data with on-premise systems, recruit with the help of 3rd party recruiting systems (SHL, People Answers), track learning requirements of all its employees, and manage many more HR processes with an ecosystem of partners.

 

  • In Business with governments, SAP HANA Cloud Integration provides a cost-effective and secure means of exchanging business-to-government (B2G) e-invoices for legal and tax compliances.

 

Check out the whitepaper to know more about the customer use cases!

Link to whitepaper on slideshare: http://www.slideshare.net/SAPcloud/sap-hana-cloud-integration-customer-use-cases

 

Happy Integrating!

 

Best Regards,
Sujit

Data integration with Salesforce and SAP PI/PO using Outbound Messaging.

$
0
0

Introduction:

One of the challenges for any Salesforce and SAP PI/PO integration project is getting data from Salesforce back to SAP in reliable and secured way. Salesforce provides different ways like Streaming API, Queries using SOAP API or APEX callouts. This blog covers one specific option: Outbound Messaging.

 

Outbound Messaging:

Using Outbound messaging, you can define fields within Salesforce for which a notification will be send to SAP PI/PO when values of these fields are changed. Outbound messaging is part of a workflow rule defined in Salesforce for a specific object.

One important feature of Outboundmessaging is that it guarantees delivery of messages that are not older than 24 hrs. This means that if for some reasons the PI/PO system is temporal inaccessible, Salesforce will retry to deliver the data at later time. However, keep in mind that the data may be out of order when arriving to the PI/PO system.

 

Example of Outbound Messaging:

We will use the Opportunity object as an example on how to setup Outbound messaging on Salesforce side and how to use the Advantco SFDC adapter to listen for messages from Salesforce.

1.       Salesforce - Create a new workflow rule

pic1.jpg

 

2.       Salesforce - Define a new Outbound message

       Create a new Workflow rule for the Opportunity object.

 

pic2.jpg

Create an outbound message and assign it to the workflow rule.

pic3.jpg

 

Workflow with Outbound message action activated.

pic4.jpg

 

3.       Salesforce – Download WSDL

pic5.jpg

 

 

4.       SAP PI/PO – Import WSDL and generate the PI/PO OBM Web service deployment file.

       pic6.jpg

          A web service deployment file is created based the Salesforce Outbound message WSDL file.

       pic7.jpg

          

 

5.       SAP PI/PO - Deploy and Configure OBM Web service

Deploy and configure the Web service object to listen for incoming notifications from Salesforce.

pic17.jpg

 

 

6.       SAP PI/PO – Import WSDL into ESR to generate message structure

       Import the Outbound message WSDL from Salesforce into the ESR to define the source message structure.

      pic9.jpg

 

7.       SAP PI/PO – Create sender SFDC channel and create OBM mapping

        Create new sender SFDC channel of type Salesforce OBM.

        pic10.jpg

           Mapping object Opportunity to the sender OBM channel.

           pic8.jpg

 

 

8.       SAP PI/PO – Complete Configuration Scenario

pic11.jpg

 

9.       Salesforce – Update Outbound message endpoint

        Copy the endpoint of the Web service OpportunityNotificationPort.

     pic12.jpg

 

        Goto the Outbound Message screen and update the endpoint with the correct endpoint from the PI system.

        pic13.jpg

 

 

10.   Salesforce – Test Opportunity update

       Create a new Opportunity object to trigger an outbound message.

        pic14.jpg

 

11.   SAP PI/PO – Check incoming message from Salesforce.

       Channel monitor: new message coming into the OBM sender channel

       pic15.jpg

 

       Payload of the Outbound message as received by PI.

       pic16.jpg

 

Summary:

Outbound Messaging is a very powerful feature to get almost real-time information from Salesforce to SAP. The Advantco SFDC adapter supports this method of data replication with minimal configuration efforts and without the need to implement specific JAVA codes.


SFTP Adapter - Handling Large File

$
0
0

In SFTP adapter the following interactive options are included for handling large sized files:

  • Bypass PI Runtime For File Transfer
  • Chunk Mode

Bypass PI Runtime for File Transfer

This option allows user to choose a directory on the PI system where the incoming message or file is saved.  The intended receiver receives a file in the output directory containing the notification of the file where it is saved on PI system.

Chunk Mode

This option allows user to divide an incoming message into chunks of message ranging from 1 MB to 50 MB size as configured.


Prerequisites

You need to install ESR content patch (Minimum patch 02 of SP 04 for XI CONTENT SFTP ADAPTER 1.0) provided as part of the SAP Note 2144272. Upon successful deployment of ESR content patch, apply minimum of the patch number 8 or more of PIB2BSFTP SP04 for availing the fixes.


Procedure


Enabling Bypass PI Runtime for File Transfer

1. Choose Edit in the Edit Communication Channel screen.

2. Open Processing tab under the Parameters tab.

LFH_BYP_1.png

3. In Large File Handling area, perform the following:

3.1 In the Special Processing Mode field, set the value as Bypass PI Runtime For File Transfer.

3.2 In the Absolute File Path (including Filename) field, specify a valid path along with the file name for saving the incoming file on the PI system.

Example for the file path: //host/../../../../<file_name>.<file_type>

Note: To include time stamp to the file name, add “%TS” sequence to the file name, to avoid overwrite of the file.

LFH_BYP_2.png

4. Perform the following steps to verify the configuration and the file transfer:

4.1 Open Communication Channel Monitor.

4.2 Start the Sender Channel.

LFH_BYP_3.png

4.3 Go to the Storage location specified in the Large File Handling section at the time of configuration and verify that the incoming message has been saved.

4.4 Also verify that a file containing the details of the storage location has been put in the out directory of receiver channel.

Example: An XML file will be created in the storage location with the details as given below:

LFH_BYP_5.png

Parameters such as source file location, target file location, file size, time of creation are available in the XML file.

4.5 You can also verify the audit log as shown below.

LFH_BYP_4.png

 

 

Enabling Chunk Mode

 

1. Choose Edit in the Edit Communication Channel screen.

2. Open Processing tab under the Parameters tab.

3. In Large File Handling area, perform the following:

3.1 In the Special Processing Mode field, set the value as Chunk Mode.

This value appears for selection only when the Quality of Service value is Exactly Once in order (asynchronous)

3.2 In the Maximum Size (MB) field, select a desired value of chunk size you want the incoming message to be divided by.

LFH_Chunk_1.png

4. Perform the following steps to verify the configuration and chunking of message:

4.1 Open Communication Channel Monitor.

4.2 Start the Channels.

LFH_Chunk_2.png

4.3 Verify that the audit log for checking the chunking happens.

LFH_Chunk_3.png

The adapter creates a new XI message for each chunk.

4.4 Match the received file with the source file to verify the integrity of the message.

Add External Jar files in your UDF - Working with API's

$
0
0

Background

 

Writing UDF was always more easier than writing a Java code outside ESR and bring it back as Imported Archive. You can find all the details in this blog about implementing Java Mapping directly in your ESR : Write Java Mapping directly in ESR!


It was always been straight forward to add External Jar files in different Java IDE (Like Eclipse, SAP NWDS etc.) you used to develop your Java mappings. But other than you are using NWDS as your Service Repository, you always have to perform couple of extra steps in case of Java mapping compared with UDFs (User Defined Functions). This blog will be helpful for Swing client.


Also we add these Jar files locally to compile our Java code and it has to be available next time to us for further enhancements of the code. But with Function Library feature this Jar would be available throughout the SWCV.

 

Now with availability of different Open source API's and Java libraries, you would always want to add them in your code to reduce the development effort or to achieve specific functionality.


Example API

In PI, we commonly use String related functions. org.apache.commons.lang.StringUtils is a very powerful API available and it provides nice utilities to work with strings like checking for numeric/alphanumeric, finding difference or reversing other than common string functionality I will take this one as an example.


You can download Jar file from below website:

https://commons.apache.org/proper/commons-lang/download_lang.cgi


Now you need to download correct API depending on your PI server JDK version (Not JRE). Commons Lang 2 for Java 1.2+ and Commons Lang 3 is for Java 6.0+.  I will be using Lang 2.


Pic1.JPG


Now the file 'commons-lang-2.6-bin.zip' file is downloaded. Now unzip and you can see like below:

Pic2.JPG

Keep in mind we need to upload ".jar" file in our PI ESR and not the ".zip" file.


Now open ESR, create Imported Archive ('Apache_StringUtils' in the Pic) and upload 'commons-lang-2.6.jar', you can see Class files like below available within the library:

 

Pic3.JPG

 

Use of Jar inside Function Library UDF

Create Function Library as described in the blog:PI 7.1 Concept of Function Library in Process Integration - Process Integration - SCN Wiki


Go to 'Archives Used':

 

Pic4.JPG

Select your Archive created above.

Pic5.JPG

Now in your 'Import Instructions' tab, add the Package name just like you use in your Java code.

Pic6.JPG

It is ready to be used under UDF. I am going to use "equals" function to test. Now all the StringUtils functions are NULL safe.

 

Sample UDF Code

public String StringManipulation(String firstStr, String secondStr, Container container) throws StreamTransformationException{
if(StringUtils.equals(firstStr,secondStr))      return "YES";
else  return "NO";

 

Pic10.JPG


Access the code inside Message Mapping

You can use your Function Library inside any Message Mapping throughout the SWCV. It would look like below:Pic11.JPG

 


With the use of Open Source API's you can find easy solution for different difficult requirements, sometimes few are beyond the capability of basic Java functions.

Advantco receiver SFTP adapter and Variable Substitution

$
0
0

Recently, I happened to discover a nice feature of Advantco SFTP adapter which we might have missed checking in any of the other vendor SFTP adapters or even SAP SFTP adapter. The feature which I would like to highlight in this blog is about the extensibility of variable substitution option in Advantco SFTP adapter.

 

Many a times when variable substitution is termed in context of file scenarios in PI, the two parameters which come to our mind are the filename and directories but advantco SFTP adapter offers something more than that. Below are the parameters for which we can declare as variables in Advantco SFTP adapter

 

Server Name, User Name, Private Key File, Keystore of Private Key, Alias to Private Key, Known Hosts File, Proxy Host Name, Proxy User Name, Target Directory, File Name Scheme, Directory to Save Attachments

 

As we can see that there are couple of the parameters which in real time can come handy to decide our integration approach if our PI landscape uses advantco SFTP adapter. It obiviously depends on the requirement and individual how these parameters are utilised but below is one of the applcation scenario where I utilised these parameters

 

In one of my project scenario, i was required to send the files to SFTP application where user which we used to connect to application was actually the windows user and the directory where files should be placed was "/outbound" which ideally if we observe is equivalent to "c:/users/<myuser>/Documents" of normal windows user and looked something like below, if you observe in below image PGPHOME is the root directory where directory of all the users would be present and pgp is the SFTP user which i need to use in my SFTP communication channel and the directory Outbound seen in yellow is the one where i need to place the file in after connecting to server. Design of the application was such that when I login to application using user the default root directory becomes E:\PGPHOME\<sftpuser> and in channel i just need to provide extension of the outbound folder i.e. /outbound in parameter Directory.

SFTPblog_1.jpg

 

My scenario was that based on the filename (bank types) of the sender i was required to use different users to connect receiver SFTP application so that different bank files are delivered to corresponding outbound directories of different banks. for e.g. if the sender file was for RBC bank then the receiver directory of RBC bank on SFTP application would be E:\PGPHOME\RBC\Outbound while if the sender file was for PNC bank then the receiver directory of PNC bank on the SFTP application would be E:\PGPHOME\PNC\Outbound where in RBC and PNC in the absolute file paths are nothing but the users which i need to utilise to connect to the SFTP application. If you observe, directory from communication channel perspective would be "/Outbound" for both the bank files while it is the user(RBC/PNC) which varies here.

 

From interface design approach, I had couple of options before i came to know about the advantco's variable substitution possibility

1. Ask SFTP application team if they can create user(kind of admin) which has authorisations to directories of all the bank users and use variable substitution for directory in communication channel and creating directory to be used by means of udf.

2. Creating one communication channel one receiver per bank file which i receive from sender and filter the receiver based on the filename in the which is picked by PI from sender.

 

From re-usability perspective option2 above was complete NO because with each new bank there would be 6 new objects need to be created in PI so i was not interested to follow that but on my discussion with the SFTP application team for option1 i was left disappointed as they had limitations from security perspective in creating any super user which can access directories for all the other users. Thats where I decided to refer documentation of Advantco SFTP adapter to see if there was any possiblity and option to use "username" as variable was something which caught my attention and i was happy since my efforts of going through documentation proved to be fruitful and i decided to use this approach of using same channel to connect SFTP application and below is my brief interface design.

 

ESR:

 

1. Created UDF which would read the sender file name and used value mapping for identifying SFTPUSER. I assigned SFTPUSER to one of BASEFILENAME header attributes of SFTP adapter

SFTPblog_2.jpg

 

Here is the code of UDFs

a.  identifyfileType

 

String fileName    = ""; 
String fileType = "";
String bankType = "";
String paymentType = "";
DynamicConfiguration conf = (DynamicConfiguration) container.getTransformationParameters().get(StreamTransformationConstants.DYNAMIC_CONFIGURATION); 
DynamicConfigurationKey file = DynamicConfigurationKey.create( "http://sap.com/xi/XI/System/File","FileName"); 
//read Source Filename
fileName = conf.get(file); 
bankType = fileName.substring(0,3);
paymentType = fileName.substring(4,7); //determine paymentType
if (bankType.equals("PNC"))
fileType = fileName.substring(0,10);
else if(bankType.equals("RBC"))
{
if (paymentType.equals("ACH"))
{
fileType = fileName.substring(0, 14);
fileName = fileType + "_" + currDate + "_" + currTime + ".txt";
}
else
fileType = fileName.substring(0,17);
}
//set Target Filename
conf.put(file,fileName); 
return fileType;

b. setTargetUsername

 

DynamicConfiguration conf1 = (DynamicConfiguration) container.getTransformationParameters().get(StreamTransformationConstants.DYNAMIC_CONFIGURATION); 
DynamicConfigurationKey user = DynamicConfigurationKey.create( "http://advantco.com/xi/XI/SFTP/SFTP","BaseFileName"); 
conf1.put(user,userName); 
return userName;

 

Integration Directory:

 

1. Used ASMA for filename in sender adapter

SFTPblog_3.jpg

2. Created Value mappings in ID for maintaing filename-SFTPUSER mapping


3. Created one receiver Advantco SFTP and used variable %username% in parameter User of communication channel

SFTPblog_4.jpg

 

4. Maintained variable substitution in Advanced tab of the communication channel

 

SFTPblog_5.jpg

This blog is just a small effort to add a drop to sea of SAP knowledge. Hope you find this useful !!!

Did you ever wanted to link an XI/PI message with a business document?

$
0
0

Did you ever wanted to link a message with a business document? Often, messages are generated in the background and in hidden places like a BADI making it difficult to trace.

 

In my current project for example, an item that has been dunned generates an outbound message. The message is generated in a FM called in the dunning activity and this is not seen in the job logs. Wouldn’t it be interesting to make a link between the message and the item so we don’t need to search in the message payload? Or the other way around, and inbound message creates a document in my system. I would like to link the document with that message.

 

If the transaction that displays the ‘document’ profits from the Generic object services (GOS), this link is really easy. All we have to do is create a link between the message and the BOR used in the GOS and there are plenty of blogs/posts about it so I wont cover it here.

 

I’ve done this often with IDOCS, it’s quite easy and quite handy so I wanted to achieve the same, this time with XI/PI messages and …

 

it turned out to be already foreseen in the standard. Just have a look to methods CL_PROXY_ACCESS=>WRITE_INBOUND_MESSAGE_LINK and CL_PROXY_ACCESS=>WRITE_OUTBOUND_MESSAGE_LINK.

 

The class documentation already provides some examples so I won’t duplicate that here. Instead I’ll just show the result.

 

  This is transaction FPE3 and displays a FI-CA document. It has the GOS toolbar enabled and by clicking on the ‘Relationships’ menu, it displays the link to the PI messages.

example_GOS_FPE3.png

A double click on the line will take the user to the message monitor transaction.

example_GOS_FPE3_list.png

Create two files with same time stamp using StreamTransformationConstants

$
0
0

Introduction

The requirement is to create two files with different file names from one source message but we have to keep the same time stamp for both files (ABC_20140409-1030-897.txt and XYZ_20140409-1030-897.xml). One approach for this is multi mapping with dynamic configuration using custom adapter module as mentioned in below blog.

A new approach: Multi-mapping Dynamic Configuration using a generic custom module

In this blog i want show how to do this using Dynamic Configuration UDF with TIME_SENT constant of StreamTransformationConstants without using adapter module.

 

Documentation of TIME_SENT:

Time stamp specifying when the message was sent by the sender. The format of the time stamp is as follows: YYYY-MM-DDTHH:MM:SSZ The letter 'T' separates the date from the time, which is generally specified in UTC. If it is a local time, the closing 'Z' is omitted.

More information you can find here.

 

Source Code

This UDF we can use it in both mappings and map it to variable under root node.

public String setFileName(Container container) throws StreamTransformationException {  Map<String, Object> mapParameters = container.getInputHeader().getAll();  // access dynamic configuration  DynamicConfiguration conf = (DynamicConfiguration) mapParameters  .get(StreamTransformationConstants.DYNAMIC_CONFIGURATION);  // read TIME_SENT constant  String timeSent = (String) mapParameters.get(StreamTransformationConstants.TIME_SENT);  // change the format of TIME_SENT  SimpleDateFormat input = new SimpleDateFormat("yyyy-MM-dd'T'HH:mm:ss");  SimpleDateFormat output = new SimpleDateFormat("yyyyMMdd-HHmmss-SSS");  Date date = null;  try {  date = input.parse(timeSent);  } catch (ParseException e) {  throw new StreamTransformationException("Parse Exception: " + e.getMessage());  }  timeSent = output.format(date);  DynamicConfigurationKey KEY_FILENAME = DynamicConfigurationKey.create("http://sap.com/xi/XI/System/File",  "FileName");  // read file name  String fileName = conf.get(KEY_FILENAME);  String name = fileName.substring(0, fileName.lastIndexOf("."));  String extension = fileName.substring(fileName.lastIndexOf("."), fileName.length());  fileName = name + "_" + timeSent + extension;  // set new file name  conf.put(KEY_FILENAME, fileName);  return fileName;  }


Configuration

Create one ICO with sender proxy interface and assign two receiver interfaces and two operation mapping under Receiver Interfaces tab like below.

InterfaceDetermination.PNG

Assign two receiver file communication channels under outbound processing.

 

Testing Results
The two files created under target directory with same time stamp.

FileNames.PNG

 

Conclusion

With this approach we can create files with same time stamp when we split the message into two or more messages.

I hope this helps.

Using PGP module in advantco receiver SFTP adapter and need FCC? - Do not use 'XML-To-Plain Content Conversion'

$
0
0

When it comes to FCC errors during runtime, it always become tricky in identifying the point of failure if the file content is the problem or if some parameters are missing in channel for special case of FCC and we try what not ! but I was surprised to discover one of the issue highlighted in this blog which started popping up when i just added PGP module to already tested scenario of FCC in advantco receiver SFTP adapter. BTW this is where i got to know one more use case of "MessageTranformBean"

 

To understand the problem which i want to highlight about advantco content conversion option, if you are new to use of PGP module in PI i would suggest you to make yourself comfortable with basics about PGP module and its use in PI from these nice blogs of Vijay and Li which explains about PGP protocol as well with illustration.

 

Here is my initial set up of receiver SFTP adapter with message protocol as 'XML-To-Plain Content Conversion' and the scenario is working properly and i see that the file was converted properly as can be seen from the audit log

 

FCC_Blog_1.jpg

 

FCC_Blog_3.jpgFCC_Blog_5.jpg

 

Here is the file after content conversion

 

FCC_Blog_4.jpg

 

Now, since i also wanted a message to be converted to PGP format I just added and configured PGP module as suggested in standard guide of advantco as well as in blogs i mentioned above and thats where i started getting this strange error in audit log

 

FCC_Blog_2.jpg

 

Here is audit log of the error which relates to FCC in the initial look

 

FCC_Blog_8.jpg

Strange this was !!! as FCC is something which i had already tested before adding PGP module and I was like this cannot be FCC error something is wrong with PGP parameters or something and i thought of checking the logs if PGP was success or not and more surprise came when i saw that PGP was indeed success as shown below

 

FCC_Blog_9.jpg

 

Then a little observation and I came to know that what could be the possible issue

 

FCC_Blog_7.jpg

 

If you observe in the above screenshot, then we can identify that the root cause of the FCC issue is not the parameters of FCC maintained in channel but the sequence of the module processing !!! Here PGP module was processed before the standard MP Exit module of Advantco SFTP adapter, since content (XML DATA) of the message was already encrypted by PGP module so it was no longer XML which usually gets converted in  to text format when we use message protocol 'XML-To-Plain Content Conversion' in the MP Exit module of advantco and this is why the error of FCC was present in audit log since it was trying to convert PGP output of module 'AdvantcoOpenPGPSender' which was called before the MP Exit module.

 

This led me to think how can i do PGP after FCC and then I thought of using 'MessageTransformBean' but even that I did not have to explicitly add because advantco itself provides the below two options in Message protocol

 

1. XML-To-CSV Conversion Simple

2. XML-To-CSV Conversion Structural

 

FCC_Blog_10.jpg

 

My purpose was solved as selecting option 1 above added MessageTransfomBean in the Module tab and removed tab Content conversion, In the module tab i just had to update the parameters like fieldSeparator, endSeparator etc. Further, I added PGP module after this module since now input which PGP module receives is already converted to text format by its predecesor module MessageTransformBeam. Below is my final configuration of the SFTP channel

 

FCC_Blog_14.jpg

FCC_Blog_12.jpg

 

FCC_Blog_13.jpg

 

Below is the audit log of execution after this configuration

 

FCC_Blog_15.jpg

Working With Hermes JMS To Access JMS Queues

$
0
0


Dear SCN Users,

 

In my project we need to access TIBCO JMS queues.While working across came to Know about Hermes JMS which comes with SOAP UI,Hermes JMS is very useful in accessing the JMS queues.

 

Steps:

 

1. Install SOAP UI.

 

2.Go to TOOLS->HERMES JMS

 

Untitled1.png

 

3. Then Hermes JMS screen opens like below.

 

Untitled1.png

 

4. Configure the JMS queue server,By selecting Options->Configuration

 

Untitled1.png

 

4. First step in configuration is to upload the required Jars in the provider tab.Create a group and select the required jars for accesing the JMS server.In this  

    case to establish connection to TIBCO JMS all the TIBCO Jars were uploaded

 

Untitled1.png

5. Next step ,In the session tabs select and configure the required parameters as below.

 

Untitled1.png

 

 

6. After configuring select the session and click on discover to see all the queues.

 

Untitled1.png

 

7.Messages can be sent directly using "Messages"  Tab.

 

Untitled.png

 

8.Also From SOAP UI Messages can be sent by configuring JMS End POINT in the Project.

 

Untitled.png

Hermes JMS is very helpful tool to access and test JMS queues


USEFUL UDFS - 2

$
0
0


Dear SCN Users,

 

In continuation to http://scn.sap.com/community/pi-and-soa-middleware/blog/2015/03/16/useful-udfs, adding some more UDFs.

 

11. UDF tostrip Left Zero

 

q.jpg

    

Code:

 

   if(str == null)

    {

            return null;

    }

   int i;

   char[] chars = str.toCharArray();

   for(i = 0; i < str.length(); i++)

   {

                  if(chars[i] != '0')

            {

              break;

     }

   }

   return(i == 0) ? str : str.substring(i);

 

Output:

 

q.jpg

12. UDF to add Values in Queue/Context:

 

  q.jpg

Code:

 

String output = "";

double total = 0.0;

for(int i=0;i<var1.length;i++)

total=total+Double.valueOf(var1[i].trim()).doubleValue();

output=total+"";

result.addValue(output);

 

Output:

 

q.jpg

 

13. UDF to remove duplicate/repeated entries in Context/queue

 

q.jpg

 

Code:

 

int i;

String val = new String();

for ( i = 0 ; i < input.length ; i++)

{

  if ( input[i].equals(ResultList.CC))

     {

         result.addValue(ResultList.CC);

         val = "";

         continue;

      }

   if ( val.equals(input[i] ));

  else

      {

         result.addValue(input[i]);

         val = input[i]; 

       }

}

 

Output:

q.jpg

 

NOTE: Sort the values before inputing the UDF

 

14. UDF to make 2 digits after the decimal point

 

After the decimal point if there is one digit in the input value, then this udf is used to add zero at the end to make 2 digits after the decimal. Example: if the input values is 1.1, then this udf used to display 1.10.

 

q.jpg

 

Code:

 

int len = value.indexOf(".") ;

int tot_len = value.length();

String result = "";

if (len > -1)

{

            if((tot_len - len) == 2)

            {

                                                result = value.substring(0,len);

                                                String temp1 = value.substring(len,tot_len);

                                                return result + temp1 + "0";

            }

            if((tot_len - len) == 3)

            {

                                      return value ;

            }

  return value ;

}

else

return value   ;

 

Output:

q.jpg

 

15. UDF To change Supress to empty value

 

q.jpg

Code:

 

for (int i=0;i<value.length;i++)

{

if(value[i].equals(ResultList.SUPPRESS))

result.addValue("");

else

result.addValue(value[i]);

}

 

Output:

 

q.jpg

 

Will be adding more UDFs soon.

 

 

 

Async/Sync Bridge : IDoc (Async) - SOAP (Sync) - Proxy (Async)

$
0
0

Introduction:

We were busy with a C4C (Cloud for Customer) project and one requirement that we had was that we had to send equipment to C4C. We are using SAP PO as the middleware to send through the equipment. C4C comes with SAP standard objects mapping etc. So it was easy to just implement and send through the data. The interface is from IDoc(ECC) to SOAP(C4C). But we had a problem when updating C4C, but it was sending back the error. We were not doing anything with the response because IDoc can only be asynchronous. But luckily for us the SOAP(C4C) equipment web service is synchronous. So this blog will tell you how we handled the problem of going from an asynchronous interface to an synchronous interface and back to asynchronous.

 

We decided to go with the following strategy. Sending out the IDoc to the Sync web service and then back into ECC using Proxy.

IDOC(Async) -> SOAP(Sync) -> Proxy(Async)

Im not going to go through how the basics are created. Will only explain the part how to create a async/sync brige between the 2 interface.

 

Requirements:


You will need to create 2 independent interfaces. The one I am using is the one discussed in the introduction.

First interface will be IDoc - SOAP. In the SOAP receiver adapter you will configure the adapter module specified down below.

Second interface will be from SOAP - Proxy. This interface will be triggered by the first interface.


SOAP Adapter Module:

Adapter Module : Processing Sequence

Module NameTypeModule Key
AF_Modules/RequestResponseBeanLocal Enterprise Beanrequest
AF_Modules/ResponseOnewayBeanLocal Enterprise Beanresponse

 

 

Adapter Module : Module Configuration

Module KeyTypeModule Key
requestpassThrough           true                                      
response                    interface<interface_name_for_second_interface>
response                        interfaceNamespace<interfacenamespace_for_second_interface>
response                        replaceInterfacetrue

 

Please see the message log down below.

 

Capture.PNG

Configuration of First Interface - Step by Step

 

1. Sender Comm Channel

01_FirstInterface.PNG

03_FirstInterface.PNG

04_FirstInterface.PNG

 

05_FirstInterface.PNG

 

06_FirstInterface.PNG

 

07_FirstInterface.PNG

 

Second Interface - Step by Step

 

01_SecondInterface.PNG

 

02_SecondInterface.PNG

 

03_SecondInterface.PNG

 

 

 

Let me know if you would like me to add any additional information that I might have missed.

 

Regards,

 

Jannus Botha

SAP PI/PO Learning survey

$
0
0

Proudly announcing that the answers given by 139 developers were compiled into a complex analysis. Reading all the answers has helped me discover some interesting correlations. The report containing the detailed analysis of the responses can be found here: http://picourse.com/learningsurvey


http://d30n0m8j8xjig1.cloudfront.net/wp-content/uploads/2015/04/SAP_book_cover_final_transparent-212x300.pngConducted in March 2015, the survey proved itself to be a real success. Respondents were eager to share the details of their learning path and career in the field of SAP. In total, 139 professionals participated in the survey. From their answers, I was able to deduct that real experts are always ready to learn more, and to share their knowledge as well.


There is no universally accepted way of learning SAP XI/PI/PO - in fact, several methods seem to work just fine. Options include regular classroom courses, online classes and in-house training sessions, such as those offered by consulting companies.


However, maintaining a mentorship with a more experienced consultant or colleague seems to lead to the best results. In fact, several questions present in the survey focused on the effect such professional collaborations have on the learning process.  It seems that most beginners have a positive attitude in regards to mentorship.


Time is also a crucial aspect of learning. Naturally, the speed of learning and acquiring new skills differs from one person to another. Still, the survey has revealed that most developers need at least 1 to 2 years of assistance before being able to work independently. Unfortunately, the responses have also shown that courses are merely detailed introductions - most of us need a helping hand with specific issues when starting out in this field.


I also looked at continuous improvement. It turns out that the majority of developers (especially the more advanced ones) review their work. Furthermore, those working at bigger companies tend to be even more keen on reviewing their own projects.


The entire report is available here:http://picourse.com/learningsurvey/

Recompile com.equalize.xpi.af.modules as EJB 2.1 modules in NWDS 7.1x

$
0
0

Introduction

FormatConversionBean and the other modules in com.equalize.xpi.af.modules are developed in NWDS 7.31 SP13 Patch 0 as EJB 3.0 modules. To be able to use these modules in PI versions earlier than that, the EJB & EAR projects needs to be recompiled in an NWDS version matching the PI system where it will be deployed.

 

As the source codes are publicly available in the GitHub repository, this blog is a guide on how to perform the recompilation on the modules. As EJB 3.0 is only applicable for PI 7.3x onwards, the module codes needs to be refactored for EJB 2.1.

 

This guide is also applicable for those who wish to download the modules to make further changes according to their own requirement. Ignore EJB 3.0 -> 2.1 changes if they are not applicable.

 

 

Prerequisite

Following are the prerequisites in order to recompile the modules.

  • Download and install appropriate version of NWDS (refer NWDS Download Links Wiki)
  • For NWDS 7.1x and earlier, the download links only provide a kernel with minimum functionality. Ensure that the NWDS is updated to have the appropriate functionality for PI development.
  • Install version of JDK that is compatible with the NWDS (i.e. JDK 1.5 for NWDS 7.1x) - refer wiki for the JVM versions used by each XI/PI version

 

Note: I was not able to update the NWDS 7.11 CE kernel, maybe the update site is no longer available (I'm not sure). Therefore the examples provided below are using NWDS 7.31 and some of the screenshots might look different. However, the procedure is still more or less the same.

 

 

Source Codes and Libraries

As a preparation step, the following source codes and libraries need to be downloaded first.

 

 

Extract all the Zip files into appropriate directories in the local file system.

 

 

Step by Step Procedure

The steps listed here are based on the steps listed in Section 4 of the following document. This blog will only describe the differences of each step where it is applicable. Otherwise, the steps listed in the document should be followed.

How to Create Modules for the JEE Adapter Engine

 

4.1 & 4.2

Follow steps listed in document

 

4.3

Create EJB 2.1 Project following on steps listed with the following details

  • EJB project name: com.equalize.xpi.af.modules.ejb
  • EAR project name: com.equalize.xpi.af.modules.app

proj.png

 

4.4

Ignore steps listed in document.

 

Create package com.equalize.xpi.af.modules under ejbModule.

package.png

pack1.png

Import the downloaded source code for com.equalize.xpi.af.modules into the newly created package.

import1.png

Browse to the directory where the extraced source codes are and select the following files and folders to import.

files1.png

Repeat steps for importing com.equalize.xpi.util and org.java source codes.

import2.png

 

import3.png

 

4.5

Replace content of ejb-jar.xml file based on following criteria:-

  • Compiling for EJB 2.1 - Use following file > EJB 2.1 ejb-jar.xml
  • Compiling for EJB 3.0 - Use file included in com.equalize.xpi.af.modules ZIP extract

 

4.6

Replace content of ejb-j2ee-engine.xml file based on following criteria:-

  • Compiling for EJB 2.1 - Use following file > EJB 2.1 ejb-j2ee-engine.xml
  • Compiling for EJB 3.0 - Use file included in com.equalize.xpi.af.modules ZIP extract

 

4.7

Follow steps listed in document to include PI AF library files.

 

Additionally, include external Apache POI libraries in the build path.

lib1.png

Add the following 5 files as External JARs.

extjar.png

 

4.8

Ignore steps listed in the document.

 

At this point, most of the imported source code should no longer have any syntax errors. There remains a few more tweaking in order for the remainder of the source code to be error-free and ready for compilation.

 

The message logger class, com.sap.engine.interfaces.messaging.api.logger.MessageLogger, is only available from 7.3x onwards. Therefore, when compiling for 7.1x and below, the MessageLoggerHelper class needs to be deleted as shown below.

log.png

 

Additionally, the AbstractModule class needs to be changed as follows:

  • Remove logic using MessageLoggerHelper
  • Class needs to implement javax.ejb.SessionBean interface (and corresponding methods) as required by EJB 2.1

 

The source code for AbstractModule can be replaced with the following source code > EJB 2.1 AbstractModule.java

 

At this point, the EJB project should be error free.

 

4.9

Skip steps listed in the document.

 

4.10

The steps listed in the document can be followed, or optionally just replace the content of application-j2ee-engine.xml with the same file included in the com.equalize.xpi.af.modules ZIP extract.

 

4.11 & 4.12

Follow steps listed in the document to complete compilation and deployment.

 

 

Reference

To perform testing of the modules prior to deployment, refer to the following article on how to do perform standalone testing in NWDS

Standalone testing of Adapter Module in NWDS

SAP HCI Trial - My Experiences

$
0
0

I recently had the great opportunity of using the SAP HANA Cloud Integration (HCI) platform through the trial program offered by SAP. It is a really nice tool, which is paving the way for the cloud system’s users. It makes it easier for companies to use cloud-based integration. There are some obvious use cases, so if you don’t already have an integration tool, but you’d like to invest in a new one, then it makes sense to use HCI. If you are already internally using Process Orchestration (PO), or the Process Integration (PI) system, this might not be a good fit for you, unless you wish to get rid of those systems. If you only have a few small integrations, then moving to the cloud-based platform makes sense.

 

The trial period lasted 30 days. Unfortunately, the program is not yet available for everybody, which is a shame because SAP really created an amazing product. It is taking up some resouces at SAP at the moment, and that is why it is not in wider use.

 

SAP HANA Cloud integration overview

I would like to give you some insight into the HANA Cloud Integration platform. HANA is SAP’s cloud-based integration platform, which was designed to help companies evolve using cloud-based solutions, while doing a lot of the integration work using cloud scenarios.

 

When you sign up for the service, you can see an overview. From that page, you can select the pre-delivered content you need. You just need to copy the content, and you’ll be ready to use it. You can edit the content and change the Integration Flow according to your own wishes, e.g. you can add a mapping between data structures. The mapping is the same as in PI/PO. You can also add functionalities with user defined functions.  

 

You are basically using the same tool as in regular message mapping. It is similar to handling mapping in BPMN, but you are able to use multiple different mappings and changes in a message process, unlike in PI. HANA gives you more flexibility in mapping and saving your projects. When you are done, you need to press ‘Deploy’ in order to deploy your project into the system.

 

You can see what types of scenarios are available on your system. If you want, you can also see what processes are running - you will get a list of everything that has been running for a certain period of time. If you wish to see the data that has been sent through the system, you can receive all the details in the Message Processing Log. This is a quite a nice tool that lets you see all the processing times and other details. You will probably work most using this web-based tool.

 

Let’s take a look at the Eclipse-based tool as well. This tool is designed for more complicated tasks. It basically works the same way: you can add and edit mappings; however, you still need to be very precise when choosing the location of the different elements you are working with. The good thing is that you can easily import PI content (message mappings and interfaces). If you use Import ES Repository Interfaces, you can download everything here. The use of define functions is not yet supported - if your mapping contains some of those, you might have some issues.

 

 

You can also work with XSLT and Groovy Scripts. Most of it works the same way as user-defined Java mapping, but XML handling is easier using Groovy. If you want a programming language specifically developed for XML, Groovy should be a top choice. It is a scripting language, you can tap down your content instead of using getChild.

 

All in all, I really like this platform. I can find many use cases for it. You will see some similarity to Business Connector (BC), which was an older SAP Integration tool. Then there’s the Cloud Integration - if you are integrating through a cloud application, this is a good way of doing it, because you are basically doing it in the cloud-based system.

 

I really like the product. It gives you a lot of flexibility as a developer. There are a few bugs, but I am sure these will be sorted out in the following releases. The Eclipse tooling has already been updated, so you will see that it works just fine.

 

This blog is crossposted to SAP HCI Trial - SAP PI Course


How do you see HCI please share bellow in the comments.

Viewing all 676 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>