Quantcast
Channel: SCN : Blog List - Process Integration (PI) & SOA Middleware
Viewing all 676 articles
Browse latest View live

Message Alerting in SAP PI/PO: Sending Alerts as Push Notifications to Mobile Device

$
0
0

Intro

Component Based Message Alerting (CBMA) functionality of Advanced Adapter Engine (AAE) of SAP PI/PO is already shipped with alert consumer that can be used to deliver alerts to recipients as e-mail messages. With increased popularity of smart mobile devices in alerting and notification solutions, some organizations experience demand in using alternative channels of alerts delivery and are interested in bringing them to mobile devices. Even though e-mail alerting is still very popular in enterprise IT world, during last few years, one of commonly raised questions in the area of alerting in SAP PI/PO, was feasibility of delivering alerts via SMS. SMS messages are surely more lightweight in comparison to e-mail messages, but large amount of SMS alerts on interface errors may overflow SMS inbox and cause other SMS messages becoming unnoticed among many alerts, unless some kind of SMS sorting and filtering is applied. Moreover, there is even more lightweight technique that is available on many smart devices - push notifications. What if an alert from a SAP PI/PO system is delivered to mobile device as a push notification?

 

Standard shipment of a SAP PI/PO system doesn't deliver functionality to implement this kind of requirement by means of pure customizing, so let me share an option how this can be introduced to a system with minor custom development.

 

It shall be stated upfront that I haven't yet found technical solution fulfilling this requirement, that would not involve intermediate services responsible for processing alert messages and delivering them to mobile device. High-level interaction diagram of involved components of the described solution is depicted on illustration below:

Components diagram.png

The solution is as following:

  • Custom alert consumer is registered for an alert rule in a SAP PO system;
  • Custom alert consumer job (which is a Java Scheduler job) is developed, deployed and scheduled for periodic execution in a SAP PO system: a job consumes alerts produced for a registered consumer, suppresses alerts that correspond to the same interface and the same error, and sends requests containing alert notifications to a RESTful service exposed by an external gateway service;
  • On a mobile device (in the demo described in this blog, phone), an application provided by a gateway service provider, is installed and a device is registered as a recipient for messages produced by a gateway service.


As a result, when an alert is produced by a SAP PO system and is consumed by an alert consumer job, an alert message is sent to a gateway service, which forwards it to a mobile device. Mobile device processes it and renders in a form of push notification.


Please note that this mechanism of alert notifications delivery to a recipient is in no way considered as a replacement of e-mail notifications or other alert subscribers. Nature of push notification is delivery of very lightweight information notifications, commonly consisting of very few text lines. In contrast, e-mail messages containing notifications or other alert subscribers / alert aggregation and processing systems (like SAP Solution Manager, if it is used as a central engine for alerts processing) normally handle and deliver more extended version of an alert providing much more details about it. Having written so, I would rather encourage to consider push notifications for alerts as a complementary technique to already existing and commonly used alerting approaches.


Remaining part of the blog contains major configuration steps and technical details of solution implementation and a demo.



Prerequisites: AAE part

CBMA is activated in AAE and appropriate alert rule is configured and enabled. If you need more information on this, you may find decent SCN materials that describe concept, technical details and configuration of CBMA in general and alerts consumption and notifications via e-mails in particular:

 

Architecture and configuration of CBMA is also described in SAP Help documentation: Component-Based Message Alerting - Administering Process Integration (PI) - SAP Library.

 

 

Prerequisites: Gateway service part

There are multiple services that are available for individuals and organizations to mediate incoming messages and deliver them as push notifications to various platforms running on mobile devices (and some of them provide services not only for mobile devices, but also for desktops) - here are just a couple of them which are commonly used:

Please note that some services require paid subscription, so it is advisable to refer to corresponding provider official materials for more details regarding terms and conditions of their services usage.

 

In the demo below, I'm using gateway service provided by Pushover - some parts of the described job implementation are specific to Pushover API and are not service agnostic (should be adopted if you would like to use another gateway service provider or if you host and run such a service within your organization). Pushover service exposes a REST API (RESTful service), to which registered users can send POST requests over HTTPS. Message transmitted in a request, is then forwarded to a registered device (mobile device running iOS or Android, or desktop). Mobile devices process received message and deliver it to Notification Center software of a device in a form of push notification.

 

An account was registered and an application on behalf of which SAP PO alerts are going to be delivered to a mobile device, was created. For a registered account, Pushover platform generates a random user key, which will be required later on when sending alert notification messages to a gateway service. Similarly, for every created application, Pushover platform generates a random application key (token), which will also be required later on when sending notification messages to a gateway service.

 

 

Prerequisites: Mobile device part

Gateway services may require installation of specific applications on mobile devices in order to enable communication between a gateway service and the device, before messages can be delivered and rendered by the device as push notifications.

 

Corresponding Pushover application was installed on the mobile device used in a demo (iPhone mobile running the latest iOS version).

 

After this was done, when an application was started for a first time, it was logged into the Pushover service using earlier created user account credentials, and device ID was specified. Device may be used later on to subscribe to specific applications or to be explicitly specified as a recipient of a notification when sending notification messages to a gateway service.

 

 

Screenshots below highlight major created objects in Pushover service account, that will be used in a demo:

  • User key - masked due to security considerations;
  • Application named "SAP", application key - masked due to security considerations;
  • Device named "Vadim".

Pushover service setup - general.png

Pushover service setup - application.png

 

 

Register custom alert consumer

Custom alert consumer named "PUSH_NOTIFICATION" was assigned to an alert rule:

Alert rule.png

 

 

Develop custom alert consumer job

Steps required to develop custom Java Scheduler job, are well described in SAP Help documentation: Creating the Hello Job Definition -  Using Central Development Services - SAP Library.

 

Custom Java Scheduler job named "AlertConsumerPushNotificationJob" was developed. Job logic comprises following steps:

  1. Consume alerts from local alerts store of AAE for a specified alert consumer. Job implementation provides two alert consumption methods:
    • Using SOAP service AlertRetrieveAPI_V2 provided by AAE. Alert Engine provides Alerting API exposed as a SOAP service, which can be used to develop custom external consumers, consume alerts and process them in custom specific way. Technical details about usage of this API can be found in SAP Help documentation: Alerting API on Alert Engine - Administering Process Integration (PI) - SAP Library. WSDL service definition of this SOAP service can be retrieved from a SAP PO system at http://<host>:<port>/AlertRetrieveAPI_V2_Service/AlertRetrieveAPIV2ImplBean?wsdl;
    • Using consumer specific JMS queue jmsqueues/alertingVP/jms/queue/xi/monitoring/alert/<consumer ID> registered in the JMS Provider of AAE. It shall be noted that even if an alert consumer is registered for an alert rule, corresponding JMS queue will only be created automatically when a first alert is generated and is due to delivery to that alert consumer. This means, access to local alerts store JMS queue may fail because corresponding JMS queue is not found, and it may not be necessarily due to an error, but because alerts haven't been produced for that alert consumer yet;
  2. Parse consumed alerts and retrieve alerts' payload, which is stored in local alerts store in JSON format;
  3. Suppress individual alerts in such a way that only one notification is generated for multiple alerts which correspond to the same scenario and are related to the same error. Generally speaking, this step is optional, but was introduced to avoid overflow of the mobile device with push notifications and degraded user experience in case there would be many alerts produced by a SAP PO system due to the same error for same scenarios in a short period of time;
  4. Prepare and send requests to a RESTful service exposed by a gateway service. For each notification, individual request containing a reduced, only very basic and minimum set of original alert's attributes, is sent. Extensive documentation on REST API of Pushover gateway service is available at https://pushover.net/api.

 

Following 3rd party libraries where used in implementation and are dependencies for the developed job implementation:

  • Google Gson (GitHub - google/gson: A Java serialization library that can convert Java Objects into JSON and back.) - used for conversion of Java objects to / from JSON representation. Alert payload is persisted in local alert store of AAE in JSON format - conversion from original JSON formatted message containing alert payload to Java object was used internally in an alert consumer job in order to further process consumed alerts;
  • Jersey (Jersey) - used to enable RESTful client capabilities in an alert consumer job and send notification messages to a RESTful service of a gateway service.

 

Additionally, generation of Java class reflecting structure of JSON message was done using http://www.jsonschema2pojo.org/.

 

Both mentioned alert consumption methods (calling SOAP service and consuming from JMS queue) have advantages and disadvantages - here are few of them:

 

SOAPJMS
AuthenticationSOAP service consumption requires explicit authentication, which has to be embedded into an alert consumer job logic. In the described demo, basic authentication method support is implemented, user and password required for authentication are exposed as job parameters, which is not secure enough (since they are stored in plain text in job details and can be further retrieved when looking into job parameters of a scheduled job). More complex logic for retrieval of valid authentication credentials can be implemented instead, if required.JMS Provider and corresponding JMS queue registered in it is accessed using initial context and JNDI lookup. Additional explicit authentication is not required for server-side JNDI clients such as applications running in the same system - for example, for a deployed Java Scheduler job.
PerformanceSOAP service internally accesses JMS queue associated with specified alert consumer, and consumes alert messages from it - in other words, it acts as a SOAP wrapper on top of that JMS queue. Thus, very minor extra latency may be expected when using SOAP service in contrast to direct access to JMS queue.Core  components of local alert store persistence layer are alert consumer specific JMS queues - in this way, alert consumption from JMS queue corresponds to lower level access to local alert store and may result in slightly reduced processing time of alerts consumption logic.
Implementation efforts

Usage of JAX-WS Web Service Client generation wizard from provided WSDL in NWDS simplifies creation of required Java proxy interfaces and classes.

Invocation of required SOAP related functionality (like creation of request, calling SOAP service operation, receiving and handling response) is abstracted and simplified.

Some more efforts are required to implement JMS consumer and respective resources cleanup.


Mentioned aspects may be relevant or irrelevant in various environments and in application to specific requirements, so an implemented job supports both alert consumption methods and the one who schedules the job may decide which option suit them better.

 

Outcome of all major job steps is reflected in job log. If the job completes successfully, its return code is assigned value "0", if some error occurs during alerts consumption, parsing / suppression and push notifications generation, return code is assigned value "-1". Further details describing error reason can be found in job log.

 

 

Demo

After EAR file containing a custom alert consumer job is successfully deployed to a SAP PO system, we are ready to schedule a job and test its behavior.

 

Corresponding job definition name for which job task is to be scheduled, is "AlertConsumerPushNotificationJob":

Job scheduler - task.png

Currently following job parameters are valid for this job definition:

 

Job parameterDescription
AlertConsumerMethodMethod that is used to consume alerts from local alerts store. Currently alerts can be consumed using SOAP service (value of the parameter: SOAP) or from a consumer specific JMS queue (value of the parameter: JMS)
AlertConsumerAlert consumer ID (the one that was used when registering a new consumer in an alert rule)
AlertLanguageAlert language (for example, EN)
AlertMaximumNumberMaximum number of consumed alerts from local alerts store. This is used as a safety measure and mechanism preventing potential performance degradation in case extensive amount of alerts has been produced for a specified alert consumer
PushNotificationServiceURL of a gateway service to which alert notifications are to be sent. In case of Pushover, following fixed URL exposed by Pushover, is used: https://api.pushover.net/1/messages.json
PushNotificationUserUser key that has been generated by Pushover for a registered user account and that is to be used for authentication of incoming requests containing alert notifications
PushNotificationTokenApplication key (token) that has been generated by Pushover for a created application and that is to be used to differentiate incoming requests being produced for different applications. Generally speaking, it is possible to create multiple applications within same Pushover account and use different applications to associate requests with
PushNotificationDeviceRecipient device to which push notifications are to be delivered by Pushover service. If necessary, multiple devices can be specified, separated by a comma. A parameter is optional and can be left blank - if done so, push notifications will be delivered to all devices subscribed for an application for which push notifications are generated
AAEUserA parameter is only mandatory if SOAP alert consumer method is selected and is a user account to be used for identification when calling SOAP service for alerts consumption. If JMS alert consumer method is used, a parameter can be left blank (it will be ignored by job logic)
AAEPasswordA parameter is only mandatory if SOAP alert consumer method is selected and is a password to be used for authentication when calling SOAP service for alerts consumption. If JMS alert consumer method is used, a parameter can be left blank (it will be ignored by job logic)

 

Below is an example of provided job parameters for a scheduled job task (push notification user key and application token are masked due to security considerations):

Job scheduler - job parameters.png

 

After a job is scheduled, I produce several errors for one of scenarios which were assigned to an alert rule created earlier for this demo.

 

If a job is executed successfully, then job log will look similarly to the one provided below:

Job log - successful.png

 

Let me also demonstrate some erroneous cases by creating this job task with various combinations of job parameters. If job execution encounters errors, it is worth checking job log first to identify problematic area or step where error occurred - here are few examples of erroneous cases reflection in job log:

  • Alert consumer method different from SOAP or JMS was specified in job parameterization:

Job log - error - unsupported method.png

  • When using JMS alert consumer method, alerts consumption failed due to wrong consumer ID was specified (reason for warning and not error log entry in case JMS queue is not found, has been explained earlier):

Job log - warning - JMS lookup.png

  • When using SOAP alert consumer method, alerts consumption failed due to invalid password was specified for AAE user account in job parameterization:

Job log - error - SOAP authentication.png

  • Message delivery to a gateway service failed due to invalid Pushover user key and/or invalid application token was specified in job parameterization:

Job log - error - gateway service.png

 

In case job executed successfully, after job completed and notification messages reached a gateway service, respective push notifications containing selected alert information, will be delivered and shown in the mobile device:

Mobile - notification.PNG

Mobile - Pushover notifications overview.PNG

Mobile - Pushover notification details.PNG

 

 

Downloads

You can download SAP EAR file that is a bundle containing the described job implementation, and source files from which it was built, from following locations:


Projects were built using NWDS 7.31 and compiled using JDK 6. Most recent stable versions of mentioned dependency libraries, which were released and at a time of a blog writing, were embedded into an assembled EAR file.


The job was tested in a SAP PO 7.31 SP16 system. This job has only been used and tested in a prototype, and has not been used in productive environment yet.


SAP HANA Cloud Integration :: Webservice SuccessFactor - A Walkthrough

$
0
0

Overview

 

We are aware that today's technology world is shifting towards Cloud Platform. Many companies offering their Cloud compatible business solutions. One such major product is Success Factor - SAP Cloud solution for HCM. Hence the Integration scope of Cloud-Cloud and Cloud-On-Premise system has already been expanded. SAP PI/PO -Middle-ware technology caters Integration needs of all A2A/B2B but supports On-premise systems connectivity only. SAP's HANA Cloud Integration's platform (HCI) provides Integration featuring Cloud-Cloud and Cloud-On premise systems.

 

About

 

This blog walks through you about a scenario featuring connectivity between On-Premise and Cloud Solution via HCI.

 

(On premise) Webservice (SOAP) -- > SAP HCI (Cloud)  --> SuccessFactor (Cloud Solution)


 

Tools used

  • Eclipse Luna (4.4.2) IDE
  • SOAP UI 5.2.0
  • Key store explorer 5.1.1

Pre-requisite

  •   Above mentioned tools should already been downloaded into your system
  •   HCI development Tools downloaded from https://tools.hana.ondemand.com/luna and
  •   Of course HANA cloud Instance /HCI Instance Trail Version.

 

Scenario

Web Service call with User name details passed to HCI been triggered from SOAP UI and corresponding User details will be fetched from Success Factor. I have used XPATH expression to get the user details dynamically from Success Factor . Step by step approach has been explained below.

 

 

Overall Flow

Integration Scenario.JPG

 


Source Structure

 

Synchronous source Structure created using on-premise PI system<<HCI Integration Flow help>> and same has been imported here into HCI system.

 

        UserDetail_Source.JPG

        SOAP Adapter.JPG

Connection Details


Address --> /userquery will be scenario specific details . It will be added as suffix in End point URL generated for Scenario.

 

Remaining Service and Endpoint we can select it manually or by choosing the respective WSDL it auto populates the Service and Endpoint.

 

Success Factor Adapter Configuration

 

SF Adapter supports SOAP/ODATA Protocols for SFAPI and ODATA API respectively. We have opted for SFAPI in our case.

 

 

SFAdatper.JPG

SFAdapterconfig.JPG



Connection to SF Server configuration screenshots captured in attached document.<<HCI Integration Flow help>>

 

Target XSD has been generated by connecting to Success Factor Server to choose required Entity , its USER Entity in our case.

 

XPATH has been used in Query to get the Dynamic user name during run time.


Connectivity between Success Factor and HCI


Connectivity  between SuccessFactor (Cloud Solution of HCM)  and HCI requires Certificate Based Authentication.


Authentication mode.JPG

Basic Authentication mode would be preferred for Inbound messages to SAP HCI. Here we need to go with Certification based Authentication mode. Since HCI calls SuccessFactor ( push from HCI) to establish connectivity Certification based Authentication mode is required. Kindly except the below error otherwise.

 

Even if SuccessFactor been at sender end Certificate based authentication is required.

 

 

"Error = javax.xml.ws.WebServiceException: Could not send Message., cause: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target"


Basic Credentials and Certification Deployment steps


Following are different Authentication type available in SAP HCI currently. Different scenarios do require different authentication type .
For eg., Connectivity to Twitter we need to maintain Secure Parameter type IDs for authentication purpose.


Our case requires User Credentials(username/password) and Keystore (for Certificate deployment) configuration and deployment.


Wizard will be available by right clicking on Cloud Instance which can be accessed from left bottom corner (Node Explorer view)


                Node explorer.JPGArtifacts deployment.JPG



User Credentials


System's User credentials will be maintained in the below format and the same can be reused multiple times for similar server connectivity by deploying into the HCI server. Our case we need to maintain Success Factor Cloud Instance Username and password and configured in SuccessFactor Adapter level.                                                                             

system credz.JPG


Keystore Certificates

 

Keystore Explorer has been used to store Certificates we obtained from Vendor (Successfactors here - Root, Intermediate and CA) and store the certificates in keystore in .jks format which will be deployed into HCI Server. Certificates will be maintained against single entry system.jks (standard) and need to ensure all the old and new certificates are available.

 

In order to do the same, system. jks file should be downloaded from HCI Server and after appending the latest certificates we can deploy system.jks again.

 

Few other blogs talks about Certificate download and hence its been pointed out in reference section to avoid redudancy.

 

keystore explorer.JPG

 

Message Mapping

 

We are maintaining simple message mapping between source and Target and user details XPATH value passed to Query (Dynamic user details) to fetch details from Success Factor

 

message mapping.JPG

 

 

Overall Integration Flow (iFlow)



IFlow.JPG

Scenario Deployment

 

Each and every iflow should be deployed into Server and Endpoint URL will get generated for every case.

 

We can copy the Endpoint URL /WSDL can be downloaded and share it with the SOAP Clients.

 

SOAP UI Call


Using the WSDL url a test project will be created in SOAP UI.


Simple SOAP UI Request call will be made to invoke the service in HCI. (with Basic Authentication)


SOAP UI Request.JPG


Once request triggered from SOAP UI, call will be made to HCI and SuccessFactor been connected and response will be obtained from Successfactor and passed to SOAP UI via HCI.


SOAP UI Call.JPG


Monitoring


Message Monitoring at HCI level


Below screenshot depicts Message flow monitoring at HCI server level.


HCI _monitoring.JPG


Success Factor level Monitoring


Logon to SuccessFactor Cloud Instance and by choosing SFAPI Audit log, message flow can be monitored.


Successfactor Monitoring.JPG


Detailed Log of Request and Response can also be viewed at SF level.

 


 

SF audit log.JPG

I understand there are many blogs related to SuccessFactor are available over net, however i believe this blog gives you an idea and step by step guidance for Webservice to SuccessFactor Connectivity via HCI.

 

Hope it helps. Happy Learning

 

Reference :

SuccessFactors Adapter in SAP HANA Cloud Integration (SAP HCI)

Successfactor Integration with HCM system Using SAP PI

Stop using the wrong NWDS version! © - Have a smoother Eclipse experience

$
0
0

Introduction

As SAP's product team continually closes the gaps between the functionality in NWDS compared to the traditional Swing-based clients, usage of NWDS as the development environment for PI/PO is increasing in popularity. This is especially true when working on features that are only available in single stack environments such as iFlows or NW BPM.

 

Despite the various NWDS installation materials found on SCN, it is still not uncommon to find discussion threads occasionally popping-up in the forum related to NWDS issues. Such issues can be found here, here, here, here, here, here and here (phew!!! ).

 

The most important thing in achieving a correct setup is ensuring correct compatibility between NWDS and the PI/PO system. This means having an exact match between NWDS and the PI/PO system in terms of:-

  • Version
  • SP level
  • Patch level

 

This means, don't use an NWDS with a version/SP/patch level higher than the PI/PO system, i.e.:-

  • Don't use NWDS 7.31 on a lower PI system, like PI 7.11

 

  • Don't use NWDS 7.31 SP17 on a PI system with a lower SP, like PI 7.31 SP12

 

  • Don't use the latest and greatest NWDS 7.5 on a PI/PO 7.31/7.4 system

 

While it is fine to use an NWDS version with a lower SP level, it is recommended to use an exact match to benefit from any new feature or bug fixes available.

 

 

Checking the version/SP/patch level of the PI/PO system

As compatibility is of utmost important, the first step is to check the version/SP/patch level of the PI/PO system. The steps are mentioned in SAP Note 1381878 which are reproduced here in graphical form.

 

Navigate to NWA > Troubleshooting > Java > System Information

sysinfo.png

 

Select Components Info tab, and filter Name column by XIESR. The version is listed in the Version column.

Note: XIESR component is just one example of the many components in the system, and this assumes all the other XI components are consistent across the system.

info.png

 

The way to interpret the version number is as follows:-

version.png

 

 

Downloading the correct NWDS version

Once the PI/PO system's version has been identified, proceed to the following Wiki article to download the corresponding NWDS version. Apart from the latest NWDS 7.5 which is available from Service Marketplace, the other versions are downloadable from the corresponding update site.

NWDS Download Links - Java Development - SCN Wiki

 

For those working on PI/PO 7.4, there is no corresponding NWDS 7.4 available. This is because NetWeaver 7.4 uses the same code base as 7.31, and therefore an equivalent NWDS 7.31 installation can be used. This is mentioned in SAP Note 1791485 - NWDS 7.3 EHP1 as development environment for SAP NetWeaver 7.4. Following is a snippet from the note on how to determine the equivalent NWDS 7.31 version to use:-

If you are using SAP NetWeaver 7.4 SP lower that the latest, please install SAP NetWeaver Developer Studio 7.3 EHP1 with SP version five levels higher than the number of your SAP NetWeaver 7.4 SP.

For example: You are using SAP NetWeaver 7.4 SP 4. You have to install SAP NetWeaver 7.31 EHP1 SP 9.

 

So based on the example above:-

  • PI/PO system = 7.40 SP08
  • NWDS = 7.31 SP13 (8 + 5)

 

And therefore, the following should be the corresponding compatible version of NWDS.

download.png

 

Additionally, it is also important to have the correct JDK when using NWDS. This should be the same as the JVM version of the PI/PO system.

 

i) PI/PO 7.31 or 7.4

Runs on JVM 6.1, therefore it requires a JDK 1.6. This can be downloaded from SAP or Oracle. Personally, I prefer using SAP's as it provides a more stable experience. It can be downloaded from the update site itself as shown in the screenshot below.

jvm.png

 

ii) PI/PO 7.5

Runs on JVM 8, therefore it requires a 64-bit JDK 8.

 

 

Conclusion

Working on NWDS as the development environment is great and I personally enjoy the various benefits of working on the Eclipse platform. However, getting the setup right is essential to enjoying a smooth experience. Hopefully this blog will help the community be aware of the prerequisites when setting up installing NWDS in order to avoid unnecessary incompatibility issues.

SFTP addon installation in PI NW 7.5 using SUM 15

$
0
0

Hello Guys,

 

Recently I have installed SFTP addon in NW PI 7.5 using SUM 15 tool. I am going to write this blog, how to install it using SUM 15. I hope this would be helpful to everyone.

 

Take offline backup of PI system before starting to apply addon.

 

Capture1.PNG

 

Download SFTP adapter from below path. For more information, kindly refer SAP Note (1695521).

 

http://service.sap.com/swdc> Installation and Upgrades > Browse our Download Catalog > SAP NetWeaver and complementary products > PI SFTP PGP ADDON > PI SFTP PGP ADDON 1.0 > 51047870.ZIP

 

Capture2.PNG

 

 

Put appropriate source of SFTP adapter in PI system (PIB2BSFTP*.SCA, SAPCAR file etc.) SFTP component.

 

Capture3.PNG

 

SAPCAR and SUM component : Make sure appropriate permissions are provided to SUM component as per screen shot. Ideally it should be <sid>adm:sapsys.

 

Capture4.PNG

 

Extract SUM tool in /usr/sap/<SID> directory.

 

Capture5.PNG

 

Capture6.PNG

 

SUM directory created under /usr/sap/<SID>

 

Capture7.PNG

 

Go inside SUM directory and execute below command using root user.

 

./STARTUP confighostagent <SID>

 

Capture8.PNG

 

Copy below URL in browser, it will prompt for user id and password.

 

http://<hostname>:1128/lmsl/sumjava/<SID>/index.html

 

Capture9.PNG

 

Capture10.PNG

 

Capture11.PNG

 

Capture12.PNG

 

Provide path of manually prepared directory, in which SFTP source is reside.

 

Capture13.PNG

 

Capture14.PNG

 

Provide Administrator password.

 

Capture15.PNG

 

Capture16.PNG

 

Manually provided directory identify component of SFTP.

 

Capture17.PNG

 

 

Capture18.PNG

 

SUM tool identify parameters of your SAP system.

 

Capture19.PNG

 

Capture20.PNG

 

Capture21.PNG

 

When click on Next button, SFTP component will start deploying and Java system will automatically restart after successful deployment.

 

Capture22.PNG

 

Capture23.PNG

 

SFTP component deployed successfully.

 

Capture24.PNG

 

Capture25.PNG

 

Capture26.PNG

 

After successfully deployed SFTP component, go to http:<hostname>:50000/nwa/sysinfo

 

In component info, search for SFTP, it will show deployed component of SFTP.

 

Capture27.PNG

 

PIPB2BSFTP component deployed, means SFTP is installed in PI.

 

Now, go to http://<hostname>:50000/nwa

 

Capture28.PNG

 

Capture29.PNG

 

sftp adapter should be started.

 

Capture30.PNG

 

Up to this we have deployed addon SFTP in PI system to use it.

 

To use SFTP in configuration with IB (Integration Builder) and ESR (Enterprise Service Repository), we must have to deploy SFTP adapter in ESR, otherwise it will not be useful .

 

How to deploy SFTP in ESR, for that I will create new blog soon.

 

 

Regards,

Harshil Shah

SAP Integration workshop with the Swedish user group

$
0
0

Early last week I was in Stockholm meeting with the Swedish user group. The target of the meeting was a workshop on the integration topic that I did together with my colleague Udo Paltzer

 

We had representatives from key customers and partners in the region and the target for them was to better understand SAP’s strategy around integration in general, understand SAP HANA Cloud Integration in depth with deep hands-on sessions. Understanding our offering for SAP API Management as well as the roadmap for SAP Process Orchestration.

 

10257688_10209225053687419_8550766828211764266_n (1).jpg

 

Since there was quite some ‘when to use what integration technology from SAP’ types of questions I also believe that the Integration Solution Advisor methodology played a key role during the entire workshop to narrow down the choice based on a very simple methodology and framework. If you are also interested in learning more about the same check out http://scn.sap.com/community/pi-and-soa-middleware/blog/2016/03/04/int203-integration-solution-advisor-methodology-isa-m-sap-teched-lecture-of-the-week

 

The best part of the workshop which ran on the 15 and 16 March at the CGI office located in Kista Stockholm supposedly the Silicon Valley of Sweden was the open and very constructive communication and sharing of knowledge that helped everybody get on the same page.

Many of the customers and partners had a strong Process Orchestration/ Process Integration footprint and was in the midst of cloud integration projects or was in the process of deciding which cloud integration middleware would serve their use cases best.

 

The hands on sessions that we went through jointly on SAP HANA Cloud Integration showcased how simple it was to leverage the packaged integration content from SAP and model and adapt your integration flows to your specific needs. Everybody was thrilled to explore the new content packages from SAP which could be found at https://cloudintegration.hana.ondemand.com/#shell/catalog

 

IMG_6879 (002).JPG

 

The SAP API Management solution clearly showed everybody how you could drive topics like API Governance and usage tracking of your APIs that you have in your landscape especially when you have the innovation cycles growing so much faster on the consumer side as compared to the innovation cycles in your backend applications.

 

All in all a great workshop and to top it all a big thank you to Pontus Borgström  (SAPSA, SKF),  Estelius Johan  (CGI), Åsa Jonnson (SAPSA) and Lena Hartung (CGI) for hosting us and for working with us over the last months to put together the format of this workshop and of course for their great hospitality.

Mass-Switch between Central and De-Central Adapter Engine with the Migration Tool

$
0
0

Planned or not planned there is a number of situations where it can be necessary that you do a fast switch of your scenarios between you central and non central adapter engine. If you will do that manually it may take a long time … time that especially in critical situations is not available.

 

This thread will show you how you can do a fast switch from one to an other machine with the PI Migration Tool.

 

Open the PI monitoring tool (PIMON http://host:port/pimon) and go to the Migration Tool.

 

1.png

 

As soon it’s open it should look like this. Now you have to open the channel migration.

 

2.png

Now you have to enter the Source and Target System. In your case Source and Traget System will be the same.

 

3.png

After you entered your user credentials and clicked on the next button you will be able to get a list of all available communication channels.

 

4.png

You can simply click on the Search Button to get a list of all channels or you can use the filter function.

 

5.png

In the result list it’s possible to mark several communication channels. Mark all the should be switched.

6.png

As soon the channels are marked you will see in the property window the parameters that can be changed.  Flag the checkbox at the “Adapter Engine” parameter and choose the engine on which the channels should run.

 

7.png

 

As soon you have done your selection click on the Apply Button. And after that on the Next Button. (Can be that you have to scroll dwon).

 

On the next screen you can define the name of the change list where the changes will be added to.

8.png

Click on create to to the changes.

9.png

When all changed have been done you get a protocol about the changes. Click on Finish an start you Integration Builder.

 

10.png

 

A problem can be how to identify the channels that have to be moved, especially when you want to move the channels back, later. For that it can be usefull if you include the default machine of a channel in its name (e.g. CC_AEX_MYCHANNEL) or you do a identification via the communication component. In standard all channels of component A are running in the central engine and channels of component b are running on he decentral engine.

 

That progress can be used also for other mess manipulation like inactivate a huge number of channels.

Configuring HANA Cloud Connector with HCI to Use OData Adaptor

$
0
0

This blog explains how to use HCC to connect HCI OData adaptor with SAP Gateway. HCC helps HCI adaptors to communicate with SAP on premise systems.

2016-03-22_22-07-59.jpg

Setting up HANA Cloud Connector

  • Download HCC from here.
  • After extracting the HCC zip file, open the directory and run the batch file go.bat to start cloud connector.
  • Next, run the url https://localhost:8443/

2015-11-23_14-24-49 (1).png

  • Enter username and password enter Administrator / manage(Case sensitive).
  • Next, choose Master type installation and click Apply.

2015-11-23_14-28-23 (1).png

  • On the next screen, enter your HCI tenant details. To find the details from your HCI eclipse environment, click IFLMAP.
  • Note the Account ID and URL and provide it in the setup configuration window.

2016-03-23_08-28-14.jpg

2016-03-23_08-31-03.jpg

 

  • From the new window click on Access Control,then click on Add to add a backend to be used with HCI.

2015-11-24_02-34-14 (1).png

  • Choose backend system type as SAP Gateway.

gateway (1).png

  • Select HTTPS protocol.

2015-11-23_21-49-03 (1).png

  • Then, provide SAP Gateway internal hostname and port.

2015-11-23_21-51-33 (1).png

  • Next, provide a virtual host and port. You could give any values here.

2016-03-22_22-44-26.jpg

  • Choose Principal Propagation Type as None.

2015-11-23_21-53-20 (1).png

  • Click on Finish.

2016-03-22_22-46-59.jpg

  • Then, choose the system we just created, and click on Add.

2016-03-22_22-53-20.jpg

  • Provide below details. Here I am choosing url path as "/" which will allow HCP to access all the Odata serivces available in SAP Gateway. If you need to give access to only specific services mention only that.

2015-11-23_21-57-12 (1).png

  • Now we are good to go. It should a green icon under Connector State.

2015-11-23_15-55-42 (1).png

OData Adaptor in HCI

  • In the below example I am using OData receiver adaptor.

2016-03-22_23-03-11.jpg

  • In the adaptor settings you should provide the address with virtual hostname and port as given below.
  • Also, the Proxy type should be On-premise when using HCC.

2016-03-22_23-05-06.jpg

 

To know more about the OData adaptor read this blog: OData Adapter in SAP HANA Cloud Integration

If you are interested to know how to do batch request in OData HCI adaptor read this blog: Batch Request in SAP HCI OData Adaptor

 

Regards, Midhun

SAP Technology RIG

Batch Request in SAP HCI OData Adaptor

$
0
0

Everyone knows what is OData protocol - It is for accessing diverse data in a common way. With OData adaptor in HCI you can connect any OData service provider and perform the required integration scenario.

 

In this blog I am explaining how to do a batch request with multiple operations on different OData collections.

 

If you are new to OData adaptor read this blog: OData Adapter in SAP HANA Cloud Integration

 

An OData batch request allows you to execute multiple operations in a single HTTP request.


Let's take an example.

  • Assume that you have two OData collections - Products and SalesOrders.
  • You should be able to do a batch with multiple operations as given below.

     2016-03-23_10-34-51.jpg


  • But, if you are using OData adaptor, you will find that you don't have an option to choose multiple OData collections as given below.

 

     2016-03-23_10-23-48.jpg

     2016-03-23_10-24-25.jpg

  • It doesn't mean that HCI is not stopping you from doing a batch request with multiple operations on different OData collections.

 

  • The solution is to send the batch request in the payload irrespective of what you configured in the model operation.
  • To show you how it works I have created a simple iFlow. (I am not going to the basics of creating an iflow, if you are new to HCI read this blog)

     2016-03-23_10-39-06.jpg

  • I have two OData collections - Products, SalesOrders.
  • In my batch request I want to create a Product and a SalesOrder.

     2016-03-23_13-19-49.jpg

  • Hence the equivalent batch request payload to be used in HCI content modifier body is given below.

     2016-03-23_13-33-07.jpg

  • For your reference the I have attached the body used in the postman rest client, download it here.

 

  • The format of the batch request body to be used is given below.

     2016-03-24_08-36-06.jpg

  • The details for the payload could be taken from the metadata of OData service.
  • Given below I have highlighted entitySetName and entityTypeName in my metadata.

     2016-03-23_13-44-01.jpg

  • If your backend is an on premise system (ex. SAP Gateway) you need to configure HANA Cloud Connector with your HCI as mentioned in this blog.

 

  • In the adaptor settings the address should be referring to your virtual host and port configured in HCC. And the Proxy Type should be On-Premise.
  • Choose a model operation with batch processing enabled. The Odata collection chosen could be independent of the batch payload we are using.

    2016-03-23_13-57-33.jpg

  • Now we could deploy the project and see how the two create operations (create product and create salesorder) worked through a batch request.

 

     2016-03-22_20-19-02.jpg

 

Regards, Midhun

SAP Technology RIG


Middleware Solutions Rebooted: SAP Process Orchestration

$
0
0

logo_1_Rebooted Orchestration.jpg

 

middleware platform1.jpg

 

 

After my last post, Tim S.e-mailed a question that’s very relevant to our discussion of middleware solutions.  He wanted to know, “Just what is Process Orchestration and how is it different from Process Integration, or an AEX?”  Obviously, there are some excellent courses on this topic, including SAP NetWeaver Process Integration (BIT400)and Process Orchestration Overview (BIT800).  But before you dive into training, I’d like to use this week’s blog to explain the concept of process orchestration, and offer a brief (but necessary) history lesson.

 

From a generic perspective, process orchestration is the idea that you can take business processes that are inefficient and time consuming, and use middleware tools to convert them to run more effectively across your business.

 

SAP offers an on-premise middleware solution called SAP Process Orchestration that helps automate and optimize business processes, and transform them from simple workflows to integrated processes that work across multiple applications and organizational boundaries. At a high level, this technology makes it possible to:

 

  • Develop custom process applications based on models
  • Exchange data across SAP and non-SAP applications
  • Automate decisions and ensure compliance with policies

 

Of course I haven’t answered the second part of Tim’s question. That is, how is process orchestration different from Process Integration, or an AEX? I know we technology geeks don’t spend a lot of time studying history.  But to answer this, I think we should take a trip back in time and see the progression of process orchestration middleware.

 

 

Process orchestration:  A history lesson


XPIPO time line.jpg

 

In the beginning there was Exchange Infrastructure (XI).  XI was a dual stack application. The adapter engine, Integration Repository, and Integration Directory were on the JAVA stack. The Central Integration Engine and the Business Process Engine (BPE) were on the ABAP stack.

 

 

Adapter Engine was responsible for two actions: Connecting to and from XI, and converting a message format to a usable SAP XML formatted message. Central Integration Engine was responsible for routing and transferring a message.  Business Process Engine was based on a SAP Workflow Engine. The BPE was a very system centric BPM tool based on BPEL4WS.[MC1]

 

SAP XI_System.jpg

 

The components of Exchange Infrastructure included:

 

  • Integration Repository – Central design
  • Integration Directory – Integration Server was made up of 3 main engines:
  • - Adapter Engine
  • - Central Integration Engine
  • - Business Process Engine

XI 3.0 grew into Process Integration (PI) 7.0. For developers, Basis, and architects, not much changed other than the name.

 

There were big changes in 2007. For Basis, the changes where not very big but for the developers and architects the changes were worth taking a look at. The Integration Repository grew and become the Enterprise Service Repository and the Service Registry was added to design time.

 

  • The Adapter Engine grew into the Advanced Adapter Engine. With this new component, you could map and route through the Adapter Engine. Now, a message could remain solely on the JAVA stack.
  • In the Integration Directory, we see a new configuration object called the “Integration Configuration.” This allowed the message to remain in the Advanced Adapter Engine to be routed and transformed.
  • You now had two engines to route and map messages. The central integration engine and the Advanced Adapter Engine.

 

[SAP PI system.jpg

Components of Process Integration (PI) included:

 

  • - Enterprise Service Repository (ESR)/Service Registry (SR)
  • - Integration Directory (ID)
  • - Integration Server, which consisted of three main engines:
  • - Business Process Engine
  • - Central Integration Engine
  • - Advanced Adapter Engine

 

In 2010 things got interesting for Basis and architects. SAP introduced the Advanced Adapter Engine Extended. You had a choice to run your middleware either in a traditional ABAP/JAVA stack, or in a JAVA stack only. One concern was that you’d lose the ccBPM capability. But you had a lower cost of ownership. But this was an option if you just needed a robust enterprise service bus to transform messages to and from your SAP environment.

 

SAP_AEX_system.jpg

 

Components of the Advanced Adapter Engine Extended included:

  • Enterprise Service Repository
  • Service Registry
  • Integration Directory
  • Integration Server, which was made up of one main engine – the Advanced Adapter Engine

 

 

 

Wow, in 2011 things got fun for everyone. We now had PI 7.31, PI AEX 7.31 and the new kid on the block, Process Orchestration! Now it’s easy to see this as the next version of PI, because it does message transformation, but it also does much more.

 

 

SAP_PO system.jpg

Components of Process Orchestration include three existing SAP programs:

 

  • Advanced Adapter Engine Extended (AEX)– Routes and transforms messages.
  • SAP Business Process Management (SAP BPM)business processes, both human and system-centric. This is something that the traditional PI configuration cannot do.
  • SAP Business Rules Management (SAP BRM)

 

 

Understanding what the components of SAP Process Orchestration can do

 

With that background, here’s more detail on what the components of this middleware solution can help you accomplish:

 

  1. 1. Develop and deploy custom process applications quickly (SAP BPM).

Support process improvement projects by making it easier for business and IT teams to jointly compose executable processes using standardized notation.

 

  • Foster collaboration with a shared environment for process modeling, design, and development.
  • Streamline end-to-end process modeling from initial definition to specs and execution.
  • Improve efficiency by leveraging service-oriented architecture and reusable services.
  • Enable business on the go with intuitive access to process tasks on mobile devices.
  • Gain deep transparency into business processes with powerful analytical capabilities.

 

  1. 2. Master the growing demand for connected systems across your business network (AEX).

Connect heterogeneous systems and achieve application-to-application (A2A) and business-to-business (B2B) integration.

  • Use a mediation layer to exchange information across distributed business applications.
  • Take advantage of packaged adapters to support B2B integration.
  • Get predefined integration scenarios to jumpstart your integration.
  • Manage and govern the complete lifecycle of Web services.

 

  1. 3. Empower your business and it teams to manage business rules (SAP BRM).

Compose, execute, and maintain business rules across your enterprise – so you can improve agility and decision making.

  • Create and amend business rules in your organization's natural language.
  • Validate and deploy rules with speed and reliability.
  • Leverage reusable Web services to make rules available to multiple applications.
  • Empower users with a Web-based collaborative environment and familiar tools.

 

More than routing and transforming messages

 

Here’s my take:  Process orchestration (PO) is more than just routing and transforming messages. If that is all I want to do then I would look at a PI configuration. But let’s face it, the days of simply taking a message and routing it to another system are few and far between. With the speed of today’s business environment coupled with new technologies, a company needs to be able to make changes and know the impact of those changes – quickly!

 

Ask yourself these two questions: First, on the last interface you created or changed, if you had to changed it tomorrow would you know what the impact would be to the business process? And second, the big one...did you document it?  Be truthful!

 

When I want to take messages, make them a part of a business process that may involve system and human-centric tasks that cross multiple systems both SAP and Non SAP.  Oh, and I want to keep the control on premise. SAP Process Orchestration is your tool.

 

“But,” you may ask, “what if I want to do cloud integration?”  That’s easy.  Check out my next blog to learn about SAP HANA Cloud Integration.

 

I hope you’ve been rebooted, with a better understanding of where SAP Process Orchestration fits.  If you have questions, please feel free to ask them below as a reply to this blog![MC3]

Any questions drop me line j.valladares@sap.com

 

Resources to help you

 

Get “how to” training:

 

Check out these reference sites on the SAP Community Network:

 

 


 

SAP PI/PO REST Sender Adapter configuration for Web Proxy/oData services

$
0
0

Many of you are aware that SAP PI/PO REST Adapter is out in the market for quite sometime, but so many unanswered questions for those who are very new to the SAP Technology or REST/UI5 architecture, the take away from this blog would be

Purpose of REST Adapter.

How to configure REST Sender Adapter in SAP PI/PO (7.31 SP14 / 7.4 SP09)

How to consume the ECC ABAP Web service Proxy and read oData services which are build in ECC.

What are the testing Tools available in the market to test the REST/oData Services.

 

The reason small SAP customers prefer REST is because if they want to upgrade their old front end applications (Java/.net/Web dynpro Java) with UI5, fast rendering and easy designing front end tool then they can this route configuring the REST Sender Adapter on SAP PI/PO since UI5 reads JSON, rest can convert from JSON to XML and back to JSON and can consume the web proxy in ECC through SOAP adapter.

Application --> REST Adaptor (PI/PO)--> SOAP adapter --> SAP ECC Web Proxy.

 

SAP recommendation is to use oData and REST because SAP did not released oData Sender Adapter yet. But as said above, some of the small customers don't want to spend time/money to convert existing ECC web services to oData services but still they want to use UI5 to consume ECC Proxies, in that scenario this is the best approach and robust and salable because once they have the budget approved in near future, they can easily unplug web services and plug in oData services.


Here are steps to consume the web services. iflow connector/ NWDS is not required to configure and use REST adapter.


Step1:


All the logic resides on SAP PI/PO ESR objects, focus on that mapping and service interface, it can be achieved easily.


Go to DataType --> download .xsd files for req/res. Create the External Definition req/res with that .xsd files.

Create the MM for same req/res for external definition, and have 1:1 mapping drag from the target to source.

Create OM use MM req for Mapping Program. Save and activate. Please note we are not touching/changing any existing OM/MM/SI.


Here comes the actual REST Sender Adapter configuration in SAP PI/PO ID object. Rest config 1.JPG

Rest config 2.JPG

REST config 3.JPG

REST config 4.JPG

REST config 5.JPG

REST config 6.JPG

REST config 7.JPG

 

screen shots 4 and 5 are very important because of the end point URL and custom pattern, they are used to call your service During Testing.


You can download the REST client add on to Firefox and test/ Google Postman/ with any SOAPUI REST



Testing REST.JPG

SoapUI REST.JPG

Good Luck....

questions/comments/likes are more welcome 








HCI: Integrate On Premise ERP with HCI IDoc Adapter using HANA Cloud Connector & Client Authentication

$
0
0

Recently, we had a requirement to integrate HCI with an On-Premise ERP Instance using standard SAP IDoc’s. The ground rules for the Integration were,

  • Only Client Authentication aka 2 Way SSL is to be used.
  • HANA Cloud Connector to be used for Interfacing from Cloud to the On-Premise ERP.

 

So what’s different out here you might ask considering that HCI Supports Client Authentication natively for its IDoc Adapter.

The short answer:

When using HANA Cloud Connector, your IDoc Adapter configuration would require your Proxy Type as “On-Premise”. When your Proxy Type is “On-Premise”, HCI does not allow you to provide an Option for Client Authentication.

 

Below is what my initial configuration looked like:

1.png

 

When trying to deploy my iFlow with this configuration, HCI prompts an error: Certificate based authentication is not supported for Proxy Type On-Premise


2.png

Does this mean that HCI does not allow Client Authentication when Integrating with ERP System with a Receiver IDoc Adapter using HANA Cloud Connector? The answer my friends, lies in the details

The Long Answer

The Scenario


3.png


HANA Cloud Connector Configuration


Configure your HCC Account

Configure your HCC Account by providing the required HCI Details

4.png

5.png

Access Control

Set Up your access control by mapping to an On-Premise ABAP System


6.png

Provide the Protocol. In my case HCC connects to On-Premise ERP using HTTPS Protocol.

          7.png

Provide the Internal SAP System Host Name and the corresponding HTTPS Port.

8.png

Provide the Virtual HostName that should be used in HCI in your IDoc Adapter. In this case I have called it: bhavesh.hcc.com


9.png


The Next Step is the most critical as this step enables Client Authentication between HCC and the On-Premise ERP System.Select Option for Principal Type as: X.509 Certificate.


10.png


This setting makes sure that the Connectivity between HCC and the On-Premise ERP System now uses Client Authentication.

11.png

12.png


Add Resource

Click on Add Resource

          13.png

Below is now what your Access Control should look like,

          14.png


Add System Certificate for Client Authentication

To enable Client Authentication you would need to ensure your Private Key is added to the System Certificate in your HANA Cloud Connector. Navigate to Settings --> System Certificate. Select your Key-pair in a P12 File Format.

15.png

Click on Import, the KeyPair should be imported successfully,

16.png

Backend SAP Configuration for User Mapping

Go to SM30 : Table Name : VUSREXTID

17.png

External ID Type: DN

18.png


Create a new entry by Importing the Public Certificate of  the Key Pair you imported into HANA Cloud Connector and providing a User ID for the same.

19.png

 

HCI IDoc Adapter Configuration

Configure your IDoc Adapter with the below options:

  • ProxyType : OnPremise
  • Authentication : Basic Authentication Enabled
  • Credentials: Provide any Credentials. This is not going to be used in the runtime. In my case I created a Dummy Credentials with a Dummy User / Password.

20.png

 

Save and Deploy your Integration Flow.

Your scenario should now use Client Authentication and Authenticate itself to the BackEnd ERP System!

 

So What happens Behind the scenes?

 

What you will notice is that if remove the KeyPair from the Settings --> System Certificate in your Hana Cloud Connector, then the IDoc Adapter will try to use Basic Authentication. If you have maintained valid credentials, then the Login goes through and IDoc gets posted. If you have used maintained Invalid credentials a HTTP 401 Unauthorized Error is returned.

 

In Summary, HANA Cloud Connector has been instructed to use a X509 Certificate to authenticate itself to the Back End ERP System. Hence, when the IDoc from HCI is sent to HCC, HCC uses the X509 Certificate to authenticate itself which leads to a Client Authentication aka 2 Way SSL with HANA Cloud Connector & HCI’s IDoc Adapter!


References / Additional Reading


HCI Securing your communications

HANA Cloud Connector SetUp



HCI: Deciphering HCI Keystore

$
0
0

Background

SAP provides 2 major operating models for HCI Installations,

  • SAP Managed Operating Model– Major tasks for tenant related administration is handled by SAP including management of Keystore & Known_Hosts for SSH / SFTP Connections.
  • Customer Managed Operating Model– Major Tasks for tenant administration is handled by the Customer. This model is currently applicable only for SAP HCI Partner Edition.

 

As most HCI Installations are on SAP Managed Operating Model –

  • Have you ever wondered what the KeyStore from HCI contains?
  • How does SSH / SFTP connectivity occur?

 

If yes, read on..


As-Is HCI Tenant Setup

As we start this journey to understand the setup, key important notes on the HCI Tenant:

  • No Keystore is Deployed
  • No Known_Hosts is deployed ( For SSH / SFTP Connections )

Learning#1 - Component CXF-endpoint-IFLMAP-hcibsp

Let’s start our journey of discovery with the  component CXF-endpoint-IFLMAP-hcibsp. Have you looked at the Components under your Tenant Management Node? Wondered what the Component: CXF-endpoint-IFLMAP-hcibsp does?

 

As per SAP Documentation, the role of this Component is to simulate an external SSL Call every 30 seconds to your HCI Tenant Run-time node from the Tenant Management Node via the SAP HCI Load Balancer. The status of this SSL call helps HCI confirm if the runtime node is up and running.

 

So what happens when you do not have a Keystore deployed on your HCI Tenant? Well, this Component goes into error with the message: “Cannot connect, no Keystore deployed” as SSL connections require a Keystore.

 

1.png

Create your Keystore – Use a Self-Signed Certificate

Well, let’s just go ahead and create our Keystore then, we thought. In our case it was a “Test” tenant and as we were keen to get our hands dirty on a “Fresh” HCI Tenant, we thought lets create a Self-Signed Keypair and deploy on the Keystore.

The steps to create the Keystore are described in this blog: Starting with hana cloud integration-keep this in mind . In our case we used a Self-Signed Certificate and deployed the same on the Keystore. This is what our Keystore looked like, i.e., just a single Self Signed Keypair,

 

2.png

 

Post deployment of the Keystore, the status of CXF-endpoint-IFLMAP-hcibsp changed to “javax.net.SSLPeerUnverifiedExcepton: peer not authenticated”

 

Learning#2 - Add HCI Tenant’s SSL Certificate to your Keystore

The SAP documentation clearly states a SSL call is made to the HCI Load Balancer which led us to believe that the SSL call was failing as the HCI LoadBalancer’s SSL Certificates were not trusted in the Keystore. This brought us to the next step, where we added our HCI LoadBalancers Certificate Chain to the Keystore.

 

Download BaseCertificate from your HCI URL

3.png

4.png

 

Download Intermediate Certificate from your HCI URL

5.png

 

Download Root Certificate from your HCI URL

6.png

 

Import all 3 Certificates into your Keystore. This is what our Keystore now looked like.

7.png


Deploy your Keystore with the hope that the error vanishes!


Learning#3 - Self-Signed Certificates – Sorry Not Accepted

Alas, the error continued!

We had the feeling at the back of our mind that Self Signed Certificates would probably not be accepted in the Keystore. SAP provides a list of CA’s whose Certificates are accepted by the Load Balancer for authentication which meant we had to now get our certificate signed by a Trusted CA as listed on SAP documentation here.


Update & Deploy Keystore with a Trusted CA

The Keystore was then updated and deployed with a Keypair signed by a TrustedCA.Subsequently the error disappeared, and the CXF-endpoint-IFLMAP-hcibsp turned into status Green

 

8.png



SSH / SFTP, Known_Hosts and Your Keystore

So, what role does the Keystore play when you need to deal with SSH/SFTP Connections? How do you generate a known_hosts file?

In the case of a SAP Managed Operating Model, these complexities are hidden to HCI Developers but what does SAP do behind the scenes?


Prerequisite:

The SFTP Server IP Address is open in the SAP Firewall. This action had to be taken by SAP and a ticket to SAP is the only way to go about the same.

 

Test your SSH Connection

Use the Test Outbound Connection  as described in the blog: HCI: Testing Outbound Connections from HCI to Test your connection to a SSH SFTP Server. The server returns the error “Retrieving known.hosts from cloud storage failed due to KeyStoreNotFoundException: Keystore with name: 'known.hosts'”. The error clearly states – known_hosts is not deployed.

 

Learning#4 – Creating known_hosts file

A known_hosts file is a list of SFTP Servers and their public keys. This file enables your HCI Tenant know the list of Hosts that a SSH / SFTP Connection is allowed to.To generate a known_hosts file execute the below command from a SSH Terminal. In my case I use cygwin

 

Command: ssh-keyscan -t rsa <<IPAddress/HostName of SFTP Server>>


9.png

Copy the output of this command to a Text File and then deploy the same on the server.

10.png

11.png

12.png

13.png

SFTP Authentication & Keystore

So, now that our known_hosts is deployed let’s try a test connection we said. Alas, the error changed this time with error:  “Auth Fail”. Our SFTP Server had been set up for Key Based Authentication with the exact KeyPair that was loaded into our Keystore as described in the wiki: Generating SSH Keys for SFTP Adapters - Type 2

 

So why is our SFTP Server not authenticating us when the Key is available in the Keystore? How does HCI know which Keypair to be used for SFTP Authentication we wondered considering that there is no field to provide the Alias Name of your Private Key.

HCI does not provide you with an option to provide a Private Key Alias name for Key Based Authentication for SSH / SFTP. Instead HCI looks for Key’s with alias: id_rsa or id_dsa in the Keystore and uses this to authenticate itself.

 

Learning#5 - Update Keystore to have a entry for Keypair with id_rsa

To avoid any impact to existing scenarios, we copied the existing Keypair entry into a new entry called “id_rsa” and deployed the keystore.

14.png

Now when the SSH connection is tested, it works like a charm!

 

15.png



Conclusion for the HCI KeyStore

 

  1. Ensure your Keypair is signed by a Trusted CA. ( No Self Signed Certificates )
  2. Ensure your tenant’s Load Balancer Certificates are loaded into your Keystore.
  3. Ensure a known_hosts file is created for SFTP Connections.Continue to append any new server to existing file.
  4. For SFTP / SSH ensure your KeyPair has the alias: id_rsa or id_dsa

HCI -Integrating SalesForce (SFDC) using HCI -Part 1

$
0
0

Background

 

Integration between SFDC and a Back End SAP ERP has been a common Interface requirement for many PI consultants. A typical SFDC integration using PI would involve -

  • Check sessionExpiryTime from Value Mapping
  • If session expired,
    • Make a Login Call to SFDC.
    • Persist the SessionID, sessionExpiryTime and URL into PI Value Mapping ( Use Integration Directory API )
  • Make corresponding API call to SFDC by adding SessionID to the Soap Header Using XSLT / Java Mappings or use Adapter Modules for the same.

 

So how do you do a SFDC Integration with HCI using the SOAP Adapter HCI provides? Are there better ways to do this Integration on HCI?

 

Note: There exists a Partner Adapter for SFDC from a SAP Partner for HCI. This blog does not go into the merits of using an out of the box adapter vs the below custom process.

 

Pre-Requisite:

There are detailed documents available on this on SCN for SFDC Integration using PI. In case you are new to SFDC Integration and would like to understand a flow of how SFDC Integration works would recommend to read the below first. The next sections of this blog assumes you are aware of the “How-To” in terms of SFDC Integration and will focus only on how HCI can be used for such Integration's.

 

Salesforce.com Integration Using SAP PI: A Case Study

Salesforce Integration Using PI: How to Perform Query and Other DML Operations Using the Enterprise WSDL

 

 

HCI Scenario under Consideration

 

  • SAP ERP Triggers a Material Master Change to HCI using a ARTMAS IDoc.
  • The same has to be updated (UPSERT Operation) to SFDC.

 

1.png

 

HCI Implementation Process Flow

When implementing this Process Flow on HCI to make the design modular, 2 separate Integration Flows have been created,

 

  1. Integration Flow#1 – Login Process
  2. Integration Flow#2 – Actual Material Master Replication Process as described above. Integration Flow#2 will use Integration Flow#1 in case the session has expired.

Due to the complexity of this flow,this blog deals with Integration Flow#1. Integration Flow#2 will be published in a subsequent blog.

 

Integration Flow#1 – Login Process

Process Flow

 

2.png

 

Integration Flow Trigger
  • The Integration Flow is triggered using a SOAP Sender Channel. The SOAP Sender Channel uses the SFDC WSDL as described in the prerequisite documents. The authentication mode is set to Basic Authentication.

Content Modifier

  • Saves the Input Payload to a Property.
  • Set’s the SFDC Login Request XML with dummy values for User & Password.

Script

  • Groovy Script to read the SFDC Login Credentials deployed on the HCI Tenant.
  • Read User Name and Password and replace them into the SFDC Request XML.

Request Reply

  • Perform a Login Request using SOAP Adapter.
  • Get the Login Response.

RemoveXMLNamespace

  • XSLT Mapping to remove XML Namespaces from SFDC Response.

Write Variables

  • Persist SessionID, SessionURL from response XML into Global Variables.
  • Persist sessionTime using Camel Expression into Global Variable.

Content Modifer

  • Set Body of Integration Flow Response as the Initial Input Payload that was persisted in 1st Step

Content Modifier Settings

Set Body to match the SFDC Request XML. The %user% and %pass% will be replaced with the actual SFDC User and Password in the next step.

4.png

<?xml version="1.0" encoding="UTF-8"?><ns0:login xmlns:ns0="urn:enterprise.soap.sforce.com"><ns0:username>%user%</ns0:username><ns0:password>%pass%</ns0:password></ns0:login>

 

Set Property inputPayload to persist the actual Input to the Integration Flow. This will be used in the last step of the Integration Flow to return the response back to the calling Integration Flow.

3.png

 

Groovy Script

Groovy Script reads the User Name and Password from Deployed Credentials called :SFDCCredentials. Make sure that the corresponding credentials are deployed on the HCI Tenant. If the name of the Credentials is different adjust the code accordingly.

 

import com.sap.gateway.ip.core.customdev.util.Message;
import java.util.HashMap;
import com.sap.it.api.ITApiFactory;
import com.sap.it.api.securestore.SecureStoreService;
import com.sap.it.api.securestore.UserCredential;
def Message processData(Message message) {  /* Read Credentials.  Name of Deployed Credentials : SFDCCredentials  */  def messageLog = messageLogFactory.getMessageLog(message);  messageLog.setStringProperty("Info1", "ReadLoginCredentials Script Called..");  String payload = message.getBody(java.lang.String)  def service = ITApiFactory.getApi(SecureStoreService.class, null);  def credential = service.getUserCredential("SFDCCredentials");  if (credential == null){       throw new IllegalStateException("No credential found for alias 'SFDCCredentials'");    }    messageLog.setStringProperty("Info3", "SFDC Credentials Retrieved");    String userName = credential.getUsername();    String password = new String(credential.getPassword());    messageLog.setStringProperty("password ", password);  message.setProperty("userName", userName);  message.setProperty("password",password);  payload = payload.replaceAll('%user%',userName);  payload = payload.replaceAll('%pass%',password);  message.setBody(payload);  return message;
}

Request Reply

Configure Request Reply to a SOAP Receiver Channel.URL of SOAP Receiver Channel would point to the SFDC Login URL. Use the SFDC WSDL as described in the documents in the Prerequisite section.

 

6.png

RemoveXMLNamespace - XSLT Mapping

The Login Response returned by SFDC is as shown in the image below.

7.png


The sessionID, serverUrl are persisted into Global variables from the LoginResponse XML in the next step using XPATH. As the login response contains the XML Namespace declaration highlighted, XPath Expression does not provide the required results. This XSLT Mapping has been used to remove all XML Namespaces to enable the Xpath expression to evaluate successfully in the next Step.


<?xml version="1.0"?><xsl:stylesheet version="1.0" xmlns:xsl="http://www.w3.org/1999/XSL/Transform">    <xsl:output indent="yes" method="xml" encoding="utf-8" omit-xml-declaration="yes"/>    <xsl:template match="*">        <xsl:element name="{local-name()}">            <xsl:apply-templates select="@* | node()"/>        </xsl:element>    </xsl:template>    <xsl:template match="@*">        <xsl:attribute name="{local-name()}">            <xsl:value-of select="."/>        </xsl:attribute>    </xsl:template>    <xsl:template match="comment() | text() | processing-instruction()">        <xsl:copy/>    </xsl:template></xsl:stylesheet>


Write Variables

The Write Variables step is used to persist the sessionID,serverUrl and sessionTime as Global Variables.

This step takes the output of the XML Mapping,.i.e, the Login Response without the namespace prefix and then parser's the same to set the Global variables - sessionID and serverURL using XPath expressions


As the underlying framework for HCI is Apache Camel, the variables available in Apache Camel are also available in HCI. The sessionTime Global variable is set using Camel Variables. Refer to section Variables in this camel link to understand what other options are available.


 

NameDataTypeTypeValueGlobal Scope
sessionIDjava.lang.StringXPath/loginResponse/result/sessionIdEnabled
serverURLjava.lang.StringXPath/loginResponse/result/serverUrlEnabled
sessionTimejava.lang.StringExpression${date:now:dd-MM-yyyy HH:mm:ss}Enabled


8.png

Content Modifier

The Content Modifier step is used to return the Input XML to this Integration Flow as the output of this Integration flow. The property inputPayload that was persisted in the first content modifier step will now be used to set the response. Set Body as ${property.inputPayload}

 

9.png

 

Test Integration Flow#1 – Login Process

Test your flow by triggering the Interface using SOAP UI. SOAP UI should return the same data that was as your request.

10.png

 

Navigate to DataStoreViewer and go to sap_global_store. The Global variables are persisted in this DataStore and you should be able to see the corresponding entries in this Data Store as shown below.

 

11.png

 

12.png

 

Your Integration Flow#1 for Login to SFDC and persisting the session to Global Variables is now in place and can be triggered from any external system or any other HCI Integration Flow.

 

We will look at the Integration Flow#2 in the next blog of this series and how all pieces fit in together!

Encryption/Decryption in SAP PI with Seeburger PGP Module

$
0
0

Summary

Pretty Good Privacy (PGP) is a data encryption and decryption algorithm that provides privacy and authentication for data communication. It is primarily used for the requirements where the data is sensitive and needs to encrypted before it is sent to Third party vendors/External Applications  via SAP PI and vice versa.In this blog,  Seeburger PGP Module configuration with SFTP adapter is described. The Seeburger PGP module provides composition and decomposition of OpenPGP messages in SAP NW PI. The messages comply with RFC 4880 (OpenPGP Message Format), which is based on PGP 5.x. With PI 7.11+, PGP is available as part of the SAP Net Weaver Process Orchestration Secure Connectivity Add-On.  PGP is an adapter user –module which can be used with any Java adapters e.g. File/FTP/SFTP, JDBC , SOAP, RFC , HTTP etc .

The PGP module uses the public key encryption method to secure the content of the business document.  It allows us to encrypt/decrypt and digitally sign or verify a message.

Features

Composing

  • Signing a message with a Private Key
  • Encrypting a message with Public Key or Pass-phrase
  • Signing and Encrypting a message

Decomposing

  • Verifying a received OpenPGP message.
  • Decrypting a received OpenPGP message.

Simple Integration diagram presentation of a scenario where Source System is an Internal FTP Server and Target System is External/Third Party Application. SAP PI is the middle ware used for Interfacing.

image001.png

Procedure

1.    Pre-requisite for using PGP

The file SeePGPModulePI.sca has to be available.Deploy the file SeePGPModulePI.sca with the SAP  Software Deployment Manager (JSPM). This step was performed by the Basis Team.

Installation Steps

A user with sufficient permissions should be configured for PGP module under Property Store’s front-end settings.

NamespaceNameValues
http://seeburger.com/xi/PGPKey pgpUserPGPUSER
http://seeburger.com/xi/PGPKey pgpPass

image002.png

3.   SAP Net Weaver (NWA)Settings

Login to NWA and navigate to Identity Management.Create a user with the same user name and password as the one registered in Seeburger Property Store front-end in the first step i.e. PGPUSER. Assign the role view-creator.<PGPViewName> to the newly created user.

Manage Role of PGP front end settings has to be assigned in NWA. (This step was performed by the basis team)

image003.png

4.  PGP Key Management

The PGP keys can be generated  directly from seeburger key center Front end or it can be  generated from publicly available websites & programs, the generated key pair can be imported in the seeburger key store. Both options are described below.

Encryption The concept is your private key is used to digitally sign/ encrypt a message when sending to your  business partner. And the public needs to be provided to your business partner.Your business partner will use the public key to verify the digital signature.

Decryption Your business partner will use the public key (provided by you) to  encrypt the message.  For decryption , your private key will be used  to decrypt the file sent by your business  partner.

A Test view is created in NWA ( for testing purposes)- > Configuration Management->Certificates & Keys . The key  pair will be generated under this view once all the steps are completed.

Option1 Create the PGP Keys from Seeburger Key Store Management .Navigate to Key Center Front End and click on create ( PGP Key Management).

image004.png

Complete the PGP Keyring details  and provide all the key-specific information like key size, Username, email address etc. If generate public key option is selected then corresponding Public key –ring will be created and stored in the TRUSTED/<view>/<cert>_pub ,otherwise only the private key-ring will be created and stored to the TRUSTED/<view>/<cert_sec> view ‘Test’ which was created in NWA. Click on confirm the key pair will be generated.

image005.png

Option 2You can also use any publicly available programs and websites, open tools available for key generation.                              Reference https://www.igolder.com/pgp/generate-key/

Set the pass-phrase for the key-pair and click on Generate PGP Keys.

image006.png

The generated key pair has to be imported in seeburger workbench-> Key Center front end.

image007.png

After the key pair is created by either of the methods, the public & private key can be viewed and downloaded from NWA. Navigate to PI NWA-> Config Management->Certificates and Keys->Select ‘Test’ view .

Public key _pub is renamed as pgpencrpypt_pub

Private key _sec is renamed as pgpdecrypt_sec

(Any naming convention can be used here)

Capture.PNG

5.  PGP Module Configuration in the Communication Channel

PGP Module is configured to be used in a certain adapter channel to allow  secured data transmission according to the PGP protocol.A new module needs to be inserted in the module tab  of the communication channel.

Module Name: localejbs/Seeburger/PGP

Module Type: Local Enterprise Bean

The module position is important, it should be placed after all the attachments have been created and after unpacking any zip document.

Encryption Process

ActionDescription
SignSigns the message using PGP public key from trusted keystore
EncryptEncrypts  a message using PGP public keys from a keystore
Sign- EncryptCombination of Sign & Encrypt






The below channel is configured as per the ‘encrypt’ action

image009.png


image010.png

Explanation

  • Encryption algorithm :  CAST5 -The algorithms used to encrypt a document with a session key. The  Supported algorithms are  CAST5 (Default), TRIPLEDES,, BLOWFISH, AES-128, AES-192 and AES-256
  • Encrypt Mode : encrypt
  • Encrypt Key : key store path where the encrypt public key is located/stored (Public key format -TRUSTED/<view>/<certificate>).
  • EncryptPass (Character sequence used to encrypt a message. The receiver needs the same pass phrasefor decrypting the message.

Encrypted File Received at Recipient Side.Target system receives the encrypted file and utilizes the shared public key to decrypt the file.

image011.png


Decryption ProcessThere are three actions that can be performed with Decryption process.

ActionDescription
VerifyVerifies the signature  of PGP documentWith public keys
DecryptDecrypts the messages with PGP private  keys  from trusted keystore
Decrypt- verifyThis is combination of decrypt & verify

The below channel is configured as per the ‘decrypt’ action

 

image012.png


Explanation

  • Decrypt Key- Key-store entry that stores the private key. The entry has to be specified as TRUSTED/<view>/<certificate>
  • Decryptkey pass- This parameter is the pass-phrase to access the private key.
  • Mode – Decrypt
  • Decrypt pass- Pass-phrase to decrypt the session ID.

When PI receives the decrypted file from the vendor , the private key stored in the trusted key-store is utilized to decrypt the file  and  then a plain/ text file is sent to the source system.

Additional Information

References

1. Seeburger PGP Manual

2. https://help.sap.com/saphelp_nw-secure-connect102/helpdata/en/8b/11483856d04f6b9c7bf378ecd1670c/content.htm


System Details where the scenario was successfully tested

ComponentReleases
SAP Process Intgration7.1 , SP10

HCI -Integrating SalesForce (SFDC) using HCI -Part 2

$
0
0

Background

 

In Part 1 of this blog series, a Login Integration Flow was created that persisted the sessionID, serverURL and sessionTime to the Global Variables of HCI.

 

  • How is this Integration Flow#1 used when the IDoc from SAP arrives to HCI?
  • How can a Global Variable be read from the HCI DataStore?
  • How easy is it to add SOAP Headers to your SOAP Message?

 

This blog covers this and the end to end flow. Read on!

 

Integration Flow#2 – Material Master Replication Process

ProcessFlow

 

1.png

 

Step TypeDescription
Integration Flow Trigger
  • The Integration Flow is triggered using a IDoc Sender Channel.
  • The authentication mode is set to Basic Authentication.
Content Modifer
  • Content Modifier sets the Header properties SAP_ApplicationID, SAP_Sender, SAP_Receiver
  • SAP_ApplicationID is set to the IDoc number XPath to enable search for the Interface with the IDoc Number
  • SAP_Sender is set to SNDPRN XPath to enable set Sending System Details
  • SAP_Receiver is set to RCVPRN XPath to enable set Receiver System Details
  • This is an optional step and can be skipped.
Mapping
  • The Message Mapping to Map the Source IDoc to the UPSERT Webservice call is triggered.
  • This blog does not go into the details of this mapping and this should be covered in the Pre-Requiste section of the Part1 of this Blog series
Content Modifier
  • Set the Property - sessionID, sessionURL, sessionTime.
  • These Properties are read from the Global Variables persisted in the IntegrationFlow#1- Login.
  • Setting these properties in this step enables these to be accessible as local variables within the Integration Flow
Script
  • The script checks if the sessionID is valid. SalesForce sessionIDs are valid for 2 hours from the time the Login is made.
  • This Groovy script checks the if currentTime is within sessionTime+2 Hours
    • If yes, property sessionValid is set to True
    • If no, property sessionValid is set to False
  • In summary, this Script checks if the session continues to be valid and sets a Flag accordingly.
Gateway
  • If Session InValid -> Branch 1
  • If Session Valid -> Branch 2 (Default Branch)

SessionInValid -

Branch 1

Request Reply

  • Make a SOAP Call to the Integration Flow#1 Login
  • This Request Reply step triggers the call to the Integration Flow #1, if the session is Invalid and a Login call has to be made to get a new sessionID.
Content Modifier
  • Set the Property - sessionID, sessionURL, sessionTime.
  • These Properties are read from the Global Variables persisted in the IntegrationFlow#1- Login.
  • Setting these properties in this step enables these to be accessible as local variables within the Integration Flow
Script
  • Add sessionID to the SOAP Header
  • Perform certain XML Namespace Manipulations - Reasons for this are as described in the Prerequisite documents of Blog 1 of this series.
Request - Reply
  • Make the SOAP call to SalesForce to perform Upsert operation

 

 


Content Modifier

Set the Headers to enable monitoring with the corresponding IDoc Headers. The documentation on this can be read in the blog: End2End monitoring of HCI message flow in Standard Content - Made easy

2.png

 

Mapping

Mapping from the Source IDoc to the SFDC Upsert Request. The details of the mapping are not discussed in this blog.

3.png

Content Modifier

Content Modifier enables to set the Property sessionID,sessionURL,sessionTime. The values for these properties are read from their corresponding Global Variables.

4.png

Script

The below script is used to validate if the session is valid. If session is Valid, property sessionValid is set to "True" else set to "False"

import com.sap.gateway.ip.core.customdev.util.Message;
import java.util.HashMap;
import java.util.Date;
import java.text.SimpleDateFormat;
def Message processData(Message message) {  def messageLog = messageLogFactory.getMessageLog(message);  def propertyMap = message.getProperties()  //Read sessionTime from the Properties  String sessionTime = propertyMap.get("sessionTime");  //Set a Log Message for Monitoring purporses  messageLog.setStringProperty("sessionTime ", sessionTime);  // Convert sessionTime to a Calendar Date  Calendar cal = Calendar.getInstance();  SimpleDateFormat sdf = new SimpleDateFormat("dd-MM-yyyy HH:mm:ss");  cal.setTime(sdf.parse(sessionTime));  // Add 2 Hours to  sessionTime  Calendar cal1 = Calendar.getInstance();  SimpleDateFormat sdf1 = new SimpleDateFormat("dd-MM-yyyy HH:mm:ss");  cal1.setTime(sdf1.parse(sessionTime));  cal1.add(Calendar.HOUR_OF_DAY, 2);  // Get Current Time  SimpleDateFormat sdfDate = new SimpleDateFormat("dd-MM-yyyy HH:mm:ss");//dd/MM/yyyy    Date now = new Date();  String sessionValid = "";  // Validate if session is Valid  if(now.before(cal1.getTime())){  sessionValid = "true";  }else{  sessionValid = "false";  }  message.setProperty("sessionValid",sessionValid);  messageLog.setStringProperty("sessionValid ", sessionValid);  return message;
}


Parallel MultiCast & Gateway

Parallel MultiCast is used to enable have a Join Step to merge the multiple branches of the Gateway Step. If you would like to understand these step types further would suggest reading the blog : Multicast Pattern in Integration Flows (HCI-PI)

 

Gateway Properties are as below

  • If sessionValid = 'False' -> Make a Request/Reply call to HCI Integration Flow #1 - Join Step
  • If sessionValid = 'True' -> Default Branch as no call to SFDC for Login is required -  Join  Step

5.png

Request Reply

As mentioned previously, this Request Reply step uses the SOAP Adapter to make a call back to Integration Flow#1.

The SOAP Adapter treats this like any other Webservice and the Proxy-Type used has been Set to "Internet".


At this moment, I am not aware of any other means to trigger an Integration Flow of HCI and hence this approach has been documented.It would be great if a inbuilt feature is provided to do this to avoid traffic over the internet.

 

6.png

 

7.png

 

 

Content Modifier

 

This content modifier step performs the same tasks as one of the previous content modifier steps,i.e, enables to set the Property sessionID, sessionURL, sessionTime. The values for these properties are read from their corresponding Global Variables.

 

4.png

 

Script

This script is used to insert SOAP Headers into your Message. In this case, we insert the sessionID into the SOAP Header.

Note: This script also performs some rudimentary XMLNamespace manipulations for SFDC to address the WSDL Incompatibilities into your PI Graphical Mapping editor. There are better ways to do this including a XSLT / Parsing Technique. This blog does not delve into the details of the same.

 

import com.sap.gateway.ip.core.customdev.util.Message;
import java.util.ArrayList;
import java.util.List;
import javax.xml.namespace.QName;
import javax.xml.parsers.DocumentBuilder;
import javax.xml.parsers.DocumentBuilderFactory;
import org.apache.cxf.binding.soap.SoapHeader;
import org.w3c.dom.Document;
import org.w3c.dom.Element;
import com.sap.it.api.ITApiFactory;
import com.sap.it.api.securestore.SecureStoreService;
import com.sap.it.api.securestore.UserCredential;
import groovy.util.XmlSlurper;
def Message processData(Message message) {  def body = message.getBody();  def messageLog = messageLogFactory.getMessageLog(message);  // Read SessionID from the Property  def propertyMap = message.getProperties()    String sessionID = propertyMap.get("sessionID");  // Perform certain XMLNamespace manipulations. Note this is a very rudimentary mode of performing the same.  // An XSLT or a Parser are better modes to do the same.  String payload = message.getBody(java.lang.String);  payload = payload.replaceAll('type','xsi:type');  payload = payload.replaceAll('xmlns:ns0="urn:enterprise.soap.sforce.com"','xmlns:ns0="urn:enterprise.soap.sforce.com" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"');  payload = payload.replaceAll('xsi:type="Product2"','xsi:type="urn1:Product2"');  message.setBody(payload);   // Set SOAP Heeader sessionID   DocumentBuilderFactory dbf = DocumentBuilderFactory.newInstance();   dbf.setNamespaceAware(true);   dbf.setIgnoringElementContentWhitespace(true);   dbf.setValidating(false);   DocumentBuilder db = dbf.newDocumentBuilder();   Document doc = db.newDocument();   Element authHeader = doc.createElementNS("urn:enterprise.soap.sforce.com", "SessionHeader");   doc.appendChild(authHeader);   Element clientId = doc.createElement("sessionId");   clientId.setTextContent(sessionID);   authHeader.appendChild(clientId);   SoapHeader header = new SoapHeader(new QName(authHeader.getNamespaceURI(), authHeader.getLocalName()), authHeader);   List  headersList  = new ArrayList<SoapHeader>();   headersList.add(header);   message.setHeader("org.apache.cxf.headers.Header.list", headersList);   messageLog.setStringProperty("SessionID ", sessionID);   return message;
}

Request Reply


Now that the SOAP Message is formed with the correct SessionID in the SOAP Header, the request reply is aimed at making the SOAP Call to SFDC to perform the corresponding operation on SFDC. Unlike in the PI world, where a Do Not Use SOAP Envelope mode is required, so such mode is required in HCI as HCI natively provides a mode to manipulate the SOAP Header.

 

8.png

 

Testing Your Flow

 

  • Prior to testing this Integration Flow, make sure that the Integration Flow#1 Login has been triggered atleast once manually through SOAP UI. Reason : This Integration Flow assumes that the Global Variables sessionID, sessionURL and sessionTime exist in the SAP Data Store.
  • Trigger the IDoc from SAP.

 

Results

  • IDoc Triggered at 8:53:56 AM, Processing Time ~ 6 Seconds
    • This Integration Flow makes the Call to the Login Integration Flow.
    • The Logs also show the call to the Login Integration Flow ( Once )

9.png

 

  • IDoc's triggered at 8:53:58 and 8:54:00, Procesing Time ~ 1 Second
    • This Integration Flow re-uses the SessionID as the sessionID is valid.

11.png

 

Global Data Store Updated via the Login Integration Flow

16.png


 

Additional Details

 

  • Search using IDoc# works as required. Note as the variable was declared as a Integer only Number is to be used.
  • Likewise if variables is declared as a String, the same needs to be appended with zeroes or wild card prefix is to be used.

12.png

13.png

 

Logs Show the sessionID Status as per the Groovy Script


15.png

 

Final Note

 

The above 2 blogs show a means to implement a Integration with SFDC. Like in the PI world, there are multiple design options for this integration requirement. The idea behind this blog was not to delve into the various other options but show users how the most common approach used for persistence of sessionID's can be implemented using HCI.

 

Other patterns / models can be added into the comment sections!


SAP Integration Strategy Document

$
0
0

SAP Integration Strategy - I am writing this blog to share my experience around creating Integration strategy document for multiple clients and providing details of future integration solution to clients and development team using this document.

 

As we always say "he who is well prepared has half won the battle", I believe in this and always believed a proper integration strategy document is building block for any successful integration project.

 

Why do we need Integration strategy document?- Integration strategy is one of the most important document for any integration project because it provides glimpse of current integration challenges, future integration solution, benefits of future integration solution, provides integration architecture principles, and etc.

It serves as blueprinting document for technical team to understand current and future integration solution and also helps developer to understand different technical integration solution available for interfaces.

 

What should we cover in Integration strategy document? It is very important as an Integration architect that we balance the content of the document in such a way that, it is not very lengthy and exhaustive document. We should always try to meet the expectation of client in terms of clearly illustrating current, future solution and principles applied in achieving that also providing enough details for integration team so that they can understand and build interfaces.

 

I have also tried to put below contents in my integration strategy document:


  • ARCHITECTURE VISION
    •             Current Integration Architecture Overview
    •             Current Integration Challenges
    •             Future Integration Objectives
    •             Business and Technical Guiding principles for future Integration Architecture
    •             Future Integration Architecture Overview
    •             Benefits of Future Integration Architecture


  • TECHNICAL INTEGRATION SOLUTION
    •            Batch Integration Strategy
    •            Real-Time Integration Strategy
    •            Interface patterns
    •            Error Management
    •            Integration Performance considerations 
    •            Integration security details


  • TECHNOLOGY ARCHITECT
    •            Details of Integration tools
    •            Positioning of Integration tools
    •            Capabilities of Integration tools
    •            Best Practices       

    

In summary, This blog is just a reference document to point out some of the useful contents can be put together inside Integration strategy document.


Encryption/Decryption in SAP PI with Seeburger PGP Module

$
0
0

Summary

Pretty Good Privacy (PGP) is a data encryption and decryption algorithm that provides privacy and authentication for data communication. It is primarily used for the requirements where the data is sensitive and needs to encrypted before it is sent to Third party vendors/External Applications  via SAP PI and vice versa.In this blog,  Seeburger PGP Module configuration with SFTP adapter is described. The Seeburger PGP module provides composition and decomposition of OpenPGP messages in SAP NW PI. The messages comply with RFC 4880 (OpenPGP Message Format), which is based on PGP 5.x. With PI 7.11+, PGP is available as part of the SAP Net Weaver Process Orchestration Secure Connectivity Add-On.  PGP is an adapter user –module which can be used with any Java adapters e.g. File/FTP/SFTP, JDBC , SOAP, RFC , HTTP etc .

The PGP module uses the public key encryption method to secure the content of the business document.  It allows us to encrypt/decrypt and digitally sign or verify a message.

Features

Composing

  • Signing a message with a Private Key
  • Encrypting a message with Public Key or Pass-phrase
  • Signing and Encrypting a message

Decomposing

  • Verifying a received OpenPGP message.
  • Decrypting a received OpenPGP message.

Simple Integration diagram presentation of a scenario where Source System is an Internal FTP Server and Target System is External/Third Party Application. SAP PI is the middle ware used for Interfacing.

image001.png

Procedure

1.    Pre-requisite for using PGP

The file SeePGPModulePI.sca has to be available.Deploy the file SeePGPModulePI.sca with the SAP  Software Deployment Manager (JSPM). This step was performed by the Basis Team.

Installation Steps

A user with sufficient permissions should be configured for PGP module under Property Store’s front-end settings.

NamespaceNameValues
http://seeburger.com/xi/PGPKey pgpUserPGPUSER
http://seeburger.com/xi/PGPKey pgpPass

image002.png

3.   SAP Net Weaver (NWA)Settings

Login to NWA and navigate to Identity Management.Create a user with the same user name and password as the one registered in Seeburger Property Store front-end in the first step i.e. PGPUSER. Assign the role view-creator.<PGPViewName> to the newly created user.

Manage Role of PGP front end settings has to be assigned in NWA. (This step was performed by the basis team)

image003.png

4.  PGP Key Management

The PGP keys can be generated  directly from seeburger key center Front end or it can be  generated from publicly available websites & programs, the generated key pair can be imported in the seeburger key store. Both options are described below.

Encryption The concept is your private key is used to digitally sign/ encrypt a message when sending to your  business partner. And the public needs to be provided to your business partner.Your business partner will use the public key to verify the digital signature.

Decryption Your business partner will use the public key (provided by you) to  encrypt the message.  For decryption , your private key will be used  to decrypt the file sent by your business  partner.

A Test view is created in NWA ( for testing purposes)- > Configuration Management->Certificates & Keys . The key  pair will be generated under this view once all the steps are completed.

Option1 Create the PGP Keys from Seeburger Key Store Management .Navigate to Key Center Front End and click on create ( PGP Key Management).

image004.png

Complete the PGP Keyring details  and provide all the key-specific information like key size, Username, email address etc. If generate public key option is selected then corresponding Public key –ring will be created and stored in the TRUSTED/<view>/<cert>_pub ,otherwise only the private key-ring will be created and stored to the TRUSTED/<view>/<cert_sec> view ‘Test’ which was created in NWA. Click on confirm the key pair will be generated.

image005.png

Option 2You can also use any publicly available programs and websites, open tools available for key generation.                              Reference https://www.igolder.com/pgp/generate-key/

Set the pass-phrase for the key-pair and click on Generate PGP Keys.

image006.png

The generated key pair has to be imported in seeburger workbench-> Key Center front end.

image007.png

After the key pair is created by either of the methods, the public & private key can be viewed and downloaded from NWA. Navigate to PI NWA-> Config Management->Certificates and Keys->Select ‘Test’ view .

Public key _pub is renamed as pgpencrpypt_pub

Private key _sec is renamed as pgpdecrypt_sec

(Any naming convention can be used here)

Capture.PNG

5.  PGP Module Configuration in the Communication Channel

PGP Module is configured to be used in a certain adapter channel to allow  secured data transmission according to the PGP protocol.A new module needs to be inserted in the module tab  of the communication channel.

Module Name: localejbs/Seeburger/PGP

Module Type: Local Enterprise Bean

The module position is important, it should be placed after all the attachments have been created and after unpacking any zip document.

Encryption Process

ActionDescription
SignSigns the message using PGP public key from trusted keystore
EncryptEncrypts  a message using PGP public keys from a keystore
Sign- EncryptCombination of Sign & Encrypt






The below channel is configured as per the ‘encrypt’ action

image009.png


image010.png

Explanation

  • Encryption algorithm :  CAST5 -The algorithms used to encrypt a document with a session key. The  Supported algorithms are  CAST5 (Default), TRIPLEDES,, BLOWFISH, AES-128, AES-192 and AES-256
  • Encrypt Mode : encrypt
  • Encrypt Key : key store path where the encrypt public key is located/stored (Public key format -TRUSTED/<view>/<certificate>).
  • EncryptPass (Character sequence used to encrypt a message. The receiver needs the same pass phrasefor decrypting the message.

Encrypted File Received at Recipient Side.Target system receives the encrypted file and utilizes the shared public key to decrypt the file.

image011.png


Decryption ProcessThere are three actions that can be performed with Decryption process.

ActionDescription
VerifyVerifies the signature  of PGP documentWith public keys
DecryptDecrypts the messages with PGP private  keys  from trusted keystore
Decrypt- verifyThis is combination of decrypt & verify

The below channel is configured as per the ‘decrypt’ action

 

image012.png


Explanation

  • Decrypt Key- Key-store entry that stores the private key. The entry has to be specified as TRUSTED/<view>/<certificate>
  • Decryptkey pass- This parameter is the pass-phrase to access the private key.
  • Mode – Decrypt
  • Decrypt pass- Pass-phrase to decrypt the session ID.

When PI receives the decrypted file from the vendor , the private key stored in the trusted key-store is utilized to decrypt the file  and  then a plain/ text file is sent to the source system.

Additional Information

References

1. Seeburger PGP Manual

2. https://help.sap.com/saphelp_nw-secure-connect102/helpdata/en/8b/11483856d04f6b9c7bf378ecd1670c/content.htm


System Details where the scenario was successfully tested

ComponentReleases
SAP Process Intgration7.1 , SP10

How to tune the queue consumer threads for specific adapter in PI/PO system

$
0
0

You find that messages stuck in the Adapter Engine with hanging status, such as "To be delivered". In Engine Status -> Additional Data tab, you see that it's using all the assigned threads for a specific adapter and a specific queue (for example, SOAP_http://sap.com/xi/XI/SystemCall). You may be wondering how to increase the value of Maximum Number of Threads.

 

Here I will introduce how to tune the queue comsumer threads for different adapters in PI/PO system.

 

1) For some normal adapters (SOAP, RFC, JDBC, JMS, File, HTTP_AAE, IDoc_AAE......), you can add a new property set for the specific Adapter in the property "messaging.connectionDefinition". This is appended after the global AFW entry.

 

You can set the custom value for property "messaging.connectionDefinition" in


-> NWA

-> Configuration

-> Infrastructure

-> Java System properties

-> Services

-> XPI Service: AF Core

 

91Capture.PNG

 

or

 

-> ConfigTool (- \usr\sap\<SID>\<Instance ID>\j2ee\configtool)

-> template - Usage_Type_All_in_One

-> services

-> com.sap.aii.af.svc

 

90Capture.PNG

 

****Please restart the system after saving these changes in NWA or in ConfigTool. Then the change will take effect and the Maximum Number of Threads for these adapters in Engine Status will be updated as expected.*****

 

  • For example, increasing the value of Call.maxConsumers for SOAP adapter from default value 5 to 10

 

(name=global, messageListener=localejbs/AFWListener, exceptionListener=localejbs/AFWListener, pollInterval=60000, pollAttempts=60,

Send.maxConsumers=5, Recv.maxConsumers=5, Call.maxConsumers=5, Rqst.maxConsumers=5) (name=SOAP_http://sap.com/xi/XI/System,  messageListener=localejbs/AFWListener, exceptionListener=localejbs/AFWListener, pollInterval=60000, pollAttempts=60,  Send.maxConsumers=5, Recv.maxConsumers=5, Call.maxConsumers=10, Rqst.maxConsumers=5)

 

  • For example, increasing the value of maxConsumers for all the queues of File adapter from 5 to 8

 

(name=global, messageListener=localejbs/AFWListener, exceptionListener=localejbs/AFWListener, pollInterval=60000, pollAttempts=60,  Send.maxConsumers=5, Recv.maxConsumers=5, Call.maxConsumers=5, Rqst.maxConsumers=5) (name=File_http://sap.com/xi/XI/System,  messageListener=localejbs/AFWListener, exceptionListener=localejbs/AFWListener, pollInterval=60000, pollAttempts=60,  Send.maxConsumers=8, Recv.maxConsumers=8, Call.maxConsumers=8, Rqst.maxConsumers=8)

 

****Please pay attention to the name part, it would be better to check the correct adapter name in Engine status. It should be case-insensitive. If you use the name=FILE_http://sap.com/xi/XI/System, the custome value can be saw in NWA or in ConfigTool, however it does not really take effect and the Maximum Number of Threads for File adapter in Engine Status will not be updated as expected.*****

 

  • For example, increasing the value of Send.maxConsumers for SFTP adapter from 5 to 10

 

(name=global, messageListener=localejbs/AFWListener, exceptionListener=localejbs/AFWListener, pollInterval=60000, pollAttempts=60,  Send.maxConsumers=5, Recv.maxConsumers=5, Call.maxConsumers=5, Rqst.maxConsumers=5) (name=SFTP_http://sap.com/xi/XI/SFTP, messageListener=localejbs/AFWListener, exceptionListener=localejbs/AFWListener, pollInterval= 60000, pollAttempts=60, Send.maxConsumers=10, Recv.maxConsumers=5, Call.maxConsumers=5, Rqst.maxConsumers=5)

 

****Please pay attention to the name part, it would be better to check the correct adapter name in Engine status. It should be "name=SFTP_http://sap.com/xi/XI/SFTP", not "name=SFTP_http://sap.com/xi/XI/System".****


  • For example, increasing the value of maxConsumers for File adapter and SFTP adapter together


(name=global, messageListener=localejbs/AFWListener, exceptionListener=localejbs/AFWListener, pollInterval=60000, pollAttempts=60,  Send.maxConsumers=5, Recv.maxConsumers=5, Call.maxConsumers=5, Rqst.maxConsumers=5) (name=File_http://sap.com/xi/XI/System,  messageListener=localejbs/AFWListener, exceptionListener=localejbs/AFWListener, pollInterval=60000, pollAttempts=60,  Send.maxConsumers=8, Recv.maxConsumers=8, Call.maxConsumers=8, Rqst.maxConsumers=8) (name=SFTP_http://sap.com/xi/XI/SFTP, messageListener=localejbs/AFWListener, exceptionListener=localejbs/AFWListener, pollInterval= 60000, pollAttempts=60, Send.maxConsumers=10, Recv.maxConsumers=5, Call.maxConsumers=5, Rqst.maxConsumers=5)


sftp11Capture.PNG


2) For Integrated Configuration Object (ICO) scenarios, the message processing occurs exclusively in the MS Senderqueues (Send.maxConsumers for asynchronous outbound and Call.maxConsumers for synchronous outbound). To ensure sufficient work threads, you can consider to increase the number of Send.maxConsumers and Call.maxConsumers for specific adapter. The value of Recv.maxConsumers and Rqst.maxConsumers will not be considered in ICO scenarios.


*******

Related Notes/Documents:


SAP Note 1623356 - "To be delivered" messages in Adapter Engine

SAP Note 1557036 - Integrated Configuration Objects (ICO) scenarios use Messaging System Sender Queues only

Messaging System queue properties after XI 3.0 SP19 / XI 7.0SP11

B2B Addon compared with Seeburger

$
0
0

A month a go I created a webinar on how SAPs B2B Addon stacked up with Seeburger. It was quite a fun webinar I had time to create and there was a good participation.

 

So I decided to take the presentation and record it for online videos so I could share it in a broader perspective. And now I got time to share it here also.

 

Video 1 : On a overview of what the B2B add-on is about.

This video covers the different things that you get in the seeburger package.

 

Video 2: B2B add-on vs. Seeburger

This videos covers the difference between the B2B Add-on and the Seeburger tools.

 

Video 3: Migration from Seeburger

This video covers how to migrate from Seeburger to B2B Add-on.

 

 

The videos original appeared on my blog https://picourse.com/b2b-add-on-series/

they was created it as a part of my launch of a training on using the B2B add-on.

Load Testing with JMeter: Test Results Visualization Using Kibana Dashboards

$
0
0

Intro

Apache JMeter (Apache JMeter - Apache JMeter&amp;trade;) is one of popular tools for load testing, which became a "Swiss Army knife" in the area of integration scenarios performance testing. It is open source, lightweight, supports variety of load generator types (called samplers in JMeter's terminology), and it is flexibly extendible. Most commonly extension plugins are developed for samplers - so that JMeter can be used to generate load for specific technologies and protocols. But there is another area, which is worth of paying attention to - processing, presentation and visualization  of test results. In JMeter, listener components are responsible for this task. JMeter is already equipped with some of them, 3rd party plugins enrich listeners capabilities by bringing new listeners - but in majority of cases, listeners are either visualizing test results in JMeter, or exporting them to external files. Test results visualization in JMeter is limited to user interface technologies used by JMeter, export to external files requires further data processing and rendering.

 

For those who would like to familiarize themselves with JMeter, there are technical materials describing its basic concepts, development and usage of test plans and their major components: a good starting point is documentation published on JMeter's official web site, complemented by examples of JMeter test plans, which can be found over the Internet and on SCN.

 

In this blog, I would like to describe an alternative approach, which employs external data visualization tools for presentation of test results obtained from JMeter test plan executions in real time. I will use Kibana (Kibana: Explore, Visualize, Discover Data | Elastic), which is a part of ELK stack, but generally speaking, the approach can be easily adopted and other data visualization / dashboard tools can be utilized instead.


In conjunction with Kibana, which is used for data visualization, I will utilize Elasticsearch for storing and indexing JMeter test results data (samples data). Principal flow diagram depicting involved components, is provided in the illustration below:

Flow.png

Custom developed JMeter listener for Elasticsearch sends test results data to an Elasticsearch server, Kibana uses indexed data persisted in Elasticsearch, when visualizing it in a prepared dashboard.

 

 

JMeter Listener for Elasticsearch

A developed JMeter listener for Elasticsearch is bundled as a JAR file. A Java class implementing listener, has to extend JMeter's abstract class AbstractBackendListenerClient. Details on AbstractBackendListenerClient are provided in JavaDocs for JMeter and can be found at https://jmeter.apache.org/api/index.html?org/apache/jmeter/visualizers/backend/AbstractBackendListenerClient.html. Below are some key points regarding important methods of this class, that have to be implemented by custom listeners:

  • getDefaultParameters(). Implements logic to set parameters and their default values. It is necessary to specify parameters programmatically so that they appear on the UI of the listener, default values can be overwritten later in JMeter in the listener UI. In the developed listener, parameters are used to specify connectivity and authentication configuration for a called Elastichsearch server, index to be used and timezone to be used for timestamps adjustment (if necessary);
  • setupTest(). Implements initialization logic. Called once at the beginning of an executed test plan. In the developed listener, an Elasticsearch server ping test is conducted during initialization - if ping fails (for example, due to network connectivity issues between JMeter and an Elasticsearch server, an Elasticsearch server being down or incorrectly specified listener parameters), listener will not attempt to send samples data to an Elasticsearch server;
  • teardownTest(). Implements cleanup logic. Called once at the end of an executed test plan. Not used in the developed listener;
  • handleSampleResults(). Implements listener logic that is used to handle and process each sampler execution results. Called at each iteration of the test for each thread in the thread group. In the developed listener, sample data is marshalled into JSON format and sent to a specified Elasticsearch server by invoking its corresponding REST API over HTTP(S).

 

Following 3rd party libraries where used in implementation of a listener for Elasticsearch:

 

Additionally, generation of a Java class reflecting structure of test results / sample data was done using http://www.jsonschema2pojo.org/.

 

Source code of a developed listener can be found at GitHub - vadim-klimov/SCN_JMeterListenerForElasticsearch. Compiled and built plugin (which is distributed as a JAR file) can be downloaded from SCN_JMeterListenerForElasticsearch/Listener4Elasticsearch.jar at master · vadim-klimov/SCN_JMeterListenerForElasticsearc…. At least JRE 8 is required to use it.

 

Utility and dependency libraries (Gson and Jersey) shall be placed in directory <JMeter home>/lib.

JMeter listener plugin binary shall be placed in directory <JMeter home>/lib/ext.

 

 

Demo

I have prepared a JMeter test plan that consists of two samplers, which are responsible for generation of SOAP requests (being sent to a SAP PO system to trigger a simple interface) and HTTP requests (being sent to a public Internet service for retrieving current time).

 

Below is a screenshot of main components of a test plan and configuration of a developed listener for Elasticsearch. As it can be noticed, custom listener is added to a test plan using component Backend Listener and selecting a class that implements the required listener:

Listener configuration.png

The test plan configuration implies execution of samplers several times - having done so, we are capable of collecting some volume of test data, that is sufficient for visualization demonstration in Kibana.

 

The developed listener produces log entries that can be accessed from JMeter - info / error messages for major events and debug messages in case details of listener execution need to be collected:

JMeter log.png


Configuration of Elasticsearch and Kibana, as well as development of visualizations and dashboards in Kibana are out of scope of this blog. Please refer to official documentation available on Elastic web site and tutorials published in Internet in case details are required in regards to these topics.

 

Let us now start test plan execution in JMeter and simulate both successful and failed sample executions in one of target systems. Below is an outlook at results visualized in a Kibana dashboard:

Kibana dashboard.png


It shall be noted that just few attributes of retrieved test results were visualized and used in a demonstrated Kibana dashboard - there is much more data sent by a listener and indexed by Elasticsearch, which can be used for more precise analysis of test results. Below is a sample indexed document (test sample) containing various metrics and test sample request / response data, which were submitted by JMeter, persisted in Elasticsearch and available for visualization in Kibana:

Indexed document.png

Viewing all 676 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>