Testing SAP CPI with Mock data service

If you start making any changes to your SAP CPI system you will end up with problems where you need to be able to test your integration flows, if you make modifications to your scripts. In most cases, it will require that you have access to the same backed system and create the same requests.

Figaf have been supporting testing of SAP CPI for about 1 year. We was able to do it in the following way.

  1. Switch the integration flow into trace mode.
  2. Let the user send some messages
  3. Figaf IRT could then pick up all the trace data and use it as a baseline.

To test an integration flow it did the following.

  1. Switch the integration flow into trace mode.
  2. Figaf would send the input data
  3. Figaf IRT could then pick up all the trace data
  4. Compare the data with the original data.

It did provide an easy way to test your iflows with and enabled users to compare the data. To see if anything had changed.

Why we needed a new way of testing

The old way is good if the flows you don’t have many backed services you are connecting with and they are able to reply the same data. If you have SuccessFactors it may be difficult to create or fire the same employ multiply times. Or if your interface is expecting a delta of data, then you will not without making modification be able to fetch the correct data.

And then we want to be able to run the test more often, so we can be sure nothing is impacted. It could be our new approach about adding shared resources to SAP CPI, so if you make a modification in a script it will update all the other iflows.

Or if possible be able to run it on the next release of SAP CPI to see that nothing is affected.

How testing SAP CPI with mock data work

The testing work a little different than if you have any receiver adapters. Figaf will create a copy of the iFlow, and replace all Receiver channels with a HTTP call to the Figaf IRT server. It will send information about the current test case and what part of the process it is in. The API exposed at the Figaf server will then return the HTTP headers that have been changed in the process and also the body.

As a user the only thing you need to change is how to select that the test should be performed with mock data.

If there is a change in the iflow then Figaf would automatically synchronize the testing copy, and deploy it.

You can see the demo of how the service works.

As you can see it works pretty simple to get to work. Just one tick mark on the Testing Template. The solution does work with Cloud Connector if you have CPI on your local installation of IRT, you will just need some settings to specify URL and cloud connector information when configuring the iflows.

What are the limits of mock data

With the mock data there is some limits. Some we can probably find a way to solve with more development and feedback from customers.

  • You will not be able to test your adapter.
  • You be able to test any binary objects in the headers, it is not possible to recover them.
  • It does not support Enrich scenarios at the moment.

Try it your self

You can download Figaf tool and run it in your own landscape or you can try it out in our cloud deployment.

Sharing SAP CPI resources

A common use case when using SAP CPI is that you want to share scripts with multiply CPI iflows. Currently, now you have to manage this your self by copy or upload scripts in all the places it is used. It makes it pretty difficult to have a good shared repository of resources.

With shared resources, you will get a repository of all the different scripts that you have and can easily check the status of them. If the resource version, you can update all iFlows where that resource is used. If you make a modification of an object it should be possible to test to validate that nothing has happened with your processing.

You can watch a presentation of the flow here.

The flow is as the following

  1. From an iFlow Resource, you select create a resource
  2. Enter a name for the resource. Shared_ will be added as a prefix to make it easier to see what is shared
  3. IRT will then add the new resource to your Iflow
    1. Make manual changes to start using it
  4. On other iFlows you can add the shared resource
  5. If a user changes a resource in an iflow the shared resource will have a new version
  6. Use can then apply the shared version to iFlows where it is used
  7. Test that the modification does not affect processing

Limitations

There is some challenge with the current setup. You cannot share message mappings primarily because they rely on the source and target schema and possible groovy scripts, that they also need to be shared. We can probably find some way to get around it. If it becomes a requirement, we can find a solution that allows you to share message mappings also.

A standard solution from SAP

It is a common request and it may be possible that SAP will implement something similar to a standard part of the development. One thing that is required for this to work is the ability to test that the change does not affect other iFlows. It should be possible to adopt our solution, so it will enable version and support testing of the shared resources.

Want to try it out

The feature will be released in the next version of Figaf IRT probably in the week of 17 June. But you can already now signup and try all the other great functions we have to make your SAP CPI life easier.

Watch the full flow of Figaf to handle SAP CPI Change Management

We have an integrated approach for handling Change Management for SAP CPI (Cloud Platform Integration aka HCI). In this video (25 minutes) I show how the full application is connected. It is a way to have a central place to manage all your SAP Integration development.

In the video I show how you can get a good change management process for SAP CPI with

  • Full traceability for all modifications made for the development, and linking with business request
  • Ability to view differences between objects and understand how they have been changed because of a business request
  • Test the process have been developed correctly
  • Configuration of Iflows across the landscape, so there are no manual steps for the transport
  • Transport of individual iflows in the landscape
  • Document all changes in the process

There is also part of the monitoring that makes monitoring of the CPI easier and enables you to setup alerting if a message is in a specific state.

If you like it and want to see how we are able to support the change management process in SAP PI/PO.

If you want to try it out then try our cloud offering for Figaf IRTCloud

How do you Document your SAP CPI Iflows

Documentation of integration has always been a strange thing. It has been pretty difficult to make sure we documented the details correctly and had enough information with. Normally you will just get some document designed to document some other thing and then try to adapt it to your integration. That will never give a good result. You may be compliant but just a waste of time if the documentation is not useful.

For SAP PI/PO we have for ages been using a Word template for the documentation of each interface. We support that with the Figaf IRT tool, so you can generate it fast. You can see an example of the SAP PI documentation here. The inspiration was to avoid having the documentation that was never updated and always had the initial version.

SAP Best practive

I did find an example of a SAP CPI template in the best practice guide. I did not like if for a few assumptions:

  • It was focusing on the wrong things that were a bit to detailed and too generic. Like a file conversion, or mapping it just has empty tabs that users need to fill in.
  • It could not be automated every well, which ment it did require a lot of user governance to host the information
  • It was juggling 3 different adapters and not showing the required details

I wanted to improve the process and make it even easier to capture the most important part of a CPI Iflow. It ended up with the following information:

  • Overview of the iFlow header information like name and description
  • History of changes logged with business requirements
  • Connection sender and receiver together with information
  • Test cases you have created for the tool
  • Flow description which is a table representation of the iflow. It will give some overview of how the different steps are connected
  • Configuration parameters configured in the full landscape, so you got some information on what is being used and the relavant resources.
  • Resources and the lat change data for them.

You can download an example of the document here for the SF to AD iflow.

There are ways to improve the document with more information. If you have anything that you think will provide value and make it easier to understand the documentation. Do let us know.

Automated documentation does have some limits like being able to update it different places, and add the extra information that makes sense.

You can try it own on your own iflows and see how it performs in your landscape.

SAP PI/PO B2B Add-on alternative

We will in the next period of time see a large number of users that will be performing migrations of their SAP PI/PO platfrom to 7.5. I have created a number of resources on SAP PI/PO Migration. One of the big thing in this migration is the migration from Seeburger to B2B Add-on. We do have a tool that allow you to automate the migration from Seeburger to B2B Add-on message mappings.

B2B Add-on from SAP does work and can give you a way to manage your EDI messages. There are still some gaps in the B2B Add-on solution that could be greatly improved and where i see that an alterntaive solution could be better. It is in the area of:

  • Logging and archiving of messages sent.
  • Partner management and settings usability
  • Configuration of partner connections via AS2, VANS or SFTP.

I had a webinar with Paw Pedersen from BizBrains that created a product called Link. Link is a tool build to manage your EDI solution. It is currently being used to manage B2B for Biztalk, but the same system can be used also for SAP PI/PO. They have been running it for a number of customers from small companies wanting to run EDI to big banks. You will learn about what the solution can help with. You can see the replay here.

They have a number of different configuration options and security settings that are required for banks. They also have large customers with many partners, so they know how to make the application scale.

One of the reasons I think it could be useful for you is that you can switch to their platform and then there is no need to upgrade to the PO license and can stay on the PI License. It can save you a great deal of money and speed up your migration process.

There is a number of different ways the Link solution can tie into your SAP PI/PO solution from

  • Just as an Archive of messages
  • Partner configuration
  • Partner configuration that updates your mappings
  • Using it also for SAP CPI.

If you find the product interesting write to psp@bizbrains.com, and get a trial of what the solution can do.

I will be a part of designing a good solution that will make your migration as simple as possible.