Connectivity is one of the big challenges for a SAP PI/PO migration. Though a protocol is the same, it can behave widely different.
Channels in SAP PI have been developed and evolved of a long period of time, they have been continues improved to support different uses case asked by customers.
Once you have a good understanding of the cannels you will be using. Then you can easily create custom mapping between the PI channel and the Cloud Integration ones. See here how to create new mappings for such scenarios.
Channels have multiply actions in one step. This could be to move a payload once the processing was completed or to update an database table in the same transaction as select.
This is a bit more complicated in a Cloud Integration way because channels have been designed with simplicity in mind. But you have options to modify the payloads as you like during the processing. But the atomic operations may require a bit of extra work.
It could be to convert a JSON into XML or to handle content conversion.
Modules in SAP PI/PO channels added a good way to enable users to run different kinds of code. You should consider if the module still makes some can be released with a content converter or it is needed.
If you still need it, you can build something similar within a groovy script for most of your custom modules. If you add it to a Groovy script in a shared collection you can map the module to it in Figaf Migration Tool. If it is custom code it is fairly easy to convert but it is a manual effort.
For File protocol does give some challenges that are often mentioned. The challenge is that the SAP PI system most of the time has been having a file system or a share with the ERP system. Either an NTFS or Unix mount. It means SAP ERP could save a file on the shared drive and SAP PI could then process this file. It made integration so smooth.
With cloud it is no longer an option. You dont have a way to get access to the file system. There is road map item to allow Edge Integration Cell access to a mapped network drive. I would go a long way to avoid it because of the complexity of managing such a scenario. And there are quite a few options.
1) Setup a SFTP or FTPS server
You probably already have an SFTP or FTPS server running in your network. If not it is pretty quick to be able to be able to set it up. Your IT department should be able to get this running for you and every server in your landscape will be able for this. It can be secured well.
You can use Cloud Connector so it does not need to be have a connection to the internet. SSH(SFTP) is the easiest option via Cloud Connector since you dont need to have other ports open.
2) File to HTTP jobs
You can also create a program on your SAP ERP system that takes each files from one or more directories and send them to Cloud Integration via HTTP. This also takes short time to develop and is also easy to monitor.
3) Go away from files
You can move away from files to some other protocol like JMS/AMQP or HTTP in your use cases. This will simplify the setup and be easier to maintain in the future.
4) Move to online drives
You can also connect to Sharepoint, OneDrive, AWS S3 or something similar for all your file exchanges. It will add some complexity to use this but may also improve the processes you have for integration development.
If you are using JMS adapter you can easily change this to use the newer protocol AMQP. Here the adapter supports that you can to reuse your existing Queue system. If not, a upgrade of the integration broker will be good. You can see the AMQP adapter configuration here. This is part of the scenarios where you will need to make your own adapter mapping.
You can also consider using SAP Event Mesh or Advance Event Mesh depending on your use case for the scenario.
The good old IDOC here SAP is moving to IDOC over SOAP. It is just to change the SM59 connection and you are good to go. The HTTP end point has something to say.
1) You can use a IDOC sender in each iFlow.
This will require you to have one SM59 connection for each IDOC that need to be processed in that flow. You will need to have a lot of managing in place for handling this. If you change certificate or security credential you will need to update multiply SM59 channels.
2) Use an IDOC dispatcher flow
The best option is to use one or more IDOC dispatcher flow. This enables you to take the IDCO and then send it to the iFlow that need to process a separate message. You can use value mapping object to handle splitting to the right processes.
The SM59 connection is easiest if you just use one HTTP endpoint for all IDOCs or groups of IDOCs.
Fort he receiver side there is not much difference you can either just a router flow or have IDOC connection in a shared flow.
For the sender side you probably need to have a iFlow flow in Cloud Integration where you can dispatch the XI30 messages to the correct location. This is the same challenge as for the IDOC. Each SM59 channel only create on URL which correspond to an iFlow endpoint.
You will need two (or more flows) one async / EO and one for sync / BE. Since they are covered by different settings in the adapter, and it probably also makes sense for the subsequent processing.
You can move one message at a time by using the process.
Create two SM59 connection that points to your proxy dispatching iFlows.
1) Define sub parameters based on Interface Name and namespace with t-code SXMSIF. Here you define “Sender/Receiver ID” for the interface you want to migrate
2) In SXMB_ADM in the configuration you can define a sub parameter for IS_URL where you use the Sender ID created before.
All interfaces without the sub parameters defined will be send to SAP PI. This way you can easily migrate without having to use the SAP PI/PO system for routing. And there is also an easy failover option to just update the SXMB_ADM or the SenderID.
Here you can have the messages in a shared or in each iFlow that needs to send the messages.
Maintaining the existing proxies is a little of a problem. You have the option Metadata Repository (MDR). It does allow you to migrate the structure that you want to maintain to a repository in SAP.
You will need to change some of the data types that exist because SAP does not support the new way. You need to consider only moving the custom proxy interfaces you have created. This does limit your flexibility for building and using the processes.
The REST Adapter is probably the newest major used Adapter. It has a really advanced features in it compared to other adapters. This would normally be mapped to a HTTP adapter in SAP Cloud Integration. But this will be missing content like parsing the URL, Routing information, JSON-XML conversion and error handling. They can all be performed in the PI adapter.
In Cloud Integration you will need to understand what the main areas of the adapter was and then add it as functionality to the processing. You have most of the components in place but you just need to create a Groovy script to handle it the logic to select the right data.
I do not think we will be able to create a good way to support all use cases for the rest of adapter setup. There is simply too many scenarios that can be covered with our migration tool, so this is easiest with your own scripting.
The RFC sender protocol is a bit tricky. The easiest is probably to change the RFC to RFC over SOAP and then use the protocol for it. There is an road map item to enable RFC from Edge Integration Cell.
The receiver side is not a problem. You will need to change the flow thru use your cloud connector to handle the request.
If you have RFC lookup in your Message Mappings Figaf will just create an empty RFC channel where you can enter your data. You can see how to implement this here.
Database look is a pretty interesting protocol. It enables you to have a much better standard connection with the backend platforms. In SAP PI/PO it was perfect to extract documents/lines, though you needed to be careful with the transaction needed.
With the JDBC you will need to do some modifications to your flow to allow them to run smoothly. The JDBC adapter does not have a send step. So you have to create a XML Query string that you can use for it. Here you can add your select and then update of the same records in one transaction.
You can also consider to use creating services on the Database servers that will send data via REST/SOAP whenever there is new data. This could probably simplify the setup but require you have someone that can handle the setup at their end.
You can have you own custom adapter or an adapter you have purchased thru a partner. There it does depend of if the structure of messages is the same and if the protocol works somewhat identically.
You can ask SAP to create an adapter for it. But I think in cases you can use the standard Cloud Integration flows and then be able to build the requests with regular HTTP calls or create a custom adapter for it.
Up can also ask a consultancy for it. They may be able to provide it for consulting or building it as a product. The biggest challenge for consulting companies is to be able to find the right number of customers for selling a product.
The Edge Cell may have its own set of adapters. Some may be the same but other times they are different. It does impact your migration process. That you will need to map the migration differently depending on where you want to run the integration.