News and Articles

R and The SAP Netweaver Gateway

This short article is in response to friends and others in the SAP world who have asked me the question, “I’m trying to wrap my head around data science and what it can do for me in SAP.” Well, let me show you with roughly 30 lines of R code.

For the purists, this is not truly data science. I believe to label it that, you need to employ models and machine learning. However, for those beginning to hack their way into this jungle…let’s not get caught up in semantics. “Data Science” is a broad term and we’ll use it here for this work. Don’t judge!

Recently I was shown a report that sales was using to show sales over the past week and the geographic location. It was a standard SAP ALV report. The user showing me the report said, “It would be great to map this in google or something like that, but that’s kind of far-fetched.” “Is location by zip code good enough?” I asked, “Yea, that would be awesome. We just want to see general distribution nationwide for the previous week. Right now we just kind of imagine what a map would look like.” Was the response.
“Ok, I’ll get you a prototype tomorrow.” I said. She laughed…then realized I wasn’t joking.

This is a breeze in R…let me show you step by step and you can go do the same thing in your system. You will see it is not a final product, but the idea here is to spark your curiosity around two things:

• SAP NetWeaver Gateway
• R

The first step is to create a data dictionary structure in SAP using transaction code SE11 for the fields you want to extract. Because we are going after sales data, these are the fields I selected. Honestly though, I only need Ship To Name and Postal code for this prototype.


The next step is to create an RFC function module to extract that data into this structure using transaction code SE37.


Put in the table structure you created in SE11 in the table output tab.


Put in some simple code to pull out the data. You’ll notice that I hardcoded the date range, this will become an input parameter if the prototype is a success.


p style=”text-align: left;”>FUNCTION zdf_sales_history.
"Local Interface:
DATA: wa_sales LIKE LINE OF t_sales,
t_adrnr TYPE adrnr,
t_all TYPE char01,
t_datum TYPE datum.
data: s_datum type datum.
s_datum = '20151231'.
t_datum = '20160110'.
SELECT vbak~vbeln vbak~auart vbak~kunnr vbak~erdat kna1~name1
vbap~posnr vbap~matnr vbap~kwmeng vbap~netpr vbap~prodh
FROM vbak JOIN vbap ON vbak~vbeln = vbap~vbeln
JOIN kna1 ON vbak~kunnr = kna1~kunnr
WHERE vbak~auart = 'ZOR'
AND vbak~erdat >= s_datum and vbak~erdat <= t_datum.
"No need to worry about line item ship-tos because we don't use them.
SELECT SINGLE kunnr adrnr INTO (kunwe, t_adrnr)
FROM vbpa
WHERE vbeln = vbeln AND parvw = 'WE'.
SELECT SINGLE name1 street city1 region post_code1 INTO (name1_we, stras, ort01, regio, pstlz)
FROM adrc
WHERE addrnumber = t_adrnr.
"Pick up the customer group. If not found, check when line item on vbkd is blank. SELECT SINGLE kdgrp INTO kdgrp
FROM vbkd WHERE vbeln = vbeln AND posnr = posnr.
IF sy-subrc = 0.
SELECT SINGLE kdgrp INTO kdgrp FROM vbkd WHERE vbeln = vbeln.

That is it on the SAP side. Now we move to the NetWeaver Gateway. Most people associate the NetWeaver Gateway with Fiori and while that is correct, it is not limited to just that. You can use the gateway to format json or xml OData streams and consume those in applications or reports (such as PowerBI). For those of you not familiar with working in the Gateway, I will walk through this step by step. This is the simplest of examples and I do not want it to detract from the power the Gateway has, but I also wanted to illustrate how approachable it is.
I’m assuming your Netweaver Gateway is configured. If it is not, then refer to your basis folks, it is a pretty easy task. Or, reach out to me and I can send you my config settings and steps.
1. Execute transaction code SEGW in the gateway client.
2. Click on the Create Project button 4.
3. Enter a project name, a description and assign this project to a package. Then click on the Enter button 5.
I set this as a local project because I’m not transporting it.


4. Now you will have a project with nodes. Right-click on the Data Model node and select Import -> BOR/RFC Interface.


  1. Enter a name for the Entity Type you are creating. I’m calling it SalesHistory. Then select the system where you created the RFC enabled function module earlier, for me it was PR1CLNT300. Next enter data source attributes, which are the type (Remote Function Call) and the name of the function module. Then click on the Next button 8.


  1. On this screen you will select the fields you want to import. Selecting the top node will select all of them. For the prototype, just do that. Then click on the Next button 8.


  1. The final screen is where you select the key fields. For our sales data this is sales document and item number. Finish by clicking on the Finish button 11.


  1. Now you will see your defined Entity Type by opening the properties folder. You should name the columns something meaningful, but we’re not doing that now because we’re lazy and this is a prototype.


  1. The next step is to map our newly created project to the actual function module in SAP. To do this open the Service Implementation folder, right click on the GetEntitySet node and select Map to Data Source.


  1. Enter the client where the function module exists, the type of data source and the name. This should be pretty familiar, it is the same as step 5. Then click on the Enter button 5.


  1. Now you have to map each field from the function module to the Entity set. However, there is a magic button. Click on the Propose Mapping button and watch it do the work for you.


  1. Finally you want to generate the project by clicking on the Generate button 3. You will get a popup with names of the classes to be generated. Accept the defaults and click Continue. You will have to assign this to a transport or make it local.

  2. Finally we need to create the service itself and obtain a URL that we will use later. Open the Service Maintenance folder and then double click on GW_EMBEDDED_SYSTEM.


  1. Click on the Register button 5.

  2. Enter the system alias you will be using, or the system you are calling.


  1. This step will add the service. Accept the default values and assign it as local or assign it to a package. Then click on the Enter button 5.


  1. Click on the Maintain button 8.

  2. Confirm that you are being redirected.


  1. At the bottom of the screen you will see a Call in Browser button. Click on it.


  1. Your browser will open up to the service name and provide you with the entity sets within this entity type. Copy down your entity set name. It should be the same as the entity type with the word ‘set’ on the end if you accepted all defaults.


  1. Your URL should be something like this:
    Change it by adding the entity set name like this: (it is case sensitive)

Refresh the browser with this new URL and you should see your sales data showing up in an XML format. Copy this URL, we’ll use it in R next!

That was by far the most difficult part of our process, just getting access to the SAP sales data.

I’m assuming you have R and R Studio locally installed. If you do not, just google it, there are so many tutorials on this I can’t possibly add anything to them.

Open up your R studio and make sure the following packages are installed. I’ll make sure to comment this code as much as possible, but I think you’ll see R is fairly intuitive.

data(zipcode) #load the zips from the zipcode package

I use this piece of code at the start of all my R programs to make sure that the packages I’m going to need are installed.

needs = c("xml2", "leaflet", "magrittr", “zipcode”) #Packages that I need
have = wants %in% rownames(installed.packages()) #Packages that I have
if(any(!have)) install.packages(needs[!have]) #If there are any needs not in have, then install them.
rm(needs) #remove when finished to free memory

First assign your new URL to a variable, let’s get crazy and call it url.

url = http://:/sap/opu/odata/sap/ZSALES_HISTORY_SRV/ZSALES_HISTORYSet

Now let’s read the data from our backend system, through the NetWeaver gateway and into our R code. This functionality is part of the xml2 library loaded above.

pg = read_xml(url)

Now that we have all the XML in our pg list, let’s read out the tags that we want to work with. In our case, those tags are “Ship_to_Name” and “Postal_Code”.

st_name = xml_find_all(pg, "//d:Ship_To_Name")
pstlz = xml_find_all(pg, "//d:Postal_Code")

These two variables now have all the ship-to and postal code NODES. We have to take one more step and read out the text values in these nodes. The xml2 library makes this easy as well.

st_name_vals = trimws(xml_text(st_name))
pstlz_vals = trimws(xml_text(pstlz))

If you look at these lists you will see all the ship-to names and all the postal codes. Let’s put this in a nice data frame that we can work with. We will bind the columns (cbind) of these two lists into a data frame.

df_vals =, pstlz_vals))

If you double click on the df_vals object in the global environment you will see the nice contents of our new data frame.


Our next step is to ‘clean’ our zip codes. The library zipcodes makes this easy and will strip off “ZIP+4” suffixes, restore missing leading zeroes, and replaces anything with non-digits with NAs.

df_vals$pstlz_vals = clean.zipcodes(df_vals$pstlz_vals)

There is not a lot you can do with the zipcodes when it comes to mapping, so let’s combine our new set of data with the zipcode data from the zipcodes library. This will give us the latitude and longitude of the zipcode centroid.

geo = merge(df_vals, zipcode, by.x="pstlz_vals", by.y = "zip")

If you take a look at your geo data frame now, you’ll see that the latitude and longitude have been associated with the zip codes. This is key to our next step…the exciting step…creating a map.


Leaflet is a javascript library that allows for many creative and interesting ways to map. We will be using two examples to map using the leaflet library in R.

salesMap = leaflet() %>%
addTiles() %>%
setView(-95.7129, 37.0902, zoom = 4) %>%
addMarkers(data = geo, lng = ~ longitude, lat = ~ latitude,
popup = geo$st_name_vals)

Let’s take a journey through this little code because it does a lot of great stuff. The function leaflet() calls the javascript library. The %>% is a pipe function and concatenates actions. AddTiles is necessary to tell the program which tiles are being used. If nothing is declared, as we have done, then the tiles for are used. Try using addProviderTiles(“Stamen.Toner”) instead to get another interesting map. You can find the options for this here. The addMarkers function will create markers on our map. We declare that the data we want is the geo data frame from earlier, the longitude is simply longitude and the latitude is simply latitude. Finally, we create a popup for our markers and that is simply our ship-to names (you can get more creative). Again, not a ton of value here, but the concept should at this point be spurring some ideas for you on how you could easily adapt your data to this application.

When we execute this code we get this interesting map. This is an interactive map so you can zoom in and click on a marker.


When you zoom in and click on a marker you can see who made the sale.


Let’s try another map, because all these markers could quickly get out of control. Just for experimentation, let’s cluster them into circles and add a legend.

salesMap = leaflet() %>%
addTiles() %>%
setView(-95.7129, 37.0902, zoom = 4) %>%
addCircleMarkers(data = geo, lng = ~ longitude, lat = ~ latitude, radius = 5,
clusterOptions = markerClusterOptions()) %>%
addLegend("bottomright", colors= "red", labels="Sales Whoopie", title="Some Sales We made")

This code is very similar to our previous code however, instead of adding markers, we are adding circle markers and setting the clustering to default (markerClusterOptions()). When this code is executed you will see something a bit different.


If you hover over a circle you will see the cluster area to which it applies.


If you click on the circle it will take you to the next level.

And so on…you can keep drilling down to the count of sales for a zip code.


Thus concludes this foray into using R with the NetWeaver gateway. I returned with an evening’s work to my colleague in sales who was surprised and pleased with the results of the prototype. Now, of course, they want the production version tomorrow……I put myself into these situations. So now the prototype is over and it will take more work to make it production ready however, the lesson is… there is incredible power combining R, the SAP NetWeaver Gateway and SAP. The R community is rich, helpful and so very accommodating to newbies. I reflect on how much time it would have taken me in the past to code this example…and I’d have to say something like this without R would have taken me months to code. With R, it was 30 minutes…and I’d have to humbly admit better than anything I could have done.

Leave a Reply

Your email address will not be published. Required fields are marked *