Prospects usually need to increase and enrich SAP supply information with different non-SAP supply information. Such analytic use instances may be enabled by constructing an information warehouse or information lake. Prospects can now use the AWS Glue SAP OData connector to extract information from SAP. The SAP OData connector helps each on-premises and cloud-hosted (native and SAP RISE) deployments. By utilizing the AWS Glue OData connector for SAP, you’ll be able to work seamlessly along with your information on AWS Glue and Apache Spark in a distributed vogue for environment friendly processing. AWS Glue is a serverless information integration service that makes it simpler to find, put together, transfer, and combine information from a number of sources for analytics, machine studying (ML), and utility improvement.
AWS Glue OData connector for SAP makes use of the SAP ODP framework and OData protocol for information extraction. This framework acts in a provider-subscriber mannequin to allow information transfers between SAP methods and non-SAP information targets. The ODP framework helps full information extraction and alter information seize by means of the Operational Delta Queues (ODQ) mechanism. As a supply for information extraction for SAP, you should use SAP information extractors, ABAP CDS views, SAP BW, or BW/4 HANA sources, HANA info views in SAP ABAP sources, or any ODP-enabled information sources.
SAP supply methods can maintain historic information, and may obtain fixed updates. For that reason, it’s necessary to allow incremental processing of supply modifications. This weblog submit particulars how one can extract information from SAP and implement incremental information switch out of your SAP supply utilizing the SAP ODP OData framework with supply delta tokens.
Resolution overview
Instance Corp desires to research the product information saved of their SAP supply system. They need to perceive their present product providing, particularly the variety of merchandise that they’ve in every of their materials teams. This may embody becoming a member of information from the SAP materials grasp and materials group information sources from their SAP system. The fabric grasp information is offered on incremental extraction, whereas the fabric group is simply obtainable on a full load. These information sources ought to be mixed and obtainable to question for evaluation.
Stipulations
To finish the answer introduced within the submit, begin by finishing the next prerequisite steps:
- Configure operational information provisioning (ODP) information sources for extraction within the SAP Gateway of your SAP system.
- Create an Amazon Easy Storage Service (Amazon S3) bucket to retailer your SAP information.
- In an AWS Glue Knowledge Catalog, create a database referred to as
sapgluedatabase
. - Create an AWS Identification and Entry Administration (IAM) function for the AWS Glue extract, rework, and cargo (ETL) job to make use of. The function should grant entry to all sources utilized by the job, together with Amazon S3 and AWS Secrets and techniques Supervisor. For the answer on this submit, title the function
GlueServiceRoleforSAP
. Use the next insurance policies:- AWS managed insurance policies:
- Inline coverage:
Create the AWS Glue connection for SAP
The SAP connector helps each CUSTOM (that is SAP BASIC authentication) and OAUTH authentication strategies. For this instance, you’ll be connecting with BASIC authentication.
- Use the AWS Administration Console for AWS Secrets and techniques Supervisor to create a secret referred to as
ODataGlueSecret
to your SAP supply. Particulars in AWS Secrets and techniques Supervisor ought to embody the weather within the following code. You will want to enter your SAP system username instead of <your SAP username> and its password instead of <your SAP username password>. - Create the AWS Glue connection
GlueSAPOdata
to your SAP system by deciding on the brand new SAP OData information supply. - Configure the reference to the suitable values to your SAP supply.
- Utility host URL: The host will need to have the SSL certificates for the authentication and validation of your SAP host title.
- Utility service path:
/sap/opu/odata/iwfnd/catalogservice;v=2;
- Port quantity: Port variety of your SAP supply system.
- Shopper quantity: Shopper variety of your SAP supply system.
- Logon language: Logon language of your SAP supply system.
- Within the Authentication part, choose CUSTOM because the Authentication Sort.
- Choose the AWS Secret created within the previous steps: SAPODataSecret.
- Within the Community Choices part enter the VPC, subnet and safety group used for the connection to your SAP system. For extra info on connecting to your SAP system, see Configure a VPC to your ETL job.
Create an ETL job to ingest information from SAP
Within the AWS Glue console, create a brand new Visible Editor AWS Glue job.
- Go to the AWS Glue console.
- Within the navigation pane underneath ETL Jobs select Visible ETL.
- Select Visible ETL to create a job within the Visible Editor.
- For this submit, edit the default title to be Materials Grasp Job and select Save.
In your Visible Editor canvas, choose your SAP sources.
- Select the Visible tab, then select the plus signal to open the Add nodes menu. Seek for
SAP
and add the SAP OData Supply. - Select the node you simply added and title it
Materials Grasp Attributes
.- For SAP OData connection, choose the GlueSAPOData connection.
- Choose the fabric attributes, service and entity set out of your SAP supply.
- For Entity Title and Sub Entity Title, choose SAP OData entity out of your SAP supply.
- From the Fields, choose Materials, Created on, Materials Group, Materials Sort, Outdated Matl quantity, GLUE_FETCH_SQ, DELTA_TOKEN and DML_STATUS.
- Enter restrict 100 within the filter part, to restrict the info for design time.
Notice that this service helps delta extraction, so Incremental switch is the default chosen choice.
After the AWS Glue service function particulars have been chosen, the info preview is offered. You may alter the preview to incorporate the three new obtainable fields, that are:
glue_fetch_sq
: This can be a sequence subject, generated from the EPOC timestamp within the order the report was obtained and is exclusive for every report. This can be utilized if it’s worthwhile to know or set up the order of modifications within the supply system.delta_token
: All data may have this subject worth clean, aside from the final handed report, which is able to include the worth for the ODQ token to seize any modified data (CDC). This report is just not a transactional report from the supply and is simply there for the aim of passing the delta token worth.dml_status
: This may present UPDATED for all newly inserted and up to date data from the supply and DELETED for data which have been deleted from supply.
For delta enabled extraction, the final report handed will include the worth DELTA_TOKEN
and the delta_token subject will probably be crammed as talked about above.
- Add one other SAP ODATA supply connection to your canvas, and title this node
Materials Group Textual content.
- Choose the fabric group service and entity set out of your SAP supply
- For Entity Title and Sub Entity Title, choose the SAP OData entity out of your SAP supply
Notice that this service helps full extraction, so Full switch is the default chosen choice. It’s also possible to preview this dataset.
- When previewing the info, discover the language key. SAP passes all languages, so add a filter of
SPRAS = ‘E’
to solely extract English. Notice this makes use of the SAP inner worth of the sector. - Add a rework node to the canvas Change Schema rework after the
Materials Group Textual content
.- Rename the fabric group subject in goal key to
matkl2
, so it’s completely different than your first supply. - Beneath Drop, choose ;spras, odq_changemode, odq_entitycntr, dml_status, delta_token and glue_fetch_sq.
- Rename the fabric group subject in goal key to
- Add a be a part of rework to your canvas, bringing collectively each supply datasets.
- Make sure the node mother and father of each Materials Grasp Attributes and Change Schema have been chosen
- Choose the Be part of kind of Left be a part of
- Choose the be a part of circumstances as the important thing fields from every supply
- Beneath Materials Grasp Attributes, choose
matkl
- Beneath Change Schema, choose
matkl2
- Beneath Materials Grasp Attributes, choose
You may preview the output to make sure the proper information is being returned. Now, you might be able to retailer the outcome.
- Add the S3 bucket goal, to your canvas.
- Make sure the node mother and father is Be part of
- For format, choose Parquet.
- For S3 Goal Location, browse to the S3 bucket you created within the conditions and add
materialmaster/
to the S3 goal location. - For the Knowledge Catalog replace choices, choose Create a desk within the Knowledge Catalog and on subsequent runs, replace the schema and add new partitions.
- For Database, choose the title of the AWS Glue database created earlier sapgluedatabase.
- For Desk title, enter
materialmaster
.
- Select Save to avoid wasting your job. Your job ought to appear to be the next determine.
Clone your ETL job and make it incremental
After your ETL job has been created, it’s able to clone and embody incremental information dealing with utilizing delta tokens.
To do that, you will want to change the job script immediately. You’ll modify the script so as to add a press release which retrieves the final delta token (to be saved on the job tag) and add the delta token worth to the to the request (or execution of the job), which is able to allow the Delta Enabled SAP OData Service when retrieving the info on the following job run.
The primary execution of the job won’t have a delta token worth on the tag; subsequently, the decision will probably be an preliminary run and the delta token will subsequently be saved within the tags for future executions.
- Go to the AWS Glue console.
- Within the navigation pane underneath ETL Jobs select Visible ETL.
- Choose the Materials Grasp Job, select Actions and choose Clone job.
- Change the title of the job to
Materials Grasp Job Delta
, then select the Script tab. - You have to add a further python library that can maintain storing and retrieving the Delta Tokens for every job execution. To do that, navigate to the Job Particulars tab, scroll down and increase the Superior Properties part. Within the Python library path add the next path:
s3://aws-blogs-artifacts-public/artifacts/BDB-4789/sap_odata_state_management.zip
- Now select the Script tab and select Edit script on the highest proper nook. Select Verify to verify that your job will probably be script-only.
Apply the next modifications to the script to allow the delta token.
- 7. Import the SAP OData state administration library courses you added in step 5 above, by including the next code to row 8.
- The subsequent few steps will retrieve and persist the delta token within the job tags so it may be accessed by the following job execution. The delta token is added to the request again to the SAP supply, so the incremental modifications are extracted. If there is no such thing as a token handed, the load will run as an preliminary load and the token will probably be persevered for the following run which is able to then be a delta load.To initialize the
sap_odata_state_management
library, extract the connection choices right into a variable and replace them utilizing the state supervisor. Do that by including the next code to line 16 (after thejob.init
assertion).
You’ll find the <key of MaterialMasterAttributes node>
and the <entityName for Materials Attribute>
within the current generated script underneath # Script generated for node Materials Grasp Attributes
. Make sure you exchange with the suitable values.
- 9. Remark out the present script generated for node
Materials Grasp Attributes
by including a#
, and add the next substitute snippet. - To extract the delta token from the dynamic body and persist it within the job tags, add the next code snippet simply above the final line in your script (earlier than
job.commit()
)
That is what your remaining script ought to appear to be:
- Select Save to avoid wasting your modifications.
- Select Run to run your job. Notice that there are at the moment no tags in your job particulars.
- Wait to your job run to be efficiently accomplished. You may see the standing on the Runs tab.
- After your job run is full, you’ll discover on the Job Particulars tab {that a} tag has been added. The subsequent job run will learn this token and run a delta load.
Question your SAP information supply information
The AWS Glue job run has created an entry within the Knowledge Catalog enabling you to question the info instantly.
- Go to the Amazon Athena console.
- Select Launch Question Editor.
- Be sure to have an applicable workgroup assigned, or create a workgroup if required.
- Choose the sapgluedatabase and run a question (corresponding to the next) to begin analyzing your information.
Clear up
To keep away from incurring fees, clear up the sources used on this submit out of your AWS account, together with the AWS Glue jobs, SAP OData connection, Glue Knowledge Catalog entry, Secrets and techniques Supervisor secret, IAM function, the contents of the S3 bucket, and the S3 bucket.
Conclusion
On this submit, we confirmed you how you can create a serverless incremental information load course of for a number of SAP information sources. The method used AWS Glue to incrementally load the info from a SAP supply utilizing SAP ODP delta tokens after which load the info into Amazon S3.
The serverless nature of AWS Glue implies that there is no such thing as a infrastructure administration, and also you pay just for the sources consumed whereas your jobs are operating (plus storage value for outputs). As organizations more and more develop into extra information pushed, this SAP connector can present an environment friendly, value efficient, performant, safe technique to embody SAP supply information in your massive information and analytic outcomes. For extra info see AWS Glue.
In regards to the authors
Allison Quinn is a Sr. ANZ Analytics Specialist Options Architect for Knowledge and AI primarily based in Melbourne, Australia working intently with Monetary Service prospects within the area. Allison labored over 15 years with SAP merchandise earlier than concentrating her Analytics technical specialty on AWS native companies. She’s very captivated with all issues information, and democratizing in order that prospects of every kind can drive enterprise profit.
Pavol is an Innovation Resolution Architect at AWS, specializing in SAP cloud adoption throughout EMEA. With over 20 years of expertise, he helps international prospects migrate and optimize SAP methods on AWS. Pavol develops tailor-made methods to transition SAP environments to the cloud, leveraging AWS’s agility, resiliency, and efficiency. He assists purchasers in modernizing their SAP landscapes utilizing AWS’s AI/ML, information analytics, and utility companies to reinforce intelligence, automation, and efficiency.
Partha Pratim Sanyal is a Software program Growth Engineer with AWS Glue in Vancouver, Canada, specializing in Knowledge Integration, Analytics, and Connectivity. With intensive backend improvement experience, he’s devoted to crafting impactful, customer-centric options. His work focuses on constructing options that empower customers to effortlessly analyze and perceive their information. Partha’s dedication to addressing advanced consumer wants drives him to create intuitive and value-driven experiences that elevate information accessibility and insights for purchasers.
Diego is an skilled Enterprise Options Architect with over 20 years’ expertise throughout SAP applied sciences, specializing in SAP innovation and information and analytics. He has labored each as companion and as a buyer, giving him an entire perspective on what it takes to promote, implement, and run methods and organizations. He’s captivated with expertise and innovation, specializing in buyer outcomes and delivering enterprise worth.
Luis Alberto Herrera Gomez is a Software program Growth Engineer with AWS Glue in Vancouver, specializing in backend engineering, microservices, and cloud computing. With 7-8 years of expertise, together with roles as a backend and full-stack developer for a number of startups earlier than becoming a member of Amazon and AWS; Luis focuses on growing scalable and environment friendly cloud-based functions. His experience in AWS applied sciences allows him to design high-performance methods that deal with advanced information processing duties. Luis is captivated with leveraging cloud computing to fixing difficult enterprise issues.