Must haves
Data Lake storage
Key vault implementation
Data Storage
Logic app data factory
Pyspark
Python
Technical Skills:
Python: They should have good experience in Python Coding Array handling.
- SQL Expert Build must have
- Azure / Databrick / Cloud Certifications
- Azure Synapse Azure Logic apps
- Data Modelling Data structures & common algorithms. Understand E2E Data Flow from Ingestion to Exploitation must have
- Azure Databricks / Azure Data Lake Storage must have
- Azure Data Factory orchestration of pipelines must have
- Azure Data Engineering Platform must have to understand how Platform works
- Experience with SAP PIM is 90% SAP nice to have but can be acquired on the job
- Python Build / Scripting experience.
- API Ingestion Team not needed for PIM team but needed for Ingestion team
- At least 4 years experience in data engineering software engineering or data analyst/data science role with majority of those experience must be from developing business data product/business data model
- Very strong in SQL and python for data analytics
- Experienced with SAP data is a plus
Roles and Responsibilities:
Information Model object build/enhancements (PO Invoice SES RO Vendor etc) based on functional design
Technical/Functional validation of IM objects
Troubleshooting issues with code and implementing fixes
Support daily data loads from SAP/NonSAP systems into Information Model
data structures,data analysis,azure,data analytics,sql,data integration,data flow,microsoft sql server,algorithm development,python,data engineering,data modelling,azure data lake storage,api,azure data factory,etl,sap,data modeling