drjobs Lead Azure Data Engineer

Lead Azure Data Engineer

Employer Active

drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Job Location drjobs

Alexander City - USA

Salary drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Job Description

Lead Azure Data Engineer
LocationSeattle WA / Remote
Job TypeLong Term


Interview Mode: Hands on Coding round on SQL Python Pyspark

Key Skills: SQL Python Pyspark Databricks Synapse Analytics ADF/ADLS Data Warehousing Data Modelling Architecture design.

12 Years Experience is a must.

Job Description:

Leads largescale complex crossfunctional projects build technical roadmap for the WFM Data Services platform .
Leading and reviewing design artifacts
Build and own the automation and monitoring frameworks that showcase reliable accurate easytounderstand metrics and operational KPIs to stakeholders for data pipeline quality
Execute proof of concept on new technology and tools to pick the best tools and solutions
Supports business objectives by collaborating with business partners to identify opportunities and drive resolution;
Communicating status and issues to Sr Starbucks leadership and stakeholders;
Directing project team and cross functional teams on all technical aspects of the projects
Lead with engineering team to build and support realtime highly available data data pipeline and technology capabilities
Translate strategic requirements into business requirements to ensure solutions meet business needs
Define & implement data retention policies and procedures
Define & implement data governance policies and procedures
Identify design and implement internal process improvements: automating manual processes optimizing data delivery redesigning infrastructure for greater scalability
Enable team to pursue insights and applied breakthroughs while also driving the solutions to Starbucks scale
Build the infrastructure required for optimal extraction transformation and loading of data from a wide variety of structured and unstructured data sources and using big data technologies.
Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition operational efficiency and other key business performance metrics.
Work with stakeholders including the Executive Product Data and Design teams to assist with datarelated technical issues and support their data infrastructure needs.
Perform root cause analysis to identify permanent resolutions to software or business process issues

Basic Qualifications
10 year of experience with objectoriented/object function scripting languages: Python Java etc
8 years of leading development of large scale cloudbased services with platforms like AWS GCP or Azure and developing and operating cloudbased distributed systems.
Experience building and optimizing data pipelines architectures and data sets.

Knowledge on Incorta ETL Pipelines
Build processes supporting data transformation data structures metadata dependency and workload management
Strong computer science fundamentals in data structures algorithm design problem solving and complexity
Working knowledge of message queuing stream processing and highly scalable big data data stores.
Software development experience in big data technologies Databricks Hadoop Hive Spark(PySpark)
Familiarity with distributed systems and computing at scale.
Advanced working experience with databases SQL & NoSQL is required.
Proficiency in data processing using technologies like Spark Streaming Spark SQL
Expertise in developing big data pipelines using technologies like Kafka Storm
Experience with large scale data warehousing mining or analytic systems.
Ability to work with analysts to gather requirements and translate them into data engineering tasks
Aptitude to independently learn new technologies.
Experience automating deployments with continuous integration and continuous delivery systems
Experience with DevOps automation using Terraform or similar products are preferred .

Preferred Qualifications
Ability to apply knowledge of multidisciplinary business principles and practices to achieve successful outcomes in crossfunctional projects and activities
Effective communication skills
Excel at problem solving
Proficiency in debugging troubleshooting performance tuning and relevant tooling
Proven ability to manage and deployment of big data implementations
Experience building cloud native enterprise software
Solid understanding of data design patterns and best practices
Proficiency in logging and monitoring tools patterns & implementations
Understanding of enterprise security REST / SOAP services best practices around enterprise deployments
Proven ability and desire to mentor others in a team environment
Working knowledge of data visualization tools such as Tableau is a plus
Bachelors degree in computer science management information systems or related discipline

Cloud BC Labs Inc is a digital transformation organization aimed at creating seamless solutions for clients to effectively manage their business operations. The company specializes in Business and Management Consulting AI/ML Data Analytics & Visualization Cloud Data Warehouse Migration Snowflake Implementation Informatica Implementation & Upgrade Staffing Services and Data Management Solutions

Employment Type

Full Time

Company Industry

About Company

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.