Maintain/develop data pipelines required for the extraction transformation cleaning preprocessing aggregation and loading of data from a wide variety of data sources using Python SQL DBT and other data technologies
Design implement test and maintain data pipelines/ new features based on stakeholders requirements
Develop/maintain scalable available quality assured analytical building blocks/datasets by close coordination with data analysts
Optimize/ maintain workflows/ scripts on present data warehouses and present ETL
Design / develop / maintain components of data processing frameworks
Build and maintain data quality and durability tracking mechanisms to provide visibility into and address inevitable changes in data ingestion processing and storage
Translate technical designs into business appropriate representations and analyse business needs and requirements ensuring implementation of data services directly correlates to the strategy and growth of the business
Focus on automation use cases CI/CD approaches and selfservice modules relevant for data domains
Address questions from downstream data consumers through appropriate channels
Create data tools for analytics and BI teams that assist them in building and optimizing our product into an innovative industry leader
Stay up to date with data engineering best practices patterns evaluate and analyze new technologies capabilities opensource software in context of our data strategy to ensure we are adapting our own core technologies to stay ahead of the industry
Contribute to Analytics engineering process
Minimum Qualifications
Bachelors degree in computer science information systems or a related discipline
5 years in the Data Engineer role
Built processes supporting data transformation data structures metadata dependency data quality and workload management
Experience with Snowflake Handson experience with Snowflake utilities Snow SQL Snow Pipe. Must have worked on Snowflake Cost optimization scenarios.
Overall solid programming skills able to write modular maintainable code preferably Python & SQL
Have experience with workflow management solutions like Airflow
Have experience on Data transformations tools like DBT
Experience working with Git
Experience working with big data environment like Hive Spark and Presto
Ready to work flexible hours
Preferred Qualifications
Snowflake
DBT
Five Tran
Airflow
CI/CD (Jenkins)
Basic understanding of Power BI
AWS environment for example S3 Lambda Glue Cloud watch
Basic understanding of Salesforce
Experience working with remote teams spread across multiple timezones
Have a hunger to learn and the ability to operate in a selfguided manner
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.