drjobs Sr AWS Data Engineer - Airflow ETL AWS Pyspark Databricks

Sr AWS Data Engineer - Airflow ETL AWS Pyspark Databricks

Employer Active

drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Job Location drjobs

India

Monthly Salary drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Job Description

Who We Are

Artmac Soft is a technology consulting and serviceoriented IT company dedicated to providing innovative technology solutions and services to customers.

Job Description

Job Title : Sr. AWS Data Engineer Airflow ETL ( AWS Pyspark Databricks)

Job Type : Full Time / Contract

Experience : 320 years

Location : Remote

Responsibilities :

  • Experience in data engineering with a strong focus on ETL processes and data pipeline development.
  • Proficiency in using Airflow for orchestrating complex data workflows.
  • Extensive experience with AWS services (e.g. S3 Lambda Glue Redshift).
  • Strong programming skills in PySpark with a deep understanding of distributed data processing.
  • Handson experience with Databricks for data processing and analytics.
  • Experience with additional big data technologies such as Kafka Hadoop or Snowflake.
  • Familiarity with data visualization tools like Tableau Power BI or similar.
  • Experience with CI/CD tools and practices for data pipelines.
  • Certifications in AWS or Databricks are a plus.
  • Familiarity with SQL and relational databases for data extraction and manipulation.
  • Proven experience with data modeling data warehousing and building scalable data solutions.
  • Knowledge of best practices in data management including data governance and data security.
  • Strong problemsolving skills and ability to troubleshoot complex data issues.
  • Excellent communication and collaboration skills with the ability to work effectively in a team environment.
  • Optimize data pipelines for performance scalability and reliability.
  • Implement best practices for data governance data quality and data security.
  • Monitor and troubleshoot ETL processes to ensure data accuracy and integrity.
  • Work in an agile environment actively participating in sprint planning daily standups and retrospectives.
  • Document data workflows processes and technical specifications to ensure clear communication and knowledge sharing within the team.
Qualification:
  • Bachelors degree in Computer Science Information Technology or a related field.

Employment Type

Remote

Company Industry

Key Skills

  • Apache Hive
  • S3
  • Hadoop
  • Redshift
  • Spark
  • AWS
  • Apache Pig
  • NoSQL
  • Big Data
  • Data Warehouse
  • Kafka
  • Scala

About Company

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.