Role: AWS Data Engineer
Experience: Minimum 4 years
Location: Remote
Summary :
We are seeking a skilled AWS Data Engineer to join our dynamic team. As an AWS Data Engineer you will be responsible for designing building and maintaining data pipelines and systems on the AWS cloud. You will work closely with data scientists data analysts and other engineering teams to ensure that our data infrastructure is optimized for scalability reliability and performance. This role requires expertise in AWS cloud services and experience with largescale data processing frameworks.
Preferred Skills and Qualifications:
Proficient in AWS Cloud services and infrastructure.
Strong experience with Databricks and PySpark.
Extensive knowledge of data gathering ingestion manipulation orchestration and optimization.
Proven track record of leading multishore teams on sizable cloud and data projects.
Excellent communication presentation and client interaction skills.
Ability to work effectively in a fastpaced dynamic environment with multiple stakeholders.
Key Responsibilities:
Design develop and maintain scalable data pipelines and ETL processes using AWS services Databricks and PySpark.
Integrate and manage data from multiple sources ensuring data quality and integrity.
Optimize data workflows for performance and costefficiency on AWS Cloud.
Work with AWS services such as Redshift Aurora Glue Lambda and other AWS datarelated technologies.
Implement and manage data warehousing solutions and cloud data storage.
Develop RESTful APIs to support data access and manipulation.
Utilize ETL tools and technologies for data extraction transformation and loading.
Leverage Spark and Python for data processing and analysis.
Participate in Agile ceremonies and contribute to sprint planning retrospectives and daily standups.
Collaborate with crossfunctional teams including data scientists analysts and stakeholders to deliver highquality data solutions.
Lead multishore teams on largescale cloud and data initiatives.
Provide technical guidance and mentorship to junior team members.
Communicate effectively with clients and stakeholders to understand requirements and present solutions.
aws lambda,pyspark,aws cloud services,databricks,restful apis,aws,agile,spark,etl tools,communication,aurora,data,cloud services,glue,data warehousing,python,etl processes,lambda,data gathering,cloud,redshift