drjobs AWS Cloud Data API Engineer Pyspark Databrick - Entity and Transaction APIs

AWS Cloud Data API Engineer Pyspark Databrick - Entity and Transaction APIs

Employer Active

drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Job Location drjobs

India

Monthly Salary drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Job Description

Who We Are

Artmac Soft is a technology consulting and serviceoriented IT company dedicated to providing innovative technology solutions and services to customers.

Job Description

Job Title : AWS Cloud Data API Engineer ( Pyspark / Databrick Entity and Transaction APIs )

Job Type : Full Time / Contract

Experience : 320 years

Location : Remote

Responsibilities :

  • Experience in data engineering with a focus on AWS cloud technologies.
  • Proven experience in developing APIs using PySpark and Databricks.
  • Strong experience with AWS services such as Lambda S3 EMR Glue and Redshift.
  • Proficiency in building and optimizing ETL pipelines.
  • Expertise in PySpark Databricks and AWS cloud services.
  • Strong knowledge of data structures algorithms and software design principles.
  • Proficiency in SQL Python and API development.
  • Experience with version control systems (e.g. Git) and CI/CD pipelines.
  • Familiarity with monitoring tools like CloudWatch Datadog or similar.
  • Experience with data governance and data security best practices.
  • Familiarity with machine learning workflows and integration of ML models into data pipelines.
  • Experience with realtime data processing and streaming technologies.
  • Ability to work in an Agile/Scrum environment.
  • Build and maintain Entity and Transaction APIs using PySpark and Databricks on AWS.
  • Work closely with data architects data scientists and other engineers to integrate APIs into broader data solutions.
  • Set up monitoring logging and alerting for data pipelines and APIs. Troubleshoot and resolve issues related to data integration and API performance.
  • Manage and maintain AWS services such as S3 Lambda Glue EMR and Redshift as part of the data processing and API ecosystem.
  • Create and maintain comprehensive documentation for APIs data pipelines and cloud infrastructure.
Qualification:
  • Bachelors degree in Computer Science Information Technology or a related field.

Employment Type

Remote

Company Industry

Key Skills

  • Apache Hive
  • S3
  • Hadoop
  • Redshift
  • Spark
  • AWS
  • Apache Pig
  • NoSQL
  • Big Data
  • Data Warehouse
  • Kafka
  • Scala

About Company

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.