drjobs Senior Data Engineer

Senior Data Engineer

Employer Active

1 Vacancy
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Job Location drjobs

Malvern, PA - USA

Monthly Salary drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Vacancy

1 Vacancy

Job Description

Senior Data Engineer 7 Years of Experience

  • We are seeking a highly experienced Senior Data Engineer with 7 years of expertise in designing building and optimizing robust data solutions. The ideal candidate must possess toptier skills in Python AWS services API development and TypeScript and have significant handson experience with anomaly detection systems.
  • The candidate should have a proven ability to work at both strategic and tactical levels from designing data architectures to implementing them in the weeds.

Required Technical Skills: Python SQL TypeScript AWS Web services Swagger/Open AI Rest API LLM/AI Graphql

Core Programming Skills:

  • Expert proficiency in Python with experience in building data pipelines and backend systems.
  • Solid experience with TypeScript for developing scalable applications.
  • Advanced knowledge of SQL for querying and optimizing large datasets.

AWS Cloud Services Expertise:

  • DynamoDB S3 Athena GlueETL Lambda ECS Glue Data Quality EventBridge Redshift Machine Learning OpenSearch and RDS.

API and Resilience Engineering:

  • Proven expertise in designing faulttolerant APIs using Swagger/OpenAPI GraphQL and RESTful standards.
  • Strong understanding of distributed systems load balancing and failover strategies.

Monitoring and Orchestration:

  • Handson experience with Prometheus and Grafana for observability and monitoring.

Key Responsibilities:

Data Pipeline Development

  • Independently design build and maintain complex ETL pipelines ensuring scalability and efficiency for largescale data processing needs.
  • Manage pipeline complexity and orchestration delivering highperformance data products accessible via APIs for businesscritical applications.
  • Archive processed data products into data lakes (e.g. AWS S3) for analytics and machine learning use cases.

Anomaly Detection and Data Quality

  • Implement advanced anomaly detection systems and data validation techniques ensuring data integrity and quality.
  • Leverage AI/ML methodologies including Large Language Models (LLMs) to detect and address data inconsistencies.
  • Develop and automate robust data quality and validation frameworks.

Cloud and API Engineering

  • Architect and manage resilient APIs using modern patterns including microservices RESTful design and GraphQL.
  • Configure API gateways circuit breakers and faulttolerant mechanisms for distributed systems.
  • Ensure horizontal and vertical scaling strategies for APIdriven data products.

Monitoring and Observability

  • Implement comprehensive monitoring and observability solutions using Prometheus and Grafana to optimize system reliability.
  • Establish proactive alerting systems and ensure realtime system health visibility.

Crossfunctional Collaboration and Innovation

  • Collaborate with stakeholders to understand business needs and translate them into scalable datadriven solutions.
  • Continuously research and integrate emerging technologies to enhance data engineering practices.

Required Skills : Python

Basic Qualification :

Additional Skills :

Background Check : No

Drug Screen : No

Employment Type

Full Time

Company Industry

About Company

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.