We are seeking a highly skilled Senior Data Engineer to join our dynamic team. This role is ideal for a passionate engineer who can hit the ground running leveraging their expertise in cloud technologies data pipelines automation and APIs to drive impactful data solutions.
Requirements
Required Skills & Experience:
- Bachelor s degree in Computer Science Engineering or a related field.
- Proven experience as a Data Engineer with a strong software engineering background.
- Expertise in building and optimizing data pipelines for largescale systems.
- Strong experience with cloud platforms (AWS Azure GCP).
- Handson experience in automation scheduling tools and orchestration frameworks (Airflow Prefect etc.).
- Proficiency in SQL and at least one programming language (Python Scala Java).
- Experience working with APIs and integrating various data sources.
- Knowledge of Salesforce data structures (ideal).
- Familiarity with Data Vault modeling and principles (ideal).
Key Responsibilities:
- Design develop and maintain scalable data pipelines and ETL processes.
- Automate data workflows ensuring seamless scheduling and orchestration.
- Optimize and maintain cloudbased data platforms for performance and reliability.
- Integrate APIs to streamline data exchange and improve system connectivity.
- Work with Salesforce data (ideal) to enhance analytics capabilities.
- Implement Data Vault methodologies (ideal) for scalable and adaptable data architecture.
- Collaborate with data scientists analysts and business stakeholders to support analytical and reporting needs.
- Ensure data quality governance and security best practices are adhered to.
Required Skills & Experience: Bachelor s degree in Computer Science, Engineering, or a related field. Proven experience as a Data Engineer, with a strong software engineering background. Expertise in building and optimizing data pipelines for large-scale systems. Strong experience with cloud platforms (AWS, Azure, GCP). Hands-on experience in automation, scheduling tools, and orchestration frameworks (Airflow, Prefect, etc.). Proficiency in SQL and at least one programming language (Python, Scala, Java). Experience working with APIs and integrating various data sources. Knowledge of Salesforce data structures (ideal). Familiarity with Data Vault modeling and principles (ideal). Key Responsibilities: Design, develop, and maintain scalable data pipelines and ETL processes. Automate data workflows, ensuring seamless scheduling and orchestration. Optimize and maintain cloud-based data platforms for performance and reliability. Integrate APIs to streamline data exchange and improve system connectivity. Work with Salesforce data (ideal) to enhance analytics capabilities. Implement Data Vault methodologies (ideal) for scalable and adaptable data architecture. Collaborate with data scientists, analysts, and business stakeholders to support analytical and reporting needs. Ensure data quality, governance, and security best practices are adhered to.
Education
Bachelor s degree in Computer Science, Engineering, or a related field.