This is a remote position.
We are seeking a Data Engineer ( Snowflake Bigquery Redshift) to join our team. In this role you will be responsible for the development and maintenance of faulttolerant pipelines including multiple database systems.
Responsibilities:
- Collaborate with engineering teams to create REST APIbased pipelines for largescale MarTech systems optimizing for performance and reliability.
- Develop comprehensive data quality testing procedures to ensure the integrity and accuracy of data across all pipelines.
- Build scalable dbt models and configuration files leveraging best practices for efficient data transformation and analysis.
- Partner with lead data engineers in designing scalable data models.
- Conduct thorough debugging and root cause analysis for complex data pipeline issues implementing effective solutions and optimizations.
- Follow and adhere to groups standards such as SLAs code styles and deployment processes.
- Anticipate breaking changes to implement backwards compatibility strategies regarding API schema changesAssist the team in monitoring pipeline health via observability tools and metrics.
- Participate in refactoring efforts as platform application needs evolve over time.
Requirements
- Bachelors degree or higher in Computer Science Engineering Mathematics or a related field.
- 3 years of professional experience with a cloud database such as Snowflake Bigquery Redshift.
- 1 years of professional experience with dbt (cloud or core).
- Exposure to various data processing technologies such as OLAP and OLTP and their applications in realworld scenarios.
- Exposure to work crossfunctionally with other teams such as Product Customer Success Platform Engineering.
- Familiarity with orchestration tools such as Dagster/Airflow.
- Familiarity with ETL/ELT tools such as dltHub/Meltano/Airbyte/Fivetran and DBT.
- High intermediate to advanced SQL skills (comfort with CTEs window functions).
- Proficiency with Python and related libraries (e.g. pandas sqlalchemy psycopg2) for data manipulation analysis and automation.
Benefits
- Work Location: Remote
- 5 days working
3+ years of professional experience with a cloud database such as Snowflake, Bigquery, Redshift. +1 years of professional experience with dbt (cloud or core). Exposure to various data processing technologies such as OLAP and OLTP and their applications in real-world scenarios. Exposure to work cross-functionally with other teams such as Product, Customer Success, Platform Engineering. Familiarity with orchestration tools such as Dagster/Airflow. Familiarity with ETL/ELT tools such as dltHub/Meltano/Airbyte/Fivetran and DBT. High intermediate to advanced SQL skills (comfort with CTEs, window functions). Proficiency with Python and related libraries (e.g., pandas, sqlalchemy, psycopg2) for data manipulation, analysis, and automation.