DBT Engineer (8 Years Experience)
Overview
The DBT Engineer plays a critical role in the data transformation and analytics pipeline of our organization. As the demand for datadriven decisionmaking increases this role becomes paramount in ensuring that data integrity quality and accessibility are maintained. The DBT Engineer will work closely with data analysts data scientists and other stakeholders in translating business requirements into efficient data models. This position necessitates a balance between technical expertise and creative problemsolving abilities enabling the organization to leverage data effectively.
Key Responsibilities
- Design and develop data transformation workflows using DBT.
- Optimize ETL processes for performance and efficiency.
- Create and maintain data models that ensure data integrity and quality.
- Collaborate with data analysts to understand business needs and translate them into functional data solutions.
- Implement version control best practices for DBT models and workflows.
- Work with data warehouses and cloud technologies to manage data storage solutions.
- Conduct thorough testing of models to ensure accuracy and reliability.
- Provide documentation for data models transformation processes and best practices.
- Monitor and troubleshoot issues arising from data pipelines and data quality.
- Engage in continuous improvement by identifying areas for optimization in existing workflows.
- Participate in code reviews and provide constructive feedback to peers.
- Stay current with industry trends tools and technologies relevant to data engineering.
- Participate in Agile ceremonies and contribute to team objectives.
- Assist in training junior engineers and interns in DBT practices.
- Foster a collaborative environment to enhance team productivity and efficiency.
Required Qualifications
- Bachelors or Masters degree in Computer Science Information Technology or a related field.
- Minimum of 8 years of experience in data engineering or data analytics roles.
- Strong proficiency in SQL and data modeling techniques.
- Handson experience with DBT and ETL processes.
- Proficient in cloud technologies such as AWS GCP or Azure.
- Experience with data warehousing solutions (e.g. Snowflake BigQuery Redshift).
- Familiarity with version control systems such as Git.
- Experience with Agile methodologies in project management.
- Strong analytical and problemsolving skills.
- Excellent communication and collaboration abilities.
- Experience with data visualization tools is a plus.
- Ability to work independently and as part of a team.
- Experience in performance tuning of SQL queries and ETL processes.
- Strong understanding of data governance and compliance standards.
- Previous experience in a fastpaced environment is preferred.
- Certifications in data engineering or cloud technologies are a plus.
data visualization,sql,data warehousing (snowflake, bigquery, redshift),data modeling,dbt,version control (git),etl,data governance,cloud technologies (aws, gcp, azure),snowflake,aws,airflow,agile methodologies