Job Title: Data Engineer with DBT Expertise (US Citizens Only)
Location: Remote
We are currently seeking candidates who meet the following qualifications
Responsibilities:
- Design develop and optimize scalable data pipelines using DBT for transforming and modeling large datasets.
- Collaborate with data analysts and stakeholders to understand business requirements and translate them into data models and ETL processes.
- Build and maintain DBT models ensuring data integrity accuracy and timeliness.
- Work closely with data infrastructure teams to ensure the data pipeline architecture supports analytics and reporting needs.
- Implement data transformation workflows including data validation testing and documentation.
- Optimize DBT models for performance and scalability in a cloudbased data warehouse environment (e.g. Snowflake BigQuery Redshift).
- Monitor and maintain the health of data pipelines ensuring data is reliable and easily accessible.
- Troubleshoot and resolve datarelated issues and collaborate with crossfunctional teams to continuously improve processes.
Qualifications:
- Proven experience as a Data Engineer ETL Developer or similar role with handson experience working with DBT.
- Strong proficiency in SQL with a deep understanding of database management and optimization.
- Experience with cloud data platforms (e.g. Snowflake Google BigQuery AWS Redshift).
- Knowledge of data pipeline orchestration tools (e.g. Airflow Prefect Dagster) is a plus.
- Experience working with version control systems (e.g. Git).
- Experience with data warehousing concepts and analytical databases.
- Strong problemsolving skills and attention to detail.
- Ability to work collaboratively in an agile fastpaced environment.
- Excellent communication skills to work with both technical and nontechnical teams.
- Federal Experience is a plus.
- Required Security clearance.
If you meet these qualifications please submit your application via link provided in Linkedin.
Kindly do not call the general line to submit your application.