This role is for one of the Weekdays clients
We are seeking a versatile and experienced Data Engineering Generalist to join our team. The ideal candidate will have a strong foundation in data engineering and backend development. You will be responsible for designing building and maintaining robust data pipelines as well as collaborating with backend engineering teams to optimize datadriven applications. If you have a passion for solving complex data challenges and integrating backend solutions this role is for you!
Key Responsibilities:
- Design develop and optimize data pipelines and ETL processes to support data integration and analysis.
- Collaborate with backend teams to ensure seamless data flow between applications and databases.
- Build and maintain scalable data architectures that handle large volumes of data efficiently.
- Implement and maintain data models ensuring data integrity and performance across systems.
- Optimize database performance and ensure high availability for realtime data processing.
- Work closely with data scientists analysts and backend developers to deliver datadriven features and insights.
- Troubleshoot and resolve datarelated issues ensuring the stability of data systems and pipelines.
- Develop and implement data governance practices to ensure data quality and compliance.
- Stay updated with industry trends and best practices in data engineering and backend development.
Required Skills and Qualifications:
- Education: Bachelors or Master s degree in Computer Science Data Engineering or related field.
- Technical Skills:
- Strong proficiency in SQL and NoSQL databases (e.g. PostgreSQL MongoDB).
- Handson experience with data pipeline tools (e.g. Apache Kafka Apache Airflow Spark).
- Proficiency in programming languages such as Python Java or Scala.
- Experience with cloud platforms (AWS Azure or GCP) for data storage processing and deployment.
- Familiarity with backend frameworks (e.g. Node.js Django) and RESTful API design.
- Experience with containerization technologies (Docker Kubernetes) is a plus.
- Soft Skills: Strong problemsolving skills attention to detail and the ability to collaborate effectively in a team environment.
Preferred Qualifications:
- Experience working in an Agile environment.
- Knowledge of data warehousing concepts and tools (e.g. Snowflake Redshift).
- Familiarity with data governance frameworks and data security best practices.
apache kafka,backend development,node.js,aws,scala,python,java,kubernetes,design,restful api design,docker,spark,gcp,apache airflow,nosql databases,data engineering,django,azure,sql