Roles & Responsibilities
- We are looking for a Data Engineer to join our Campaign Analytics team. The Campaign Analytics suite of products utilize survey based datasets to measure analyze and report on Advertisement effectiveness.
- As a Data Engineer you will work alongside data scientists and engineers to build a data platform on AWS to ingest data from external sources and perform ETL tasks. The role requires you to have experience working with large datasets with complex schemas a cando approach towards automation with an emphasis on the implementation of best practice cloud security principles.
- Our culture is inclusive and we have a healthy work life balance.
- If you are passionate about problem solving enjoy continuous learning like building new things.
Responsibilities
- Collaborate with product owners to understand requirements and design new
- components.
- Collaborate in crossfunctional teams to implement test and deploy features.
- Perform code reviews.
- Build and maintain development environments and CI/CD workflows.
Preferred Qualifications
- BS or MS in Computer Science.
- 3 years with mainstream programming languages (Python C C Java etc.).
- Strong knowledge of programming concepts and paradigms.
- Basic understanding of data lakes / data warehousing
- Experience with SQL and relational database systems.
- Experience writing automated tests.
- Excellent communication skills.
- Experience working in Cloud environments.
- Experience working in an agile environment.
- Experience with popular AWS services (S3 RDS EC2 etc.).
- Experience with popular developer tools (Git Docker Github/Gitlab).
- Familiarity with orchestration tools such as Apache Airflow.
- Familiarity with big data processing tools such as Spark EMR Presto.
communication skills,aws,emr,github,amazon web services (aws),aws services,big data processing tools,cloud security,automated tests,c++,etl,sql,developer tools,python,data warehouse,git,gitlab,agile environment,docker,presto,java,automated testing,automation,orchestration tools,data lakes,data platform,spark,data engineering,apache airflow,data warehousing