drjobs Data EngineerLooker GCP

Data EngineerLooker GCP

Employer Active

1 Vacancy
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Job Location drjobs

Banga - India

Monthly Salary drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Vacancy

1 Vacancy

Job Description

Job Description

Mandatory Skills: Advance SQL Snowflake Looker GCP DBT Airflow AWS Data Visualization

As part of the team you will be responsible for building and running the data pipelines and services that are required to support business functions/reports/dashboard.. We are heavily dependent on BigQuery/Snowflake Airflow Stitch/Fivetran dbt Tableau/Looker for our business intelligence and embrace AWS with some GCP.

As a Data Engineer youll be:

  • Developing end to end ETL/ELT Pipeline working with Data Analysts of business Function.
  • Designing developing and implementing scalable automated processes for data extraction processing and analysis in a Data Mesh architecture
  • Mentoring Fother Junior Engineers in the Team
  • Be a goto expert for data technologies and solutions
  • Ability to provide on the ground troubleshooting and diagnosis to architecture and design challenges
  • Troubleshooting and resolving technical issues as they arise
  • Looking for ways of improving both what and how data pipelines are delivered by the department
  • Translating business requirements into technical requirements such as entities that need to be modelled DBT models that need to be build timings tests and reports
  • Owning the delivery of data models and reports end to end
  • Perform exploratory data analysis in order to identify data quality issues early in the process and implement tests to ensure prevent them in the future
  • Working with Data Analysts to ensure that all data feeds are optimised and available at the required times. This can include Change Capture Change Data Control and other delta loading approaches
  • Discovering transforming testing deploying and documenting data sources
  • Applying help defining and championing data warehouse governance: data quality testing coding best practises and peer review
  • Building Looker Dashboard for use cases if required

What makes you a great fit:

  • Having 3 years of extensive development experience using snowflake or similar data warehouse technology
  • Having working experience with dbt and other technologies of the modern datastack such as Snowflake Apache Airflow Fivetran AWS git Looker
  • Experience in agile processes such as SCRUM
  • Extensive experience in writing advanced SQL statements and performance tuning them
  • Experience in Data Ingestion techniques using custom or SAAS tool like fivetran
  • Experience in data modelling and can optimise existing/new data models
  • Experience in data mining data warehouse solutions and ETL and using databases in a business environment with largescale complex datasets

QualificationBE/BTech in IT/ MCA

Additional Information

This is permanent remote opportunity.

Required Qualification

Bachelor of Engineering Bachelor of Technology (B.E./B.Tech.)

Employment Type

Full Time

Company Industry

About Company

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.