drjobs Senior Data Engineer GCP

Senior Data Engineer GCP

Employer Active

1 Vacancy
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Job Location drjobs

Un - India

Salary drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Vacancy

1 Vacancy

Job Description

Title: Senior Data Engineer (GCP)
Location:
Job Type: FullTime
About Us: Logic Hire Software Solutions is an innovative datacentric organization committed to leveraging advanced analytics and data engineering practices to inform strategic business decisions and improve client engagement. We are seeking a seasoned Senior Data Engineer to join our team and contribute to the optimization and expansion of our data infrastructures.
Job Description:
We are in search of a highly proficient Senior Data Engineer with over 12 years of handson experience in architecting and maintaining data infrastructure. The ideal candidate will possess extensive expertise in leveraging the Google Cloud Platform (GCP) and the BigQuery ecosystem alongside strong commands in SQL SSIS (SQL Server Integration Services) SSRS (SQL Server Reporting Services) and Python. This role necessitates a combined technical acumen and strong interpersonal skills to successfully engage with business units while supporting the overall project lifecycle.
Key Responsibilities:
  • Architect implement and maintain highperformance data pipelines utilizing GCP services particularly BigQuery Cloud Storage Cloud Functions and Dataflow ensuring optimal data flow and accessibility.
  • Design and write highly efficient scalable SQL queries including complex joins CTEs and aggregations to enable robust data analysis and reporting across multiple operational facets.
  • Develop ETL (Extract Transform Load) processes using SSIS for operational data integration and leverage SSRS for generating executivelevel reporting and analytics dashboards.
  • Employ Python to create productionquality scripts and applications for data ingestion transformation and visualization utilizing libraries such as Pandas NumPy or Apache Airflow for orchestrating workflows.
  • Engage with crossfunctional teams to elicit document and analyze business requirements subsequently translating these into comprehensive technical specifications data models and workflows.
  • Implement and uphold data governance frameworks to ensure data integrity quality control and security protocols across all data engineering processes.
  • Monitor data pipelines and system performance metrics identifying bottlenecks and implementing solutions to optimize throughput and minimize downtime.
  • Provide analytical insights and recommendations to project and client management facilitating datadriven decisionmaking.
  • Mentor junior data engineering staff cultivating an environment of knowledge sharing and professional development.
  • Stay abreast of latest trends in data engineering technologies tools and methodologies to continually refine our data practices.
Qualifications:
  • Bachelor s degree in Computer Science Engineering Data Science or a related discipline; a Master s degree is highly desirable.
  • A minimum of 8 years of experience in the field of data engineering particularly within GCP and the BigQuery architecture.
  • Profound experience in formulating and executing complex SQL queries and a solid understanding of relational database design principles.
  • Advanced proficiency with SSIS for ETL processes and SSRS for business intelligence reporting.
  • Strong programming skills in Python with a focus on data manipulation and the development of scalable ETL solutions.
  • Demonstrated ability in constructing deploying and maintaining data engineering pipelines utilizing modern best practices.
  • Strong verbal and written communication skills complemented by an ability to liaise effectively between technical teams and business stakeholders.
  • Exceptional analytical and problemsolving capabilities with a proactive approach towards diagnosing and resolving issues.
  • Working knowledge of data governance principles compliance with data privacy regulations and industry best practices.
Preferred Skills:
  • Familiarity with additional GCP services such as Cloud Dataflow for stream/batch processing Dataproc for managing Hadoop/Spark clusters or Pub/Sub for messaging services.
  • Understanding of machine learning concepts and frameworks (e.g. TensorFlow scikitlearn) to integrate predictive analytics within data solutions.
  • Experience working within Agile environments and proficiency with project management tools (e.g. JIRA Trello).
What We Offer:
  • A competitive salary and comprehensive benefits package.
  • Opportunities for continued professional development and advancement within a cuttingedge environment.
  • A collaborative workspace that encourages innovation and creativity.
  • Flexible working options to support worklife balance.
If you possess the expertise and are eager to advance your career by driving impactful data initiatives at Logic Hire we invite you to apply. Please submit your resume and a cover letter detailing your relevant qualifications and accomplishments.

Communication in English should be proficient.

Experience in Data Engineering & Architecture ( Data Modeling ETL Processes Data Pipeline Development Data Integration and Cloud Data Solutions (GCP)
Experience in Cloud Platforms (Google Cloud Platform (GCP) particularly BigQuery Cloud Storage Cloud Functions)
Experience in Big Data Tools (Hadoop Spark MapReduce Pig Hive NoSQL Apache Airflow)
Experience in Data Governance(data governance frameworks ensuring data integrity quality control and security protocols)
Experience in Data Visualization & Reporting (PowerBI Tableau SSIS SSRS Superset Plotly)
Experience in Programming Languages (: Python SQL R Scala C C Java)
Experience in Database Technologies (Teradata Oracle SQL Server)

Note : Must be Green Card US Citizen

problem-solving,c++,spark,cloud data solutions (gcp),workflow management,data analysis,data pipelines,etl processes,data reporting,cloud,c,data visualization,data governance frameworks,bigquery,data governance,scikit-learn,data integrity,data engineering,hive,data engineering & architecture,tensorflow,gcp,data science,data visualization & reporting,data engineering pipelines,agile methodology,business intelligence reporting,tableau,hadoop,ssis,cloud functions,agile methodologies,pandas,database technologies,teradata,reporting,monitoring,numpy,java,plotly,superset,data pipeline development,sql,team management,data engineering methodologies,cloud platforms (google cloud platform (gcp), particularly bigquery, cloud storage, cloud functions),data architecture,agile environments,etl,etl (extract, transform, load),google cloud platform (gcp),data quality control,programming languages,big data tools,data engineering technologies,data,big data tools (hadoop, spark, mapreduce, pig, hive, nosql, apache airflow),agile,apache airflow,data integration,data security,data governance (data governance frameworks, ensuring data integrity, quality control, and security protocols),ssrs,scala,data storage,project management,cloud data solutions,data solutions,cloud storage,nosql,communication,data modeling,database technologies (teradata, oracle, sql server),business requirements analysis,mapreduce,python,data manipulation,relational database design,dataflow,r,security protocols,mentoring,analytics,programming languages (python, sql, r, scala, c, c++, java),technical specifications,pipelines,powerbi,cloud platforms,data visualization & reporting (powerbi, tableau, ssis, ssrs, superset, plotly),machine learning,communication skills,oracle,pig,google cloud platform

Employment Type

Full Time

Company Industry

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.