drjobs DataPlatform Engineer Remote

DataPlatform Engineer Remote

Employer Active

1 Vacancy
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Jobs by Experience drjobs

5years

Job Location drjobs

India

Monthly Salary drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Vacancy

1 Vacancy

Job Description

This is a remote position.

About the job

At Minutes to Seconds we match people having great skills with tailorfitted jobs to achieve welldeserved success. We know how to match people to the right job roles to create that perfect fit. This changes the dynamics of business success and catalyzes the growth of individuals. Our aim is to provide both our candidates and clients with great opportunities and the ideal fit every time. We have partnered with the best people and the best businesses in Australia in order to achieve success on all fronts. We re passionate about doing an incredible job for our clients and job seekers. Our success is determined by the success of individuals at the workplace.


We would love the opportunity to work with YOU!!


Minutes to Seconds is looking for an Data/Platform Engineer in a Full time position.


Requirements

Job Overview

The primary goal of developers is efficiency consistency scalability and reliability.

We are responsible for the Platform all the tooling integrations security accesscontrol data classification/management orchestration selfservice lab concept observability reliability as well as data availability (data ingestion)

We are NOT responsible for Data Modeling Data Warehousing Reporting (PowerBI)

although we do work with PBI team for access control from PBI to Snowflake.

Everything we do is achieved through code nothing is manual (or ClickOps) everything is automated through the effectiveness of our CI/CD framework GitHub GitHubActions Terraform Python.

Orchestration is centrally managed using Managed Airflow

We manage RBAC / Access Control

We re responsible for Tooling Integrations and all the connectivity and authentication requirements

.

Ingestion Methods/Patterns

o Fivetran

o SnowflakeSnowpipe (FileBased sources)

o SnowflakeSecure Data Share

.

Solid Software Development (Full SDLC) Experience with excellent Coding skills:

o Python (required).

o Good knowledge of Git and GitHub (required)

o Good codemanagement experience/best practices (required).

.

Understanding of CI/CD to automate and improve the efficiency speed and reliability of software delivery.

    • Best Practices/principals
    • Github Actions
      • automate workflows directly from their GitHub repositories.
      • Automation of building testing and deploying code inc. code linting security scanning and version management.
    • Experience with testing frameworks
    • Good knowledge of IaC (Infrastructure as Code) using Terraform (required)
      • EVERYTHING we do is IaC
  • Strong verbal and written skills are a must ideally with the ability to communicate in both technical and some business language
  • A good level of experience with cloud technologies AWS namely S3 Lambda SQS SNS API Gateway (API Development) Networking (VPCs) PrivateLink and Secrets Manager.
  • Extensive handson experience engineering data pipelines and a solid understanding of the full data supply chain from discovery & analysis data ingestion processing & transformation to consumption/downstream dataintegration.

.

A passion for continuous improvement and learning for optimization both in terms of cost and efficiency as well as ways of working. Obsessed with data observability (aka data reconciliation) ensuring pipeline and data integrity.

  • Experience working with large structured/semistructured datasets
    • A good understanding of Parquet Avro JSON/XML
  • Experience with Apache Airflow / MWAA or similar orchestration tooling.
  • Experience with Snowflake as a Data Platform
    • Solid understanding of Snowflake Architecture compute storage partitioning etc.
    • Key features such as COPYINTO Snowpipe objectlevel tagging and masking policies
    • RBAC (security model) design and administration intermediate skill required

o query performance tuning and zero copy clone nice to have

o virtual warehouse (compute) sizing

  • TSQL experience ability to understand complex queries and think about optimisation advantageous
  • Data Modelling experience advantageous
  • Exposure to dbt (data build tool) for data transformations advantageous
  • Exposure to Alation or other Enterprise Metadata Management (EMM) tooling advantageous
  • Documentation: architectural designs operational procedures and platform configurations to ensure smooth onboarding and troubleshooting for team members.
Please send resume at


Job Description: Extensive experience and ample hands-on using IBM Cognos tool Experience in designing & developing dashboards and complex reports Understanding of India Insurance domain is plus Practical knowledge of connecting Cognos with on-prem DBs (like DB2) and cloud platform database services like GCP BQ , AWS S3/Redshift Good knowledge of using javascript in Cognos reports Knowledge of Cognos Administration is plus Experience in collaborating with cross-functional teams Experience Range: 5 - 8 years Educational Qualifications: B.Tech/B.E Skills Required: IBM cognos tool, AWS, Redshift, GCP BQ. Click to apply the role or get in touch with us at

Employment Type

Full Time

Company Industry

Key Skills

  • Apache Hive
  • S3
  • Hadoop
  • Redshift
  • Spark
  • AWS
  • Apache Pig
  • NoSQL
  • Big Data
  • Data Warehouse
  • Kafka
  • Scala
Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.