Job: Data Platform Engineer (Snowflake and Databricks)
Location: North Dublin
Working model: Hybrid (12 days on site)
Rate: 450475 per day
Type: Contract
Duration: 12 months
We are working with a leader of industry. Working alongside a talented Engineering team we are looking to add a talented Data Platform Engineer to join our team in Dublin.
This Job offers the chance to shape and support the teams data architecture working on cuttingedge cloud technologies and driving the success of our datadriven projects. You should have a strong background in Databricks Snowflake and AWS. You should be proficient in MLOps to support seamless deployment and scaling of machine learning models. Youll play a critical role in our mission to enhance data accessibility streamline data sourcing pipelines and optimize performance for largescale data solutions.
Key responsibilities & duties include:
- Architect and Implement CloudNative Data Solutions: Design and develop scalable data platforms focusing on a cloudnative approach data mesh architectures and seamless integration across multiple data sources
- MLOps Pipeline Development: Build and maintain MLOps pipelines using tools like MLflow ensuring efficient and reliable deployment of machine learning models to production environments
- Data Governance and Quality Management: Create and enforce data governance standards ensuring robust data quality and compliance through tools such as Databricks Unity Catalog
- Data Integration & Migration: Lead migration projects from legacy data platforms to modern cloud solutions optimizing cost and operational efficiency
- Performance Tuning and Optimization: Leverage tools such as Snowflake and Delta Lake to improve data accessibility reliability and performance delivering highquality data products that adhere to best practices
Key Projects/Deliverables:
- Data Mesh Architecture: Design and deployment of data mesh frameworks to streamline data integration and scalability across business domains
- MLOps Pipelines: Prototype and operationalize MLOps pipelines to enhance the efficiency of machine learning workflows
- Data Migration & Cost Optimisation: Migrate largescale datasets to Azure and AWS platforms with a focus on businesscritical data sources and significant cost reductions
- Data Governance Applications: Develop applications to enforce data governance data quality and enterprise standards supporting a robust production environment
Required Experience:
- Experience in Data Platform Engineering: Proven track record in architecting and delivering largescale cloudnative data solutions
- Proficiency in Databricks and Snowflake: Strong skills in data warehousing and lakehouse technologies with handson experience in Databricks Spark Pyspark and Delta Lake
- MLOps Expertise: Experience with MLOps practices ideally with MLflow for model management and deployment
- Cloud Platforms: Knowledge of AWS with additional experience in Azure beneficial for multicloud environments
- Programming Languages: Strong coding skills in Python SQL and Scala
- Tooling Knowledge: Experience with version control (GitHub) CI/CD pipelines (Azure DevOps GitHub Actions) data orchestration tools (Airflow Jenkins) and dashboarding tools (Tableau Alteryx)
- Some familiarity with data governance tools and best practices would be ideal.
This is an exciting opportunity to work in a secure contract with a leading business undertaking a major digital transformation. You will play a critical part of the technical advancement and work along side a skilled and friendly group of Engineers. If you would be interested please submit your CV to the link provided for immediate consideration.