Job Title: Senior GCP Data Engineer
Location: Nashville TN (Initial remote ok)
Duration: 6 Months
Mandatory skill:
1. Data Integration (Lawson to GCP)
a. GCP data flow
b. Data Proc in Python
c. Big Query
Job Description:
We are seeking a skilled GCP Data Engineer with expertise in integrating data solutions into the Lawson ERP system. The successful candidate will be responsible for building maintaining and optimizing data pipelines that integrate data between Google Cloud Platform (GCP) and the Lawson ERP system ensuring that all data flow is smooth reliable and accurate. As a GCP Data Engineer you will collaborate with crossfunctional teams including business analysts IT and ERP specialists to design and implement scalable and efficient data solutions. The ideal candidate will have a strong background in cloudbased data engineering specifically on GCP and experience working with Lawson and other enterprise systems.
Key Responsibilities:
Data Integration & ETL Development: Design and implement robust ETL (Extract Transform Load) processes to integrate data from the Lawson ERP system into GCP services (BigQuery Dataflow Dataproc etc.) and other cloud data storage solutions.
Cloud Architecture Design: Work closely with cloud architects to design and deploy data pipelines ensuring optimal performance scalability and costefficiency within GCPs cloud ecosystem.
Lawson Integration: Develop and maintain seamless integration between GCP and Lawson ERP ensuring smooth data transfer transformation and synchronization between the systems.
Data Pipeline Optimization: Ensure data pipelines are optimized for high availability fault tolerance and scalability. Monitor data pipelines for performance troubleshoot and resolve issues promptly.
Collaboration & Documentation: Collaborate with business stakeholders and project teams to understand data requirements and provide technical recommendations. Document data engineering processes integration workflows and troubleshooting guides.
Data Security & Governance: Implement data security best practices and ensure compliance with relevant industry standards including data privacy and governance policies when integrating and handling sensitive business data.
Automation: Automate routine data processes to improve efficiency and reduce manual errors leveraging tools and technologies such as Apache Airflow Cloud Functions and other GCP automation services.
Reporting & Monitoring: Set up data reporting and monitoring systems to track the performance of integrations identify bottlenecks and proactively improve data flows.
Qualifications:
4 years of experience in data engineering or cloud data engineering with a strong focus on GCP (Google Cloud Platform).
Proven experience in integrating data systems (especially ERP systems like Lawson) with cloud platforms.
Proficiency in SQL data warehousing and cloudbased ETL tools (such as Apache Beam Dataflow or similar).
Strong knowledge of BigQuery Cloud Storage Cloud Functions Cloud Pub/Sub and other GCP services.
Familiarity with Lawson ERP architecture data models and integration methods.
Experience with data pipeline orchestration tools (e.g. Apache Airflow Cloud Composer).