For further inquiries regarding the following opportunity please contact our Talent Specialist
Lavanya at
Title: Data Engineer (REMOTE)
Location: Remote
Job Description
Client is seeking a dynamic motived energetic and driven Data Engineer!
Purpose and Scope:
Design develop and implement enterprise data movement transformation and storage to facilitate application data migration / transfer and reporting structures.
Utilizes technologies to facilitate enterprise data pipelines spark notebooks dataflows data lakes SQL Serverless and data warehouse processes and artifacts from internal and external application data sources.
Essential Responsibilities:
Focuses on data as an entity and does analysis on data in terms of content integrity and security.
Works on design and implementation of data flows/pipelines focuses on control and optimization of data movement and transformation.
Works on data integration between different systems or source/sink requirements.
Specific Job Duties:
Design develop test and manage the overall framework to facilitate analysis and processing of enterprise data working closely with the Data Architect to direct and optimize the flow of data within the framework and ensure consistency of data delivery and utilization across multiple projects.
Design how data will be stored accessed used integrated and managed by different data regimes and digital systems working with data users to determine create and populate optimal data architectures structures and systems.
Recommend and implement ways to improve data reliability efficiency and quality; shall evaluate compare and improve the different approaches including the design patterns innovation data lifecycle design data ontology alignment annotated datasets and elastic search approaches.
Process clean and verify the integrity accuracy completeness and uniformity of enterprise data sets integrating external or new datasets into existing datasets as required.
Design implement and operate data management systems for business intelligence needs.
Plan design and optimize data throughput and query performance.
Build data and analytics proofs that will offer deeper insight into datasets allowing for critical discoveries surrounding key performance indicators and user activity.
Perform research and analysis of large data sets to include operational data and perform data validation and visualization and other statistical analyses.
Support change management activities for enterprise data analysis. Document all processes models and activities.
Perform all other position related duties as assigned or requested.
Work Environment Physical Demands and Mental Demands:
Typical office environment with no unusual hazards occasional lifting to 20 pounds constant sitting while using the computer terminal constant use of sight abilities while reviewing documents constant use of speech/hearing abilities for communication constant mental alertness must possess planning/organizing skills and must be able to work under deadlines.
Quality Quality is the foundation for the management of our business and the keystone to our goal of customer satisfaction. It is our policy to consistently provide services that meet customer expectations. Accordingly each employee must conform to the Amentum Quality Policy and carry out job activities in compliance with applicable Amentum Quality System documents and customer contracts. Each employee must read and understand his/her Quality Management and Customer Satisfaction responsibilities.
Procedure Compliance Each employee must read understand and implement the general and specific operational safety quality and environmental requirements of all plans procedures and policies pertaining to his/her job.
Minimum Position Knowledge Skills and Abilities Required:
Bachelors degree in Computer Science or related field and 24 years of experience.
Excellent communications and analytical skills; demonstrated working knowledge and experience of several of the following technologies/tools is desired:
o Microsoft SQL (TSQL)
o Oracle SQL (PLSQL)
o Azure Synapse
o Azure ADLS
o Azure Pipelines
o Azure Spark Notebooks (SCALA Python Spark SQL)
o REST API
o JSON Files
o Parquet Files
o Delta Lake Files
o Data Vault 2.0
o SQL Serverless
o Synapse Data Warehouse (MPP Dedicated)
o Microsoft DevOps and GitHub
o Agile Development
o Data Encryption / Decryption