Work Location: ONSITE Warren NJ
Responsibilities:
- Alphalab
- Lead the of Alphalab projects focusing on innovative data integration and ETL tool testing
- Design and implement prototypes and pilot projects to explore new data integration patterns and architectures
- Technology Evaluation
- Evaluate and recommend modern ETL and data integration technologies that support Gen AI data mesh and data fabric architectures
- Stay abreast of industry trends and developments in data integration assessing tools for API connectivity realtime replication batch loading and crosscloud connectivity
- ETL and Data Integration
- Design and develop robust ETL processes that integrate seamlessly with big data platforms and support largescale data environments
- oImplement solutions for realtime data replication and batch data loading ensuring high performance and reliability
- CrossCloud and Big Data Connectivity
- Develop strategies for crosscloud data integration enabling seamless data flow across various cloud platforms
- Architect solutions that connect with big data technologies optimizing data processing and storage
- API and RealTime Systems Integration
- Design API interfaces and services for realtime data integration and consumption
- Ensure the architecture supports eventdriven mechanisms and streaming data platforms
- Collaboration and Leadership
- Work closely with IT teams data scientists and business stakeholders to define and refine data requirements and integration strategies
- Lead crossfunctional teams in the deployment of data integration tools and platforms providing expertise and guidance
- Documentation and Compliance
- Document all architectural designs and changes ensuring solutions meet compliance and security standards
- Develop best practices and guidelines for data integration and ETL processes
Qualifications:
- Good Knowledge of ETL tools
- Primary SAP BODS Alteryx
- Secondary SLTAPIEGEE
- 6 to 10 years Proven experience as an ETL Architect Data Engineer or similar role with a focus on data platforms
- Expertise in ETL tools and technologies with a strong understanding of data integration patterns
- Experience with cloud services AWS Azure Google Cloud and big data technologies Hadoop Spark
- Knowledge of API development realtime data processing and eventdriven architectures
- Strong analytical and problemsolving skills with the ability to lead projects and teams
- Bachelors or masters degree in computer science Information Technology or a related field