drjobs Hadoop Engineer

Hadoop Engineer

Employer Active

drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Job Location drjobs

Alexander City - USA

Monthly Salary drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Job Description

Title : Hadoop Engineer/ Architect

Location: Jersey city NJ

Relocation Acceptable

C2C

Experience : 12 Years

Visa : Any Visa

Job Summary:

We are seeking a talented Hadoop Engineer / Architect to join our team. The ideal candidate will have strong experience designing building and maintaining largescale data solutions using the Hadoop ecosystem. This role will involve working closely with crossfunctional teams to architect implement and optimize data processing systems for big data analytics and storage.

Key Responsibilities:

Architect and design scalable reliable and highperformance Hadoopbased big data solutions.

Manage and maintain Hadoop clusters ensuring optimal performance scalability and security.

Collaborate with data engineers and data scientists to design efficient data pipelines and ETL processes.

Develop Architect and design solutions for data ingestion processing and storage using tools within the Hadoop ecosystem such as HDFS Hive HBase MapReduce Pig Spark Flume and Kafka.

Implement monitoring tuning and troubleshooting strategies for performance optimization.

Ensure data integrity and implement security protocols for sensitive data.

Provide thought leadership and recommend enhancements to the existing architecture based on the latest Hadoop technologies and best practices.

Assist with the migration of legacy systems and ensure seamless data integration with the Hadoop ecosystem.

Guide the bank in meeting its product goals with deep focus on big data architecture modernization data monetization data availability and data management.

Collaborate with DevOps teams to ensure efficient deployment and automation of Hadoop solutions

Qualifications:

Bachelors/Masters degree in Computer Science Engineering or a related field.

5 years of experience working with Hadoop ecosystem components (HDFS Hive HBase etc.).

Proven expertise in data architecture and Hadoop cluster management.

Handson experience with Spark MapReduce and NoSQL databases.

Proficient in Java Python or Scala for data processing and scripting.

Strong understanding of distributed computing and parallel processing.

Experience with cloud platforms (AWS Azure GCP) and their big data solutions (e.g. Amazon EMR Azure HDInsight).

Knowledge of data governance security protocols and compliance.

Familiarity with DevOps practices including automation of deployments and scaling solutions.

Excellent problemsolving skills and ability to work in a fastpaced environment.

Preferred Skills:

Experience with containerization technologies (Docker Kubernetes).

Knowledge of machine learning tools and integration with Hadoop.

Experience in migrating onprem Hadoop clusters to cloud platforms.

Familiarity with CI/CD pipelines for big data solutions.

A product mindset is a must. Should have played a key role in formulating product centric data strategy for a financial services client.

Exposure to treasury areas like liquidity management payments or capital management will be a huge plus.

Technical skills: Big data cloud data management Data lineage data quality platforms data distribution architectures Enterprise data patterns across multiple data layers.

Employment Type

Full Time

Company Industry

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.