Python Pyspark & Data Bricks
Jersey City NJ (Need within 35 Miles)
Duties and Responsibilities
- Collaborate with the team to build out features for the data platform and consolidate data assets
- Build maintain and optimize data pipelines built using Spark
- Advise consult and coach other data professionals on standards and practices
- Work with the team to define company data assets
- Migrate CMS data platform into Chases environment
- Partner with business analysts and solutions architects to develop technical architectures for strategic enterprise projects and initiatives
- Build libraries to standardize how we process data
- Loves to teach and learn and knows that continuous learning is the cornerstone of every successful engineer
- Has a solid understanding of AWS tools such as EMR or Glue their pros and cons and can intelligently convey such knowledge
- Implement automation on applicable processes
Mandatory Skills:
- 5 years of experience in a data engineering position
- Proficiency is Python (or similar) and SQL
- Strong experience building data pipelines with Spark
- Strong verbal & written communication
- Strong analytical and problem solving skills
- Experience with relational datastores NoSQL datastores and cloud object stores
- Experience building data processing infrastructure in AWS
- Bonus: Experience with infrastructure as code solutions preferably Terraform
- Bonus: Cloud certification
- Bonus: Production experience with ACID compliant formats such as Hudi Iceberg or Delta Lake
- Bonus: Familiar with data observability solutions data governance frameworks
Requirements
- Bachelors Degree in Computer Science/Programming or similar is preferred Right to work
- Measure and optimize system performance with an eye toward pushing our capabilities forward getting ahead of customer needs and innovating to continually improve
- Provide primary operational support and engineering for the public cloud platform and debug and optimize systems and automate routine tasks
- Collaborate with a crossfunctional team to develop realworld solutions and positive user experiences at every interaction
- Drive Game days Resiliency tests and Chaos engineering exercises Utilize programming languages like Java Python SQL Node Go and Scala Open Source RDBMS and NoSQL databases Container Orchestration services including Docker and Kubernetes and a variety of AWS tools and services
Required qualifications capabilities and skills
- Formal training or certification on software engineering concepts and 10 years applied experience Handson practical experience delivering system design application development testing and operational stability Advanced in one or more programming language(s) Java Python Go
- A strong understanding of business technology drivers and their impact on architecture design performance and monitoring best practices Design and building web environments on AWS which includes working with services like EC2 ALB NLB Aurora Postgres DynamoDB EKS ECS fargate MFTS SQS/SNS S3 and Route53
Advanced in modern technologies such as:
- Java version 8 Spring Boot Restful Microservices AWS or Cloud Foundry Kubernetes.
- Experience using DevOps tools in a cloud environment such as Ansible Artifactory Docker GitHub Jenkins Kubernetes Maven and Sonar Qube
- Experience and knowledge of writing InfrastructureasCode (IaC) and EnvironmentasCode (EaC) using tools like CloudFormation or Terraform
- Experience with high volume SLA critical applications and building upon messaging and or eventdriven architectures Deep understanding of financial industry and their IT systems Preferred qualifications capabilities and skills
- Expert in one or more programming language(s) preferably Java AWS Associate level certification in Developer Solutions Architect or DevOps
- Experience in building the AWS infrastructure like EKS EC2 ECS S3 DynamoDB RDS MFTS Route53 ALB NLB Experience with high volume mission critical applications and building upon messaging and or eventdriven architectures using Apache Kafka
- Experience with logging observability and monitoring tools including Splunk Datadog Dynatrace. CloudWatch or Grafana
- Experience in automation and continuous delivery methods using Shell scripts Gradle Maven Jenkins Spinnaker
- Experience with microservices architecture high volume SLA critical applications and their interdependencies with other applications microservices and databases
- Experience developing process tooling and methods to help improve operational maturity