Hybrid (3 days onsite 2 days remote) and they would prefer local candidates.
Description:
Please note candidates must have web services and big data technologies experience (Hadoop/hive/Spark or Python). They need to come from financial/banking domain and risk is a plus.
Responsibilities:
- Focus on building the next generation model development and execution platform including data sourcing
- Platform will leverage a host of interesting technologies including Spark and Hadoop and utilize a unique userconfigurable workflowdriven model execution approach
- Perform data analysis as needed to design data models data standardizations data quality/integrity checks etc.
- Highlight critical issues/risks across the program and work with partners to identify solutions
- Key contributor in Agile routines including sprint and capacity planning prioritizing tasks for monthly releases backlog maintenance
- Understand business requirements and lead analysis and translation into technical requirements including budgeting and technical documentation
- Review technical designs with the development team and ensure they meet the business partner requirements and are a scalable solution to meet longer term strategy
Requirements:
- 810 years of experience
- Undergrad/MS in Computer Science
- Understand core principles of distributed technologies like web services big data technologies like Hadoop/hive/Spark Python and should have worked in financial domain preferably in risk.
- Agile methodology (hands on experience in Epic and Story writing)
- Analytical skills to perform technical and functional analysis
- Strong communication skills