Role: Data Warehouse Developer/Architect
Duration: 6 months
Location: Plano TX (Onsite TuesWedThur and can work REMOTE Monday and Friday)
Expectations:
- Understand existing workflows and underlying frameworks
- Migrate these existing workflows to new CapitalOne specific data ingestion framework
- Collaborate with and across Agile teams
- to gather metadata to meet the current data governance standards
- Perform unit tests and conduct reviews with other team members to make sure your code is rigorously designed elegantly coded and effectively tuned for performance
- Build scripts/utilities to accelerate migration
- Analyze data & generate reports
- Learnunlearnrelearn concepts with an open and analytical mindset.
- Troubleshooting & Critical thinking
- Develop & review technical documentation for artifacts delivered.
Requirements:
- Minimum 7 years of experience in data engineering and Data Pipelines
- Minimum 5 years of extensive experience in Python Programming
- Minimum 3 years of extensive experience in SQL Unix/Linux Shell Scripting
- Handson experience writing complex SQL queries exporting and importing large amounts of data using utilities.
- Minimum 3 year of AWS experience
- Basic Knowledge of CI/CD
- Excellent communication skills and Good Customer Centricity.
- Bachelors degree in Computer Science or related field or equivalent combination of industryrelated professional experience and education
Nice to Haves:
- Prior experience with data migration project
- Experience with Kafka Streams/building data intensive streaming applications (stream processing e.g. Kafka Spark Streaming)
- Experience/Knowledge of Scala or Java Programming
- Experience with at least one Cloud DW such as Snowflake
- Experience with Distributed Computing Platforms