Job Title: Data Engineer/Architect
Job Location: Wilmington, DE 19803 (HYBRID 3 Days on-site)
Job Duration: Contract
- 4-5 years of experience with big data preferably in a large complex organization
- 4-5 years of Data Engineering/Data Analytics/Business Intelligence
- 4-5 years of legacy ETL experience
- 4-5 years of hands on Pipeline Development
- Experience with Apache Spark
- Expert level SQL knowledge with strong complex query writing skills
- 4-5 years Years experience with RDBMS such as Oracle,DB2 MySQL, Teradata Hive
- 4-5 years Years experience with No SQL database such as Hbase, DynamoDB, MongoDB, Cassandra
- Experience with Cloud/Virtual warehouse systems such as Snowflake/Redshift.
- Distributed computational framework experience required
- AWS solutions and data experience preferred Glue
- Any related experience in the distributed event streaming platforms such as Kafka, Kinesis etc
What you Need:
- Min 5 years Big Data experience
- Must have a solid ETL background
- Must have Java Spark experience Python could work as well but should still have some Java and Definitely Need Spark
- Must have AWS experience
- Must have experience building pipelines
- Experience with Kafka is a plus
- Ab Initio is a plus
What You'll Do:
- Provide hands-on practical experience delivering system design, application development, testing, and operational stability
- Build Pipelines
- Migrate legacy ETLs to Spark based processing framework
- Design, implement ETL solutions
- Building data pipelines
- Provide practical cloud native experience, preferably AWS
- Provide experience in Event Driven/Streaming Architecture
- Working proficiency of a variety of software engineering toolsets
#Bigdata #Data #Hiring #jobs #immediately #available #roles #opportunity #opening #SQL #AWS #Database #looking #opentowork