Employer Active
Job Alert
You will be updated with latest job alerts via emailJob Alert
You will be updated with latest job alerts via emailHi
Hope you are doing well !!
I have an urgent position. Kindly go through the Job description and let me know if this would be of interest to you.
Job Title: Data Warehouse (Hybrid)
Location: Downtown Chicago IL (3 days onsite per week)
Duration: 6 Months Contract
****While sharing resume mention consultant location and visa status***
Job Description:
Need Only LOCAL CANDIDATE
Project Overview / Contractors Role:
This position is for a Data Engineer post in view of RCE initiative
Experience level: 3 (7 years)
Qualifications (must haves):
> Knowledge of Python Snowflake Apache Kafka Data warehousing ETL tools and Agile development methodology
> Knowledge of full secure development life cycle
> Analytical and problemsolving skills
> A positive goaloriented attitude with a focus on delivery. Experience working in onsiteoffshore setup
Skillsets:
Python framework design and development
Data ingestion using Kafka
Snowflake SnowSQLs
Data Modelling Design and Coding
Oracle/SQL
Unix/Linux
Shell Scripting Git
Nice to have:
> JAVA script and APIs
> Data Stage v11.3 and v11.7
Tasks & Responsibilities:
Design and Develop effective Data load solution using Python Kafka or other data integration methodologies
Data load using Snowflake SQLs and stored procedures
Design technical solutions using the best available technologies
Design develop and unit test components as defined above
Consult with product analysts and/or business partners on requirements and synthesize into technical requirements and designs
On the technical side work with architects designers and enterprise framework groups on the best solution for both business and IT
Ensure that nonfunctional requirements such as security performance maintainability scalability usability and reliability are being considered when architecting solutions
Ensure code meets standards and tested appropriate
Responsible for identifying and fixing any environmental issues and regularly gets pulled to resolve platform production related issues
Most important
Data Engineer warehouse project entering into second phase Corporate warehouse phase 2
Python and SnowSQL (Is required)
Kafka Not building the pipelines knowledge of to be able to debug
Airflow and Astro
DataStage is a nice to have and so is Peoplesoft
Looking at existing stages
No Digital Visualization tools needed
Modelling design and code data pipelines
If you are interested please share your updated resume and suggest the best number & time to connect with you.
Thanks & Regards
Abhishek Yadav
4645 Avon Lane Suite 210 Frisco TX 75033
Email: Phone : EXT 103
Full Time