About ProcDNA
ProcDNA is a global consulting firm. We fuse design thinking with cuttingedge technology to create gamechanging Commercial Analytics and Technology solutions for our clients. Were a passionate team of 200 across 6 offices all growing and learning together since our launch during the pandemic. Here you wont be stuck in a cubicle youll be out in the open water shaping the future with brilliant minds. At ProcDNA innovation isnt just encouraged; its ingrained in our DNA. Ready to join our epic growth journey
What we are looking for
Youll build and maintain systems for efficient data collection storage and processing to ensure data pipelines are robust and scalable for seamless integration and analysis. We are seeking an individual who not only possesses the requisite expertise but also thrives in the dynamic landscape of a fastpaced global firm.
What you ll do
- Design and implement complex enterprise data solutions.
- Build and optimize ETL pipelines for data integration and enhance data warehouse systems through architectural reviews.
- Ensure strict data compliance security and cost optimization.
- Rearchitect data solutions for scalability reliability and resilience.
- Manage data schemas and flow to ensure compliance integrity and security.
- Deliver endtoend data solutions across multiple infrastructures and applications.
- Build strong partnerships with other teams to create valuable solutions.
Must have
- Min. 3 year of experience in data engineering role with a B.tech/BE degree in background.
- Proficient in ETL/ELT pipeline implementation.
- Extensive experience with handling large data and maintaining data quality. Automating ETL pipelines.
- Knowledge of data warehouses (Redshift Snowflake Databricks Cloudera).
- Proficient in Python scripting PySpark and Spark.
- Experience with data ingestion storage and consumption.
- Skilled in SQL and data schema management.
spark,sql,data schema management,data quality maintenance,snowflake,cloud,python scripting,pyspark,data engineering,aws,python,handling large data,azure,etl,etl/elt pipeline implementation