About the job
At Minutes to Seconds we match people having great skills with tailorfitted jobs to achieve welldeserved success. We know how to match people to the right job roles to create that perfect fit. This changes the dynamics of business success and catalyzes the growth of individuals. Our aim is to provide both our candidates and clients with great opportunities and the ideal fit every time. We have partnered with the best people and the best businesses in Australia in order to achieve success on all fronts. We re passionate about doing an incredible job for our clients and job seekers. Our success is determined by the success of individuals at the workplace.
We would love the opportunity to work with YOU!!
Minutes to Seconds is looking for a Senior Pyspark Developer in a Contract position.
Requirements
Job Description:
5 years of experience in Pyspark including Hadoop SQL and other big data technologies. Databricks knowledge is fine.
We are seeking a skilled PySpark Developer to join our dynamic team. The ideal candidate will have a strong background in big data processing and analytics using PySpark. Additionally knowledge of Databricks is highly desirable. The PySpark Developer will be responsible for designing implementing and optimizing data pipelines and ensuring data quality and performance.
Key Responsibilities:
Design develop and maintain scalable data pipelines using PySpark.
Collaborate with crossfunctional teams to gather requirements and deliver data solutions.
Optimize and tune PySpark jobs for performance and scalability.
Implement data quality checks and ensure data integrity.
Troubleshoot and resolve issues related to data processing and performance.
Work with Databricks to manage and optimize data workflows (nice to have).
Develop and maintain documentation for data pipelines and processes.
Stay updated with the latest trends and technologies in big data and analytics.
Nice to Have:
Experience with Databricks for managing and optimizing data workflows.
Knowledge of data streaming frameworks such as Kafka.
Experience Range:
5 8 years
Educational Qualifications:
B.Tech/B.E
Skills Required:
Pyspark Databricks Hadoop SQL Kafka Data Engineering.
Looking for Immediate joiner
Please send resume at
Job Description: Extensive experience and ample hands-on using IBM Cognos tool Experience in designing & developing dashboards and complex reports Understanding of India Insurance domain is plus Practical knowledge of connecting Cognos with on-prem DBs (like DB2) and cloud platform database services like GCP BQ , AWS S3/Redshift Good knowledge of using javascript in Cognos reports Knowledge of Cognos Administration is plus Experience in collaborating with cross-functional teams Experience Range: 5 - 8 years Educational Qualifications: B.Tech/B.E Skills Required: IBM cognos tool, AWS, Redshift, GCP BQ. Click to apply the role or get in touch with us at