This is a remote position.
Scope:
We are seeking a Data Engineer to join our fintech clients growing team. The Data Engineer will be responsible for implementing and managing data processes to integrate financial data across sources into our internal systems; developing maintaining and optimizing data pipelines and workflows; and collaborating closely with the rest of our engineering team to build worldclass data systems to support our growth.
We re specifically looking for a candidate based in Pakistan who will work remotely. Priority will be given to underrepresented/diverse candidates. Working hours are 3PM to 12AM (Pakistan time) to align with the US Pacific Time Zone.
Job Responsibilities:
- Implement and manage ETL (Extract Transform Load) processes to efficiently integrate data from Plaid QuickBooks Online Xero and other financial platforms into our PostgreSQL database.
- Collaborate with other engineers to execute the designed database schema ensuring seamless integration with an existing Djangobased infrastructure.
- Maintain and optimize current data pipelines
- Ensure the reliability scalability and performance of the data infrastructure handling large volumes of financial data in a
- secure and compliant manner.
- Develop test and maintain data workflows and pipelines that move and transform data as needed for various business
- applications.
- Monitor and troubleshoot datarelated issues ensuring timely resolution and minimal disruption to the existing systems.
- Collaborate with the development team to ensure that data flows and integrations are aligned with application
- requirements.
Requirements
- At least 3 years of experience as a Data Engineer with a strong focus on building and maintaining ETL pipelines.
- Expertise in working with relational databases particularly PostgreSQL and experience with largescale data integration
- projects.
- Proficiency in Python and Django with experience in writing scripts to automate data processing and transformation tasks.
- Strong problemsolving skills and the ability to optimize data pipelines for performance and scalability.
- Solid understanding of data security and compliance best practices particularly in handling financial data.
- Ability to work independently as well as part of a crossfunctional team.
Preferred Qualifications:
- Experience with Python and Django
- Experience with ETL processes data warehousing and data pipeline management
- Familiarity with creating and calling REST API endpoints
- Familiarity with the Django Rest Framework
- Experience with financial systems and data
- Familiarity with AWS infrastructure
Benefits
- Compensation at market levels commensurate with experience.
- Unlimited PTO/Sick Leave
Coworking / home office reimbursement
- Computer reimbursement benefits included.
At least 3 years of experience as a Data Engineer with a strong focus on building and maintaining ETL pipelines. Expertise in working with relational databases, particularly PostgreSQL, and experience with large-scale data integration projects. Proficiency in Python and Django, with experience in writing scripts to automate data processing and transformation tasks. Strong problem-solving skills and the ability to optimize data pipelines for performance and scalability. Solid understanding of data security and compliance best practices, particularly in handling financial data. Ability to work independently as well as part of a cross-functional team. Preferred Qualifications: Experience with Python and Django Experience with ETL processes, data warehousing, and data pipeline management Familiarity with creating and calling REST API endpoints Familiarity with the Django Rest Framework Experience with financial systems and data Familiarity with AWS infrastructure