Senior Data Platform Architect
Location: Remote
Position Overview
A US company engaged in call analytics and customer engagement is is looking for an experienced Data Platform Architect to drive the design development and deployment of scalable data architectures for large datasets. The role includes defining data structures managing crosssystem data flows and overseeing ETL processes. Expertise in tools like Spark Snowflake Kafka and Databricks is essential to guide the team in adopting best practices and optimal solutions.
You must be able to work US time zones (UTC8 to UTC5/UTC7 to UTC4).
Key Responsibilities
- Architect build and implement scalable data structures to support extensive datasets balancing performance and costeffectiveness.
- Define and manage data structures and schemas tailored to meet business needs and analytics demands.
- Coordinate and oversee data flows between systems ensuring seamless accurate and efficient data transfer integration and synchronization across platforms.
- Design and manage ETL processes to aggregate and transform data from diverse sources into centralized databases or data warehouses.
- Lead the adoption of technologies such as Spark Snowflake Kafka ADX Fabric Databricks and other cloudbased solutions providing guidance on optimal tools and methodologies for each project.
- Deliver highquality data architecture services to stakeholders including futurestate architecture design product assessments technical summaries and roadmaps.
- Collaborate across teams to convert business and analytical requirements into a cohesive data strategy including building data pipelines aggregations infrastructure and tools to support strategic initiatives.
- Establish and maintain processes for monitoring and enhancing data quality ensuring data accuracy completeness and timeliness.
- Oversee the management and optimization of structured and unstructured data focusing on data integrity accessibility and optimal UI and API performance.
- Drive data platform enhancements and growth initiatives.
- Configure finetune and optimize data warehouse performance.
- Develop implement and enforce data governance policies and protocols.
Required Experience and Skills
- Bachelors or Masters in Computer Science IT or related field.
- 15 years in data architecture and management.
- Extensive experience with cloud databases (e.g. Snowflake Azure Data Explorer).
- Proven ability to handle large datasets and optimize data structures.
- Background in PaaS and lowcode development.
- Proficient in event streaming and realtime pipeline design.
- Strong grasp of modern SDLC and agile methodologies.
- Solid foundation in data modeling and manipulation.
- Conversational analytics experience is a plus.
- Excellent problemsolving communication and collaboration skills.
Working Conditions
This position is remote but may require occasional travel to other offices or business events. It involves regular PC internet and phone use throughout the day.
Interview Process
We will assist you with preparation including mock interviews and coaching to succeed! The steps typically are:
- Prescreen with recruiters
- Technical interview with the development team