About the role
As an Analytics Engineer at Cropster you will be part of the team that architects our data foundation transforming complex and diverse data into structured usable assets for insight and decisionmaking. You will be involved in the endtoend lifecycle of our internal data warehouse development beginning with the identification of data needs through data modeling and database design to the deployment and optimization of the data warehouse itself.
You will be a critical player in the entire data management process engaging in activities such as data modeling developing and managing ELT pipelines and ensuring the integrity and availability of data for analysis. You will act as a bridge between raw data sources and the analysts or business users who need processed data employing tools and technologies that ensure data is accessible reliable and scalable.
As an Analytics Engineer you will work closely with IT professionals and business stakeholders ensuring that our data architecture not only meets current needs but is also forwardlooking and adaptable to future demands. Your role is pivotal in creating a data environment that supports efficient data analytics and empowers decisionmaking across the company.
The preferred locations for this role are Innsbruck or Vienna we do have offices at both locations.
What youll do
- Data Warehouse Design and Development: Architect design and develop a scalable and efficient data warehouse that supports the storage and analysis of data from various sources within the company e.g. AWS. Ensure the data warehouse architecture aligns with business requirements and enables effective data analytics and business intelligence.
- Data Modeling: Develop and implement comprehensive data models that support the efficient organization storage and retrieval of data. Create logical and physical data models that facilitate clear data definition structure and relationships. Optimize query performance while ensuring adherence to best practices in data cleanliness integrity testability and documentation.
- Lead Modern Data Transformation Initiatives: Drive the adoption of advanced methodologies and tools to ensure clean reliable data for analysis and decisionmaking enhancing data quality and scalability.
- ELT Pipeline Design and Implementation: Design build and manage ELT (Extract Load Transform) processes to improve data quality and accessibility collecting data from various internal and external sources transforming it according to business rules and loading it into the data warehouse.
- Data Quality Management: Implement processes and tools to monitor validate and ensure the accuracy and quality of data within the organization. Develop strategies to handle data anomalies inconsistencies and integrity issues.
- Data Governance and Compliance: Work closely with the data owners to implement policies and practices that ensure the security privacy and ethical use of data. Ensure data management practices comply with relevant laws regulations and company policies.
- Performance Tuning and Optimization: Monitor data warehouse and ELT performance identify bottlenecks and implement optimizations to improve performance and processing speed. Ensure the data warehouse can scale effectively to meet the growing data needs of the business.
- Collaboration and Stakeholder Engagement: Collaborate with business owners IT teams and other stakeholders to understand data needs and deliver solutions that support datadriven decisionmaking across the company. Translate business requirements into technical specifications and ensure the data warehouse architecture meets these requirements.
- Continuous Improvement: Stay on top of the latest trends and technologies in data management. Recommend and implement improvements to the data warehouse design ELT processes and data management practices to enhance functionality efficiency and innovation.
Things you should do well
- Designing logical and physical data models that effectively represent business data allowing for efficient storage retrieval and analysis
- Developing implementing and managing ELT pipelines
- Understanding the importance of data governance practices and compliance requirements
- Excelling in working collaboratively across teams including IT data science and business units
We are looking for someone with
- A strong background in data modeling warehouse design and the development of scalable data infrastructure.
- Proficiency in SQL and experience with database technologies like MySQL PostgreSQL or Redshift.
- Experience with programming languages like Python for data analysis and automation.
- Experience with AWS cloud services and infrastructure for data engineering purposes.
- Familiarity with business intelligence tools like (e.g. QuickSight Tableau Power BI) and data pipeline technologies (e.g. Airflow dbt).
- Expertise with dbt or similar data transformation tools with a preference for candidates who have successfully implemented dbt in previous roles.
- Excellent problemsolving skills and the ability to work independently in a fastpaced environment.
- The ability to manage projects effectively.
- Strong communication skills with the ability to translate complex data concepts into clear actionable insights for nontechnical stakeholders in an Englishspeaking work environment.At this point Cropster will only employ those who are legally authorized to work in Austria for this opening. For NonEU applicants: please attach your work permit to your application.
What you can expect
The salary range for this position is 49K to 62K gross/year and we look at factors like your experience and individual qualifications to determine our offer which includes benefits like an educational and wellness budget remote work possibilities and working from home flexible working hours paid time for volunteer work an endless stream of really great coffee and much more. We also make our best possible offer upfrontno games.
Apply now