drjobs Data Engineer - Data Science Hub - DataAI

Data Engineer - Data Science Hub - DataAI

Employer Active

1 Vacancy
drjobs

Job Alert

You will be updated with latest job alerts via email
Valid email field required
Send jobs
Send me jobs like this
drjobs

Job Alert

You will be updated with latest job alerts via email

Valid email field required
Send jobs
Job Location drjobs

Warsaw - Poland

Monthly Salary drjobs

Not Disclosed

drjobs

Salary Not Disclosed

Vacancy

1 Vacancy

Job Description

The salary range for this position is (contract of employment):

mid:PLN in gross terms

A hybrid work model requires 1 day a week in the office 

We are seeking a passionate Data Engineer to join the newly forming A/B Testing Platform team in the Data Science Hub where we apply analytical techniques mathematics and machine learning to solve a wide range of business problems.

About the team

The A/B Testing Platform team is a multidisciplinary group of product analysts software engineers and data engineers. Our mission is to strategically enhance our A/B testing platform a critical tool that empowers datadriven decisionmaking regarding the roll out of new features by assessing the potential impact of these features through user behavior analysis. Through tasks performed the team plays a pivotal role in shaping the overall user experience on Allegro one of the worlds largest eCommerce platforms.

We are looking for people who:

  • Have a Bachelors or Masters degree in Computer Science Mathematics or a related field.

  • Know English at min. B2 level.

  • Have proven experience as a Data Engineer or in a similar role

  • Possess necessary datarelated skill set meaning:

    • Are able to fluently work with SQL preferably GCP BigQuery.

    • Have knowledge of BigData tools in Google Cloud Platform AWS or Azure.

    • Have experience with message broker systems and streaming data processing eg. Pub/Sub Apache Beam

    • Are aware of data pipelines orchestration tools like Apache Airflow.

    • Have experience in Python programming and are familiar with software engineering best practices (PEP8 clean architecture code review CI/CD etc.).

  • Experience with Infrastructure as a Code tools Terraform is welcomed.

  • Have proven commercial experience in DevOps and CI/CD practice.

  • Have strong communication skills capable of conveying complex ideas in a clear concise manner.

  • Are detailoriented and capable of working in a fastpaced dynamic environment.

  • Have a positive attitude and ability to work in a team.

  • Are eager to constantly develop and broaden their knowledge.

In your daily work you will handle the following tasks

  • Designing developing and maintaining robust scalable data pipelines.

  • Collaborating closely with product managers UX designers data analysts and software engineers to understand their requirements and deliver high quality prepared data to enable their work.

  • Building testing and maintaining data systems for accuracy and readiness for a bigger pipeline containing streaming data flow.

  • Designing and implementing data schemas data models message brokers and SQL/NoSQL databases.

  • Optimizing data systems and building them from the ground up to deliver insights for data analytical systems.

  • Implementing data pipelines and automated workflows required for the A/B testing platform.

  • Ensuring data privacy and compliance standards across all projects.  

  • Operating with multiple platforms and technologies such as Google Cloud Platform Azure Cloud and Allegro Data Centers.

  • Delivering solutions for multiple markets.

  • Balancing engagement across adhoc support of Product Managers and Data Analysts requests.

Why is it worth working with us:

  • Data plays a key role in the operation of Allegro we are a datadriven technology company and through the models and analyses provided you will have a significant impact on one of the largest eCommerce platforms in the world.  

  • Gain invaluable experience and deepen your skills through continuous learning and development opportunities.  

  • Collaborate with a network of industry experts enhancing your professional growth and knowledge sharing.  

  • We are happy to share our knowledge. You can meet our speakers at hundreds of technological conferences such as Data Science Summit Big Data Technology Warsaw Summit. We also publish the content on the allegro.tech blog.  

  • We use depending on teams and their needs the latest versions of Java Scala Kotlin Groovy Go Python Spring Reactive Programming Spark Kubernetes TensorFlow.  

  • Microservices a few thousand microservices and 1.8m rps on our business data bus.  

  • In the Data&AI team you would be a part of a team consisting of over 200 data ML & product specialists overseeing dozens of products few hundred production ML models and governs all data in Allegro (several dozen petabyte scale).  

  • We practice Code Review Continuous Integration Scrum/Kanban Domain Driven Design Test Driven Development Pair Programming depending on the team.  

  • GenAI tools (e.g. Copilot internal LLM bots) support our everyday work.  

  • Our internal ecosystem is based on selfservice and widely used tools such as Kubernetes Docker GitHub (including CI/CD). This will allow you from day one to develop software using any language architecture and scale restricted only by your creativity and imagination.

  • We actively participate in the life of the biggest user groups in Poland centered around technologies we use at work (Java Python DevOps).  

  • Technological autonomy: you get to choose which technology solves the problem at hand (no need for managements consent) you are responsible for what you create.  

  • Once a year you can take advantage of the opportunity to work in a different team or more often if theres an internal business need (known as team tourism).

What we offer:

  • A hybrid work model that you will agree on with your leader and the team. We have welllocated offices (with fully equipped kitchens and bicycle parking facilities) and excellent working tools (heightadjustable desks interactive conference rooms).  

  • Annual bonus up to 10% of the annual salary gross (depending on your annual assessment and the companys results).  

  • A wide selection of fringe benefits in a cafeteria plan you choose what you like (e.g. medical sports or lunch packages insurance purchase vouchers).  

  • English classes that we pay for related to the specific nature of your job.  

  • 16 or 14 MacBook Pro with M1 processor and 32GB RAM or a corresponding Dell with Windows (if you dont like Macs) and other gadgets that you may need.  

  • Working in a team you can always count on we have on board topclass specialists and experts in their areas of expertise.  

  • A high degree of autonomy in terms of organizing your teams work. We encourage you to develop continuously and try out new things.  

  • Hackathons team tourism training budget and an internal educational platform MindUp (including training courses on work organization means of communication motivation to work and various technologies and subjectmatter issues).  

  • If you want to learn more check out this webpage or listen to the Allegro Tech Podcast Episode about recent projects in the Data Science Hub.  

Apply to Allegro and see why it is #dobrzetuby (#goodtobehere)


Remote Work :

No


Employment Type :

Fulltime

Employment Type

Full-time

Company Industry

About Company

Report This Job
Disclaimer: Drjobpro.com is only a platform that connects job seekers and employers. Applicants are advised to conduct their own independent research into the credentials of the prospective employer.We always make certain that our clients do not endorse any request for money payments, thus we advise against sharing any personal or bank-related information with any third party. If you suspect fraud or malpractice, please contact us via contact us page.