Shape the Future of Performance Marketing Through DataDriven Strategies!
Success in performance marketing relies heavily on datadriven strategies and our client excels at delivering exceptional results. With a rich history of innovation and client satisfaction spanning over two decades BMG360 specializes in crafting tailored marketing solutions that optimize customer acquisition and engagement. Their deep understanding of industry dynamics enables them to develop effective strategies that resonate with target audiences. Our client is dedicated to driving measurable outcomes for businesses ensuring that every marketing campaign achieves its goals. They foster a culture of excellence and inclusivity where collaboration and teamwork are at the forefront of their operations. By joining their team individuals not only contribute to impactful marketing efforts but also benefit from a supportive environment that prioritizes professional development and recognizes each employees contributions to the companys success.
Job Overview
As a Data Operations Engineer you will join the BMG360 Business Intelligence & Technology department contributing your expertise to a dynamic team on a projectbased opportunity. In this role you will be deeply involved in various projects centered around Data Collection ETL (Extract Transform Load) processes and Data Governance/Monitoring buildouts. Your responsibilities will include designing and implementing efficient data pipelines that support the data needs of multiple teams systems and products. The ideal candidate will possess a strong foundation in Python and SQL along with familiarity in AWS ETL processes. Being selfdirected and proactive you will thrive in a collaborative environment ensuring data accuracy and accessibility to drive informed decisionmaking across the organization.
Discover the Comp to Success:
Employment type: Indefinite Term Contract
Shift: Monday to Friday 8:00 am to 5:00 pm Colombian
Work setup: Remote/Work from home
Weave the Tapestry of Success: Embrace Responsibilities That Shape Outcomes
- Client Provided Files/Data Project: Standardize and utilize diverse clientprovided data.
- Kantar Data Integration Project: Leverage external media research data for enhanced market ysis.
- Competitor Data (Digital) Integration Project: Systematically integrate competitor data for strategic ysis.
- Streaming & Podcast Project: Efficiently integrate streaming and podcast listening data into the agency s ytics framework to enhance media campaign ysis and planning.
- Digital Data Funnel.io: Complete the integration of digital marketing data (Search Social SEO) into the organization s central data system enabling comprehensive tracking and ysis of digital marketing efforts.
Requirements
Arm Yourself with the Tools of Excellence to Unlock Your Full Potential
- English Level: C1 C2.
- Strong proficiency in Python and SQL.
- Familiarity with AWS technologies such as S3 EC2 and Lambda.
- Understanding of ETL processes and best practices.
- Familiarity with Snowflake database concepts and data modeling.
Education:
- Bachelor s degree in Computer Science Engineering Mathematics or a related field.
Tech Stack:
- AWS SageMaker: Essential for building training and deploying machine learning models.
- Python: Core language for writing and maintaining model code.
- AWS Step Functions: For orchestrating model pipelines (preprocessing training inference).
- R: Required for any Rbased model components.
Plus:
- Terraform: Infrastructure as code for managing AWS resources.
- AWS Services: Familiarity with AWS ecosystem services (S3 Lambda IAM CloudWatch).
- CI/CD Tools: Tools for automating deployment pipelines (GitLab CI/CD Jenkins AWS CodePipeline).
- Containerization: Use of Docker for containerizing applications.
- Version Control: Proficient in Git for code and infrastructure versioning.
- Data Science/Machine Learning Frameworks: Experience with libraries like TensorFlow PyTorch or scikitlearn.
- Monitoring and Automation: Experience with ML monitoring and automation tools (MLflow SageMaker Pipelines).
Benefits
Explore the Array of Compelling Benefits
- 5day work week
- Prepaid medicine benefits
- Remote/work from home arrangement
- Indefinite term contract
- Additional 5 days of vacation
- Direct client exposure
- Career growth opportunities
- Supportive work environment
- Prime office locations in Bogot and Medell n
- Emapta Academy for upsing
Your Future Team at Emapta Latam
Join Emapta Latam and contribute to our legacy of transforming global outsourcing. Since 2010 Emapta has pioneered personalized outsourcing solutions empowering businesses to thrive with bespoke teams and seamless integration. Our commitment to excellence is reflected in our stateoftheart facilities competitive compensation and a supportive work environment that fosters professional growth. With over 720 clients worldwide and a team of nearly 7400 talented professionals Emapta continues to set new standards in the industry. Apply now to be part of our success story in Colombia where your ss are valued and your career ambitions are supported.
#EmaptaExperience
Arm Yourself with the Tools of Excellence to Unlock Your Full Potential English Level: C1 - C2. Strong proficiency in Python and SQL. Familiarity with AWS technologies, such as S3, EC2, and Lambda. Understanding of ETL processes and best practices. Familiarity with Snowflake, database concepts, and data modeling. Education: Bachelor s degree in Computer Science, Engineering, Mathematics, or a related field. Tech Stack: AWS SageMaker: Essential for building, training, and deploying machine learning models. Python: Core language for writing and maintaining model code. AWS Step Functions: For orchestrating model pipelines (preprocessing, training, inference). R: Required for any R-based model components. Plus: Terraform: Infrastructure as code for managing AWS resources. AWS Services: Familiarity with AWS ecosystem services (S3, Lambda, IAM, CloudWatch). CI/CD Tools: Tools for automating deployment pipelines (GitLab CI/CD, Jenkins, AWS CodePipeline). Containerization: Use of Docker for containerizing applications. Version Control: Proficient in Git for code and infrastructure versioning. Data Science/Machine Learning Frameworks: Experience with libraries like TensorFlow, PyTorch, or scikit-learn. Monitoring and Automation: Experience with ML monitoring and automation tools (MLflow, SageMaker Pipelines).