Data Operations Engineer

Job ID
2025-13038
# of Openings
1
Job Locations
Remote - Colombia
Additional Locations
BR-São Paulo
Category
Operations

Overview

The Data Operations Engineer 2 plays a key role in building and maintaining robust data pipelines that power analytics and machine learning initiatives. In this role, you’ll help ensure data is clean, reliable, and ready for use, while contributing to testing, documentation, and monitoring processes. Working closely with experienced team members, you’ll gain hands-on exposure to advanced data workflows and grow your technical expertise in a collaborative environment.

 

Responsibilities

 

  • Build and maintain data pipelines that prepare clean structured data for analysis.

  • Deliver components of analytical and operational data products contributing to solutions from ingestion through production deployment.

  • Apply established practices for data observability quality checks metadata tracking and lineage documentation within workflows.

  • Work with other teams to help ensure proper data access quality and usability for reporting and analytics.

  • Develop data workflows that enable downstream ML/AI capabilities ensuring datasets are well-prepared and versioned.

  • Observe and provide input on new tools and techniques evaluated by senior team members.

  • Implement solutions that align with internal data security standards compliance policies and governance frameworks.

  • Engage in team conversations about data engineering trends tools and process improvements.

  • Contribute to team knowledge base through documenting work learnings and lessons from projects.

  • Actively seek feedback from senior engineers and contribute to team success through collaboration and support.

Basic Qualifications

 

  • Bachelor's Degree in Computer Science Data Engineering or related field (or equivalent experience)

  • Strong English communication skills (oral and written)
  • Experience with dbt and cloud-based data orchestration tools (e.g. Azure Data Factory, Fivetran, Airflow, dbt Cloud)

  • Experience setting up API Connections, handling authentication and ingesting JSON data 
  • Proficiency in Python, SQL, Spark, API workflows, and GitHub-based development workflows 
  • Strong knowledge of cloud platforms (e.g., Fabric, Azure, Snowflake, AWS)
  • Familiarity with Medallion Architecture and Lakehouse design principles 
  • Understanding of Delta/Parquet formats and modern data modelling concepts
  • Good knowledge of Microsoft Operating systems and products
  • Excellent communication, collaboration, and problem-solving skills 
  • Attention to detail and strong critical thinking skills 
  • Good facilitation and project management skill
  • Ability to use original thinking to translate goals into the implementation new ideas and design solutions
  • Good ability to develop and use engaging informative and compelling presentation methodologies

  • Good ability to handle sensitive information with discretion and tact

  • Good ability to establish rapport and gain the trust of others; effective at gaining consensus

  • Up to 0% travel time required

Options

Sorry the Share function is not working properly at this moment. Please refresh the page and try again later.
Share on your newsfeed