BairesDev

BairesDev

0 0 Evaluaciones

Hoy
Expira 21/05/2026

Data Engineer (Databricks) - Remote Work

Data Engineer (Databricks) - Remote Work

Sobre BairesDev

At BairesDev®, we've been leading the way in technology projects for over 15 years. We deliver cutting-edge solutions to giants like Google and the most innovative startups in Silicon Valley.

Nuestra Equipo

Our diverse 4,000+ team, composed of the world's Top 1% of tech talent, works remotely on roles that drive significant impact worldwide.

Tu Primer Paso

When you apply for this position, you're taking the first step in a process that goes beyond the ordinary. We aim to align your passions and skills with our vacancies, setting you on a path to exceptional career development and success.

Posición: Data Engineer (Databricks)

As a Data Engineer specializing in Databricks, you will develop and maintain scalable data pipelines and transformations within Databricks environments. You will act as a key link between raw data sources and analytics-ready datasets, leveraging lakehouse architectures to ensure data consistency and reliability.

Responsabilidades

  • Design and implement end-to-end ETL pipelines using PySpark and SQL within the Databricks platform.
  • Build and optimize data transformations following the Medallion Architecture to create high-quality Bronze, Silver, and Gold layers.
  • Develop declarative pipelines and manage complex orchestration workflows to ensure reliable production deployments.
  • Collaborate with data scientists and analysts to provide clean, structured data for advanced analytics and machine learning workloads.
  • Ensure data integrity and performance through proactive monitoring, schema enforcement, and automated data quality checks.
  • Maintain and scale lakehouse environments to handle diverse structured and unstructured datasets efficiently.

Requisitos

  • 4+ years of experience in Data Engineering or Software Engineering.
  • Proven expertise in developing and optimizing data pipelines using Databricks and PySpark.
  • Strong proficiency in SQL and experience with data warehousing concepts.
  • Hands-on experience building and maintaining data lakehouse architectures.
  • Deep understanding of ETL/ELT processes and large-scale data transformations.
  • Advanced proficiency in English.

Beneficios

  • 100% remote work (from anywhere).
  • Excellent compensation in USD or your local currency if preferred.
  • Hardware and software setup for you to work from home.
  • Flexible hours: create your own schedule.
  • Paid parental leaves, vacations, and national holidays.
  • Innovative and multicultural work environment: collaborate and learn from the global Top 1% of talent.
  • Supportive environment with mentorship, promotions, skill development, and diverse growth opportunities.

¡Aplica Ahora!

Apply now and become part of a global team where your unique talents can truly thrive!