24 días
Expira 21/07/2025
Sr. Data Engineer (Snowflake/dbt)
Sr. Data Engineer (Snowflake/dbt)
Why You’ll Love This Role
- Work with cutting-edge cloud data technologies in a dynamic, collaborative environment
- Tackle enterprise-scale data challenges, working with billions of rows of data
- Opportunities for career growth and skill development through mentorship and certification programs
- Fully remote work flexibility
We are seeking a Senior Data Engineer with expertise in Snowflake and dbt, with a strong focus on scalability and optimization. The ideal candidate has experience working with massive datasets at the enterprise level and can fine-tune and optimize Snowflake environments to enhance performance, cost efficiency, and best practices.
Responsibilities
- Design and build scalable data pipelines in Snowflake and dbt, ensuring they can handle billions of rows of data efficiently
- Optimize Snowflake storage, compute performance, and query execution to improve processing speed and cost efficiency
- Lead efforts in migrating and refining legacy data processes in Snowflake using dbt, ensuring optimized transformations and modeling
- Collaborate with business and data teams to understand requirements and translate them into high-performance data solutions
- Implement best practices for Snowflake optimization, including clustering, partitioning, indexing, materialized views, and workload management
- Troubleshoot and resolve bottlenecks in existing Snowflake-based ETL/ELT workflows
- Provide technical leadership and mentorship, ensuring the team follows best practices for scalable data engineering
- Create and maintain technical documentation, including architecture diagrams and optimization guidelines
- 3+ years of experience in data engineering, with a focus on cloud-based enterprise-scale data solutions
- Proven experience working with massive datasets (billions of rows) in Snowflake
- Hands-on expertise in Snowflake performance tuning, storage optimization, and cost management
- Deep experience with dbt for data transformation, testing, and workflow orchestration
- Strong proficiency in SQL and Python for data manipulation, automation, and optimization
- Ability to identify, diagnose, and optimize inefficient queries and processing workflows
- Experience working both with and without an architect to optimize Snowflake performance
- Strong understanding of data governance, security best practices, and role-based access control in Snowflake
- Excellent problem-solving and communication skills, with the ability to collaborate across teams
- Experience with orchestration tools like Airflow or Prefect
- Exposure to AWS, GCP, or Azure for cloud data integration
- Familiarity with streaming data pipelines (Kafka, Kinesis, etc.)
C2C is not available