4 días
Expira 23/03/2026
Data Engineer
Data Engineer
Location: Remote (Chile)
Type: Full-time
About TruckerCloud
TruckerCloud is the leading telematics data platform for commercial auto insurance, unifying data from hundreds of telematics and camera providers into a single, easy-to-use solution. Our mission is to help insurers and risk teams streamline telematics programs, improve underwriting accuracy, enhance claims workflows, and leverage real-time and historical fleet insights—all without the technical overhead of custom integrations. With connections to 100+ telematics systems, deep analytics, automated reporting, and powerful behavioral insights, TruckerCloud enables data-driven decision-making and scalable risk management across the transportation ecosystem.
About the Role
TruckerCloud is seeking a Data Engineer to build and scale the data infrastructure powering our core products and analytics. Our platform manages hundreds of terabytes of data, and this volume is growing exponentially as our customer base expands.
A key aspect of this role is working very closely with business stakeholders and Data Scientists to translate analytical requirements, experiments, and modeling needs into production-grade datasets, transformations, and pipelines. You must be able to turn ideas into operational, reliable, and scalable data products.
We also require a high level of proficiency in AI tools and a solid understanding of security and compliance standards, given the critical nature of our data operations.
A core requirement for this position is deep expertise in Data Quality. This includes designing and implementing frameworks, validation layers, monitoring capabilities, and automated checks to ensure data accuracy, consistency, completeness, and reliability at scale.
Tech Stack You’ll Work With
- Programming: Python (primary), Java (optional)
- Cloud: GCP (BigQuery, Dataflow, Cloud Storage) & AWS (S3, Lambda, EC2)
- Databases: MySQL, BigQuery
- Scale: 100+ TB today with exponential growth
What You Will Do
- Design, build, and maintain ETL/ELT pipelines and ingestion systems at massive and fast-growing scale.
- Work closely with business teams and Data Scientists to:
- - Understand analytical and modeling requirements
- - Translate them into production-ready pipelines and curated datasets
- - Support experimentation and productization of data-driven features
- Use AI tools extensively to accelerate development, improve validation, and automate documentation.
- Implement strong data validation, automated testing, and monitoring frameworks.
- Apply and enforce security and compliance standards, such as:
- - Data access governance
- - Logging and auditability
- - Secure data handling and storage practices
- Own deployments and reliability of data workflows, ensuring scalability as data grows exponentially.
- Optimize for performance, resiliency, and cloud cost.
- Contribute to architectural design and continuous improvement.
- Build and enforce Data Quality frameworks including validation rules, anomaly detection, monitoring dashboards, and automated testing for data pipelines.
- Implement and maintain systems that ensure data accuracy, completeness, consistency, timeliness, and lineage transparency.
- Lead the definition and adoption of data quality best practices, helping elevate the standards of the entire engineering team.
What We’re Looking For
- At least 3 years of experience as a Data Engineer.
- Strong expertise in Python for data processing.
- Excellent SQL skills and experience with MySQL.
- Hands-on experience with BigQuery and cloud-native data platforms.
- Experience working with large-scale datasets (hundreds of GB to many TB).
- Ability to translate business and Data Science needs into engineering solutions.
- High proficiency with AI tools.
- Solid understanding of security and compliance practices.
- Experience with CI/CD and automated testing for data systems.
- Ownership mentality and strong communication.
- Strong, hands-on expertise in Data Quality, including data validation frameworks, observability, QA automation for pipelines, and designing scalable quality controls in production environments.
- Fluency in Spanish and English (written and spoken).
Nice to Have
- Experience with orchestration tools like Airflow.
- Familiarity with infrastructure-as-code or containerization.
- Exposure to cloud optimization techniques.