EPAM Systems

EPAM Systems

0 0 Evaluaciones

6 días
Expira 09/01/2026

Lead Data Software Engineer

Lead Data Software Engineer

Job Overview

We are looking for an experienced Lead Data Software Engineer for a remote position to support our team’s success and advance our Data Science initiatives.

Role Responsibilities

  • Lead and mentor the Data Software Engineering team, fostering a culture of continuous improvement and professional growth.
  • Collaborate across disciplines to deliver high-quality data solutions aligned with project goals and deadlines.
  • Establish and maintain effective Data Software Engineering processes, emphasizing automation, efficiency, and innovation.
  • Implement and sustain scalable data solutions using AWS services.
  • Oversee optimization of workflows and data processing through Apache Airflow and Apache Spark.
  • Continuously evaluate industry trends and best practices to refine and adopt advanced Data Software Engineering methodologies.
  • Provide on-call support for data pipelines and datamarts to ensure operational efficiency.
  • Direct the creation and deployment of REST APIs to enable seamless data integration and communication.
  • Engage with clients to understand their needs and deliver tailored solutions that meet their requirements.
  • Manage team structure and organization to ensure timely and efficient delivery of projects.
  • Work with stakeholders, demonstrating strong communication and leadership capabilities.

Requirements

  • At least 5 years of experience as a Data Software Engineer, working on complex projects and large-scale data infrastructures.
  • Minimum of 1 year of leadership experience, successfully managing and motivating a team of Data Software Engineers.
  • Expertise in Amazon Web Services (AWS), including designing and implementing scalable data solutions.
  • Extensive knowledge of Apache Airflow and Apache Spark for optimizing workflows and data processing.
  • Advanced proficiency with one or more CI/CD tools, such as Jenkins, to streamline data pipeline delivery.
  • Strong skills in Python and SQL for building and maintaining data pipelines and ETL processes.
  • Experience with Databricks and PySpark for data analysis and processing.
  • Familiarity with REST APIs for efficient data integration and communication.
  • Excellent analytical skills for problem-solving and decision-making in complex environments.
  • Strong client-facing abilities, ensuring effective collaboration and clear communication to achieve project objectives.
  • Exceptional organizational and team management skills for delivering projects efficiently.
  • Upper-intermediate English proficiency for effective communication, presentations, and discussions with stakeholders.

Nice to Have

  • Experience with Redshift for data warehousing and analysis.

What We Offer

  • International projects with top brands.
  • Work with global teams of highly skilled, diverse peers.
  • Healthcare benefits.
  • Employee financial programs.
  • Paid time off and sick leave.
  • Upskilling, reskilling and certification courses.
  • Unlimited access to the LinkedIn Learning library and 22,000+ courses.
  • Global career opportunities.
  • Volunteer and community involvement opportunities.
  • EPAM Employee Groups.
  • Award-winning culture recognized by Glassdoor, Newsweek and LinkedIn.