Buscar oferta
Buscar oferta
Descubre oportunidades únicas y postúlate al empleo ideal hoy.
Descubre oportunidades únicas y postúlate al empleo ideal hoy.
Descubre oportunidades únicas y postúlate al empleo ideal hoy.
We are looking for a Senior Site Reliability Engineer to join our team and play a key role in ensuring the reliability, scalability, and performance of our systems. This position involves working across the entire service lifecycle, from design and deployment to monitoring and optimization. You will collaborate with global teams, tackle complex challenges, and implement automation strategies to improve system resilience and efficiency. Your expertise will be instrumental in maintaining the stability of critical systems and driving continuous improvement. Responsibilities Participate in and enhance the full lifecycle of services, including design, deployment, operation, and refinementAnalyze ITSM activities for the platform and provide feedback to development teams to address operational gaps and improve resiliencySupport services pre-launch through system design consultation, capacity planning, and launch reviewsMonitor live services by tracking availability, latency, and overall system healthScale systems sustainably through automation and advocate for changes that enhance reliability and velocityLead application automation efforts to validate and promote software across environments while adhering to best practicesPractice incident response with a focus on sustainable solutions and conduct blameless postmortemsTake a proactive approach to problem-solving, connecting insights across the technology stack during production events to minimize recovery timeCollaborate with global teams across multiple regions and time zones to ensure consistent support and operationsShare expertise and provide mentorship to junior team members Requirements Bachelor’s degree in Computer Science, or a related technical field involving coding (e.g., physics or mathematics), or equivalent practical experienceAt least three years of hands-on experience as a Site Reliability EngineerExperience with technologies such as COBOL, JCL, VSAM, DB2, CICS, and MQStrong knowledge of algorithms, data structures, scripting, pipeline management, and software designA systematic approach to problem-solving combined with excellent communication skills and a strong sense of ownership and driveProficiency in debugging and optimizing code, as well as automating routine tasksExperience working with diverse stakeholders and handling urgent situations while making effective decisionsInterest and expertise in designing, analyzing, and troubleshooting large-scale distributed systemsEnglish proficiency at a B2 level or higher, with strong verbal and written communication skills Nice to have Familiarity with cloud-native tools and platforms for enhancing system performance and scalabilityExperience implementing observability solutions to monitor and optimize distributed systemsKnowledge of containerization and orchestration tools such as Docker and Kubernetes for managing application environments We offer International projects with top brandsWork with global teams of highly skilled, diverse peersHealthcare benefitsEmployee financial programsPaid time off and sick leaveUpskilling, reskilling and certification coursesUnlimited access to the LinkedIn Learning library and 22,000+ coursesGlobal career opportunitiesVolunteer and community involvement opportunitiesEPAM Employee GroupsAward-winning culture recognized by Glassdoor, Newsweek and LinkedIn
We are looking for a Senior Site Reliability Engineer to join our team and play a key role in ensuring the reliability, scalability, and performance of our systems. This position involves working across the entire service lifecycle, from design and deployment to monitoring and optimization. You will collaborate with global teams, tackle complex challenges, and implement automation strategies to improve system resilience and efficiency. Your expertise will be instrumental in maintaining the stability of critical systems and driving continuous improvement. Responsibilities Participate in and enhance the full lifecycle of services, including design, deployment, operation, and refinementAnalyze ITSM activities for the platform and provide feedback to development teams to address operational gaps and improve resiliencySupport services pre-launch through system design consultation, capacity planning, and launch reviewsMonitor live services by tracking availability, latency, and overall system healthScale systems sustainably through automation and advocate for changes that enhance reliability and velocityLead application automation efforts to validate and promote software across environments while adhering to best practicesPractice incident response with a focus on sustainable solutions and conduct blameless postmortemsTake a proactive approach to problem-solving, connecting insights across the technology stack during production events to minimize recovery timeCollaborate with global teams across multiple regions and time zones to ensure consistent support and operationsShare expertise and provide mentorship to junior team members Requirements Bachelor’s degree in Computer Science, or a related technical field involving coding (e.g., physics or mathematics), or equivalent practical experienceAt least three years of hands-on experience as a Site Reliability EngineerExperience with technologies such as COBOL, JCL, VSAM, DB2, CICS, and MQStrong knowledge of algorithms, data structures, scripting, pipeline management, and software designA systematic approach to problem-solving combined with excellent communication skills and a strong sense of ownership and driveProficiency in debugging and optimizing code, as well as automating routine tasksExperience working with diverse stakeholders and handling urgent situations while making effective decisionsInterest and expertise in designing, analyzing, and troubleshooting large-scale distributed systemsEnglish proficiency at a B2 level or higher, with strong verbal and written communication skills Nice to have Familiarity with cloud-native tools and platforms for enhancing system performance and scalabilityExperience implementing observability solutions to monitor and optimize distributed systemsKnowledge of containerization and orchestration tools such as Docker and Kubernetes for managing application environments We offer International projects with top brandsWork with global teams of highly skilled, diverse peersHealthcare benefitsEmployee financial programsPaid time off and sick leaveUpskilling, reskilling and certification coursesUnlimited access to the LinkedIn Learning library and 22,000+ coursesGlobal career opportunitiesVolunteer and community involvement opportunitiesEPAM Employee GroupsAward-winning culture recognized by Glassdoor, Newsweek and LinkedIn
We are seeking a highly skilled Senior Data DevOps Engineer to join our remote team, working on a cutting-edge project in the financial services industry. In this role, you will be responsible for designing, implementing, and maintaining the infrastructure and tools necessary for the development, testing, and deployment of data-driven applications. You will work closely with cross-functional teams to ensure the seamless integration of data pipelines, databases, and data analytics tools. If you are passionate about DevOps and data engineering, we invite you to apply for this exciting opportunity. Responsibilities Design, implement, and maintain the infrastructure required for the development, testing, and deployment of data-driven applicationsBuild, configure, and manage CI/CD pipelines using tools such as Jenkins, GitLab, or CircleCIDeploy and manage Kubernetes clusters for containerized applications and microservicesDevelop and maintain Docker images for data processing and analytics toolsConfigure and manage Amazon Web Services resources, including EC2 instances, S3 buckets, and RDS databasesAutomate infrastructure deployment and management using Terraform and other infrastructure-as-code toolsMonitor and troubleshoot infrastructure issues, ensuring high availability and performance of data pipelines and databasesCollaborate with data scientists and analysts to ensure seamless integration of data pipelines and analytics tools Requirements A minimum of 3 years of experience in Data DevOps, demonstrating your expertise in designing and implementing data pipelines, databases, and data analytics toolsIn-depth knowledge of CI/CD pipelines and HelmStrong experience with Kubernetes, Docker, Amazon Web Services, Linux, and TerraformFamiliarity with Elastic StackExperience with distributed data processing frameworks such as Apache Spark, Kafka, or FlinkExpertise in scripting languages such as Python, Bash, or PowerShell, allowing you to automate tasks and manage infrastructure as codeStrong interpersonal and communication skills, enabling you to collaborate effectively with cross-functional teams and stakeholdersAbility to work independently and manage multiple projects simultaneously, while maintaining a high level of performanceFluent spoken and written English at an Upper-intermediate level or higher (B2+) Nice to have Experience with data governance and security practices, including data encryption, access control, and compliance requirementsKnowledge of machine learning frameworks and tools, including TensorFlow, PyTorch, or Scikit-learnExperience with big data technologies such as Hadoop, Hive, or PrestoFamiliarity with data visualization tools such as Tableau, Power BI, or Grafana We offer International projects with top brandsWork with global teams of highly skilled, diverse peersHealthcare benefitsEmployee financial programsPaid time off and sick leaveUpskilling, reskilling and certification coursesUnlimited access to the LinkedIn Learning library and 22,000+ coursesGlobal career opportunitiesVolunteer and community involvement opportunitiesEPAM Employee GroupsAward-winning culture recognized by Glassdoor, Newsweek and LinkedIn
We are seeking a highly skilled Senior Data DevOps Engineer to join our remote team, working on a cutting-edge project in the financial services industry. In this role, you will be responsible for designing, implementing, and maintaining the infrastructure and tools necessary for the development, testing, and deployment of data-driven applications. You will work closely with cross-functional teams to ensure the seamless integration of data pipelines, databases, and data analytics tools. If you are passionate about DevOps and data engineering, we invite you to apply for this exciting opportunity. Responsibilities Design, implement, and maintain the infrastructure required for the development, testing, and deployment of data-driven applicationsBuild, configure, and manage CI/CD pipelines using tools such as Jenkins, GitLab, or CircleCIDeploy and manage Kubernetes clusters for containerized applications and microservicesDevelop and maintain Docker images for data processing and analytics toolsConfigure and manage Amazon Web Services resources, including EC2 instances, S3 buckets, and RDS databasesAutomate infrastructure deployment and management using Terraform and other infrastructure-as-code toolsMonitor and troubleshoot infrastructure issues, ensuring high availability and performance of data pipelines and databasesCollaborate with data scientists and analysts to ensure seamless integration of data pipelines and analytics tools Requirements A minimum of 3 years of experience in Data DevOps, demonstrating your expertise in designing and implementing data pipelines, databases, and data analytics toolsIn-depth knowledge of CI/CD pipelines and HelmStrong experience with Kubernetes, Docker, Amazon Web Services, Linux, and TerraformFamiliarity with Elastic StackExperience with distributed data processing frameworks such as Apache Spark, Kafka, or FlinkExpertise in scripting languages such as Python, Bash, or PowerShell, allowing you to automate tasks and manage infrastructure as codeStrong interpersonal and communication skills, enabling you to collaborate effectively with cross-functional teams and stakeholdersAbility to work independently and manage multiple projects simultaneously, while maintaining a high level of performanceFluent spoken and written English at an Upper-intermediate level or higher (B2+) Nice to have Experience with data governance and security practices, including data encryption, access control, and compliance requirementsKnowledge of machine learning frameworks and tools, including TensorFlow, PyTorch, or Scikit-learnExperience with big data technologies such as Hadoop, Hive, or PrestoFamiliarity with data visualization tools such as Tableau, Power BI, or Grafana We offer International projects with top brandsWork with global teams of highly skilled, diverse peersHealthcare benefitsEmployee financial programsPaid time off and sick leaveUpskilling, reskilling and certification coursesUnlimited access to the LinkedIn Learning library and 22,000+ coursesGlobal career opportunitiesVolunteer and community involvement opportunitiesEPAM Employee GroupsAward-winning culture recognized by Glassdoor, Newsweek and LinkedIn
Buscamos un Consultor Senior SAP CX con sólida experiencia en los módulos de Sales & Services (versión 2) para unirse a nuestro equipo en proyectos regionales estratégicos. El candidato ideal deberá tener conocimientos profundos en la configuración, implementación y soporte de soluciones SAP Customer Experience, demostrando liderazgo técnico y capacidad de comunicación con clientes. Requisitos Experiencia comprobada en proyectos SAP CX, especialmente en los módulos Sales & Services v2. Conocimiento funcional y técnico sobre procesos de ventas, atención al cliente, gestión de tickets, workflows y actividades relacionadas. Deseable experiencia en SAP Field Service Management (FSM): planificación de servicios en campo, asignación de técnicos, integración con SAP S/4HANA o ECC. Capacidad de liderar workshops com clientes, mapear requerimientos y proponer soluciones alineadas a mejores prácticas. Deseable experiencia en proyectos multipaís o rollout regional. Idiomas: español fluente. Inglés intermedio o avanzado es un diferencial. Se Valorará Certificaciones SAP CX o FSM. Experiencia em ambientes cloud (SAP BTP, SAP Integration Suite). Habilidades para trabajar de forma autónoma, con foco en resultados y atención al cliente. Disponibilidad para viajes puntuales
Buscamos un Consultor Senior SAP CX con sólida experiencia en los módulos de Sales & Services (versión 2) para unirse a nuestro equipo en proyectos regionales estratégicos. El candidato ideal deberá tener conocimientos profundos en la configuración, implementación y soporte de soluciones SAP Customer Experience, demostrando liderazgo técnico y capacidad de comunicación con clientes. Requisitos Experiencia comprobada en proyectos SAP CX, especialmente en los módulos Sales & Services v2. Conocimiento funcional y técnico sobre procesos de ventas, atención al cliente, gestión de tickets, workflows y actividades relacionadas. Deseable experiencia en SAP Field Service Management (FSM): planificación de servicios en campo, asignación de técnicos, integración con SAP S/4HANA o ECC. Capacidad de liderar workshops com clientes, mapear requerimientos y proponer soluciones alineadas a mejores prácticas. Deseable experiencia en proyectos multipaís o rollout regional. Idiomas: español fluente. Inglés intermedio o avanzado es un diferencial. Se Valorará Certificaciones SAP CX o FSM. Experiencia em ambientes cloud (SAP BTP, SAP Integration Suite). Habilidades para trabajar de forma autónoma, con foco en resultados y atención al cliente. Disponibilidad para viajes puntuales
Join our remote team as a Senior Data Software Engineer within a global leader in data-driven solutions. We are seeking a highly skilled and experienced individual to take ownership of building and implementing reusable DataBricks components for data ingestion and analytics. The successful candidate will collaborate closely with architects, technical leads, and other functional groups to ensure the creation of effective and efficient solutions. This role presents an opportunity to drive innovation and contribute to the optimization of the company's data solutions. Responsibilities Design and implement reusable DataBricks components for data ingestion and analyticsIngest data via batch, streaming, and replication into data lake and make data available for reporting and predictive modelingEstablish security controls, integration with data governance and clear auditable data lineageBuild collaborative partnerships with architects, technical leads, and key individuals within other functional groupsParticipate in code review and test solutions to ensure they meet best practice specificationsWrite project documentationProvide technical input for new feature requirements, partnering with business owners and architectsEnsure continuous improvement by staying abreast of industry trends and emerging technologiesDrive the implementation of solutions aligned with business objectivesCollaborate with cross-functional teams to achieve project goals Requirements At least 3+ years of experience in Data Software EngineeringExpertise in Databricks and PySpark for building and managing Big Data analytics applicationsHands-on experience with Microsoft Azure, including Azure Synapse Analytics, SQL Azure, and ADLS for designing and deploying scalable, available and fault-tolerant systemsExperience in building data ingestion pipelines, data warehousing or database architectureDeep understanding of data modeling concepts and experience with modern Big Data componentsStrong coding experience with Python for building data solutionsUnderstanding of compliance awareness, such as PI, GDPR, HIPAA for data security and privacyExperience in actively participating in code review and testing solutions to ensure they meet best practice specificationsExcellent communication skills in spoken and written English, at an upper-intermediate level or higher Nice to have Experience with Power BI for data visualization and reporting We offer International projects with top brandsWork with global teams of highly skilled, diverse peersHealthcare benefitsEmployee financial programsPaid time off and sick leaveUpskilling, reskilling and certification coursesUnlimited access to the LinkedIn Learning library and 22,000+ coursesGlobal career opportunitiesVolunteer and community involvement opportunitiesEPAM Employee GroupsAward-winning culture recognized by Glassdoor, Newsweek and LinkedIn
Join our remote team as a Senior Data Software Engineer within a global leader in data-driven solutions. We are seeking a highly skilled and experienced individual to take ownership of building and implementing reusable DataBricks components for data ingestion and analytics. The successful candidate will collaborate closely with architects, technical leads, and other functional groups to ensure the creation of effective and efficient solutions. This role presents an opportunity to drive innovation and contribute to the optimization of the company's data solutions. Responsibilities Design and implement reusable DataBricks components for data ingestion and analyticsIngest data via batch, streaming, and replication into data lake and make data available for reporting and predictive modelingEstablish security controls, integration with data governance and clear auditable data lineageBuild collaborative partnerships with architects, technical leads, and key individuals within other functional groupsParticipate in code review and test solutions to ensure they meet best practice specificationsWrite project documentationProvide technical input for new feature requirements, partnering with business owners and architectsEnsure continuous improvement by staying abreast of industry trends and emerging technologiesDrive the implementation of solutions aligned with business objectivesCollaborate with cross-functional teams to achieve project goals Requirements At least 3+ years of experience in Data Software EngineeringExpertise in Databricks and PySpark for building and managing Big Data analytics applicationsHands-on experience with Microsoft Azure, including Azure Synapse Analytics, SQL Azure, and ADLS for designing and deploying scalable, available and fault-tolerant systemsExperience in building data ingestion pipelines, data warehousing or database architectureDeep understanding of data modeling concepts and experience with modern Big Data componentsStrong coding experience with Python for building data solutionsUnderstanding of compliance awareness, such as PI, GDPR, HIPAA for data security and privacyExperience in actively participating in code review and testing solutions to ensure they meet best practice specificationsExcellent communication skills in spoken and written English, at an upper-intermediate level or higher Nice to have Experience with Power BI for data visualization and reporting We offer International projects with top brandsWork with global teams of highly skilled, diverse peersHealthcare benefitsEmployee financial programsPaid time off and sick leaveUpskilling, reskilling and certification coursesUnlimited access to the LinkedIn Learning library and 22,000+ coursesGlobal career opportunitiesVolunteer and community involvement opportunitiesEPAM Employee GroupsAward-winning culture recognized by Glassdoor, Newsweek and LinkedIn
Join our remote team as a Senior Data Software Engineer within a global leader in digital transformation and technology services. We are looking for an experienced software developer with a strong background in data engineering to join our team. The ideal candidate will be responsible for implementing reusable DataBricks components for data ingestion and data analytics. They will have experience ingesting data into a data lake via batch, streaming, and replication and making data available for reporting and predictive modeling, as well as establishing security controls, integration with data governance, and clear auditable data lineage. The Senior Data Software Engineer will work collaboratively with architects, technical leads, and key individuals within other functional groups to develop and test solutions that meet best practice specifications. Responsibilities Implementing reusable DataBricks components for data ingestion and data analyticsIngesting data into a data lake via batch, streaming, and replication and making data available for reporting and predictive modelingEstablishing security controls, integration with data governance, and clear auditable data lineageCollaborating with architects, technical leads, and key individuals within other functional groups to develop and test solutions that meet best practice specificationsParticipating in code review and testing solutions to ensure they meet best practice specificationsWriting project documentation to ensure that other developers can easily understand and use the codeBuilding collaborative partnerships with architects, technical leads, and key individuals within other functional groupsContinuously updating skills and knowledge to keep up with industry trends and best practices Requirements At least 3+ years of experience in software engineering with a focus on data engineeringProficient in Python coding and PySpark developmentExperience building data ingestion pipelines, Data Warehouse or Database architecture.Hands-on experience with modern Big Data components like Databricks, SQL Azure, and Microsoft AzureExperience in designing, deploying, and administering scalable, available, and fault-tolerant systems in a cloud environmentStrong ability to write clean, maintainable, and well-documented codeExperience in data modeling and working with data-oriented personality and compliance awareness, such as PI, GDPR, and HIPAAExcellent communication skills in spoken and written English, at an upper-intermediate level or higher Nice to have Experience with Power BI, Azure Synapse Analytics, and ADLS is a plus We offer International projects with top brandsWork with global teams of highly skilled, diverse peersHealthcare benefitsEmployee financial programsPaid time off and sick leaveUpskilling, reskilling and certification coursesUnlimited access to the LinkedIn Learning library and 22,000+ coursesGlobal career opportunitiesVolunteer and community involvement opportunitiesEPAM Employee GroupsAward-winning culture recognized by Glassdoor, Newsweek and LinkedIn
Join our remote team as a Senior Data Software Engineer within a global leader in digital transformation and technology services. We are looking for an experienced software developer with a strong background in data engineering to join our team. The ideal candidate will be responsible for implementing reusable DataBricks components for data ingestion and data analytics. They will have experience ingesting data into a data lake via batch, streaming, and replication and making data available for reporting and predictive modeling, as well as establishing security controls, integration with data governance, and clear auditable data lineage. The Senior Data Software Engineer will work collaboratively with architects, technical leads, and key individuals within other functional groups to develop and test solutions that meet best practice specifications. Responsibilities Implementing reusable DataBricks components for data ingestion and data analyticsIngesting data into a data lake via batch, streaming, and replication and making data available for reporting and predictive modelingEstablishing security controls, integration with data governance, and clear auditable data lineageCollaborating with architects, technical leads, and key individuals within other functional groups to develop and test solutions that meet best practice specificationsParticipating in code review and testing solutions to ensure they meet best practice specificationsWriting project documentation to ensure that other developers can easily understand and use the codeBuilding collaborative partnerships with architects, technical leads, and key individuals within other functional groupsContinuously updating skills and knowledge to keep up with industry trends and best practices Requirements At least 3+ years of experience in software engineering with a focus on data engineeringProficient in Python coding and PySpark developmentExperience building data ingestion pipelines, Data Warehouse or Database architecture.Hands-on experience with modern Big Data components like Databricks, SQL Azure, and Microsoft AzureExperience in designing, deploying, and administering scalable, available, and fault-tolerant systems in a cloud environmentStrong ability to write clean, maintainable, and well-documented codeExperience in data modeling and working with data-oriented personality and compliance awareness, such as PI, GDPR, and HIPAAExcellent communication skills in spoken and written English, at an upper-intermediate level or higher Nice to have Experience with Power BI, Azure Synapse Analytics, and ADLS is a plus We offer International projects with top brandsWork with global teams of highly skilled, diverse peersHealthcare benefitsEmployee financial programsPaid time off and sick leaveUpskilling, reskilling and certification coursesUnlimited access to the LinkedIn Learning library and 22,000+ coursesGlobal career opportunitiesVolunteer and community involvement opportunitiesEPAM Employee GroupsAward-winning culture recognized by Glassdoor, Newsweek and LinkedIn
Descripción Empresa Somos más de 100 mil personas que, día a día, dedicamos nuestra pasión y energía a cumplir nuestro Propósito de “Simplificar y Disfrutar más la Vida”. Propósito que hoy vive a través de nuestro ecosistema físico y digital en todas nuestras marcas y países (Chile, Perú, Colombia, India, México, Brasil, China, Argentina y Uruguay). Valoramos las distintas miradas porque entendemos que la diversidad es la clave de nuestra innovación. Queremos ir más allá de cualquier límite, desafiarnos constantemente, divertirnos haciendo lo que nos gusta y dejar huella en lo que hacemos. Y sabemos que existe sólo una forma de hacerlo: como UN EQUIPO. ¡Súmate a vivir la #ExperienciaFalabella y así potenciar JUNTOS nuestra transformación! Misión Del Cargo Operativo/a Tienda / Sodimac HC El Bosque / Jornada FT44hrs Funciones Del Cargo ¿Te gustaría aprender con los mejores y desarrollar todo tu potencial? Entonces, ¡esta oportunidad es para ti! En Sodimac buscamos a nuestro/a próximo/a Operativo/a de Tienda para sumarse al equipo de nuestra tienda en Sodimac HC El Bosque ¡No pierdas más tiempo y postula! ¿Qué desafíos encontrarás en esta posición? Recibir, revisar y clasificar los productos, trasladándolos a los puntos de venta, a las áreas destinadas para retiro de productos o despacho a domicilio o bien al área de DAP y/o servicio técnico para ser retirado por transportistas, administrando para ello la documentación asociada y registrando la data en el sistema, realizando reportes de la operación diaria y las gestiones correspondientes en caso de encontrar discrepancias. Contactar a clientes por problemas con disponibilidad de productos para despacho a domicilio, ofrecer alternativas y gestionar solución acordada con el cliente. Realizar cuadratura en los puntos de venta, cambiando y renovando los flejes de precio, fichas técnicas, autoservicios, desprendibles de productos (cartelería, gráfica, pendón informativo) o servicios y cualquier otro instrumento en los puntos de venta con o sin planograma. Mantener el orden y limpieza del equipamiento de la tienda y del área de trabajo, trasladando vigas y desechos de la operación de la tienda y trastienda a los compactadores y depósitos respectivos. Mantener y renovar las muestras y exhibiciones de productos en los puntos de venta, cumpliendo con las normas de seguridad establecidas por la compañía. Realizar el montaje y desmontaje de catálogos para las campañas comerciales de Tienda. ¿Qué esperamos de ti? (requisitos mínimos) Ser mayor de 18 años Contar con Enseñanza Media completa Exigencias del cargo Trabajo mayoritariamente realizado de pie Requisitos Ser mayor de 18 añosContar con licencia de enseñanza media completa/secundariaDebe contar con disponibilidad para trabajar en sistema de turnos rotativosDeseable contar con experiencia en reposición, despacho y/o recepción Beneficios Condiciones oferta: Con la #ExperienciaFalabella disfrutarás de: Hasta 5 días libres (adicionales a las vacaciones, por año calendario). Día de cumpleaños libre. Día de navidad o año nuevo libre. Descuento en compras en tiendas Sodimac y en Falabella.com. Descuento en compras con beneficio costo cero más IVA. Aguinaldo de fiestas patrias y navidad. Caja de navidad. Bono de vacaciones. Bono de matrimonio o acuerdo de unión civil. Bono de natalidad o adopción. Bono escolaridad. Posibilidad de Becas de Estudio para trabajadores Premio a la excelencia académica para trabajadores y sus hijos. Almuerzo (casinos de la Compañía) Seguro complementario de salud, dental y catastrófico (costo compartido con empresa). Seguro de vida e invalidez. Descripción Proceso De Selección El proceso de selección se realiza a través de AIRA - plataforma de reclutamiento diseñado para mejorar tu experiencia de postulación. Para Postular Solo Necesitas Postular a la oferta Revisar tu mail Ingresar a AIRA y contestar las preguntas y/o pruebas solicitadas Luego, si vemos que tu perfil se ajusta a lo que estamos buscando, te contactaremos por mail (a través de AIRA) para seguir a la etapa presencial.
Descripción Empresa Somos más de 100 mil personas que, día a día, dedicamos nuestra pasión y energía a cumplir nuestro Propósito de “Simplificar y Disfrutar más la Vida”. Propósito que hoy vive a través de nuestro ecosistema físico y digital en todas nuestras marcas y países (Chile, Perú, Colombia, India, México, Brasil, China, Argentina y Uruguay). Valoramos las distintas miradas porque entendemos que la diversidad es la clave de nuestra innovación. Queremos ir más allá de cualquier límite, desafiarnos constantemente, divertirnos haciendo lo que nos gusta y dejar huella en lo que hacemos. Y sabemos que existe sólo una forma de hacerlo: como UN EQUIPO. ¡Súmate a vivir la #ExperienciaFalabella y así potenciar JUNTOS nuestra transformación! Misión Del Cargo Operativo/a Tienda / Sodimac HC El Bosque / Jornada FT44hrs Funciones Del Cargo ¿Te gustaría aprender con los mejores y desarrollar todo tu potencial? Entonces, ¡esta oportunidad es para ti! En Sodimac buscamos a nuestro/a próximo/a Operativo/a de Tienda para sumarse al equipo de nuestra tienda en Sodimac HC El Bosque ¡No pierdas más tiempo y postula! ¿Qué desafíos encontrarás en esta posición? Recibir, revisar y clasificar los productos, trasladándolos a los puntos de venta, a las áreas destinadas para retiro de productos o despacho a domicilio o bien al área de DAP y/o servicio técnico para ser retirado por transportistas, administrando para ello la documentación asociada y registrando la data en el sistema, realizando reportes de la operación diaria y las gestiones correspondientes en caso de encontrar discrepancias. Contactar a clientes por problemas con disponibilidad de productos para despacho a domicilio, ofrecer alternativas y gestionar solución acordada con el cliente. Realizar cuadratura en los puntos de venta, cambiando y renovando los flejes de precio, fichas técnicas, autoservicios, desprendibles de productos (cartelería, gráfica, pendón informativo) o servicios y cualquier otro instrumento en los puntos de venta con o sin planograma. Mantener el orden y limpieza del equipamiento de la tienda y del área de trabajo, trasladando vigas y desechos de la operación de la tienda y trastienda a los compactadores y depósitos respectivos. Mantener y renovar las muestras y exhibiciones de productos en los puntos de venta, cumpliendo con las normas de seguridad establecidas por la compañía. Realizar el montaje y desmontaje de catálogos para las campañas comerciales de Tienda. ¿Qué esperamos de ti? (requisitos mínimos) Ser mayor de 18 años Contar con Enseñanza Media completa Exigencias del cargo Trabajo mayoritariamente realizado de pie Requisitos Ser mayor de 18 añosContar con licencia de enseñanza media completa/secundariaDebe contar con disponibilidad para trabajar en sistema de turnos rotativosDeseable contar con experiencia en reposición, despacho y/o recepción Beneficios Condiciones oferta: Con la #ExperienciaFalabella disfrutarás de: Hasta 5 días libres (adicionales a las vacaciones, por año calendario). Día de cumpleaños libre. Día de navidad o año nuevo libre. Descuento en compras en tiendas Sodimac y en Falabella.com. Descuento en compras con beneficio costo cero más IVA. Aguinaldo de fiestas patrias y navidad. Caja de navidad. Bono de vacaciones. Bono de matrimonio o acuerdo de unión civil. Bono de natalidad o adopción. Bono escolaridad. Posibilidad de Becas de Estudio para trabajadores Premio a la excelencia académica para trabajadores y sus hijos. Almuerzo (casinos de la Compañía) Seguro complementario de salud, dental y catastrófico (costo compartido con empresa). Seguro de vida e invalidez. Descripción Proceso De Selección El proceso de selección se realiza a través de AIRA - plataforma de reclutamiento diseñado para mejorar tu experiencia de postulación. Para Postular Solo Necesitas Postular a la oferta Revisar tu mail Ingresar a AIRA y contestar las preguntas y/o pruebas solicitadas Luego, si vemos que tu perfil se ajusta a lo que estamos buscando, te contactaremos por mail (a través de AIRA) para seguir a la etapa presencial.
Become a key player in our remote team by taking on the role of a Senior Data Software Engineer dedicated to a project centered around Databricks workflows, APIs, analytical development, and data engineering. In this role, your primary focus will involve constructing and sustaining intricate data pipelines, facilitating seamless deployment to production. Your responsibilities extend to crafting end-to-end production solutions while engaging with cross-functional teams to deliver top-notch results. Responsibilities Engage in the Agile development process (Scrum) to conceive and implement innovative featuresPrioritize and uphold high-quality standards throughout each developmental phaseEnsure the dependability, accessibility, performance, and scalability of systemsTroubleshoot and maintain code within expansive, intricate environmentsWork in tandem with Developers, Product and Program Management, and seasoned technical professionals to furnish customer-centric solutionsProvide technical insights for new feature requirements in collaboration with business owners and architectsStay abreast of industry trends and emerging technologies for continuous improvementChampion the execution of solutions aligned with business objectivesGuide and mentor less seasoned team members, fostering skill enhancement and career growthParticipate in code reviews, ensuring adherence to standards and code qualityCollaborate seamlessly with cross-functional teams to achieve project objectivesActively contribute to architectural and technical discourse Requirements A minimum of 3 years of hands-on experience in Data Software EngineeringProficiency in Databricks, Microsoft Azure, PySpark, Python, and SQL for development and deployment in productionFamiliarity with Azure DevOps, GitHub (or alternative platforms), and version control for efficient project managementCapability to develop comprehensive end-to-end production solutionsRobust experience on one or more cloud platforms such as Azure, GCP, AWSProven track record in constructing resilient data pipelinesCapacity to integrate disparate elements for solutions spanning multiple systemsExceptional communication skills in both spoken and written English, at an upper-intermediate level or higher Nice to have Experience with REST APIs and Power BI would be an advantage We offer International projects with top brandsWork with global teams of highly skilled, diverse peersHealthcare benefitsEmployee financial programsPaid time off and sick leaveUpskilling, reskilling and certification coursesUnlimited access to the LinkedIn Learning library and 22,000+ coursesGlobal career opportunitiesVolunteer and community involvement opportunitiesEPAM Employee GroupsAward-winning culture recognized by Glassdoor, Newsweek and LinkedIn
Become a key player in our remote team by taking on the role of a Senior Data Software Engineer dedicated to a project centered around Databricks workflows, APIs, analytical development, and data engineering. In this role, your primary focus will involve constructing and sustaining intricate data pipelines, facilitating seamless deployment to production. Your responsibilities extend to crafting end-to-end production solutions while engaging with cross-functional teams to deliver top-notch results. Responsibilities Engage in the Agile development process (Scrum) to conceive and implement innovative featuresPrioritize and uphold high-quality standards throughout each developmental phaseEnsure the dependability, accessibility, performance, and scalability of systemsTroubleshoot and maintain code within expansive, intricate environmentsWork in tandem with Developers, Product and Program Management, and seasoned technical professionals to furnish customer-centric solutionsProvide technical insights for new feature requirements in collaboration with business owners and architectsStay abreast of industry trends and emerging technologies for continuous improvementChampion the execution of solutions aligned with business objectivesGuide and mentor less seasoned team members, fostering skill enhancement and career growthParticipate in code reviews, ensuring adherence to standards and code qualityCollaborate seamlessly with cross-functional teams to achieve project objectivesActively contribute to architectural and technical discourse Requirements A minimum of 3 years of hands-on experience in Data Software EngineeringProficiency in Databricks, Microsoft Azure, PySpark, Python, and SQL for development and deployment in productionFamiliarity with Azure DevOps, GitHub (or alternative platforms), and version control for efficient project managementCapability to develop comprehensive end-to-end production solutionsRobust experience on one or more cloud platforms such as Azure, GCP, AWSProven track record in constructing resilient data pipelinesCapacity to integrate disparate elements for solutions spanning multiple systemsExceptional communication skills in both spoken and written English, at an upper-intermediate level or higher Nice to have Experience with REST APIs and Power BI would be an advantage We offer International projects with top brandsWork with global teams of highly skilled, diverse peersHealthcare benefitsEmployee financial programsPaid time off and sick leaveUpskilling, reskilling and certification coursesUnlimited access to the LinkedIn Learning library and 22,000+ coursesGlobal career opportunitiesVolunteer and community involvement opportunitiesEPAM Employee GroupsAward-winning culture recognized by Glassdoor, Newsweek and LinkedIn
Join our remote team as a Senior Data Software Engineer . We are actively seeking a hands-on and deeply technical engineer to collaborate closely with development peers, product leadership, and other technical staff to create innovative and impactful solutions. This role offers an opportunity to contribute significantly to the design, development, and optimization of features in a dynamic Agile development environment, with a focus on Databricks workflows, APIs, and Data Engineering. Responsibilities Design and develop new features using the Agile development process (Scrum)Prioritize and ensure high-quality standards at every stage of developmentGuarantee reliability, availability, performance, and scalability of systemsMaintain and troubleshoot code in large-scale, complex environments.Collaborate with Developers, Product and Program Management, and senior technical staff to deliver customer-centric solutions.Provide technical input for new feature requirements, partnering with business owners and architectsEnsure continuous improvement by staying abreast of industry trends and emerging technologiesDrive the implementation of solutions aligned with business objectives.Mentor and guide less experienced team members, helping them enhance their skills and grow their careersParticipate in code reviews, ensuring code quality and adherence to standardsCollaborate with cross-functional teams to achieve project goalsActively contribute to architectural and technical discussions Requirements At least 3 years of production experience in Data Software EngineeringExpertise in Databricks, Microsoft Azure, PySpark, Python, and SQL for building both within development and enabling deployment to productionExperience with Azure DevOps, GitHub, (or others), and version control for effective project managementAbility to develop end-to-end production solutionsStrong experience working on one or more cloud platforms such as Azure, GCP, AWSExperience in building out robust data pipelinesAbility to tie loose ends together for solutions across systemsExcellent communication skills in spoken and written English, at an upper-intermediate level or higher Nice to have Experience with REST APIs and Power BI would be a plus We offer International projects with top brandsWork with global teams of highly skilled, diverse peersHealthcare benefitsEmployee financial programsPaid time off and sick leaveUpskilling, reskilling and certification coursesUnlimited access to the LinkedIn Learning library and 22,000+ coursesGlobal career opportunitiesVolunteer and community involvement opportunitiesEPAM Employee GroupsAward-winning culture recognized by Glassdoor, Newsweek and LinkedIn
Join our remote team as a Senior Data Software Engineer . We are actively seeking a hands-on and deeply technical engineer to collaborate closely with development peers, product leadership, and other technical staff to create innovative and impactful solutions. This role offers an opportunity to contribute significantly to the design, development, and optimization of features in a dynamic Agile development environment, with a focus on Databricks workflows, APIs, and Data Engineering. Responsibilities Design and develop new features using the Agile development process (Scrum)Prioritize and ensure high-quality standards at every stage of developmentGuarantee reliability, availability, performance, and scalability of systemsMaintain and troubleshoot code in large-scale, complex environments.Collaborate with Developers, Product and Program Management, and senior technical staff to deliver customer-centric solutions.Provide technical input for new feature requirements, partnering with business owners and architectsEnsure continuous improvement by staying abreast of industry trends and emerging technologiesDrive the implementation of solutions aligned with business objectives.Mentor and guide less experienced team members, helping them enhance their skills and grow their careersParticipate in code reviews, ensuring code quality and adherence to standardsCollaborate with cross-functional teams to achieve project goalsActively contribute to architectural and technical discussions Requirements At least 3 years of production experience in Data Software EngineeringExpertise in Databricks, Microsoft Azure, PySpark, Python, and SQL for building both within development and enabling deployment to productionExperience with Azure DevOps, GitHub, (or others), and version control for effective project managementAbility to develop end-to-end production solutionsStrong experience working on one or more cloud platforms such as Azure, GCP, AWSExperience in building out robust data pipelinesAbility to tie loose ends together for solutions across systemsExcellent communication skills in spoken and written English, at an upper-intermediate level or higher Nice to have Experience with REST APIs and Power BI would be a plus We offer International projects with top brandsWork with global teams of highly skilled, diverse peersHealthcare benefitsEmployee financial programsPaid time off and sick leaveUpskilling, reskilling and certification coursesUnlimited access to the LinkedIn Learning library and 22,000+ coursesGlobal career opportunitiesVolunteer and community involvement opportunitiesEPAM Employee GroupsAward-winning culture recognized by Glassdoor, Newsweek and LinkedIn