About the role: Would you like to be part of DHL IT Services? Our Data Lake Platform team is looking for a Senior DevOps Engineer star! You will develop and operate Kubernetes clusters as part of DHL Data Lake ecosystem that sits at the heart of DHL digitalization effort. You will join a platform DevOps team that is on a mission to catapult DHL into data driven enterprise. Apply and be part of one of the most impactful and upstream IT teams in DHL. Your work: As part of DevOps product team you will take end-to-end care of Kubernetes (Rancher) platform that includes designing, building, operating and supporting the platform • Development: further architecture, develop, test and automate the platform across technologies and infrastructures • Operations: overlook daily operations, maintenance, monitoring and capacity adjustments for 24×7, mission critical platform • Support: support and consult use cases, solve incidents, coordinate changes, share on-call duty Team & Tech The Team: Work in a highly skilled, highly motivated international team of unique professionals. Learn very fast, have a high impact and start-up feeling from day one.We follow scrum principles, trace work on Jira, communicate via slack and live high trust DevOps culture. Our Tech: • Kubernetes (Rancher) Platform as underlying container infrastructure • Red Hat Enterprise Linux as OS layer • Jenkins, Ansible, Git as automation engine • Mapr (HPE) Hadoop cluster as storage with Hive, Spark, Drill, Hue ecosystem services • Data Science Workbench is based on JupyterHub and Kubeflow • Airflow to orchestrate data injection • OpenSearch (Elasticsearch) for central logging and specific use cases • Microsoft Azue and Google (GCP) for hybrid scenarios You should have: • Master degree in computer science or related field • Deep knowledge of Linux Platform, preferably Red Hat Enterprise • Good expertise with Kubernetes, preferably Rancher distribution • Strong experience with network and infrastructure administration • Hands-on knowledge with infrastructure automation using Ansible, Jenkins and Git • Fair understanding of Hadoop, preferably of mapr (HPE) distribution • Some experience with Elasticsearch • Rudimentary knowledge on Microsoft Azure / GCP is a plus • Ability to use English in daily communication • Ready to learn extremely fast in a very agile, high pace environment What do we offer: • Young team of motivated professionals from across the globe, always ready to support you • A great opportunity to advance your career to a global level, with lots of learning included • Professional and industry-recognized technical trainings and certifications • An extra week of holiday (25 days per year) • 6 Self-sickness days/year • Company car / car allowance • Full salary compensation for up to 10 days absence due to illness per the calendar year • Meal vouchers fully covered by DHL • Benefit cafeteria system by EDENRED • On-going professional and technical training and certifications • A multicultural environment in modern offices • Pension plan contribution • Multisport card • Employee Referral Program • Company sponsorship of various sports and social clubs • Smart casual dress code • Huge number of internal growth opportunitiesIn DHL ITS Prague, you will have the opportunity to be part of over 1650 highly skilled IT professionals of 67 nationalities.Would you like to develop your career in DHL ITS Prague – the largest data center operation in the Czech Republic? Start your application now!