Data Engineer

  • Full time
  • Prague
  • Posted 1 week ago

Datumo

Kafka (nice to have)
Kubernetes (nice to have)
Docker (nice to have)
Airflow (junior)
Big Data (regular)
Python (regular)
JVM (regular)
Cloud (regular)
Apache Spark (advanced)
Datumo specializes in providing Big Data and Cloud consulting services to clients from all over the world, primarily in Western Europe and the USA. Core industries we support include e-commerce, telecommunications and life science. We create dedicated solutions using a variety of tools. Our team consists of exceptional people whose knowledge and commitment allow us to conduct highly demanding projects

Is the world of Big Data and Cloud a place where you spend time with pleasure? Are you looking for modern technologies, people with passion and eagerness for  innovation? Take a look at Datumo. Find out about our sample projects. Learn what we expect of our new colleagues and what we offer them. Join us as a Big Data Engineer!

Discover our exemplary projects:

Greenfield project

  • life science client  
  • co-creating architecture
  • implementation of data pipelines feeding reports or ML models
  • automation of batch data processing
  • automation of infrastructure creation for cloud projects
Cloud migration project

  • large ecommerce platform
  • migration of data processing from on-premise to cloud
  • development of internal tools and libraries supporting migration
  • solving performance problems
  • cost optimization

What we expect:

Must-have:
  • 3 years commercial experience in Big Data
  • proven record of working with a selected cloud provider (GCP, Azure or AWS)
  • good knowledge of Python & JVM (Java/Scala/Kotlin) programming languages
  • strong understanding of Spark or similar distributed data processing framework
  • experience in working with Hive, BigQuery or similar distributed datastore
  • designing and implementing Big Data systems in accordance with best practices
  • ensuring quality of the solution (automatic tests, CI / CD, code review)
  • sharing knowledge with the team, mentoring less experienced team members on technical matters
  • proven record of cooperating with businesses and corporations
  • knowledge of English at B2 level, communicative Polish

Nice to have:
  • experience in using Airflow or similar pipeline orchestrator 
  • knowledge of Docker and Kubernetes technologies
  • knowledge of Apache Kafka
  • experience in Machine Learning projects
  • willingness to share knowledge by speaking at conferences, writing articles for the company blog, contributing to open-source projects

What’s on offer:

  • 100% remote work (however we invite you to occasionally visit one of our offices – in Warsaw or Częstochowa)
  • flexible working hours
  • onboarding with a dedicated mentor
  • switching projects is possible after some time
  • 20 days of paid holiday, additional free days depending on work experience
  • individual budget for training and conferences
  • benefits: Medicover private medical care, co-financing of the Multisport card
  • opportunity to learn English with a native speaker
  • regular company trips and informal get-togethers
If you like what we do and you dream about creating this world with us – don’t wait, apply now!

To apply for this job please visit cz.talent.com.