Data Engineer

  • Full time
  • Prague
  • Posted 19 hours ago
IaC (nice to have)
Terraform (nice to have)
Scala (nice to have)
Azure Services (nice to have)
SQL (regular)
PySpark (regular)
RDBMS (regular)
Spark (regular)
ETL (advanced)
Python (advanced)
We are a team of nerds dedicated to collect other nerds that happen to be the best in their respective fields in the broadly understood IT service and analytics. And with them we go out of our way to create IT and Analytical Hubs which bring together best experts from the IT world to help our partners to become real data driven entities. 

Currently we are looking for Data Engineer as we are supporting our partner in developing Global Analytics unit which is a global, centralized team with the ambition to strengthen data-driven decision-making and the development of smart data products for day-to-day operations. The team is meant to be a nucleus radiating data-driven entrepreneurial culture by acting as incubator to realize ideas by simply doing it: creating smart data products in all areas of business, be it sales, marketing, purchasing, logistics or any other part of the company. 

The team has a lot of freedom to shape this, especially in the use of tools and technology, but also by introducing new concepts, solutions and ways of working.
If you want to:
  • take part in the development and implementation of a complex system of smart data solutions 
  • have opportunity to work on bleeding-edge projects
  • have a chance to see how your visions come true
·       be a member of an international and diverse team of ground-breaking data scientists, data engineers, BI developers, UX designers and analytics translators
  • carry out projects which address real business challenges
  • have a real impact on the projects you work on and the environment you work in
  • have a chance to propose innovative solutions and initiatives
  • opportunity and tools to grow, develop and drive your career forward,
 it’s probably a good match.
Moreover, if you like:
  • flexible working hours
  • casual working environment and no corporate bureaucracy
  • having an access to such benefits as Multisport and Luxmed
  • working in modern office in the centre of Warsaw with good transport links or working remotely as much as you want
  • a relaxed atmosphere at work where your passions and commitment are appreciated
  • vast opportunities for self-development (e.g. online courses and library, experience exchange with colleagues around the world, partial grant of certification),
it’s certainly a good match!
If you join us, your responsibilities will include:
  • structuring whole processes of data extraction, data transformation and data storing using serverless Azure services to deploy models / analytical solutions 
  • handling activities such as quality assurance, data migration and integration, and solution deployment to ensure the business gets the best value
  • writing and maintaining ETL processes in Python, designing database systems and developing tools for real-time and offline analytic processing
  • implementing, maintaining and further developing the functionality of the Python packages for ETL processes, data lineage, operator inputs, including building logic
  • troubleshooting software and processes for data consistency and integrity 
  • integrating large scale data from a variety of sources for business partners to generate insights and make decisions 
  • design and implementation of data flow
  • unit and integration tests of Python modules
  • participating in mission critical processes of the data pipeline
  • technical support in understanding business problems and designing smart data products

We expect:

  • significant commercial experience on similar position
  • strong data analytics skills using Python
  • excellent software engineering skills (including unit testing, integration testing, OOP)
  • proficiency with Python and/or Scala (ideally both)
  • experience with working with large datasets through Spark and RDBMS
  • solid knowledge of PySpark with ability to apply it to write Spark applications as well as to analyze data in distributed environment
  • very good SQL skills
  • fluency in extracting information from databases
  • ability to write clean, efficient, documented and scalable code
  • experience working in the organizations with the agile culture
  • team player
  • fluent English as you will communicate in English almost all the time
 Nice to have
  • good working knowledge of Azure services
  • experience in building and releasing Infrastructure as Code with working knowledge of such tools as Terraform 
If interested please let us get to know you by sending your CV using “Apply” button.

Please add to your CV the following clause:
I hereby agree to the processing of my personal data included in my job offer by hubQuest spółka z ograniczoną odpowiedzialnością  located in Warsaw for the purpose of the current recruitment process.”

If you want to be considered in the future recruitment processes please add the following statement:
I also agree to the processing of my personal data for the purpose of future recruitment processes.”

To apply for this job please visit