Data Engineer with GCP/AWS and Python

  • Full time
  • Prague
  • Posted 3 weeks ago
Docker (junior)
Kubernetes (junior)
Linux (junior)
Git (junior)
Snowflake (junior)
Postgres (junior)
GCP (regular)
AWS (regular)
Python (regular)
Database design (advanced)
Get to know us better
CodiLime is a software and network engineering industry expert and the first-choice service partner for top global networking hardware providers, software providers and telecoms. We create proofs-of-concept, help our clients build new products, nurture existing ones and provide services in production environments. Our clients include both tech startups and big players in various industries and geographic locations (US, Japan, Israel, Europe).
While no longer a startup – we have 300+ people on board and have been operating since 2011 we’ve kept our people-oriented culture. Our values are simple:

• Act to deliver
• Disrupt to grow.
• Team up to win.
The project and the team
Our client is a networking company, building an ETL solution to gather and expose API with telemetry data. At the moment the platform is being vastly extended and integrated with the customer’s other subsystems.
What else you should know:

• The team consists of less than 10 people including an architect, project manager, developers, devops and multi-tech and multi-language engineers familiar with numerous APIs, data structuring and processing techniques, presenting output in multiple ways depending on the business need
• We use SCRUM/Agile methodology
• Our tech stack for the project includes: GCP (BigQuery, Cloud Functions), AWS, Azure, relational and non-relational databases, application logic written in Python
• The client is based in the US

We work on multiple interesting projects at the time, so it may happen that we’ll invite you to the interview for another project, if we see that your competencies and profile are well suited for it.

Your role

As a part of the project team, you will be responsible for:
• Designing and implementing features in GCP’s BigQuery and Cloud Functions (Python)
• Implementing solutions suitable for either OLTP or OLAP approaches
• Investigating possible bottlenecks and improving overall databases’ performance
• Taking part in technical design discussions
• Delivering automatic tests for your code
• Validating the solution with the client (demo)
• Fixing discovered bugs efficiently and effectively
• Working in agile methodology and collaborating with a team

Do we have a match?

As a Data Engineer you must meet the following criteria:
• Very good understanding of database design concepts and approaches
• Hands-on experience with GCP’s BigQuery and Cloud Functions (or other cloud’s substitute like AWS Redshift and Lambda)
• Knowledge of Python and SQL
• Experience with CI/CD tools and processes
• Basic understanding of virtualization technologies with emphasis on Kubernetes, Docker would be a plus
• Knowledge of AWS/Azure would be a plus
• Experience with non-relational databases would be a plus
• Knowledge of Git
• Good communication skills, English (B2 level), ability to confront technical solutions with the team and the customer’s technical representatives to validate the solution with the client

More reasons to join us

• Flexible working hours and approach to work: fully remotely, in the office or hybrid
• Professional growth supported by internal training sessions and a training budget
• Solid onboarding with a hands-on approach to give you an easy start
• A great atmosphere among professionals who are passionate about their work
• The ability to change the project you work on

To apply for this job please visit