- 2+ years of experience working with Java or Python
- Good knowledge of SQL
- Knowledge of Data pipeline and workflow management tools (e.g. Apache Airflow)
- Solid knowledge of DevOps processes, monitoring, tooling as well as application runtime
- environments with docker and Kubernetes
- Experienced with automated CI/CD Pipelines and software testing practices
- Experience with Data Lake and Data Warehouse organization
- Knowledge of Columnar OLAP storages (like Amazon Redshift, Snowflake or Clickhouse)
- Knowledge of PySpark, HDFS
- Intermediate spoken English
Nice to have:
- Experience in data science or implementation of performance critical algorithms
- Understanding of optimization algorithms
- Experience with PostgreSQL is a plus
- Knowledge of Hadoop stack might be a plus
- Projects in Western Europe and America.
- Career and professional growth.
- Product development.
- Full cycle projects.
- Corporate-funded training (functional training, foreign languages).
- Benefits (football, basketball, swimming pool, health insurance, sports compensation, corporate presents and events).
- Young and friendly team.
- Long term full time employment
- Stable pay (100% official).
- Flexible working hours
- Mentor’s support and guidance to help you grow as a developer