We are looking for a skilled Data Engineer with a proven track record designing, building, and supporting data pipelines from multiple source systems. As a Data Engineer, you’ll contribute to projects across the company by providing custom end-to-end analytics integrations and improving efficiency through automated workflows. You’ll be a wizard with systems abstractions and puzzle-master building modular, reusable components that are critical to manageably supporting hundreds of pipelines as we scale.
In this role, you’ll work in close collaboration with analysts, product managers, and engineers to gather requirements and problem-solve as one team, then translate that into robust, scalable solutions that work within the overall data architecture.
The candidate will be based in Europe and will be required to communicate with colleagues in fluent Ukrainian and English.
Responsibilities:
- Design, develop, and implement a large-scale, high-volume, high-performance data warehouse
- Build, deploy, and support both batch and real-time, fault-tolerant, resilient data pipelines
- Participate in peer development and code review; comfortably iterate, optimize, and proactively help solve problems
- Develop datasets and models to meet the requirements of analysts and key stakeholders
- Research and implement AI advancements that automate the generation of analytical insights
- Play a key role in maintaining data integrity, security, and governance
Required Qualifications:
- 3+ years of experience working in a data engineering capacity
- Experience data modeling in large-scale OLAP data warehouses such as Google BigQuery
- Proficiency with object-oriented programming, REST APIs, and Python/R data science libraries
- Comfortable writing/debugging complex SQL scripts and ETL pipelines
- Experience with building Data Engineering Pipelines with Airflow (Astronomer, Google Cloud composer, etc)
- Experience with real-time distributed stream-processing frameworks such as Apache Kafka
- Experience working with large, unstructured datasets and cloud storage architecture
- Microservice and scalability-based mindset
- English level Upper-intermediate+
Preferred Qualifications:
- Exposure to any/some/all of the following: Looker (LookML), dbt, Docker, CI/CD systems, Snowplow
- Experience optimizing SQL scripts and data models for distributed databases such as CockroachDB
- Interest in blockchain technology and video games
Location: Ukraine, Portugal.
We offer:
- Team of like-minded professionals who understand your game passion;
- Work in an international product company — Mythical Games is the resident of Diia City;
- Competitive financial reward;
- Health care starting on your first working day;
- 22 paid vacation days, paid sick leaves, and other personal days in accordance with the company’s internal policies;
- Flexible working hours;
- Referral bonuses;
- Financial support and PTO in case of special occasions governed by the internal policies of the company.