| iGBA
Data Engineer

Data Engineer

11 SEP 2025

Data Engineer job description WHAT YOU’LL DO: Design, develop, and maintain ETL/ELT pipelines to transform raw, multi-source data into clean, analytics-ready tables in Google BigQuery, using tools such as dbt for modular SQL transformations, testing, and documentation. Integrate and automate affiliate data workflows, replacing manual processes in collaboration with the related stakeholders. Proactively monitor and manage data pipelines using tools such as Airflow, with proper alerting and retry mechanisms in place. Emphasize data quality, consistency, and reliability by implementing robust validation checks. Build a Data Consistency Dashboard (in Looker Studio, Power BI, Tableau or Grafana) to track schema mismatches, partner anomalies, and source freshness, with built-in alerts and escalation logic. Ensure timely availability and freshness of all critical datasets, resolving latency and reliability issues quickly and sustainably. Control access to cloud resources, implement data governance policies, and ensure secure, structured access across internal teams. Monitor and optimize data infrastructure costs, particularly related to BigQuery usage, storage, and API-based ingestion. Document all pipelines, dataset structures, transformation logic, and data contracts clearly to support internal alignment and knowledge sharing. Build and maintain postback-based ingestion pipelines to support event-level tracking and attribution across the affiliate ecosystem. Collaborate closely with Data Scientists and Product Analysts to deliver high-quality, structured datasets for modeling, experimentation, and KPI reporting.

WHAT WE EXPECT FROM YOU:
Strong proficiency in SQL and Python.
Experience with Google BigQuery and other GCP tools (e.g., Cloud Storage, Cloud Functions, Composer).
Proven ability to design, deploy, and scale ETL/ELT pipelines.
Hands-on experience integrating and automating data from various platforms.
Familiarity with postback tracking, attribution logic, and affiliate data reconciliation.
Skilled in orchestration tools like Airflow or similar.
Experience with visualization tools like Looker Studio, Power BI, Tableau, or Grafana for building dashboards for data quality monitoring and business needs.
Experience with Git for version control and Docker.
Exposure to iGaming data structures and KPIs is a strong advantage.
Strong sense of data ownership, documentation, and operational excellence.
Good communication skills with different stakeholders.
Upper-intermediate English language proficiency.
Do you want to know some details about this position?
Valeriia will help!
more details
work
YOUR JOURNEY WITH US:
Step 1: CV and short questionary.
Step 2: Pre-screen with a recruiter.
Step 3: Test task.
Step 4: Interview.
Step 5: Final intrview.
Step 6: Reference check & Job offer!
28 business days of paid off.
Flexible hours and the possibility to work remotely.
Medical insurance and mental health care.
Compensation for courses, trainings.
English classes and speaking clubs.
Internal library, educational events.
Outstanding corporate parties, teambuildings.
WHAT WE OFFER:
Haven’t found
a vacancy that
suits you?
Maybe we will find something to offer you
For CV / career questions
cv@boosta.co
For all other questions
pr@boosta.co