Senior Data Engineer - all genders - Google Cloud Platform
Senior Data Engineer on GCP: own end-to-end pipelines, partner with product and data teams, translate requirements into scalable data solutions using Python, Airflow, and BigQuery.
We usually respond within a day
The Role
This is a hands on data engineering position embedded in a product focused environment. The work spans the full data lifecycle: gathering requirements from stakeholders, designing technical solutions, and shipping reliable, scalable pipelines. Expect close collaboration with data architects, asset managers, and product managers. This role sits at the intersection of engineering and business outcomes.
What the Work Looks Like Day to Day
Translate product and business requirements into technical data solutions
Design and implement data pipelines and capabilities across product offerings
Work with data architects to ensure solutions are aligned with broader technical strategy
Identify gaps in internal processes and lead improvements
Write clean, reusable code that adheres to established engineering standards
Communicate technical decisions clearly to both technical and non technical audiences
Technical Stack
The primary environment is Google Cloud Platform. Day to day tooling includes:
BigQuery for data analysis and processing
Cloud Composer / Airflow for workflow orchestration
Dataproc / PySpark for large scale data processing
Vertex AI for machine learning adjacent workloads
Cloud Spanner and Cloud Run for additional platform needs
Python as the primary programming language, with GitHub Copilot integrated into the workflow
What's Required
Degree in Computer Science, Engineering, or a related field, or equivalent demonstrated experience
Proven background in data engineering and architecture, including ownership of strategic technical initiatives
Strong Python skills with hands on experience across the GCP services listed above
Familiarity with PySpark and big data processing patterns
Ability to explain complex technical concepts to varied audiences
Comfortable working in fast moving environments where priorities shift
What Sets a Strong Candidate Apart
A track record of not just building pipelines, but improving how a team builds them: process thinking alongside technical depth. No need to be an expert in every tool listed. Intellectual curiosity and a structured approach to learning matter more than a perfect checklist match.
- Department
- Python (AI/ML, data science, web via Django/Flask)
- Locations
- Warsaw
- Remote status
- Hybrid
- Monthly salary
- PLN18,000 - PLN25,000
- Employment type
- Full-time
- Employment level
- Professionals
- Recruitment Speed
- 14 days