• Galaxus

(Senior) Data Engineer, Analytics and AI (f/m/d)80 – 100%

  • Permanent position
  • Immediately or by agreement
  • Home office option
Our team in Digitec Galaxus ensures that the further development of the online shop is driven by data. To strengthen the team, we are seeking a motivated Analytics Engineer who will support us in further developing a clean data ecosystem for behavioral and online shop data.

Key information

We welcome all applications, but can only consider applications submitted online. Applications submitted by post will be disposed of for data protection reasons and will not be returned.

What you move

Main task

Architect & Orchestrate: Design, build, and manage scalable ETL/ELT data flows using Python and Dataform to transform raw behavioral data into high-performance data objects.

Main task

Productionalize ML: Partner closely with Senior Data Scientists to transition model prototypes into robust production pipelines, assisting in the engineering and training of ML models.

Main task

Data Product Innovation: Develop efficient data marts and canonical datasets that serve as the backbone for data-driven shop features, Machine Learning, and Generative AI applications.

Main task

Collaborative Leadership: Act as a technical consultant for analysts and data scientists, ensuring the integration of online shop data into the broader company platform.

Main task

Systems Integrity: Implement robust, fault-tolerant systems and maintain clear data logic to meet the growing demands of the business intelligence domain.

What you will bring

5 years

Data Engineering

Completed degree (university / university of applied science / higher technical college)

English (Advantageous)

Mandatory

Experience: You have 5+ years of professional experience in data engineering, with at least 2 years specifically focused on productionizing machine learning models or working within AI infrastructure.

Mandatory

Education: Advanced degree in Computer Science or a data-focused field (ETH/UNI/FH) or equivalent professional expertise.

Mandatory

Pipeline Mastery: Extensive experience in sustainable data modeling and the design of complex, high-volume data pipelines.

Mandatory

Technical Stack: Expert SQL skills with Google BigQuery and proficiency in data workflow tools such as Dataform, dbt, Airflow, or Dagster.

Mandatory

Software Engineering: Strong programming skills in Python, including experience with Git and software engineering best practices like unit testing and CI/CD.

Mandatory

ML & AI Foundations: Practical experience with ML modeling technologies (e.g., PyTorch, TensorFlow, or Vertex AI) and a deep understanding of data preparation for ML tasks.

Mandatory

Language: Fluent English is a must, while German is considered as a plus.

Advantageous

Bonus Skills: Knowledge of web tracking (Snowplow/Google Analytics) or data visualization tools like Tableau.

The benefits we offer

Werte statt blabla

Mutig selbst denken

Unkompliziert

Flexibilität statt verbiegen

Schnelles Wachstum

Weitere Benefits Schweiz

Flexible working hours

Flexible workplace / working from home

Maternity/paternity leave

Employee offers

Holidays

Individual continuing training

Initial and continuing vocation training

Personal responsibility & freedom

Room to manoeuvre and decision-making

Application and contact details

Maja Crljen

Information for recruitment agencies

Only direct applications will be considered for this position.