Job description
Stratos supports clients in transforming their information systems. We operate in high-stakes environments and drive operational and digital performance.
We are seeking a Snowflake Data Engineer to design and deploy scalable, high-performance data platforms in demanding contexts (tech scale-ups, SMEs/mid-caps in transition).
Your mission
Working alongside a Stratos Partner, you will contribute to some or all of the following workstreams:
- Scope business priorities and structure data requirements.
- Design modern data architectures on Snowflake.
- Develop high-performance ELT pipelines; maintain and continuously optimize them.
- Establish data governance (data quality, security, cross-entity sharing).
- Automate deployments (CI/CD, Infrastructure as Code) and monitor data flows.
- Coach internal teams or integrators and structure documentation.
Profile
- At least 5 years’ experience in data engineering within a cloud environment.
- Proven, hands-on experience delivering Snowflake projects to production.
- Pragmatic, business-impact orientation (able to work under time constraints with KPI-driven objectives).
Key Technical Skils
- Mastery of Snowflake (performance, pipelines, storage, security).
- Advanced SQL and Python scripting.
- Strong understanding of ELT flows and their industrialization (tools such as dbt, Fivetran, Matillion, etc.).
- Snowflake integration with AWS/Azure/GCP.
- DevOps/CI/CD practices: GitLab, Jenkins, Terraform, etc.
- Knowledge of governance and security (RBAC, masking, data sharing, etc.).
Nice to Have
- Snowflake certifications (Core/Advanced).
- Experience with Spark/Scala.
- Familiarity with MDM tools, data catalogs, or governed-data solutions.
Engagement Details
- 2–3 days per week on average.
- Fully remote possible, with occasional travel to Paris.
- Direct collaboration with the project team (Data Analyst, Project Partner).