Versatility is the keyword when it comes to what we do in IT. Data Competency is one of our initiatives to support digital transformation of financial sector, enabling our customers to become truly data-driven companies. Our mission is to ensure capabilities to address financial sector data-related needs.
As a Snowflake Engineer, you will be responsible for building and optimizing our clients’ data pipelines and optimizing data flows. You will work on various data initiatives and will ensure optimal data delivery. You are ideal candidate if you are experienced in Snowflake ecosystem. If you enjoy building data systems from scratch and you like modifying existing ones, you will have those opportunities at Sollers.
Key facts about Sollers Consulting:
We are a Team of almost 700 professionals who build the Digital Future for the world’s largest insurance, banking, and leasing organizations. Our history of business advisory and software implementation goes back to the year 2000. Sollers Consulting’s roots are in Poland, but the company’s footprint is visible around the world. Working with us means taking part in many projects worldwide (Poland, Germany, Austria, Switzerland, UK, USA, Canada, Japan). Our teams are located in Warsaw, Cologne, Gdansk, Tokyo, Lublin, Wroclaw, Paris, and Poznań. Being agile across our company and our projects enable us to play an active role in industries with high digitalization needs.
Tools & technologies used on projects:
• Data architecture, data modeling, design patterns
• RDBMS and NoSQL databases
• DataOps, ETL technologies
• Real-time data streaming, Spark, Airflow, Kafka
• OLTP, OLAP, DWH, data lakes
• BI & predictive analytics; AI/ML
• Python, Java, Scala, R
You will have an opportunity to:
• Build reliable data streaming architectures, data models & automation for installation and configuration management.
• Advise on the use of appropriate tools and technologies.
• Guide the team with good Snowflake development practices.
• Develop automated ETL/ELT flows and transformations, feeding data lake with source systems (AirFlow, Spark).
• Recommend potential improvements and enhancements of existing data warehouse and reporting solutions.
• Collaborate with analysts, experts and tech leads in Agile methodology to meet clients' needs.
• Address aspects like data privacy and security, compliancy with regulations, integrity and availability, etc.
• Monitor performance, optimizing data lake provisioning processes, troubleshooting failures.
We bet on you, so we expect you to:
• Have proven, hands-on experience with Snowflake.
• Understand DWH concepts, data lakes, data mesh, data streaming.
• Be proficient with SQL and at least one data-wrangling programming language (Python, Java, Scala, R).
• Have commercial experience with ETL/ELT, data migration and cloud deployments.
• Speak English (min. B2).
• Communicate effortlessly with clients and team members.
• Be able to work in the European Union.
We offer you:
• The opportunity to quickly develop professionally.
• Clear career path and future salary projection.
• Individual learning & development budget.
• German and French language classes.
• Comprehensive health care, life insurance, travel insurance.
• Home office policy.
• Family support: wedding gifts, generous layette for newborns, family parties.
• Relocation package (if you come from another city).