Data Competency
Versatility is the keyword when it comes to what we do in IT. Data Competency is one of our initiatives to support digital transformation of financial sector, enabling our customers to become truly data-driven companies. Our mission is to ensure capabilities to address financial sector data-related needs.
As a Data Engineer, you will be responsible for building and optimizing our clients’ data pipelines and optimizing data flows. You will work on various data initiatives and will ensure optimal data delivery. You are ideal candidate if you are experienced in data warehousing and data wrangling. If you enjoy building data systems from scratch and you like modifying existing ones, you will have those opportunities at Sollers.
Key facts about Sollers Consulting:
We are a Team of over 900 professionals who build the Digital Future for the world’s largest insurance, banking and leasing organisations. Our history of business advisory and software implementation goes back to the year 2000. Sollers Consulting’s roots are in Europe, but the company’s footprint is visible around the world.
As an international company with offices & projects around the world and Sollers of 20+ nationalities, we thrive in our multi-culture. We guarantee you will feel like you belong here, whether you are from Poland, the West, the East or another hemisphere.
Tools & technologies used on projects:
• Data architecture, data modeling, design patterns
• RDBMS and NoSQL databases
• DataOps, ETL technologies
• Real-time data streaming, Spark, Airflow, Kafka
• OLTP, OLAP, DWH, data lakes
• BI & predictive analytics; AI/ML
• Python, Java, Scala, R
Your responsibilities and challenges:
• Build scalable data processing pipelines.
• Guide the team with good development practices.
• Find potential improvements and enhancements for existing data processing solutions.
• Advise on the use of appropriate tools and technologies.
• Recommend potential improvements to existing data architecture.
• Monitoring performance and optimizing data processing flows.
• Work with Business Analysts, Subject Matter Experts, and Tech Leads to ensure business requirements are adequately addressed in the design and later in development process.
• Address aspects like data privacy and security, compliance with regulations, integrity, availability, etc.
We bet on you, so we expect you to:
• Know at least one programming language (Python, Java, Scala)
• Have an experience with Spark and Kafka
• Be proficient in SQL and have knowledge about noSQL databases
• Have an experience with ETL/ELT processes and building data processing pipelines
• Speak English (min. B2)
• Communicate effortlessly with clients and team members
• Know at least one programming language (Python, Java, Scala, R)
• Be able to work in Poland and the European Union
We appreciate if additionally you:
• Are familiar with one or more Cloud data stacks (AWS, Azure, GCP)
• Have practical experience with Snowflake and Databricks
• Have an experience in pipeline orchestration tools (e.g. Airflow)
• Are familiar with containerization
• Have working experience with various databases
• Have knowledge of popular file formats (Parquet, Avro, Orc)
Equipment:
• Intel core i5/i7 notebooks, 32 GB RAM, Win10, 2x24’’ monitors
• Ergonomic chair, designed in Japan
• Motorized height-adjustable desk
• And many, many more…