Versatility is the keyword when it comes to what we do in IT. Data Competency is one of our initiatives to support digital transformation of financial sector, enabling our customers to become truly data-driven companies. Our mission is to ensure capabilities to address financial sector data-related needs.
As an AWS Data Engineer, you will be responsible for building and optimising our clients’ data pipelines in the AWS cloud, as well as implementing AWS ecosystem solutions. You will work on various data initiatives and ensure optimal data delivery. It is a great opportunity to show off your skills and learn new ones.
Speaking of skills, it is quite likely that, at some point, you will be using the following tools and technologies:
• Data architecture, data modelling, design patterns
• Azure Data Factory, Synapse Analytics, Event Hubs, HDInsight
• RDBMS and NoSQL databases, Azure SQL Database, Cosmos DB
• AWS S3, AWS Data Lake
• DataOps, ETL technologies
• Real-time data streaming, Spark, Airflow, Kafka, MSK
• OLTP, OLAP, DWH, data lakes
• BI & predictive analytics; AI/ML
• Python, Java, Scala, R
How exactly will you be applying all your skills and talents?
• Building scalable data processing pipelines, SQL database integrations and data services in the AWS cloud.
• Advising on the use of appropriate tools and technologies from the AWS Data ecosystem.
• Recommending potential improvements to the existing data architecture.
• Collaborating with analysts, experts and tech leads in Agile methodology to meet clients' needs.
• Addressing aspects like data privacy and security, compliance with regulations, integrity and availability, etc.
• Guiding the team with good AWS development practices.
• Defining feasible test strategies and troubleshooting failures.
Is it something right up your alley? Let’s see if you have what it takes! And what it takes is:
• Proven, hands-on experience with the AWS Data toolset, in particular: RDS for PostgreSQL,
Glue, Dynamo Db, Document Db, Redis Elasticache, MSK, Lambda.
• Deep understanding of the serverless & hybrid architecture of big data pipelines.
• The ability to write useful abstractions to process similarly formatted datasets in a generic
• Familiarity with data-related design patterns.
• General ETL/DWH knowledge.
• Experience with Spark.
• Good command of English (min. B2).
• The ability to easily communicate with clients and team members.
But enough about work. Let's talk benefits!
And there are many of those in Sollers. Check the graphics below and see for yourself. And if you need more info about Sollers Consulting, read all about us on our website – check the projects we have completed around the world and the integration events we enjoy together (also around the world!)
Key facts about Sollers Consulting
We are a Team of almost 900 professionals who build the Digital Future for the world’s largest insurance, banking and leasing organisations. Our history of business advisory and software implementation goes back to the year 2000. Sollers Consulting’s roots are in Europe, but the company’s footprint is visible around the world.
As an international company with offices & projects around the world and Sollers of 20+ nationalities, we thrive in our multi-culture. We guarantee you will feel like you belong here, whether you are from Poland, the West, the East or another hemisphere.