Danske Bank A/S Lietuvos filialas
Danske Bank A/S Lietuvos filialas

Senior Data Engineer in Everyday Banking Individuals Tribe

4.280 - 6.420 €
Neatskaičius mokesčių
14 Peržiūrų

Area

We are Everyday Banking Individuals, and our goal is to support the Forward28 strategy and the digital transformation that it delivers to the services we provide our customers.

This position is strategically located around customer journey solutions in the cards, accounts, and payments domain. The tribe meets our customers evolving needs, by providing embedded offerings and services such as Cards services, Digital wallets, Accounts & Payment solutions, and many other innovative services that go beyond traditional banking while promoting sustainable practices. You will collaborate with global team based in India, Lithuania and Denmark.

You will be working with advanced technology like AWS, directly impacting customers by leveraging data, customer insights and behavior, with opportunities to explore machine learning and AI in the future.

  • Mission-Critical challenges: You will be working at the forefront of developing and managing a system that is the lifeblood of everyday banking customer journey products. Your contributions will directly impact how cards, accounts and payments services help our customers across Nordic countries. You will have the opportunity to work on complex, high-stakes projects.
  • Global Collaboration: Our team is highly co-located within Aarhus, Copenhagen, and India. We are expecting to grow in coming months, and you will have the chance to collaborate with team members from Denmark, India, and Lithuania.

Depending on your experience and knowledge, we may offer you a different seniority for the role.

Mission

  • Design and Develop Data Lake Solutions: Create and implement data lake architectures on AWS, ensuring they are scalable, secure, and efficient
  • Migrate Legacy Systems: Plan and execute the migration of existing legacy systems to AWS, ensuring minimal disruption and data integrity
  • ETL Processes: Design, develop, and maintain ETL processes using tools like AWS Glue, Talend, or AWS Data Pipeline to ensure smooth data flow and transformation
  • Program and Script: Utilize Python, Spark, Kafka, and advanced SQL for developing data processing solutions and automating
  • Database Management: Work with AWS databases, including Aurora, RDS, and S3, managing security and access controls effectively
  • Agile Development and CI/CD: Collaborate using Agile methodologies and employ AWS CloudFormation, AWS CodeBuild, and AWS CodePipeline for continuous integration and continuous deployment
  • DevOps Practices: Engage with DevOps practices and use deployment and orchestration tools to streamline processes and improve efficiency

Skills

  • Bachelor’s or equivalent degree in computer science, engineering, or a related field, with minimum 3+ years hands-on experience, ideally supplemented with certifications from AWS, Microsoft, GCP etc
  • Ability and knowledge how to design and develop data lake solutions in public cloud (AWS would be ideal), as well as prior experience in migrating legacy to public cloud
  • Hands-on experience with ETL processes and tools such as Glue, Talend or AWS Data Pipeline
  • Proficient in Python, Spark, Kafka and advanced SQL. Proficient with at least two technologies such as Aurora, RDS, S3 including security and access management
  • Experience with at least one of relational databases on AWS RDS such as SQL Server, NoSQL database such as DynamoDB and cloud data warehouse databases such as AWS Redshift
  • Good understanding of Agile Development, Cloud formation Template, AWS Code Built, AWS Code Pipeline
  • Understanding of Virtual Private Cloud environment including VPC, IAM, EC2 instances, Security groups, Network Access control, Elastic IPs, public & private subnets etc
  • Familiarity in building scalable production systems with high availability using Elastic load balancers & Auto Scaling on different regions and availability zones as well as familiarity working with DevOps tools, deployment, and orchestration technologies
  • Upper-Intermediate English skills

We offer:

We will ensure that exact salary offered for you will be based on your qualifications, competencies, professional experience and requirements for the corresponding job function (salary range from 4280 EUR to 6420 EUR gross EUR/monthly).

Your title in job contract will be IT Software Engineer (Data Engineer), Senior.

Vivek Bhatnagar
City:

Vilnius

Remote work:

No

Working time:

Full time

Valid till:

2025-04-07

Siųsti CV Kandidatavimas vyks

Siųsti prisegtą CV

Pasirinkite CV Pasirinkite savanorio anketą
(.pdf, .doc, .odt)
arba
Panaudokite motyvacinio laiško šabloną

Jūsų CV sėkmingai išsiųstas būsimam darbdaviui

Dėkojame, kad naudojatės cv.lt paslaugomis.

Jūsų CV sėkmingai išsiųstas. Jūsų savanorio anketa sėkmingai išsiųsta.

Kandidatavimo istorijoje galite pamatyti ar išsiųstas CV jau peržiūrėtasišsiųsta savanorio anketa jau peržiūrėta!

Norite sužinoti ar darbdavys perskaitė jūsų CV?

Prašome aktyvinti nuorodą, kurią išsiuntėme el.paštu .

Sukurkite paskyrą ir kandidatuokite greičiau.

Aktyvinkite nuorodą, kurią išsiuntėme el. paštu .