Data Architect
110
5000 - 6400 €
Prieš mokesčius
Job description:
IT Data&Analytics Architecture unit is happy to welcome a new team member. As a Data Architect, you will provide the end-to-end technical vision and hands-on guidance, ensure that data assets, internal and external, are stored, moved, transformed, and exposed in a way that is, secure, governed, cost-efficient, and ready for analytics, AI/ML, and operational workloads. This role is more technically oriented, however you will also interact with the business stakeholders and translates strategy into a data architecture, coach squads on best practices.
What are you going to do?
- Develop and continuously refine the target data-platform architecture, including data lake, data warehouse, and operational data stores aligned with enterprise architecture standards
- Maintain current state and roadmap blueprints that cover storage, processing, metadata, security, and cost optimization
- Help to design and supervise robust, reusable pipelines for ingestion, transformation, and orchestration (Azure Data Factory / Databricks)
- Partner with ML Engineers on Databricks MLflow, feature tables, and model deployment pipelines
- Translate complex data platform topics into business language for non-IT stakeholders
- Create artefacts (roadmaps, OKRs, etc.)
Data Mesh
- Define the concept of "data product" for the Baltic organization in line with the group strategy
- Drive squads to build, publish, and maintain data contracts
- Monitor data contract compatibility and breaking changes
- Liaise data contract with Information Architects on naming, lineage, and business semantics
Technical Leadership
- Embed with development squads as the "data face" of architecture
- Review designs
- Troubleshoot issues
- Coach engineers on patterns and anti-patterns.
- Interact with the Group data architects
Requirements:
- Microsoft Azure: ADLS Gen2, Data Factory, Databricks (incl. Delta Lake & MLflow)
- Relational: SQL Server
- Search/NoSQL: Elasticsearch (index modelling, relevance tuning).
- Programming/Scripting: Python (PySpark, Pandas, etc.) and strong SQL
- DevOps: Git, CI-CD (Azure DevOps or GitHub Actions), Terraform
Architecture Know-How regarding:
- Lakehouse and warehouse patterns; data-mesh/data-product thinking
- Streaming vs batch design; micro-(and swarm-) services
- Data modelling (star/snowflake, event-driven schemas, etc.)
AI/ML Collaboration:
- Feature-store design, ML-ready data curation, model observability, and MLOps principles
Communication & Leadership skills:
- Excellent communication skills (especially in English to communicate with colleagues from different countries)
- Ability to influence across roles-engineers, product managers, executives
- Workshop facilitation, documentation, mentoring, and conflict resolution
Company offers:
- Extra vacation days, annual bonus, great insurance benefits, discounts on our products for you and your family, gifts, etc.
- Strong company culture: knowledge sharing, company events, interesting speakers, and other inspiring initiatives
- Career and development opportunities
- Challenging and exciting projects with autonomy to plan own tasks
- Job location: Vilnius (flexible hybrid work model, with our office as the main workplace)
- Business trips to the Baltic and Nordic countries
Miestas:
Vilnius
Nuotolinis darbas:
Ne
Laikas:
Visa darbo diena
Galioja iki:
26/11/2025
Kandidatuokite į skelbimą
add title
Persiųsti