31 Data Integration jobs in Vietnam
Data Integration Lead
Posted today
Job Viewed
Job Description
Role Purpose:
- Leads, drives and coordinates all data & master data integrations in the scope of DBB
- Accountable for over-seeing and leading all data deliverables, following the provided planning and roadmap. This includes activities such as cleansing, enrichment, migration and data governance
- Reports to the OpCo Transition Manager.
Role & Responsible:
- Contributes to the creation of the Master data & Data integrated transition plan, with responsibility for assessing and understanding all data & applications landscape and impact.
- Is the key point of contact for analysing data requirements and aligns with the central project team on scope definition.
- Leads and oversees all data activities, such as: cleansing, enrichment, migration and data governance
- Works with the project team to identify all areas of D&T data landscape which cease post project and ensure that these are all covered within the HVN decommissioning plan, including applications, data architecture, and reporting
Skills & Experience
:
Work experience
5-8 years
Demonstrated experience
Data management within a digital and technology environment, using Master data management tools.
Other fields of expertise
Familiarity with business process management would be useful, as would operational experience with IT systems.
Others
- Good planning and organising skills, with an eye for detail. A team player, who works collaboratively and delivers through others. Comfortable working within a clearly defined framework and someone who follows the rules to deliver high quality results.
Data Analyst- FCO Data Integration
Posted today
Job Viewed
Job Description
About the job:
NAB Financial Crime Operations are on a significant transformation journey to uplift our data and technology eco system to protect our customers and meet our Regulatory compliance obligations. This includes building centralised strategic financial crime data assets that will enable financial crime processes and controls. This role is critical to successful delivery of NAB's financial crime strategy by working closely with Product Owners and subject matter experts to ensure the integrity of data being ingested in financial crime systems. In addition to delivering these strategic assets, you will also be required to support project delivery through your skill set removing blockers related to technology and data deliverables. The person we are seeking is special. You will be expected to have the drive to continue the journey D&TI are on to deliver strategic assets and support the data revolution occurring in FC.
Main Responsibilities:
•Strong experience with analysing and manipulating complex, high volume data sets using SQL and/or Python
•Ability to work with operational teams to translate business requirements into data and tech deliverables
•Ability to manage and communicate across a broad range of stakeholders
•Ability to act quickly to identify and resolve potential blockers and issues in the data delivery space to ensure speed to ououtcome
•Responsible for providing and gathering data requirements for large scale data engineering or analytics projects
•Assisting with writing and executing UAT and BVT for data use cases (using SQL and/or Python)
•Trusted advisor and SME in technical requirement elicitation for data/tech or reporting delivery
•Partner with Product Owners, Business Analysts, Engineering teams on high priority data and tech projects across the FC transformation program to deliver and resolve issues at pace
•Providing recommendations and solutions to problems, leveraging data analysis gathered
Clearly communicating and documenting data quality issues and the impact of these on outcomes
Experience
•A strong understanding of data and technical concepts and familiar with big data implementation strategies.
•Knowledge of core banking data and technology systems and infrastructure, Agile software delivery lifecycle (SDLC), excellent presentation skills.
•Experience in data/tech related project delivery would be highly desirable
•Knowledge of API and UI design would be useful
Experience
•A strong understanding of data and technical concepts and familiar with big data implementation strategies.
•Knowledge of core banking data and technology systems and infrastructure, Agile software delivery lifecycle (SDLC), excellent presentation skills.
•Experience in data/tech related project delivery would be highly desirable
•Knowledge of API and UI design would be useful
Qualification Requirements
Bachelor degree in business, data analytics/ computer science, statistics or related discipline
THE BENEFITS AND PERKS
We appreciate and reward our colleagues who do great work every day – from excelling for our customers, to taking ownership of an issue to get it resolved. Here's how we support our people with a range of exclusive benefits.
Generous compensation and benefit package
Attractive salary
- 20-day paid annual leave and 7-day paid sick leave
- 13th month salary and Annual Performance Bonus
- Premium healthcare for yourself and family members
- Monthly allowance for team activities
- Premium welcome kit and occasional gifts of appreciation
Extra benefits on your work anniversary
Exciting career and development opportunities
Large scale products with modern technologies in banking domain
- Clear roadmap for career advancement in both technical and leadership pathways
- Access to digital learning platform such as Udemy
- Consistent and high-quality leadership training through the Distinctive Leadership program (DLP)
- Specialist capabilities and accreditations in key skill areas such as Cloud Engineering, Digital, Data, Security and SREs (Site reliability engineers)
- Sponsored English course with native teachers
Opportunity for training in Australia
Professional and engaging working environment
Hybrid working model and excellent work-life balance
- State-of-the-art & modern Agile office
- Food and beverages in the office pantry
- Employee Assistance Program to improve your physical and mental health
- Annual team activities and company events
- A solid and talented team behind you – great people who love what they do
A DIVERSE AND INCLUSIVE WORKPLACE WORKS BETTER FOR EVERYONE
We know that our people make us who we are. That's why we have built a culture of respect – where everyone feels valued and appreciated for being their true authentic selves at NAB. With our focus on inclusion and diversity, and in partnership with our Employee Resource Groups, NAB is a place where First Nations colleagues, colleagues of all genders, sexualities and ages, carers and colleagues with disability, and colleagues from all cultures, races and religions have the opportunity to thrive, connect and grow.
We are intent on providing an environment where you can work your way. Ask about our many flexible work options and please let us know if we can provide any adjustments throughout the recruitment process.
CLOUD-FIRST
NAB is undergoing an exciting "Cloud First" technology transformation by taking advantage of the latest tools and techniques used by leading technology and digital companies globally. But it's not just about the Tech, we are also investing heavily in our people, so if you have an appetite to learn, grow and elevate others around you, this is the place for you
If this excites you, let's have a chat over a cup of coffee
It's more than a career at NAB. It's about more opportunity, more moments to make a difference and more focus on you.
Your job is just one part of your life. When you bring your ideas, energy, and hunger for growth to us, you'll be recognised and rewarded for your contribution in return. You'll have our support to excel for our customers, deliver positive change for our communities and grow your career.
It's a good time to see what more you can find at NAB as a
We naturally also provide a very competitive remuneration package but a career with us is about a lot more than money. We believe in people with ideas and dreams, and we want you to achieve your aspirations. We will work together to deliver exceptional products and outcomes that push the limits of our own aspirations. Out passion for creating value and exceeding our customers' expectations means we are constantly striving to redefine our standards of excellence. You will have our backing to develop and our encouragement to explore, realize and reach your full potential.
If this excites you, let's have a chat over a cup of coffee
It's more than just a career at NAB
We believe in people with people and dreams, and we want you to achieve your aspirations. More than just a career, NAB Vietnam offers you a flexibility to balance your work - life, the opportunity to grow as professionals, people and a complete set of well-being offerings. If you have an appetite to learn, grow and elevate others around you, this is the place for you.
IT'S MORE THAN MONEY
We naturally also provide a very competitive remuneration package but a career with us is about a lot more than money. We believe in people with ideas and dreams, and we want you to achieve your aspirations. We will work together to deliver exceptional products and outcomes that push the limits of our own aspirations. Our passion for creating value and exceeding our customers' expectations means we are constantly striving to redefine our standards of excellence. You will have our backing to develop and our encouragement to explore, realize and reach your full potential.
It's more than just a career at NAB
We believe in people with people and dreams, and we want you to achieve your aspirations. More than just a career, NAB Vietnam offers you a flexibility to balance your work - life, the opportunity to grow as professionals, people and a complete set of well-being offerings. If you have an appetite to learn, grow and elevate others around you, this is the place for you.
IT'S MORE THAN MONEY
We naturally also provide a very competitive remuneration package but a career with us is about a lot more than money. We believe in people with ideas and dreams, and we want you to achieve your aspirations. We will work together to deliver exceptional products and outcomes that push the limits of our own aspirations. Our passion for creating value and exceeding our customers' expectations means we are constantly striving to redefine our standards of excellence. You will have our backing to develop and our encouragement to explore, realize and reach your full potential.
Data Integration Engineer/ ETL Developer
Posted today
Job Viewed
Job Description
Mô tả công việc:
(Mức lương: Thỏa thuận)
- Thiết kế, xây dựng và tối ưu các luồng ingest/ETL bằng Apache NiFi hoặc Apache Airflow (hoặc kết hợp cả hai tuỳ use case batch/stream).
- Thiết lập ingest CDC từ DBs Kafka (Debezium hoặc giải pháp tương đương); ingest từ API / File / Log / AI output.
- Mapping & chuẩn hoá schema giữa các nguồn khác nhau, xử lý kiểu dữ liệu, mã hoá ký tự, timezone, chuẩn hoá mã định danh.
- Xây dựng cơ chế kiểm tra chất lượng dữ liệu (record count, schema drift, null ratio, referential checks) ngay trong pipeline.
- Monitor & alert: theo dõi tình trạng pipeline NiFi/Airflow (latency, error rate, back pressure), cấu hình retry, fallback, reprocessing theo partition/time slice.
- Phân nhánh dữ liệu theo môi trường (dev/stg/prod), domain (event/log/master data) hoặc SLA khác nhau.
- Làm việc với nhóm Data Engineer / Analytics để bàn giao dữ liệu đã chuẩn hoá vào các zone lưu trữ (raw, staged, curated).
- Viết tài liệu pipeline, hướng dẫn vận hành & quy trình khắc phục sự cố.
- Kinh nghiệm môi trường yêu cầu bảo mật cao (ngân hàng, tài chính, viễn thông, cơ quan nhà nước) – xử lý masking, tokenization, network segment.
- Đã triển khai ingest near real-time / streaming.
- Kinh nghiệm với Kafka Connect, Debezium, Schema Registry.
- Biết Git, Jira, CI/CD căn bản cho deployment NiFi/Airflow (export template, infra-as-code, Helm/Ansible/Terraform).
- Hiểu cơ bản về NoSQL (MongoDB), Object Storage (MinIO/S3) phục vụ landing zone.
- Làm việc được trong môi trường on-premise giới hạn internet, artifact offline.
Chức vụ: Nhân Viên/Chuyên Viên
Hình thức làm việc: Toàn thời gian
Quyền lợi được hưởng:
Chế độ đãi ngộ:
Thu nhập cạnh tranh, thưởng theo hiệu suất và năng lực. Thử việc hưởng 100% lương trong 2 tháng.
- Tăng lương định kỳ dựa trên kết quả làm việc và tình hình kinh doanh.
- Thưởng vinh danh, ghi nhận cá nhân xuất sắc hàng quý/năm.
- Thưởng lễ, Tết và các phúc lợi khác (hiếu, hỉ, ốm đau.).
Môi trường startup tăng trưởng nhanh – cơ hội phát triển không giới hạn, được đào tạo chuyên sâu và thử sức ở nhiều lĩnh vực.
Quyền lợi khác:
Tặng khóa học Tiếng Anh online trọn đời trị giá đ ngay khi gia nhập.
- BHXH đầy đủ theo luật, 12 ngày phép/năm, nghỉ lễ theo quy định.
- Văn phòng hiện đại, không gian mở, khuyến khích sáng tạo.
- Du lịch hàng năm, teambuilding sôi động, sự kiện nội bộ phong phú.
- Làm việc với đội ngũ trẻ, tài năng, năng động – môi trường khuyến khích sáng tạo, trao quyền và không ngừng phát triển.
Yêu cầu bằng cấp (tối thiểu): Trung cấp - Nghề
Yêu cầu công việc:
- Tốt nghiệp đại học chuyên ngành CNTT / Hệ thống thông tin / Khoa học máy tính hoặc tương đương (hoặc kinh nghiệm thực tế mạnh thay thế).
- Có từ 3 năm kinh nghiệm làm việc với ETL/ELT hoặc tích hợp dữ liệu đa nguồn.
- Thành thạo SQL; hiểu biết PL/SQL hoặc stored procedures để tiền xử lý dữ liệu khi cần.
- Có kinh nghiệm thực tế triển khai Apache NiFi và/hoặc Apache Airflow: build flow, scheduling, parameter context, connection pooling, error handling.
- Hiểu và đã làm CDC / log-based replication (ví dụ: Debezium, debezium-connector, wal2json, binlog, etc.) từ Postgres/MySQL (hoặc RDBMS tương đương) lên Kafka.
- Kỹ năng debug pipeline: đọc log, trace message, phân tích data drift, xử lý quyền truy cập hạn chế.
- Khả năng làm việc nhóm và giao tiếp kỹ thuật tốt (viết doc, trao đổi với dev/app owner, DBA, SecOps).
Yêu cầu giới tính: Nam/Nữ
Ngành nghề: IT Phần Mềm,Java,Python,SQL
Trung cấp - Nghề
Không yêu cầu
Data Engineering Manager
Posted today
Job Viewed
Job Description
ZALORA Group is looking for a Data Engineering Manager to join its Data Science Team. As a Data Engineering Manager, you will lead a small team of two Data Engineers to design, build, and maintain robust, scalable, and efficient data pipeline and systems that support the Finance and Commercial departments.
WHAT WILL YOU DO
Team Leadership & Management
- Lead, mentor, and develop a team of two Data Engineers, providing technical guidance and career development support.
- Manage workload allocation, set priorities, and ensure timely delivery of team objective
Data Infrastructure & Pipeline Development
- Design, implement, and maintain scalable ETL/ELT pipelines to support Finance and Commercial data requirements.
- Ensure pipelines are reliable, efficient, and optimized for performance.
- Provide architectural oversight and ensure best practices in data engineering are applied.
Data Quality & Governance
- Establish processes to ensure data accuracy, consistency, and reliability.
- Implement monitoring, testing, and alerting mechanisms for data pipelines.
Business Support & Collaboration
- Work closely with Finance and Commercial stakeholders to define and align on data definitions, standards, and governance practices.
- Translate business requirements into technical solutions that align with long-term data strategy.
WHAT WILL YOU BRING TO THE TEAM
- Reliable and well-documented data pipelines supporting Finance and Commercial teams.
- Consistent, high-quality datasets for reporting and decision-making.
- Reduced downtime and faster resolution of data issues.
- Clear team processes, coding standards, and documentation.
- A motivated and high-performing data engineering team.
ZALORA Engineering
ZALORA is the leading e-commerce company providing fashion throughout South East Asia. ZALORA Engineering has teams in Singapore and Vietnam to serve millions of users in Malaysia, Singapore, Hong Kong, Taiwan, Indonesia, Philippines. This regional diversity presents a lot of interesting challenges that you and your colleagues will face. You will learn a lot by tackling these challenges.
ZALORA's e-commerce platform is built with love by the engineering team. The Ho Chi Minh office, 12 Ton Dan, D4 (ZALORA Group) focuses completely on engineering and sets a high standard for software development in the field of e-commerce. This team creates the tech stack that powers ZALORA's stellar shopping experience. This ranges from the online shop and the mobile apps to ZALORA's complex logistics solutions.
The ZALORA Story
ZALORA exists for the millions of fashion consumers in Asia seeking a shopping experience focused on their unique styles, trends and fit. As Asia's leading online fashion destination, ZALORA was founded in 2012 and has a presence in Singapore, Indonesia, Malaysia & Brunei, the Philippines, Hong Kong and Taiwan. ZALORA's localised sites offer an extensive collection of top international and local brands as well as our own in-house labels across apparel, shoes and accessories for men and women. ZALORA is part of Global Fashion Group, the world's leader in online fashion for emerging markets.
ZALORA is not obligated to accept resumes from any third parties on behalf of potential candidates for any position (advertised or otherwise) by any means, unless ZALORA has executed a written agreement with such third party and has expressly requested such third party for candidate referrals. Third parties who provide unsolicited resumes of candidate(s) shall waive and forfeit all rights to claim for any placement fees or referral fees in the event that such candidate is eventually engaged or employed by ZALORA or Global Fashion Group.
hFT4hXIixo
Data Engineering Manager
Posted today
Job Viewed
Job Description
About Us
Intrepid Asia is a leading Ecommerce and Digital Solutions Provider in South East Asia. We offer end-to-end omni-channel ecommerce management, a wide range of Digital Marketing Services and advanced Market Intelligence, all powered by state of the art inhouse Technology to our client base of leading international brands across all key marketplaces and social platforms in all 6 SEA countries. Brands love our regional presence, our excellent data-driven and growth-focused services which are enabled by the strongest team in the industry, and our advanced marketing and tech capabilities.
We are growing rapidly and as the exclusive partner of Flywheel in SEA, we offer many exciting opportunities to work with leading brands across multiple categories and key industry players. By joining us, you will work on the cutting edge of digital commerce in SEA, and experience what it takes to drive a successful ecommerce business end to end.
The Role
We are seeking a seasoned and innovative Data Engineering Manager to lead our talented team of data engineers. You will take ownership of our entire data ecosystem, ensuring the reliability, scalability, and efficiency of our data pipelines and data warehouses. Your primary mission will be to maintain and enhance our current infrastructure built on Google Cloud Platform (GCP), Airflow, dbt, StarRocks, and Google BigQuery while architecting its evolution to support our ambitious goals in AI, SaaS, and enterprise-level analytics.
The ideal candidate is a hands-on leader with a strategic mindset, a passion for technology, strong business acumen, and a relentless drive for improvement. You will be instrumental in fostering a culture of excellence, agility, and cost-conscious innovation within the team.
Key Responsibilities
- Team Leadership & Mentorship:
- Lead, manage, and mentor a team of data engineers, fostering their professional growth and cultivating a collaborative, high-performance culture.
- Champion Agile methodologies (Scrum/Kanban) to manage project priorities, deadlines, and deliverables effectively.
- Data Platform & Pipeline Management:
- Oversee the end-to-end data lifecycle, ensuring the robustness and performance of our data pipelines orchestrated with Apache Airflow and transformations managed by dbt.
- Manage and optimize our dual data warehouse strategy, leveraging StarRocks for transformation and analytics, and
Google BigQuery for large-scale data warehousing. - Champion DataOps principles, implementing best practices for CI/CD, automated testing, and infrastructure as code.
- Data Quality & Observability:
- Design and implement a comprehensive data quality framework, establishing metrics and SLAs for our critical datasets.
- Implement and manage a data observability platform to proactively monitor, alert, and troubleshoot issues across the data stack, minimizing data downtime.
- Develop and own our data governance strategy to ensure data accuracy, security, and compliance.
- Data Architecture & Strategy:
- Design and evolve our data architecture to be future-ready for enterprise-grade data services and new business initiatives.
- Collaborate closely with data scientists, analysts, product managers, and business stakeholders to understand data requirements and deliver impactful solutions.
- Drive cost-optimization initiatives across our data stack, ensuring we achieve maximum value from our cloud and technology investments.
Required Skills & Experience
- Bachelor's or Master's degree in Computer Science, Engineering, or a related technical field.
- 8+ years of hands-on experience in data engineering, with a proven track record of building and maintaining large-scale data pipelines.
- 3+ years of experience in a leadership or management role.
- Expert proficiency in SQL and Python.
- Exposure to building data infrastructure to support AI/ML workflows (e.g., feature stores, MLOps).
- Deep, hands-on experience with modern data stack tools, including:
- Workflow Orchestration: Apache Airflow
- Data Transformation: dbt
- Cloud Data Warehouses: Google BigQuery is a must and/or familiarity with other similar Data Warehouse technologies i.e. Snowflake, Azure Synapse Analytics, Databricks.
- Modern OLAP Databases: StarRocks or similar (e.g., ClickHouse, Druid).
- Familiarity with OLTP databases like MySQL and Postgres is required.
- Solid understanding of DataOps principles, data modelling, ETL/ELT design patterns, and data architecture.
- Proven experience working in an Agile/Scrum development environment.
Preferred Qualifications (Added Advantage)
- Experience within the e-commerce and/or digital marketing landscape.
- Hands-on experience with data observability platforms (e.g., Monte Carlo, Datadog, OpenTelemetry).
- Experience designing and building multi-tenant data architectures for SaaS products.
- Extensive experience with the Google Cloud Platform (GCP) ecosystem.
What we offer:
- Excellent and competitive compensation package
- Professional and open international working environment – culturally integrating the best of all cultures to take the best of each and ensure we build an energetic, commercial, and fun working atmosphere.
- You are one of the pioneers of a new and cutting edge Flywheel product underpinned by strong tech. You have a key position in the Intrepid Insights team working on a unique product solving complex data challenges.
- Ample opportunity for personal and professional development, both on the job and through regular training (Ecommerce topics, technical skills, soft skills and leadership training) made available on our proprietary learning platform Intrepid University
- You will work with many brilliant co-workers who are movers and shakers of the industry, as well as with leading brand and ecosystem partners across all categories to shape their presence across all ecommerce platforms in the years to come, and will have a broad view on the latest developments in the South East Asian e-commerce ecosystem
Note: we will not be accepting any unsolicited resumes or CVs from headhunting or recruitment agencies at this point. Any CVs or profiles shared with us will not be entertained, and in the event of dispute, Intrepid will not be liable for any material compensation to third parties
Data Engineering Lead
Posted today
Job Viewed
Job Description
We are seeking a Data Engineering Lead to architect and drive scalable, secure, and resilient data infrastructure that powers our next-generation digital banking platform. You will lead a team of engineers in building robust pipelines, ensuring data quality, and solving complex financial data challenges.
This is a high-impact leadership role for someone who thrives in a fast-paced, regulated fintech environment and is passionate about data infrastructure at scale.
Key Responsibilities:
Technical Leadership
- Design and lead implementation of secure, scalable data pipelines using AWS-native services (e.g., S3, Glue, Redshift, Athena, Lambda, Kinesis…).
- Own and evolve the data architecture that supports transactional, analytical, and regulatory use cases in digital banking.
- Ensure data security, integrity, and governance, with a strong focus on regulatory and compliance standards in the financial sector.
- Solve complex data challenges including ledger reconciliation, payment processing analytics, customer profiling, and risk/compliance monitoring.
- Collaborate with backend, product, and risk teams to deliver real-time and batch data solutions for finance-specific workflows (e.g., KYC, AML, loan underwriting, credit scoring).
- Implement data quality validation, schema versioning, and lineage tracking across the entire data platform.
Team & Project Leadership
- Lead and mentor a team of data engineers in delivering high-quality, production-grade pipelines and data products.
- Define and enforce best practices in data engineering processes, code quality, deployment automation, and monitoring.
- Drive cross-functional alignment between data, product, and engineering stakeholders to ensure timely delivery of data initiatives.
- Support agile delivery with strong documentation and a proactive problem-solving mindset.
Qualifications:
Education & Experience
- Bachelor's or Master's degree in Computer Science, Software Engineering, or related field.
- 5+ years of hands-on experience in
data engineering
, including at least 1–2 years in a
tech lead role
. - Prior experience in
fintech, banking, or other highly regulated industries
is strongly preferred.
Technical Skills
- Deep experience with the
AWS ecosystem
: - Data:
S3, Glue, Redshift, Athena, DynamoDB - Compute:
Lambda, EC2, EMR, Kinesis - Orchestration:
Apache Airflow
(self-hosted or managed) - Proficient in
Python
for pipeline development and scripting. - Strong SQL skills for data transformation and warehousing.
- Understanding of
data lake
,
data warehouse
, and
real-time streaming
architectures. - Familiarity with
data encryption
,
PII handling
,
access control (IAM, Lake Formation)
, and
compliance (e.g., GDPR, SOC2)
requirements.
Soft Skills
- Excellent communication and collaboration skills to partner effectively across technical and non-technical teams.
- Ability to break down complex fintech requirements into technical roadmaps and deliverables.
- Strong ownership mindset with a bias for action and continuous improvement.
- Comfortable leading through ambiguity in a fast-moving product-led organization.
Lead - Data Engineering
Posted today
Job Viewed
Job Description
MoMo is the leading mobile payments provider in Vietnam, committed to improving the lives of every Vietnamese through technological innovation. As our business continues to expand, we're looking for an experienced Data Engineer to join our Data Platform team.
At MoMo, we emphasize smart, efficient, and excellent execution, with a strong focus on data quality. Our data platform delivers critical insights for:
- Business and app performance monitoring
- Machine learning products including recommendation systems, personalization, risk scoring, fraud detection, targeted promotions, and financial services
We're also building a next-generation hybrid data platform across multiple cloud providers, giving us greater control over both cost and technology
What you will do
With MoMo's AI-first mission, we are designing and building a self-serve data platform to empower both internal teams and external partners. This platform allocates resources based on users' needs to support:
- Ingesting data from diverse sources — either in batch or streaming, using both pull and push mechanisms
- Developing and deploying resilient data pipelines across the data lake, data warehouse, and streaming systems
- Delivering high-quality, derived datasets to downstream tools such as BI solutions (e.g., Apache Superset, Google Data Studio), via multiple delivery methods including APIs, datasets, and streaming data
- Monitoring data quality throughout all data pipelines in the platform to ensure high-quality data, resulting in better decision-making, accurate reporting, and reliable machine learning outputs
- Tracking and optimising resource usage for efficiency
Additionally, we are building Data Management Systems that enable the Data Governance team and data consumers to:
- Manage the full data lifecycle within the big data platform
- Explore the MoMo data ecosystem independently
- Provide a single source of truth with high data quality to downstream consumers
- Track and manage infrastructure costs across major projects, teams, and departments
Your Skills and Experience
- Bachelor's degree in Computer Science, Engineering, or a related field
- A problem solver with a strong sense of ownership and accountability — not just a task executor
- 5+ years of experience working as a Data Engineer and 1+ year of experience working as leader
- Curious and committed to lifelong learning, with a passion for solving business problems through engineering, improving service quality and usability, and maintaining a strong customer focus
- Strong foundation in computer science fundamentals, including data structures, algorithms, database systems, and data modelling techniques
- Proficient in at least one of the following languages: SQL, Python, JVM-based languages
- Experience with databases such as PostgreSQL, MySQL, ClickHouse, DuckDB, etc
- Skilled in analysing, designing, implementing, and optimising Data Vault or Dimensional Modeling for performance and cost
- Hands-on experience with infrastructure platforms — cloud-based (e.g., GCP, AWS) or on-premise — and container orchestration using Kubernetes
- Experience with data storage and processing engines like Apache Spark, Apache Flink, and StarRocks
- Experience with Google Cloud Platform or Amazon Web Services is a plus.
Why You'll Love Working Here
We are passion with new technologies, not follow the old-track of outsourcing. We love our product and ready to sacrifice everything for it You often argue with your boss? Don't worry, here in M_Service, you will get salary increase if you win your boss in argument. We love objection You hate company trip in hotel, resort? We only conquer difficult and hottest road in team building. Do you want to join?
What you will get:
- Competitive compensation package.
- Performance-based bonus.
- Insurance package.
- Chance to work with smart people with international experience.
Our benefits:
- Attractive compensation & benefits.
- 13th month salary bonus and yearly performance bonus.
- 14 paid days off per year
- Premium health care insurance
- Great allowances (lunch, parking, birthday, happy hours.)
- Salary review at least one time per year based on employee's performance and contribution.
- Outing/team-building activities (company trip , soccer sport, English club, running club,.).
- Other benefits as per stated in Vietnamese Labor Law
- Work with experienced & strong team.
- Friendly, dynamic & flexible working environments.
Be The First To Know
About the latest Data integration Jobs in Vietnam !
Data Engineering Manager
Posted today
Job Viewed
Job Description
ZALORA Group
is looking for a
Data Engineering Manager
to join its Data Science Team. As a Data Engineering Manager, you will lead a small team of two Data Engineers to design, build, and maintain robust, scalable, and efficient data pipeline and systems that support the Finance and Commercial departments.
What Will You Do
Team Leadership & Management
- Lead, mentor, and develop a team of two Data Engineers, providing technical guidance and career development support.
- Manage workload allocation, set priorities, and ensure timely delivery of team objective
Data Infrastructure & Pipeline Development
- Design, implement, and maintain scalable ETL/ELT pipelines to support Finance and Commercial data requirements.
- Ensure pipelines are reliable, efficient, and optimized for performance.
- Provide architectural oversight and ensure best practices in data engineering are applied.
Data Quality & Governance
- Establish processes to ensure data accuracy, consistency, and reliability.
- Implement monitoring, testing, and alerting mechanisms for data pipelines.
Business Support & Collaboration
- Work closely with Finance and Commercial stakeholders to define and align on data definitions, standards, and governance practices.
- Translate business requirements into technical solutions that align with long-term data strategy.
What Will You Bring To The Team
- Reliable and well-documented data pipelines supporting Finance and Commercial teams.
- Consistent, high-quality datasets for reporting and decision-making.
- Reduced downtime and faster resolution of data issues.
- Clear team processes, coding standards, and documentation.
- A motivated and high-performing data engineering team.
ZALORA Engineering
ZALORA is the leading e-commerce company providing fashion throughout South East Asia. ZALORA Engineering has teams in Singapore and Vietnam to serve millions of users in Malaysia, Singapore, Hong Kong, Taiwan, Indonesia, Philippines. This regional diversity presents a lot of interesting challenges that you and your colleagues will face. You will learn a lot by tackling these challenges.
ZALORA's e-commerce platform is built with love by the engineering team. The Ho Chi Minh office, 12 Ton Dan, D4 (ZALORA Group) focuses completely on engineering and sets a high standard for software development in the field of e-commerce. This team creates the tech stack that powers ZALORA's stellar shopping experience. This ranges from the online shop and the mobile apps to ZALORA's complex logistics solutions.
The ZALORA Story
ZALORA exists for the millions of fashion consumers in Asia seeking a shopping experience focused on their unique styles, trends and fit. As Asia's leading online fashion destination, ZALORA was founded in 2012 and has a presence in Singapore, Indonesia, Malaysia & Brunei, the Philippines, Hong Kong and Taiwan. ZALORA's localised sites offer an extensive collection of top international and local brands as well as our own in-house labels across apparel, shoes and accessories for men and women. ZALORA is part of Global Fashion Group, the world's leader in online fashion for emerging markets.
ZALORA is not obligated to accept resumes from any third parties on behalf of potential candidates for any position (advertised or otherwise) by any means, unless ZALORA has executed a written agreement with such third party and has expressly requested such third party for candidate referrals. Third parties who provide unsolicited resumes of candidate(s) shall waive and forfeit all rights to claim for any placement fees or referral fees in the event that such candidate is eventually engaged or employed by ZALORA or Global Fashion Group.
Data Engineering Specialist
Posted today
Job Viewed
Job Description
We are seeking a highly skilled Senior Data Engineer to join our client's growing data team. The ideal candidate will have extensive experience in cloud platforms (GCP, AWS), data engineering frameworks, and strong expertise in ETL processes, data modeling, and orchestration tools. You will play a key role in designing, building, and optimizing data pipelines to support analytics, reporting, and machine learning initiatives.
*This opportunity is open for Local resident candidates only.
Key Responsibilities:
● Design, develop, and maintain scalable data pipelines using modern data engineering frameworks.
● Build and optimize ETL/ELT processes for structured and unstructured data from diverse sources.
● Develop and implement data ingestion frameworks ensuring data quality, integrity, and reliability.
● Work with DBT and Apache Airflow to create and manage data transformations and orchestrations.
● Manage and optimize Snowflake data warehouses, ensuring performance and cost efficiency.
● Leverage Kubernetes to deploy, scale, and manage containerized data processing workloads.
● Utilize Python and SQL to develop efficient, scalable, and high-performing data solutions.
● Design and implement data models to support analytical and operational use cases.
● Collaborate with cross-functional teams including Data Scientists, Analysts, and Software Engineers to support business intelligence and machine learning initiatives.
● Ensure data governance, security, and compliance best practices are upheld.
Required Qualifications:
● 5+ years of experience in Data Engineering, with a strong background in ETL/ELT development.
● Proficiency in GCP and AWS, with hands-on experience in data storage, processing, and orchestration services.
● Strong experience with DBT for data transformation and Airflow for orchestration.
● Expertise in SQL for complex querying, performance tuning, and optimization
● Hands-on experience with Snowflake, including schema design, performance tuning, and cost optimization.
● Strong programming skills in Python for data processing and automation.
● Experience working with Kubernetes for managing containerized data workloads.
● Strong data modeling skills with a focus on designing scalable and maintainable data architectures.
● Experience building and maintaining data ingestion frameworks at scale.
● Excellent problem-solving skills and ability to work in a fast-paced environment.
● Excellent English communication skills.
Preferred Qualifications:
● Experience with streaming data frameworks (e.g., Kafka, Pub/Sub, Kinesis).
● Knowledge of data security, compliance, and governance best practices.
● Familiarity with CI/CD processes and DevOps methodologies.
What We Offer:
● Competitive salary and benefits package.
● Opportunity to work with cutting-edge technologies in a data-driven organization.
● Collaborative and innovative work environment.
● Professional growth and career advancement opportunities.
If you are passionate about building scalable data solutions and have the skills we are looking
for, we would love to hear from you Apply now to be part of our client's dynamic data team.
Head of Data Engineering
Posted today
Job Viewed
Job Description
Reporting to CEO, you, as a visionary Head of Data Engineering, are expected to lead the arcthiecture of data & empower the fast-growing team of data/ML engineers
Key Responsibilities
- Define and execute company data strategy
- Lead data governance, privacy, and compliance
- Build modern data infrastructure (data lakes, vector DBs, feature stores)
- Drive AI/ML model deployment and integration
- Transition analytics to real-time, intelligent systems using LLMs and RAGs
- Champion data literacy and responsible AI innovation
Requirements
- 8+ years in data engineering/ AI/ML roles; 4+ years in leadership position, ideally serving Data Engineering Manager
- Experience managing, empowering & leading teams from 10 members
- Proven success in deploying AI/ML models at scale
- Strong data architecture (Lakehouse architecture) experience, and modern/open source data technology solutions (e.g. LlamaIndex, Apache parquet files, vector database etc). Data access control and security is critical.
- Experience implementation data pipeline on AWS, including optimizing for cost and performance - Strong technical and hands-on leadership. Provide review and coaching to the team.
- Strong strategic thinking and executive communication, excellent communications skills (with English too)
- Nice to have experience within e-commerce/ fintech/ Consumer industry
Due to the high volume of applications we are experiencing, our team will only be in touch with you if your application is shortlisted.