52 Big Data Hadoop jobs in Vietnam
Big Data Engineer (Spark/ Hadoop/ Scala)
Posted 7 days ago
Job Viewed
Job Description
- Works on the data pipeline infrastructure that is veritably the backbone of our business li>Writing elegant functional Scala code to crunch TBs of data on Hadoop clusters, mostly using Spark
- Be owning a data pipeline deployment to clusters: on-prem or on-cloud (AWS or GCP or more). li>Be managing Hadoop clusters right from security to reliability to HA. li>Building a pluggable, unified data lake from scratch.
- Automating and scaling tasks for the Data Science team. < i>Constantly look to improve framework and pipelines, hence learning on the job is sort of a given. li>Our expertise and requirements include but are not limited to Spark, Scala, HDFS, Yarn, Hive, Kafka, Distributed Systems, Python, Datastore (Relational and NoSql) and Airflow.
Big Data Engineer (Spark/ Hadoop/ Scala)
Posted 11 days ago
Job Viewed
Job Description
- Works on the data pipeline infrastructure that is veritably the backbone of our business li>Writing elegant functional Scala code to crunch TBs of data on Hadoop clusters, mostly using Spark
- Be owning a data pipeline deployment to clusters: on-prem or on-cloud (AWS or GCP or more). li>Be managing Hadoop clusters right from security to reliability to HA. li>Building a pluggable, unified data lake from scratch.
- Automating and scaling tasks for the Data Science team. < i>Constantly look to improve framework and pipelines, hence learning on the job is sort of a given. li>Our expertise and requirements include but are not limited to Spark, Scala, HDFS, Yarn, Hive, Kafka, Distributed Systems, Python, Datastore (Relational and NoSql) and Airflow.
Big Data Engineer (Spark/ Hadoop/ Scala)
Posted 18 days ago
Job Viewed
Job Description
- Works on the data pipeline infrastructure that is veritably the backbone of our business li>Writing elegant functional Scala code to crunch TBs of data on Hadoop clusters, mostly using Spark
- Be owning a data pipeline deployment to clusters: on-prem or on-cloud (AWS or GCP or more). li>Be managing Hadoop clusters right from security to reliability to HA. li>Building a pluggable, unified data lake from scratch.
- Automating and scaling tasks for the Data Science team. < i>Constantly look to improve framework and pipelines, hence learning on the job is sort of a given. li>Our expertise and requirements include but are not limited to Spark, Scala, HDFS, Yarn, Hive, Kafka, Distributed Systems, Python, Datastore (Relational and NoSql) and Airflow.
Senior Big Data Engineer
Posted today
Job Viewed
Job Description
- Research, build and optimize current data pipeline to get the best performance.
- Designing efficient architectures to store and analyze petabytes of data
- Leading largescale projects and mentoring other developers
- Implementing complex ETL data pipeline workflows
- Thinking of smart data formats to serve the functionalities of the product, while minimizing the cost
- Developing tools to help datascientists and Machine Learning team
**Benefits**:
Bachelor’s degree or equivalent industry At least 5+ years of work experience in Database Engineer, preferably in a global company Strong experience with SQL Query and DBMS like Postgre, SQL Server, MySQL, MariaDB Strong experience with big database engine, like Clickhouse, BigTable, BigQuery, Redshift Excellent English capability in both oral and written. Motivated with a high desire to learn Resultsdriven with a strong commitment
Senior Data Engineer - Big Data Analytics
Posted today
Job Viewed
Job Description
Key responsibilities:
- Design, build, and maintain scalable data pipelines and ETL/ELT processes.
- Develop and manage data warehouses and data lakes.
- Optimize data infrastructure for performance and cost efficiency.
- Ensure data quality, integrity, and security.
- Collaborate with data scientists and analysts to support their data needs.
- Implement data governance and best practices.
- Monitor and troubleshoot data-related issues.
Qualifications:
- Bachelor's degree in Computer Science, Engineering, or a related field.
- 5+ years of experience in data engineering or a similar role.
- Expertise in big data technologies (Spark, Hadoop, Kafka).
- Experience with cloud data platforms (Snowflake, Databricks, Redshift).
- Proficiency in SQL and programming languages (Python, Scala).
- Strong understanding of data modeling and database design.
- Excellent problem-solving and analytical skills.
Data Engineer
Posted today
Job Viewed
Job Description
(Mức lương: Thỏa thuận)
What you’re going to do
- Design, construct, install, test, and maintain data management systems
- Build data transformations, data model, dashboard and reports.
- Ensure that all systems meet the business/company requirements as well as industry practices
- Integrate up-and-coming data management and software engineering technologies into existing data structures
- Develop set processes for data mining, data modeling, and data production
- Research new uses for existing data
- Employ an array of technological languages and tools to connect systems together
- Install/update disaster recovery procedures
- Recommend different ways to constantly improve data reliability and quality
**Chức vụ**: Nhân Viên/Chuyên Viên
**Hình thức làm việc**: Toàn thời gian
**Quyền lợi được hưởng**:
We are the sharp-minded IT experts who tackle the trickiest software and security challenges. With more than 630 employees in our locations in Zurich (HQ), Bern, Lausanne, Budapest, Lisbon, Singapore, and Ho Chi Minh City, we make the digital business of our clients work.
As a great team, we empower each other to share, grow and succeed. The unique Adnovum spirit across locations stands for helping each other at any time, having an open door and contributing to an appreciating and trustful atmosphere. We always enjoy having a laugh, a coffee or a drink together!
Apart from our unique «one Adnovum» spirit, we offer a solution-oriented engineering culture with flat hierarchies, which gives you the opportunity to contribute with your opinions and ideas. We embrace flexible working, like the possibility to work part-time and a hybrid work model. Your continuous education and development are key to us. Therefore, we actively encourage and support individual training opportunities.
We offer
- Different customer projects using technologies like Java 8, Java EE, EJB, JPA, Hibernate, Spring, JUnit, Mockito, Eclipse RCP, WebServices (RESTful, SOAP), JavaScript, Cordova, JSF, Angular (2), HTML5, CSS 3, JQuery, etc.
- Project assignments according to your skill set and development goals
- Working side by side with highly skilled and experienced software engineers
- Collaboration with colleagues in Switzerland, Hungary, Portugal and Singapore
- Friendly working atmosphere in a well-equipped and professional IT environment
- Long-term and stable job with flexible working hours
- A competitive salary plus a performance based bonus, a premium healthcare plan and free English classes
**Yêu cầu bằng cấp (tối thiểu)**: Đại Học
**Yêu cầu công việc**:
What we’re looking for
- Bachelor’s degree in computer science or similar
- Min. 3 years’ proven experience as a Data Engineer or similar
- Proficient in Data Modelling, Data Architecture, ETL, Data warehousing, Data Lake
- Proficient in Linux/Unix and shell scripting as well as in functional programming languages
- Proficient in one or more scripting language (e.g Python, R)
- Experience in Apache Hadoop based analytics covingdata processing, access, storage, governance, security, and operations
- Experience in Cloud based Big Datatechnologies
- Experience with AWS cloud services: EC2, EMR, RDS, Redshift
- Prior data streaming experience with Spark/Python,.
- Knowledge in deploying microservices
- Creative thinking backed by strong analytical and problem-solving skills
**Yêu cầu giới tính**: Nam/Nữ
**Ngành nghề**: CNTT - Phần Mềm,Data Analytics,SQL
Đại Học
Dưới 1 năm
Data Engineer
Posted today
Job Viewed
Job Description
You should have excellent communication skills to work with product owners to understand data requirements, and to build ETL procedures to ingest the data into the data lake. You should be an expert at designing, implementing, and operating stable, scalable, low-cost solutions to flow data from production systems into the data lake.
If you are self-directed and comfortable supporting the data needs of multiple teams, systems and products, we would like to meet you.
**YOUR MAIN DUTIES**:
- Manage and expand our existing AWS big data infrastructure
- Build ETL processes within the AWS environment
- Perform data extraction, cleaning, transformation, and flow
- Design, build, launch and maintain efficient and reliable large-scale batch and real-time data pipelines with data processing frameworks
- Integrate and collate data silos in a manner which is scalable, compliant and cost efficient
**YOUR ATTRIBUTES**:
- English proficiency is a must
- 2+ years of experience in designing and developing solutions using AWS services such as Lambda, Glue, Athena, etc.
- Good knowledge of programming Python is a must
- Hands on working knowledge of Relational / NoSQL databases such as MySQL, PostgreSQL
- Experience with DB administration in a production environment; performance/scaling concepts and tuning best practices
- Experience working in agile development teams using Scrum methodologies
- Able to work autonomously and within a team
- Must be organized, efficient, and have good attention to details
- Must be able to show initiative to get a job done with little / no supervision
**WHAT WE OFFER**:
- English speaking multi-cultural environment
- Flexible working time
- Competitive salary and benefits
- Premium medical insurance package (option to extend to family members)
- Annual employee’s health-check
- Generous annual leave days and paid sick leave days
- Bi-Annual Performance Review
- English classes / Toastmasters Clubs
- Various trainings on soft-skills and best practices
- Annual company trips, year end party and periodic team-building activities
**Job Types**: Full-time, Contract, Permanent
Be The First To Know
About the latest Big data hadoop Jobs in Vietnam !
Data Engineer
Posted today
Job Viewed
Job Description
Adnovum Vietnam
- Ứng Tuyển
Python Data Analyst English
- Đăng nhập để xem mức lương
- Etown 2 Building, 364 Cong Hoa, Tan Binh, Ho Chi Minh- Xem bản đồ- Linh hoạt- 9 giờ trước
**3 Lý Do Để Gia Nhập Công Ty**:
- Competitive salary + bonus
- Premium healthcare
- Free English classes
**Mô Tả Công Việc**:
**What you’re going to do**:
- Design, construct, install, test, and maintain data management systems
- Build data transformations, data model, dashboard and reports.
- Ensure that all systems meet the business/company requirements as well as industry practices
- Integrate up-and-coming data management and software engineering technologies into existing data structures
- Develop set processes for data mining, data modeling, and data production
- Research new uses for existing data
- Employ an array of technological languages and tools to connect systems together
- Install/update disaster recovery procedures
- Recommend different ways to constantly improve data reliability and quality
**Yêu Cầu Công Việc**:
**What we’re looking for**:
- Bachelor’s degree in computer science or similar
- Min. 3 years’ proven experience as a Data Engineer or similar
- Proficient in Data Modelling, Data Architecture, ETL, Data warehousing, Data Lake
- Proficient in Linux/Unix and shell scripting as well as in functional programming languages
- Proficient in one or more scripting language (e.g Python, R)
- Experience in Apache Hadoop based analytics coving data processing, access, storage, governance, security, and operations
- Experience in Cloud based Big Data technologies
- Experience with AWS cloud services: EC2, EMR, RDS, Redshift
- Prior data streaming experience with Spark/Python,.
- Knowledge in deploying microservices
- Creative thinking backed by strong analytical and problem-solving skills
**Tại Sao Bạn Sẽ Yêu Thích Làm Việc Tại Đây**:
We are the sharp-minded IT experts who tackle the trickiest software and security challenges. With more than 630 employees in our locations in Zurich (HQ), Bern, Lausanne, Budapest, Lisbon, Singapore, and Ho Chi Minh City, we make the digital business of our clients work.
As a great team, we empower each other to share, grow and succeed. The unique Adnovum spirit across locations stands for helping each other at any time, having an open door and contributing to an appreciating and trustful atmosphere. We always enjoy having a laugh, a coffee or a drink together!
Apart from our unique «one Adnovum» spirit, we offer a solution-oriented engineering culture with flat hierarchies, which gives you the opportunity to contribute with your opinions and ideas. We embrace flexible working, like the possibility to work part-time and a hybrid work model. Your continuous education and development are key to us. Therefore, we actively encourage and support individual training opportunities.
**We offer**
- Different customer projects using technologies like Java 8, Java EE, EJB, JPA, Hibernate, Spring, JUnit, Mockito, Eclipse RCP, WebServices (RESTful, SOAP), JavaScript, Cordova, JSF, Angular (2), HTML5, CSS 3, JQuery, etc.
- Project assignments according to your skill set and development goals
- Working side by side with highly skilled and experienced software engineers
- Collaboration with colleagues in Switzerland, Hungary, Portugal and Singapore
- Friendly working atmosphere in a well-equipped and professional IT environment
- Long-term and stable job with flexible working hours
- A competitive salary plus a performance based bonus, a premium healthcare plan and free English classes
Data Engineer
Posted today
Job Viewed
Job Description
Carousell Group is the leading online classifieds marketplace for secondhand goods in Southeast Asia, with a mission to inspire the world to sell and buy more sustainably. Chotot Data Engineering is a sub-team of Carousell Group Data Engineering, focused on building a self-service data infrastructure platform. Our team is dedicated to creating a robust platform that allows users to easily ingest, access, and transform data into valuable insights using familiar and comfortable tools. We are currently seeking a excellent Data Engineer to join our team, you will have the opportunity to work on group level projects that deliver meaningful value and contribute to our missions in Southeast Asia
**RESPONSIBILITIES**
- Develop and maintain data exploration, visualization platform, customer data (CDP) as well as experimentation platform to support business needs and enable data-driven decision making
- Develop and maintain MLOps tools, platforms for data scientists and ML engineers to automate, standardize, and manage ML projects
- Design solutions and build self-serve data infrastructure platforms for internal data users such as data ingestion, transformation, streaming processing, event data management. The goal is to improve the efficiency and effectiveness of using data across the group through self-service capabilities
- Manage common data infrastructures components, such as orchestration tool, logging and monitoring, CI/CD, data governance
- Responsible for Cloud cost control via cloud Finops best practices
- Build, maintain and efficiently scale complex data ETL, high-throughput real-time and offline data pipeline
**Qualifications** REQUIREMENTS**
Must to have:
- 3+ years of experience in data engineering roles with BSc or MSc degree in Computer Science
- Experience building platforms for internal use, with a strong focus on self-service capabilities and efficient data processing
- Familiarity with SQL/BigQuery, Kafka, Kubernetes, Docker
- Strong proficiency in at least one of the following programming languages: Python, Scala
- Have experience using the Flask framework
- Excellent written and spoken English skills
- Ability to quickly learn and adapt to new technologies.
- Strong attention to detail, self-motivated, and responsible.
**Additional Information**
- Nice-to-Have:
- Experience with Golang
- Strong scripting ability in Bash
- Familiarity with designing and operating robust distributed systems
Data Engineer
Posted today
Job Viewed
Job Description
We're seeking a talented Data Engineer to join our growing team! You will play a crucial role in designing, building, and maintaining data pipelines across two major cloud platforms: Azure and AWS. Your expertise will ensure our data is clean, accessible, and ready for analysis.
**What you'll bring**
- Bachelor's degree in Computer Science, Information Technology, or a related field (or equivalent experience).
- Minimum 3 years of experience as a Data Engineer (or similar role).
- Proven experience in designing, developing, and implementing data pipelines.
- Strong understanding of data warehousing and data lake concepts.
- Proficiency in Python and Javascript for data manipulation and scripting.
- Experience with both Azure (Logic Apps, Function Apps, Blob Storage) and AWS (Step Functions, Lambda, S3, EC2, SNS) cloud platforms.
- Experience with GitHub Actions for CI/CD automation.
- Excellent problem-solving and analytical skills.
- Strong communication and collaboration skills in English.
LI-LQ1
**Championing diversity, equity, and inclusion**
**How we look after you**
We help take care of your today and tomorrow with industry-leading benefits, support, and services that look after your holistic health and wellbeing. We're also champions of life balance and offer flexible arrangements that work for you (role and location dependent). We're always looking for new ways of working that bring out our best, which leads to unexpected ideas. So here, you'll experience a sense of belonging, and discover autonomy, freedom, and ownership as you work alongside talented people you enjoy sharing knowledge with.
- We're proud to say we're an equal opportunity employer and welcome all applicants for employment without attention to race, colour, religion, sex, sexual orientation, gender identity, national origin, veteran, age, disability status_ or any other protected characteristic._ Should you need reasonable accommodations during the recruitment process, please let us know so that we can do our best to set you up for success.