HCL is setting up large delivery centres for digital transformation projects in Vietnam and as part of that we are starting a delivery centre in Ho Chi Minh City for Data transformation initiative of a major Global Bank. Work involves transforming legacy data into modern digital data technology and hosting it in Google Cloud Platform. This is an exciting project in which employees get opportunity to experience most modern data tool stack. Project is delivered in agile ways of working, an excellent opportunity to work for a global leader.
As a Hadoop big data engineer, you will develop, operate and drive scalable and resilient data platform based on Hadoop ecosystem to address the business requirements:
- Ensure industry best practices around data pipelines, metadata management, data quality, data governance and data privacy
- Design and implement business-specific large-scale data processing pipelines
- Work with complex data structures, manipulate, cleanse data, and perform transformations to make insights from data.
- Responsible to Ingest data from files, streams, and databases. Process the data with PySpark, Kafka, Hive, Hive LLAP…
- Develop efficient software code for multiple use cases leveraging Spark and Big Data Technologies for various use cases built on the platform
- Provide high operational excellence guaran