Design, develop, and maintain a robust and efficient data architecture for Data Lake/Data Warehouse
Design, develop, and maintain data workflows (data pipelines) to flow data efficiently for ETL Data process
Ensure data quality and consistency by implementing data testing, validation and cleansing for ETL Data process
Troubleshoot and monitor data pipelines to have high availability of error
Explore new technologies to design data modeling scenarios, transformations and provide optimal data engineering solutions
Create and maintain ETL Data process documentation & definitions, including data dictionaries and process flows
Having experience using Data Modelling, ETL process, and Big Data Technologies, Data Lake and/or Data Warehouse architecture, Proficienct and experience in SQL and/or NOSQL is a must, experience in one or more programming language (Java, Scala, Python, etc)