Responsible for Extract-Transfer-Load (ETL) jobs for model building, dashboarding, pilots, and scaling up, ensures data validity for every use case in AA Dept as well as integrates models to BAU process.
COMMUNICATION
Internal (within VPB FC): Extract Transfer Load (ETL) data from related department
External (outside VPB FC): Coordinate and work with vendors who in contracts with VPB FC/ outstaff in term of data engineering advance analytics
KEY RESPONSIBILITIES
Extract-Transfer-Load (ETL) jobs for model building, dashboarding, pilots, and scaling up
Ensures data validity for every use case in AA Dept
Integrates models to BAU process
Undertake tasks assigned by Unit Head
Job Requirement
Degree in Computer Science, IT, or a similar field
2+ years hands-on experience on Big Data Engineering or in a similar role.
Deep experience in querying databases and using programming languages (e.g. C, C++, R, Python, Scala, SQL, Java, Tableau, R)
Demonstrably deep understanding of SQL and analytical data warehouses
Hands-on experience implementing ETL (or ELT) and data pipeline practice best practices at scale.
Familiar with SQL, NoSQL, git, CI/CD tools, etc.
Familiar with AWS data stack and patterns (Kafka, Glue, EMR, etc.), Spark (SQL, DataFrame, etc.), distributed data query engines.
Experience in performance tuning/optimizing Big Data programs
Ability to work independently and learn quickly, with good communication and organization skills.