Big Data Engineer (12-month contract)
Singapore
Responsibilities:
• Develop data processing pipelines for ingestion, modelling, analysis, mining and reporting with
Enterprise Big Data Lake
• Responsible for the code writing of the core module of the system
• Develop POC and build data pipeline architecture using of the overall technical framework of the software
• Work closely with teams ensure timely delivery of assignments
Requirements:
• Possess good communications skills to understand our customers’ core business objectives and
build end-to-end data centric solutions to address them
• Good critical thinking and problem-solving abilities
Must-have:
• Experience building large scale enterprise data pipelines using commercial and/or open source Big
Data platforms from vendors such as Hortonworks/Cloudera, MapR, for Hadoop based platforms
or NoSQL platforms such as Cassandra, HBase, DataStax, Couchbase, Elastic Search, Neo4j etc
• Hands on experience in Spark, Scala, Impala, Hive SQL, Apache Nifi necessary to build and
maintain complex queries, streaming and real-time data pipelines
• Data modelling and architecting skills including strong foundation in data warehousing concepts,
data normalisation, and dimensional data modelling such as OLAP
• Undergraduate or graduate degree in Computer science or equivalent
Chin (Reg No. R21100141)
We regret that only shortlisted candidates will be notified.