Share this Job

#NEXTDIGITAL2020 - ## Big Data Developer

Date: 29-Mar-2021

Location: Singapore

Company: Singtel Group

About NCS
NCS is a leading information and communications technology (ICT) and communications engineering services provider across the Asia-Pacific region. We are headquartered in Singapore and a wholly owned subsidiary of the Singtel Group. We have in-depth domain knowledge and unique capabilities that create business value for customers. We offer a broad range of services, including consulting, systems development and integration, business process outsourcing, infrastructure management and solutions, and technology solutions.

What is the opportunity?
NCS is looking for a big data engineer to design and implement the storage and data flow solution for Big Data platforms. This includes the setup of a Big Data platform, development of related data integration and management processes and ensuring that the platform can handle the velocity, volume and variety of large volumes of structured and unstructured data. 
The big data engineer will work closely with the business users, project managers, technical teams like data centre engineers, network infrastructure teams and source system data owners to commission the platform and automate data acquisition and cleansing, to sustain analytics and AI initiatives. 
The role requires the ability to translate business and technical requirements into data interfaces, data transformation jobs and design data models that powers data lake solutions and support AI/ML initiatives. To support this, you may also need to establish data management processes such as data security, privacy classification and advise our clients on the collection, storage and consumption of information in their organisations.


What will you do?
•    Plan, design and setup Big Data platforms in customer data centres or on the cloud
•    Design and implement relevant data models in Big Data and NoSQL platforms 
•    Build data pipelines to bring information from source systems, harmonise and cleanse data to support analytics initiatives for core business metrics and performance trends
•    Work closely with project manager and technical leads to provide regular status reporting and support them to refine issues/problem statements and propose/evaluate relevant analytics solutions
•    Bring your experience and ideas to effective and innovative engineering, design and strategy
•    Work in interdisciplinary teams that combine technical, business and data science competencies that deliver work in waterfall or agile software development lifecycle methodologies
•    The range of accountability, responsibility and autonomy will depend on your experience and seniority, including:
o    Contributing to our internal networks and special interest groups
o    Mentoring to upskill peers and juniors


The ideal candidate should be / possess: 
•    Possess good communications skills to understand our customers' core business objectives and build end-to-end data centric solutions to address them
•    Good critical thinking and problem-solving abilities
•    Prior experience building large scale enterprise data pipelines using commercial and/or open source Big Data platforms from vendors such as Hortonworks/Cloudera, MapR, for Hadoop based platforms or NoSQL platforms such as Cassandra, HBase, DataStax, Couchbase, Elastic Search, Neo4j etc
•    Strong knowledge of data manipulation languages such as Spark, Scala, Impala, Hive SQL, Apache Nifi necessary to build and maintain complex queries, streaming and real-time data pipelines
•    Good appreciation and operational experience of Big Data administrative tools and skillsets eg: Linux shells, Apache Ambari, YARN, Ranger to build scalable and resilient data platforms
•    Data modelling and architecting skills including strong foundation in data warehousing concepts, data normalisation, and dimensional data modelling such as OLAP
•    Undergraduate or graduate degree in Computer science or equivalent

•    Experience with other aspects of data management such as data governance, metadata management, archival, data lifecycle management
•    Large scale data loading experience moving enterprise or operational data from source systems to new applications or data analytics solutions
•    Experience in leveraging on cloud-based data analytics platform such as:
o    AWS serverless architecture in Lambda on AWS DynamoDB, EMR
o    Azure Data Lake, HDInsight
o    GCP BigQuery/BigTable, Cloud Dataprep/Dataflow/Dataproc



If you would like to be part of the winning team that does great work, apply today!


To All Agencies:
Agencies must have a valid fee agreement in place and they must have been assigned the specific requisition to which they submit resumes, by the Talent Acquisition team. Any resume submitted outside of this process will be deemed the sole property of Singtel Group and in the event a candidate is submitted outside of this policy is hired, no fee or payment of any kind will be paid