Loading...
Share this Job

#NEXTDIGITAL2020 - ## Hadoop Administrator

Date: 08-Feb-2021

Location: Singapore

Company: Singtel Group

About NCS

NCS is a leading information and communications technology (ICT) and communications engineering services provider across the Asia-Pacific region. We are headquartered in Singapore and a wholly owned subsidiary of the Singtel Group. We have in-depth domain knowledge and unique capabilities that create business value for customers. We offer a broad range of services, including consulting, systems development and integration, business process outsourcing, infrastructure management and solutions, and technology solutions.

 

What is the opportunity?

  • NCS is looking for a Big Data Hadoop Administrator with 5+ years of experience to maintain and administrate the Hadoop data lake cluster. This includes the setup of a Big Data platform on cloud or on-premises and overseeing ongoing administration of Hadoop infrastructure and systems.
  • The Big Data Hadoop Administrator will work closely with the business users, project managers, technical teams like data centre engineers, network infrastructure teams and source system data owners to commission the platform and automate data acquisition and cleansing, to sustain analytics and AI initiatives.

 

What will you do?

  • Plan, design and setup Big Data platforms in customer data centres or on the cloud
  • Aligns with the systems engineering team to propose and deploy new hardware and software environments,upgrades required for Hadoop and expanding existing environments

 

What do you need to succeed?

  • Possess good communications skills to understand our customers' core business objectives and build end-to-end data centric solutions to address them
  • Good critical thinking and problem-solving abilities

 

The ideal candidate should be / possess: 

  • Should be able to upscale cluster to cater ongoing requirements
  • Manages Big Data components/frameworks such as Hadoop, Spark, Storm, HBase, MySQL, Hadoop Distributed File System (HDFS), Pig, Hive, Sqoop, Flume, Oozie, Avro, Kafka  ,Kibana, Graffana, Solr, Elastic search etc.
  • Handles cluster maintenance and creation/removal of nodes using Cloudera Manager Enterprise
  • Perform POCs of new capability in Hadoop Platform
  • Handles performance tuning of Hadoop clusters and Hadoop routines
  • Screens Hadoop cluster job performances and capacity planning
  • Monitors Hadoop cluster connectivity and security
  • Manages and reviews Hadoop log files
  • Handles HDFS and file system management, maintenance, and monitoring
  • Partners with infrastructure, network, database, application and business intelligence teams to guarantee high data quality and availability
  • Collaborates with application teams to install operating system and Hadoop updates, patches, and version upgrades when required
  • Acts of point of contact for Business and IT End user’s escalation
  • Responsible for maintenance of Hadoop security services like Sentry and Ranger
  • Undergraduate or graduate degree in Computer science or equivalent

 

Nice to have:

  • Experience in Administration on cloud-based data analytics platform such as:
    • AWS IaaS, Azure Iaas, GCP IaaS
    • Dockers, Kubernetes deployment experience.