Share this Job

Senior Big Data Engineer

Date: 25-Jul-2019

Location: Macquarie Park, Australia

Company: Singtel

Senior Big Data Engineer

The role of Senior Big Data Engineer is responsible for development and automation of Big Data ingestion, transformation and consumption services; adopting new technology; and ensuring modern operations in order to deliver consumer driven Big Data solutions.

The role

  • Implement request for ingestion, creation, and preparation of data sources
  • Develop and execute jobs to import data periodically/ (near) real-time from an external source
  • Setup a streaming data source to ingest data into the platform
  • Delivers data sourcing approach and data sets for analysis, with activities including data staging, ETL, data quality, and archiving
  • Design a solution architecture to meet business, technical and user requirements
  • Profile source data and validate fit-for-purpose
  • Works with Delivery lead and Solution Architect to agree pragmatic means of data provision to support use cases
  • Understands and documents end user usage models and requirements

 

The perks

We offer all kinds of benefits, such as:

  • Work collaboratively in an open, agile environment with flexible working hours and location
  • Discounts with over 400 companies Australia wide (Technology, Retail, Home, Fashion and more!)
  • Mobile and Broadband staff discounts
  • Onsite facilities at Macquarie Park such as a Gym, GP, Mini-Mart, Cafes
  • Training, Mentoring and further learning opportunities
  • Staff busses to Epping and Wynyard, and back again

About you

Preferred skills and experience include:

  • Bachelor Degree in maths, statistics computer science, information management, finance or economics
  • 3 -5 years’ experience integrating data into analytical platforms
  • Experience in ingestion technologies (e.g. sqoop, nifi, flume), processing technologies (Spark/Scala) and storage (e.g. HDFS, HBase, Hive)
  • Experience in data profiling, source-target mappings, ETL development, SQL optimisation, testing and implementation
  • Expertise in streaming frameworks (Kafka/Spark Streaming/Storm) essential
  • Experience managing structured and unstructured data types
  • Experience in requirements engineering, solution architecture, design, and development / deployment
  • Experience in creating big data or analytics IT solution
  • Track record of implementing databases and data access middleware and high volume batch and (near) real-time processing
  • You would be a self-starter with the ability to work independently and multitask several different activities across critical deadlines in a high-pressure environment.

About us

At Optus, we don’t sit back and let the future happen to us – we’re out there making it.  By expanding into new technology and relentlessly improving every day, we’re working to create a better tomorrow for all Australians.

Optus believes in the strength of a vibrant, diverse and inclusive workforce where backgrounds, perspectives and life experiences of our people help us innovate and create strong connections with our customers.


Heads Up!
Due to the fast paced nature of our business, vacancy close dates may change, so make sure you apply today!

Find similar jobs: