Big Data Engineer
Macquarie Group
Manila
17d ago

Here at Macquarie Bank we are building the Digital Bank of the future and we are doing it with our customers by our side.

We understand that banking in the future will be predominantly digital and are building our digital platforms from the ground up to match.

We use technology as a catalyst to bring people together and help them realise their life milestones through our products and solutions which are designed by humans for humans.

We are putting our customers first and changing how the world interacts with banks.

We are searching for passionate technologists to join us on our mission, who want to develop cutting edge applications built on beautifully crafted, easily shipped and reusable code to join is here in our Engineering teams.

Our teams are some of the most advanced in the bank, we actively look to use emerging cutting-edge technologies and we work very different to others and our results speak for themselves :

We were the first Australian Bank to use Kubernetes at scale in Production for all Digital channels We were the first Australian Bank Open Banking API platform Canstar awarded us the Innovation Excellence Award 2017 for our New Digital Banking Offering Recognised for partnering with FinTechs across Australia to provide core banking infrastructure they need We are constantly featured regularly as an example architecture in Silicon Valley tech events.

We need a Big Data Engineer to help build the next generation of data technology at Macquarie Banking & Financial Services.

We are looking for a learner with a growth mindset to help us build a data vault on AWS. The role will use Talented and other ETL tools to help provide data both internally and externally, using distributed system technologies.

It will require some production support and on-call time as part of a cross-functional agile team. We want someone with a core strength as data developer / modeller, along with quality assurance and analysis capabilities.

Essential Skills and Experience

  • Data Warehousing / ETL concepts (Datastage or Talend) or have work in similar projects
  • Strong Linux / Unix skills
  • Understanding AWS and HADOOP, Spark cloud computing concepts
  • Oracle RDS / Strong SQL
  • Python
  • BigData Querying tools like Hive
  • Working in an Agile environment
  • DevOps tools like JIRA, Bamboo, Ansible, GIT(stash or bitbucket), Nexus, SVN
  • Desirable Skills and Experience

  • Spark Programming
  • Scala Programming
  • NoSQL Database like Cassandra / MongoDB
  • Docker, Kubernetes, Openshift
  • Deep understanding of AWS ELB, EMR, EBS would be a plus
  • Security Concepts and best practices
  • Presto, BigSQL, PL / SQL
  • Java
  • Data Pipelines tools like Apache oozie, LUIGI or Airflow
  • Maven, SBT
  • If this sound like you, apply now!

    The Corporate Operations Group (COG) brings together specialist support services including workplace, human resources, market operations and technology.

    COG's purpose is to drive operational excellence through business-aligned services with a focus on quality, cost and risk.

    COG comprises the following divisions : Business Improvement and Strategy, Business Services, Human Resources, Market Operations, and Technology.

    Apply
    Add to favorites
    Remove from favorites
    Apply
    My Email
    By clicking on "Continue", I give neuvoo consent to process my data and to send me email alerts, as detailed in neuvoo's Privacy Policy . I may withdraw my consent or unsubscribe at any time.
    Continue
    Application form