Senior Hadoop Administrator
AIG Shared Services
Valenzuela City, Philippines
2d ago
source : ictjob.ph

Your future team

  • Our technology teams collaborate with their worldwide colleagues and partners every day to take on the challenges of providing IT support to one of the world'
  • s leading financial services firm. We're people who believe that with the right values and hard work, anything is possible.

    We know that if we're at our best, that enables our customers to be their best and realize their dreams and hoped for successes.

  • The Information Technology group provides enterprise-wide IT solutions for all of AIG's specialized disciplines. Technology provides strategic and procedural support in all of AIG'
  • s specialized disciplines, such as policy issuance, premium collection, claims handling, and administration. It enables AIG to deliver business strategies through efficient world-

    class IT and operations services, while ensuring the necessary IT risk management and security measures are in place.

    Your contribution at AIG

    As an influencer at AIG, people come to you as a go-to source for help and support because of your deep knowledge and expertise.

    As a more experienced team member, you are capable of driving continual improvement and impacting the way that things get done.

    Because of your influence, whether direct or indirect, we are able to deliver powerful outcomes for our clients.

    Senior Hadoop Administrator

    Job Description :

    Experienced Hadoop administrator to provide support and expertise on for HortonWorks (required) and Cloudera (preferred) Hadoop deployments.

    Responsibilities include :

  • Working with business and application development teams to provide effective technical support
  • Assist / Support technical team members for automation, development, and security
  • Identifying the best solutions and conduction proofs of concepts leveraging Big Data & Advanced Analytics to meet functional and technical requirements
  • Interfacing with other groups such as security, network, compliance, storage, etc.
  • Transforming data from RDBMS to HDFS using available methodologies
  • Administration and testing of DR, replication & high availability solutions
  • Implementing Kerberos, Knox, Ranger, and other security enhancements
  • Managing and reviewing Hadoop log files for audit & retention
  • Arranging and managing maintenance windows to minimize impact of outages to end users
  • Moving services (redistribution) from one node to another to facilitate securing the cluster and ensuring high availability
  • Assist in reviewing and updating all configuration & documentation of Hadoop clusters as part of continuous improvement processes
  • Mentoring & supporting team members in managing Hadoop clusters
  • Job Requirements :

  • 4+ years of experience on Hadoop clusters; minimum of 2 years with the HortonWorks distribution
  • 8+ years of experience in Linux-based systems or database administration
  • Hands-on experience with production deployments of Hadoop applications
  • Strong understanding of best practices and standards for Hadoop application design and implementation
  • Hadoop administration experience that includes :
  • Configuring, monitoring, and administering large Hadoop clusters
  • Backup & Recovery of HDFS
  • DR, Replication & High Availability of Hadoop infrastructure
  • Securing the cluster using Kerberos, LDAP Integration, and / or Centrify
  • Managing & Scheduling jobs
  • Managing Hadoop Queues, Access Controls, user quotas etc.
  • Capacity planning, configuration management, monitoring, debugging, and performance tuning
  • Hands-on experience with Big Data use cases & development
  • Experience with Hadoop tools / languages / concepts such as HDP 2.0+, HDFS, Hive, Oozie, Sqoop, PIG, Flume, Spark, Kafka, Solr, Hbase, Ranger, Knox, Zookeeper, Map Reduce, etc.
  • Experience with scripting tools such as bash shell scripts, Python and / or Perl
  • Experience with storing unstructured data in semi-structured format on HDFS using HBase
  • Experience with advanced SQL, partitioning and bucketing in HIVE
  • Understanding of enterprise ETL tools (e.g. DataStage, Talend)
  • Understanding of relational databases (RDBMS), SQL & No-SQL databases
  • Knowledge of java virtual machines (JVM) and multithreaded processing
  • Ability to coordinate and prioritize multiple tasks in a fast-paced environment
  • Strong verbal / written communication and presentation skills
  • Ability to build relationships and work effectively with cross-functional, international team
  • Apply
    Add to favorites
    Remove from favorites
    Apply
    My Email
    By clicking on "Continue", I give neuvoo consent to process my data and to send me email alerts, as detailed in neuvoo's Privacy Policy . I may withdraw my consent or unsubscribe at any time.
    Continue
    Application form