Senior Hadoop Big Data/Data Science Engineer
Safeway Philtech, Inc.
Parañaque City, Philippines
2d ago
source : ictjob.ph

SENIOR HADOOP BIG DATA / DATA SCIENCE ENGINEER

Safeway Philtech is looking for Senior Hadoop Big Data / Data Science Engineer who will be working in BGC, Taguig City .

Safeway Philtech Inc. is a fully-owned subsidiary of Albertsons-Safeway Inc . It functions as a Technology Center that provides application and infrastructure support and software development based on specifications from Albertsons-

Safeway's internal IT department.

When Albertsons-Safeway Inc. - one of the largest supermarket chains in North America , decided to build a site in Manila, it was in recognition of two things : the technical competency of Filipino IT workers and their ability to be global players, and the effectiveness of having non-

US IT operations to sustain Albertsons-Safeway's robust and evolving business.

We were founded on the principle that quality service is at the core of how we do business.

Since 2003, Safeway Philtech has continuously provided legendary IT services at the best value through the development of world-

class talent. The company has grown into a vital technology center of Albertsons-Safeway. Servicing over 1,700 stores and corporate offices all over the United States and Canada, we continue to deliver essential enterprise-

enabling solution which covers all aspects of Albertsons-Safeway operations including Retail, Supply Chain, Merchandising, Administration, Human Resources, Accounting, and Marketing.

We provide Albertsons-Safeway with the most secure, intelligent infrastructure and application solutions for managing and maintaining mission-

critical information across the retail enterprise. Our solution enables Albertsons-Safeway to : Stay competitive in the marketplace Optimize their operations : streamline processes, increase productivity Create first-

class shopping experience : improve customer loyalty and boost profits.

As Senior Hadoop Big Data / Data Science Engineer , you are expected to do the ff tasks :

  • Responsible for the documentation, design, development, and architecture of Big Data applications
  • Conversion of complex techniques as well as functional requirements into detailed designs
  • Perform testing of software prototypes and transfer to the operational team
  • Perform analysis of large data stores and derive insights
  • Provide technical assistance to application / business users
  • Analyze, review and alter code to increase operating efficiency or adapt to new requirements
  • May test, install or deploy code changes
  • Documents processes and code base for hand-over to eventual support team
  • Propose best practices and standards
  • REQUIRED

  • Bachelor's degree in Computer Science or other related courses
  • At least 2-4 years of relevant experience and proficiency in the following tools :
  • Python
  • UNIX Korn / Bourne shell scripting
  • ANSI-SQL
  • At least 2-4 years of relevant experience and proficiency in the following technologies :
  • Business Intelligence
  • Datawarehouse and Data Analytics
  • Data Science / Data Lake
  • Proficient in Teradata, Oracle, Exadata, DB2 and similar RDBMS
  • Must have excellent oral and written English communication skills
  • Strong analytical skills
  • Must be open to shifting schedules
  • DESIRED

  • Work experience in a support and development-oriented environment
  • Work experience in Data Warehouse or related technology
  • Work experience in Agile Scrum development
  • Knowledgeable in Datastage or similar data integration tool like Pentaho
  • Apply
    Add to favorites
    Remove from favorites
    Apply
    My Email
    By clicking on "Continue", I give neuvoo consent to process my data and to send me email alerts, as detailed in neuvoo's Privacy Policy . I may withdraw my consent or unsubscribe at any time.
    Continue
    Application form