Are you fascinated by data and building robust data pipelines which process massive amounts of data at scale and speed to provide crucial insights to the end customer?
This is exactly what we, as part of Engineering team, do. Our mission is building Entrego’s Data Assets into Data Products that deliver intelligence and real-time insights for our customers.
These, when built will support a variety of applications which are used by the internal teams and as well as provide business insights to our customers.
Our work spans across a variety of data-sets like transaction data , clickstream (soon!), and web scrape data, across a diverse technology stack ranging from Hadoop to Spark, Tableau and AWS.
As the first ever Data Engineer hire, we are looking forward to work with someone who is passionate, creative and innately curious to join a fresh team, excited to build out a next generation Enterprise Data Warehouse platform to drive revenue opportunities, provide customer insights, and to contribute operational efficiency using data to our logistics business.
What will you be doing?You will build, maintain and scale efficient data infrastructure / EDW, ETL and reporting pipelinesYou will be expected to work with data scientists and data analysts in getting ML and deep learning models production readyYou will investigate and research data quality and integrity from data sourcesYou will develop and maintain the data platform, business intelligence and experimentation toolsYou will develop and maintain scalable platforms for tracking business intelligence, built for reliability and redundancyYou lead the management of data collection, organize the models, and forecast future needsYou are able to coach and mentor junior data engineers to be more effective individual contributorsWho are we looking for?
You are able to design, implement and maintain the Logistics Platform, particularly on Enterprise Data WarehouseYou are fluent in Python or similar programming languageYou can develop data loading / ETL processes for the Entrego Logistics Solutions PlatformYou have experience in data engineering tools like Hadoop, Spark, BigQuery, Airflow, etc.
You are data-driven and passionate about solving problems through dataYou are inquisitive and curious to delve deep into data to investigate trends or anomaliesYou are detail oriented and be able to work efficiently in a fast-paced team environmentYou are keen on data technologies and picking up new skills and tools along the wayYou have strong critical thinking and ability to frame issues in a logical mannerYou have experience in deploying and scaling ML models.
We often conduct workshops to improve our individual skill sets, and to improve our workflow as a team.We have flexi-time arrangement for work.
We have many of the best bits of a start-up but with the resources of one of the oldest conglomerates in the Philippines.
We work hard to create a supportive, collaborative, and fulfilling place where you can progress your careerYou'll get to work with some spectacular people both from the technology team and wider businessYou will get to work with massive data sets and opportunities to learn and apply the latest big data technologies.
We want everyone on our team to have the tools and resources to succeed in their careerAutonomy in the role and in managing your own portfolioWe get to do all of this in a lovely, comfortable office in a nice, central part of BGC.
We don't hire just for the sake of it, we hire the best people, and trust them to do what’s right! With your knowledge and experience, you will work closely with product managers, user experience developers, creative designers and software engineers using multi variant testing, analytics, usability testing and good old common sense to help make key decisions with the development of the platform.