You will use your expertise to:

Develop, construct and automate data pipelines and computational architectures that allow for leading edge analytical models

Translates complex functional and technical requirements into detailed design, and high performing end to end solutions

Develop a deep understanding of our data and our business to create value for our business partners and our organization

Seek to develop the automation of data quality process in line with the corporate data governance standards

We’d like to review your application if you have…

Must-haves (minimum requirements):

  • At least five years of experience building, maintaining and orchestrating Big Data pipelines and analytics solutions in an enterprise setting, using Agile development approaches
  • Strong experience in data management, data and environment provisioning, and data masking and test automation, using relevant Big Data technologies on Azure, AWS or Google platforms
  • Data Warehousing experience: Hadoop, MapReduce, HIVE, PIG, Apache Spark, Kafka
  • A Bachelor’s degree in Computer Science, Computer Engineering, Software Engineering or a related technical field
  • The ability to work in environments with significant ambiguity, can develop creative approaches to data problems, and you are curious and passionate about data in the context of business and industry implications
  • Expert knowledge of data modeling, with an understanding of different data structures and their benefits and limitations under particular use cases
  • Alignment with our values

Where you’ll be working, your work schedule, and other important information:

  • You will work out of our Calgary head office, located in the Suncor Energy Centre at 150 – 6th Ave S.W.
  • Hours of work are a regular 40-hour work week, Monday to Friday, with the potential for extended work hours based on business needs
  • We are currently growing our Data Analytics team and are hiring several DataOps/Data Engineers, as well as Platform Engineers