What You’ll Do

  • Design robust, reusable and scalable data solutions and pipeline frameworks to automate ingestion, processing and delivery of batch and real-time streaming data.
  • Build and maintain conceptual, logical and physical data models for transactional, BI and analytical platforms.
  • Lead the development and on-going maintenance of our target state data architecture roadmap in collaboration with technology and business partners.
  • Ensure on-going alignment of new and existing data initiatives with the target state data architecture.
  • Define standards and best practices for enterprise data management framework.
  • Play an active role in all stages of a data solution life cycle: acquisition, ingestion, curation, publication, quality and metadata to ensure consistency with data management standards and best practices.
  • Assist with data-related technical challenges and perform root-cause analysis to answer specific business questions and recommend and implementing ways to improve data reliability, efficiency and quality.
  • Assist in the selection of data management tools required to expand our platform capability. Make recommendations to senior management.

What You’ll Bring

  • University degree in Computer Science, Information Systems or related field.
  • 5+ years of experience with data warehousing, big data and cloud technologies with focus on the data architecture discipline.
  • 5+ years of hands-on experience with
    • Data modeling or architecture tools (ERWin)
    • BI visualization software (Power BI, Qlikview)
    • Data catalog software (Informatica EDC, IBM Infosphere, Collibra)
    • Data quality management (IBM IA, Informatica IDQ)
  • 5+ years of hands-on experience in architecture, design and development of enterprise data solutions that enables well integrated transactional, BI and
    analytical platforms.
  • 5+ years of experience with relational databases (Teradata, Oracle, SQL Server) and big data technologies (Hadoop, Hive)
  • Deep experience with data modeling, analysis, metadata and quality.
  • Advanced knowledge of SQL queries
  • Experience with Cloud data platforms technolo gies: Azure Data Lake Store, Azure
  • Databricks, Google Cloud Platform (Data Proc, Dataflow, Big Query).
  • Strong Communication and interpersonal skills
  • Superb analytical and conceptual thinking skills; to not only manipulate but also derive meaningful interpretations from data
  • Ability to take initiative, multi-task and work in a fast-paced environment
  • Capability to liaise with all levels across the enterprise on projects and ad-hoc requests
  • Strong detail orientation is essential in this role
  • A Team player and self-starter
Click here to apply for the position

This site uses cookies to give our users the best experience on our website. By continuing on our website, you are agreeing to the use of cookies. To learn more, you can read our privacy policy.