Data Engineer

1115058
  • Job type

    Contract
  • Location

    Toronto
  • Profession

    Data & Advanced Analytics
  • Industry

    Property & Real Estate
  • Pay

    Upto $100/hr

Great opportunity for Data Engineers!

Your new company

Our client is a leading Real Estate firm looking to hire a Data Engineer for a 6-month extendable contract with a 4-days-in-office hybrid work arrangement.


Your new role

  • Design and develop data pipelines to ingest, store, and transform data from a variety of sources

  • Develop data models and algorithms to optimize data architecture

  • Monitor data quality and performance, and provide troubleshooting when needed

  • Accelerate data-informed decision-making to transform our product and engineering strategy

  • Architect scalable date models and build efficient and reliable ETL pipelines to bring data to our core data warehouse

  • Design and develop highly scalable and reliable data pipelines and ETL processes to ensure smooth data flow and accessibility (Informatica IICS preferred)

  • Demonstrably deep understanding of SQL and analytical data warehouses (Snowflake preferred)

  • Professional experience using Python, Java, or Scala for data processing (Python preferred)

  • Knowledge of and experience with data-related Python packages

  • Develop and maintain data warehousing and storage solutions to ensure data is stored securely and efficiently.

  • Collaborate with cross-functional teams to design and implement data models and data warehouse architecture that supports business needs.

  • Optimize the performance of our data pipelines and data warehouse by implementing best practices and continuous improvement.

  • Develop and implement data governance policies and procedures to ensure data accuracy, security, and compliance with industry regulations.

  • Constantly improve product quality, security, and performance

  • Understand and implement data engineering best practices

  • Improve, manage, and teach standards for code maintainability and performance in code submitted and reviewed

  • Generate architecture recommendations and the ability to implement them

  • Great communication: Regularly achieve consensus amongst teams

  • Documentation of the work in Confluence


What you'll need to succeed

  • A university degree in computer science or an approved equivalent combination of education and experience.

  • Minimum of five years of experience as a data engineer focusing on building scalable data pipelines and data warehouses.

  • Strong proficiency in Snowflake, Informatica IICS, Snowpipe, SQL, Unix, Python, and API integrations.
    Strong knowledge of data warehousing and modelling concepts, with experience designing and building large-scale data solutions.

  • Experience developing and maintaining ETL/ELT pipelines and data integration solutions.

  • Strong SQL skills, with experience in performance tuning and optimization of complex queries.

  • Strong programming skills in Python, with experience developing and maintaining data processing scripts.

  • Experience in API development and integration with third-party systems.

  • Excellent communication and collaboration skills, with the ability to work effectively in a team environment.

  • Strong problem-solving skills with the ability to analyze complex data sets and identify trends and insights; think creatively and outside the box.

  • Excellent communication skills with the ability to communicate technical concepts to non-technical stakeholders.

  • Ability to work independently and in a team environment, prioritize tasks, and manage multiple projects simultaneously.

  • Experience with agile development methodologies such as Scrum or Kanban.

  • Design and optimize the SQL syntax and queries for faster data processing

  • Dimensional modelling experience is mandatory

  • Adoption of best development practices for batch processing

  • Managing all work in Jira, including but not limited to story creation, defect resolution etc

  • Identifies opportunities for new architectural initiatives and recommends the increased scalability and robustness of platforms and solutions.

  • Extensive experience in establishing data warehouses/lakes, has solid experience with Cloud-based data storage solutions (AWS S3 specifically).

  • Strong experience with data stack like Snowflake, Informatica IICS, and Informatica Axon and their data models.

  • Good understanding of data-driven system integration (web services and ETL/batch jobs), establishing the organization’s canonical data models and ability to work with XML and JSON-based data formats.

  • Scripting expert with experience with description techniques for data architecture and the relationship of data with other architecture domains (e.g. data representations in business and application process models).


What you need to do now


If you're interested in this role, click 'apply now' to forward an up-to-date copy of your CV, or call us now.

If this job isn't quite right for you, but you are looking for a new position, please contact us for a confidential discussion on your career.


#LI-DNI

Apply for this job