Data Engineer| Level 4 (CAD) - RSR MSP

LanceSoft

  • Montréal, QC
  • Permanent
  • Temps-plein
  • Il y a 1 mois
Job Title: Data Engineer
Experience Level: Level 4 (advanced): 7-15 years
Open positions: 1
Job Level: FTC
Location: Montreal (Day 1 onboarding onsite / in office presence 3x week)We provide:
  • A robust career development path offering numerous opportunities for growth, learning and advancement.
  • A supportive, learning-oriented environment in collaboration with development within fast-feedback agile delivery squads
  • Collaborative work within cross-functional squads, following agile practices and utilizing both cloud and on-premises technology to deliver innovative solutions.
  • Encouragement for every developer to contribute their unique perspective - your ideas will be valued, and you’ll receive full support in their implementation!
  • Participation in an international environment with various multidisciplinary squads, working alongside customers, product experts, and SREs.
  • A dynamic environment where cutting-edge technology propels us.
  • State-of-the-art offices located in the City Centre designed to enhance collaboration.
Role ResponsibilitiesYou will be responsible for:
  • Collaborating with cross-functional teams to understand data requirements, and design efficient, scalable, and reliable ETL processes using Python and DataBricks
  • Developing and deploying ETL jobs that extract data from various sources, transforming it to meet business needs.
  • Taking ownership of the end-to-end engineering lifecycle, including data extraction, cleansing, transformation, and loading, ensuring accuracy and consistency.
  • Creating and manage data pipelines, ensuring proper error handling, monitoring and performance optimizations
  • Working in an agile environment, participating in sprint planning, daily stand-ups, and retrospectives.
  • Conducting code reviews, provide constructive feedback, and enforce coding standards to maintain a high quality.
  • Developing and maintain tooling and automation scripts to streamline repetitive tasks.
  • Implementing unit, integration, and other testing methodologies to ensure the reliability of the ETL processes
  • Utilizing REST APIs and other integration techniques to connect various data sources
  • Maintaining documentation, including data flow diagrams, technical specifications, and processes.
You have:
  • Proficiency in Python programming, including experience in writing efficient and maintainable code.
  • Hands-on experience with cloud services, especially DataBricks, for building and managing scalable data pipelines
  • Proficiency in working with Snowflake or similar cloud-based data warehousing solutions
  • Solid understanding of ETL principles, data modelling, data warehousing concepts, and data integration best practices
  • Familiarity with agile methodologies and the ability to work collaboratively in a fast-paced, dynamic environment.
  • Experience with code versioning tools (e.g., Git)
  • Meticulous attention to detail and a passion for problem solving
  • Knowledge of Linux operating systems-Familiarity with REST APIs and integration techniques
You might also have:
  • Familiarity with data visualization tools and libraries (e.g., Power BI)
  • Background in database administration or performance tuning
  • Familiarity with data orchestration tools, such as Apache Airflow
  • Previous exposure to big data technologies (e.g., Hadoop, Spark) for large data processing
*//
EEO Employer
Minorities/ Females/ Disabled/ Veterans/ Gender Identity/ Sexual Orientation
//*

LanceSoft