
Analytics Engineer (dbt)
- Canada
- 73.440-86.400 $ par an
- Permanent
- Temps-plein
- Data Modeling & Pipelines: Design, develop, and maintain complex data models in our Snowflake data warehouse. Utilize dbt (Data Build Tool) to create efficient data pipelines and transformations for our data platform.
- Snowflake Intelligence Integration: Leverage Snowflake Intelligence features (e.g., Cortex Analyst, Cortex Agents, Cortex Search, AISQL) to implement conversational data queries and AI-driven insights directly within our data environment. Develop AI solutions that harness these capabilities to extract valuable business insights.
- Advanced SQL & Analysis: Design and build advanced SQL queries to retrieve and manipulate complex data sets. Dive deep into large datasets to uncover patterns, trends, and opportunities that inform strategic decision-making.
- Business Intelligence (BI): Develop, maintain, and optimize Looker dashboards and LookML to effectively communicate data insights. Leverage Looker's conversational analytics and data agent features to enable stakeholders to interact with data using natural language queries.
- Cross-Functional Collaboration: Communicate effectively with stakeholders to understand business requirements and deliver data-driven solutions. Identify opportunities for implementing AI/ML/NLP technologies in collaboration with product, engineering, and business teams.
- Programming & Automation: Write efficient Python code for data analysis, data processing, and automation of recurring tasks. Skilled in shell scripting and command-line tools to support data workflows and system tasks. Ensure code is well-tested and integrated into automated workflows (e.g., via Airflow job scheduling).
- Visualization & Presentation: Create compelling visualizations and presentations to deliver analytical insights and actionable recommendations to senior management and cross-functional teams. Tailor communication of complex analyses to diverse audiences.
- Innovation & Best Practices: Stay up-to-date with industry trends, emerging tools, and best practices in data engineering and analytics (with a focus on dbt features, Snowflake's latest offerings and BI innovations). Develop and implement innovative ideas to continuously improve our analytics stack and practices.
- Education: Bachelor's degree in Computer Science, Statistics, or a related field; Master's degree preferred.
- Experience: 2+ years of experience in data analytics or a related field, with significant exposure to AI and Machine Learning applications in analytics.
- SQL Expertise: Advanced SQL skills with experience in writing and optimizing complex queries on large-scale datasets.
- dbt Proficiency: Hands-on experience with dbt (Data Build Tool) and its features for building, testing, and documenting data models.
- Data Modeling: Expert-level knowledge of data modeling and data warehouse concepts (e.g., star schema, normalization, slowly changing dimensions).
- Snowflake & AI Capabilities: Experience with Snowflake's Data Cloud platform and familiarity with its advanced AI capabilities (Snowflake Intelligence - Cortex Analyst, Cortex Agents, Cortex Search, AISQL, etc.) is highly preferred.
- Business Intelligence Tools: Strong skills in Looker data visualization and LookML (including familiarity with Looker's conversational AI and data agent capabilities) or similar BI tools.
- AI Agents & Automation: Experience with AI agents or generative AI tools to optimize workflows and service delivery (such as creating chatbots or automated analytic assistants) is a plus.
- Real-Time & Streaming Data: Experience with real-time data processing and streaming technologies (e.g., Kafka, Kinesis, Spark Streaming) for handling continuous data flows.
- Programming: Proficient in Python for data analysis and manipulation (pandas, NumPy, etc.), with the ability to write clean, efficient code. Experienced with shell scripting and command-line tools for automating workflows and data processing tasks.
- ETL/Orchestration: Familiarity with ETL processes and workflow orchestration tools like Apache Airflow (or similar scheduling tools) for automating data pipelines alongside Docker for local development and testing.
- Cloud Platforms: Experience with cloud platforms and services (especially AWS or GCP) for data storage, compute, and deployment.
- Version Control & CI/CD: Solid understanding of code versioning (Git) and continuous integration/continuous deployment (CI/CD) processes in a data engineering context.
- Agile Methodology: Familiarity with agile development methodologies and ability to work in a fast-paced, iterative environment.
- Soft Skills: Excellent communication and presentation skills, with critical thinking and problem-solving abilities. Proven track record of working effectively on cross-functional teams and translating business needs into technical solutions.
- Data Governance & Ethics: Experience implementing data governance best practices, ensuring data quality and consistency. Knowledge of data ethics, bias mitigation strategies, and data privacy regulations (e.g., GDPR, CCPA) with a commitment to compliance.
- Community & Open Source: Contributions to open-source projects or active participation in data community initiatives.
- AI/ML Skills: Experience with applying Artificial Intelligence/Machine Learning techniques in analytics (e.g., building predictive models for forecasting, churn prediction, fraud detection, etc.). Practical experience deploying models and using MLOps/DataOps practices for lifecycle management.
- Statistical Background: Solid foundation in statistics and probability, with ability to apply various modeling techniques and design A/B tests or experiments.
- Additional Programming: Knowledge of additional programming or query languages (e.g., R, Scala, Julia, Spark SQL) that can be applied in analytics workflows.
- Certifications: Certifications in relevant data technologies or cloud platforms (such as Snowflake, AWS, GCP, or Looker) demonstrating your expertise.