Senior Data Architect
- Calgary, AB
- Permanent
- Temps-plein
- Define the future-state architecture for ATCO's enterprise data environment, including data integration, modeling, and consumption patterns
- Provide hands-on architecture support by guiding and, where needed, designing/validating data pipelines and integration patterns (batch, streaming, and API-based), including orchestration patterns (e.g., Azure Data Factory, Databricks Workflows/Jobs)
- Translate business priorities into data domains, data products, and roadmaps that improve decision-making and analytics adoption
- Architect and scale the Databricks lakehouse and related patterns to support a data mesh direction, enabling domain teams to publish and operate trusted data products with clear ownership, SLAs, and governance, supported by shared platform guardrails and enablement
- Establish standards for core platform components (lakehouse/warehouse, orchestration, catalog/lineage, and BI/semantic layer) and how teams publish and consume trusted data
- Establish standards for data ingestion, transformation, lineage, metadata, retention, and quality across the data lifecycle
- Enable self-serve analytics through curated datasets, consistent KPI definitions, and scalable consumption patterns
- Partner with AI teams to ensure data foundations support AI/ML and GenAI use cases (e.g., trusted sources, governed access, structured and unstructured data readiness)
- Define pragmatic multi-cloud patterns across Azure and GCP, balancing security, cost, performance, and governance
- Provide architecture oversight and act as a design authority through design reviews across projects, vendors, and delivery teams
- Ensure alignment with ATCO cybersecurity, privacy, and enterprise architecture requirements
- Post-secondary education in Computer Science, Engineering, Information Systems, or equivalent experience
- 8+ years of experience in data platform and/or enterprise data architecture
- Strong knowledge of data modeling, data warehousing/lakehouse concepts, and modern data integration patterns (ETL/ELT)
- Experience enabling business analytics (e.g., semantic/metrics layers, certified datasets, KPI governance, self-serve patterns)
- Hands-on experience with Databricks (lakehouse patterns, notebooks/jobs, and productionizing data pipelines) and familiarity with data mesh / data product operating models
- Working knowledge of cloud data services in Azure and/or GCP (e.g., Databricks, BigQuery, cloud storage)
- Understanding of data governance disciplines (metadata, lineage, retention, quality, access control)
- Familiarity with data patterns that support AI/ML and GenAI (without needing to be a model developer)
- Strong stakeholder management skills and the ability to communicate technical concepts in business terms
- A culture based on caring, integrity, agility, collaboration, and striving for excellence
- Competitive compensation
- Flex benefits
- Tuition assistance program
- Training and mentorship programs
- Charitable donation matching