WikiFrameworksQuebec Law 25Automated Decision-Making Transparency

Automated Decision-Making Transparency

Updated: 2026-02-23

Plain English Translation

Quebec Law 25 requires organizations to be transparent when using automated systems to make decisions about individuals. If a decision is based exclusively on the automated processing of personal information, the organization must inform the individual at or before the time the decision is delivered. Furthermore, organizations must allow individuals to request an explanation of the factors leading to the decision, access the data used, correct any inaccurate information, and submit observations to a staff member capable of reviewing the automated decision.

Executive Takeaway

Ensure transparency and accountability for fully automated decisions by notifying users, explaining the algorithmic logic upon request, and offering a mechanism for human review.

ImpactHigh
ComplexityMedium

Why This Matters

  • Failing to provide automated decision-making transparency can lead to significant regulatory penalties and erode consumer trust.
  • Meeting Loi 25 article 12.1 décision fondée exclusivement traitement automatisé requirements ensures ethical AI and data usage practices.

What “Good” Looks Like

  • User interfaces clearly display a Law 25 transparency notice for AI and automated decisions whenever a fully automated decision is presented.
  • A formalized Law 25 governance process for automated decision review is in place, connecting users to human reviewers who can explain the decision and override it if necessary; tools like WatchDog Security's Policy Management can help maintain the SOP and reviewer accountability.

Quebec Law 25 compliance requires organizations to inform individuals when a decision is based exclusively on automated processing, explain the factors involved upon request, allow them to correct inaccurate data, and provide an opportunity to submit observations to a human.

Under Quebec Law 25 section 12.1 automated decision-making rules, organizations must inform the individual no later than at the time they are informed of the decision itself.

Organizations must provide the personal information used to render automated decision, explain the reasons and principal factors that led to the outcome, and inform the individual of their right to have the underlying data corrected.

The specific transparency requirements of Loi 25 article 12.1 décision fondée exclusivement traitement automatisé apply only when a decision is based exclusively on automated processing, meaning no meaningful human intervention occurs before the decision is finalized.

Maintain strict data lineage and use output activity logs to trace exactly what personal information was fed into the algorithm or rules engine at the exact time the decision was rendered.

To properly Law 25 explain automated decision reasons and main factors, organizations must disclose the core logic, weightings, and primary variables the system used to reach its conclusion, presented in simple and understandable terms.

Organizations must establish a structured Law 25 governance process for automated decision review, ensuring a qualified staff member is available to receive the individual's observations and has the authority to meaningfully review or override the system's output.

A proper Law 25 transparency notice for AI and automated decisions must clearly state that the decision was fully automated and inform the individual of their right to request more details, correct their data, or submit observations for human review.

Organizations must process the correction through their standard data subject request procedures, update the inaccurate data, and potentially re-run the automated decision if the inaccurate data materially impacted the initial outcome.

To learn how to comply with Law 25 automated decisions, CISOs should collect algorithmic transparency logs, UI screenshots of Law 25 automated processing disclosure requirements, and documented SOPs for human review workflows.

Law 25 §12.1 requests often require timely retrieval of the data used, the rationale, and proof of a human review path. Tools like WatchDog Security's Compliance Center can help track evidence, map the control to Law 25, and surface gaps in workflows and artifacts needed to support consistent responses.

Consistency usually breaks down when teams rely on ad hoc screenshots, emails, and undocumented handoffs between support, privacy, and engineering. Tools like WatchDog Security's Policy Management can help maintain approved SOPs and acceptance tracking, while WatchDog Security's Secure File Sharing can support controlled sharing of request artifacts with audit logs.

LAW25 § 12.1

"Any person carrying on an enterprise who uses personal information to render a decision based exclusively on an automated processing of such information must inform the person concerned accordingly not later than at the time it informs the person of the decision. He must also inform the person concerned, at the latter’s request, (1) of the personal information used to render the decision; (2) of the reasons and the principal factors and parameters that led to the decision; and (3) of the right of the person concerned to have the personal information used to render the decision corrected. The person concerned must be given the opportunity to submit observations to a member of the personnel of the enterprise who is in a position to review the decision."

VersionDateAuthorDescription
1.0.02026-02-23WatchDog Security GRC TeamInitial publication