Automated Decision-Making Transparency
Plain English Translation
Quebec Law 25 requires organizations to be transparent when using automated systems to make decisions about individuals. If a decision is based exclusively on the automated processing of personal information, the organization must inform the individual at or before the time the decision is delivered. Furthermore, organizations must allow individuals to request an explanation of the factors leading to the decision, access the data used, correct any inaccurate information, and submit observations to a staff member capable of reviewing the automated decision.
Technical Implementation
Use the tabs below to select your organization size.
Required Actions (startup)
- Add a prominent disclaimer to the UI when a decision is made exclusively by automated processing.
- Train support staff to handle basic inquiries about automated decisions and direct users to human reviewers.
Required Actions (scaleup)
- Implement a formal workflow to handle Law 25 requests for automated decision explanation.
- Ensure data pipelines log the specific data points used at the time an automated decision is rendered to support future inquiries.
Required Actions (enterprise)
- Develop comprehensive algorithmic auditing tools to easily extract and explain the principal factors and parameters for any given automated decision.
- Integrate automated decision tracking directly into the central data subject request log.
Quebec Law 25 compliance requires organizations to inform individuals when a decision is based exclusively on automated processing, explain the factors involved upon request, allow them to correct inaccurate data, and provide an opportunity to submit observations to a human.
Under Quebec Law 25 section 12.1 automated decision-making rules, organizations must inform the individual no later than at the time they are informed of the decision itself.
Organizations must provide the personal information used to render automated decision, explain the reasons and principal factors that led to the outcome, and inform the individual of their right to have the underlying data corrected.
The specific transparency requirements of Loi 25 article 12.1 décision fondée exclusivement traitement automatisé apply only when a decision is based exclusively on automated processing, meaning no meaningful human intervention occurs before the decision is finalized.
Maintain strict data lineage and use output activity logs to trace exactly what personal information was fed into the algorithm or rules engine at the exact time the decision was rendered.
To properly Law 25 explain automated decision reasons and main factors, organizations must disclose the core logic, weightings, and primary variables the system used to reach its conclusion, presented in simple and understandable terms.
Organizations must establish a structured Law 25 governance process for automated decision review, ensuring a qualified staff member is available to receive the individual's observations and has the authority to meaningfully review or override the system's output.
A proper Law 25 transparency notice for AI and automated decisions must clearly state that the decision was fully automated and inform the individual of their right to request more details, correct their data, or submit observations for human review.
Organizations must process the correction through their standard data subject request procedures, update the inaccurate data, and potentially re-run the automated decision if the inaccurate data materially impacted the initial outcome.
To learn how to comply with Law 25 automated decisions, CISOs should collect algorithmic transparency logs, UI screenshots of Law 25 automated processing disclosure requirements, and documented SOPs for human review workflows.
Law 25 §12.1 requests often require timely retrieval of the data used, the rationale, and proof of a human review path. Tools like WatchDog Security's Compliance Center can help track evidence, map the control to Law 25, and surface gaps in workflows and artifacts needed to support consistent responses.
Consistency usually breaks down when teams rely on ad hoc screenshots, emails, and undocumented handoffs between support, privacy, and engineering. Tools like WatchDog Security's Policy Management can help maintain approved SOPs and acceptance tracking, while WatchDog Security's Secure File Sharing can support controlled sharing of request artifacts with audit logs.
"Any person carrying on an enterprise who uses personal information to render a decision based exclusively on an automated processing of such information must inform the person concerned accordingly not later than at the time it informs the person of the decision. He must also inform the person concerned, at the latter’s request, (1) of the personal information used to render the decision; (2) of the reasons and the principal factors and parameters that led to the decision; and (3) of the right of the person concerned to have the personal information used to render the decision corrected. The person concerned must be given the opportunity to submit observations to a member of the personnel of the enterprise who is in a position to review the decision."
| Version | Date | Author | Description |
|---|---|---|---|
| 1.0.0 | 2026-02-23 | WatchDog Security GRC Team | Initial publication |