To better uphold the privacy and dignity of individuals and communities, we argue for the inclusion of privacy as a fundamental human right and for the insertion of an intersectional lens throughout the draft Bill C-27. Our argument is backed by research on the social impacts and potential harms of digital systems and recent patterns of behaviour in the digital industries, which suggest broader definitions of harms, inclusion of intersectionality, expansion of mitigation measures, and more robust, transparent, and autonomous oversight are necessary to achieve the goals of C-27. While some of our recommendations align with suggestions made by others, especially prior submissions by Bailey, Burkell, and McPhail, LEAF, and the CCLA; we further advise adding language that includes “groups with intersecting identities” where appropriate. We recommend the requirements and procedures for various parties to assess and mitigate harms should be developed by an independent regulator through public proceedings mandated to include relevant and most-affected groups, and involve intersectional analysis capable of recognizing group, intersectional and cumulative, rather than “high impact,” harms. Towards these ends, we recommend that data collection by government institutions and political parties be included as subject to oversight, and that passage of AIDA be delayed to enable a full public consultation that brings intersectionality to the forefront.