The Privacy Commissioner’s Draft Proposals on Online Reputation under PIPEDA

On January 26, 2018 the Office of the Privacy Commissioner of Canada (OPC) released its Draft Position on Online Reputation[1]. The goal of the proposed draft is to enable individuals to have increased control over information that they post online about themselves or information posted about themselves by others. The OPC has taken the position that online search engines and online social media providers by selling add space are conducting a commercial activity and as such the information displayed on search engine results or on social media accounts is regulated by  the Personal Information Protection and Electronic Documents Act (PIPEDA).[2] In consequence, search engines and social media providers are obligated to ensure accuracy of posted information (which may include keeping information up to date, ensuring completeness of the information, de-indexing or deleting the information). The OPC proposal is amiable in purpose but may result in stifling freedom of expression. The OPC in its position outlined how the burden is initially on the search engines and social media providers to handle the privacy complaints. In my opinion, complaints to search engines and social medial platforms due to administrative cost considerations will likely lead to the default remedy of de-indexing web pages from search engines and in the case of social media providers deleting information. The OPC has also taken the position that an individual should have almost unfettered control over information they have posted about themselves and this should not be subject to a public Interest exception. I find this position troubling as it may allow public figures to white wash previous posts about themselves or their opinions. Lastly, I will review the OPC proposals on the special circumstances of youth and online reputation control.


De-Indexing and Deleting Likely the Default Remedy for Privacy Complaints

For an individual to challenge information about themselves posted by others as well  as self-posted information the initial complaint must be made to the search engine or social media provider first. Due to the complaint, the search engine or social media provider may be required to update, amended, de-index or delete this information. The practicality of search engines or social media providers instating updates or amendments to information seems unlikely. As the information in almost all cases is not authored by the search engine or social media provider but by a third party or a user. Additionally, in some instances the search engine or social media provider may not have the ability to alter the published information. Alternatively, the search engine may be required to de-index the page from future search results that originate in Canada (geo-fencing).

When considering the correct course of action, the search engine or social media provider is expected to evaluate the privacy and freedom of expression factors on a case by case basis and query if there is a public interest factor in maintaining the information.  Notably, there is no duty on the search engine or social media provider to inform the author of the challenged information that a challenge has been made (an in some cases it may impossible to notify the author). This means the author of the information in most cases will not have a voice to make a counter argument. Also if the complainant does not get the outcome desired he or she can then simply file a complaint with the Privacy Commissioner. Theses factors are likely to result in search engines and social media providers simply deleting information and de-indexing websites or social media posts on receiving the initial complaint as opposed to going through a burdensome and likely administratively costly analysis of weighing privacy and freedom of expression. To the OPC’s credit they seem to indirectly address this issue by noting the OPC powers should be increased in regard to oversight at these initial stage:

By way of example, we believe it would be of significant benefit for the OPC to be able to proactively examine how organizations are responding to the de-indexing requests and challenges to accuracy described above.

However, the extent that OPC will be policing that the initial complaint process and assessing if a meaningful weighing of freedom of expression, privacy, and in some cases a public interest defense will have to be examined in the future


Self-Posted Information, the Public Interest Defense, and Public Figures

Where a complainant is the author of the information challenged and wishes the information to be de-indexed or deleted from search engine or social media post the OPC takes the position that this remedy is near absolute and should not be subject to a public interest balancing approach:

This ability to delete self-posted information should be near-absolute, except to the extent that it is subject to legal or contractual restrictions. In particular, we do not believe that this ability should be subject to a public interest test; individuals should be able to delete information they have posted about themselves regardless of whether others would prefer it remain available.

Notably the only restriction that that the OPC does recognize in this situation is legal or contractual restrictions.  The real issue here is, should this be the position in regards to public figures? For example, blogs, articles, or social media posts written by public figures about themselves that later they find embarrassing or inconvenient would not be subject to a public interest exception. The OPC position seems incongruent with how our society usually views privacy balancing regarding public figures and holding them to account for past positions or views.

The Provisions for Youth

The OPC notably discussed the special issues around reputation and control of information for youth. Specifically the OPC noted:

They often have little or no option but to engage online (e.g. due to social pressures or requirements placed on them by schools). They are also in a time of experimentation, in which boundaries are being tested. It is thus critical that youth be provided with a means of reinventing themselves as they mature and enter adulthood – a fact recognized by the existence of “clean slate” and other protective mechanisms in Canada and elsewhere.

The OPC re-articulated its position that an organization captured by PIPEDA must have the consent of a guardian for children under 13 to collect, use, and disclosed information. Whereas between the ages 13 to 18 the organization’s consent process must take the maturity of youth. When a complaint is a youth the organization is to weigh more heavily towards de-indexing the information or removal of the content as a remedy. The OPC takes the position that the ability for a youth to have self-posted information removed should be almost absolute, unrestricted by contractual limitations, and enshrined into statute. Additionally, the OPC has proposed that youth on attaining the age of majority should have some ability to remove information about themselves posted by guardians.



Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s