IAPP CIPM – From Small & Medium Enterprise (SME) to Multinational examples Part 5

  1. Consent to Children’s Data – is it Legal?

Hi, guys. In this lesson, we’ll discuss consent for children’s data. Is it legal or not? Special events involving children are highly sought after by children and parents alike, especially those that involve going through an assessment process to be selected for something as prestigious as national science fairs, debate contests, math competitions, petitions, coding conferences, or programmes taught at prestigious schools and universities. Parents and children alike are proud to be among those who get invited. It is then very disappointing, after the child has already been chosen, to be presented with a consent form that the parent must sign for the child to further participate in the desired program. When seeking a single consent, these forms typically combine several different rights. Typical are those forms asking for consent to use a child’s personal data and image and consent to ownership of any intellectual property generated by the child’s participation in the program.

Worse, the intended processing of personal data requires that the parent agree to allow images taken of their child during the programme to be permanently retained and used on social media and websites as part of the program’s publicity. The right to not consent, along with alternative ways to participate, is never provided, and the right to withdraw consent is not stated. Based on reading these consent forms, they would appear to violate the Data Protection Directive and the GDPR, which require consent to be unambiguous, specific, informed, and freely given to be valid. In these cases, consent cannot be freely given due to the coercive nature of the form’s wording and the process of obtaining consent.

As such, gathering consent in such a manner and the subsequent processing of the children’s personal data would be illegal. Consent must be freely given to be valid. The Working Party 29 (1878) states that consent can only be valid if the data subject is able to exercise a real choice and there is no risk of deception, intimidation, coercion, or significant negative consequences if he or she does not consent. If the consequences of consenting undermine an individual’s freedom of choice, consent would not be free. The Working Party 29-1319 states that “free consent” means a voluntary decision by an individual in possession of all of his faculties taken in the absence of coercion of any kind, be it social, financial, psychological, or other. Any consent given under the threat of no treatment or lower-quality treatment in a medical situation cannot be considered free. Reliance on consent should be confined to cases where the individual data subject has a genuine free choice and is subsequently able to withdraw the consent without detriment.

Because valid consent is not possible where significant negative consequences like nonparticipation are present, the data controllers must find their legal basis elsewhere or provide a valid method of consent. Working Party 187 makes it clear that if consent is initially sought but another legal basis is used, then doubts could be raised as to the original use of consent as the initial legal ground. If the processing could have taken place from the beginning, using the other ground, presenting the individual with a situation where he is asked to consent to the processing could be considered misleading or inherently unfair. In addition, the ability to withdraw consent is required. As Working Party 187 states in principle, consent can be considered deficient if no effective withdrawal is permitted. The situation of a child and their parent being forced to consent or lose the ability to participate is similar to that of an employee as described in Working Party 29 48.Where consent is required from a worker and there is a real or potential relevant prejudice that arises from non-consent, the consent is not valid in terms of satisfying either Article Seven or Article Eight, as it is not freely given.

If it is not possible for the worker to refuse it, this is not consent. An area of difficulty is where consent is a condition of employment. The worker is, in theory, able to refuse consent, but the consequence may be the loss of a job opportunity. In such circumstances, consent is not freely given and is therefore not valid. GDPR Resident 38 states that such specific protection should apply in particular to the use of children’s personal data for marketing purposes. The emotional aspects for children and their parents are described in this statement from Working Party 187. While a situation of subordination is often the main reason preventing consent from being free, other contextual elements can influence the decision of the data subject. They can have, for instance, a financial dimension or an emotional or a practical dimension. These forms typically demand consent as a prerequisite for further participation, but offer no alternative way for a child to participate in the programme if they refuse to consent to having their image captured and processed online. None of the main activities of the children’s participation have anything to do with the capture of their images. Instead, these children’s images are not only captured, but also further processed by being posted on publicly accessible social media and websites. It is well known what happens to images of children on the Internet when bad actors intervene.

There should always be a more granular ability to restrict the use of a child’s image to a less durable medium, such as broadcast television, printed newspapers, or Snapchat’s live capabilities. While withholding consent to other more durable mediums such as social media, video portals, and other websites, parents should be able to withhold consent entirely to the processing of the images of their child online when that processing is not strictly necessary. as part of the principal activity that children are involved in. Differentiation should also be made between photos of individual children or close-ups and photos of large groups of children or crowd scenes, as well as between photos with labels naming the child or the school and unlabeled photos. Videos that depict any child individually should require separate consent, given the additional invasive properties of video, and any audio recorded should be strictly necessary for participation in the program. The consent forms combine IP and personal data into a single consent.

Contrary to the GDPR requirement that they be properly unbundled, the forms also do not provide notification of the ability to withdraw consent previously granted. I contend that this course, and therefore illegal consent, may be the most widespread data protection problem today. So what exactly are the DPAs doing about it? Has anyone seen a specific instance of their DPA educating controllers or starting enforcement actions based upon the coerced consent of parents? Regarding the processing of their children’s personal data, The GDP requires in Article 57.1 B that DPAs have, as one of their tasks, activities addressed specifically to children that receive specific attention. I asked local DPAs to educate specific controllers on this, and the controllers promised to do something, but not until, I don’t know, next year, one year and a half, or who knows when? When will all DPAs across the European Union start educating and investigating controllers about these widespread violations of consent to the prosecutor’s processing of children’s personal data? That is the question we would all like to have answered.

  1. GDPR Right to Erasure and Backup Systems

Hi, guys. The GDPR requires organisations to delete personal data in certain circumstances. For example, when your organisation has received a valid erasure request known as the “right to be forgotten” and no exception under Article 17 of the GDPR applies, Additionally, data controllers must erase personal data when there is no longer a legal basis for processing such personal data as a result of a deletion deadline, according to their data retention policies, or at the request of a supervisory authority ordering the controller to comply with the data subject’s right to erasure request. So what? Does personal data have to be deleted? While it is clear that this erasure obligation covers personal data in production information systems, organisations may well wonder whether this obligation also requires them to delete personal data from backup systems and archives. Many companies keep database backups for disaster recovery purposes, and the truth is that it is often neither easy nor practical to remove a single record from the backups. Deleting a backup or manipulating the files therein can be a problem for the integrity of the backup as a whole.

For instance, in read-only files, the deletion of any of the data could corrupt other information not associated with the user. Besides, many backup files are compressed and do not allow their contents to be searched or manipulated without restoring the whole backup, making finding and deleting information about a specific individual difficult.

Finally, deleting the individual’s personal data without affecting other data that does not have to be deleted is not always feasible because many backup products that allow searches within the backup cannot erase the individual’s data without deleting the whole file or record where the information is contained. Then, depending on the number of archives containing personal data, the difficulty of restoring an environment, and the kind of disaster recovery tool used, erasing all personal data in a backup system without scrapping the backup entirely may cost an organisation thousands of dollars, and compliance with one request could be somebody’s full-time job. For these reasons, it is crucial to clarify whether your organisation is obliged to erase personal data from backup systems. In short, yes, it is. The text of the GDPR does not mention any exceptions for personal data contained in backups, and it does not recognize that a company may not have to honor an erasure request.

If compliance proves impossible or would involve a disproportionate effort, organisations must delete the data in all its locations without undue delay. But don’t panic. Enforcement authorities know how difficult it is to fulfil this obligation in practice. For example, the Danish Supervisory Authority issued guidance on data deletion, explaining that personal data must be deleted from backups where technically possible. This may be the case when the backup consists of an uncompressed copy of a database that allows deletion to be performed in the same way as for the live system. If it is not technically possible to delete individual data in a backup, the organisation must ensure that any data deleted from the production system is removed again if a backup is restored to production. A recommendation for this purpose is to keep a log of deletions performed in the live system. However, such logging should respect the data minimization principle. For example, instead of containing an explicit reference to a data subject, the log can indicate, for example, that a given row in Atable has been deleted at a given time.

Also, the authority issued a recommendation to impose a $100 and 6674 thousand euro fine on a company that failed to insure and demonstrate, beyond manually updated deletion logs, effective deletion of personal data, including in backup files. among other infringements. The regulator specifies that a retention and deletion strategy must provide for deletion logs in systems and processes to ensure that deletion is carried out based on logs in accordance with requirements as set out in internal procedures. Similarly, the UK’s Supervisory Authority, or the ICO, released guidance on the rights to erasure, indicating that it is necessary to take steps to ensure erasure from backup systems. Such steps depend on the organization’s particular circumstances, its retention schedule, particularly in regards to backups, and the technical mechanisms that are available to the organization.

Importantly, the ICO emphasises that organisations must be absolutely clear with individuals as to what will happen to their data when the Erasia request is fulfilled, including in respect of backup systems. It recognises that, while an erasure request in a live system can be instantly fulfilled, the data will remain in the backup environment for a set period of time until it is overwritten. In those cases, the ICO has clarified that it will be satisfied if the backup data is put beyond use, even if it cannot be immediately overwritten. This means that the organisation must guarantee that it will not use the data within the backup for any other purpose. This is to say that the data is merely held on the systems until it is replaced in line with an established schedule, and it commits to permanent deletion of the information if and when this becomes possible. When data could be unused, the ICO considers that it is unlikely that the retention within backups would pose a significant risk, although this will be context-specific. So how should this be done in practice?

As the authorities have stressed the importance of transparency, both your privacy notice and your communications with a data subject should be extremely clear about the limitations of deleting personal data from backups. Specify that even when the individual has validly exercised their right to be forgotten, there is no longer a legal basis or it is time to delete the data according to your retention schedule. The data contained in backups will only be deleted or overwritten at a later time according to the backup and retention schedules, which indicate when to schedule backups in such a way that the backups are only stored for a specified, limited, and reasonable time. Implement an automatic deletion logging system that reminds administrators that certain data must be deleted again after a backup is restored to production but still respects the data minimization principle. Protect your backups by means of encryption, secure offsite storage, and environmental controls, among other measures. The best solution will depend on the technical capabilities of your organization. For example, some cloud solutions allow for granular searching within backup.

  1. Video Surveillance Guidelines (part 1)

Hi, guys. In these two videos, we’ll discuss video surveillance guidelines. The European Data Protection Board, or EDPB, adopted draught guidelines on processing personal data through video devices. The Guidelines explain how to apply the GDPR when data is processed as a result of video surveillance. Video devices used to process personal data by European Union competent authorities for the purposes of prevention, detection, or prosecution of criminal offences or the execution of criminal penalties or for household purposes do not fall under the scope of the Guidelines. The household exemption determines that purely personal or household activities are out of scope for the Guidelines. Video surveillance activities that process personal data in the course of the private or family lives of individuals and are not made publicly accessible fall under the household exemption. The Guidelines reiterate that a legal basis under GDPR must be determined in order for the controllers to process personal data, specifically related to video surveillance. However, you should highlight some subtle differences as to how a legal basis may be applied.

Firstly, video surveillance based on the mere purpose of safety is no longer sufficient or specific enough. The purpose of using video surveillance must be explicit and documented. Secondly, controllers who claim to have a legitimate interest and necessity under Article 6(1) of the FGDPR must consider whether their legitimate interest is compelling enough to override the interests, rights, and freedoms of the data subject. The reasonable expectations of data subjects will play a role in this balancing test. For instance, it is reasonable for a data subject to not expect to be under surveillance in a sanitary facility, but it is reasonable for the data subject to expect to be under surveillance at an ATM machine or a bank. Likewise, the video surveillance must be necessary. Consequently, Adam’s means would not suffice. This includes the necessity of the video surveillance usage, but also the storage of the data and what data is captured. For example, in clips taken from the footage, faces are blurred, et cetera. The Guidelines stipulate that controllers must have taken or at least considered other measures before reverting to video surveillance.

Examples the EDPB gives include fencing the property, installing regular patrols of security personnel, using gatekeepers, providing better lightning, installing security locks, tamper-proof windows and doors, or applying anti-graffiti coatings or foils to walls. Thirdly, the guidelines determine that there must be an existing issue to process personal data through video surveillance. Essentially, real-life threats or situations will or may dictate whether a controller employs video surveillance. Not only will controllers have to specify the purposes for processing data under GDPR, but they will also have to make a case for processing personal data using video surveillance before any processing takes place. There have been previous robberies, or statistics on crime in or around the area could be an example. Criteria one and two above, in particular, are a significant improvement over how the Dutch Data Protection Authority has assessed video surveillance to date. This is an example. And lastly, consent is mentioned as a legal basis in the EDPB guidelines, but these legal bases must be taken with a grain of salt and used only in exceptional cases. It seems impossible to believe that controllers using video surveillance systems would collect the consent of data subjects in large areas before processing personal data.

Therefore, consent as a legal basis would most likely be used in exceptional cases. For example, individual monitoring of an athlete Any transfer or disclosure is considered a separate processing activity, and the controller therefore needs a legal basis. Additionally, any footage that is disclosed to a third party, for instance, law enforcement agencies, would then place a legal obligation on the controller and would constitute a new purpose. Where such disclosure is to law enforcement agencies, this is often done under a legal obligation. In such a case, the new processing purpose is unproblematic. However, this may be different if the disclosure is not done pursuant to a legal obligation. Moreover, aside from controllers determining a legal basis for the transfer, third-party recipients must also determine their own legal analysis and identify their own legal basis for receiving and processing the material.

Although video surveillance may collect special categories of personal data, for example, data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, trade union members, genetic or biometric data, or data concerning health or sexual life or sexual orientation, this may not necessarily be the original purpose or intent. In such cases, the captured data would not qualify as special category data. However, if data controllers wish to collect and process special categories of personal data, they must identify an exception for processing special categories of data under Article 9 of the GDPR. Video footage of an individual cannot, however, in itself be considered biometric data under Article 9 of the GDPR if it has not been specifically and technically processed to contribute to the identification of an individual. For example, for facial recognition to be considered “processing of special category data,” it must be processed for the purpose of uniquely identifying a natural person. To determine this, three criteria must be considered. First, the nature of the data relates to the physical, physiological, or behavioural characteristics of a natural person.

Second, there are means and methods of processing, which are data that result from a specific technical processing. Third, there is the processing purpose, which is data that must be used to uniquely identify a natural person. Processing biometric data presents a problem if individuals have not consented to their biometric data being captured and are represented in the footage. Certain safeguards must be taken by controllers to ensure that data is stored safely and appropriately. For instance, controllers must consider appropriate places to store the data retention periods, accessibility, and speech signals that indicate what data subjects are saying should not be identified. Also, consent will not be valid if there is a clear imbalance between the data subject and the controller, as evidenced by a very recent fine issued on August 21 by the Swedish Data Protection Authority against a school that used facial recognition to track students’ attendance in school. This was done as a pilot in one class on the basis of consent, but the Swedish DPA ruled that this consent was invalid in view of the clear imbalance between students and the school.

  1. Video Surveillance Guidelines (part 2)

 

Yes, let’s continue with video surveillance guidelines. The Guidelines further reiterate data subjects’ rights under GDPR. However, in terms of video surveillance, these rights are more limited. Data subjects have the right to access, erasure, and objection to the processing of their personal data. However, complying with these rights is not so straightforward forward.For instance, the right to access footage will be difficult, as footage usually contains data about more than one individual. And if data subjects request to have access to such footage or copies, controllers may not readily comply. However, the Guidelines stipulate an interesting solution: controllers may ask for more specifics regarding the data subject before searching for any footage, for example, the time frame. Moreover, the right to erasure does not necessarily mean that controllers will be able to erase data completely; instead, blurring pictures or images so as not to identify the data subject and erasing the legal basis for processing will constitute erasure.

Further, if any footage is published publicly, the controller has the obligation to take the necessary steps to inform other controllers of the request. Objections can be made prior, during, or after living in surveillance areas. According to the EDPB, the right to object means that unless the controller has compelling, legitimate grounds, monitoring an area where individuals could be identified is only lawful if the controller is able to immediately stop the camera from o immediately stop tdata or if the monitored area is restricted, so that the controller can ensure the approval of the data subject prior to entering the area. The ADPB does not elaborate on how this would work in practice. My take is that whenever the video surveillance is for safety and security reasons, and this has been sufficiently clarified as per the discussion above, the controller would typically have a compelling legal ground to continue the video surveillance.

Data subjects should be aware that video surveillance is in operation, and the Guidelines identify two levels at which data subjects should be informed. The first layer is the most crucial, as this is how the controller first engages with the data subject. As a result, warning signs with an icon must be displayed to provide easily understood information about the processing that is taking place. Controllers may no longer display a sign that solely states that you are under video surveillance. Instead, under the Guidelines, the first layer must identify controllers, the purposes of the processing, and data subjects’ rights. Not only is the information regarding the processing of personal data more detailed, but it must also be strategically placed. According to the guidelines, the warning sign must be positioned at a reasonable distance from the monitored area. That way, data subjects are able to determine which area is under surveillance before they are captured. The second layer requests that data subjects are also able to access any information regarding the video surveillance and the processing of data in hard copy and in the general vicinity of the area under surveillance.

Digital sources are also permitted and must be mentioned in the first layer, along with the QR code, for example. Here too, a number of practical issues arise, such as how to deal with cameras located at the entrance of a store or shopping center. In such cases, it may not always be possible to provide the information before data subjects are captured. The Guidelines are unfortunately silent on this and other practical concerns. The guidelines determine the parameters of storing and accessing personal data through video surveillance. The duration of storage of personal data may vary per member state, as they may have their own legislation on this matter, but the EDPB’s default position is that camera footage should be deleted after one or two days. It is important to remember that the longer data is stored, the more legitimate the purpose and necessity of the storage become. This is a clear deviation from the Dutch DPAs, for example, and the current practise in the Netherlands, which allowed a standard retention period of four weeks. Processing data should be both organised and secure, and data controllers have the obligation to ensure this. Additionally, controllers should select privacy-enhancing technologies for data protection by design and default functionality.

Organizational measures must take into account the overall management and operation of the video surveillance system. For example, who can access the video survey and storage? Who can monitor the video surveillance measures for a data breach, incident maintenance, recovery procedures, et cetera? Technical measures are vital to ensure that video surveillance systems are secure, meaning that systems should include data encryption features, firewalls, antivirus detection systems, or even measures to physically protect the video surveillance system from theft, vandalism, or other accidents. Lastly, controllers will also need to pay special attention to access controls, for example, by ensuring that monitors are concealed. Procedures for granting, changing, or revoking access are defined. User authentication methods are in place, et cetera. under GDPR and further determined in the Guidelines. Controllers are required to undertake a data protection impact assessment, or DPA, particularly if the processing constitutes a systematic monitoring of publicly accessible areas on a large scale.

Given the data processed and the purposes of video surveillance, which can include protection of people and property, detection of offences, prevention and control of offenses, collection of evidence, and biometric identification of suspects, many cases of video surveillance will require a DPIA. When considering processing personal data through video devices, controllers and potential controllers should consider the following: Under what legal basis can I use video surveillance? Is it a real necessity to have video surveillance in place? Will I be processing special category data? How do I meet the transparency obligations? In particular, the need to provide the first layer of information before data subjects are captured by the cameras How and for how long will I store the footage, and is it necessary to store the footage at all? How will the video survey system be equipped to handle and protect personal data? Should I perform a data privacy impact assessment? These are all questions that you need to answer before considering or even implementing video surveillance in your organisation or business.

img