' Privacy Risks with using XR in Education | MTTLR

Privacy Risks with using XR in Education

Online learning has become widespread and normalized during the pandemic. In a survey conducted from September to October 2020 of about 3,500 full-time college students, 72% of students were concerned about remaining engagedwhile learning remotely. Extended Reality (XR) technologies, including Augmented Reality (AR) and VR (Virtual Reality), can improve student engagement and success in online education. Augmented Reality, as its name suggests, augments a user’s surroundings by placing digital elements in a live view, commonly through the user’s smartphone camera. On the other hand, Virtual Reality allows the user to replace the real world through wearing a headset and headphones to simulate an immersive experience.

Though XR technologies have not been widely adopted in education yet, its use can benefit a variety of disciplinesranging from medicine to foreign languages. Among various legal uncertainties, universities that seek to provide XR in education should be aware of privacy risks associated with these technologies.

Privacy Concerns with Computed Data

XR technologies comprise displays and sensors that need to collect heavy data streams in order to provide the user with an immersive experience. Data can include a user’s location, biographic, biometric, and demographic information. More intrusive types of data collection include gaze-tracking, a feature likely to be essential to XR technologies’ ability to provide users deeply immersive experiences, such as rendering more sharply the virtual world elements where users are actively looking.

The types of data that XR devices collect can be broadly categorized into four categories of data: observable, observed, computed, and associated. Observable data is information that third parties can observe and replicate, such as digital communications between users. In contrast, observed data can be observed but not replicated, such as biometric data and location information. Computed information takes observable and observed data to draw new inferences about users—like biometric information and advertising profiles. Finally, associated data does not reveal identifying information about a user on its own, such as usernames, passwords, or IP addresses.

Computed information is a uniquely concerning area, because XR devices in educational settings can collect and synthesize significant amounts of data collected from involuntary and subconscious movements to make inferences or predictions about users—which may not always be correct. For example, depending on the measures used, eye-tracking data can reveal staggering information about a user’s gender, ethnicity, drug consumption habits, and sexual preferences. Computed data contributes to building a far more holistic user profile than would be possible with only the other three types of data.

Computed data can cause privacy harms through unintended use, unauthorized access, or malicious misuse. In the educational context, revealing sensitive computed information without student consent or knowledge could cause harm ranging from embarrassment to significant reputational damage. Of the four commonly recognized privacy torts, the two torts of public disclosure of embarrassing facts and false light invasion of privacy may be most salient. Public disclosure of embarrassing facts usually requires a plaintiff to show the 1) public disclosure of 2) facts concerning the plaintiff’s private life 3) where disclosure would be highly offensive to a reasonable person 4) and the facts are not of legitimate concern to the public 5) with disclosure resulting in mental distress or reputational injury. False light invasion of privacy typically means 1) giving publicity 2) to a false representation 3) understood as concerning the plaintiff 4) and which places the plaintiff in a false light which would be highly offensive to a reasonable person 5) resulting in damage. Plaintiffs who are not public figures—a category under which students would likely fall—may prevail by proving the false statement was made merely negligently rather than recklessly.

Universities must be careful to safeguard sensitive computed information about students to prevent its exposure to students’ classmates, instructors, or prospective employers to avoid resulting harms. One risky situation involves the disclosure of computed student information to employers. These employers may review comprehensive student profiles before making discriminatory hiring decisions on the basis of potentially wrong inferences about sensitive information, such as drug use or health status. At a minimum, mitigation techniques can include education and consent. Students should receive training on what data is being collected, for what applications the data is used, and how they can opt out of certain non-essential computing and aggregation practices built into XR devices. Furthermore, universities should establish clear guidelines on data storage, access, and retention policies to minimize harms from unauthorized access. Because computed data may make wrong inferences about users, a holistic mitigation approach should include procedures enabling students to review their profiles regularly and change incorrectly inferred information.

Universities that are interested in adopting XR technologies to provide students with more engaging learning experiences must balance the data collection with effective and responsible mechanisms to safeguard student privacy. The risks associated with careless adoption can negate any potential benefits in educational enrichment.

Emily Liu is an Associate Editor on the Michigan Technology Law Review.

Submit a Comment

Your email address will not be published. Required fields are marked *