An era of remote learning raises questions about children’s data privacy. As COVID-19 spread through the United States this spring, school districts across the country scrambled to find a way to teach students remotely. Many turned to Zoom, the videoconferencing platform that has rapidly become a household name. But as the usage of Zoom skyrocketed, the platform’s data privacy policies came under heightened scrutiny. Just a few weeks after Zoom CEO Eric Yuan gave K-12 schools in the U.S. free accounts in mid-March, the New York attorney general and Senators Ed Markey and Elizabeth Warren sent letters to the company requesting more information about its privacy and security measures. Both parties were particularly concerned about how Zoom handled children’s personal data now that so many minors were using the service for their education. Children’s online privacy in the United States is governed by the Children’s Online Privacy Protection Act (COPPA). Passed in 1998, COPPA is intended to protect the privacy of children under 13 by giving their parents control over the kind of information that is collected about them online. COPPA applies to web services that are either aimed at children under 13 or have “actual knowledge” that they collect and store personal information from children under 13. Personal information includes data like a child’s name, contact information, screennames, photos and videos, and geolocation information. In order to comply with COPPA, covered websites must publish their privacy policies, provide notice to parents and obtain their consent before collecting personal information from children, and give parents the opportunity to review and delete the information and opt-out of further collection. The...
Machine algorithms can discriminate. More accurately, machine algorithms can produce discriminatory outcomes. It seems counterintuitive to think that dispassionately objective machines can make biased choices, but it is important to remember that machines are not completely autonomous in making decisions. Ultimately, they follow instructions written by humans to perform tasks with data provided by humans, and there are many ways discriminations and biases can occur during this process. The training data fed to the machine algorithm may contain inherent biases, and the algorithm may then focus on factors in the data that are discriminatory towards certain groups. For example, the natural language processing algorithm “word2vec” learns word associations from a large corpus of text. After finding a strong pattern of males being associated with programming and females being associated with homemakers in the large text datasets fed to it, the algorithm came up with the analogy: “Man is to Computer Programmer as Woman is to Homemaker.” Such stereotypical determinations are among the many discriminatory outcomes algorithms can produce. The European Union (EU), out of fear of these outcomes leading to discriminatory effects produced by decision-making algorithms, included Article 22 when enacting the General Data Protection Regulation, which gives people “the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her.” Although what constitutes “solely automated processing” is debatable, the EU’s concern of algorithmic discrimination is evident. In the United States (U.S.), instead of passing laws that specifically target algorithmic discrimination, such concerns are handled largely under regular anti-discrimination laws,...
As the use of Zoom Video Conferencing has skyrocketed since the start of the Coronavirus Pandemic, the company’s security infrastructure and alleged interference in virtual events over the platform have come under fire multiple times since the beginning of global quarantines in March 2020. As millions of Americans are now using Zoom and other videoconferencing tools daily, any data breaches may provide unprecedented access to otherwise confidential conversations between users, including any U.S. government and private sector professionals who utilize the app for their work. Furthermore, censorship of certain virtual gatherings may place dangerously restrictive limits on communication and social organizing as the pandemic demands that most of the population continue to conduct its daily business virtually. Most recently, the U.S. Department of Justice has charged former China-Based Zoom executive Xinjiang Jin, also known as “Julien Jin,” with conspiracy to commit interstate harassment and unlawful conspiracy to transfer a means of identification after his alleged participation in a scheme to assist the People’s Republic of China in blocking virtual commemorations of the Tiananmen Square massacre in May and June 2020. News of this potential attempt to censor Chinese dissidents should remind users that their choice to route our communications through this (and other) videoconferencing apps has created new, special pandemic-era censorship concerns, Zoom has released a blog post and S.E.C. filing on its website acknowledging the charge and investigation, reaffirming its “support [for] the U.S. Government to protect American interests from foreign influence,” dedication “to the free and open exchange of ideas,” and ongoing, “aggressiv[e]” actions to “anticipate and combat…data security challenges.” Furthermore, the blog post details subpoenas received...
As colleges and universities reopened campuses to students last fall, a number of schools across the United States turned towards the use of location tracking apps, wearable technology, and other surveillance tools in the hope that they would facilitate contact tracing and potentially mitigate the spread of COVID-19 in residence halls and in-person classes. These efforts to monitor student health and track student activity have been met with skepticism from students and privacy advocates, who cite concerns about the invasive nature of such tools and the risk that the data they generate may be misused by unauthorized parties. In Michigan, Oakland University had announced earlier in August that it would require students living in residence halls to wear a BioButton, a coin-sized device that would monitor physiological data, such as skin temperature and heart rate as well and physical proximity to others wearing BioButtons. Administrators had hoped that this would allow the university to pinpoint early-stage cases among the student body. The university soon withdrew the policy, however, after receiving significant backlash from students, who, citing privacy and transparency issues, petitioned the school to make usage optional. Albion College, a private liberal arts college in Michigan, had issued a similar requirement for students to install the Aura app on their phones before they could come on campus. As a contact-tracing app, Aura would record students’ real-time location using phone GPS services and alert students when they had been in close proximity with someone who had tested positive for the virus. Albion had intended for the Aura app to work in tandem with what some considered to be...
The modern, digital world has made the world smaller and faster, with information and data transferred within an instant, ignoring any and all physical borders. While this digital highway is an essential pillar for our Internet age, it is also not without its problems. One such area of concern rests with data protection and privacy enforcement laws.
Facial recognition technology continues to experience an onslaught of complications and backlash.
Social media influencers need to stay abreast of intellectual property laws so their content does not violate them. This post explores the relevant U.S. legal issues implicated by every video or post creation.
Are you a resident of California? Or are you a business owner whose business reaches consumers in California? If your answer to either of these questions is “yes,” then you should familiarize yourself with the California Consumer Privacy Act (“CCPA”).
On September 10, 2018, the Federal Circuit Court of Appeals (“Federal Circuit”) affirmed the ruling of the United States Patent Trial and Appeals Board (“the Board”) in Regents of the University of California v. Broad Institute, finding that there was no interference-in-fact between competing patents that claimed methods of using CRISPR-Cas9 to modify cellular DNA. Rather than settling the patentability issue, however, exhaustive litigation has continued, as both parties seek to protect the results obtained from costly research.
The number of people shot and killed by police officers in the past several years is disturbingly consistent: 987 in 2017, 992 in 2018, 1004 in 2019. People of color and those with mental illnesses are disproportionately the victims.
After serving seven years in prison, Lydell Grant was released on bond in November 2019 as a result of exonerating DNA evidence. Grant was convicted of murder in 2012 primarily based on eyewitness testimony, despite the fact that Houston police could not conclude the mixture of DNA found on the victim belonged to Grant.
If you are a digital content creator in the US, whether it be through YouTube, Instagram or blogs, the Digital Millennium Copyright Act (DMCA) is important to you, as it governs what happens if you post content that may infringe on another person’s copyright.
When implemented in 2018, the European Union’s General Data Protection Regulation (GDPR) represented the most comprehensive privacy and data protection laws to date in the world. Its territorial scope is quite staggering.
Anti-aging researchers and their investors are beginning to make bold claims about the future of their field. Bank of America predicted that the market for anti-aging products will grow to $610 billion by 2025, roughly six times what the market is today.
Plano, Texas used to be home to the third oldest Apple store ever built. This Dallas suburb’s median household income of $92,121 is 55% above the national average. The eventual construction of Apple’s 500 locations worldwide was in some ways a result of its early success in Plano.
Level 4 autonomous vehicles, vehicles that do not require human interaction in most circumstances, are predicted to be on the road as soon as 2021. Experts believe that as autonomous vehicles grow in popularity and availability, the prevalence of car ownership will dramatically decrease.
What is the line dividing nature and patentable invention in life sciences and biotechnology? On January 13, 2020, the U.S. Supreme Court refused to answer this question by denying all pending petitions concerning patent eligibility.
In 2018, the General Data Protection Regulation (GDPR) began to govern members of the European Union. The GDPR allows individuals the “right of erasure” — the ability to request erasure of personal data from the Internet. But the European Union’s top court recently stymied the regulation’s effect, ruling that search engine operators are not required to de-reference subjects globally. Thus, the potential spillover effects — i.e., the potential issue of whether a U.S. court ought to enforce a European de-referencing — won’t allow for a cascading privacy right debate to enter American discourse.
Questions surrounding data privacy and what happens to our personal data when companies collect it have risen to the forefront of public discussion more in recent years than ever before.
Increased development of virtual reality (“VR”) technology brings a host of legal questions surrounding both the intellectual property (“IP”) of the actual technology as well as unlawful activity within the VR space itself.
Massachusetts is a hot battleground for Right to Repair movements – first for cars, and now for smartphones.
Should a probationer be forced to submit to warrantless searches of their electronic devices at any time, including being forced to provide all electronic passwords to a probation officer to allow remote and continuous monitoring?
What do WeWork, Lyft, and Smile Direct Club have in common? They are “tech companies,” their IPOs underperformed or didn’t happen at all, and they all hired JP Morgan as their underwriter.
From the transferability of social media or email accounts to maintaining online accounts linked to a client’s virtual assets, estate planning issues regarding digital assets have existed for some time. But, now that blockchain based assets such as cryptocurrencies are more commonplace, there is an increased need to plan for the disposition of these digital assets. Estate planning for cryptocurrencies raises unique concerns and the blockchain technology behind cryptocurrencies might provide potential solutions.