Skip to content

Right To Know - January 2023, Vol. 2

January 4, 2023

Cyber, Privacy, and Technology Report

 

Welcome to your monthly rundown of all things cyber, privacy, and technology, where we highlight all the happenings you may have missed.

New Laws & Regulations:  

  • The New York City Department of Consumer and Worker Protection announced the postponement of the effective date of Local Law 144 – an amendment to the existing city code that will regulate the use of “automated employment decision making tools” (AEDTs) by employers in the city. The Law, which was originally set to go into effect on January 1, 2023, prohibits employers from using artificial intelligence/machine learning tools which constitute AEDTs without first conducting an independent bias audit on the AEDT, publishing the results, and providing notice and an opportunity to opt-out to job candidates and employees. Employers now have until at least April 15, 2023, to prepare for compliance with Local Law 144.
  • On December 21, 2022, the Colorado Attorney General’s office published revised draft Colorado Privacy Act (CPA) rules. Interested parties can submit comments during January, and a public hearing is scheduled for February 1, 2023. 
  • On December 16, 2022, during a board meeting of the California Privacy Protection Agency, the Agency indicated that final California Privacy Rights Act (CPRA) regulations will not be released until the end of January 2022 and will require a notice and comment period prior to formal adoption. Under this revised timeline, the CPRA regulations are expected to be effective no sooner than April 2023. 
  • In the rundown to its effective date, the CPRA draft regulation’s exclusion of “cross-context behavioral advertising” from the definition of “business purpose” for Service Providers, and the potential “sale” implications of such exclusion, have been the subject of much commentary from industry groups and businesses, including from the International Association of Privacy Professionals. 
  • Following the trend started by other states, Pennsylvania updated its breach notification law to expand the definition of “personal information.” The updated law has a more expansive definition of PI, to include medical information, health insurance information and username/email address in combination with a password or security question and answer that would permit access to an online account. The updated law, which goes into effect on May 2, 2023, also allows for expanded electronic notice, and shortens the breach reporting time period and other requirements on state agencies. 
  • New Jersey joined a growing list of states that seek to regulate the collection and processing of minor data. In December 2022, New Jersey proposed Act 4919 – which establishes “The New Jersey Children’s Data Protection Commission.”  In addition to creating the Commission, the proposed bill will prohibit companies from launching a new online service, product, or feature that is likely to be accessed by children, without first: (1) conducting a Data Protection Impact Assessment; (2) documenting any risk of material detriment to children that arises from the data management practices of the social media platform identified in the Data Protection Impact Assessment and create a timed plan to mitigate or eliminate those risks before the product is accessed by children; (3) configuring default privacy settings provided to children by the online service, product, or feature to settings that offer a high level of privacy; and (4) providing prominent, accessible, and responsive tools to help children exercise their privacy rights and report concerns, among other things.  The law, which is pending before the state assembly, includes a statutory penalty provision for companies failing to comply of $2,500-$7,500 per affected child, enforceable by the state’s attorney general.

Federal Enforcement & Initiatives:   

  • The White House’s executive order implementing the US-EU Data Privacy Framework (DPF) has been deemed preliminarily adequate in a draft decision issued by the European Commission (EC), making it one step closer in the process towards formal EC adoption. The draft adequacy decision concludes that the U.S. has taken steps to ensure that the current U.S. data privacy regime does not undermine the level of protection of personal data transferred from the EU to the U.S. under the relevant EU laws. The draft adequacy decision will now be reviewed by the European Data Protection Board and representatives from EU Member States and parliament. Formal adoption of the decision will allow data transfers from the EU to US companies that self-certify, and annually re-certify, to the U.S. Department of Commerce, and publicly commit to comply with, the new DPF
  • In December, the U.S.-EU Trade and Technology Council issued a Joint Roadmap on Evaluation and Measurement Tools for Trustworthy AI and Risk Management. This roadmap is intended to guide the development of tools, methodologies, and approaches to AI risk management and trustworthy AI. In addition, the Council reported that a joint study on the impact of AI in the workplace was finalized, which incorporates U.S. and E.U. case studies on hiring and logistics.
  • In a significant decision involving teen and minor privacy, the FTC announced a $275 million fine against Epic Games, the company behind the wildly successful Fortnite online game, for dark patterns that lead to automatic billing sign-up and frictionless (and occasionally unintentional) in-game purchases and for failing to protect the privacy teen and pre-teen players of the game (the enforcement action also included $245 million in consumer refunds). Of particular note in the privacy realm, the FTC found that because the game allowed for live communications between adults, teens, and children, the game created three types of privacy risks for teens and children: 1) exposing information about minor users to others without clarity about what is exposed, 2) exposing minors users to the unfiltered communications of adults, which could include harassment and abuse, and 3) enabling minors to connect with adults in a public setting that could facilitate sexual exploitation and abuse. The FTC determined that on an interactive platform, other users are third parties that can observe information about each other and that if a minor is present, their personal information must be protected by default and that valid, complete consent must be obtained before open communication is enabled. By failing to protect its teen and child users, Epic was creating privacy risks.
  • The Federal Trade Commission released its Mobile Health App Interactive Tool. According to the FTC, the tool was developed for “anyone developing a mobile app that will access, collect, share, use, or maintain information related to an individual consumer’s health, such as information related to diagnosis, treatment, fitness, wellness, or addiction.” The tool is designed to identify App compliance requirements, including under HIPAA, the FTC Act, 21st Century Cures Act, the Children’s Online Privacy Protection Act (COPPA) and the federal Food, Drug & Cosmetic Act.
  • Also, this month, the FTC extended the deadline for financial institutions regulated by the Gramm-Leach-Bliley Act (GLBA) to comply with certain provisions of its final rule implementing changes to the Standards for Safeguarding Customer Information (“Safeguards Rule”). Specifically, Rule requirements to: (1) designate a qualified individual to oversee an affected entity’s information security program; (2) develop a written risk assessment; (3) encrypt all sensitive information data; (4) implement multi-factor authentication or equivalent protection for access to customer information; (5) develop and incident response plan and train security personnel; have been extended from December 2022 to June 9, 2023. Interestingly, the FTC noted that its decision to extend the deadline was based in part on a on a Small Business Administration (SBA) letter documenting the shortage of qualified personnel to implement information security programs and supply chain issues that may lead to delays in obtaining necessary equipment for upgrading security systems.
  • The Office for Civil Rights (OCR) at the U.S. Department of Health and Human Services (HHS) warned covered entities and business associates under the Health Insurance Portability and Accountability Act of 1996 (HIPAA) about the use of tracking technologies. In a bulletin issued on December 1, 2022, HHS OCR reminded covered entities that they are not permitted to use tracking technologies, which are scripts or code on a website or mobile app used to gather information about users as they interact with the website or mobile app, in a manner that would result in impermissible disclosures of PHI to tracking technology vendors or other violations of the HIPAA Rules. Some sensitive information that a covered entity shares with online tracking technology vendors could lead to unauthorized disclosures of PHI to the vendor, which would not only violate HIPAA’s Privacy Rule, but could also result in significant financial and personal harm to the individual or to others identified in the individual’s PHI. A covered entity’s failure to comply with the HIPAA Rules when using tracking technologies could result in a civil money penalty. Additional analysis available here.
  • Under the new Omnibus spending bill, federal employees are no longer allowed to have TikTok on their government-issued phones and devices, amidst growing concerns over TikTok’s sharing of user data with the Chinese government.
  • Following the collapse of the FTX exchange, on December 14, 2022, Senators Elizabeth Warren (D-Mass) and Roger Marshall (R-Kan) introduced the Digital Asset Anti-Money Laundering Act of 2022. The proposed bill would impose anti-money laundering (AML) obligations on custodial and unhosted wallet providers, cryptocurrency miners, validators. The bill would also increase reporting requirements and prohibit financial institutions from transacting with digital assets that are intended to anonymize transactions. The full bill is available here.

Litigation & Noteworthy Settlements:  

  • On December 14, 2022, a first of its kind federal class action lawsuit was filed against State Farm, alleging the insurer’s automated claims processing, based on artificial intelligence/machine learning (AI/ML), resulted in algorithmic bias against Black homeowners who submitted insurances claims. The putative class action, titled Huskey v. State Farm Fire & Casualty Company, 22-cv-7014 (E.D. Ill.) asserts that the insurers failure to mitigate and correct bias in its automated claims processing constitutes discrimination in violation of the federal Fair Housing Act law.
  • NetChoice, a technology trade association, announced that it has filed a lawsuit against the state of California seeking to block the California Age-Appropriate Design Code Act from taking effect. The complaint alleges the recently passed legislation “presses companies to serve as roving censors of speech on the internet,” in violation of the First Amendment of the United States of America. The law, which is set to take effect July 1, 2024, includes requirements for privacy-by-default settings and data protection impact assessments for operators of websites used by minors in California.
  • Further evidence that Courts are highly scrutinizing class action settlements in the privacy and cybersecurity realm, a judge in the Northern District of California refused to preliminarily approve MetaPlatform’s proposed $37.5 million settlement of a 2018 class action lawsuit alleging claims concerning its collection of IP addresses, estimated location and geolocation data without user consent. Among other things, the judge is reported as having noted that not enough information was provided to understand the class size (estimated at 70 million people), and whether the claims rate was reasonable.

Cyber Insurance & Subrogation:  

  • On December 27, 2022, the Ohio Supreme Court in EMOI Servs. L.L.C v Owners Ins. Co., Slip Opinion No. 2022-Ohio-4649 narrowed coverage for ransomware attacks involving software. Specifically, the Court held that the insured, EMOI, a medical billing software developer, could not recover the costs of responding to a ransomware attack under its commercial property and liability insurance policy. The Court justified its determination finding that because software is an “intangible item” it “cannot experience direct physical loss.” Therefore, the policy, which covered only “direct physical loss of or damage to” the insureds property, did not apply to the insureds software that was affected by the cyber incident. The scope and enforceability of insurance coverage continues to evolve in the ransomware and incident response space, with the EMOI decision a new layer for those incidents involving company software or data.
  • On December 6, 2022, a federal district court, applying Oregon law, found coverage for a ransomware payment under the Computer Fraud insuring agreement of a commercial crime insurance policy. The decision is entitled Yoshida Foods Int’l, LLC v. Fed. Ins. Co., No. 3:21-cv-01455-HZ, 2022 WL 17480070 (D. Or. Dec. 6, 2022). The Yoshida court held that the policyholder’s ransom payment was “a direct loss of Money” resulting from “Computer Fraud,” as defined in the policy. It also held that the policy’s exclusion for employee-approved payments did not apply because, among other things, the ransom payment was coerced and therefore not truly “approved.”
  • The Supreme Court of California recently held that commercial general liability policies could cover liability for right-of-seclusion violations brought under the Telephone Consumer Protection Act (TCPA). The decision, titled Yahoo Inc. v. National Union Fire Insurance Company of Pittsburgh, Pa., No. S253593 provides that CGL insurance may be available in certain circumstances at least for the defense of TCPA cases. The Supreme Court considered whether “a commercial general liability insurance policy that provides coverage for ‘personal injury,’ defined as ‘injury … arising out of … [o]ral or written publication, in any manner, of material that violates a person’s right of privacy,’ … trigger[s] the insurer’s duty to defend the insured against a claim that the insured violated the [TCPA] … by sending unsolicited text message advertisements that did not reveal any private information.”  Finding that the personal injury coverage provision at issue was ambiguous, the Supreme Court held it was reasonable to read the relevant policy as covering claims arising under the TCPA, where those claims arise from an allegation that the TCPA violation intrudes on a policyholder’s right of seclusion.  

International & Industry Highlights 

  • In an approach known as “naming and shaming,” the UK Information Commissioner’s Office (ICO) has begun publishing details of personal data breaches, complaints, and civil investigations on its website. This action, plus the ICO’s triple-increase in fines in 2022, makes it an international regulator to watch in 2023.
  • The deadline for Standard Contractual Clauses (SCCs) refreshes was December 27, 2022. From this date forward, cross-border transfers may only occur where SCCs adopted by the European Commission on June 4, 2021 are in place. Continuing to transfer personal data outside the EEA without updated SCCs in place may result in a breach of the GDPR.
  • Reflecting increasing concerns over regulatory scrutiny of international data transfers, Microsoft announced the phased roll out of its “EU Data Boundary,” which will allow cloud customers of its products to process and store parts of their customer data inside the EU, limiting the need for data transfers.
  • Apple announced the roll out of a handful of new security tools, including Advance Data Protection which allows for end-to-end encryption “to provide Apple’s highest level of cloud data security, users have the choice to further protect important iCloud data, including iCloud Backup, Photos, Notes, and more.” In addition to providing added security in the event of a security breach, the move also comes amid ongoing concerns about technology companies sharing user data with law enforcement, and would effectively “prevent Apples from accessing iCloud phone backups in response to law enforcement requests.”
  • Effective December 2022, the International Advertising Bureau (IAB)’s Multi-State Privacy Agreement (MPA) is officially available for use by ad-tech, advertisers, and publishers. The MPA is a contractual framework that seeks to assist companies to share Global Privacy Platform signals with online partners, in compliance with state privacy laws, including the CPRA (California), the Colorado (CPA), Virginia (VCDPA), Connecticut (CTDPA) and Utah (UCPA).
  • According to a recent study, 96% of apps used in U.S. K-12 schools share personal information with third parties. The Internet Safety Labs study looked at 13 schools in every state representing half a million students and found that the sharing of student data with advertisers is a widespread problem. They also found that most schools had more than 150 such apps approved for use in classrooms—presenting a serious challenge to administrators and parents alike to monitor. The researchers found that the explosion of technology use that accompanied the quick shift to remote education during COVID drove the use of technologies that were not fully vetted and, in many cases, included apps that were not designed for use in schools or with children in mind. Yet, the researchers also found that customized apps, designed for schools, tended to be less safe than the general pool of apps studied. Concerns in this area from regulators and privacy experts have been on the rise. For instance, the FTC issued guidance in May that education technology companies bound by federal children’s privacy protections for children under 13 were prohibited from using the personal information collected from a child for any commercial purpose—even where the school authorized such collection.  

Subscribe For The Latest

Subscribe