Skip to content

Right To Know - May 2023, Vol. 6

May 3, 2023

Cyber, Privacy, and Technology Report

 

Welcome to your monthly rundown of all things cyber, privacy, and technology, where we highlight all the happenings you may have missed.

View previous issues and sign up to receive future newsletters by email here. 

 

State Actions:  

  • New Washington State Law Dramatically Expands Protections for Consumer Health Information. On April 17, the Washington State legislature passed the “My Health My Data Act” (“MHMDA”), which Governor Jay Inslee is expected to sign into law. If passed, the MHMDA’s new rules would take effect on March 31, 2024. In short, the MHMDA reflects a dramatic expansion of protections for consumer health data, including consumer health data not traditionally protected by the federal HIPAA statute. For more information, check out the Clark Hill Client Alert here.
  • Indiana and Iowa Pass Consumer Privacy Laws. On March 28, Iowa’s six-year-long effort to pass comprehensive consumer data privacy legislation was finally completed, making Iowa the sixth state to pass such a law. Just over two weeks later, Indiana’s legislature passed its own comprehensive consumer data privacy law, to make Indiana the seventh state with such comprehensive legislation. The new Iowa law (the Iowa Act Relating to Consumer Data Protection) is set to take effect on Jan. 1, 2025, and the Indiana law (the Indiana Consumer Data Protection law) a year later on Jan. 1, 2026.  Not only were both laws passed within a short time of one another, but they both share quite a few similarities to one another and to other similar comprehensive data privacy laws. For further analysis on both state laws, check out the Clark Hill Client Alert here.
  • Tennessee and Montana May Be Next to Pass Comprehensive Consumer Privacy Laws. On April 21, both Tennessee and Montana’s legislatures passed comprehensive consumer data privacy laws. Unsurprisingly, these laws are similar to those already enacted by other states. Montana’s law mirrors in many ways the Connecticut consumer data privacy law and was particularly adamant about including a universal opt out provision allowing consumers to, opt out of all collection from all businesses. Tennessee’s law too has a number of similarities with other data privacy laws, but, includes a number of key differences. Tennessee’s law has the narrowest application of any of the state privacy laws—applying only to businesses with more than $25 million in revenue, controlling or processing the data of 25,000 consumers and grossing 50% or more of their revenue from the sales of data from more than 175,000 consumers. Tennessee’s law also includes an affirmative defense to any enforcement proceeding if the business at issue is in “reasonable conformity” with the U.S. National Institute of Standards and Technology’s Privacy Framework.
  • New York Attorney General Releases Data Security Guide. On April 19, 2023, New York AG Letitia James released Protecting Consumers’ Personal Information, a guide to help businesses adopt effective data security measures to protect personal information. The guide is drawn from the AG Office’s experience investigating and prosecuting businesses following cybersecurity breaches. Its tips include secure authentication, encryption, ensuring security of service providers, an inventory of consumer information, guarding against automated attacks, and prompt and adequate breach notification. For businesses covered by New York law, the guide provides a roadmap to security measures that the AG Office expects and will enforce. For others, it provides good general guidance.
  • Multiple BIPA-Like Legislative Proposals Introduced.  Recent legislative proposals to govern the collection and use of biometric information by businesses include Arizona’s “Act Relating to Biometric Information” (SB 1238), New York’s “Act Prohibiting Private Entities From Using Biometric Data for Advertising” (AB S02390), and Vermont’s “Act Relating to Protection of Personal Information” (121). Each proposal is progressing through their state’s chambers. Notably, the Arizona proposed law incorporates a private right of action similar to Illinois’ multi-million-dollar settlement producing BIPA ($1,000 for negligent violations or $5,000 for intentional or reckless violations). These are one’s to watch for.
  • Montana Bans TikTok.  Montana is the first state to approve a statewide TikTok ban. The Montana House voted 54-34 in favor of the ban which, if signed by the state governor, would go into effect January 1, 2024. The ban prohibits TikTok from operating within the state and bars app stores from offering TikTok to users within the state. Entities would be fined $10,000 for violating the law. Critics have complained the ban amounts to censorship and violates free-speech rights, and have promised to file legal challenges. The ban follows Utah’s passage of a Social Media Law last month, sharply curtailing the use of social media platforms by minors in that state.
  • California Introduces Bill to Regulate Artificial Intelligence. California State Assembly member Rebecca Bauer-Kahan (D) has sponsored AI-related legislation (A.B. 331) that would impose assessment requirements on the private sector’s use of AI technologies.  Drawing on the Biden Administration’s Blue Print for AI Bill of Rights, the legislation requires creators and users to vet AI systems, and would require developers of automated or AI tools to submit annual impact assessments to the California Civil Rights Department by 2025.
  • New York City issued Final Regulations under its Automated Employment Decision Tools Law (Local Law 144). This month, the New York City Department of Consumer and Workplace Protection formally adopted its highly anticipated final rules implementing Local Law 144, which regulates the use of automated employment decision tools (AEDT) by employers and other entities in the City. Generally, LL144 prohibits the use of AEDTs by employers in the City unless the AEDT has first been subjected to a bias audit, information about the bias audit results have been publicly posted, and written notices and opt-out rights have been provided to employees or job candidates in advance of their use.  The final regulations expanded the definition of machine learning and artificial intelligence, and modified important requirements concerning the requirements of the bias audit and the independence of the auditor involved in the audit. While the law has been effective as of earlier this year, the final regulations have an effective and enforcement date of July 5, 2023.

Regulatory:  

  • Federal Government Announces Study of Artificial Intelligence Regulation. On April 11, 2023, the US began a study into possible rules to regulate artificial intelligence like ChatGPT. The Biden administration is seeking comments on potential accountability measures for artificial intelligence (AI) systems. The goal of the study and potential rules are to ensure “that AI systems are legal, effective, ethical, safe, and otherwise trustworthy.” Public comments are due by June 12, 2023.
  • USPTO Seeks Public Comment Regarding AI Inventions. Can an AI machine be an inventor for purposes of patent and related protections? To date, the USPTO and federal courts have generally said no, that an inventor must be a natural person. But the USPTO recently announced that it is holding USPTO will hold stakeholder engagement sessions regarding inventorship and AI-enabled innovation, including public comments on certain questions it has posed regarding AI inventorship. More information can be found here.
  • CFPB Reports Data Breach. In the midst of Bureau of Consumer Financial Protection (CFPB)’s Dodd-Frank Act Section 1033 rulemaking that would increase regulation of banks and fintech to enhance consumers’ rights to access their personal consumer data, the CFPB reported its own data breach.  News outlets reported that the breach, occurring in mid-February 2023, impacted over 250,000 consumers and several financial institutions resulted from a former employee emailing spreadsheets containing consumer and banking information to their personal account. CFPB will face questions concerning the incident as representatives will appear before Senate and House committees later this month.
  • Updates to GLBA Security Requirements Announced for Educational Institutions and Third-Party Service Providers. As of June 9, 2023, postsecondary education institutions and third-party service providers that administer financial aid associated with Title IV programs will need to comply with the final regulations to amend the Gram- Leach-Bliley Act Safeguards Rule, which were issued December 9, 2021. The changes to the Safeguards Rule expand on the minimum-security requirements that should already be in place.
  • CISA Publishes Blog Post on Phishing Resistant MFA. On April 12, 2023, the Cybersecurity and Infrastructure Agency (CISA) published “Phishing Resistant MFA is Key to Peace of Mind.” While multifactor authentication (MFA) is a strong safeguard, it is not foolproof, and attackers have found ways to sometimes bypass it with phishing attacks. The post explains bypass techniques and how to protect them.
  • CIS Publishes Incident Response Template. In March 2023, the Center for Internet Security (CIS) published an Incident Response Policy Template. The CIS Critical Security Controls, currently in Version 8, is a set of consensus best practices to strengthen cybersecurity. The Controls include18 Top-Level Controls, with various Safeguards for each of them. The Template is for Control 17, Incident Response Management. It is a helpful tool, both for reviewing existing incident response policies and for preparing new ones.
  • Bipartisan Proposal Seeks to Hold Technology Companies Liable For CSAM. US Senators Lindsay Graham and Richard Blumenthal are reintroducing the Eliminating Abusive and Rampant Neglect of Interactive Technologies (EARN IT) Act in an attempt to hold technology companies liable for child sexual abuse materials and images distributed on their platforms. The EARN IT Act makes two substantial changes from prior forms to try and achieve this: 1) it strips companies of liability protections provided under section 230 of the Communications Decency Act; and 2) it removes the knowledge standard for child sexual abuse materials therefor making it easier for courts to hold tech companies liable for providing encryption on their platforms because they knew it could be used to transmit such materials. There is a lot of push back to this act in the privacy and security communities as it could weaken end to end encryption used in communication applications. This is another episode in the communication encryption debate that tech companies should be keep an eye on.
  • Federal Bill follows State Laws Limiting Use of Social Media by Minors. A new US Senate bill is being introduced to set age limits for kids on social media. This is coming on the backs of new laws in Utah and Arkansas signed in the past month requiring anyone under 18 to get parental consent to join social media platforms. The new Senate bill will seek to prohibit children under 13 from accessing social media and those 13-17 to have parental permission. This could significantly limit how social media platforms are permitted to target children under 17 and specifically those under 13. It is not clear what age verification provisions will be put in place as part of the bill.
  • HHS Announces Amendment to Interoperability Regs. On April 11, 2023, the United States Department of Health and Human Services (HHS) Office of the National Coordinator for Health Information Technology (ONC) issued a proposed rule that would amend certain provisions of the 21st Century Cures Act and make several enhancements to the ONC Health IT Certification Program to advance interoperability, improve transparency, and support the access, exchange, and use of electronic health information. The proposed rule contains many technical requirements relevant to health IT developers, including the implementation of the Electronic Health Record Reporting Program as a new Condition of Certification for developers of certified health information technology under the ONC Health IT Certification Program. But the proposed rule will have the greatest impact on providers, patients, and payers in its modifications and exceptions in the information blocking regulations, such as defining what it means to “offer health information technology” for purposes of the information blocking regulations in the 21st Century Cures Act, and modifying the definition of “health IT developer of certified health IT. Comments on the proposed rule are due June 20, 2023.
  • HHS Cyber Alert Issued Concerning DDoS Attack. On April 7, 2023, the Department of Health and Human Services’ Health Sector Cybersecurity Coordination Center (HC3) issued an alert to healthcare organizations regarding a distributed denial-of-service (DDoS) attack, which the agency has been tracking since November 2022. These attacks flood targeted networks and servers with a fake Domain Name Server (DNS) request for non-existent domains (NXDOMAINs). Like other DDoS attacks, they are carried out by large botnets, which can consist of thousands of compromised devices located worldwide, making detecting and blocking this type of DNS attack difficult. As a result, NXDOMAIN DDoS attacks could negatively impact network providers, website owners, and end-users or customers, and shut down businesses’ websites. HC3 encourages businesses to exercise caution when blocking IPs, because this could result in legitimate users being prevented from accessing public services.

Litigation & Enforcement:  

  • BIPA Lawsuits Up 65% Following White Castle Decision. Bloomberg reports that the number of lawsuits alleging violations of Illinois’ Biometric Information Protection Act increased 65% following a precedent-setting ruling by the Supreme Court of Illinois in Cothron v. White Castle, which held that every scan of biometric data was an independent violation of the state.   BIPA requires companies that collect biometric data, which includes fingerprints, eye scans and facial recognition, to receive advance written consent from employees and customers and to develop a written policy about its collection, retention, and destruction, or face steep statutory penalties for their failure to do so.
  • Criminal Sentencing in Federal Medicare Distribution Case. In one of the first cases brought under the Medicare Access and CHIP Reauthorization Act of 2015 (MACRA), a Florida man was sentenced on April 7, 2023 by a U.S. District judge in the Southern District of Florida to 41 months behind bars for illegally distributing Medicare beneficiary identification. In the case, United States v. McElwee, No. 22-cr-60202, judgment entered (S.D. Fla. Apr. 7, 2023), the man and his co-conspirators admitted to using “data mining” and “social engineering techniques” to collect Medicare beneficiary information, including beneficiary names, addresses, dates of birth, social security numbers, and Medicare beneficiary identification numbers of more than 2.6 million beneficiaries. MACRA makes it illegal to buy, sell, or distribute Medicare or Medicaid beneficiary numbers without authority. After collecting the Medicare beneficiary information, the Florida man posted an ad on Craigslist titled “Medicare data for sale”, where he fraudulently sold the data for about $310,000 through the scheme.
  • Illinois Appellate Court Finds Coverage for BIPA Class Action Under Cyber Policy. The Appellate Court of Illinois held that an insured was entitled to coverage for claims expenses it incurred in an Illinois Biometric Information Privacy Act (“BIPA”) class action lawsuit. In the case, Remprex, LLC v. Certain Underwriters at Lloyd’s London, 2021 IL App (1st) 211097, Remprex sought coverage for costs it incurred in two BIPA class actions filed by truck drivers who claimed their privacy rights had been violated when their fingerprints were collected in order to access automatic railyard gates. Remprex was never named as a defendant in the first action, therefore there was no “claim” and coverage was not owing. Remprex was, however, a defendant in the second action. For costs incurred in this action, the court held that Remprex was entitled to its claim expenses under the policy’s Media Liability section, which applied to claims alleging a violation of an individual’s right to privacy during the “course of creating media material.” Importantly, the court also ruled that coverage was not owed under the portion of the Media Liability section applying to the dissemination of material to the public, because collecting truck drivers’ fingerprints was not tantamount to disseminating them to the public. Further, the court found coverage was not owing under the policy’s Data & Network Liability section because collecting and storing fingerprints is not tantamount to a security breach and no personally identifiable information was lost.

International Updates:  

  • EDPB Annual Report Released. The European Data Protection Board (EDPB) released its 2022 Annual Report outlining undertakings and results from 2022. Among other information, the Annual Report includes the EDPB’s five-step methodology that Supervisory Authorities can use for calculating administrative fines under the Global Data Protection Regulation.
  • European Parliament Urges Action on AI. On April 17, 2023, European legislatures issued an open letter that pushed the EU to impose a new set of rules to govern the use of AI-tools. The EU is already considering a proposed Artificial Intelligence Act. However, the rapid development of AI tools, primarily the rapid rise in the popularity of ChatGPT, let the letter’s authors pushing for more rapid regulatory response. The letter is available here.
  • Italy Takes on ChatGPT. Italy temporally bans ChatGPT. The Italian DPA temporarily banned the use of ChatGPT in its country based on privacy concerns and alleged violations of the GDPR. Shortly thereafter, on April 28, 2023, Italy lifted the ban after ChatGPT’s creator, OpenAI, implemented new privacy controls to address the Italian regulators’ concerns. The story of Italy’s implementation and subsequent lift of the ban is available here.
  • France looks into Connected Vehicles. France’s data protection authority, the Commission nationale de l’informatique et des libertés (CNIL) announced the results of its first meeting on geolocation tracking and connected vehicles. Called the “compliance club,” the group is set to provide practical and operational recommendations for the use of geolocation data in connected vehicles operating in the country.

Subscribe For The Latest

Subscribe