Skip to content

Right To Know - July 2023, Vol. 7

July 12, 2023

Cyber, Privacy, and Technology Report


Welcome to your monthly rundown of all things cyber, privacy, and technology, where we highlight all the happenings you may have missed.

View previous issues and sign up to receive future newsletters by email here. 


State Actions:  

  • Enforcement of California’s Privacy Regulations Stayed Until 2024: In a surprise turn of events, the Superior Court of California for the County of Sacramento sided with the California Chamber of Commerce and held that the California Privacy Protection Agency (“Agency”) is stayed from enforcing its final regulations issued under the California Privacy Protection Act, as amended by the California Privacy Right Act (collectively “CCPA”), until March 2024. Further analysis of the decision and its impact on businesses is here.
  • Three States Make Moves on Comprehensive Privacy Laws:
    • Florida Enacts Digital Bill of Rights Act (“FDBR”). On June 6, 2023, Florida Gov. Ron DeSantis signed into law SB 262, which grants Florida consumers certain rights relating to the processing of their personal data by businesses. Parts of SB 262 will come into effect in 2023. The FDBR has different jurisdictional thresholds from other state privacy legislation, such as operating an app store or digital distribution platform offering a minimum of 250,000 unique software applications for download, and gross annual revenues of more than $1 billion. CS/CS/SB 262 (
    • Texas’ Data and Privacy Security Act Signed into Law: On June 19, 2023, Governor Greg Abbott signed into law the Texas Data and Privacy Security Act (TDPSA)—adding to the growing list of comprehensive state data privacy laws in the U.S. With the exception of provisions related to universal opt-out mechanisms, the law is set to take effect on July 1, 2024, and will create a host of new privacy rights for Texas consumers similar to those found in other comprehensive consumer data privacy laws. These rights include a right to know what is being collected, the ability to correct collected information, and to prohibit the sale of information about individuals. The law also requires entities subject to the law to obtain consent before collecting sensitive personal information (like race, health conditions, sexuality, citizenship status, genetic or biometric data) or information from anyone under the age of 13. And the law also requires consent before collecting geolocation data within 1750 feet of a user’s location.
    • Oregon Set to Finalize Comprehensive Privacy Bill: On June 22, 2023, the Oregon legislature passed its data privacy law, joining the other 10 comprehensive consumer data privacy laws currently enacted or passed by other states around the country. Similar to those other laws, the Oregon law applies to entities that control or process the personal data of 100,000 Oregon consumers or derive 50% of their revenue from selling the data of more than 25,000 Oregon consumers. The Oregon law, among other things, allows consumers to access personal data stored by an entity, correct inaccuracies, delete personal data, restrict the sale of personal data and prevent personal data from being used to profile a consumer. The law, if signed by the Governor, would take effect on July 1, 2024.
  • Nevada and Connecticut Focus on Consumer Health Data Privacy:
    • Nevada Enacts Consumer Health Data Privacy Law: On June 16, 2023, Nevada Senate Bill 370 was signed into law by Governor Lombardo, imposing new requirements on the collection, use, and sale of consumer health data. The act will take effect on March 31, 2024 and generally prohibits the collection and sharing of consumer health data without the relevant consumer’s affirmative, voluntary consent, and would similarly prohibit the sale of consumer health data without the consumer’s written authorization. SB 370 adopts a narrowed definition of consumer health data by focusing on data used to identify the past, present or future health status of the consumer.
    • Connecticut’s’ Health Data Law Passes State Legislature: On June 2, 2023, Connecticut state legislature voted to pass SB 3, which amends the Connecticut Data Privacy Act to give additional protections for consumer health information  – which is not limited to gender affirming health and reproductive/sexual health data. In shorter terms, this amendment adds health data to the list of sensitive data and requires opt-ins in order to sell or offer consumer health data. The bill also prohibits individuals from using a geofence to establish a virtual boundary that is within 1,750 of a mental health facility and/or reproductive/sexual health facility for the purpose of identifying, tracking, collecting data from or sending to a consumer regarding their health data.
  • Connecticut Enacts AI Law: On June 7, 2023, Connecticut Gov. Ned Lamont, D-Conn., signed into law SB 1103, an act concerning artificial intelligence, automated decision-making and personal data privacy.  Among other things, SB 1103 prohibits the use of minor data in AI decision making and targeted advertising, establishes an Office of Artificial Intelligence and a task force to study AI and develop an AI bill of rights.
  • Florida Bans Maintenance of Patient Information Outside the United States: Also, this month, Florida took steps to ban the maintenance of patient information for certified electronic health record technology in a physical or virtual environment outside the United States, its territories, or Canada. CS/CS/SB 264 amends the Florida Electronic Health Records Exchange Act to require health care providers who use certified electronic health record technology to ensure that patient information is physically maintained in the continental United States or its territories or Canada. The law applies to “all patient information stored in an offsite physical or virtual environment,” including patient information stored through third-party or subcontracted computing facilities or cloud computing service providers. It applies to all qualified electronic health records that are stored using any technology that can allow information to be electronically retrieved, accessed, or transmitted. The new law is limited to health care providers who use “certified electronic health record technology” (CEHRT), including but not limited to entities licensed by the Florida Agency for Health Care Administration (AHCA), such as hospitals, healthcare clinics, ambulatory surgical centers, and home health agencies.  The new law also requires an entity licensed by AHCA to ensure that a person or entity who possesses a controlling interest in the licensed entity does not hold, either directly or indirectly, an interest in an entity that has a business relationship with a “foreign country of concern” such as the Republic of China or the Russian Federation.


  • White House Announces New NIST Public Working Group on AI: On June 22, 2023, the White House announced a new NIST public working group on AI. According to the announcement, “The Public Working Group on Generative AI will help address the opportunities and challenges associated with AI that can generate content, such as code, text, images, videos and music. The public working group will also help NIST develop key guidance to help organizations address the special risks associated with generative AI technologies.”
  • Federal Trade Commission Proposes Amendments to Health Breach Notification Rule: As a result of the recent explosion of health apps and connected devices, many of which aren’t covered by HIPAA, on June b8, 2023 the FTC proposed amendments to the Health Breach Notification Rule that include clarifying the rule’s applicability to health apps and other similar technologies. Under the proposed amendments, more entities will be subject to Health Breach Notification Rule through the addition of two terms – “health care provider” and “health care services or supplies” which includes any online service that provides health-related services or tools to track diseases, health conditions, medications, diet, sexual health, and more. Under the proposed amendments, a reportable breach includes not just data breaches, but any disclosure that is not authorized by a consumer. Under the Rule, notification to impacted consumers, with their consent, by text, in-app messaging, or electronic banner in an application, would be permitted.
  • Federal Trade Commission continues focus on Children’s Privacy: The FTC continues to crack down on violations of the Children’s Online Privacy Protection Act (“COPPA”), the Future of Privacy Forum has released a report that addresses some of the challenges that come with trying to comply with one of COPPA’s most often violated provision, verifiable parental consent. The report looks at the challenges that often arise with attempting to obtain effective parental consent and proposes changes to the regulatory approaches and consent models. This information, including their infographic, are helpful tools as organizations think through their parental consent processes.
  • GLBA Safeguards Rule Now Applicable to Post-Secondary Institutions: On June 9, 2023, the Department of Education began enforcing the GLBA Safeguards Rule under 10 C.F.R. Part 314 applicable to post-secondary institutions and service providers administering Title IV aid.  The Department of Education will enforce the Safeguards Rule by including information security in its audit plans going forward.  As a reminder, colleges and universities agree to comply by the Safeguards Rule when agreeing to the Program Participation Agreement (PPA) and the Student Aid Internet Gateway (SAIG) Enrollment Agreement which commit institutions and servicers to ensure that all federal student aid applicant information is protected from unauthorized disclosure as well as threats against the security or integrity of the student information. The Federal Student Aid Office of the Department of Education recently published a GLBA Enforcement Dear Colleague Letter reminding institutions to develop, implement, maintain, and test comprehensive, written information security programs (WISP) with the seven required safeguard elements.  Institutions or Title IV service providers processing the student information of more than 5,000 or more students/consumers additional must establish an incident response plan and report regularly and at least annually to institutional leadership on the institution’s information security program (16 C.F.R. 314.4(h) and (i)).
  • HHS Enforcement Action Centers on Security Guard Activities: On June 15, 2023, the U.S. Department of Health and Human Services (HHS) Office for Civil Rights (OCR) announced a settlement with Yakima Valley Memorial Hospital after completing a HIPAA investigation that began in May 2018 regarding allegations that 23 hospital security guards accessed medical records of 419 patients. Yakima voluntarily resolved the matter by agreeing to pay $240,000 and implement a corrective action plan to train its workforce and update its policies and procedures to safeguard its protected health information.

Litigation & Enforcement:  

  • Judge Vacates $228 Million Damages Award in First BIPA Trial: The damages award in the first Illinois Biometric Information Privacy Act (“BIPA”) trial has been vacated, and a new trial on the issue of damages ordered. In Rogers v. BNSF Railway Company, the Court originally calculated a $228million damages award in favor of the Plaintiffs.  Following cross motions to amend the decision, the District Court  vacated that portion of its decision in favor of BNSF so as to allow a jury to determine the appropriate amount of BIPA damages.  In its decision, the District Court denied the Plaintiffs motion to reconsider its BIPA damages calculation following the Illinois Supreme Court’s White Castle decision, which held that ever finger print scan was a separate violation for BIPA purposes, which could have significantly increased the damages award.
  • Open AI Targeted in Multiple Lawsuits concerning ChatGPT:
    • ChatGPT Alleged to Scrap Personal Information of Individuals without Consent: Anonymous Plaintiffs on June 28, 2023, filed a federal 157-page class action complaint in the northern district of California against ChatGPT creator OpenAI alleging that the Defendants’ conduct in developing, marketing, and operating their AI products, including ChatGPT-3.5, ChatGPT-4.0, 4 Dall-E, and Vall-E was unlawful when it used scrapping technology and methods to steal private information, including personally identifiable information, “from hundreds of millions of internet users, including children of all ages, without their informed consent or knowledge.”  The Plaintiffs request a jury trial, several transparency and governance measures, as well as additional injunctive, equitable and monetary damages including actual damages for economic and non-economic harm and punitive damages in an amount to be determined at trial.
    • ChatGPT alleged to Violate Copyright and IP Law: Two authors have brought a class action lawsuit against OpenAI for its ChatGPT product, alleging that OpenAI scrapped and mined data from thousands of books, including the authors’ own publications, which were protected by copyright.  According to the lawsuit, OpenAI acted without permission or license to do so, and in violation of federal copyright and other laws.
  • FTC Files Sealed Amended Complaint Against Kochava, After Losing First Round in Data Privacy Battle: In 2022, the FTC sued data analytics and marketing company Kochava alleging that its data collection practices, in particular its collection and use of sensitive geolocation data and mobile device IDs, violates user privacy and constitutes an unfair trade practice under Section 5 of the Federal Trade Commission Act.  Among other things, the FTC sought a permanent injunction prohibiting Kochava from continuing with these data collection practices. Last month, the Idaho Federal District Court overseeing the case granted Kochava’s motion to dismiss the FTC’s action, finding that the FTC failed to sufficiently allege substantial consumer harms.  Specifically, the Court held that the risk of misuse of the sensitive consumer data at issue was theoretical at best, and did not constitute a substantial injury as required under the FTC Act.  Nonetheless, the Court allowed the FTC 30 days to amend its complaint to include such allegations.   Earlier this month, the FTC filed its amended complaint against Kochava under seal.  According to the FTC, the new complaint was filed under seal because the FTC “anticipates that defendant Kochava Inc. may take the position that some of the materials referenced, excerpted, or cited in the amended complaint constitute trade secrets” and was sealed “out of an abundance of caution.”
  • FTC Sues Amazon for Dark Patterns in Prime Membership Subscription: This month, the FTC sued Amazon in federal court in Washington state alleging its Prime Subscription enrollment and cancellation processes were unfair trade practices under the FTC Act, and in violation of the federal Restore Online Shoppers Confidence Act (ROSCA) statute governing online subscription practices. In its Complaint, the FTC alleges Amazon worked for years to enroll consumers into its Prime service without their consent, while making it knowingly difficult for them to cancel their subscriptions.

International Updates:  

  • Support for EU-US Data Privacy Framework Builds. Paving the way for adoption of the EU-US Data Privacy Framework, on June 30, 2023, the United States Department of Justice and the Director of National Intelligence announced that the United States had satisfied its commitments under President Biden’s Executive Order on the EU-US Data Privacy Framework, including the DNI’s adoption of policies and procedures pursuant to the Executive Order.  On the other side of the framework, 24 EU member states voted in favor of the framework, finding that it provides an adequate level of protection of personal data.  In the coming weeks, the European Commission is expected to adopt the proposed adequacy decision for the Framework, bringing US and EU businesses closer to being able to utilize the Framework as a basis for international data transfers, in addition to the Standard Contractual Clauses and Binding Corporate Rules.
  • WEF issues AI Guidelines: On June 20, 2023, the World Economic Forum has also issued Guidelines for Procurement of AI Solutions in the Private Sector. The WEF created in the guidelines to “establish standards and frameworks to ensure responsible AI practices and procurement” based on the “exponential growth of the global AI market.” “This report offers a structured framework for evaluating the implications of acquiring AI solutions, emphasizing transparency, accountability and human-centered design throughout the development and implementation process.”
  • Sweden Sanctions Spotify for Right to Access Violations: The Swedish Privacy Authority (IMY) has sanctioned Spotify SEK 58 million after discovering how they handle their consumers’ right to access their personal information. The IMY determined that Spotify does not adequately inform consumers enough about how their data is being used by the company after a consumer requests access to their personal information. A shortfall discovered to this is that the explanation may need to be in the individuals’ own language. The information being provided by Spotify to consumers is unclear.

Subscribe For The Latest