Full Evaluation Questions

These questions provide part of the foundation of our evaluation process. You have several options for navigating these questions, and learning more about data privacy. You can view the evaluation questions categorized into four concerns: Safety, Privacy, Security, and Compliance. In addition, this page contains all of the evaluation questions, with background and citations.

1: Transparency

1.1: Policy Version

1.1.1: Effective Date (Privacy)

Do the policies clearly indicate the version or effective date of the policies?

1.1.2: Change Log (Privacy)

Do the policies clearly indicate a changelog or past policy versions available are for review?

  • Indicator
    • Discloses a public archive or change log of previous policies.
  • Background
    • This indicator seeks evidence that a company provides publicly available records of previous terms so that people can understand how the company’s terms have evolved overtime. See Ranking Digital Rights, F2, P2.
    • What is a changelog? A changelog is a file which contains a curated, chronologically ordered list of notable changes for each version of a project. Why keep a changelog? To make it easier for users and contributors to see precisely what notable changes have been made between each release (or version) of the project. Who needs a changelog? People do. Whether consumers or developers, the end users of software are human beings who care about what's in the software. When the software changes, people want to know why and how. See Keep a Changelog

1.2: Policy Notice

1.2.1: Change Notice (Privacy)

Do the policies clearly indicate whether or not a user is notified if there are any material changes to the policies?

  • Indicator
    • Discloses notification will be provided to users about changes to the policies.
    • Discloses commitment to notify users about changes to the privacy policy.
  • Citation
  • Background
    • It is common for companies to change their terms of service as their business evolves. However these changes can have a significant impact on how users can or cannot use the service, with potential impact on users’ freedom of expression rights. We therefore expect companies to commit to notify users when they change these terms and to provide users with information that helps them understand what these changes mean. See Ranking Digital Rights, F2, P2.

1.2.2: Method Notice (Privacy)

Do the policies clearly indicate the method used to notify a user when policies are updated or materially change?

  • Indicator
    • Discloses how users will be directly notified of changes to the policies.
  • Citation
  • Background
    • The FTC adopts the OECD principle that companies should be accountable for their privacy practices. Specifically, the FTC calls on companies to implement procedures – such as designating a person responsible for privacy, training employees, and ensuring adequate oversight of third parties – to help ensure that they are implementing appropriate substantive privacy protections. See FTC, Protecting Consumer Privacy in an era of rapid change: recommendations for business and policy makers (2012), p. 24; See also The Organisation for Economic Co-operation and Development (OECD) Privacy Framework (2013).
    • This indicator seeks clear disclosure by companies of the method and timeframe for notifying users about changes to their terms of service. We expect companies to commit to directly notify users prior to those changes coming into effect. The method of direct notification may differ according to the type of service. For services that contain user accounts, direct notification may involve sending an email or an SMS. For services that do not require a user account, direct notification may involve posting a prominent notice on the main page where users access the service. See Ranking Digital Rights, F2, P2.

1.3: Policy Changes

1.3.1: Review Changes (Privacy)

Do the policies clearly indicate whether or not any updates or material changes to the policies will be accessible for review by a user prior to the new changes being effective?

  • Indicator
    • Discloses the timeframe for notification prior to changes to the policies coming into effect.
  • Citation

1.3.2: Effective Changes (Privacy)

Do the policies clearly indicate whether or not any updates or material changes to the policies are effective immediately and continued use of the product indicates consent?

1.4: Policy Coverage

1.4.1: Services Include (Privacy)

Do the policies clearly indicate the products that are covered by the policies?

  • Indicator
    • Discloses what applications or services are covered by the company's policies.
  • Citation
  • Background
    • If the company offers multiple products and services, it should be clear to what products and services the policies apply. See Ranking Digital Rights, P1.
    • If the company offers multiple products and services, it should be clear to what products and services the policies apply. See Ranking Digital Rights, P1.

1.5: Policy Contact

1.5.1: Vendor Contact (Privacy)

Do the policies clearly indicate whether or not a user can contact the vendor about any privacy policy questions, complaints, and material changes to the policies?

1.6: Policy Principles

1.6.1: Quick Reference (Privacy)

Do the policies clearly indicate the vendor's privacy principles by short explanations, layered notices, a table of contents, or outlined privacy principles of the vendor?

1.7: Policy Language

1.7.1: Preferred Language (Privacy)

Do the policies clearly indicate they are available in any language(s) other than English?

  • Indicator
    • Discloses policies are available in the language(s) most commonly spoken by the user.
  • Background
    • This indicator expects companies to provide terms of service and privacy policy that are easy to find, are available in the languages of the primary markets in which the company operates, and to ensure that the policies are easy to understand. If the company offers multiple products and services, it should be clear to what products and services the policies apply. See Ranking Digital Rights, F1, P1.

1.8: Intended Use

1.8.1: Children Intended (Compliance)

Do the policies clearly indicate whether or not the product is intended to be used by children under the age of 13?

  • Indicator
    • Discloses the product is intended to be used by children under the age of 13.
  • Citation

1.8.2: Teens Intended (Compliance)

Do the policies clearly indicate whether or not the product is intended to be used by teens 13 to 18 years of age?

  • Indicator
    • Discloses the product is intended to be used by teens 13 to 18 years of age.
  • Citation
    • Children's Online Privacy Protection Act: (A mixed audience site is where the site is directed to children, but does not target children as its "primary audience," but rather teens 13-to-18 years of age or adults. An operator of a mixed audience site is required to obtain age information from a user before collecting any information and if a user identifies themselves as a child under the age of 13, the operator must obtain parental consent before any information is collected) See Children's Online Privacy Protection Act (COPPA), 16 C.F.R. Part 312.2
    • California Privacy Rights for Minors in the Digital World: (Prohibits an operator from marketing or advertising non age-appropriate types of products or services to a minor under 18 years of age and from knowingly using, disclosing, compiling, or allowing a third party to use, disclose, or compile, the personal information of a minor for the purpose of marketing or advertising non age-appropriate types of products or services. Also, a minor is permitted to request to "erase" or remove and obtain removal of content or information posted on the operator's site) See California Privacy Rights for Minors in the Digital World, Cal. B.&P. Code §§ 22580-22582
    • General Data Protection Regulation: (In relation to the offer of information society services directly to a child, the processing of the personal data of a child shall be lawful where the child is at least 16 years old. Where the child is below the age of 16 years, such processing shall be lawful only if and to the extent that consent is given or authorised by the holder of parental responsibility over the child. Member States may provide by law for a lower age for those purposes provided that such lower age is not below 13 years.) See General Data Protection Regulation (GDPR), Conditions Applicable to Child's Consent in Relation to Information Society Services, Art. 8(1)

1.8.3: Adults Intended (Compliance)

Do the policies clearly indicate whether or not the product is intended to be used by adults over the age of 18?

1.8.4: Parents Intended (Compliance)

Do the policies clearly indicate whether or not the product is intended to be used by parents or guardians?

1.8.5: Students Intended (Compliance)

Do the policies clearly indicate whether or not the product is intended to be used by students in preschool or K-12?

1.8.6: Teachers Intended (Compliance)

Do the policies clearly indicate whether or not the product is intended to be used by teachers?

  • Indicator
    • Discloses the product is intended to be used by teachers.
  • Citation

2: Focused Collection

2.1: Data Collection

2.1.1: Collect PII (Privacy)

Do the policies clearly indicate whether or not the vendor collects Personally Identifiable Information (PII)?

  • Indicator
    • Discloses Personally Identifiable Information (PII) is collected.
    • Discloses how the product collects personal information.
  • Citation
    • Children's Online Privacy Protection Act: (Personally Identifiable Information under COPPA includes first and last name, photos, videos, audio, geolocation information, persistent identifiers, IP address, cookies, and unique device identifiers) See Children's Online Privacy Protection Act (COPPA), 16 C.F.R. Part 312.2
    • California Online Privacy Protection Act: (The term "Personally Identifiable Information" under CalOPPA means individually identifiable information about a consumer collected online by the operator from that individual and maintained by the operator in an accessible form, including any of the following: (1) A first and last name; (2) A home or other physical address, including street name and name of a city or town; (3) An e-mail address; (4) A telephone number; (5) A social security number; or (6) Any other identifier that permits the physical or online contacting of a specific individual) See California Online Privacy Protection Act (CalOPPA), Cal. B.&P. Code §22577(a)(1)-(6)
    • Family Educational Rights and Privacy Act: ("Personal Information" under FERPA includes direct identifiers such as a student or family member's name, or indirect identifiers such as a date of birth, or mother's maiden name, or other information that is linkable to a specific student that would allow a reasonable person in the school community to identify the student with reasonable certainty) See Family Educational Rights and Privacy Act (FERPA), 34 C.F.R. Part 99.3
    • General Data Protection Regulation: (“personal data” means any information relating to an identified or identifiable natural person ("data subject"); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person) See General Data Protection Regulation (GDPR), Definitions, Art. 4(1)
  • Background
    • FERPA defines the term personally identifiable information (PII) to include direct identifiers (such as a student's or other family member's name) and indirect identifiers (such as a student's date of birth, place of birth, or mother's maiden name). Indirect identifiers include metadata about a student's interaction with an application or service, and even aggregate information can be considered PII under FERPA if a reasonable person in the school community could identify individual students based on the indirect identifiers together with other reasonably available information, including other public information. See PTAC, Responsibilities of Third-Party Service Providers under FERPA, p. 2; See PTAC, Protecting Student Privacy While Using Online Educational Services: Model Terms of Service, p. 2.
    • Companies collect a wide range of personal information from users—from personal details and account profiles to a user’s activities and location. We expect companies to clearly disclose what user information they collect and how they do so. See Ranking Digital Rights, P3.
    • The term “user information” appears in many indicators throughout the Privacy category. An expansive interpretation of user information is defined as: “any data that is connected to an identifiable person, or may be connected to such a person by combining datasets or utilizing data-mining techniques.” As further explanation, user information is any data that documents a user’s characteristics and/or activities. This information may or may not be tied to a specific user account. This information includes, but is not limited to, personal correspondence, user-generated content, account preferences and settings, log and access data, data about a user’s activities or preferences collected from third parties either through behavioral tracking or purchasing of data, and all forms of metadata. See Ranking Digital Rights, P3.

2.1.2: PII Categories (Privacy)

Do the policies clearly indicate what categories of Personally Identifiable Information are collected by the product?

2.1.3: Geolocation Data (Privacy)

Do the policies clearly indicate whether or not precise geolocation data are collected?

  • Indicator
    • Discloses location information is collected.
    • Discloses location information is derived from usage information.
  • Citation
    • Children's Online Privacy Protection Act: (Personally Identifiable Information under COPPA includes first and last name, photos, videos, audio, geolocation information, persistent identifiers, IP address, cookies, and unique device identifiers) See Children's Online Privacy Protection Act (COPPA), 16 C.F.R. Part 312.2
    • Family Educational Rights and Privacy Act: ("Personal Information" under FERPA includes direct identifiers such as a student or family member's name, or indirect identifiers such as a date of birth, or mother's maiden name, or other information that is linkable to a specific student that would allow a reasonable person in the school community to identify the student with reasonable certainty) See Family Educational Rights and Privacy Act (FERPA), 34 C.F.R. Part 99.3
    • Student Online Personal Information Protection Act: ("Covered Information" under SOPIPA is personally identifiable information that includes descriptive information or identifies a student that was created or provided by a student, parent, teacher, district staff, or gathered by an operator through the operation of the site) See Student Online Personal Information Protection Act (SOPIPA), Cal. B.&P. Code § 22584(i)(1)-(3)
    • California Online Privacy Protection Act: (The term "Personally Identifiable Information" under CalOPPA means individually identifiable information about a consumer collected online by the operator from that individual and maintained by the operator in an accessible form, including any of the following: (1) A first and last name; (2) A home or other physical address, including street name and name of a city or town; (3) An e-mail address; (4) A telephone number; (5) A social security number; or (6) Any other identifier that permits the physical or online contacting of a specific individual) See California Online Privacy Protection Act (CalOPPA), Cal. B.&P. Code §22577(a)(1)-(6)
    • General Data Protection Regulation: (“personal data” means any information relating to an identified or identifiable natural person ("data subject"); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person) See General Data Protection Regulation (GDPR), Definitions, Art. 4(1)
  • Background
    • Location information collected in the mobile context is considered a persistent identifier that can be used to recognize a user over time and across different websites or online services. Geolocation data includes information sufficient to identify the latitude and longitude coordinates of a user that can correspond to a specific street, address, name of a city or town. If location data is collected and shared with third-parties, companies should work to provide consumers with more prominent notice and choices about its geolocation data collection, transfer, use, and disposal practices. See FTC, Protecting Consumer Privacy in an era of rapid change: recommendations for business and policy makers (2012), p. 33; See also U.S. v. Jones, 132 S. Ct. 945, 955 (2012)("GPS monitoring generates a precise, comprehensive record of a person's public movements that reflects a wealth of detail about her familial, political, professional, religious, and sexual associations").
    • For mobile ecosystems, we expect companies to clearly disclose what options users have to control the collection of their location information. A user’s location changes frequently and many users carry their mobile devices nearly everywhere, making the collection of this type of information particularly sensitive. In addition, the location settings on mobile ecosystems can influence how other products and services access their location information. For instance, mobile apps may enable users to control location information. However, if the device on which those mobile apps run collects geolocation data by default and does not give users a way to turn this off, users may not be able to limit that mobile app's collection of their location information. For these reasons, we expect companies to disclose that users can control how their device interacts with their location information. See Ranking Digital Rights, P7.

2.1.4: Health Data (Privacy)

Do the policies clearly indicate whether or not any health or biometric data are collected?

  • Indicator
    • Discloses health or biometric related information is collected.
  • Citation
    • Family Educational Rights and Privacy Act: (A biometric record, as used in the definition of personally identifiable information, means a record of one or more measurable biological or behavioral characteristics that can be used for automated recognition of an individual. Examples include fingerprints; retina and iris patterns; voiceprints; DNA sequence; facial characteristics; and handwriting) See Family Educational Rights and Privacy Act (FERPA), 34 C.F.R. Part 99.3
    • Children's Online Privacy Protection Act: (Personally Identifiable Information under COPPA includes first and last name, photos, videos, audio, geolocation information, persistent identifiers, IP address, cookies, and unique device identifiers) See Children's Online Privacy Protection Act (COPPA), 16 C.F.R. Part 312.2
    • Student Online Personal Information Protection Act: ("Covered Information" under SOPIPA is personally identifiable information that includes descriptive information or identifies a student that was created or provided by a student, parent, teacher, district staff, or gathered by an operator through the operation of the site) See Student Online Personal Information Protection Act (SOPIPA), Cal. B.&P. Code § 22584(i)(1)-(3)
    • General Data Protection Regulation: (“personal data” means any information relating to an identified or identifiable natural person ("data subject"); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person) See General Data Protection Regulation (GDPR), Definitions, Art. 4(1)
    • General Data Protection Regulation: ("genetic data" means personal data relating to the inherited or acquired genetic characteristics of a natural person which give unique information about the physiology or the health of that natural person and which result, in particular, from an analysis of a biological sample from the natural person in question) See General Data Protection Regulation (GDPR), Definitions, Art. 4(13)
    • General Data Protection Regulation: ("biometric data" means personal data resulting from specific technical processing relating to the physical, physiological or behavioural characteristics of a natural person, which allow or confirm the unique identification of that natural person, such as facial images or dactyloscopic data) See General Data Protection Regulation (GDPR), Definitions, Art. 4(14)
    • General Data Protection Regulation: ("data concerning health" means personal data related to the physical or mental health of a natural person, including the provision of health care services, which reveal information about his or her health status) See General Data Protection Regulation (GDPR), Definitions, Art. 4(15)
  • Background
    • Biometric data are physical or behavioral characteristics which can be used to identify unique individuals. Biometric technologies measure these unique characteristics electronically and match them against existing records to create a highly accurate identity management system. Fingerprints, retnia scans, or voice and facial recognition are examples of physcial identification technologies. It uses the layout of facial features and their distance from one another for identification against a "gallery" of faces with similar characteristics. See Privacy Best Practice Recommendations For Commercial Biometric Use, NTIA Discussion Draft (July 22, 2015), p. 1.
    • The ability of facial recognition technology to identify consumers based solely on a photograph, create linkages between the offline and online world, and compile highly detailed dossiers of information, makes it especially important for companies using this technology to implement privacy by design concepts with robust choice and transparency policies. Such practices should include reducing the amount of time consumer information is retained, adopting reasonable security measures, and disclosing to consumers that the facial data collected may be used to link them to information from third-parties or publicly available sources. See FTC, Protecting Consumer Privacy in an era of rapid change: recommendations for business and policy makers (2012), p. 46.

2.1.5: Behavioral Data (Privacy)

Do the policies clearly indicate whether or not any behavioral data are collected?

  • Indicator
    • Discloses behavioral or usage information is collected.
  • Citation
    • Children's Online Privacy Protection Act: (An operator is prohibited from including behavioral advertisements or amassing a profile of a child under the age of 13 child without parental consent) See Children's Online Privacy Protection Act (COPPA), 16 C.F.R. Part 312.2
    • Family Educational Rights and Privacy Act: (A biometric record, as used in the definition of personally identifiable information, means a record of one or more measurable biological or behavioral characteristics that can be used for automated recognition of an individual. Examples include fingerprints; retina and iris patterns; voiceprints; DNA sequence; facial characteristics; and handwriting) See Family Educational Rights and Privacy Act (FERPA), 34 C.F.R. Part 99.3
    • General Data Protection Regulation: ("biometric data" means personal data resulting from specific technical processing relating to the physical, physiological or behavioural characteristics of a natural person, which allow or confirm the unique identification of that natural person, such as facial images or dactyloscopic data) See General Data Protection Regulation (GDPR), Definitions, Art. 4(14)

2.1.6: Sensitive Data (Privacy)

Do the policies clearly indicate whether or not sensitive personal information is collected?

  • Indicator
    • Discloses collection of sensitive information such as ethnic, racial, national origin, cultural, religious, or social personal information.
  • Citation
    • General Data Protection Regulation: (Processing of personal data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade-union membership, and the processing of genetic data, biometric data for the purpose of uniquely identifying a natural person, data concerning health or data concerning a natural person's sex life or sexual orientation shall be prohibited unless: (a) the data subject has given explicit consent to the processing of those personal data for one or more specified purposes, except where Union or Member State law provide that the prohibition ... may not be lifted by the data subject) See General Data Protection Regulation (GDPR), Processing of special categories of personal data, Art. 9(1)-(2)(a)

2.1.7: Usage Data (Privacy)

Do the policies clearly indicate whether or not the product automatically collects any information?

  • Indicator
    • Discloses non-personal usage information is collected.
  • Citation
    • Children's Online Privacy Protection Act: (Personally Identifiable Information under COPPA includes first and last name, photos, videos, audio, geolocation information, persistent identifiers, IP address, cookies, and unique device identifiers) See Children's Online Privacy Protection Act (COPPA), 16 C.F.R. Part 312.2
    • Family Educational Rights and Privacy Act: ("Personal Information" under FERPA includes direct identifiers such as a student or family member's name, or indirect identifiers such as a date of birth, or mother's maiden name, or other information that is linkable to a specific student that would allow a reasonable person in the school community to identify the student with reasonable certainty) See Family Educational Rights and Privacy Act (FERPA), 34 C.F.R. Part 99.3
    • Student Online Personal Information Protection Act: ("Covered Information" under SOPIPA is personally identifiable information that includes descriptive information or identifies a student that was created or provided by a student, parent, teacher, district staff, or gathered by an operator through the operation of the site) See Student Online Personal Information Protection Act (SOPIPA), Cal. B.&P. Code § 22584(i)(1)-(3)
    • California Online Privacy Protection Act: (The term "Personally Identifiable Information" under CalOPPA means individually identifiable information about a consumer collected online by the operator from that individual and maintained by the operator in an accessible form, including any of the following: (1) A first and last name; (2) A home or other physical address, including street name and name of a city or town; (3) An e-mail address; (4) A telephone number; (5) A social security number; or (6) Any other identifier that permits the physical or online contacting of a specific individual) See California Online Privacy Protection Act (CalOPPA), Cal. B.&P. Code §22577(a)(1)-(6)
    • General Data Protection Regulation: (“personal data” means any information relating to an identified or identifiable natural person ("data subject"); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person) See General Data Protection Regulation (GDPR), Definitions, Art. 4(1)
  • Background
    • The Children's Online Privacy Protection Act (COPPA) defines "personal information" to include identifiers, such as a customer number held in a cookie, an IP address, a processor or device serial number, or a unique device identifier that can be used to recognize a user over time and across different websites or online services, even where such an identifier is not paired with other items of personal information. Companies should disclose in their privacy policy, and in their direct notice to parents, their collection, use or disclosure practices of persistent identifiers unless: (1) the company collects no other "personal information," and (2) persistent identifiers are collected on or through a company's site or service solely for the purpose of providing "support for the internal operations" of the site or service. See FTC, Complying with COPPA: Frequently Asked Questions, q. 6.
    • Persistent identifiers collected for the sole purpose of providing support for the internal operations of the website or online service do not require parental consent, so long as no other personal information is collected and the persistent identifiers are not used or disclosed to contact a specific individual, including through behavioral advertising; to amass a profile on a specific individual; or for any other purpose. See FTC, Complying with COPPA: Frequently Asked Questions, q. 5.
    • The data on students collected and maintained by Ed Tech can be extremely sensitive, including medical histories, social and emotional assessments, progress reports, and test results. Online services also collect new types of data, which were not contemplated by and may not be protected by federal privacy laws. New data types collected by Ed Tech include "metadata," such as a student’s location, how many attempts a student made to answer a question, and whether a student is using a desktop or a mobile device. Metadata can be put to good use to personalize learning and to improve educational products. It can also be used to influence or market to students or to their parents. See Ready for School, Recommendations for the Ed Tech Industry to Protect the Privacy of Student Data (November 2016), CA. D.O.J., p. 3.
    • A vendor should describe the types or categories of student information that they acquire from schools, school districts, teachers, parents, or students. Data types may include behavioral data reflecting how a student used the site or service or what content the student has accessed or created through it, and transactional data, such as persistent unique identifiers, collected through the use of your site or service. While unique identifiers are evolving with technology, currently such identifiers include, but are not limited to, cookies, device IDs, IP addresses, and other data elements if used to identify devices or users. See Ready for School, Recommendations for the Ed Tech Industry to Protect the Privacy of Student Data (November 2016), CA. D.O.J., p. 11.

2.1.8: Lunch Status (Privacy)

Do the policies clearly indicate whether or not the vendor collects information on free or reduced lunch status?

  • Indicator
    • Discloses free or reduced lunch information is collected.
  • Citation
    • The National School Lunch Act: (The NSLA defines penalties for the unauthorized sharing of personal information related to free and reduced lunch status for students) See The National School Lunch Act (NSLA), 42 U.S.C. §§1751-63
    • Family Educational Rights and Privacy Act: ("Personal Information" under FERPA includes direct identifiers such as a student or family member's name, or indirect identifiers such as a date of birth, or mother's maiden name, or other information that is linkable to a specific student that would allow a reasonable person in the school community to identify the student with reasonable certainty) See Family Educational Rights and Privacy Act (FERPA), 34 C.F.R. Part 99.3
  • Background
    • The National School Lunch Act (NSLA) requires school districts to provide free or reduced price lunches to all eligible children, including eligible children in schools that had not yet established school lunch programs. The NSLA aims to safeguard the health and well-being of children and defines penalties for the unauthorized sharing of personal information related to free and reduced lunch status for students. See 42 U.S.C. §§ 1751-63.

2.2: Data Source

2.2.1: Student Data (Compliance)

Do the policies clearly indicate whether or not the vendor collects personal information or education records from preK-12 students?

  • Indicator
    • Discloses education records from preK-12 students is collected.
  • Citation
  • Background
    • The Family Educational Rights and Privacy Act of 1974 (FERPA), provides parents of students the right to access their children's Student Data or education records, and Students 18 years of age and older the right to access their own education records. In addition, FERPA provides the right to have the records amended, and the right to have some control over the disclosure of personally identifiable information (PII) in the education records. Furthermore, strict storage guidelines surround Student Data which require organizations to maintain accurate, and up-to-date records. See 20 U.S.C. § 1232g; 34 C.F.R. Part 99.1.
    • What are Education Records? FERPA defines educational records as records that are: (1) directly related to a student; and (2) maintained by an educational agency or institution or by a party acting for the agency or institution. These records include, but are not limited to, transcripts, class lists, student course schedules, health records, student financial information, and student disciplinary records. It is important to note that any of these records maintained by a third-party acting on behalf of a school or district are also considered education records. 20 U.S.C. § 1232g (a)(4)(A); 34 CFR § 99.3; See PTAC, Responsibilities of Third-Party Service Providers under FERPA, p. 1; See also PTAC, Protecting Student Privacy While Using Online Educational Services: Requirements and Best Practices, p. 2.

2.2.2: Child Data (Compliance)

Do the policies clearly indicate whether or not the vendor collects personal information online from children under 13 years of age?

  • Indicator
    • Discloses personal information from children under 13 years of age is collected.
  • Citation
  • Background
    • The Children's Online Privacy Protection Act (COPPA) requires a privacy policy to list the kinds of personal information collected from children (for example, name, address, email address, hobbies, etc.), how the information is collected, and how the company uses the personal information. It also requires companies to indicate whether they disclose information collected from children to third-parties. If so, the company must also disclose the kinds of businesses in which the third-parties are engaged, the general purposes for which the information is used, and whether the third-parties have agreed to maintain the confidentiality and security of the information. See 15 U.S.C. § 6502; 16 C.F.R. Part 312.
    • If a company knows that a user of the online website or service is under the age of 13, the Children's Online Privacy Protection Act (COPPA) will impose more stringent requirements on the collection of information from those users. COPPA requires that operators seeking to collect, use, or disclose personal information from children under the age of 13, must first obtain verifiable parental consent. Even where a user is 13 or older, COPPA remains a source of best practices for companies that collect personal information from users, particularly when those users are still minors. See 15 U.S.C. §§ 6501-6506; 16 C.F.R. Part 312.
    • COPPA permits the collection of limited personal information from children under 13 for the purposes of: (1) Obtaining verified parental consent; (2) providing parents with a right to opt-out of an operator's use of a child's email address for multiple contacts of the child; and (3) to protect a child's safety on a website or online service. See 15 U.S.C. 6502(b)(2); 16 C.F.R. 312.5(c)(1)–(5).

2.3: Data Exclusion

2.3.1: Data Excluded (Privacy)

Do the policies clearly indicate whether or not the vendor excludes specific types of data from collection?

  • Indicator
    • Discloses specific types of information are excluded from collection.

2.3.2: Coverage Excluded (Privacy)

Do the policies clearly indicate whether or not the vendor excludes specific types of collected data from coverage under its privacy policy?

  • Indicator
    • Discloses specific types of information are collected, but excluded from coverage under the policies.

2.4: Data Limitation

2.4.1: Collection Limitation (Privacy)

Do the policies clearly indicate whether or not the vendor limits the collection or use of information to only data that are specifically required for the product?

3: Data Sharing

3.1: Data Shared With Third Parties

3.1.1: Data Shared (Privacy)

Do the policies clearly indicate if collected information (this includes data collected via automated tracking or usage analytics) is shared with third parties?

  • Indicator
    • Discloses user information is shared with third parties.
    • Discloses the type of user information shared with third parties.
  • Citation
  • Background
    • Online educational services increasingly collect a large amount of contextual or transactional data as part of their operations, often referred to as "metadata." Metadata refer to information that provides meaning and context to other data being collected; for example, information about how long a particular student took to perform an online task has more meaning if the user knows the date and time when the student completed the activity, how many attempts the student made, and how long the student's mouse hovered over an item (potentially indicating indecision). See PTAC, Protecting Student Privacy While Using Online Educational Services: Requirements and Best Practices, pp. 2-3.
    • Metadata that have been stripped of all direct and indirect identifiers are not considered protected information under FERPA, because the data are not PII. A provider that has been granted access to PII from education records under the "school official" exception may use any metadata that are not linked to FERPA-protected information for other purposes, unless otherwise prohibited by the terms of their agreement with the school or district. See PTAC, Protecting Student Privacy While Using Online Educational Services: Requirements and Best Practices, pp. 2-3.
    • Companies collect a wide range of personal information from users—from personal details and account profiles to a user’s activities and location. Companies also often share this information with third parties, such as advertisers, governments, and legal authorities. We expect companies to clearly disclose what user information they share and with whom. Company disclosure should specify if it shares user information with governments and with commercial entities. See Ranking Digital Rights, P4.

3.1.2: Data Categories (Privacy)

Do the policies clearly indicate what categories of information are shared with third parties?

  • Indicator
    • Discloses the categories of information shared with third parties.
  • Citation
  • Background
    • Consumers deserve more transparency about how their data is shared beyond the entities with which they do business directly, including "third-party" data collectors. This means ensuring that consumers are meaningfully aware of the spectrum of information collection and reuse as the number of firms that are involved in mediating their consumer experience or collecting information from them multiplies. The data services industry should follow the lead of the online advertising and credit industries and build a common website or online portal that lists companies, describes their data practices, and provides methods for consumers to better control how their information is collected and used or to opt-out of certain marketing uses. See Exec. Office of the President, Big Data: Seizing Opportunities, Preserving Values (2014), p. 62.
    • What is the "School Official" Exception? In some cases, providers need PII from a students's education records in order to deliver the agreed-upon services. FERPA's school official exception to consent is most likely to apply to the schools' and districts' relationships with service providers. When schools and districts outsource institutional services or functions, FERPA permits the disclosure of PII from education records to contractors, consultants, volunteers, or other third-parties provided that the outside party meets specified requirements. See 34 C.F.R. § 99.31(a)(1)(i); See also PTAC, Responsibilities of Third-Party Service Providers under FERPA, P. 2; See also PTAC, Protecting Student Privacy While Using Online Educational Services: Requirements and Best Practices, p. 3-5.

3.2: Data Use by Third Parties

3.2.1: Sharing Purpose (Privacy)

Do the policies clearly indicate the vendor's intention or purpose for sharing a user's personal information with third parties?

3.2.2: Third-Party Analytics (Privacy)

Do the policies clearly indicate whether or not collected information is shared with third parties for analytics and tracking purposes?

3.2.3: Third-Party Research (Privacy)

Do the policies clearly indicate whether or not collected information is shared with third parties for research or product improvement purposes?

3.2.4: Third-Party Marketing (Privacy)

Do the policies clearly indicate whether or not personal information is shared with third parties for advertising or marketing purposes?

3.3: Data Not Shared With Third Parties

3.3.1: Exclude Sharing (Privacy)

Do the policies specify any categories of information that will not be shared with third parties?

3.4: Data Sold to Third Parties

3.4.1: Data Sold (Privacy)

Do the policies clearly indicate whether or not a user's personal information is sold or rented to third parties?

3.5: Third-Party Data Acquisition

3.5.1: Data Acquired (Privacy)

Do the policies clearly indicate whether or not the vendor may acquire a user's information from a third-party?

  • Indicator
    • Discloses user information is collected from third parties.
    • Discloses how user information is collected from third parties.
    • Discloses the purpose for collecting user information from third parties.
  • Citation
  • Background
    • We expect companies to disclose what information about users they collect from third parties, which in this case typically means information collected from third-party websites or apps through technical means, for instance through cookies, plug-ins, or widgets. Company disclosure of these practices helps users understand if and how their activities are being tracked by companies even when they are not on a host company’s website. See Ranking Digital Rights, P9.

Do the policies clearly indicate whether or not outbound links on the site to third-party external websites are age appropriate?

  • Indicator
    • Discloses outbound links to third-party external websites are moderated or age appropriate with age-gate.
    • Discloses notification is provided to users that outbound links open third-party external websites.
  • Citation
    • Children's Internet Protection Act: (If an operator provides third-party links on its site that link to potentially non-age appropriate information for children, then the operator must provide notice upon clicking a third-party link that a user is leaving the website) See Children's Internet Protection Act (CIPA), 47 U.S.C. § 254
  • Background
    • If a vendor links or directs student users of the site or service to external, non-Ed Tech sites or services, the vendor should disclose any such referrals in their Privacy Policy and, where possible, include a link to the privacy policy of the referral site or service. If the vendor is also the operator of the external site or service, they should maintain the same privacy and security protections for their student users when they leave the Ed Tech site or service. See Ready for School, Recommendations for the Ed Tech Industry to Protect the Privacy of Student Data (November 2016), CA. D.O.J., p. 12.
    • We expect the company to clearly disclose whether the privacy policies of the apps that are available in its app store specify what user information the apps collect and whether those policies comply with data minimization principles. See Ranking Digital Rights, P3.

3.7: Third-Party Data Access

3.7.1: Authorized Access (Privacy)

Do the policies clearly indicate whether or not a third party is authorized to access a user's information?

  • Indicator
    • Discloses an authorized third party may access a user's information.
    • Discloses the process for responding to non-judicial government requests.
    • Discloses the process for responding to requests made by private parties.
  • Citation

3.8: Third-Party Data Collection

3.8.1: Third-Party Collection (Privacy)

Do the policies clearly indicate whether or not a user's personal information is collected by a third party?

3.9: Third-Party Data Misuse

3.9.1: Data Misuse (Privacy)

Do the policies clearly indicate whether or not a user's information can be deleted from a third party by the vendor, if found to be misused by the third party?

  • Indicator
    • Discloses user information may be deleted from a third party if misused.
    • Discloses user information may be deleted from a third party if the vendor's policies are violated.

3.10: Third-Party Service Providers

3.10.1: Third-Party Providers (Privacy)

Do the policies clearly indicate whether or not third-party services are used to support the internal operations of the vendor's product?

  • Indicator
    • Discloses third-party service providers may be used to support the product.
  • Citation
  • Background
    • Disclosure of personal information for the "internal operations" of the website or online service, means activities necessary for the site or service to maintain or analyze its functioning; perform network communications; authenticate users or personalize content; serve contextual advertising or cap the frequency of advertising; protect the security or integrity of the user, website, or online service; ensure legal or regulatory compliance; or fulfill a request of a child. See 16 C.F.R. 312.2; See also FTC, Complying with COPPA: Frequently Asked Questions, q. 5.

3.10.2: Third-Party Roles (Privacy)

Do the policies clearly indicate the role of third-party service providers?

3.11: Third-Party Affiliates

3.11.1: Third-Party Categories (Privacy)

Do the policies clearly indicate the categories of related third parties, such as subsidiaries or affiliates with whom the vendor shares data?

3.12: Third-Party Policies

3.12.1: Third-Party Policy (Privacy)

Do the policies clearly indicate whether or not the vendor provides a link to a third-party service provider, data processor, partner, or affiliate's privacy policy?

  • Indicator
    • Discloses links to the privacy policies for third-party service providers.

3.13: Third-Party Data Combination

3.13.1: Vendor Combination (Privacy)

Do the policies clearly indicate whether or not data collected or maintained by the vendor can be augmented, extended, or combined with data from third party sources?

  • Indicator
    • Discloses user information is combined with information from third parties by the vendor.
  • Citation

3.13.2: Third-Party Combination (Privacy)

Do the policies clearly indicate whether or not data shared with third parties can be augmented, extended, or combined with data from additional third party sources?

3.14: Third-Party Authentication

3.14.1: Social Login (Privacy)

Do the policies clearly indicate whether or not social or federated login is supported to use the product?

  • Indicator
    • Discloses social login is supported to authenticate with the product.
  • Citation
    • California Privacy of Pupil Records: (Prohibits schools, school districts, county offices of education, and charter schools from collecting or maintaining information about pupils from social media for any purpose other than school or pupil safety, without notifying each parent or guardian and providing the pupil with access and an opportunity to correct or delete such information) See California Privacy of Pupil Records, Cal. Ed. Code § 49073.6(c)

3.14.2: Social Collection (Privacy)

Do the policies clearly indicate whether or not the vendor collects information from social or federated login providers?

  • Indicator
    • Discloses user information is collected from social login third-party service providers.
  • Citation
    • California Privacy of Pupil Records: (Prohibits schools, school districts, county offices of education, and charter schools from collecting or maintaining information about pupils from social media for any purpose other than school or pupil safety, without notifying each parent or guardian and providing the pupil with access and an opportunity to correct or delete such information) See California Privacy of Pupil Records, Cal. Ed. Code § 49073.6(c)

3.14.3: Social Sharing (Privacy)

Do the policies clearly indicate whether or not the vendor shares information with social or federated login providers?

  • Indicator
    • Discloses user information is shared with social login third-party service providers.
  • Background

3.15: De-identified or Anonymized Data

3.15.1: Data Deidentified (Privacy)

Do the policies clearly indicate whether or not a user's information that is shared or sold to a third-party is only done so in an anonymous or de-identified format?

  • Indicator
    • Discloses user information is shared in an anonymized or de-identified format.
    • Discloses user information is sold in an anonymized or de-identified format.
  • Citation
  • Background
    • There is nothing wrong with a provider using de-identified data for other purposes, because privacy statutes, govern PII, not de-identified data. But because it can be difficult to fully de-identify data, as a best practice, an agreement between a company and third-party should prohibit re-identification and any future data transfers unless the third-party also agrees not to attempt re-identification. It is also a best practice to be specific about the de-identification process. De-identification typically requires more than just removing any obvious individual identifiers, as other demographic or contextual information can often be used to re-identify specific individuals. Retaining location and school information can also greatly increase the risk of re-identification. See PTAC, Protecting Student Privacy While Using Online Educational Services: Model Terms of Service, P. 3.
    • Properly de-identified data can reduce the risk of a person's sensitive personal information being disclosed, but data de-identification must be done carefully. Simple removal of direct identifiers from the data to be released does not constitute adequate de-identification. Properly performed de-identification involves removing or obscuring all identifiable information until all data that can lead to individual identification have been expunged or masked. Further, when making a determination as to whether the data have been sufficiently de-identified, it is necessary to take into consideration cumulative re-identification risk from all previous data releases and other reasonably available information. See PTC, Data De-identification: An Overview of Basic Terms, p. 3.
    • FERPA allows properly de-identified data to be used for other purposes, though providers planning to use de-identified student data should be clear about their methodologies for de-identification. If de-identified data will be transferred to another party, it is a best practice to contractually prohibit the third-party from attempting to re-identify any student data. Providers should also acknowledge whether anonymized metadata—a type of deidentified or partially de-identified data—will be used, and for what purposes. See PTAC, Responsibilities of Third-Party Service Providers under FERPA, P. 3.
    • If a vendor shares covered information for the development and improvement of educational sites or services, they should de-identify and aggregate the information first. See Ready for School, Recommendations for the Ed Tech Industry to Protect the Privacy of Student Data (November 2016), CA. D.O.J., p. 14.

3.15.2: Deidentified Process (Privacy)

Do the policies clearly indicate whether or not the de-identification process is done with a reasonable level of justified confidence, or the vendor provides links to any information that describes their de-identification process?

  • Indicator
    • Discloses the process or method in which user information is anonymized or de-identified.
  • Citation
  • Background
    • While data shared in the aggregate can reduce the risk of re-identifying anonymous individuals, it does not completely eliminate the risk, and sharing of aggregate data should be carefully reviewed. The aggregation of student-level data into school-level (or higher) reports removes much of the risk of disclosure, since no direct identifiers (such as a name, Social Security Number, or student ID) are present in the aggregated tables. Some risk of disclosure does remain, however, in circumstances where one or more students possess a unique or uncommon characteristic (or a combination of characteristics) that would allow them to be identified in the data table (this commonly occurs with small ethnic subgroup populations), or where some easily observable characteristic corresponds to an unrelated category in the data table (e.g., if a school reports that 100% of males in grade 11 scored at "Below Proficient" on an assessment). In these cases, some level of disclosure avoidance is necessary to prevent disclosure in the aggregate data table. See PTAC, Frequently Asked Questions—Disclosure Avoidance (Oct 2012), p. 2.
    • FERPA allows properly de-identified data to be used for other purposes, though providers planning to use de-identified student data should be clear about their methodologies for de-identification. If de-identified data will be transferred to another party, it is a best practice to contractually prohibit the third-party from attempting to re-identify any student data. Providers should also acknowledge whether anonymized metadata—a type of deidentified or partially de-identified data—will be used, and for what purposes. See PTAC, Responsibilities of Third-Party Service Providers under FERPA, P. 3.
    • A company must take reasonable measures to ensure that the data is de-identified. This means that the company must achieve a reasonable level of justified confidence that the data cannot reasonably be used to infer information about, or otherwise be linked to, a particular consumer, computer, or other device. See FTC, Protecting Consumer Privacy in an era of rapid change: recommendations for business and policy makers (2012), p. 21.
    • Anonymous data is 'data that is in no way connected to another piece of information that could enable a user to be identified.' This expansive view is necessary to reflect several facts. First, skilled analysts can de-anonymize large data sets. This renders nearly all promises of anonymization unattainable. In essence, any data tied to an 'anonymous identifier' is not anonymous; rather, this is often pseudonymous data that may be tied back to the user’s offline identity. Second, metadata maybe as or more revealing of a user's associations and interests than content data, thus this data is of vital interest. Third, entities that have access to many sources of data, such as data brokers and governments, may be able to pair two or more data sources to reveal information about users. Thus, sophisticated actors can use data that seems anonymous to construct a larger picture of a user. See Ranking Digital Rights, P3.

3.16: Third-Party Contractual Obligations

3.16.1: Third-Party Limits (Privacy)

Do the policies clearly indicate whether or not the vendor imposes contractual limits on how third parties can use personal information that the vendor shares or sells to them?

  • Indicator
    • Discloses contractual obligations or restrictions are placed on third parties who receive user information.
  • Citation
    • Children's Online Privacy Protection Act: (An operator must take reasonable steps to release a child's personal information only to service providers and third parties who are capable of maintaining the confidentiality, security, and integrity of the information, and provide assurances that they contractually maintain the information in the same manner) See Children's Online Privacy Protection Act (COPPA), 16 C.F.R. Part 312.8
    • Family Educational Rights and Privacy Act: (An exception for disclosing personally identifiable information without obtaining parental consent exists for sharing data with a third party who is considered a "school official" with a legitimate educational interest, and under direct control of the school for the use and maintenance of education records) See Family Educational Rights and Privacy Act (FERPA), 34 C.F.R. Part 99.31(a)(1)(i)(B)
    • Student Online Personal Information Protection Act: (An operator may disclose student information to a third party service provider, but the third party is prohibited from using the information for or any purpose other than providing the service) See Student Online Personal Information Protection Act (SOPIPA), Cal. B.&P. Code § 22584(b)(4)(E)(i)
    • Student Online Personal Information Protection Act: (A third party service provider may not disclose student information to any subsequent third party) See Student Online Personal Information Protection Act (SOPIPA),Cal. B.&P. Code § 22584(b)(4)(E)(ii)
    • General Data Protection Regulation: (The processor shall not engage another processor without prior specific or general written authorisation of the controller. In the case of general written authorisation, the processor shall inform the controller of any intended changes concerning the addition or replacement of other processors, thereby giving the controller the opportunity to object to such changes.) See General Data Protection Regulation (GDPR), Processor, Art. 28(2)
    • General Data Protection Regulation: (Processing by a processor shall be governed by a contract or other legal act under Union or Member State law, that is binding on the processor with regard to the controller and that sets out the subject-matter and duration of the processing, the nature and purpose of the processing, the type of personal data and categories of data subjects and the obligations and rights of the controller.) See General Data Protection Regulation (GDPR), Processor, Art. 28(3)
    • General Data Protection Regulation: (Where a processor engages another processor for carrying out specific processing activities on behalf of the controller, the same data protection obligations as set out in the contract or other legal act between the controller and the processor ... shall be imposed on that other processor by way of a contract or other legal act under Union or Member State law, in particular providing sufficient guarantees to implement appropriate technical and organisational measures in such a manner that the processing will meet the requirements of this Regulation. Where that other processor fails to fulfil its data protection obligations, the initial processor shall remain fully liable to the controller for the performance of that other processor's obligations.) See General Data Protection Regulation (GDPR), Processor, Art. 28(4)
    • General Data Protection Regulation: (The processor and any person acting under the authority of the controller or of the processor, who has access to personal data, shall not process those data except on instructions from the controller) See General Data Protection Regulation (GDPR), Processing under the authority of the controller or processor, Art. 29
  • Background
    • A company that transfers data from one company to another should not place emphasis on the disclosures themselves, but on whether a disclosure leads to a use of personal data that is inconsistent within the context of its collection or a consumer's expressed desire to control the data. Thus, if a company transfers personal data to a third party, it remains accountable and thus should hold the recipient accountable—through contracts or other legally enforceable instruments. See Exec. Office of the President, Consumer Data Privacy in a Networked World: A Framework for Protecting Privacy and Promoting Innovation in the Global Digital Economy (2012), p. 22.
    • A company's data would not be "reasonably linkable" to a particular consumer or device to the extent that the company implements three significant protections for that data: (1) a given data set is not reasonably identifiable, (2) the company publicly commits not to re-identify it, and (3) the company requires any downstream users of the data to keep it in de-identified form. See FTC, Protecting Consumer Privacy in an era of rapid change: recommendations for business and policy makers (2012), p. 21.
    • The ability to re-identify "anonymous" data supports the FTC's framework application to data that can be reasonably linked to a consumer or device, because consumers' privacy interest in data goes beyond what is strictly labeled PII. There exists a legitimate interest for consumers in having control over how companies collect and use aggregated or de-identified data, browser fingerprints, and other types of non-PII. See FTC, Protecting Consumer Privacy in an era of rapid change: recommendations for business and policy makers (2012), pp. 18-19.
    • Properly de-identified data can reduce the risk of a person's sensitive personal information being disclosed, but data de-identification must be done carefully. Simple removal of direct identifiers from the data to be released does not constitute adequate de-identification. Properly performed de-identification involves removing or obscuring all identifiable information until all data that can lead to individual identification have been expunged or masked. Further, when making a determination as to whether the data have been sufficiently de-identified, it is necessary to take into consideration cumulative re-identification risk from all previous data releases and other reasonably available information. See PTC, Data De-identification: An Overview of Basic Terms, p. 3.
    • A vendor should contractually require their service providers who receive covered information acquired through the site or service to use the information only to provide the contracted service, not to further disclose the information, to implement and maintain reasonable security procedures and practices as required by law, and to return or delete covered information at the completion of the contract. Include a requirement that any service providers notify the vendor immediately of any unauthorized disclosure of the student information in their custody, and then act promptly to provide proper notice as required by law. Make clear to service providers that they may separately face liability for the mishandling of student data. See Ready for School, Recommendations for the Ed Tech Industry to Protect the Privacy of Student Data (November 2016), CA. D.O.J., p. 13.

3.16.2: Combination Limits (Privacy)

Do the policies clearly indicate whether or not the vendor imposes contractual limits that prohibit third-parties from re-identifying or combining data with other data sources that the vendor shares or sells to them?

  • Indicator
    • Discloses contractual obligations are placed on third parties from re-identification of anonymized or de-identified data.
  • Citation
    • Children's Online Privacy Protection Act: (An operator must take reasonable steps to release a child's personal information only to service providers and third parties who are capable of maintaining the confidentiality, security, and integrity of the information, and provide assurances that they contractually maintain the information in the same manner) See Children's Online Privacy Protection Act (COPPA), 16 C.F.R. Part 312.8
  • Background
    • When data are collected in one context and combined with data from other sources or different contexts, it increases the potential for an individual's privacy to be compromised. Combining data from multiple sources is part of the process of creating a digital profile of a student. Combining data from multiple sources can also be used to re-identify data sets that have been de-identified, or to identify individuals within data sets that have been shared as anonymous aggregated data. A privacy policy that prohibits third-parties from re-identifying anonymous aggregated data provides an additional level of privacy protection for users. See PTC, Data De-identification: An Overview of Basic Terms.
    • The FTC recommends that third-party data brokers take reasonable precautions to ensure that downstream users of their data do not use it for eligibility determinations or for unlawful discriminatory purposes. Of course, the use of race, color, religion, and certain other categories to make credit, insurance, and employment decisions is already against the law, but data brokers should help ensure that the information does not unintentionally go to unscrupulous entities that would be likely to use it for unlawful discriminatory purposes. Similarly, data brokers should conduct due diligence to ensure that data that they intend for marketing or risk mitigation purposes is not used to deny consumers credit, insurance, employment, or the like. See FTC, Data Brokers: A Call For Transparency and Accountability (May 2014), pp. 55-56.
    • A company that transfers data from one company to another should not place emphasis on the disclosures themselves, but on whether a disclosure leads to a use of personal data that is inconsistent within the context of its collection or a consumer's expressed desire to control the data. Thus, if a company transfers personal data to a third party, it remains accountable and thus should hold the recipient accountable—through contracts or other legally enforceable instruments. See Exec. Office of the President, Consumer Data Privacy in a Networked World: A Framework for Protecting Privacy and Promoting Innovation in the Global Digital Economy (2012), p. 22.
    • The FTC's framework application applies to data that, while not yet linked to a particular consumer, computer, or device, may reasonably become so. There is significant evidence demonstrating that technological advances and the ability to combine disparate pieces of data can lead to identification of a consumer, computer, or device even if the individual pieces of data do not constitute PII. See FTC, Protecting Consumer Privacy in an era of rapid change: recommendations for business and policy makers (2012), p. 20.

4: Respect for Context

4.1: Data Use

4.1.1: Purpose Limitation (Privacy)

Do the policies clearly indicate whether or not the vendor limits the use of data collected by the product to the educational purpose for which it was collected?

  • Indicator
    • Discloses use of information is limited to the purpose for which it was collected.
    • Discloses user information is only used if it is directly relevant or necessary for the product.
  • Citation
  • Background
    • Any PII from a students's education record that the provider receives under FERPA's "school official" exception may only be used for the specific purpose for which it was disclosed (i.e., to perform the outsourced institutional service or function, and the school or district must have direct control over the use and maintenance of the PII by the provider receiving the PII). Further, under FERPA's school official exception, the provider may not share or sell FERPA-protected information, or re-use it for any other purposes, except as directed by the school or district and as permitted by FERPA. See PTAC, Protecting Student Privacy While Using Online Educational Services: Requirements and Best Practices, p. 5.
    • Companies should publicly commit to the principle of use limitation, which is part of the OECD privacy guidelines, among other frameworks. See Ranking Digital Rights, P5.

4.1.2: Data Purpose (Privacy)

Do the policies clearly indicate the context or purpose in which data are collected?

4.2: Data Combination

4.2.1: Combination Type (Privacy)

Do the policies clearly indicate whether or not the vendor would treat Personally Identifiable Information (PII) combined with non-personally Identifiable Information as PII?

  • Indicator
    • Discloses any collected information combined with personal information is treated as Personally Identifiable Information (PII).
  • Citation
  • Background
    • When data are collected in one context and combined with data from other sources or different contexts, it increases the potential for an individual's privacy to be compromised. Combining data from multiple sources is part of the process of creating a digital profile of a student. Combining data from multiple sources can also be used to re-identify data sets that have been de-identified, or to identify individuals within data sets that have been shared as anonymous aggregated data. A privacy policy that prohibits third-parties from re-identifying anonymous aggregated data provides an additional level of privacy protection for users. See PTC, Data De-identification: An Overview of Basic Terms.

4.3: Data Notice

4.3.1: Context Notice (Privacy)

Do the policies clearly indicate whether or not notice is provided to a user if the vendor changes the context in which data are collected?

4.4: Data Changes

Do the policies clearly indicate whether or not the vendor will obtain consent if the practices in which data are collected change or are inconsistent with contractual requirements?

  • Indicator
    • Discloses consent will be obtained if the context in which data are collected or used changes.
  • Citation
    • General Data Protection Regulation: (Where the processing for a purpose other than that for which the personal data have been collected is not based on the data subject's consent or on a Union or Member State law which constitutes a necessary and proportionate measure in a democratic society to safeguard the objectives referred to in Article 23(1), the controller shall, in order to ascertain whether processing for another purpose is compatible with the purpose for which the personal data are initially collected, take into account, several factors.) See General Data Protection Regulation (GDPR), Lawfulness of Processing, Art. 6(4)(a)-(d)

4.5: Policy Enforcement

4.5.1: Community Guidelines (Privacy)

Do the policies clearly indicate whether or not the vendor may terminate a user's account if they engage in any prohibited activities?

  • Indicator
    • Discloses what type of user content or activities are prohibited on the product.
    • Discloses clear examples to help the user understand what the rules are and how they are enforced.
    • Discloses violations of the rules may result in the restriction or termination of a user's account.
    • Discloses what mechanisms are used to identity accounts that violate the rules.
    • Discloses an appeal process is available to reinstate accounts alleged to have violated the rules.
  • Background
    • We therefore expect companies to clearly disclose what these rules are and how companies enforce them. This includes information about how companies learn of material or activities that violate their terms. See Ranking Digital Rights, F3.

5: Individual Control

5.1: User Content

5.1.1: User Submission (Privacy)

Do the policies clearly indicate whether or not a user can create or upload content to the product?

  • Indicator
    • Discloses user content may be created or uploaded to the product.

Do the policies clearly indicate whether or not the vendor requests opt-in consent from a user at the time information is collected?

5.3: Remedy Process

5.3.1: Complaint Notice (Compliance)

Do the policies clearly indicate whether or not the vendor has a grievance or remedy mechanism for users to file a complaint after the vendor restricts or removes a user's content or account?

  • Indicator
    • Discloses notification is provided to users if their account or content is restricted.
    • Discloses notification is provided to users who attempt to access content that has been restricted.
    • Discloses users can file a complaint if their account or content is restricted.
    • Discloses the reasons why a user's account or content may be restricted.
    • Discloses an appeal process for users to request their account or content be restored.
    • Discloses data about the number of accounts it restricts or closes on its own initiative.
    • Discloses data about the number of accounts it restricts or closes as a result of a government request.
    • Discloses data about the number of accounts it restricts or closes as a result of a request from private third-parties.
  • Citation
  • Background
    • Companies often set boundaries for what content users can post on a service as well as what activities users can engage in on the service. Companies can also restrict a user’s account, meaning that the user is unable to access the service, for violating these rules. For mobile ecosystems, this can include restricting access to an end-user’s account or a developer’s account. See Ranking Digital Rights, F3.
    • We also expect companies to clearly disclose whether they have a policy of granting priority or expedited consideration to any government authorities and/or members of private organizations or other entities that identify their organizational affiliation when they report content or users for allegedly violating the company’s rules. See Ranking Digital Rights, F3.
    • This indicator focuses on whether companies clearly disclose that they notify users when they take these types of actions (whether due to terms of service enforcement or third-party restriction requests). A company's decision to restrict or remove access to content or accounts can have a significant impact on users' freedom of expression and access to information rights. We therefore expect companies to disclose that they notify users when they have removed content, restricted a user's account, or otherwise restricted users' abilities to access a service. If a company removes content that a user has posted, we expect the company to inform that user about its decision. If a different user attempts to access content that the company has restricted, we expect the company to notify that user about the content restriction. We also expect companies to specify reasons for their decisions. This disclosure should be part of companies' explanations of their content and access restriction practices. See Ranking Digital Rights, F8.

5.4: Data Settings

5.4.1: User Control (Privacy)

Do the policies clearly indicate whether or not a user can control the vendor or third party's use of their information through privacy settings?

  • Indicator
    • Discloses how users can control the collection, use, or disclosure of their information.
  • Background
    • While notice and consent remains fundamental in many contexts, it is important to examine whether a greater focus on how data is used and reused would be a more productive basis for managing privacy rights in a big data environment. It may be that creating mechanisms for individuals to participate in the use and distribution of his or her information after it is collected is actually a better and more empowering way to allow people to access the benefits that derive from their information. Privacy protections must also evolve in a way that accommodates the social good that can come of big data use. See Exec. Office of the President, Big Data: Seizing Opportunities, Preserving Values (2014), p. 61.

5.5: Data Disclosure

Do the policies clearly indicate whether or not a user can opt-out from the disclosure or sale of their data to a third party?

5.5.2: Disclosure Request (Privacy)

Do the policies clearly indicate whether or not a user can request the vendor to provide all the personal information the vendor has shared with third parties?

  • Indicator
    • Discloses what types or categories of information users can obtain from a request.
    • Discloses users can obtain a copy of all their information collected by the product.
    • Discloses users can obtain a copy of all their information shared with third parties.
    • Discloses users can obtain their information in a structured data format.
  • Citation
  • Background
    • Users should be able to obtain all information that companies hold about them. We expect companies to clearly disclose what options users have to obtain this information, what data this record contains, and what formats users can obtain it in. See Ranking Digital Rights, P8.

5.5.3: Disclosure Notice (Privacy)

Do the policies clearly indicate whether or not the vendor will provide the affected user, school, parent, or student with notice in the event the vendor receives a government or legal request for their information?

  • Indicator
    • Discloses users are notified when government entities (including courts or other judicial bodies) request their user information.
    • Discloses notification is provided to an affected individual(s) of a government or private request for information.
    • Discloses the number of legal requests for information received.
    • Discloses situations when the company might not notify users, including a description of the types of government requests it is prohibited by law from disclosing to users.
    • Discloses the number of legal requests the company is prohibited by law from disclosing.
    • Discloses commitment to carry out due diligence on requests before deciding how to respond and to deny unlawful requests.
    • Discloses guidance or examples of its process of providing notice.
  • Citation
    • Family Educational Rights and Privacy Act: (An educational agency or institution may disclose information for lawful reasons if they make a reasonable effort to notify the parent or eligible student of the order or subpoena in advance of compliance, so that the parent or eligible student may seek protective action) See Family Educational Rights and Privacy Act (FERPA), 34 C.F.R. Part 99.31(a)(9)(ii)
    • California Electronic Communications Privacy Act: (Prohibits a government entity from compelling the production of or access to electronic communication information or electronic device information, without a search warrant, wiretap order, order for electronic reader records, or subpoena issued pursuant under specified conditions, except for emergency situations) See California Electronic Communications Privacy Act, Cal. Pen. Code § 1546-1546.4)
    • General Data Protection Regulation: (The data subject shall have the right to obtain from the controller the erasure of personal data concerning him or her without undue delay and the controller shall have the obligation to erase personal data without undue delay where one of the following grounds applies: ... (d) the personal data have been unlawfully processed) See General Data Protection Regulation (GDPR), Right to erasure, Art. 17(1)(d)
    • General Data Protection Regulation: (The data subject shall have the right to obtain from the controller restriction of processing where one of the following applies: ... (d) the data subject has objected to processing pursuant to Article 21(1) pending the verification whether the legitimate grounds of the controller override those of the data subject.) See General Data Protection Regulation (GDPR), Right to restriction of processing, Art. 18(1)(d)
  • Background
    • We expect companies to clearly disclose a commitment to notifying users when governments and private parties request data about users. We acknowledge that this notice may not be possible in legitimate cases of an ongoing investigation; however, we expect companies to specify what types of government requests they are prohibited by law from disclosing. See Ranking Digital Rights, P12.

5.6: Intellectual Property

5.6.1: Data Ownership (Compliance)

Do the policies clearly indicate whether or not a student, educator, parent, or the school retains ownership to the Intellectual Property rights of the data collected or uploaded to the product?

  • Indicator
    • Discloses copyright ownership of content remains with the user who created or uploaded the content to the product
    • Discloses the company does not retain any control or ownership over the operation, use, inputs, or outputs of the product after it has been purchased by the consumer.
  • Citation
  • Background

Do the policies clearly indicate whether or not the vendor may claim a copyright license to the data or content collected from a user?

Do the policies clearly indicate whether or not the vendor limits its copyright license of a user's data?

  • Indicator
    • Discloses the company may limit its copyright license to users' information or content in certain situations.

Do the policies clearly indicate whether or not the vendor provides notice to a user when their content is removed or disabled because of alleged infringement or other Intellectual Property violations?

  • Indicator
    • Discloses processes for receiving copyright infringement complaints.
    • Discloses notification is provided to users of alleged copyright infringement complaints of their content.
  • Citation
    • Digital Millennium Copyright Act: (The provider of a service or application that has removed or disabled access to material or activity claimed to be infringing must take reasonable steps to promptly notify the subscriber that it has removed or disabled access to their material) See Digital Millennium Copyright Act (DMCA), 17 U.S.C. § 512(g)(2)(A)
  • Background
    • The Digital Millennium Copyright Act (DMCA) establishes procedures for proper notification and rules to take down a user's content that violates the copyrights of others. Under the notice and takedown procedure, a copyright owner submits a notification under penalty of perjury, including a list of specified elements to the service provider's designated agent. If, upon receiving a proper notification, the service provider promptly removes or blocks access to the material identified in the notification, the provider is exempt from liability. However, the service provider is required to provide adequate notice to the affected user, who then has the ability to respond to the notice and takedown by filing a counter notification. See U.S. Copyright Office Summary, The Digital Millennium Copyright Act (DMCA), p. 12; See also 17 U.S.C. § 512(c)(3); 17 U.S.C. § 512(g)(1).

6: Access and Accuracy

6.1: Data Access

6.1.1: Access Data (Compliance)

Do the policies clearly indicate whether or not the vendor provides authorized individuals a method to access a user's personal information?

6.1.2: Restrict Access (Compliance)

Do the policies clearly indicate whether or not the vendor provides mechanisms (permissions, roles, or access controls, etc.) to restrict what data is accessible to specific users?

6.1.3: Review Data (Compliance)

Do the policies clearly indicate whether or not the vendor provides a process available for the school, parents, or eligible students to review student information?

6.2: Data Integrity

6.2.1: Maintain Accuracy (Compliance)

Do the policies clearly indicate whether or not the vendor takes steps to maintain the accuracy of data they collect and store?

6.3: Data Correction

6.3.1: Data Modification (Compliance)

Do the policies clearly indicate whether or not the vendor provides authorized individuals with the ability to modify a user's inaccurate data?

  • Indicator
    • Discloses processes for the correction or modification of users' information.
  • Citation
    • California Online Privacy Protection Act: (If the operator maintains a process for a consumer to review and request changes to any of their personally identifiable information they must provide a description of that process) See California Online Privacy Protection Act (CalOPPA), Cal. B.&P. Code §22575(b)(2)
    • General Data Protection Regulation: (The data subject shall have the right to obtain from the controller without undue delay the rectification of inaccurate personal data concerning him or her. Taking into account the purposes of the processing, the data subject shall have the right to have incomplete personal data completed, including by means of providing a supplementary statement.) See General Data Protection Regulation (GDPR), Right to rectification, Art. 16

6.3.2: Modification Process (Compliance)

Do the policies clearly indicate whether or not the vendor provides a process for the schools, parents, or eligible students to modify inaccurate student information?

6.3.3: Modification Notice (Compliance)

Do the policies clearly indicate how long the vendor has to modify a user's inaccurate data after given notice?

  • Indicator
    • Discloses a timeframe in which to modify users' information after provided notification of the request.
  • Citation
    • General Data Protection Regulation: (The data subject shall have the right to obtain from the controller without undue delay the rectification of inaccurate personal data concerning him or her. Taking into account the purposes of the processing, the data subject shall have the right to have incomplete personal data completed, including by means of providing a supplementary statement.) See General Data Protection Regulation (GDPR), Right to rectification, Art. 16

6.4: Data Retention

6.4.1: Retention Policy (Compliance)

Do the policies clearly indicate the vendor's data retention policy, including any data sunsets or any time-period after which a user's data will be automatically deleted if they are inactive on the product?

  • Indicator
    • Discloses a timeframe in which the company may retain user information.
    • Discloses users' information is automatically deleted after a specified timeframe.
    • Discloses users' information is retained for different timeframes based on the type of data collected.
  • Citation
  • Background
    • A vendor should retain student information acquired through the site or service only as long as allowed or required by the school or district. A vendor should also describe their data retention policy, including how long they retain student information and why. A vendor's default retention period for covered information should not be indefinite. See Ready for School, Recommendations for the Ed Tech Industry to Protect the Privacy of Student Data (November 2016), CA. D.O.J., p. 12.
    • Companies collect a wide range of personal information from users in exchange for the use of and access to the company's products and services. This information can range from personal details, profiles, and account activities to information about a user's activities and location. We expect companies to clearly disclose how long they retain user information and the extent to which they remove identifiers from user information they retain. Users should also be able to understand what happens when they delete their accounts. Companies that choose to retain user information for extended periods of time should take steps to ensure that data is not tied to a specific user. Acknowledging the ongoing debates about the efficacy of de-identification processes, and the growing sophistication around re-identification practices, we still consider de-identification a positive step that companies can take to protect the privacy of their users. If companies collect multiple types of information, we expect them to provide detail on how they handle each type of information. See Ranking Digital Rights, P6.

6.4.2: Retention Limits (Compliance)

Do the policies clearly indicate whether or not the vendor will limit the retention of a user's data unless a valid request to inspect data is made?

  • Indicator
    • Discloses limits on the retention of users' information if an inspection or legal request is made.
  • Citation
    • Family Educational Rights and Privacy Act: (An educational institution must annually notify parents of their rights to inspect and review a student's education records, make corrections, delete, or consent to the disclosure of information) See Family Educational Rights and Privacy Act (FERPA), 34 C.F.R. Part 99.7(a)
    • General Data Protection Regulation: ([Data is] kept in a form which permits identification of data subjects for no longer than is necessary for the purposes for which the personal data are processed; personal data may be stored for longer periods insofar as the personal data will be processed solely for archiving purposes in the public interest, scientific or historical research purposes or statistical purposes in accordance with Article 89(1) subject to implementation of the appropriate technical and organisational measures required by this Regulation in order to safeguard the rights and freedoms of the data subject) See General Data Protection Regulation (GDPR), Principles relating to processing of personal data, Art. 5(1)(e)
    • General Data Protection Regulation: (The data subject shall have the right to obtain from the controller restriction of processing where one of the following applies: ... (c) the controller no longer needs the personal data for the purposes of the processing, but they are required by the data subject for the establishment, exercise or defence of legal claims) See General Data Protection Regulation (GDPR), Right to restriction of processing, Art. 18(1)(c)

6.5: Data Deletion

6.5.1: Deletion Purpose (Compliance)

Do the policies clearly indicate whether or not the vendor will delete a user's personal information when the data are no longer necessary to fulfill its intended purpose?

  • Indicator
    • Discloses users' information will be deleted when no longer neccessary for the purpose in which it was collected.
  • Citation
    • Children's Online Privacy Protection Act: (An operator may retain information collected from a child only as long as necessarily to fulfill the purpose for which it was collected and must delete the information using reasonable measures to prevent unauthorized use) See Children's Online Privacy Protection Act (COPPA), 16 C.F.R. Part 312.10
    • California AB 1584 - Privacy of Pupil Records: (A local educational agency that enters into a contract with a third party must ensure the contract contains a certification that a pupil's records shall not be retained or available to the third party upon completion of the terms of the contract and a description of how that certification will be enforced) See California AB 1584 - Privacy of Pupil Records, Cal. Ed. Code § 49073.1(b)(7)
    • General Data Protection Regulation: (The data subject shall have the right to obtain from the controller the erasure of personal data concerning him or her without undue delay and the controller shall have the obligation to erase personal data without undue delay where one of the following grounds applies: (a) the personal data are no longer necessary in relation to the purposes for which they were collected or otherwise processed) See General Data Protection Regulation (GDPR), Right to erasure, Art. 17(1)(a)

6.5.2: Account Deletion (Compliance)

Do the policies clearly indicate whether or not a user's data are deleted upon account cancellation or termination?

6.5.3: User Deletion (Compliance)

Do the policies clearly indicate whether or not a user can delete all of their personal and non-personal information from the vendor?

  • Indicator
    • Discloses users can delete their information collected from the product.
  • Citation
    • California Online Privacy Protection Act: (If the operator maintains a process for a consumer to review and request changes to any of their personally identifiable information they must provide a description of that process) See California Online Privacy Protection Act (CalOPPA), Cal. B.&P. Code §22575(b)(2)
    • California Privacy Rights for Minors in the Digital World: (Prohibits an operator from marketing or advertising non age-appropriate types of products or services to a minor under 18 years of age and from knowingly using, disclosing, compiling, or allowing a third party to use, disclose, or compile, the personal information of a minor for the purpose of marketing or advertising non age-appropriate types of products or services. Also, a minor is permitted to request to "erase" or remove and obtain removal of content or information posted on the operator's site) See California Privacy Rights for Minors in the Digital World, Cal. B.&P. Code §§ 22580-22582
    • General Data Protection Regulation: (Where the controller has made the personal data public and is obliged ... to erase the personal data, the controller, taking account of available technology and the cost of implementation, shall take reasonable steps, including technical measures, to inform controllers which are processing the personal data that the data subject has requested the erasure by such controllers of any links to, or copy or replication of, those personal data.) See General Data Protection Regulation (GDPR), Right to erasure, Art. 17(2)
  • Background
    • We expect companies to clearly disclose what options users have to control the information that companies collect and retain about them. Enabling users to control what information about them that a company collects and retains would mean giving users the ability to delete specific types of user information without requiring them to delete their entire account. We therefore expect companies to clearly disclose whether users have the option to delete specific types of user information. See Ranking Digital Rights, P7.

6.5.4: Deletion Process (Compliance)

Do the policies clearly indicate whether or not the vendor provides a process for the school, parent, or eligible student to delete a student's personal information?

6.5.5: Deletion Notice (Compliance)

Do the policies clearly indicate how long the vendor may take to delete a user's data after given notice?

  • Indicator
    • Discloses a timeframe after which to automatically delete users' information.
  • Citation
    • General Data Protection Regulation: (The data subject shall have the right to obtain from the controller without undue delay the rectification of inaccurate personal data concerning him or her. Taking into account the purposes of the processing, the data subject shall have the right to have incomplete personal data completed, including by means of providing a supplementary statement.) See General Data Protection Regulation (GDPR), Right to rectification, Art. 16
    • General Data Protection Regulation: (The data subject shall have the right to obtain from the controller the erasure of personal data concerning him or her without undue delay and the controller shall have the obligation to erase personal data without undue delay ...) See General Data Protection Regulation (GDPR), Right to erasure, Art. 17(1)

6.6: Data Portability

6.6.1: User Export (Privacy)

Do the policies clearly indicate whether or not a user can export or download their data, including any user created content on the product?

6.6.2: Legacy Contact (Security)

Do the policies clearly indicate whether or not a user may assign an authorized account manager or legacy contact to access and download their data?

  • Indicator
    • Discloses processes to assign an account manager or trusted contact if the account becomes inactive.
  • Citation

7: Data Transfer

7.1: Data Handling

7.1.1: Transfer Data (Privacy)

Do the policies clearly indicate whether or not the vendor can transfer a user's data in the event of the vendor's merger, acquisition, or bankruptcy?

7.1.2: Data Assignment (Privacy)

Do the policies clearly indicate whether or not the vendor can assign its rights or delegate its duties under the policies to a successor vendor without notice or consent to the user?

  • Indicator
    • Discloses the company may assign its rights and obligations of the policies to a third party.
    • Discloses the company will provide notification to users before assigning its rights and obligations of the policies to a third party.
    • Discloses the company will seek user consent before assigning its rights and obligations of the policies to a third party.

7.1.3: Transfer Notice (Privacy)

Do the policies clearly indicate whether or not the vendor will notify users of a data transfer to a third-party successor, in the event of a vendor's bankruptcy, merger, or acquisition?

  • Indicator
    • Discloses notification is provided and consent is obtained from users before data is transferred to a third party.

7.2: Transfer Request

7.2.1: Delete Transfer (Privacy)

Do the policies clearly indicate whether or not a user can request to delete their data prior to its transfer to a third-party successor in the event of a vendor bankruptcy, merger, or acquisition?

  • Indicator
    • Discloses users' may request deletion of their information before data is transferred to a third party.

7.3: Onward Contractual Obligations

7.3.1: Contractual Limits (Privacy)

Do the policies clearly indicate whether or not the third-party successor of a data transfer is contractually required to provide the same privacy compliance required of the vendor?

  • Indicator
    • Discloses contractual obligations are imposed on third-party data transfer successors of the same privacy protections provided by the company.
  • Citation
    • Children's Online Privacy Protection Act: (An operator must take reasonable steps to release a child's personal information only to service providers and third parties who are capable of maintaining the confidentiality, security, and integrity of the information, and provide assurances that they contractually maintain the information in the same manner) See Children's Online Privacy Protection Act (COPPA), 16 C.F.R. Part 312.8
    • Student Online Personal Information Protection Act: (An operator may transfer a student's personal information to a third party in the event of a merger, acquisition, or bankruptcy, but the successor entity is subject to the same onward data privacy and security obligations) See Student Online Personal Information Protection Act (SOPIPA), Cal. B.&P. Code § 22584(b)(3)
    • General Data Protection Regulation: (Any transfer of personal data which are undergoing processing or are intended for processing after transfer to a third country or to an international organisation shall take place only if, subject to the other provisions of this Regulation, the conditions laid down in this Chapter are complied with by the controller and processor, including for onward transfers of personal data from the third country or an international organisation to another third country or to another international organisation. All provisions in this Chapter shall be applied in order to ensure that the level of protection of natural persons guaranteed by this Regulation is not undermined.) See General Data Protection Regulation (GDPR), General principle for transfers, Art. 44
  • Background

8: Security

8.1: User Identity

8.1.1: Verify Identity (Security)

Do the policies clearly indicate whether or not the vendor or vendor-authorized third party verifies a user's identity with personal information?

  • Indicator
    • Discloses users are required to verify their identity with a government issued identification or with other forms of identification that could be connected to their offline identity.
    • Discloses users are required to verify their identity with personal information for parental consent purposes.
  • Citation
  • Background
    • The ability to communicate anonymously is essential to freedom of expression both on and offline. The use of a real name online, or requiring users to provide a company with identifying information, provides a link between online activities and a specific person. This presents human rights risks to those who, for example, voice opinions that don't align with a government's views or who engage in activism that a government does not permit. It also presents risks for people who are persecuted for religious beliefs or sexual orientation. We therefore expect companies to disclose whether they might ask users to verify their identities using government-issued ID or other forms of identification that could be connected to their offline identity. We acknowledge that users may have to provide information that could be connected to their offline identity in order to access paid features of various products and services. However, users should be able to access features that don't require payment without needing to provide information that can be tied to their offline identity. See Ranking Digital Rights, F11.

8.2: User Account

8.2.1: Account Required (Security)

Do the policies indicate whether or not the vendor requires a user to create an account with a username and password in order to use the product?

  • Indicator
    • Discloses users are required to create an account to use the product.

8.2.2: Managed Account (Security)

Do the policies clearly indicate whether or not the vendor provides user managed accounts for a parent, teacher, school or district?

  • Indicator
    • Discloses managed accounts are provided for parents, teachers, schools, or district staff.
    • Discloses accounts are created for students by parents, teachers, schools, or district staff.

8.2.3: Two-Factor Protection (Security)

Do the policies clearly indicate whether or not the security of a user's account is protected by two-factor authentication?

  • Indicator
    • Discloses user accounts can be protected with two-factor authentication.
    • Discloses managed accounts can be protected with two-factor authentication.

8.3: Third-party Security

8.3.1: Security Agreement (Security)

Do the policies clearly indicate whether or not a third party with access to a user's information is contractually required to provide the same level of security protections as the vendor?

8.4: Data Confidentiality

8.4.1: Reasonable Security (Security)

Do the policies clearly indicate whether or not reasonable security standards are used to protect the confidentiality of a user's personal information?

  • Indicator
    • Discloses security protections in place for users' information are based on industry standards and best practices.
    • Discloses complex passwords and failed login lockouts protect user information.
    • Discloses advanced authentication methods are provided by the company to prevent fraudulent access.
    • Discloses users can view their recent account activity and login information.
    • Discloses users are notified about unusual account activity and possible unauthorized access to their accounts.
  • Citation
  • Background
    • A vendor should provide a general description of the technical, administrative and physical safeguards you use to protect student information from unauthorized access, destruction, use, modification, or disclosure. See Ready for School, Recommendations for the Ed Tech Industry to Protect the Privacy of Student Data (November 2016), CA. D.O.J., p. 14.
    • A vendor should implement and maintain reasonable security measures appropriate to the nature of the student information, including covered information, acquired through your site or service. Designate and train someone responsible and use a risk management process: identify your data assets, assess threats and vulnerabilities, apply appropriate controls, monitor their effectiveness, and repeat the process. As discussed in the California Data Breach Report, the Center for Internet Security’s Critical Security Controls is a good starting point for high-priority security controls. The Federal Trade Commission’s Start with Security also offers helpful guidance. See Ready for School, Recommendations for the Ed Tech Industry to Protect the Privacy of Student Data (November 2016), CA. D.O.J., p. 15.
    • This indicator is applicable to internet and mobile ecosystem companies. Companies hold significant amounts of user information, making them targets for malicious actors. We expect companies to help users protect themselves against such threats. Companies should clearly disclose that they use advanced authentication techniques to prevent unauthorized access to user accounts and information. We also expect companies to provide users with tools that enable them to secure their accounts and to know when their accounts maybe compromised. See Ranking Digital Rights, P17.

8.4.2: Employee Access (Security)

Do the policies clearly indicate whether or not the vendor implements physical access controls or limits employee access to user information?

  • Indicator
    • Discloses security processes are used that limit or monitor employee access to users' information.
    • Discloses physical access controls are used to limit employee access to users' information.
  • Citation
    • California AB 1584 - Privacy of Pupil Records: (A local educational agency that enters into a contract with a third party must ensure the contract contains a description of the actions the third party will take, including the designation and training of responsible individuals, to ensure the security and confidentiality of pupil records) See California AB 1584 - Privacy of Pupil Records, Cal. Ed. Code § 49073.1(b)(5)
  • Background

8.5: Data Transmission

8.5.1: Transit Encryption (Security)

Do the policies clearly indicate whether or not all data in transit is encrypted?

  • Indicator
    • Discloses the transmission of user communications are encrypted using Secure Socket Layer (SSL).
    • Discloses the transmission of user communications are encrypted using unique keys.
    • Discloses users can secure information with their own user supplied encryption keys.
    • Discloses user communications are encrypted by default.
  • Citation
    • California Data Breach Notification Requirements: (A person or business that owns, licenses, or maintains personal information about a California resident is required to implement and maintain reasonable security procedures and practices appropriate to the nature of the information, and to protect the personal information from unauthorized access, destruction, use, modification, or disclosure) See California Data Breach Notification Requirements, Cal. Civ. Code § 1798.81.5
    • General Data Protection Regulation: (Taking into account the state of the art, the costs of implementation and the nature, scope, context and purposes of processing as well as the risk of varying likelihood and severity for the rights and freedoms of natural persons, the controller and the processor shall implement appropriate technical and organisational measures to ensure a level of security appropriate to the risk, including inter alia as appropriate: (a) the pseudonymisation and encryption of personal data) See General Data Protection Regulation (GDPR), Security of processing, Art. 32(1)(a)
  • Background
    • Encryption is an important tool for protecting freedom of expression and privacy. The UN Special Rapporteur on Freedom of Expression has stated unequivocally that encryption and anonymity are essential for the exercise and protection of human rights. We expect companies to clearly disclose that user communications are encrypted by default, that transmissions are protected by “perfect forward secrecy,” that users have an option users have to turn on end-to-end encryption, and if the company offers end-to-end encryption by default. For mobile ecosystems, we expect companies to clearly disclose that they enable full-disk encryption. See Ranking Digital Rights, P16.

8.6: Data Storage

8.6.1: Storage Encryption (Security)

Do the policies clearly indicate whether or not all data at rest is encrypted?

  • Indicator
    • Discloses user information is encrypted or inaccessible while in storage.
    • Discloses user information on mobile devices is encrypted with full disk encryption.
    • Discloses user information is encrypted if stored with third parties.
    • Discloses user information is encrypted while archived.
  • Citation
    • California Data Breach Notification Requirements: (A person or business that owns, licenses, or maintains personal information about a California resident is required to implement and maintain reasonable security procedures and practices appropriate to the nature of the information, and to protect the personal information from unauthorized access, destruction, use, modification, or disclosure) See California Data Breach Notification Requirements, Cal. Civ. Code § 1798.81.5
    • General Data Protection Regulation: (Taking into account the state of the art, the costs of implementation and the nature, scope, context and purposes of processing as well as the risk of varying likelihood and severity for the rights and freedoms of natural persons, the controller and the processor shall implement appropriate technical and organisational measures to ensure a level of security appropriate to the risk, including inter alia as appropriate: (a) the pseudonymisation and encryption of personal data) See General Data Protection Regulation (GDPR), Security of processing, Art. 32(1)(a)

8.6.2: Data Control (Security)

Do the policies clearly indicate whether or not personal information are stored outside the control of the vendor?

  • Indicator
    • Discloses user information is under the direct control of the vendor.
    • Discloses user information is stored in the same jurisdiction as the vendor.
    • Discloses user information is under the direct control of the student, parent, teacher, or school.
    • Discloses user information is under the direct control of a third party.
  • Citation

8.7: Data Breach

8.7.1: Breach Notice (Security)

Do the policies clearly indicate whether or not the vendor provides notice in the event of a data breach to affected individuals?

  • Indicator
    • Discloses processes for notification of users affected by a data breach.
    • Discloses notification is provided to relevant legal authorities without unreasonable delay when a data breach occurs.
    • Discloses steps taken by the company to remedy the impact of a data breach on users.
  • Citation
  • Background
    • The breach notification laws in California and the 46 other states are similar in many ways, because most are modeled on the original California law. All of them require notifying individuals when their personal information has been breached, prefer written notification but allow using the "substitute method" in certain situations, allow for a law enforcement delay, and provide an exemption from the requirement to notify when data is encrypted and the keys required to de-crypt the data are still secure. However, there are some differences, primarily in three areas: (1) the notification trigger, (2) the timing for notification, and (3) the definition of covered information. See CA DOJ, California Data Breach Report (2016).
    • A vendor should develop and describe the process for notifying schools or school districts, parents, legal guardians, or eligible students, as well as any appropriate government agencies, of any unauthorized disclosure of student information. Determine whether the incident and the types of data involved also require notification under California's breach notification law, and if so, take appropriate action. See Ready for School, Recommendations for the Ed Tech Industry to Protect the Privacy of Student Data (November 2016), CA. D.O.J., p. 15.
    • When the security of users' data has been compromised due to a data breach, companies should have clearly disclosed processes in place for addressing the security threat and for notifying affected users. Given that data breaches can result in significant threats to an individual's financial or personal security, in addition to exposing private information, companies should make these security processes publicly available. Individuals can then make informed decisions and consider the potential risks before signing up for a service or giving a company their information. Company press releases or blog posts addressing a data breach after it has occurred do not qualify as sufficient disclosure for this indicator. We expect companies to have formal policies in place regarding their handling of data breaches if and when they occur, and companies to make this information about these policies and commitments public. See Ranking Digital Rights, P15.

8.8: Data Oversight

8.8.1: Security Audit (Compliance)

Do the policies clearly indicate whether or not the data privacy or security practices of the vendor are internally or externally audited to ensure compliance?

  • Indicator
    • Discloses internal privacy or security staff conduct assessments or audits.
    • Discloses third party privacy or security audits are performed.
    • Discloses privacy and security risk assessments are performed on a regular schedule.
  • Citation
    • General Data Protection Regulation: (The controller shall be responsible for, and be able to demonstrate compliance with ... [processing of personal data]) See General Data Protection Regulation (GDPR), Principles relating to processing of personal data, Art. 5(2)
    • General Data Protection Regulation: (Taking into account the nature, scope, context and purposes of processing as well as the risks of varying likelihood and severity for the rights and freedoms of natural persons, the controller shall implement appropriate technical and organisational measures to ensure and to be able to demonstrate that processing is performed in accordance with this Regulation. Those measures shall be reviewed and updated where necessary.) See General Data Protection Regulation (GDPR), Responsibility of the controller, Art. 24(1)
    • General Data Protection Regulation: (Where proportionate in relation to processing activities, the [responsibility of the controller] ... shall include the implementation of appropriate data protection policies by the controller.) See General Data Protection Regulation (GDPR), Responsibility of the controller, Art. 24(2)
    • General Data Protection Regulation: (Taking into account the state of the art, the costs of implementation and the nature, scope, context and purposes of processing as well as the risk of varying likelihood and severity for the rights and freedoms of natural persons, the controller and the processor shall implement appropriate technical and organisational measures to ensure a level of security appropriate to the risk, including inter alia as appropriate: ... (d) a process for regularly testing, assessing and evaluating the effectiveness of technical and organisational measures for ensuring the security of the processing.) See General Data Protection Regulation (GDPR), Security of processing, Art. 32(1)(d)
  • Background
    • Companies have access to immense amounts of information about users and should take the highest possible measures to keep this information secure. Just as companies should clearly disclose their oversight processes related to freedom of expression and privacy, they should also provide information about their oversight processes for keeping user information secure. We therefore expect companies to clearly disclose that they have systems in place to limit and monitor employee access to user information. We also expect the company to clearly disclose that it deploys both internal and external security teams to conduct security audits on its products and services. See Ranking Digital Rights, P13.

9: Responsible Use

9.1: Social Interactions

9.1.1: Safe Interactions (Safety)

Do the policies clearly indicate whether or not a user can interact with trusted users?

  • Indicator
    • Discloses users can have social interactions with trusted or other known users.
    • Discloses users can have social interactions with students in the same classroom or school.
  • Citation

9.1.2: Unsafe Interactions (Safety)

Do the policies clearly indicate whether or not a user can interact with untrusted users?

  • Indicator
    • Discloses users can have social interactions with unknown users in the product.
    • Discloses users can have social interactions with unknown individuals outside the product across the Internet.
  • Citation

9.1.3: Share Profile (Safety)

Do the policies clearly indicate whether or not information must be shared or revealed by a user in order to participate in social interactions?

  • Indicator
    • Discloses what type of user profile information can be shared for social interactions.
    • Discloses user profile information must be shared to use the product.
  • Citation

9.2: Data Visibility

9.2.1: Visible Data (Safety)

Do the policies clearly indicate whether or not a user's personal information can be displayed publicly in any way?

  • Indicator
    • Discloses users' personal information can be made publicly visible.
  • Citation

9.2.2: Control Visibility (Safety)

Do the policies clearly indicate whether or not a user has control over how their personal information is displayed to others?

  • Indicator
    • Discloses users can control how their personal information is displayed to others.

9.3: Monitor and Review

9.3.1: Monitor Content (Safety)

Do the policies clearly indicate whether or not the vendor reviews, screens, or monitors user-created content?

  • Indicator
    • Discloses processes to review, screen, or monitor user-created content.

9.3.2: Filter Content (Safety)

Do the policies clearly indicate whether or not the vendor takes reasonable measures to delete all personal information from a user's postings before they are made publicly visible?

  • Indicator
    • Discloses processes to filter and delete users' personal information before it is made publicly visible.
  • Citation
    • Children's Online Privacy Protection Act: (An operator may prevent collection of personal information if it takes reasonable measures to delete all or virtually all personal information from a child's postings before they are made public and also to delete the information from its records) See Children's Online Privacy Protection Act (COPPA), 16 C.F.R. Part 312.2
  • Background
    • Companies may employ staff to review content and/or user activity or they may rely on community flagging mechanisms that allow users to flag other users’ content and/or activity for company review. See Ranking Digital Rights, F3.

9.3.3: Moderating Interactions (Safety)

Do the policies clearly indicate whether or not social interactions between users of the product are moderated?

9.3.4: Log Interactions (Safety)

Do the policies clearly indicate whether or not social interactions are logged by the vendor and available for review or audit?

  • Indicator
    • Discloses social interactions between users are logged by the company.

9.4: Report Content

9.4.1: Block Content (Safety)

Do the policies clearly indicate whether or not an educator, parent, or a school has the ability to filter or block inappropriate content or social interactions?

  • Indicator
    • Discloses processes for the school, parent, or educator to filter or block inappropriate content.
  • Citation
    • Children's Internet Protection Act: (A K-12 school under E-Rate discounts is required to adopt a policy of Internet safety for minors that includes monitoring the online activities of minors and the operation of a technology protection measure with respect to any of its computers with Internet access that protects against access to visual depictions that are obscene, child pornography, or harmful to minors) See Children's Internet Protection Act (CIPA), 47 U.S.C. § 254(h)(5)(B)
    • California Privacy Rights for Minors in the Digital World: (Prohibits an operator from marketing or advertising non age-appropriate types of products or services to a minor under 18 years of age and from knowingly using, disclosing, compiling, or allowing a third party to use, disclose, or compile, the personal information of a minor for the purpose of marketing or advertising non age-appropriate types of products or services. Also, a minor is permitted to request to "erase" or remove and obtain removal of content or information posted on the operator's site) See California Privacy Rights for Minors in the Digital World, Cal. B.&P. Code §§ 22580-22582
    • The Communications Decency Act of 1996: (A provider of an interactive computer service shall notify the customer that parental control protections (such as computer hardware, software, or filtering services) are commercially available that may assist the customer in limiting access to material that is harmful to minors) See The Communications Decency Act of 1996 (CDA), 47 U.S.C. 230(d)

9.4.2: Report Abuse (Safety)

Do the policies clearly indicate whether or not a user can report abusive behavior, or cyberbullying?

  • Indicator
    • Discloses processes for users to report abusive or cyber-bullying conduct.

9.5: Internet Safety

9.5.1: Safe Tools (Safety)

Do the policies clearly indicate whether or not the vendor provides tools and processes that support safe and appropriate social interactions on the product?

  • Indicator
    • Discloses resources to tools and processes that can help support safe and secure social interactions.
  • Citation
    • Children's Internet Protection Act: (A K-12 school under E-Rate discounts is required to adopt a policy of Internet safety for minors that includes monitoring the online activities of minors and the operation of a technology protection measure with respect to any of its computers with Internet access that protects against access to visual depictions that are obscene, child pornography, or harmful to minors) See Children's Internet Protection Act (CIPA), 47 U.S.C. § 254(h)(5)(B)
    • California Privacy Rights for Minors in the Digital World: (Prohibits an operator from marketing or advertising non age-appropriate types of products or services to a minor under 18 years of age and from knowingly using, disclosing, compiling, or allowing a third party to use, disclose, or compile, the personal information of a minor for the purpose of marketing or advertising non age-appropriate types of products or services. Also, a minor is permitted to request to "erase" or remove and obtain removal of content or information posted on the operator's site) See California Privacy Rights for Minors in the Digital World, Cal. B.&P. Code §§ 22580-22582
    • The Communications Decency Act of 1996: (A provider of an interactive computer service shall notify the customer that parental control protections (such as computer hardware, software, or filtering services) are commercially available that may assist the customer in limiting access to material that is harmful to minors) See The Communications Decency Act of 1996 (CDA), 47 U.S.C. 230(d)
  • Background
    • Companies hold significant amounts of user information, making them targets for malicious actors. We expect companies to help users protect themselves against such risks. This can include materials on how to set up advanced account authentication; adjust privacy settings; avoid malware, phishing, and social engineering attacks; avoid third-party tracking; avoid or address bullying or harassment online; and what “safe browsing” means. Companies should present this guidance to the public using clear language, ideally paired with visual images, designed to help users understand the nature of the risks companies and users can face. These can include tips, tutorials, how-to guides, or other resources and should be presented in a way that users can easily understand (for instance with visuals, graphics, bullet points, and lists). See Ranking Digital Rights, P18.

10: Advertising

10.1: Vendor Communications

10.1.1: Service Messages (Privacy)

Do the policies clearly indicate whether or not a user will receive service or administrative-related email or text message communications from the vendor or a third party?

  • Indicator
    • Discloses users will receive service related non-marketing communications from the company.
    • Discloses users will receive service related notifications by email or other method.

10.2: Traditional Advertising

10.2.1: Traditional Ads (Privacy)

Do the policies clearly indicate whether or not traditional advertisements are displayed to a user based on a webpage's content, and not that user's data?

  • Indicator
    • Discloses traditional advertisements are displayed to users on the product.
    • Discloses advertisements are displayed to users without using of any collected personal information.
  • Citation

10.3: Behavioral Advertising

10.3.1: Behavioral Ads (Privacy)

Do the policies clearly indicate whether or not behavioral advertising based on a user's personal information are displayed?

  • Indicator
    • Discloses behavorial advertisements are displayed to users on the product.
    • Discloses advertisements are displayed to users based on thier personal or non-personal information.
  • Citation
  • Background
    • Online behavioral or targeted advertising is the practice of collecting information about consumers' online interests in order to deliver targeted advertising to them. This system of advertising revolves around ad networks that can track individual consumers—or at least their devices—across different websites. When organized according to unique identifiers, this data can provide a potentially wide-ranging view of individual use of the Internet. These individual behavioral profiles allow advertisers to target ads based on inferences about individual interests, as revealed by Internet use. Targeted ads are generally more valuable and efficient than purely contextual ads and provide revenue that supports an array of free online content and services. See Exec. Office of the President, Consumer Data Privacy in a Networked World: A Framework for Protecting Privacy and Promoting Innovation in the Global Digital Economy (2012), pp. 11-12.
    • The FTC recommends that affirmative express consent is appropriate when a company uses sensitive data for any marketing, whether first or third-party. When health or children's information is involved, for example, the likelihood that data misuse could lead to embarrassment, discrimination, or other harms is increased. This risk exists regardless of whether the entity collecting and using the data is a first-party or a third-party that is unknown to the consumer. In light of the heightened privacy risks associated with sensitive data, first parties should provide a consumer choice mechanism at the time of data collection. See FTC, Protecting Consumer Privacy in an era of rapid change: recommendations for business and policy makers (2012), p. 47.
    • The FTC believes affirmative express consent for first-party marketing using sensitive data should be limited. Certainly, where a company's business model is designed to target consumers based on sensitive data – including data about children, financial and health information, Social Security numbers, and certain geolocation data – the company should seek affirmative express consent before collecting the data from those consumers. On the other hand, the risks to consumers may not justify the potential burdens on general audience businesses that incidentally collect and use sensitive information. See FTC, Protecting Consumer Privacy in an era of rapid change: recommendations for business and policy makers (2012), pp. 47-48.
    • If a vendor displays targeted advertising they should not use any information, including covered information and persistent unique identifiers, acquired through the site or service as a basis for targeting advertising to a specific student or other user. This includes both advertising delivered on the site or service that acquired the information and advertising delivered on any other site or service based on that information. See Ready for School, Recommendations for the Ed Tech Industry to Protect the Privacy of Student Data (November 2016), CA. D.O.J., p. 12.

10.4: Ad Tracking

10.4.1: Third-Party Tracking (Privacy)

Do the policies clearly indicate whether or not third-party advertising services or tracking technologies collect any information from a user of the product?

10.4.2: Track Users (Privacy)

Do the policies clearly indicate whether or not a user's information is used to track users and display target advertisements on other third-party websites or services?

10.4.3: Data Profile (Privacy)

Do the policies clearly indicate whether or not the vendor allows third parties to use a student's data to create an automated profile, engage in data enhancement, conduct social advertising, or target advertising to students, parents, teachers, or the school?

10.5: Filtered Advertising

10.5.1: Filter Ads (Privacy)

Do the policies clearly indicate whether or not the vendor or third party filters inappropriate advertisements (e.g., alcohol, gambling, violent, or sexual content)?

  • Indicator
    • Discloses advertisements that are inappropriate for minors are filtered.
    • Discloses criteria for inappropriate advertisements are filtered for minors.
  • Citation
    • California Privacy Rights for Minors in the Digital World: (Prohibits an operator from marketing or advertising non age-appropriate types of products or services to a minor under 18 years of age and from knowingly using, disclosing, compiling, or allowing a third party to use, disclose, or compile, the personal information of a minor for the purpose of marketing or advertising non age-appropriate types of products or services. Also, a minor is permitted to request to "erase" or remove and obtain removal of content or information posted on the operator's site) See California Privacy Rights for Minors in the Digital World, Cal. B.&P. Code §§ 22580-22582
    • Children's Internet Protection Act: (A K-12 school under E-Rate discounts is required to adopt a policy of Internet safety for minors that includes monitoring the online activities of minors and the operation of a technology protection measure with respect to any of its computers with Internet access that protects against access to visual depictions that are obscene, child pornography, or harmful to minors) See Children's Internet Protection Act (CIPA), 47 U.S.C. § 254(h)(5)(B)
  • Background
    • Advertising to children in school presents a variety of legal issues. Several states have laws that place restrictions on the advertising of products or services that have inappropriate content for children, such as alcohol and firearms. Additionally, contextual advertising would likely be permissible as, 'support for internal operations' of a service or applciation, in contrast to behaviorally targeted advertising that implicates several privacy laws such as the CIPA, COPPA, and FERPA, which restrict the use of personal information without parental consent.
    • The FTC restricts advertisements that may be misleading to children. Advertising to children under the age of 13 is particularly scrutinized, as research shows that these children are especially vulnerable because they are unable to understand an advertisement's persuasive intent. Self-regulatory guidelines are also published by the Children's Advertising Review Unit, which is a branch of the advertising self-regulatory program administered by the Council of Better Business Bureaus. The guidelines generally provide that any advertising to young children should be clearly distinguishable from the other content. See Children's Advertising Review Unit, Self-Regulatory Program for Children's Advertising.
    • Third-party data brokers should implement better measures to refrain from collecting information from children and teens, particularly in marketing products. As to children under 13, COPPA already requires certain online services to refrain from collecting personal information from this age group without parental consent. The principles underlying COPPA could apply equally to information collected offline from children. As to teens, the FTC previously has noted that they often lack the judgment to appreciate the long-term consequences of, for example, posting personal information on the Internet. See FTC, Data Brokers: A Call For Transparency and Accountability (May 2014), p. 55.

10.6: Marketing Communications

10.6.1: Marketing Messages (Privacy)

Do the policies clearly indicate whether or not the vendor may send marketing emails, text messages, or other related communications that may be of interest to a user?

10.6.2: Third-Party Promotions (Privacy)

Do the policies clearly indicate whether or not the vendor may ask a user to participate in any sweepstakes, contests, surveys, or other similar promotions?

10.7: Unsubscribe

10.7.1: Unsubscribe Ads (Privacy)

Do the policies clearly indicate whether or not a user can opt-out of traditional, or behavioral advertising?

  • Indicator
    • Discloses users can opt-out whether their information is used for advertising purposes.
    • Discloses users can contact third-party advertisers to control whether their information is used for advertising purposes.
  • Citation
  • Background
    • We expect companies to enable users to control the use of their information for the purpose of targeted advertising. Targeted advertising requires extensive collection and retention of user information that is tantamount to tracking. Companies should therefore clearly disclose whether users have options to control how their information is being used for these purposes. See Ranking Digital Rights, P7.

10.7.2: Unsubscribe Marketing (Privacy)

Do the policies clearly indicate whether or not a user can opt-out or unsubscribe from a vendor or third party marketing communication?

10.8: Do Not Track

10.8.1: DoNotTrack Response (Privacy)

Do the policies clearly indicate whether or not the vendor responds to a "Do Not Track" signal or other opt-out mechanisms from a user?

  • Indicator
    • Discloses the company responds to 'Do Not Track' requests.
  • Citation
    • California Online Privacy Protection Act: (An operator is required to disclose how they respond to Web browser "Do Not Track" signals or other mechanisms that provide consumers the ability to opt-out of the collection of personally identifiable information about their online activities over time and across third-party Web sites) See California Online Privacy Protection Act (CalOPPA), Cal. B.&P. Code §22575(b)(5)
  • Background
    • A Do Not Track system should be implemented universally to cover all parties that would track consumers. The choice mechanism should be easy to find, easy to understand, and easy to use. Any choices offered should be persistent and should not be overridden if, for example, consumers clear their cookies or update their browsers. A Do Not Track system should be comprehensive, effective, and enforceable. It should opt consumers out of behavioral tracking through any means and not permit technical loopholes. Finally, an effective Do Not Track system should go beyond simply opting consumers out of receiving targeted advertisements; it should opt them out of collection of behavioral data for all purposes other than those that would be consistent with the context of the interaction (e.g., preventing click-fraud or collecting de-identified data for analytics purposes). See California Business and Professions Code §§ 22575(b)(5)-(6); See also FTC, Protecting Consumer Privacy in an era of rapid change: recommendations for business and policy makers (2012), p. 53.
    • Even as we focus more on data use, consumers still have a valid interest in "Do Not Track" tools that help them control when and how their data is collected. Strengthening these tools is especially important because there is now a growing array of technologies available for recording individual actions, behavior, and location data across a range of services and devices. Public surveys indicate a clear and overwhelming demand for these tools, and the government and private sector must continue working to evolve privacy-enhancing technologies in step with improved consumer services. See Exec. Office of the President, Big Data: Seizing Opportunities, Preserving Values (2014), p. 62.
    • One prominent user-generated signal is the “Do Not Track” standard. Also known by the acronym “DNT,” this refers to a setting in a user’s browser preferences which tells entities not to “track” them. In other words, every time a user loads a website, any parties that are involved in delivering the page (of which there are often many, primarily advertisers) are told not to collector store any information about the user’s visit to the page. However, this is merely a polite request—a company may ignore a DNT request, and many do. See Ranking Digital Rights, P9.

10.8.2: DoNotTrack Description (Privacy)

Do the policies clearly indicate whether the vendor provides a hyperlink to a description, including the effects, of any program or protocol the vendor follows that offers consumers a choice not to be tracked?

  • Indicator
    • Discloses hyperlinks to other 'Do Not Track' opt-out mechanisms.
  • Citation

11: Compliance

11.1: Children Under 13

11.1.1: Actual Knowledge (Compliance)

Do the policies clearly indicate whether or not the vendor has actual knowledge that personal information from children under 13 years of age is collected by the product?

  • Indicator
    • Discloses the company has actual knowledge users of the product are under the age of 13.
    • Discloses a user's age or birthday is collected upon account registration.
    • Discloses the product utilizes an age-gate or other mechanism to verify the age of a user.
    • Discloses the product is directed or would appeal to children under 13 years of age.
    • Discloses the product provides features intended for children under 13 years of age.
  • Citation
    • Children's Online Privacy Protection Act: (A general audience site is where the operator has no actual knowledge that a child under the age of 13 has registered an account or is using the service, and no age gate or parental consent is required before collection of information) See Children's Online Privacy Protection Act (COPPA), 16 C.F.R. Part 312.2
    • Children's Online Privacy Protection Act: (A mixed audience site is where the site is directed to children, but does not target children as its "primary audience," but rather teens 13-to-18 years of age or adults. An operator of a mixed audience site is required to obtain age information from a user before collecting any information and if a user identifies themselves as a child under the age of 13, the operator must obtain parental consent before any information is collected) See Children's Online Privacy Protection Act (COPPA), 16 C.F.R. Part 312.2
    • Children's Online Privacy Protection Act: (A site directed to children is where the operator has actual knowledge the site is collecting information from children under the age of 13 and parental consent is required before any collection or use of information) See Children's Online Privacy Protection Act (COPPA), 16 C.F.R. Part 312.2
    • Children's Online Privacy Protection Act: (A vendor who may obtain actual knowledge that it is collecting information from a child must not encourage a child from disclosing more information than reasonably necessary through an age verification mechanism. An age gate should be: age-neutral; not encourage falsification; list day, month, and year; have no prior warning that under 13 children will be blocked; and prevent multiple attempts) See Children's Online Privacy Protection Act (COPPA), 16 C.F.R. Part 312.3(d)
  • Background
    • The Children's Online Privacy Protection Act (COPPA) requires an operator to post a link to a notice of its information practices on the homepage of its web site or online service and in each area of its web site where it collects "Personal Information" from children. An operator of a general audience web site with a separate children's area must also post a link to its privacy policy on the homepage of the children's area. See 15 U.S.C. §§ 6501-6506; 16 C.F.R. Part 312
    • COPPA applies anytime an operator of a website or online service has actual knowledge that it is collects, maintains, uses, or discloses personal information from a child under 13. In these situations an operator is generally required to obtain verified parental consent.
    • COPPA requires companies to establish and maintain reasonable procedures to protect the confidentiality, security, and integrity of personal information collected from children. Companies should minimize what they collect in the first place and take reasonable steps to release personal information only to service providers and third-parties capable of maintaining its confidentiality, security, and integrity. Always obtain assurances that third-parties will live up to their contractual privacy responsibilities. Also, companies should hold on to personal information only as long as is reasonably necessary for the purpose for which it was collected. They should securely dispose of it once they no longer have a legitimate reason for retaining it. See FTC, Six-Step Compliance Plan for Your Business.

11.1.2: COPPA Notice (Compliance)

Do the policies clearly indicate whether or not the vendor describes: (1) what information is collected from children under 13 years of age, (2) how that information is used, and (3) its disclosure practices for that information?

  • Indicator
    • Discloses COPPA or children's privacy is applicable to the product.
    • Discloses how the company collects, uses, and discloses information from children under 13 years of age.
  • Citation

11.1.3: Restrict Account (Compliance)

Do the policies clearly indicate whether or not the vendor prohibits creating an account for a child under 13 years of age?

  • Indicator
    • Discloses restrictions are in place for account creation from children under 13 years of age.
    • Discloses a user's age or birthday is provided or collected upon account registration.
    • Discloses the product utilizes an age-gate or other mechanism to verify the age of a user.
  • Citation

11.1.4: Restrict Purchase (Compliance)

Do the policies clearly indicate whether or not the vendor restricts in-app purchases for a child under 13 years of age?

  • Indicator
    • Discloses restrictions are in place for company store or in-app purchases from children under 13 years of age.
    • Discloses parental consent or adult verification is required for in-app purchases from children under 13 years of age.
  • Citation

11.1.5: Safe Harbor (Compliance)

Do the policies clearly indicate whether or not the product participates in an FTC-approved COPPA safe harbor program?

  • Indicator
    • Discloses the company participates in a COPPA safe harbor program.
  • Citation
  • Background
    • An operator may satisfy its obligations under COPPA by participating in a safe harbor program. The safe harbor programs are self-regulatory frameworks developed by industry groups and approved by the FTC. FTC-approved COPPA safe harbor programs offer parental notification and consent systems for operators who are members of their programs. In addition, the FTC recognizes that these and other common consent mechanisms could benefit operators (especially smaller ones) and parents if they offer a proper means for providing notice and obtaining verifiable parental consent, as well as ongoing controls for parents to manage their children's accounts. The FTC recommends operators use a common consent mechanism to assist in providing notice and obtaining consent, because they are ultimately responsible for ensuring that the notice accurately and completely reflects their information collection practices and that the consent mechanism is reasonably designed to reach the parent. See 78 Fed. Reg. 3972, 3989; See FTC, Complying with COPPA: Frequently Asked Questions, q. 13.

11.2: Students in K–12

11.2.1: School Purpose (Compliance)

Do the policies clearly indicate whether or not the product is primarily used, designed, and marketed for preschool or K-12 school purposes?

11.2.2: Education Records (Compliance)

Do the policies clearly indicate the process by which education records are entered into the product? For example, are data entered by district staff, school employees, parents, teachers, students, or some other person?

11.2.3: School Contract (Compliance)

Do the policies clearly indicate whether or not the vendor provides a contract to a Local Educational Agency (LEA) or otherwise provides notice to users of additional rights?

  • Indicator
    • Discloses a separate agreement or contract is provided to schools or districts of their rights.
    • Discloses notification is provided to schools or districts of their rights.
  • Citation
    • Family Educational Rights and Privacy Act: (An educational institution must annually notify parents of their rights to inspect and review a student's education records, make corrections, delete, or consent to the disclosure of information) See Family Educational Rights and Privacy Act (FERPA), 34 C.F.R. Part 99.7(a)
    • Family Educational Rights and Privacy Act: (Any rights to access, modify, or delete student records may transfer to an "eligible" student who is over 18 years of age) See Family Educational Rights and Privacy Act (FERPA), 34 C.F.R. Part 99.5(a)(1)
    • General Data Protection Regulation: (The controller shall, at the time when personal data are obtained, provide the data subject with the following further information necessary to ensure fair and transparent processing: ... (e) whether the provision of personal data is a statutory or contractual requirement, or a requirement necessary to enter into a contract, as well as whether the data subject is obliged to provide the personal data and of the possible consequences of failure to provide such data) See General Data Protection Regulation (GDPR), Information to be provided where personal data are collected from the data subject, Art. 13(2)(e)
    • California AB 1584 - Privacy of Pupil Records: (Authorizes a Local Educational Agency (LEA) to enter into a third party contract for the collection and use of pupil records that must include a statement that the pupil records continue to be the property of and under the control of the local educational agency, a description of the actions the third party will take to ensure the security and confidentiality of pupil records, and a description of how the local educational agency and the third party will jointly ensure compliance with FERPA) See California AB 1584 - Privacy of Pupil Records, Cal. Ed. Code §§ 49073.1
  • Background
    • FERPA is a Federal law that protects personally identifiable information in students' education records from unauthorized disclosure. It affords parents the right to access their child's education records, the right to seek to have the records amended, and the right to have some control over the disclosure of personally identifiable information from the education records. When a student turns 18 or enters a postsecondary institution at any age, the rights under FERPA transfer from the parents to the student ("eligible student"). 20 U.S.C. § 1232g; 34 C.F.R. Part 99; See also PTAC, Responsibilities of Third-Party Service Providers under FERPA, pp. 1-3.
    • FERPA denies federal funding to educational agencies or institutions that have a practice or policy of permitting the release of student information without parental consent. There is an exception where such information is released to "school officials" who have been determined by the educational agency or institution to have a legitimate educational interest.
    • A vendor should describe the procedures for a parent, legal guardian, or eligible student to review and correct covered information. See Ready for School, Recommendations for the Ed Tech Industry to Protect the Privacy of Student Data (November 2016), CA. D.O.J., p. 14.

11.2.4: School Official (Compliance)

Do the policies clearly indicate whether or not the vendor is under the direct control of the educational institution and designates themselves a 'school official' under FERPA?

Do the policies clearly indicate whether or not the vendor or third party obtains verifiable parental consent before they collect or disclose personal information?

Do the policies clearly indicate whether or not a parent can consent to the collection and use of their child's personal information without also consenting to the disclosure of the information to third parties?

  • Indicator
    • Discloses parental consent can be limited with respect to use with third parties.
    • Discloses parental consent can be given for the collection and use of information with the company seperate from use with third parties.
  • Citation
    • Children's Online Privacy Protection Act: (An operator can not condition a child's participation in the service with sharing any collected information with third parties. A parent is required to have the ability to consent to the collection and use of their child's personal information without also consenting to the disclosure of the information to third parties) See Children's Online Privacy Protection Act (COPPA), 16 C.F.R. Part 312.5(a)(2)

Do the policies clearly indicate whether or not the vendor responds to a request from a parent or guardian to prevent further collection of their child's information?

11.3.4: Delete Child-PII (Compliance)

Do the policies clearly indicate whether or not the vendor deletes personal information from a student or child under 13 years of age if collected without parental consent?

  • Indicator
    • Discloses the company will delete personal information from a student or child under 13 years of age if collected without parental consent.
  • Citation

Do the policies clearly indicate whether or not the vendor provides notice to parents or guardians of the methods to provide verifiable parental consent under COPPA?

  • Indicator
    • Discloses the parental consent method(s) that are available for submission of consent by a parent or guardian.
  • Citation
    • Children's Online Privacy Protection Act: (An operator is required to provide direct notice to parents describing what information is collected, how information is used, its disclosure practices and exceptions) See Children's Online Privacy Protection Act (COPPA), 16 C.F.R. Part 312.4(b)
    • Children's Online Privacy Protection Act: (Existing methods to obtain verifiable parental consent include: (i) Providing a consent form to be signed by the parent and returned to the operator by postal mail, facsimile, or electronic scan; (ii) Requiring a parent, in connection with a monetary transaction, to use a credit card, debit card, or other online payment system that provides notification of each discrete transaction to the primary account holder; (iii) Having a parent call a toll-free telephone number staffed by trained personnel; (iv) Having a parent connect to trained personnel via video-conference; (v) Verifying a parent's identity by checking a form of government-issued identification against databases of such information, where the parent's identification is deleted by the operator from its records promptly after such verification is complete) See Children's Online Privacy Protection Act (COPPA), 16 C.F.R. Part 312.5(b)(i)-(v)
    • Children's Online Privacy Protection Act: (If an operator does not “disclose” children's personal information, they may use an email coupled with additional steps to provide assurances that the person providing the consent is the parent. Such additional steps include: Sending a confirmatory email to the parent following receipt of consent, or obtaining a postal address or telephone number from the parent and confirming the parent's consent by letter or telephone call. An operator that uses this method must provide notice that the parent can revoke any consent given in response to the earlier email.) See Children's Online Privacy Protection Act (COPPA), 16 C.F.R. Part 312.5(b)(vi)
  • Background
    • Under most circumstances an operator is required to obtain verified parental consent before the collection, use, or disclosure, of personal information from children under the age of 13. The method used to obtain parental consent must be reasonably calculated (taking into account available technology) to ensure that the person providing consent is actually the child's parent.

11.3.6: Internal Operations (Compliance)

Do the policies clearly indicate whether or not the vendor can collect and use personal information from children without parental consent to support the 'internal operations' of the vendor's product?

  • Indicator
    • Discloses personal information from children under 13 years of age may be used by the company for its own internal operations under a COPPA exception.
  • Citation

11.3.7: COPPA Exception (Compliance)

Do the policies clearly indicate whether or not the vendor collects personal information from children without verifiable parental consent for the sole purpose of trying to obtain consent under COPPA?

11.3.8: FERPA Exception (Compliance)

Do the policies clearly indicate whether or not the vendor may disclose personal information without verifiable parental consent under a FERPA exception?

11.3.9: Directory Information (Compliance)

Do the policies clearly indicate whether or not the vendor discloses student information as 'Directory Information' under a FERPA exception?

  • Indicator
    • Discloses student information can be shared without parental consent as Directory Information.
    • Discloses what type of student information can be shared as Directory Information under a FERPA exception.
  • Citation
  • Background
    • What is the "Directory Information" Exception? An exception to parental consent that permits the disclosure of PII from education records under FERPA. Information designated by the school or district as directory information may be disclosed without consent and used without restriction in conformity with the policy, unless the parent, guardian, or eligible student opts out. Examples of directory information about students include name, address, telephone number, email address, date and place of birth, grade level, sports participation, and honors or awards received. Before a school or district can disclose directory information, it must first provide public notice to parents and eligible students of the types of information designated as directory information, the intended uses for the information, and the right of parents or eligible students to "opt out" of having their information shared. See PTAC, Responsibilities of Third-Party Service Providers under FERPA, p. 3; See also PTAC, Protecting Student Privacy While Using Online Educational Services: Requirements and Best Practices, pp. 3-4.

Do the policies clearly indicate whether or not responsibility or liability for obtaining verified parental consent is transferred to the school or district?

  • Indicator
    • Discloses the obligation to obtain verifiable parental consent from a parent of guardian are transferred to the school or district.
    • Discloses the school or district are required to provide verifiable parental consent records to the company upon request.
  • Background
    • Where a school has contracted with an operator to collect personal information from students for the use and benefit of the school, and for no other commercial purpose, the operator is not required to obtain consent directly from parents, and can presume that the school’s authorization for the collection of students’ personal information is based upon the school having obtained the parents’ consent . . . As a best practice, the school should consider providing parents with a notice of the websites and online services whose collection it has consented to on behalf of the parent. Schools can identify, for example, sites and services that have been approved for use district-wide or for the particular school. See FTC, Complying with COPPA: Frequently Asked Questions, M. COPPA AND SCHOOLS, 2-4..

11.4.1: Policy Jurisdiction (Compliance)

Do the policies clearly indicate the vendor's jurisdiction that applies to the construction, interpretation, and enforcement of the policies?

  • Indicator
    • Discloses the domestic or foreign jurisdiction that applies to the enforcement of the policies.
  • Background
    • A court’s general authority to hear and/or “adjudicate” a legal matter is referred to as its “jurisdiction.” In the United States, jurisdiction is granted to a court or court system by statute or by constitution. A court is competent to hear and decide only those cases whose subject matter fits within the court’s jurisdiction. A legal decision made by a court that did not have proper jurisdiction is deemed void and nonbinding upon the litigants. See The Concept Of Jurisdiction.

11.4.2: Dispute Resolution (Compliance)

Do the policies clearly indicate whether or not the vendor requires a user to waive the right to a jury trial, or settle any disputes by Alternative Dispute Resolution (ADR)?

  • Indicator
    • Discloses the requirement that users must waive the right to a jury trial.
    • Discloses the requirement that users must settle any disputes by Alternative Dispute Resolution (ADR).
  • Background
    • The vendor's policies should describe the legal process used to determine how disputes will be resolved. Any dispute arising out of an alleged breach of the policies will likely be settled by Alternative Dispute Resolution (ADR) through binding arbitration before judicial arbitration or mediation services, such as the American Arbitration Association, or a similar arbitration service.
    • ADR allows parties to save time and money (compared to litigation), decide their own procedures during the mediation or arbitration, prevent setting precedent/lawmaking (if desired), and, in mediation, allows more inventive solutions. ADR, however, provides more opportunities for arbitors and mediators to be biased, less extensive discovery periods, lack of a right to appeal the decision, and an increased chance of no solution (which ultimately leads to litigation). See ADR: A Litigator's Perspective.

11.4.3: Class Waiver (Compliance)

Do the policies clearly indicate whether or not the vendor requires the user to waive their right to join a class action lawsuit?

  • Indicator
    • Discloses the requirement that users must waive any rights to join a class action lawsuit.
  • Background
    • A class action is a lawsuit filed on behalf of a group, or 'class,' of individuals with similar legal claims. A plaintiff, or small group of named plaintiffs (plaintiffs whose names appear on the face of the legal filings), submits a lawsuit representing a larger group of plaintiffs (of which the potential size may be several thousands of individuals) who are unnamed. The unnamed plaintiffs are identified as members of a class by virtue of a shared legal grievance with the named plaintiff(s) . . . Class actions provide a means through which plaintiffs may bring small legal claims that would be too costly to litigate individually. Additionally, class actions can equalize the difference in power between robust entities and individuals without significant resources . . . While the size of a class provides strength to its members, it also limits their choices and options. Unnamed plaintiffs (those who chose not to opt-out), have the least amount of control over the case . . [T]he complexity of class action cases makes them difficult and expensive to litigate, requiring greater time and resources of the attorneys and the court. See Justia.

11.4.4: Law Enforcement (Compliance)

Do the policies clearly indicate whether or not the vendor can use or disclose a user's data under a requirement of applicable law, to comply with a legal process, respond to governmental requests, enforce their own policies, for assistance in fraud detection and prevention, or to protect the rights, privacy, safety or property of the vendor, its users, or others?

  • Indicator
    • Discloses user information may be shared with government or legal authorities.
    • Discloses processes for responding to government or legal requests.
    • Discloses processes for responding to private requests.
    • Discloses its process for responding to court orders.
    • Discloses its process for responding to requests from foreign jurisdictions.
    • Discloses explanations that include the legal basis under which it may comply with government or private requests.
    • Discloses the company carries out due diligence on government or private requests before deciding how to respond.
    • Discloses the company's commitment to push back on inappropriate or overbroad requests made by government or private requests.
    • Discloses clear guidance or examples of implementation of its process of responding to government or private requests.
  • Citation
  • Background
    • Disclosure of personal information for the "internal operations" of the website or online service, means activities necessary for the site or service to maintain or analyze its functioning; perform network communications; authenticate users or personalize content; serve contextual advertising or cap the frequency of advertising; protect the security or integrity of the user, website, or online service; ensure legal or regulatory compliance; or fulfill a request of a child. See 16 C.F.R. 312.2; See also FTC, Complying with COPPA: Frequently Asked Questions, q. 5.
    • A vendor should only disclose covered information acquired through the site or service to ensure legal compliance, respond to judicial process, or protect an individual's safety or the security of the site or service. See Ready for School, Recommendations for the Ed Tech Industry to Protect the Privacy of Student Data (November 2016), CA. D.O.J., p. 14.
    • Companies often receive requests to remove, filter, or restrict access to content and accounts. These requests can come from government agencies or courts (both domestic and foreign), as well as from private entities (i.e. non-governmental and non-judicial entities). We expect companies to publicly disclose their process for responding to requests from governments and courts, as well as to private requests that come through some type of defined or organized process. Private requests can come through a process established by law,(e.g., requests made under the U.S. Digital Millennium Copyright Act, the European Right to beForgotten ruling, etc.) or a self-regulatory arrangement (e.g., company agreements to block certain types of images). This indicator evaluates whether the company clearly discloses how it responds to government and private requests to remove, filter, or restrict content or accounts. The company should disclose the legal reasons why it would remove content. In some cases, the law might prevent a company from disclosing information referenced in this indicator’s elements. See Ranking Digital Rights, F5.

11.5: Certification

11.5.1: Privacy Badge (Compliance)

Do the policies clearly indicate whether or not the vendor has signed any privacy pledges or received any other privacy certifications?

  • Indicator
    • Discloses commitment to a third-party privacy certification, badge, award, or principles of a privacy pledge.
  • Citation
  • Background
    • Privacy protection depends on companies being accountable to consumers as well as to agencies that enforce consumer data privacy protections. However, compliance goes beyond external accountability to encompass practices through which companies prevent lapses in their privacy commitments or detect and remedy any lapses that may occur. Companies that can demonstrate that they live up to their privacy commitments have powerful means of maintaining and strengthening consumer trust. A company's own evaluation can prove invaluable to this process. The appropriate evaluation technique, which could be a self-assessment and need not necessarily be a full audit, will depend on the size, complexity, and nature of a company's business, as well as the sensitivity of the data involved. See Exec. Office of the President, Consumer Data Privacy in a Networked World: A Framework for Protecting Privacy and Promoting Innovation in the Global Digital Economy (2012), p. 22.

11.6: International Laws

11.6.1: GDPR Jurisdiction (Compliance)

Do the policies clearly indicate whether or not a user's data are subject to International data transfer or jurisdiction laws, such as a privacy shield, or a safe harbor framework that protects the cross-border transfer of a user's data?

  • Indicator
    • Discloses the company is subject to GDPR jurisdiction.
    • Discloses transfer of user information is subject to International data jurisdiction laws.
  • Citation
    • General Data Protection Regulation: (The EU General Data Protection Regulation (GDPR) replaces the Data Protection Directive 95/46/EC and was designed to harmonize data privacy laws across Europe, to protect and empower all EU citizens data privacy, and to reshape the way organizations across the region approach data privacy) See General Data Protection Regulation (GDPR)
    • General Data Protection Regulation: (This Regulation applies to the processing of personal data in the context of the activities of an establishment of a controller or a processor in the Union, regardless of whether the processing takes place in the Union or not.) See General Data Protection Regulation (GDPR), Territorial Scope, Art. 3(1)
    • General Data Protection Regulation: (This Regulation applies to the processing of personal data of data subjects who are in the Union by a controller or processor not established in the Union, where the processing activities are related to: (a) the offering of goods or services, irrespective of whether a payment of the data subject is required, to such data subjects in the Union; or (b) the monitoring of their behaviour as far as their behaviour takes place within the Union.) See General Data Protection Regulation (GDPR), Territorial Scope, Art. 3(2)(a)-(b)
    • General Data Protection Regulation: (This Regulation applies to the processing of personal data by a controller not established in the Union, but in a place where Member State law applies by virtue of public international law.) See General Data Protection Regulation (GDPR), Territorial Scope, Art. 3(3)
    • General Data Protection Regulation: (“binding corporate rules” means personal data protection policies which are adhered to by a controller or processor established on the territory of a Member State for transfers or a set of transfers of personal data to a controller or processor in one or more third countries within a group of undertakings, or group of enterprises engaged in a joint economic activity) See General Data Protection Regulation (GDPR), Definitions, Art. 4(20)
    • General Data Protection Regulation: (“cross-border processing” means either: (a) processing of personal data which takes place in the context of the activities of establishments in more than one Member State of a controller or processor in the Union where the controller or processor is established in more than one Member State; or (b) processing of personal data which takes place in the context of the activities of a single establishment of a controller or processor in the Union but which substantially affects or is likely to substantially affect data subjects in more than one Member State.) See General Data Protection Regulation (GDPR), Definitions, Art. 4(23)(a)-(b)
    • General Data Protection Regulation: (Where personal data relating to a data subject are collected from the data subject, the controller shall, at the time when personal data are obtained, provide the data subject with all of the following information: ... (f) where applicable, the fact that the controller intends to transfer personal data to a third country or international organisation and the existence or absence of an adequacy decision by the Commission, or in the case of transfers referred to in Article 46 or 47, or the second subparagraph of Article 49(1), reference to the appropriate or suitable safeguards and the means by which to obtain a copy of them or where they have been made available.) See General Data Protection Regulation (GDPR), Information to be provided where personal data are collected from the data subject, Art. 13(1)(f)
    • General Data Protection Regulation: (Where personal data have not been obtained from the data subject, the controller shall provide the data subject with the following information: ... (f) where applicable, that the controller intends to transfer personal data to a recipient in a third country or international organisation and the existence or absence of an adequacy decision by the Commission, or ... reference to the appropriate or suitable safeguards and the means to obtain a copy of them or where they have been made available.) See General Data Protection Regulation (GDPR), Information to be provided where personal data have not been obtained from the data subject, Art. 14(1)(f)
    • General Data Protection Regulation: (Where personal data are transferred to a third country or to an international organisation, the data subject shall have the right to be informed of the appropriate safeguards pursuant to Article 46 relating to the transfer.) See General Data Protection Regulation (GDPR), Right of access by the data subject, Art. 15(2)
    • General Data Protection Regulation: (Any transfer of personal data which are undergoing processing or are intended for processing after transfer to a third country or to an international organisation shall take place only if, subject to the other provisions of this Regulation, the conditions laid down in this Chapter are complied with by the controller and processor, including for onward transfers of personal data from the third country or an international organisation to another third country or to another international organisation. All provisions in this Chapter shall be applied in order to ensure that the level of protection of natural persons guaranteed by this Regulation is not undermined.) See General Data Protection Regulation (GDPR), General principle for transfers, Art. 44
    • General Data Protection Regulation: (A transfer of personal data to a third country or an international organisation may take place where the Commission has decided that the third country, a territory or one or more specified sectors within that third country, or the international organisation in question ensures an adequate level of protection. Such a transfer shall not require any specific authorisation.) See General Data Protection Regulation (GDPR), Transfers on the basis of an adequacy decision, Art. 45(1)
    • General Data Protection Regulation: (Where [jurisdiction] applies, the controller or the processor shall designate in writing a representative in the Union.) See General Data Protection Regulation (GDPR), Representatives of controllers or processors not established in the Union, Art. 27(1)
    • General Data Protection Regulation: ([Controller or processor] obligation shall not apply to: (a) processing which is occasional, does not include, on a large scale, processing of special categories of data ... or processing of personal data relating to criminal convictions and offences ... and is unlikely to result in a risk to the rights and freedoms of natural persons, taking into account the nature, context, scope and purposes of the processing) See General Data Protection Regulation (GDPR), Representatives of controllers or processors not established in the Union, Art. 27(2)
    • General Data Protection Regulation: (The representative shall be established in one of those Member States where the data subjects are and whose personal data are processed in relation to the offering of goods or services to them, or whose behaviour is monitored.) See General Data Protection Regulation (GDPR), Representatives of controllers or processors not established in the Union, Art. 27(3)
    • General Data Protection Regulation: (The representative shall be mandated by the controller or processor to be addressed in addition to or instead of the controller or the processor by, in particular, supervisory authorities and data subjects, on all issues related to processing, for the purposes of ensuring compliance with this Regulation.) See General Data Protection Regulation (GDPR), Representatives of controllers or processors not established in the Union, Art. 27(4)

11.6.2: GDPR Role (Compliance)

Do the policies clearly indicate whether or not the vendor is categorized as a Data Controller or a Data Processor, and whether it has identified a Data Protection Officer (DPO) for the purposes of GDPR compliance?