Our Privacy Principles
We believe that everyone deserves the right to privacy.
Our privacy ratings are designed to protect all users of a product and flag any potential privacy risks if the risk applies to any intended user of the product. We believe this approach appropriately protects children and students when using products that have different privacy practices based on the type of user, because rather than provide a false impression of safety for all users when only one group of users is afforded better privacy protections, we display potential issues if any users are at risk. Ultimately, this best-in-class privacy rating approach holds companies accountable and allows parents, educators, and consumers to feel more confident and informed about a product's overall privacy risks up front. Our comprehensive approach is also designed to provide users the opportunity to learn more about how a product's privacy risks may affect their own decision to use a product based on their unique needs and concerns.
Moreover, our privacy rating approach allows parents and educators to make an informed decision for themselves with all the available information about whether a product may be appropriate to use in their context because products often protect their personal or behavioral information differently than children and students. Our privacy rating approach also takes into account the possibility of extrapolation of a child or student's personal or behavioral information in child profiles by proxy or association with Managed Accounts, such as a parent or teacher that has a student's account associated with their own. Parental or managed accounts with different roles may not have the same privacy practices, and often have fewer legal protections. For example, if a product has mixed privacy practices that include the use of "worse" practices for adult users such as targeted ads or tracking, but does not use the same "worse" practices for child profile accounts, the product may still be able to indirectly track or target children or students on the product through a parent or educator's managed account.
Intended Users
An application or service can have many intended users or just one type of specific intended user. For example, some products are designed for a general audience that does not include kids, but other products are designed to be used exclusively by children or students. In addition, some products are designed for a mixed audience and are intended to be used by anyone including children, teens, students, parents, educators, and consumers. Our privacy ratings are a measure of the minimum commitment to privacy a product is willing to make for all its users.
General Audience Products
A general‐audience product is a product intended for adults where the company has no actual knowledge that a child under the age of 13 has registered an account or is using the service, and no age gate or parental consent is required prior to the collection or use of personal or behavioral information. For example, a product that is not intended for children and would not likely appeal to children under 13, such as a tax preparation service, would be considered a general‐audience product.
However, a general‐audience product may be considered directed to children if the product would appeal to children under 13 years of age, which takes several factors into consideration such as: the subject matter, visual content, the use of animated characters or child‐oriented activities and incentives, music or other audio content, the age of people depicted in the product, the presence of child celebrities or celebrities who appeal to children, language or other characteristics of the product, or whether advertising promoting or appearing on the product is directed at children. Therefore, a general‐audience application or service that collects personal or behavioral information from users with animated cartoon characters with child oriented activities would likely be a child‐ directed product.
Mixed‐Audience Products
A mixed‐audience product is directed at children but does not target children as its "primary audience" – rather, it targets teens 13 to 18 years of age or adults. A mixed‐audience product is required to obtain age information from any user before collecting any personal or behavioral information. In addition, if a user indicates they are a child under the age of 13, the company must obtain parental consent before any personal or behavioral information is collected or used. For example, an education or consumer product that allows parents or teachers to log in through a separate account to use the product, or to monitor or manage their children or student's accounts, would be a mixed‐audience product.
Child‐Directed Products
A product directed at children is a product where the company has actual knowledge it is collecting information from children under the age of 13 because children are targeted as the primary audience, and, as a result, parental consent is required before the collection or use of any personal or behavioral information. For example, an application or service that teaches ABCs or basic numbers with animated cartoon characters would be a child‐directed product.
Selective Privacy
The Common Sense Privacy Program evaluates all types of products which includes products intended for children and teens, as well as general audience and mixed audience products used by parents, educators, and consumers. A child or student directed product typically has a separate privacy policy and website, and the application or service has the same strong privacy protections for both children and students.
However, general audience and mixed‐audience products with different types of users often have different privacy practices and protections based on the category of user, but combine all users into a single privacy policy or product. This type of “selective privacy” allows companies to establish privacy protections that apply only to a specific subset of users, such as children or students, but not other users. For example, some products may sell data and display targeted advertising to parents, teachers, and consumers, but do not do so for children or students only when they are using clearly identified protected profiles.
If children under the age of 13, or teens younger than 18, use an app or service, the company should disclose better privacy protections for children and teens in their privacy policy. In addition, if an app or service has worse privacy practices for adult users or consumers such as selling their data to third parties, or displaying targeted advertising that could inadvertently apply to children users, then the company should disclose additional protections are in place to protect those children or teens by default. Even if an app or device is not intended to be used by children or teens, it should still disclose in its privacy policy how it will handle information inadvertently collected from children or teens not logged into protected profiles with stronger privacy protections until it can be deleted. For example, Nancy, a 12 year old girl, uses an app when not logged in to an account and the product’s policy indicates that it uses data from all users by default for targeted advertising, except children. The fact that the company may not have actual knowledge that Nancy is a child under 13 does not change the fact that they have collected and intend to use Nancy's information in ways that are inconsistent with their claims of protecting children’s privacy.
A product's privacy rating is intended to be used by all users of a product to help better understand any potential issues with a product's privacy practices. This is an important feature of our privacy evaluations and rating, because if a general audience or mixed‐audience product is used by both children and adults but has different privacy practices for adults than for kids, our rating reflects any "worse" practices because it applies to any intended or unintended user of the product. This is an important distinction and different privacy rating approach than other third-party privacy seals or badges, because other approaches are limited to verifying only that child or student data is protected for minimum compliance, but ignore any worse privacy practices that may apply to other types of users such as parents, educators, or consumers.
Additionally, child or student users may automatically change their type of user classification as they grow older and lose privacy protections that were formerly in place. If a product has greater privacy protections for kids under 13, when a child turns 13 they may no longer benefit from the additional protections afforded to users under the age of 13. For example, many products that allow targeted advertising to adults do not also clarify in their privacy policy that they will not retroactively use data collected from children to target advertising after they become teens or adults. As a result, our privacy evaluations focus on the details that apply generally or apply to all users, as a user may not have control over the conditions that determine which protections they and their data are afforded.