Intended Users

An application or service can have many intended users or just one type of specific intended user. For example, some products are designed for a general audience that does not include kids, but other products are designed to be used exclusively by children or students. In addition, some products are designed for a mixed audience and are intended to be used by anyone including children, teens, students, parents, educators, and consumers.

General Audience Product

A general audience product is a product intended for adults where the company has no actual knowledge that a child under the age of 13 has registered an account or is using the service, and no age gate or parental consent is required prior to the collection or use of information. For example, a product that is not intended for children and would not likely appeal to children under 13, such as a tax preparation service, would be a general audience product.

However, a general audience product may be considered directed to children if the product would appeal to children under 13 years of age, which takes several factors into consideration such as: the subject matter, visual content, the use of animated characters or child-oriented activities and incentives, music or other audio content, the age of models, the presence of child celebrities or celebrities who appeal to children, language or other characteristics of the product, or whether advertising promoting or appearing on the product is directed at children. Therefore, a general audience application or service that collects personal information from users to teach them ABCs or basic numbers with animated cartoon characters would likely be a child-directed product.

Mixed-Audience Product

A mixed-audience product is directed to children but does not target children as its "primary audience" but rather targets teens 13 to 18 years of age or adults. A mixed-audience product is required to obtain age information from any user before collecting any personal information. In addition, if a user identifies themselves as a child under the age of 13, the company must obtain parental consent before any information is collected or used. For example, an education or consumer product that allows parents or teachers to log in through a separate account to use the product themselves, or to monitor or manage their children or student's accounts, would be a mixed-audience product.

Child-Directed Product

A product directed at children is a product where the company has actual knowledge it is collecting information from children under the age of 13 because children are targeted as the primary audience, and, as a result, parental consent is required before the collection or use of any information. For example, an application or service that teaches ABCs or basic numbers with animated cartoon characters would be a child-directed product.

Differential Privacy

The Privacy Program only evaluates products that are for a mixed audience that includes kids, or products directed at children and students. A child-directed product typically has a unique privacy policy and website, and the application or service has the same privacy protections for both children and students. However, mixed-audience products with various users often have different privacy practices and protections based on the category of user. This type of differential privacy allows the company to establish privacy protections that apply only to a specific subset of users. Companies' goal is to limit the privacy protections to as few individuals as possible. For example, some products may sell user data and display behavioral advertising to parents, teachers, and consumers but not do so for children or students.

The Privacy Program evaluates products based on multiple dimensions that include an overall score, rating, and evaluation concerns, as described in our Evaluation Details page. A product's overall score can be used by all intended users of a product to better understand its privacy protections and to more easily compare products based on how well they protect the privacy of all users. In addition, a product's rating can be used by all intended users of a product to understand potential issues with a product's privacy practices. This is an important feature of our privacy evaluations because if a mixed-audience product is intended for both children and adults but has different privacy practices for adults than kids, our rating reflects any "worse" practices—for the purposes of our evaluation process—because it applies to any intended user of the product. Additionally users may automatically change class as they use a product and lose protections that were formerly in place. For example, if a product has greater protections for kids under 13, when a kid turns 14 they may no longer benefit from the additional protections afforded to users under the age of 13. As a result our evaluations focus on the details that apply generally or apply to all users, as a user may not have control over the conditions that determine which protections they and their data are afforded.

Protecting Users

Our ratings are designed to protect all users and flag a privacy risk if it applies to any intended user of the product. The following three examples illustrate the different ratings a mixed-audience product could receive:

1). No rating flags. If none of the rating criteria has been flagged with an alert icon, that means the answers to all the rating questions have been disclosed in a product's policy with "better" responses for the purposes of our evaluation. This product would receive a Pass rating.

2). Rating flags apply to all users. If one or more of the rating criteria has been flagged a privacy risk, that product would be rated Warning—for example, if a product's terms state that personal information from any user may be sold to third parties or used to display behavioral advertisements or tracking purposes.

3). Rating flags apply to only a specific type of user. If one or more of the rating criteria has been flagged a privacy risk, that product would be rated Warning. However, if the privacy risks only apply to a specific type of intended user such as a parent or educator but do not apply to children and students, the product would still be rated Warning. This approach alerts all intended users of the potential privacy risks but also indicates in the product's overall summary any additional protections provided for other intended users—for example, if a product's terms state that no personal information collected from children or students using the product may be sold to third parties or used to display behavioral advertisements, but other intended users such as parents or educators do not have similar protections.

We believe this approach better protects children and students when using products with different privacy practices based on the type of user, because rather than provide a false impression of safety for all users when only one group of intended users is afforded protections, we display the potential issues if any intended users are at risk. This allows parents and educators to be better informed about a product's overall privacy risks up front and provide them the opportunity to learn more about how a product's privacy risks may affect their own decision to use a product based on their unique concerns. Moreover, this approach also allows parents and educators to make an informed decision with all the available information on whether a product may still be appropriate to use in their context because it protects the personal information of children and students differently.