Responsibly Designed AI – Digital Promise

Responsibly Designed AI

Product Certifications Apply to Earn
Responsibly Designed AI badge
AI-powered edtech must be designed responsibly if all students and educators are to benefit from its innovation.

Earning the Responsibly Designed AI certification demonstrates that the edtech developers designed the AI in the product to reduce algorithmic bias, provide transparency about data collection and security, include labeling of generative AI content, and equip educators with agency while engaging with AI outputs.

Learn more about how the Responsibly Designed AI product certification was developed in this blog post.

Certification Requirements

A product that earns this certification must submit evidence to demonstrate the following:

  1. The applicant has a public, free-to-access privacy policy, or similar document, to help current and prospective users understand:
    • What nonpublic personal information (NPI), including PII, data is collected, how it is stored, and for how long it is stored before being destroyed;
    • How this data is or may be used (e.g. to train the AI model);
    • Whether this data is shared with or sold to third parties; and
    • What control the user has over this data.
  2. The applicant has a public, free-to-access policy or other document describing how the product ensures data security, including a data breach or other incident response plan, as well as states which relevant data privacy laws this policy complies with (e.g. FERPA, COPPA, GDPR, etc.).
  3. The applicant has documentation about how the product team has sought to acknowledge and reduce bias in the AI model from development through implementation. The applicant has provided two examples of instances where these monitoring processes identified bias and described the changes made to mitigate the bias. Additionally, users can report instances of bias through the product directly.
  4. There are labels within the product to indicate when AI has been used in content generation that a user might mistake as being created or shared by a human (e.g. chat communication, images, word problems, literature, narrative feedback, etc.).
  5. Educators or other individual users can override at least one AI decision or recommendation made to support learning.

This certification expires after two years. Earners are welcome to re-apply to retain certification status.

What other certifications are there? Learn about the Research-Based Design: ESSA Tier 4 product certification here.

Sign Up For Updates! Email icon

Sign up for updates!

×
Loading...