AI systems used in video surveillance or security have faced criticism for higher error rates when identifying: • Mostly include certain races, genders, or body types • Stereotype certain groups • Underrepresent people with disabilities • Generate biased or stereotypical characters • Assign certain jobs or roles to specific genders • People with darker skin tones