Colorado’s new brain-wave law shows that biometric privacy legislation–and its impact on product teams–won’t stop at face and voice
A recently passed bill in Colorado classifies brain waves as sensitive personal information that must be protected the same as fingerprints and facial recognition data.
Most (though not all!) user research doesn’t collect neural data, but we should be prepared for legislation that restricts our ability to collect and store user data in unanticipated ways.
Before GDPR, product teams didn’t worry too much about storing personally identifying information as long as participants had signed a boilerplate consent form.
Before BIPA and similar biometric privacy laws, we didn't worry about showing faces and recording voices in our user research videos. That’s changing as we speak.
As we look ahead, what other information will be deemed sensitive by legislators and by courts in the future? What about distinctive moles? Or tattoos?
How can we future-proof the user research recordings we make today, so that we won’t have to re-redact them (or, more likely, destroy them) every time there’s a new law?
The solution I’ll explore in my demo for Designing with AI 2024 replaces the user’s entire body with an AI-generated avatar, removing most biometric identifiers completely (and better protecting our users in the process).