Emerging technology creates perplexing problems for user-centered design, such as: how do you take a user-centered approach when it’s too early to have a defined user? Does this mean you should throw out user-centered design, or choose your target user based solely on TAM?
Read MoreWith all the hype around AI, it’s hard to recognize what issues are attention-worthy. So last week’s Rosenfeld Media community workshop on AI was a valuable opportunity to see what questions are top-of-mind for the UX community.
Read MoreAI is a technology with rippling systems-level effects. Collaborating across diverse disciplines is the only way to begin to understand the full implications of our AI design decisions. This is the focus of an upcoming community workshop I'll be moderating on artificial intelligence.
Read MoreIn Rosenfeld Media’s upcoming Advancing Research community workshop on artificial intelligence, I’ll be talking with Rachael Dietkus, Nishanshi Shukla, and David Womack about what researchers and tech professionals can do to mitigate issues in AI tool use, from psychological harm to users, to damaging our knowledge bases.
Read MoreAnalysts argue we’re entering the “Trough of Disillusionment” for AI. That may be bad news for investors, but for product teams, it offers new opportunities to build products that solve more meaningful problems for their users.
Read MoreStartup Archetype AI is fusing physical sensor data with LLMs to create an AI model that will "encode the entire physical world." This approach means that natural language becomes a translation layer used to both interpret input (i.e., sensing) and to issue commands (e.g., controlling a robot arm). This work is exciting but also hard to access. What can a regular person do to start experimenting with and preparing for this type of tech?
Read MoreAn article in this month’s issue of The Fabricator underscores that physical industry workers are not experiencing the same AI boom as information workers. Are we so accustomed to designing for information workers that we’re overlooking opportunities to serve new audiences?
Read MoreAman Ibrahim created TerifAI (as in “terrify”), a bot that can clone your voice after only a minute or so of conversation. It’s an incredible demonstration of the rapid development (and growing threat) of voice cloning AI, and an example of why the need for biometric voice redaction is becoming more urgent.
Read MoreAs part of a recent biometric data redaction demo, we used ElevenLabs to replace the voices of participants in recordings with AI-generated speech. Now ElevenLabs has announced new voice models based on classic actors. This raises new questions about trade-offs between participant privacy and stakeholder perceptions.
Read MoreWe’ve started to recognize the quality and legal issues AI brings into our own product ecosystems. But incorporating AI into our products also means we are integrating ourselves into much larger systems: systems with far-reaching and often hard-to-understand consequences. As human-centered professionals, how can we tackle this difficult problem?
Read MoreManuel Herrera did this fabulous sketchnote version of my AI doppelgangers demo at Designing With AI 2024. It's a concise summary of the limited biometric data redaction options available for recordings of humans today, and of some immediate issues with the AI tools that could take over this task in the future.
Read MoreIn a recent blog post, Simon Willison highlighted Apple’s “ethical approach to AI generated images?” with Image Playground. His arguments are very similar to our case for using avatars to de-identify users in research recordings.
Read MoreThe new Apple Intelligence model for AI data privacy prioritizes local processing, with strict controls for when and how cloud compute is used. What would it look like to use Apple’s model for processing UX and research data with AI?
Read MoreWhat if you could eliminate data privacy concerns with your user research videos, security footage, etc., by automatically removing faces and voices? This project showcases a way to use AI to protect individuals’ privacy rather than threaten it.
Read MoreProduct teams are being told they can't keep research recordings they need because of privacy laws. This is a solvable problem, but the solution we identified requires legal, audio and video editing, and 3D art domain knowledge in addition to research know-how.
Read More