Apple calls back to our foundational spatial computing work in Vision Pro unveiling
In a blink-and-you’d-miss-it moment in Monday’s WWDC 2023 keynote, Apple called back to some foundational spatial computing work I did at Microsoft, as part of their unveiling of the Vision Pro.
With all the demonstrated advancements in eye tracking and on-device natural language processing, the line, “…you can look at a search field and just start dictating,” may have seemed obvious, but it wasn’t seven years ago, when Microsoft filed for one of my patents, “Replacing pronouns with focus-specific objects in search queries.”
Apple announced they filed for over 5000 of their own patents for the technology in Vision Pro, and in one of them, issued just eleven weeks ago, my patent is listed as a cited reference, as 10,262,036 Paine et al.
Apple joins Qualcomm in citing it directly in a patent filing, and I’m really proud to see the industry continue to intentionally build on our mixed reality work from 2016.
(A second Apple patent, from 2021, also references it, but as with citations from HTC, IBM, LG, Canon, and others, it was cited by the examiner rather than as part of the application or search process.)