Based on what was presented yesterday at WWDC, VisionPro is an amazing product. But what I'm really fascinated about is EyeSight. Here is why.
EyeSight is an outward display. It's (intended? initial?) use will be to display your eyes while you are wearing the VisionPro in order to let people around you know if you're looking at them, or at someone else, or if you are immersed in an experience on your own.
This is obviously very important, because it retains the communication bandwidth provided by facial expressions. We all know that eyes and eye contact are an important, high-bandwidth communication channel for humans.
"I saw it in her eyes", "Eyes don't lie", "Let the eyes do the talking" —others have said it better than me.
But I have the feeling that EyeSight can be more than this. In addition to preserving the existing communication bandwidth, it may also increase it! If Apple allows it, I can see a future where the outward display shows more than your eyes.
Imagine being able to do some hand gesture, or even eye gesture, like blinking your eyes in a specific way that change the display, and show emojis (I'm really bad at design, I'm sure any designer could make these look much better). Or even animations like fireworks or any animated meme gif!
I know that the idea may sound silly to many, but there is a generation of people who are really good at communicating ideas and feelings using visual clues when they are in front of their keyboard but have no way of doing it IRL. (Just visit any discord server.) Being able to express themselves IRL, with images that are displayed exactly where the person they are talking to is looking can be a killer feature.
And the most interesting part is that if this turns out to be a killer feature, it's one that will drive adoption because it has built-in virality: when I use it, other people see how cool it is and they want it too.