Control iPhone with the movement of your eyes

(support.apple.com)

143 points | by 9woc 99 days ago

25 comments

  • augstein 96 days ago
    Amazing how good eye tracking works on my phone (15 Pro).

    Unfortunately, there seems to be no way to press buttons via blinking, only by "dwelling" on an item for a few seconds, which makes using my phone feel quite hectic and prone to to inadvertent inputs to me.

    • nomel 96 days ago
      I think interfering with a biological necessity that is there to maintain eye health probably isn't a good candidate for HID input. I suspect the user would end up with very dry eyes, as they subconsciously/consciously refrained from blinking, even if it were only long blinks (which I do often if my eyes feel the need).

      Now I could see a wink working! Left wink, right wink. And, with a wink, you don't lose tracking during the action (just half the signal).

      • azinman2 95 days ago
        This is built for accessibility for people with extreme impairments. Think quadriplegic, ALS, nervous system disorders, etc.
      • FollowingTheDao 95 days ago
        Talk about eye health...are there any studies of long term IR exposure on the eyes? (I am assuming they are using IR sensors.)
        • gruez 95 days ago
          It's unclear why an iPhone would be worse than a 60W light bulb, or the sun, both of which emit so much IR you can feel the heat with your hands.
      • willio58 92 days ago
        Why not double blink? Or a flare of the nose?
    • somedude895 95 days ago
      Interesting, it's practically useless on my iPhone 13 (non-Pro) due to being extremely jittery and inaccurate
    • helsinki 96 days ago
      Mine doesn’t work at all on a 16 Pro. Maybe due to astigmatism?
      • yzydserd 95 days ago
        It’s terrible for me too. Mild astigmatism and amblyopia. It’s now got me wondering how well I’d get along with Vision Pro.
        • helsinki 93 days ago
          I also have the Vision Pro and it’s not perfect for me. It definitely struggles with precision. Actually, now that you mention it, it felt very similar to my poor experience with the 16 pro.
    • cush 96 days ago
      It's really bad on my iPhone 13. Surprised they released it here. After one or two clicks it needs to recalibrate. There doesn't seem to be a way to not click on things either. No way to change apps. After the third recalibration, I selected "yes" to "would you like to disable eye tracking", and while eye tracking was disabled, I also lost access to swiping down on the control center or swiping up on the app switcher. Had to restart the phone to get things back to a usable place.
    • sen 95 days ago
      Just tried it on a 16 Pro Max and it’s insanely good. I really expected it to be a lot more random and buggy, but had no problem navigating the phone and “clicking” buttons etc.

      This is basically the sort of “future tech” I imagined as a kid. We can now talk to computers and they talk back (ChatGPT, Apple Intelligence, etc) and flawless navigate our portable super computers just by looking at them.

    • bdavbdav 96 days ago
      Hectic is a good way to describe it. I was frantically trying to turn it off when it seemed to just start clicking on things at random.
  • OptionOfT 95 days ago
    It is amazing how well iOS supports these accessibility features but doesn't consider blocking video autoplay on websites, something that is incredibly distracting for people with ADHD.
    • endofreach 95 days ago
      Since when? Safari used to be the only ones who forced user interaction prior to autoplay. You sure you didn't manually activated a feature flag or have an extension installed?
    • nomagicbullet 95 days ago
      There is a specific accessibility setting to reduce motion and prevent videos from autoplaying on iOS.

      Go to Settings > Accessibility > Motion and turn on "Reduce Motion"

      • willio58 92 days ago
        Yeah this is super useful.

        Also audio doesn’t autoplay on any websites. You have to interact with the page first for that to happen

    • yetanotheruser8 95 days ago
      You can just turn on low power mode and it stops autoplay videos.
    • chatmasta 95 days ago
      Hmm… I definitely have that blocked, but not sure the source of it. Probably it’s an extension.
    • _kyran 95 days ago
      Try the safari extension stop the madness.
  • w10-1 96 days ago
    Cursor tracks ok, but the implementation seems to replace a low-level pointing device. I.e., it's very precise and jittery - all attribution and no salience.

    Also maybe like Siri it should be modal. E.g., dwell away to silence, and then dwell leading corner to say "Hey, listen..."

    Holding the phone seemed to cause problems ("you're not holding it right"). Probably best with fixed positioning, e.g., attached to a screen (like a continuity camera, assuming you're lying down with a fixed head position.

    Tracking needs a magnetic (gravity well?) behavior, where the pointer is drawn to UI features (and the user can retract by resisting). Salience weighting could make it quite useful.

    It's possible that weighting could piggyback on existing accessibility metadata, or it might require a different application programming model.

    Similarly, it would be interesting to combine it with voice input that prioritized things near where you are looking.

    I'm willing to try, and eager to see how it gets integrated with other features.

  • Willingham 96 days ago
    I too tried this for a short while and was not impressed. However, I can’t help but wonder how ‘good’ I could get at using it if I invested more time in it. Would love to hear from someone who truly uses this tool regularly. Flying a plane is also quite cumbersome the first 15 minutes.
  • smitelli 95 days ago
    There are some downright neat things in the iOS accessibility options. Example: you can set it so that a triple tap on the rear of the phone turns the flashlight on/off. People think it’s witchcraft how fast I can pull the phone out and switch it on without looking down.
    • notatoad 95 days ago
      i turned that on because it sounded cool, but it ended up making my flashlight turn on all the time.
      • mmh0000 95 days ago
        Ha. I had the exact same experience.

        Thought wow cool! Quick flashlight! Instead the flashlight would frequently turn itself on while in my pocket.

        And on the flip side; when I wanted the flashlight to come on the phone frequently wouldn’t recognize my tapping.

        It just ended up being quicker and more convenient to turn on the flashlight from the lock screen.

      • dmix 95 days ago
        How does the tapping happen by accident?
        • mhh__ 95 days ago
          Either fidgeting or it just being too sensitive. I love weird right-brained UX but the problem is that I drum with my fingers all the time (the two aren't unrelated).=
    • rkagerer 95 days ago
      I'm on Android and just wish I had a simple button (or switch) dedicated to the flashlight.

      Back when I ran LineageOS it was easy to commandeer a physical button and repurpose it for this. It is still difficult to do so under stock OS? (Note I'm already using double tap of the power button to activate the camera)

      Eg. It was frustrating how for the longest time Samsung wouldn't let you repurpose the Bixby button.

      • nosrepa 95 days ago
        I just double tap the back of my phone to turn the flashlight on. Might be reserved for pixel phones though.
        • rkagerer 95 days ago
          Thanks. Do you ever experience any 'false positives' i.e. flashlight turns on when you didn't intend? (A few other people in the comments mentioned that)
    • crazygringo 95 days ago
      Yup, I use the triple-tap for smart invert.

      That way when a website blinds me at night because they didn't implement dark mode, I do a quick triple-tap and magic-presto, poor man's dark mode! And after I close it, another triple-tap to go back to normal.

      I'm really just waiting for the OS and browser to do "fake dark mode" whenever it detects something with a white/light background. Seems like it's about time.

    • idbehold 95 days ago
      The physical switch on my old iPhone to toggle silent mode stopped working and sometimes it will toggle itself. I had to setup the triple tap to toggle silent mode because the alternative is like 20 clicks deep in settings.
    • stavros 95 days ago
      On Android I get the flashlight by double pressing the "lock" button. It's my single most useful shortcut, my flashlight is on literally before my phone is out of my pocket.
    • amelius 95 days ago
      It's always fun to tap on someone's phone 3x when it's in their pocket.
  • orobinson 95 days ago
    I love how many neat and handy features are baked into the iOS accessibility settings. Apple should make more noise about these features. I recently discovered there’s a white noise generator baked in which meant I could get rid of the annoying white noise app I had been using to keep my napping baby asleep. Another recent discovery was the on screen anti-motion sickness things that move around with the accelerometer. I get terrible motion sickness and they make a massive difference.
    • dagmx 95 days ago
      They do mention them in their press releases, they’re featured during accessibility day and they’re occasionally in ads too. I’m not sure what more noise would be relevant for these features.
    • joezydeco 95 days ago
      The "reduce white point" is a great hidden feature in Accessibility that dims your backlight even further for nighttime reading. You can even assign it to a triple-click or backtap (another great hidden feature).
  • dt3ft 96 days ago
    This is rather buggy. You can get locked out of your device.
  • dyeje 96 days ago
    I played around with this a bit. Doesn’t work amazing on my iPhone model (SE 3rd gen), but it’s pretty cool. I don’t think there’s an API to use it in apps yet, but I would love to make an eye controlled mobile game.
  • yaky 95 days ago
    I am surprised this wasn't done much sooner, especially for "flagship" smartphones. When I was in university ~15 years ago, I met a grad student [0] who implemented impressive eye tracking software and even played WoW with it as a demo.

    0: He is a professor now, with many publications in eye tracking, https://scholar.google.com/citations?user=i96gkYMAAAAJ&origi...

  • crazygringo 95 days ago
    Fascinating reading the comments to see some people say it works flawlessly and others say it's terrible and jittery.

    I'm wondering if it's lighting conditions? Bright light will allow the phone to read your eye position far more accurately than the noisy pixels in a dark room.

    Or did Apple do a big upgrade to their front camera recently that accounts for the difference?

  • ciceryadam 96 days ago
    So finally somebody made the Opera face control from 15 years ago a reality: https://www.youtube.com/watch?v=kkNxbyp6thM
  • maxglute 96 days ago
    If we were to expand on face control scheme furthur, what face gestures would be used for clicking/tapping? click holding? What would be least exhausting to face muscles, what would look least ridiculous?
    • layer8 96 days ago
      Double-blinking for tapping seems the most obvious. Closing your eyes for a second for tap-and-hold (and a second time to release, when necessary, e.g. for drag and drop).
      • inlined 96 days ago
        I think the hardest gesture to do would be to scroll. Cheekily I nominate sticking your tongue out and virtually licking the UX up/down
        • layer8 95 days ago
          I'm still of the opinion that smartphones should have page-up/page-down buttons, so that you can scroll easily with minimal finger movements while holding the phone one-handed.
  • GrumpyNl 96 days ago
    I see lots of people walking and scrolling on their phone. Every once and a while they look up and continue. What will happen when you control it with your eyes and you look up, will it scroll?
    • nonameiguess 96 days ago
      It obviously can't work in all circumstances. Walking is one. Driving, whether you're controlling the phone via CarPlay or just have it mounted, is another. Using it in the dark, presumably the phone on its own doesn't light your face well enough. You can't use it with sunglasses on.
      • MBCook 95 days ago
        I would think this would use the attention detection stuff that’s part of Face ID and lights your face with infrared instead of just using the standard selfie camera.
      • entontoent 95 days ago
        Looool Best comment
    • pndy 96 days ago
      When I set it up and placed it on table, I've got "bring your face into view" banner on the top of the screen.

      Still, on my 13 Pro it doesn't seem to be following my eyes correctly half way thru the screen - I need to actually look a little bit up away from center of field to get pointer to select anything. I tried setting it up few times and same thing happens over and over. Scrolling seems to be done via the widget - similar to the Assistive Touch one. Unless there's some "eye gesture" I haven't figured out yet.

      It's really interesting feature, surely helpful for people with mobility issues but for majority is rather a novelty you can show your friends. The feature I do use quite often is voice control which can be activated without training and it helps when you have busy or dirty hands and you don't want to touch the device

  • ClassyJacket 96 days ago
    I can't wait for this to be implemented on other devices and used to make sure you actually look at the screen during ads! Google is going to love it.
    • jerlam 95 days ago
      The Samsung Galaxy S4 (released eleven years ago) had some kind of eye-tracking feature that paused a video (maybe ads too) if you weren't looking at it.
  • obahareth 95 days ago
    I had a very bad experience with it, it’s almost trying to select the complete opposite of what I’m looking at. Tried on iPhone 15 Pro.
  • hardlianotion 95 days ago
    Sounds great technically, but I don't want my phone paying attention to me when I am not paying attention to it.
    • MonkeyClub 94 days ago
      I don't think that's going to be an option moving forward, and in fact hasn't been one for a while now.
  • nytesky 95 days ago
    So can this do dynamics scroll through a long document? I can’t seem to get it to do anything close to useful.
  • Jhsto 96 days ago
    Anyone knows if there's a chance that the color of your eyes affects how well this feature works?
    • willwade 95 days ago
      Probably not. Typical eyegaze using IR has two main ways of working - bright eye or dark eye. This though wont care as much. Its based on a inference model..
  • Diti 96 days ago
    Doesn’t work in a dark room (visual hypersensitivity from ASD requires me to do that).
    • bythreads 96 days ago
      It works by tracking reflective spots on your eyes (glare) i would guess, try some light you can tolerate?
  • chakintosh 96 days ago
    Tried this a while ago, it's really bad. Often freezes or overshoots the target.
  • gnrlst 96 days ago
    It doesn't mention how scrolling would work - is that supported?
    • canuckintime 96 days ago
      Only supported through Dwell Control+AssistiveTouch feature for scroll gestures
    • flurdy 96 days ago
      Eye rolling takes on a new purpose...
      • xyst 96 days ago
        imagine doom scrolling with your eyes. Eye fatigue from screens. Eye fatigue from too much rolling.
  • Thoreandan 96 days ago
    Kinda feels like an unfinished project?
  • hulitu 95 days ago
    > Control iPhone with the movement of your eyes

    Next: Control iPhone with the movement of your hips.

    (Hips don't lie - Shakira) /s

    Apple can sell any surveillance tool as a (security) feature.

  • xyst 96 days ago
    How long until this is turned on silently across all devices and adtech folks, native mobile apps, and website operators are able to use your eye tracking data for A/B testing?

    The selfie normalized surveillance. Social media normalized "transparency" (ie, posting every little dumb detail about yourself". Advertisements invading all aspects of life (tv, radio, streaming services, ads in your taskbar).

    • crooked-v 96 days ago
      Apple has been pretty thorough about not allowing actual eye tracking data through to apps (just the resulting interactions), to the point that a lot of Vision devs have complained about it getting in the way of immersive UX design.
  • canuckintime 96 days ago
    Eye tracking is very impressive technology, and foveated rendering is an excellent application, but eye control is poor UX. Yes, I'm saying the emperor has no clothes.

    Imagine having a jittery cursor in the center of your vision at all times? If I had a mouse/trackpad working like that it would immediately be replaced but that's Apple's eye control. Imagine scrolling a page and every where you glance there's a spotlighted/popup control or ad? That's Apple eye control utilizing dwell and snap to item.

    It's telling that the 'best window' apps/use cases for Vision Pro are video watching and Mac virtual display which has low reliance on eye control during usage. Trying to browse the web with Apple's eye control is a clear regression compared to touch/keyboard/mouse/trackpad only useful as an accessibility feature

    • cush 96 days ago
      > eye control is poor UX... a jittery cursor in the center of your vision

      For a lot of folks with tremors or other mobility issues, their eyes may be much more stable than their fingers. It might be helpful to weigh the tradeoffs you're presenting with alternatives including a jittery inaccurate finger in the center of their vision, or even just not being able to use the UI at all.

      > only useful as an accessibility feature

      For the above reasons, that's exactly how they're marketing it

      • canuckintime 96 days ago
        > For a lot of folks with tremors or other mobility issues, their eyes may be much more stable than their fingers. It might be helpful to weigh the tradeoffs you're presenting with alternatives including a jittery inaccurate finger in the center of their vision, or even just not being able to use the UI at all.

        I did not suggest otherwise

        > For the above reasons, that's exactly how they're marketing it

        That's not how it's being received (see other HN users in this very topic) nor how Apple is marketing it for Vision Pro

        • cush 95 days ago
          > That's not how it's being received

          If you don't need the feature, and don't like the feature, and you have to dig through accessibility settings to enable it, then it's a strong indication that the feature probably wasn't built for you

          If you're curious about who this was built for, look up iPhone Switch Control [1] and you'll see how people with mobility issues otherwise use a touchscreen

          [1](https://youtu.be/HBo2BZ-Zzwg?t=119)

          • canuckintime 95 days ago
            Again, I understand the accessibility benefits of the feature and I'm not critiquing that. I responding to a hacker news audience mostly who want to hack this feature and I'm telling them not to bother
            • Angostura 95 days ago
              You said ‘the emperor has no clothes’, without caveats. For the intended audience, the emperor is indeed very snug
              • canuckintime 95 days ago
                I literally started and ended with caveats:

                > Eye tracking is very impressive technology, and foveated rendering is an excellent application...

                > useful as an accessibility feature

        • comex 95 days ago
          Vision Pro is different. It has a finger gesture to "tap".

          The iPhone eye tracking mode relies on dwelling with your eyes, making it much slower than tapping, therefore not a good option for people without disabilities. Unsurprisingly, the setting to enable it is under Accessibility.

          • canuckintime 95 days ago
            > Vision Pro is different. It has a finger gesture to "tap".

            That doesn't rescue it from being a poor control scheme.

            • dpig_ 95 days ago
              Eye tracking in an HMD and eye tracking through a selfie camera are not on par.
              • canuckintime 95 days ago
                It’s not the quality of the the tracking I’m referring to. The eye tracking on Vision Pro is great for foveated rending for example. But as UI control it really doesn’t matter how good the tracking is; eye tracking is just a poor UX
                • cush 94 days ago
                  > eye tracking is just a poor UX

                  You’re literally just arguing your personal opinion by continually restating your personal opinion. Some people love this UX. By definition that makes it an excellent UX for some people.

                  It’s not a poor UX, it’s just a poor UX for the type of user you are.

                  • canuckintime 93 days ago
                    I gave reasons why it's poor UI:

                    > Imagine having a jittery cursor in the center of your vision at all times? If I had a mouse/trackpad working like that it would immediately be replaced but that's Apple's eye control. Imagine scrolling a page and every where you glance there's a spotlighted/popup control or ad? That's Apple eye control utilizing dwell and snap to item.

                    Nobody has responded to those but instead keep saying I'm attacking accessibility.

                    • cush 93 days ago
                      I don't think the cursor is an issue
    • kube-system 96 days ago
      Yeah, disability can be frustrating. But good on Apple for giving people some options here.
      • canuckintime 96 days ago
        Hey I did not say otherwise. It's a good accessibility feature. It becomes frustrating when Apple makes it the main control as for the Vision Pro