November 10, 2017
Part II of my iPhone X Review. Check out Part 1
FaceID setup was over before I remembered to pay much attention to it. The process was incredibly easy, even if it made me feel a bit silly. Setup happened during device setup, and while I have wanted to go in and reset FaceID to pay a little more attention to the process I’m holding back because, unlike TouchID, FaceID is backed by machine learning, where not only the original training images but also every successful unlock afterwards contribute to its education. After all, while your finger prints rarely ever change, our faces are always changing and growing. Regardless, the process was much easier than the TouchID setup which, while simple and clear, required far more attention payed to make sure the scanner was getting a good reading of your fingers. FaceID in comparison required only two rotations of your head, and then you’re all set.
I remember when the iPhone 5S first came out there were articles about how you could trick the sensor into recognizing 10 fingers, instead of the usual 5, by just swapping a second finger in during the scanning process. I would be curious to see if something like that would work with FaceID, where I do the first face rotation and swap in my wife for the second. I imagine that would have non-trivial security implications, but I think I could live with reducing the reliability of FaceID back into the range of TouchID for the benefit of not having my wife asking for my passcode every time she wants to access my phone. We both made sure to set up the other’s finger print onto our phones in the past, which I imagine many do, so it would be great to regain that lost convenience.
FaceID has otherwise been working really well for me. I’ve heard a lot of commentators use the word ‘flawless’ and then describe all of the restrictions and caveats, so I’m gonna try to avoid that. Bottom line, FaceID works, and the compromises compared to TouchID easily match the new benefits. So what compromises are those?
First of all there’s the mentioned restriction to one face, but also there are positions I use my device in that just don’t work with FaceID. One is when I’m laying on my side in bed and holding my phone in portrait orientation. Another is when I have my phone resting on a table and want to look over and read some notifications or check Twitter. I can lean over the device to unlock it (tap to wake makes this experience so much easier than it sounds by the way), but sometimes I’m not quick enough or not at the right angle to get it right the first time and it requires some more work (more on that later.)
The last big case where I notice FaceID consistently doesn’t work is when I’m using my phone with my glasses off. Not because it doesn’t recognize me without them, it works fine with contacts for instance, but I am very near sighted, to the point where when I use my phone with my natural eyesight I have to hold it so close to my face that it becomes easier to interact with the screen with my nose rather than squeezing a finger between my phone and my face. When you have your face that close to the device FaceID just isn’t able to get a good enough look to be certain it’s you, so I find myself entering my passcode with my nose a lot first thing in the morning..
But there are many benefits to FaceID that augment the TouchID experience. For one, while I really liked the lock screen changes in iOS 10 that gave it distinct behavior between locked and unlocked states I felt that the action of unlocking was too subtle to do consistently. You had to make sure the screen was on and rest your finger on the TouchID sensor without pressing it to unlock the device, and you would know it works because a tiny lock icon in the status bar would animate to an unlocked position. That experience is far easier to trigger now that the acts of unlocking your phone with a glance and opening the lock screen with a swipe are two distinct actions which can be performed either together or independently.
With winter finally coming to Boston I also got to experience the amazing benefit of using my phone with my gloves on this week. With TouchID if I was using my phone a lot I got into the habit of leaving my gloves on mostly but leaving my thumb out in the cold air so I could trigger TouchID. The tech gloves I use are pretty good but fail at entering a passcode quite often. With FaceID the process is just a simple swipe while the sensors authenticate me, scarf, hat and all.
I haven’t been able to trick FaceID up with any accessory changes. It works with my glasses on or off, with my highly reflective polarized MVMT sunglasses, with various hats, etc. It also has a built in method of training, where if it fails to recognize you with enough accuracy to unlock the device but you unlock it with a passcode it will add that new reading of your face to its training set, allowing it to actually learn from its mistakes and get better over time. This is the reason I’ve been hesitant to reset FaceID and try to train my wives face along with mine, because I don’t want it to lose the work it’s already done to get to know me. This is something I need to get better at though. When it does fail, for whatever reason, I find myself putting the device to sleep for a moment and waking it back up to try again. Most of the time that is enough to get it to work, but I then remember that the right thing to do would be to just enter my passcode. Maybe once the novelty of unlocking with my face wears off I’ll remember to fall back to passcode, and hopefully that happens less and less often.
FaceID is one part of the iPhone X’s new camera array, which includes two iSight cameras on the back of the device for the first time in a non-plus iPhone model, and the FaceTime camera augmented by FaceID’s IR sensor array. For the first time both the iSight and FaceTime cameras are able to produce Portrait mode photos with the digital bokeh effect which debuted last year with the 7 Plus model, and man is it fun. I was really bummed last year to miss out on this in 2016, I switched from the Plus model to the 4.7” iPhone 7, and the dual-camera system was the only thing that ever tempted me about the larger iPhone 7 Plus. Looking forward to the iPhone X the new cameras were by far the thing I was most excited about.
I’m certainly not a professional, or even a well-educated amateur, photographer by any means, but I’ve always enjoyed taking photos. I would take my family’s DSLR camera out and spend hours finding little details of our yard to capture. For the first time in years that habit has returned. The photos are coming out beautifully, and because of the new HEIC (High Efficiency Image Container) image format in iOS 11, which stores the original photo and the associated metadata and depth map to render the photo with various effects, every picture I take can be edited after the fact to remove portrait mode, change the portrait effects, or to crop or edit the image in a non-destructive manner. So even when a Portrait image turns out not so great I can just go back and change it after the fact to get a standard image.
Portrait mode certainly still seems to have trouble in non-standard situations, standard here meaning actually taking a portrait of a person. The machine learning that powers the technology is tuned to focus on human faces, so taking a photo of my cat in profile for instance creates many false edges that stick out like a sore thumb if you look for them. But in most cases, even when not taking an actual portrait, the photos certainly have that ‘good enough’ feel to them. Anecdotally, Portrait mode photos are the only thing my wife absolutely loves about the new phone compared to her iPhone 8.
I won’t spend too much more time here giving my uninformed opinion or comparing the photos to what you would get from an SLR camera with a larger lens and body. Suffice it to say that I’m incredibly happy with my iPhone X cameras and have very much enjoyed this new opportunity to get back into Instagram, and it’s so nice being able to zoom to 2X without destruction to the photo quality.
The awesome photos go so well with this new incredible OLED screen. I’m not all that observant of color profiles and such, and I don’t have a handful of other devices laying around to compare the screen to, so I’ll just talk about what I can notice on my iPhone X.
I watched Atomic Blonde over the week to check out some HDR content. That is a very dark movie and the black level was very impressive. The colors that did peek through were so vibrant it made it a really fun experience to watch even on my phone. And the lack of any ‘depth’ between the glass and display is hard to believe. These two things together create the illusion when I look at the infamous ‘notch’ (or ‘sensor housing’) that it’s just made of black pixels resting at the same level as the rest of the screen.
Much has been made of the Pixel 2 XL’s OLED screen color-shifting, and the burn-in noticeable after just a week of using the device. Those are known concerns on any OLED display, and things that Apple and Google both had to keep in mind when developing these devices. So how’d they do?
Well by all accounts Apple has tuned their displays to reduce the shifting color profile as much as possible. According to reports Apple is individually(!!!) calibrating each display that comes off of the assembly line, rather than just creating a calibration profile intended to work with all iPhone X OLED displays. This attention to detail has resulted in practically no red-shifting, but blue-shifting quickly becomes visible viewing 30-40 degrees off center. No one at work has a Pixel 2 XL yet though, so I can’t compare much better or worse it is at that.
As for burn-in it looks like Apple is being very proactive with the software to reduce the risks of this. Android is not helped by their persistent software buttons at the bottom of the screen, which is where burn-in on the Pixel 2 XL is most noticeable allegedly, but now that Apple has the Home Indicator at the bottom of the screen pretty much at all times it’s something to watch here as well. The software is working overtime to make sure that this indicator moves and changes color slightly though to avoid this issue, which should certainly help. And other feature of iOS that used to be ‘nice to haves’ are now critical for display longevity, such as perspective shifting wallpapers and auto-brightness adjustment.
I’ve been a fan of Night Shift since it debuted in iOS 9.3, and have never considered turning it off once. With True Tone finally arriving on the iPhone X after two years on the iPad Pro lineup the iPhone finally adjusts light temperature to match the ambient room lighting automatically, an effect that in my opinion doesn’t work well at all with Night Shift enabled. For the first time using Night Shift to read in bed after the lights had gone off seemed like more eye strain than was comfortable, but after disabling it and just using True Tone it seems fine. I’ll experiment more with this, potentially with a less warm setting, but it seems like with True Tone my days of using Night Shift are over.
Aside from a single experiment in the Plus Club with the 6S Plus I’ve stuck with the 4.7” iPhone model since 2014, and have gotten very used to the size. When I had the 5.5” phone I really enjoyed having the ‘Regular’ size class while in landscape and tried to use the phone in landscape whenever I could. I liked the multi-pane view and made sure to keep Plus size landscape screens in mind whenever I designed an app. But for all other uses, on the train, in bed, while walking, the phone was just a bit too much to keep a good grip on. It was hard going back to the 4.7” screen, because at the time it was clear that there were compromises in getting the smaller screened phone.
Finally with the iPhone X not only have they brought these features typically restricted to the largest phones down to a form factor more or less the same as the standard sized devices, but because of the higher price point we also get new features, like OIS (Optical Image Stabilization) on the telephoto iSight lens, which the Plus phones could probably fit but at too high a cost for Apple to maintain its desired profit margins.
The difference between the device size itself compared to an iPhone 7 or 8 is hard to notice. It’s slightly wider, but still feels really comfortable to hold in my hands. And the fact that nearly every square centimeter of the device is amazing. Every now and then I will look down at the device while the screen is off and it just doesn’t seem right how big I know the screen is.
That being said, I’m honestly not finding the screen size all that much noticeably bigger during every day use. Because it has the same width in points as the iPhone 7, and because a significant portion of the extra screen height is taken up by the Home Indicator, padding, and the extra tall status bar, the iPhone X still has a ‘Compact’ size class when running in landscape even though it technically has a few more usable points of vertical real estate compared to the 5.5” screen devices. It’s not the biggest deal, having rich landscape support wasn’t a thing I desperately missed after switching back to smaller phones, but it’s a ‘nice to have’ that I really enjoyed whose absence it felt on the iPhone X. When comparing the phone to my wife’s iPhone 8 I am aware that I’m basically getting an extra table view cell at a time, or a couple dozen extra points of space in other apps, but it’s not enough for it to be a noticeable difference in everyday use for me.
Where the extra real estate really shines is on the home screen, when watching videos (which, like previous iPhone models, have two scale models. Fit the screen without losing any video content, or fill every pixel of the screen with video even if there’s a bit of overflow) in overflow mode, looking at photos, and reading long content (such as this review!) in Safari or News. As nice as those experiences are though, I still find myself lightly underwhelmed by what they’re doing with the largest display Apple has shipped on an iPhone before.
That’s all for now, Part III coming soon!