In my hand, the iPhone X feels very much like the iPhone 8—the glass makes it much easier to grip than the iPhone 6 or 7. This phone is a little bit wider than the 8 (3.5 mm, or .14 inch), and after a day I can already tell that I’m going to need to retrain some muscle memory and readjust my grip. Still, as someone who found the iPhone Plus design simply too wide for my hands (the Plus is 7.2 mm wider than the iPhone X), this phone feels comfortable.
If there’s an ergonomic issue I’m going to have with the iPhone X, it’s the height of the device, not its width. Holding my iPhone 8 in one hand, I can barely reach my thumb up to the top of the screen. But not only is the iPhone X 5mm taller, but the screen extends almost all the way to the top. I can’t keep the bottom of the iPhone X braced with my pinky and use my thumb to tap items high up in the interface. I suppose over time I will either change how I hold the phone in my hand, get used to shimmying my hand up the phone to reach the top, or get used to not one-handing my iPhone as often as I currently do.
I really got a feel for how much the stabilization in the telephoto lens affected my shots when taking pictures of landmarks at night. These shots of the Guardians of the Galaxy tower really highlight the difference in sharpness that you see with a stabilized lens.
The second big way that a stabilized telephoto lens improves your images is in Portrait Mode, especially in anything but bright sunlight. The stabilized lens gives you more confidence to flip it into Portrait Mode in any light that supports the feature. Adding stabilization essentially allows you to shoot all the way down to the low-light cap on the portrait effect itself, which is great.
As I bummed around the park testing the iPhone X, I found myself defaulting to the 2x mode a lot. This allowed for some great sharp captures inside rides at a zoom that simply weren’t possible before. I’ve gotten lucky a handful of times with phones in the past, but never with a telephoto lens. The train vignettes, Pirates and other rides are so incredibly dark and dramatically lit that they’re a huge stress test for a zoom lens on a phone. The results were very impressive.
Landscape mode on the iPhone X is generally pretty messy: the notch goes from being a somewhat forgettable element in the top status bar to a giant interruption on the side of the screen, and I haven’t seen any apps really solve for it yet. And the home bar at the bottom of the screen often sits over the top of content, forever reminding you that you can swipe to go home and exit the chaos of landscape mode forever.
The other problem is actually much more interesting: almost all of the early questions about FaceID centered around how it would work in the dark, but it turns out that was exactly backwards. FaceID works great in the dark, because the IR projector is basically a flashlight, and flashlights are easy to see in the dark. But go outside in bright sunlight, which contains a lot of infrared light, or under crappy florescent lights, which interfere with IR, and FaceID starts to get a little inconsistent.
I took a walk outside our NYC office in bright sunlight, and FaceID definitely had issues recognizing my face consistently while I was moving until I went into shade or brought the phone much closer to my face than usual. I also went to the deli across the street, which has a wide variety of lights inside, including a bunch of overhead florescent strips, and FaceID also got significantly more inconsistent.
For unlock and on-device authentication, it's so fast it's almost like it doesn't even exist. Where Touch ID was always unmistakably active — you knew you had to put your finger on the sensor — Face ID seems almost ambient. You look, therefore you've unlocked.
It's not perfect, though. The biggest problem people will have with it is that it requires attention. You really have to look at your phone to unlock it. Not think you're looking at it. Not kind of look at it. Really eye-of-the-tiger look at it. The problem with attention-aware interface is that you absolutely have to be paying attention.
But seriously, so far, so great. I'm forgetting that Face ID is even there most of the time until I look and see iPhone X is already unlocked. Even with App Store apps like 1Password, Face ID is so fast it makes Touch ID suddenly seem slow and onerous by comparison.
So yeah, some app experiences are a little less than elegant right now. Thankfully, navigating through the iPhone X's interface is generally a breeze. Since there's no home button, cruising through iOS happens with a series of swiping gestures. Slide a finger across a bar at the bottom to switch between running apps, swipe up from the bottom of the screen and hold for a moment (you'll feel a haptic pulse) to display all of your currently running apps, or simply swipe up to go back to the home screen. Despite hitting the reset button on almost a decade of iPhone behavior, Apple has built a version of iOS that handily proves home buttons aren't necessary anymore.
That said, it's not perfect. Using the new app switcher seems just a hair slower than double-tapping the home button, and trying to close an app takes a little more effort than it should. Instead of swiping up on an app window to dismiss it, you have to press and hold the window, then tap a close button on the corner. The move was necessary since the swipe up does something else, but the process now takes an extra, mildly annoying step. The stock iOS keyboard also has a lot of empty space beneath it, and while Apple uses it for buttons that switch layouts and fire up voice dictation, it's pretty ugly.
So far, the biggest drawback to the extra screen space and lack of home button has been third-party app design. Many of my apps haven't been redesigned for the new screen size, so they show up with thick black bars on the top and bottom to mimic the same aspect ratio you'd get on a regular iPhone screen. It looks like a lot of wasted space.
Other apps have been refitted for the iPhone X screen, but have made a bunch of funky design choices. For example, some have large chunks of unused space at the bottom near the home bar. And I saw at least one app that showed the home bar bleeding into the menu icons at the bottom of the screen.
Apple's not totally innocent either. There were a few cases where I saw large chunks of unused space at the bottom of the screen in some of Apple's own in-house apps, such as the iPhone's built-in Mail app, especially when the keyboard popped up.
Simply put: Face ID is really fucking impressive. But that’s because it’s invisible.
You pick up your phone, swipe up, and you’re in. You open your password manager, a little orb swirls, and you’re in.
Android has had a face unlock feature since 2011 and Samsung introduced theirs earlier this year, but neither of those implementations work quite as seamlessly as Apple’s.
For a normal human who isn’t aware of the 30,000 invisible dots being projected on their face or the 3D map of their head encrypted somewhere deep inside their phone, there’s nothing “futuristic” about these interactions. Using Face ID is what life without a passcode — life before we all became paranoid technofreaks — felt like.
With the iPhone X, Apple has had to rethink many of the iOS core gestures. The new device features a brand new design with a taller display, Face ID and no home button.
If you plan on buying a new iPhone X, it’s going to take a while to get used to these new metaphors. So here’s a list of some not-so-obvious features in the iPhone X.
New emojis. Do you need to hear anything else? Apple just released an iOS update. iOS 11.1 is the first feature update for iOS 11. It adds a couple of new things, starting with dozens of new emojis.
Finally, iOS 11 comes with a bunch of bug fixes and security updates. In particular, it fixes the KRACK vulnerability.
Messaging, Lists, and Notes are the only SiriKit domains that will work with HomePod at launch, but it's likely we'll see more added over time. This means the HomePod will be able to do things like send messages in Telegram, create tasks in Todoist or Things, and create notes in Evernote.
Mrs. Archer said that her daughter can use her iPad to talk about everything she likes and dislikes, but there are still some difficulties in communication.
“She doesn’t use it to say, ‘Hey Tammy, good to see you.’ I don’t know if that will ever come. I hope it will, that she doesn’t always need mom beside her to help her function in society, but we don’t know where she’ll end up at, or how that will work out.”
Image recognition was introduced with iOS 10 in June 2016, when the Photos app was updated with deep learning for object and scene detection. Apple stressed during the keynote, as well as on its website, that all object detection is done completely locally on the device.
In a Medium post, developer Kenny Yin detailed all of the facial expressions and objects the Photos app recognized, which he found through a few lines of code in the framework of macOS Sierra’s Photos app. At the time it was released, the app was capable of recognizing seven different facial expressions, and a total of 4,432 keywords, “brassiere” included.
The reviews definitely gave me confidence about Face ID. I think it will feel weird to use other still-has-a-home-button devices, such as iPads and iPad Pros, after using the iPhone X.
I am pretty sure I will continue to curse and swear at 'designers' who think that small fonts in gray on gray background are cool, even when reading on the iPhone X.
By skipping from iPhone 6 to iPhone X, I probably missed the golden era when anybody who mentioned the name Siri will trigger my phone.
Thanks for reading.