So like, I have no idea what proportion of my music listening data Apple is getting or using productively, but I do baaaaaasically have the musical lifestyle of my dreams right now. And you know what? When people follow me on Apple Music, I check out their profile, and if I like the look of their sounds, I follow them back. If the whole point of the algorithm part is to find new stuff, I probably shouldn’t be seeding it with 10 years of data about old stuff. Eventually it’ll learn which new stuff I like, and the recommendations will just keep getting better.
My question is, are we eventually going to be able to post stuff on Apple Music? Will we be able to share music into some kind of feed, maybe with a comment, surely with some emoji? Will we be able to post photos? Location? Fitness data? Other life story stuff alongside what we’re listening to right now?
Much more importantly, what about our own music? If I record a song in Logic or GarageBand — or, hell, any music app — will I be able to post it for my followers to hear? Will I be able to sell it?
Cue revealed how HomePod will adjust how it sounds to different music without offering user-facing bass and treble adjustments. We already knew that HomePod will tune itself differently based on its position in a room, but Cue describes Apple’s smart speaker adjusting its sound even when used in place.
So how do Cardiogram’s algorithms make good guesses without directly measuring the amount of sugar in someone’s blood? Nobody really knows.
“Diabetes is very clearly a cardiovascular condition, but it’s not one with an obvious physiological connection to heart rate variability,” says Mark Pletcher, one of the principal investigators of the Health eHeart study and a co-author on the paper presented Wednesday. When you train machine learning algorithms on data without knowing the mechanisms behind the underlying patterns, you often get a signal without understanding why. “It makes me nervous, frankly. We’ve had a lot of internal discussions about whether this could be picking up medications diabetics use or some other extraneous factor. But we haven’t come up with anything.”
Someone just posted what experts say is the source code for a core component of the iPhone’s operating system on GitHub, which could pave the way for hackers and security researchers to find vulnerabilities in iOS and make iPhone jailbreaks easier to achieve.
The GitHub code is labeled “iBoot,” which is the part of iOS that is responsible for ensuring a trusted boot of the operating system. In other words, it’s the program that loads iOS, the very first process that runs when you turn on your iPhone. It loads and verifies the kernel is properly signed by Apple and then executes it—it’s like the iPhone’s BIOS.
In iOS 11, with Live Photos enabled, the Camera app actually captures 1.5 seconds of video before you press the shutter button. So when you choose a new key photo, it’s like editing your photo with a highly precise time machine that can show you every moment of the second and a half before the photo was taken.
Kap is an open source screen recording app for macOS that is built using web technologies. It sits in your menubar and offers a number of different presets to quickly start a recording with a few simple clicks. Just pick a preset or app you want from the list, click on the record button, make adjustments to the area if needed and start recording.
Today, Apple has sent out a note to developers to say that a processing error caused the problem, and that in the future it will only send alerts by email, but that developers will have to log in to their accounts to see any actual numbers or other details.
In December, I converted my one-bedroom apartment in San Francisco into a “smart home.” I connected as many of my appliances and belongings as I could to the internet: an Amazon Echo, my lights, my coffee maker, my baby monitor, my kid’s toys, my vacuum, my TV, my toothbrush, a photo frame, a sex toy, and even my bed.
Why? Why would I do this? For convenience? Perhaps. It was appealing to imagine living like the Beast in the Disney movie, with animated objects around my home taking care of my every need and occasionally serenading me. As a result of the apartment upgrade, I could watch what was happening in the house when we weren’t there. I could use voice commands to turn on the lights, coffee maker, and music. I could exchange voice messages with our toddler (and her caregiver) through a toy. I got reminders from my toothbrush to brush and tips on how best to do it. If I got cold in the night, my bed could warm me up. And I no longer had to push a vacuum around the house, instead activating a robot to do it for me with a press of a smartphone button.
Thanks to the Internet of Things, I could live in my very own tech-mediated Downton Abbey. That’s the appeal of smart homes for most people, and why they are supposed to be a $27 billion market by 2021. But that wasn’t my primary motivation. The reason I smartened up my house was to find out whether it would betray me.
I don't need a smart home. I just need a smarter brain.
Thanks for reading.