Let me cut to the chase: sadly, I don’t have a new iPad Pro to review today on MacStories.
I was able to try one in London last week, and, as I wrote, I came away impressed with the hardware. However, I didn’t get a chance to use a new iPad Pro over the past six days ahead of today’s review embargo.
I know that many of you were expecting a deeper look at the iPad Pro on MacStories this week, but that will have to come later. I still plan on upgrading to a 13” iPad Pro myself; I’ve decided I want to return to the larger size after a few months with the 11” iPad Pro. If you’re interested in checking out reviews of the new iPad Pros from heavy iPad users like yours truly right now, I highly recommend reading and watching what my friends Jason Snell and Chris Lawley have prepared.
Still, as I was thinking about my usage of the iPad and why I enjoy using the device so much despite its limitations, I realized that I have never actually written about all of those “limitations” in a single, comprehensive article. In our community, we often hear about the issues of iPadOS and the obstacles people like me run into when working on the platform, but I’ve been guilty in the past of taking context for granted and assuming that you, dear reader, also know precisely what I’m talking about.
Today, I will rectify that. Instead of reviewing the new iPad Pro, I took the time to put together a list of all the common problems I’ve run into over the past…checks notes…12 years of working on the iPad, before its operating system was even called iPadOS.
My goal with this story was threefold. First, as I’ve said multiple times, I love my iPad and want the platform to get better. If you care about something or someone, sometimes you have to tell them what’s wrong in order to improve and find a new path forward. I hope this story can serve as a reference for those with the power to steer iPadOS in a different direction in the future.
Second, lately I’ve seen some people argue on Mastodon and Threads that folks who criticize iPadOS do so because their ultimate goal is to have macOS on iPads, and I wanted to clarify this misunderstanding. While I’m on the record as thinking that a hybrid macOS/iPadOS environment would be terrific (I know, because I use it), that is not the point. The reality is that, regardless of whether macOS runs on iPads or not, iPadOS is the ideal OS for touch interactions. But it still gets many basic computing features wrong, and there is plenty of low-hanging fruit for Apple to pick. We don’t need to talk about macOS to cover these issues.
Lastly, I wanted to provide readers with the necessary context to understand what I mean when I mention the limitations of iPadOS. My iPad setup and workflow have changed enough times over the years that I think some of you may have lost track of the issues I (and others) have been experiencing. This article is a chance to collect them all in one place.
Last month, AltStore was finally made available on iOS for everyone living in the European Union. Not only does the first alternative app marketplace on iOS ship with the great Delta videogame emulator, but it also lets you install Clip, a clipboard manager unlike any other on the iPhone.
The app’s uniqueness resides in the sole fact that it’s the first ever clipboard manager on the iPhone that can actually run in the background and continuously monitor your clipboard, regardless of the app you’re in. And despite the fact that the app is pretty bare-bones right now, this core ability alone makes a huge difference in usage, enough to crown Clip the best clipboard manager to ever ship on iOS.
Apple Music kicked off a 10-day event today celebrating its newly-compiled list of the 100 Best Albums of all time. Apple’s press release explains that the list was created by:
Apple Music’s team of experts alongside a select group of artists, including Maren Morris, Pharrell Williams, J Balvin, Charli XCX, Mark Hoppus, Honey Dijon, and Nia Archives, as well as songwriters, producers, and industry professionals.
Apple also clarifies that its 100 Best Albums list is an editorially-created list that isn’t based on streaming statistics.
There are multiple ways to navigate Apple Music’s 100 Best Albums list.
Each day, Apple Music is revealing 10 new albums, starting with albums 91-100, which are:
Rachel Newman, Apple Music’s senior director of content and editorial, had this to say about the list:
100 Best brings together all the things that make Apple Music the ultimate service for music lovers — human curation at its peak, an appreciation for the art of storytelling, and unparalleled knowledge of music and an even deeper love for it. We have been working on this for a very long time, and it’s something we are all incredibly proud of and excited to share with the world.
Each album includes editorial content too.
The 100 Best Albums list can be accessed at 100best.music.apple.com. From there, you can listen on the web, add an album to your library, share it, or stream it with the Music app. Each album has its own page including written material that puts the album into context and lists each track, too.
Apple is also celebrating the 100 Best Albums list on Apple Radio and giving the creators of each album an award:
All 100 Best Albums recipients will be given an award comprised of blasted anodized aluminum, sourced entirely from recycled Apple products, in a unique polished PVD gold. The design on the back of the award takes its cues from a vinyl LP record and is inscribed with the artist’s name, the album title, and the album’s year of release.
I’ve enjoyed browsing through the first ten albums in this collection and appreciate that it’s being rolled out in stages, allowing listeners to explore a manageable number of albums each day. This will be a nice treat to look forward to for the next nine days.
Now, don’t get offended, but – you aren’t as good at clocking deepfakes as you think you are.
And it’s not just you–nobody’s that good at it. Not your mom, or your boss, or anyone in your IT department.
To make matters worse, you probably think you can spot a fake. After all, you see weird AI-generated videos of celebrities on social media and they give you that uncanny valley tingle. But it’s a different ballgame when all you’ve got to go on is a voice.
In real life, people only catch voice clones about 50% of the time. You might as well flip a coin.
And that makes us extremely vulnerable to attacks.
In the “classic” voice clone scam, the caller is after an immediate payout (“Hi it’s me, your boss. Wire a bunch of company money to this account ASAP”). Then there are the more complex social engineering attacks, where a phone call is just the entryway to break into a company’s systems and steal data or plant malware (that’s what happened in the MGM attack, albeit without the use of AI).
As more and more hackers use voice cloning in social engineering attacks, deepfakes are becoming such a hot-button issue that it’s hard to tell the fear-mongering (for instance, it definitely takes more than three seconds of audio to clone a voice) from the actual risk.
To disentangle the true risks from the exaggerations, we need to answer some basic questions:
How hard is it to deepfake someone’s voice?
How do hackers use voice clones to attack companies?
And how do we guard ourselves against this… attack of the clones?
Like a lot of modern technologies, deepfake attacks actually exploit some deep-seated fears. Fears like, “your boss is mad at you.” These anxieties have been used by social engineers since the dawn of the scam, and voice clones add a shiny new boost to their tactics.
But the good news is that we can be trained to look past those fears and recognize a suspicious phone call–even if the voice sounds just like someone we trust.
If you want to learn more about our findings, read our piece on the Kolide blog. It’s a frank and thorough exploration of what we should be worried about when it comes to audio deepfakes.**
Our thanks to Kolide for sponsoring MacStories this week.
On Tuesday, Apple introduced its new iPad Pros with a video called Crush! that was meant to convey how much the device can do. The trouble was the way the video delivered the message, depicting musical instruments, books, a record player, paints, a TV, and many other creative tools being crushed by a hydraulic press. When the press opened, it revealed the new iPad Pro.
Crush! was widely criticized by the creative community, including actor Hugh Grant, director Reed Morano, and many others. Within hours, the story had spread beyond the tech industry to all corners of the mainstream media.
Today, as reported by The Verge, Apple vice president of marketing Tor Myhren made a statement to Ad Age, apologizing for the video saying:
Creativity is in our DNA at Apple, and it’s incredibly important to us to design products that empower creatives all over the world. Our goal is to always celebrate the myriad of ways users express themselves and bring their ideas to life through iPad. We missed the mark with this video, and we’re sorry.
Earlier today, Federico and I covered the firestorm caused by the video on MacStories Unwind+ for Club MacStories members. During the episode, which will be generally available tomorrow, Federico predicted this outcome, which I think is the right move given the widespread strong reaction to the video.
This week’s “Let Loose” Apple event was filmed on the iPhone and edited on the Mac and iPad. During the event, filmmaker Stu Maschwitz noticed that some scenes featured a shallower depth of field than is possible with the iPhone’s cameras. Although he doesn’t cite a source, Maschwitz says he figured out how Apple got those shots:
“Let Loose” was shot on iPhone 15 Pro Max, and for several shots where a shallow depth-of-field was desired, Panavision lenses were attached to the iPhones using a Panavision-developed mount called the “Lens Relay System.” This rig is publicly available for rent from Panavision today, although not currently listed on their website.
With Panavision’s new system, the iPhone’s own lens captures the areal image created by any Panavision lens you like. The iPhone provides the image capture, in ProRes Apple Log, of course.
In fact, “Let Loose” is the first Apple Event finished and streamed in HDR, pushing the iPhone’s capture abilities even further than “Scary Fast.”
Or think of it this way: Apple confidently intercut footage shot with the most elite cinema lenses available with footage shot with unadorned iPhone lenses.
I appreciate Maschwitz’s perspective on the capabilities of the iPhone’s cameras. Having rewatched this week’s event a couple of nights ago, I would never have suspected it was shot on a mobile phone if I didn’t know to look for the note at the end of the video.