Federico Viticci

906 posts on MacStories since November 2015

John is MacStories' Managing Editor, has been writing about Apple and apps since joining the team in 2015, and today, runs the site alongside Federico. John also co-hosts four MacStories podcasts: AppStories, which covers the world of apps, MacStories Unwind, which explores the fun differences between American and Italian culture and recommends media to listeners, Ruminate, a show about the weird web and unusual snacks, and NPC: Next Portable Console, a show about the games we take with us.

This Week's Sponsor:

Turbulence Forecast

Know before you go. Get detailed turbulence forecasts for your exact route, now available 5 days in advance.


An iPad Pickle

This week, Federico and John reflect on where the iPad fits within their workflows after the announcement of iPadOS 26.

Then, on AppStories+, they explore the potential for an Apple automation renaissance built on the features announced at WWDC.


We deliver AppStories+ to subscribers with bonus content, ad-free, and at a high bitrate early every week.

To learn more about an AppStories+ subscription, visit our Plans page, or read the AppStories+ FAQ.


AppStories+ Deeper into the world of apps

AppStories Episode 442 - An iPad Pickle

0:00
29:14

AppStories+ Deeper into the world of apps


Rethinking Where the iPad Fits After iPadOS 26


Leave Feedback for John and Federico

Follow us on Mastodon

Follow us on Bluesky


WWDC 2025: The Benefits of Not Trying Too Hard

This year’s WWDC was very different from last year’s. It’s tempting to say it was a return to form, but I don’t think that’s entirely it. What made it so much better is that Apple stopped trying too hard. So much of WWDC 2024 felt off. The perception was that Apple was behind on AI,...


App Debuts

Pixeldrop Pixeldrop is a new iPhone and iPad app for pixelating or redacting sections of an image. I’ve done this for years with Annotable, which is still my go-to, but if you’re looking for a simple and free alternative, Pixeldrop works well. Controller for HomeKit Controller for HomeKit, which is available on the iPhone, iPad,...


Swift Assist, Part Deux

At WWDC 2024, I attended a developer tools briefing with Jason Snell, Dan Moren, and John Gruber. Later, I wrote about Swift Assist, an AI-based code generation tool that Apple was working on for Xcode.

That first iteration of Swift Assist caught my eye as promising, but I remember asking at the time whether it could modify multiple files in a project at once and being told it couldn’t. What I saw was rudimentary by 2025’s standards with things like Cursor, but I was glad to see that Apple was working on a generative tool for Xcode users.

In the months that followed, I all but forgot that briefing and story, until a wave of posts asking, “Whatever happened to Swift Assist?” started appearing on social media and blogs. John Gruber and Nick Heer picked up on the thread and came across my story, citing it as evidence that the MIA feature was real but curiously absent from any of 2024’s Xcode betas.

This year, Jason Snell and I had a mini reunion of sorts during another developer tools briefing. This time, it was just the two of us. Among the Xcode features we saw was a much more robust version of Swift Assist that, unlike in 2024, is already part of the Xcode 26 betas. Having been the only one who wrote about the feature last year, I couldn’t let the chance to document what I saw this year slip by.

Read more



Hands-On: How Apple’s New Speech APIs Outpace Whisper for Lightning-Fast Transcription

Late last Tuesday night, after watching F1: The Movie at the Steve Jobs Theater, I was driving back from dropping Federico off at his hotel when I got a text:

Can you pick me up?

It was from my son Finn, who had spent the evening nearby and was stalking me in Find My. Of course, I swung by and picked him up, and we headed back to our hotel in Cupertino.

On the way, Finn filled me in on a new class in Apple’s Speech framework called SpeechAnalyzer and its SpeechTranscriber module. Both the class and module are part of Apple’s OS betas that were released to developers last week at WWDC. My ears perked up immediately when he told me that he’d tested SpeechAnalyzer and SpeechTranscriber and was impressed with how fast and accurate they were.

It’s still early days for these technologies, but I’m here to tell you that their speed alone is a game changer for anyone who uses voice transcription to create text from lectures, podcasts, YouTube videos, and more. That’s something I do multiple times every week for AppStories, NPC, and Unwind, generating transcripts that I upload to YouTube because the site’s built-in transcription isn’t very good.

What’s frustrated me with other tools is how slow they are. Most are built on Whisper, OpenAI’s open source speech-to-text model, which was released in 2022. It’s cheap at under a penny per one million tokens, but isn’t fast, which is frustrating when you’re in the final steps of a YouTube workflow.

An SRT file generated by Yap.

An SRT file generated by Yap.

I asked Finn what it would take to build a command line tool to transcribe video and audio files with SpeechAnalyzer and SpeechTranscriber. He figured it would only take about 10 minutes, and he wasn’t far off. In the end, it took me longer to get around to installing macOS Tahoe after WWDC than it took Finn to build Yap, a simple command line utility that takes audio and video files as input and outputs SRT- and TXT-formatted transcripts.

Yesterday, I finally took the Tahoe plunge and immediately installed Yap. I grabbed the 7GB 4K video version of AppStories episode 441, which is about 34 minutes long, and ran it through Yap. It took just 45 seconds to generate an SRT file. Here’s Yap ripping through nearly 20% of an episode of NPC in 10 seconds:

Replay

Next, I ran the same file through VidCap and MacWhisper, using its V2 Large and V3 Turbo models. Here’s how each app and model did:

App Transcripiton Time
Yap 0:45
MacWhisper (Large V3 Turbo) 1:41
VidCap 1:55
MacWhisper (Large V2) 3:55

All three transcription workflows had similar trouble with last names and words like “AppStories,” which LLMs tend to separate into two words instead of camel casing. That’s easily fixed by running a set of find and replace rules, although I’d love to feed those corrections back into the model itself for future transcriptions.

Once transcribed, a video can be used to generate additional formats like outlines.

Once transcribed, a video can be used to generate additional formats like outlines.

What stood out above all else was Yap’s speed. By harnessing SpeechAnalyzer and SpeechTranscriber on-device, the command line tool tore through the 7GB video file a full 2.2× faster than MacWhisper’s Large V3 Turbo model, with no noticeable difference in transcription quality.

At first blush, the difference between 0:45 and 1:41 may seem insignificant, and it arguably is, but those are the results for just one 34-minute video. Extrapolate that to running Yap against the hours of Apple Developer videos released on YouTube with the help of yt-dlp, and suddenly, you’re talking about a significant amount of time. Like all automation, picking up a 2.2× speed gain one video or audio clip at a time, multiple times each week, adds up quickly.

Whether you’re producing video for YouTube and need subtitles, generating transcripts to summarize lectures at school, or doing something else, SpeechAnalyzer and SpeechTranscriber – available across the iPhone, iPad, Mac, and Vision Pro – mark a significant leap forward in transcription speed without compromising on quality. I fully expect this combination to replace Whisper as the default transcription model for transcription apps on Apple platforms.

To test Apple’s new model, install the macOS Tahoe beta, which currently requires an Apple developer account, and then install Yap from its GitHub page.


A Behind the Scenes Peek at WWDC Week

This week, Federico and John catch listeners up on their whirlwind WWDC week, which was chaotic in the best possible way.

On AppStories+, Federico and John get excited about what the WWDC announcements say about the direction of automation on Apple’s platforms.


We deliver AppStories+ to subscribers with bonus content, ad-free, and at a high bitrate early every week.

To learn more about an AppStories+ subscription, visit our Plans page, or read the AppStories+ FAQ.


AppStories+ Deeper into the world of apps

AppStories Episode 441 - A Behind the Scenes Peek at WWDC Week

0:00
34:22

AppStories+ Deeper into the world of apps

This episode is sponsored by:

  • Notion – Try the powerful, easy-to-use Notion AI today.


More WWDC Coverage on MacStories


Leave Feedback for John and Federico

Follow us on Mastodon

Follow us on Bluesky


WWDC 2025: A First Look at Everything Apple Announced

For our second WWDC episode of AppStories, Federico and John dig into the details they’ve learned about what was announced by Apple this week at WWDC 2025.


We deliver AppStories+ to subscribers with bonus content, ad-free, and at a high bitrate early every week.

To learn more about an AppStories+ subscription, visit our Plans page, or read the AppStories+ FAQ.


AppStories+ Deeper into the world of apps

AppStories Episode 440 - WWDC 2025: A First Look at Everything Apple Announced

0:00
57:14

AppStories+ Deeper into the world of apps

This episode is sponsored by:

  • Clic for Sonos – No lag. No hassle. Just Clic..
  • Elements – A truly modern, drag-and-drop website builder for macOS.


A Look at Everything Apple Announced at WWDC


Leave Feedback for John and Federico

Follow us on Mastodon

Follow us on Bluesky


WWDC 2025: The AppStories Interviews with Apple Design Award Winners

For their first WWDC 2025 AppStories episode, Federico and John interview finalists and winners of the Apple Design Award.


We deliver AppStories+ to subscribers with bonus content, ad-free, and at a high bitrate early every week.

To learn more about an AppStories+ subscription, visit our Plans page, or read the AppStories+ FAQ.


AppStories+ Deeper into the world of apps

AppStories Episode 439 - WWDC 2025: The AppStories Interviews with Apple Design Award Winners

0:00
58:16

AppStories+ Deeper into the world of apps

This episode is sponsored by:


Interviews with Apple Design Award Finalists and Winners


Leave Feedback for John and Federico

Follow us on Mastodon

Follow us on Bluesky