← BACK TO SHOP
← BACK TO SHOP

Inside Apple: Sonic Accessibility

Original artwork by Michael Zhang.

This episode was written & produced by Nikolas Harter and Casey Emmerling.

If you want to know where the future of sound is headed, accessibility is a great place to look. And for decades, Apple has been leading the charge in accessible technology. In this episode, the Apple team breaks down the philosophy and craft behind their most impactful accessibility features. Along the way, we reveal how these innovations have transformed the way we interact with our devices, and could even lead to a revolution in hearing health. Featuring Sarah Herrlinger, Deidre Caldbeck, Ron Huang, and Eric Treski.

Enter the “Sound Off” Story Contest at 20k.org/soundoff. Submissions close on May 7th, 2025.

Get in touch with Apple’s accessibility team by writing accessibility@apple.com.

Explore the all new Defacto Sound website, and click the Contact Form to get in touch.

If you know what this week's mystery sound is, tell us at mystery.20k.org.

Follow Dallas on Instagram, TikTok, YouTube and LinkedIn.

Join our community on Reddit and follow us on Facebook.

Follow You’ll Hear It, the #1 jazz podcast, on Apple Podcasts, Spotify or YouTube.

Sign up for a one-dollar-per-month trial at shopify.com/20k.

Cut your current cloud bill in half with OCI at oracle.com/20k.

View Transcript ▶︎

You’re listening to Twenty Thousand Hertz. I’m Dallas Taylor.

[music in: Keith Kenniff - Senses]

I’ve always been fascinated with accessibility when it comes to sound. Because so many incredible innovations in the world of audio began as accessibility efforts.

Take for instance, voice commands and text to speech. They were originally designed for people with visual or motor impairments... But today, we take for granted that we can talk to our devices, and they can talk back to us. Closed captions were created for people with hearing impairments… Yet regardless of your hearing, many of us use captions all the time. Even audiobooks were originally created by blindness advocacy groups, way back in the 1930s. And now, audiobooks are a mainstream, multi-billion dollar industry.

The point is, if you want to know where the world of sound is headed, accessibility is a great place to look. I firmly believe that many of the greatest future achievements in sound will come from accessibility efforts now.

[music in: Keith Kenniff - Dearest]

Sarah Herrlinger: Accessibility is something that's incredibly important to us. It is a part of the process in everything that we build.

That's Sarah Herrlinger.

Sarah Herrlinger: I'm the Senior Director of Global Accessibility Policy and Initiatives at Apple.

I met with Sarah at Apple Park in Cupertino, California.

Sarah Herrlinger: My team's job is to ensure that every way that Apple presents itself to the world, whether that be through our products, our services, our stores, our workplace, events, you name it, that we are living our core value of accessibility as a basic human right.

To embody this idea of accessibility as a basic human right, Apple depends on the people who use these features in their day to day lives.

Sarah Herrlinger: It starts with adherence to the disability community mantra of, “Nothing about us without us.” You don't build for a community, you build with them. And the first step of that for us is the hiring of people with lived experience on our teams to help drive the development of our different types of accessibility features.

Sarah Herrlinger: We look at accessibility as kind of falling into five main pillars, which is vision, hearing, physical/motor, cognitive, and speech. And we build features to support each one of those areas.

For decades now, Apple has been continually refining its approach to those five pillars.

Sarah Herrlinger: It’s also not something that’s new to us. Our first office of disability actually started in 1985.

That was just one year after Steve Jobs first introduced the Macintosh computer. And during that announcement, it was actually an accessibility feature that stole the show.

[clip: Macintosh Announcement]

Steve Jobs: But today, for the first time ever, I’d like to let Macintosh speak for itself.

Macintalk: Hello, I am Macintosh. It sure is great to get out of that bag.

It was called Macintalk, and it was an early text-to-speech engine, otherwise known as TTS. These enable programs to read text aloud. They're helpful for the visually impaired, and people with learning or cognitive disabilities. Basically, anyone who might have trouble reading.

Macintalk: So it is with considerable pride that I introduce a man who’s been like a father to me, Steve Jobs.

Macintalk made the Macintosh look irresistibly cool, like something out of 2001: A Space Odyssey.

[clip: 2001: A Space Odyssey]

HAL: The 9000 Series is the most reliable computer ever made.

[music in Keith Kenniff - Tactile]

Starting in 1987, many of the Mac’s accessibility features were bundled under a label called Easy Access. This included things like Sticky Keys, which made it easier to use keyboard shortcuts, and Mouse Keys, which let you control your cursor with your keyboard.

But maybe the most important was a third party program made by a company called Berkeley Systems. It was a screenreader called OutSpoken.

Now, screen readers don't just read text. Instead, they take in everything on screen, and turn all of that visual information into described audio. Here's an example of someone using a screen reader on a school's website:

Screen Reader: Navigation region. Heading, Level 2 link. Services for children. List of four items.

As they move around the screen with their keyboard arrows, the screen reader explains whatever element is highlighted. In this case, a set of links.

Screen Reader: Link: early years. Visited Link: Primary School Years. Link: High School and Beyond.

People who use screen readers often get used to listening at incredible speeds, some up to around a thousand words per minute. Here’s a demo on Youtube.

Youtube Demo: Like this.

Screen Reader: [Unintelligible]

Youtube Demo: Now, I’ve been told that’s a little fast. So…

Screen Reader: Seventy percent. Sixty. Forty percent.

Youtube Demo: How’s that?

[music in: Keith Kenniff - Just Your Luck]

Today, screen readers are really common. But up until the mid aughts, the good ones were all super expensive. The leading screen reader for Microsoft computers was called JAWS, which could cost over a thousand dollars.

JAWS: Internet Explorer. Space. Enter. Region: Search Engine. Hypertext.

But in 2005, Apple changed the game by rolling out VoiceOver, their first built-in screen reader.

Agnes: Isn’t it nice to have a computer that will talk to you?

It took a couple of years to work out the kinks, but by 2007, Apple had developed a serious competitor to JAWS. That same year, VoiceOver got a new, more natural sounding TTS voice named "Alex." In particular, people noticed how he breathed.

Alex: This is Alex. I'm programmed with over 150 different breath sounds. You can still find me in the system settings.

VoiceOver was a new addition to a broader set of accessibility features that were now called Universal Access. To represent Universal Access, Apple designed a symbol of a blue Vitruvian Man. Basically, a little blue person with their arms and legs outstretched. At Apple, they call him Vito. And now, it’s come to symbolize accessibility all around the world.

[music in: Keith Kenniff - Building Buildings]

Universal Access marked the beginning of a new era. Accessibility would no longer be led by third party companies who sold their software as expensive upgrades. Instead, Apple itself would take the lead and bake these features right into their products.

Now, part of it may have been a business decision. Because the more accessible a device is, the more marketable it is to schools, libraries, and other public institutions. But regardless of the financials, people like Steve Jobs and Tim Cook just thought it was the right thing to do.

Here’s Tim Cook with interviewer James Rath.

Tim Cook: If you think back to how Apple was founded, and it's still the case today, we make tools for people to do incredible things, and change the world with them. And that's everybody.

Tim Cook: I've never, ever, in the 20 years of being at Apple, ever looked at a, “What's our return on investment here?” It wouldn't be Apple without doing this. I mean, it's a part of our values that we will not compromise on.

Universal Access was a huge step forward for accessibility on computers. And yet, in the mid aughts, most cell phones still weren't very accessible. Many people with disabilities were limited to the most basic functions of a phone: making calls, and sometimes texting. And like with JAWS on Windows, these often required expensive add-ons. But all of that changed in 2009, with the iPhone 3GS.

[Keith Kenniff - Invention]

Sarah Herrlinger: When we designed it, we actually had to rethink all of the ways that one interacts with the device to make it a safe environment for someone in the blind community so that a single touch wouldn't make them do something they didn't intend to do.

Up to that point, touch screens were much less friendly to the visually impaired than old fashioned, tactile buttons. But once VoiceOver came to the iPhone, it transformed that experience, just like it did with the Mac.

Sarah Herrlinger: So, for example, on an iPhone, it will do everything from read your text to you, to tell you how many bars of cell coverage you have, or “What time is it?” You can just move your finger on top of all of the visuals on the screen, the icons, the words, and have it read back to you.

VoiceOver: Apps, iTunes, Settings, Voice, Notes, Stocks, Maps, Weather. Weather: New York. High 88 degrees Fahrenheit. Low 71 degrees Fahrenheit. Currently: Partly Cloudy, 82 degrees Fahrenheit.

Along with VoiceOver, Apple also introduced an early version of what’s now called Voice Control. If you have a disability that makes fine motor movements difficult, the ability to control your phone with your voice is vital. Now keep in mind, Siri still hadn’t come out yet. But In this 2009 Youtube demo of Voice Control, we can hear something that sounds a lot like it.

[Youtube VoiceControl demo]

Youtuber: You just touch and hold down the Home button for three seconds and it’ll pop up. Help.

VoiceControl: Using iPhone VoiceControl, you can tell iPhone to call contacts, play playlists.

Youtuber: Play songs by Collective Soul.

VoiceControl: Playing songs by Collective Soul.

Then, when Siri was officially introduced in 2011, it opened up even more of the iPhone to voice commands. Here's Apple's Scott Forstall showing off what Siri could do.

Scott Forstall: Do I need a raincoat today?

Siri: It sure looks like rain today.

For many users, Siri was mostly a cool, futuristic new feature. But for people with limited mobility or vision, it was a game changer.

[Siri Demo]

Scott Forstall: You can send and receive text messages, you can create notes, you can search the web… You stick something in the oven, you're gonna bake it, and need to take it out in 30 minutes? Just take your phone and ask Siri to set a timer for 30 minutes, and you're done.

[music in: Keith Kenniff - Sea [extended]]

Sarah Herrlinger: We were the first company to make a consumer touch screen accessible to someone in the blind community. It's not just VoiceOver, it's zoom, it's inverting colors, it's dynamic text. It's all these different things that are there so that whatever your personal unique need is, you can set up your device to work for you.

And right away, these features really caught peoples' attention. Around 2011, there was an outpouring of gratitude from the disabled community, especially users with visual impairments. For many of them, they jumped from barely having access to cell phones, to having almost full access to the world's leading smartphone. At a 2011 event in Los Angeles, Stevie Wonder personally thanked Steve Jobs.

[Stevie Wonder clip]

Stevie Wonder: And I want you all to give a hand for someone who, and his company, took the challenge in making his technology accessible to everyone… Steve Jobs. Because there's nothing on the iPhone or the iPad that you can do that I can't do.

[music out into music in: Keith Kenniff - Ascend]

By this point, Apple had been releasing built-in accessibility features for over twenty five years. But this was only the beginning. Today, Apple is using AI, augmented reality, and all sorts of technology to make their devices more accessible and useful than ever before. These innovations are already changing how millions of people experience the sounds around them... and they have the potential to revolutionize the world of hearing health.

That's coming up after the break.

MIDROLL

[music in: Keith Kenniff - Ascend]

Apple’s accessibility features go back to the 1980s, but they really accelerated in the aughts with Universal Access and the iPhone 3GS. Then, when the Apple Watch came out in 2015, it included a suite of features that started to blur the line between accessibility and health…

For instance, there’s a Noise app that constantly monitors the decibel levels around you. If it detects unsafe levels of sound, you’ll get an alert.

[ music out with Watch Notification]

Deidre Caldbeck: We build our health features with the objective of making an impact on people's lives.

That’s Deidre Caldbeck.

Deidre Caldbeck: I am the Senior Director of Product Marketing for Apple Watch and Health. Deidre says that the impacts of these features can sometimes be surprising, even to them.

[music in: Keith Kenniff - Chances]

Deidre Caldbeck: When we first introduced the Noise app, we heard from a father who said it's really changed how his autistic son experiences his life at school, because he didn't know how loud he was speaking, and it sort of turned some people off that he communicated with.

Deidre Caldbeck: And so the Noise app was helping give him insight, real-time, with how loud he was speaking, and he could kind of bring the level of his voice down. And that was not the way the feature was designed, but those are the stories where we know, "Okay, really want to invest more in hearing, we really want to invest more in all areas of health."

One issue they've become especially focused on is hearing loss.

Deidre Caldbeck: A lot of people don't know they have mild to moderate hearing loss. About a billion people around the world suffer from mild to moderate hearing loss, but 80% of those go undiagnosed.

[music in: Jesse Brown - what dreams]

Sarah Herrlinger: The average person who needs hearing assistance generally doesn't get it for, at times, up to a decade after they should have.

That's Sarah Herrlinger again. Sarah says that even after someone seeks treatment for hearing loss, they can find themselves in a world that's not very friendly to the hearing impaired. For example, for a long time, hearing aids couldn't connect to a cell phone without installing a copper wire called a telecoil.

Sarah Herrlinger: And the experience was not optimal. We were getting emails from customers saying, you know, “I know it's just telecoil, but I still can't find a way to make it really work well. And so I've stopped talking to my grandkids.” You know, and people just got very insulated because they couldn't use the devices well.

Isolation is a common symptom of hearing loss, especially when it goes untreated. After all, it's hard to be social when you can barely hear or understand a conversation. The same goes for watching a movie, going to a live event... or any social setting where you use your ears.

Sarah Herrlinger: And what we looked at was how Bluetooth was really a great solution, but it wasn't something that was currently an option within the hearing aid world.

So Sarah's team worked with the major hearing aid companies to bring Bluetooth to their products.

Sarah Herrlinger: We actually wrote the first Bluetooth Protocol for hearing, and did it specifically for hearing aids, and built in a bunch of features like Live Listen that were specifically for hearing aid users, and launched it in end-of-2013, and it really revolutionized the hearing aid market.

Live Listen uses the iPhone's microphone as a directional mic, and sends that audio directly into your hearing aids.

[loud restaurant in]

So let's say you go to a loud restaurant with a friend.

Friend [Overpowered]: What do you think you’re gonna order?

Your friend can hold your iPhone up to their mouth, and their words will be beamed into your bluetooth hearing aids.

Friend [Clear]: What do you think you’re gonna order?

[loud restaurant under]

And just like the Noise app, users have found creative new ways of using Live Listen. At one point, Sarah heard from someone whose mother had been losing her hearing, which made it hard to watch movies together like they used to.

[clip: White Christmas sneaks in]

Sarah Herrlinger: It was always something he loved to do, and he went home for the holidays that year. And so what this guy did was he took her iPhone, turned on Live Listen, and put it next to the speaker of the TV. And then they were able to start watching old Christmas movies together.

[White Christmas up, then under]

Deidre Caldbeck: I think that's been the most rewarding part of this process is to hear how much these products and features mean to people.

[White Christmas out]

Live Listen isn't just for hearing aids... It also works with AirPods and Beats. And like many accessibility features, it can be a useful life hack even if your hearing is perfect. For instance, you can turn up the TV for yourself without bothering anyone else, or listen to it from the next room over.

[music in: Keith Kenniff - Scenes 3]

In recent years, Apple has been using AI and other innovations to push these features even further. For instance, there's Live Captions, which generates live subtitles of any speech, whether it's a podcast playing on your phone, or someone talking to you in-person. There's also Sound Recognition, another potentially life-changing feature for the hearing impaired.

Sarah Herrlinger: The iPhone and the Watch are able to listen for environmental sounds around you, everything from a doorbell, a fire alarm, a dog barking, a baby crying, water running, and present you with a visual alert that says, “There is a sound behind you, it may be your water running."

Now, you don't have to be deaf to find a use for Sound Recognition. For instance, you can use it to get alerted about your doorbell ringing when you have headphones on. But for people in the deaf community, it can be really impactful.

[music in: Jesse Brown - lavender]

Sarah Herrlinger: I remember when it first launched, just seeing people who talked even specifically about that element of a baby crying, and that incredibly human moment of having the realization, "Your child is crying," and to be able to go, and pick them up, and comfort them.

Sarah Herrlinger: And I think that one, more than maybe a dog barking, or the doorbell ringing, it really just brings that human connection.

Over in the Magnifier app, there were similar features for the visually impaired, called People Detection and Door Detection. These allowed people to hold up their phone’s camera, and have it alert them about the presence of doors and other people.

[Door Detection Demo]

Door Detection: Open door six feet away, turn handle or knob. Swing. Two doors detected. Door five feet away, turn handle or knob. Swing.

[music in: Steven Gutheinz - Places]

More recently, Apple released something called Scenes. Now, instead of just detecting doors and people, the iPhone can describe many more details about your surroundings. It's like a screen reader for the real world.

[Scenes Demo]

Scenes: A group of people sitting in chairs in front of a desk with a laptop and a lamp. A person standing next to a glass door. A room with a couch, a table and other items.

Door Detection and Scenes are great examples of augmented reality... a catch-all term for when computer-generated information gets overlaid on top of the real world. But there’s another Apple device that’s really pushing the envelope in this area, and making augmented reality common for all users. And that’s the Airpods.

Ron Huang: It does so many more things in your life than just media.

That’s Ron Huang, Apple’s Vice President of Sensing & Connectivity.

Ron Huang: Our users tend to put it in and leave them in for a much longer time. And so that's why we build things like Adaptive Audio.

Adaptive audio came out in 2022, and it combines two older AirPods features into one.

The first is Active Noise Cancellation, which cancels out soundwaves as they enter your ear.

[AirPods ANC tone + sound canceled]

The second is Transparency Mode, which is basically the opposite of Noise Cancellation. It's for when you want to be aware of your surroundings.

[AirPods Transparency tone + sound amplified]

Ron Huang: So Adaptive Audio dynamically blends Active Noise Cancellation and Transparency Modes based on the environment you're in.

[entering restaurant]

Ron Huang: So for example, if you walk into a louder restaurant, we automatically ramp up the amount of noise cancellation we do to lessen the noise.

[AirPods Adaptive Audio tone + sounds reduced]

Ron Huang: And when you walk out of that restaurant, we then lessen that Active Noise Cancellation so you get more of the transparency effect directly.

[Leaving restaurant, gentle city sounds up]

Ron Huang: Same thing goes if a truck drives by, right?

[truck passing by]

Ron Huang: Truck gets closer to you, Active Noise Cancellation level raises…

Ron Huang: And when it drives away, we fall back down to Transparency.

[truck is gone, city sounds resume]

Ron Huang: It's not just a mode switch. It's literally a dynamic, gradual shift between the two modes. And the end effect is really special because what we hear from our customers over and over again is, when they finally take the AirPods out…

[AirPods removed]

Ron Huang: …they have this OMG moment, which is like, "I had no idea that the streets were so loud!"

[music in: Keith Kenniff - Scenes 1]

Features like Adaptive Audio elevate the AirPods from classic earbuds into the world of sonic augmented reality. Essentially, it's changing our sensory input to make the outside world friendlier to our ears.

When you’re in a noisy environment, the world can feel like a bad mix, where some of the instruments are super overpowering. But now, we finally have some control over that mix.

But just because you have AirPods in, it doesn't mean you have to be closed off to social interaction. Two features that address this are Conversation Boost and Conversation Awareness.

[train scene in]

So let’s say you’re on the train for your morning commute, and you’re jamming out to some tunes…

[music beat begins: Keith Kenniff - Build Up]

Then, you decide to ask someone a question.

Ron Huang: And then so what Conversation Awareness does automatically is that we real time detect when you start speaking. Microphones detect speech sounds, obviously, but there's potentially a lot of people around you also talking.

Ron Huang: And then, so we combine that with the accelerometer, so we know that it's from your jaw, it is you speaking, and therefore it is your intent to speak. And that's when we apply the ducking or the pausing of the audio to help you talk.

Train Rider: Excuse me, do you know what the next stop is?

Once you start talking, the AirPods know that there's probably a reply coming that you want to hear.

Ron Huang: So we beam from the mics to the conversation in front of you, and actually use machine learning techniques to amplify the speech sound, but not the rest of the noise, so you can have a much better conversation.

Other Train Rider: Uh, it should be Franklin. I’m actually getting off there.

Train Rider: Got it! Thanks.

The system even has a way of knowing when the conversation is over. To do so, they track your conversation partner using beamforming microphones, and motion sensors.

Ron Huang: And when I'm done, as I walk away, we combine the fact that we detect you're walking away to also see that, "Oh, you're likely ending that conversation” and therefore resuming the audio back to you.

Now for these features, all of the processing is happening inside the AirPods themselves. But of course, the processing power of an iPhone is much greater. And in some cases, it makes sense to utilize that. That’s what Apple did with their Voice Isolation feature, which blocks ambient noise to make your voice more clear when you call or Facetime someone.

Eric Treski: We realized, especially with something like voice, and all of this machine learning capability is, we have so much more compute power on the phone that we can take advantage of.

That's Eric Treski, who directs product marketing for AirPods.

Eric Treski: So to remove that background noise, we actually now send a raw signal of your voice directly from AirPods down to the phone. The phone does all the processing, and then of course, that just goes out to your person on the other end.

[wind + garbled voice]

When I'm on a business call, or Facetiming with a loved one, I don't want my voice to sound like a noisy, garbled mess. And these new algorithms go a long way towards improving that.

Eric Treski: And that's in windy conditions, that's in loud environments, so it's an incredible capability that we now have.

[music in: Keith Kenniff - The Boat]

Today, Apple is leaning into hearing health more than they ever have in the past.

Deidre Caldbeck: So the three pillars with our new hearing health features are Protection, Awareness, and Assistance.

On the Awareness side, there are those loud noise warnings, which even apply to the things you’re choosing to listen to.

Deidre Caldbeck: We wanted to make sure they had some awareness around how loud they might be listening to their favorite music or their favorite podcast, and giving them the ability to automatically reduce those loud sounds so that they're always listening to their media at a safe listening level.

On the Protection side, they've added automatic Hearing Protection across all three of the AirPods' Noise Control modes. This can reduce the environmental sound hitting your ears by up to 30 decibels.

Deidre Caldbeck: So maybe you are in a windy city [sfx: wind] or it's people who tend to be in situations where they don't even realize how loud their environment noise is, like a subway [sfx: subway noise] or if they have a profession where there's a lot of loud sounds around, construction, [sfx: construction sounds] et cetera.

Deidre Caldbeck: Hearing Protection will actually ensure that background noise is suppressed, so that your hearing is protected over a long period of time.

Eric Treski: We talk about really what AirPods can do in people's lives beyond just music listening. And we think about that exhaustively.

Eric Treski: So I really like the story of, not only are we notifying you, and able to protect your hearing from what you're listening to from a media perspective, but we're also now protecting your hearing from an environmental sound perspective. So you're sort of covered both ways.

But for me, the most exciting part of this is what Apple is doing with the third pillar, Hearing Assistance. In a hearing study they conducted...

Deidre Caldbeck: We learned that about 75 percent of the people diagnosed with hearing loss were not using any sort of assistance. "Well, that seems like an area where we can really make an impact."

So Apple developed a new, clinically validated hearing test, and built it into the iPhone and AirPods Pro.

[music in: Keith Kenniff - Aquatic]

Deidre Caldbeck: This feature brings together engineers, clinicians, audiologists, designers to build what is really the first of its kind hearing aid.

Deidre Caldbeck: So you can use your iPhone, you can take about a five minute test in the comfort of your own home. You will get a personalized hearing profile as a result, and this feature will seamlessly transform your AirPods Pro 2 into a hearing aid. Kind of adjust based on your personal specified needs so that you can hear the world around you much better.

It's hard to overstate how big of a deal this is. Millions of people buy AirPods… And statistically, many of them have hearing loss and don’t even know it. But by taking this test, and potentially using their AirPods as hearing aids, they can avoid some of the downstream effects of hearing loss, like social isolation and cognitive decline. In other words, it can give people who might never have gone to the doctor life changing information.

Deidre Caldbeck: The thing that makes us really excited about this feature is that, in the way that they can just use these AirPods that they rely on in their everyday life, they can now use to really help impact their lives in a way they, I don't think they would have imagined.

Deidre Caldbeck: It also has the potential of reducing the stigma, because you're seeing people wear AirPods, and that's the same set of headphones that you use to listen to your favorite podcast, and you can use to have that hearing assistance that you might need.

Deidre Caldbeck: I have a list of people that I plan to have try this. And if they're listening, they know who they are.

[music in: Keith Kenniff - Acres All Above]

For years now, I've been talking about how hearable technology was eventually going to combine headphones, ear plugs, hearing aids, virtual assistants, and more into one earbud-like device that we can theoretically leave in all day.

This is the kind of technology that I'm most passionate about, because it goes so far beyond just convenience or entertainment. It's the stuff that literally changes people's lives, and helps people connect with each other through sound.

Deidre Caldbeck: For me, when I started to work on Apple Watch, and then soon after, Health, to be able to hear some of these stories we've been sharing today, I just felt very fortunate that this was my actual job. This is my profession that I get to work with these brilliant people that come up with these features that anyone can use, and anyone has the potential of having their lives changed.

Now, designing for accessibility comes with a lot of challenges. But when you approach those challenges with empathy and creativity, the result is often a better product for everyone.

Sarah Herrlinger:  We're all unique in the world, and accessibility features may be life hacks to one person, and they may be necessities to another. But we're always just trying to make sure that we have features that work for everyone.

Here's Tim Cook again.

Tim Cook: And so that basic thought of democratizing things so everyone can create whatever they would like to create, or solve whatever problem they would like to solve, that's what we're about. That's why we're here.

Sarah Herrlinger: For everything we build, it's really just about giving people opportunity. I'd like to think that whoever finds the cure for cancer probably isn't going to look like what Hollywood has told us they should look like. And so if we can build a feature that unlocks someone's capability to express themselves, or to learn something, or to do whatever it might be that gets us one step closer to that, I'm all in.

[music out into music in: Keith Kenniff - Goldengrove]

Twenty Thousand Hertz is produced out of the sound design studios of Defacto Sound. Hear more at Defacto Sound.com.

Other Voices: This episode was written and produced by Nikolas Harter and Casey Emmerling, with help from Grace East. It was sound designed and mixed by Jesus Arteaga and Brandon Pratt.

Thanks to our guests, Sarah Herrlinger, Deidre Kaldbeck, Ron Huang and Eric Tretsky. And thanks to everyone at Apple who invited me in, and made this episode possible.

Now, accessibility is something that Apple is very open and responsive about.

Sarah Herrlinger: We are very open to feedback from customers, and have a lot of ways for people to be able to communicate directly with our team.

If you want to help drive these efforts forward, you can email the accessibility team directly at Accessibility at Apple dot com.

I'm Dallas Taylor. Thanks for listening.

Recent Episodes