Actual difference between UIAccessibilityLayoutChangedNotification and UIAccessibilityScreenChangedNotification?

Luke picture Luke · Jan 6, 2015 · Viewed 11.9k times · Source

I’m trying to ascertain what exactly happens differently when posting a UIAccessibilityLayoutChangedNotification, and a UIAccessibilityScreenChangedNotification. From what I can see, I can use them interchangeably everywhere and nothing different happens.

The Apple documentation simply says to use LayoutChanged when (for example) an element has been hidden or shown, and to use ScreenChanged if the entire screen changes, but I’m interested in what THEY do when I provide this information, and what I should see differently when using one or the other.

Can anyone give a clear explanation of implementation differences between the two?

Answer

ChrisCM picture ChrisCM · Feb 5, 2015

These two notifications are for dynamic content on views, and communicating these changes to VoiceOver for screenreader users. There is little difference between these two notifications, except for their default behavior, and the silly little "boop beep" for ScreenChange notifications.

In both instances, the argument

UIAccessibilityPostNotification(UIAccessibilityLayoutChangedNotification, arg);

Represents a string to be read out, or an on screen element, which VoiceOver will shift its focus to. In the event of dramatic context changes, it is important to send focus to a place that makes sense, or announce that such changes have taken place. Either approach is acceptable from an accessibility point of view, though I prefer approaches that involve the least amount of change possible. In the event of simple layout changes, it is almost always best just to announce the context change, and leave focus where it was. Though sometimes, the element that caused the context change is hidden, and then it is clearly necessary to direct voiceover to highlight new content, because the default behavior in this case is undefined, or perhaps deterministic, but determined by a framework that knows absolutely nothing about your app!

The difference between the two events, given that they both do exactly the same thing, is in their default behavior. If you supply nil to the UIAccessibilityLayoutChangedNotification it is as if you have done nothing. If you supply a nil argument to the UIAccessibilityScreenChangedNotification it will send focus to the first UIObject in your view hierarchy that is marked as an accessibilityElement, once all view hierarchy changes and drawings are complete.

UIAccessibilityLayoutChangedNotification

A good use case example for UIAccessibilityLayoutChangedNotification is for dynamic forms. You want to let users know that, based on decisions they've made in the form, new options are available. For example, if in a form you select that you are a Veteran, additional areas of the form may pop up to provide more input, but these areas may have been hidden to other users who did not care about them. So you could shift focus to these elements after user interaction:

UIAccessibilityPostNotification(UIAccessibilityLayoutChangedNotification, firstNewFormElement);

Which would shift focus to the provided element, and announce it's accessibilityLabel.

Or just tell them that the new form elements are there:

UIAccessibilityPostNotification(UIAccessibilityLayoutChangedNotification, @"Veterans form elements available");

Which would leave focus where it is, but VoiceOver would announce "Veterans form elements available".

Note: This particular behavior is bugged on my iPad (8.1.2).

Or finally you could do this:

UIAccessibilityPostNotification(UIAccessibilityLayoutChangedNotification, nil);

Which does absolutely nothing :). Seriously, I don't even think the a11y framework backend cares. This particular line of code is a complete waste!

UIAccessibilityScreenChangedNotification

A good use case example for the UIAccessibilityScreenChangedNotification is customized tabbed browsing situations. When the entire screen, with the exception of your navigation area, changes. You want to let voiceover know that essentially the entire screen changed, but NOT to focus the first element (your first tab) but to focus the first content element.

UIAccessibilityPostNotification(UIAccessibilityScreenChangedNotification, firstNonGlobalNavElement);

Which would play the "boop beep" sound and then shift focus to just beneath your global navigation bar. Or you could do this:

UIAccessibilityPostNotification(UIAccessibilityScreenChangedNotification, @"You're on a new tab");

Which would wait for the new tab to load, play the "beep boop" sound, announce "You're on a new tab" in voiceover, then shift focus to the first element on the screen, then announce the accessibilityLabel for that element. (PHEW! That's a lot! This is jarring for screen reader users. Avoid this scenario, unless absolutely necessary).

And finally you can of course do this:

UIAccessibilityPostNotification(UIAccessibilityScreenChangedNotification, nil);    

Which is equivalent to:

UIAccessibilityPostNotification(UIAccessibilityScreenChangedNotification, firstA11yElement);

Both of which will play the "beep boop" sound, shift VoiceOver focus to the first element on the screen, and then announce it.

Finally

In a comment somebody mentioned caching, and I occasionally comment in my answer about things the A11y Backend may or may not care about. While it is certainly possible that there is some backend magic happening, I don't believe in either of these circumstances, the back end cares at all. The reason I say this is because:

If you've ever used the UIAccessibilityContainer protocol, you can watch as your container of views gets queried. There is no caching going on. Even the accessibilityElementCount property gets pinged each time VoiceOver changes focus to a new AccessibilityElement within your container. Then it goes through the process of checking which element it is on, asking for the next element, and so on. It is designed at its core to handle dynamic situations. If you were to insert a new element into your container after interaction, it would still go through all of these queries and be just fine about it! Furthermore, if you override the properties of the UIAccessibility protocol, in order to provide dynamic hints and labels, you can also see that these functions get called every time! As such, I believe that the A11y Framework backend gleans ABSOLUTELY ZERO information from these notifications. The only information VoiceOver needs to do its job is it's currently focused Accessibility Element, and this elements Accessibility Container. The notifications are simply there for you to make your app more usable for VoiceOver users.

Imagine if this weren't the case how many times Safari would post these notifications!!!! :)

These particular statements can only be confirmed by someone with backend knowledge of the framework, who works with the code, and should be viewed as conjecture. It could be the case that this is highly version/implementation dependent. Definitely open to discussion on these points! The rest of this post is pretty concrete.

For Your Reference

Most of this comes from experience working with the frameworks, but here is a useful reference if you wish to dig further.

https://developer.apple.com/documentation/uikit/accessibility/uiaccessibility

https://developer.apple.com/documentation/uikit/uiaccessibilitylayoutchangednotification

https://developer.apple.com/documentation/uikit/uiaccessibilityscreenchangednotification

And finally, an open source repo of the silly little app I put together to test all this stuff.

https://github.com/chriscm2006/IOS-A11y-Api-Test