The Shifting Ways In Which We’ll Interact With Mobile Apps

Shift Key

This week’s column takes a look at how the iOS lock screen and app experience could evolve in the future

‘Tis the season to reorganize the apps on my iPhone. With iOS 7 and the shift from spotlight search to swipe-down-to-search for apps, I’ve noticed my app behavior has changed a bit, as it’s become easier to search for apps with the new UI and therefore less important where the app is located on my phone. For instance, the Amazon app is buried in my Shopping folder (on my second page of apps), but if I need the app, I just swipe down to search.

This shift got me thinking about other ways I’d like to search for, launch, and interact with mobile apps, so I came up with this list — let me know what you think and if I’m missing any:

Launch apps with Siri. I didn’t realize this until yesterday. I’ve kind of given up on Siri, but maybe I missed the memo here, so I’ll give it another whirl.

Folders aren’t perfect. I’ve been using folders on my second page of apps, and as a rule, don’t keep any more pages of apps. As a result, my second page is all folders, sorted by category, and some of those folders contain nine or more apps. This isn’t perfect but confines my iPhone to two screens of apps and becomes less important as I search for apps and/or enter them via push notifications.

Speaking of push, notifications is how I launch and enter many apps. Again, “where” an app sits on my phone becomes less relevant if the app’s push notifications are engaging and/or require me to do something inside the app.

iOS Control Center changes placement of Apple’s native apps. Now that I can launch the camera, alarm clock, and a few others via Control Center, I’ve bunched all of Apple’s native apps into one folder on the second page, as I don’t use them often.

This is just for iOS. Certainly, there is more creativity and innovation happening around contextual presentation of apps on Android.

Dynamically sorting apps based on high-frequency usage. I have two pages of apps: My home screen, which has 20 apps plus the bottom bar, with no folders, and then the second page, which is all folders. I’d like a control in settings to have the OS dynamically arrange apps on my home screen which I spend the most time in based on my usage data.

Contextually presenting apps in lock screen based. What if the OS could predict, based on my location, time of day, usage patterns, and other inputs what apps I wanted to use at specific times, or even by what angle I was holding the phone at? On the lock screen, much like a push notification, it could present a floating row of apps it may think I’d like to use. (This is akin to what companies like Cover are creatively doing on Android today, where such advances are currently possible.)

Larger-format push notifications. Right now, push notifications are like tweets, short messages. Some of them, like native SMS with a picture, preload the images. What if push notifications increased in format size? For instance, consider a push notification from Twitter which points to a tweet that contains a picture, but in this new UI, the picture is presented in the lock screen alert. This is one area I’d like to see Apple conservatively open up so developers can become more creative about the format of a notification, even though they ultimately want users to be inside their apps.

Deep-linking between apps. I really hope this happens sooner. App silos mean productivity can take a hit without seamless multitasking. A simple example is toggling between mail and calendar. There are more apps I’d probably visit if they were integrated with complementary apps, though I realize developers don’t have full control here its hard to identify a few apps to integrate with, as those integrations cost precious development time.

More interaction in lock screen. I’ve been thinking it would be convenient to interact with some apps just through the lock screen, without having to dive into the app. SMS is the best example. I often just want to reply to a text with a one-word letter or emoji, but I have to swipe it open, launch the app, and then reply. Why not just allow me to type in the lock screen from the push?

I don’t know what Apple has in mind for iOS, but gestures like swipe-down-to-search give me confidence they’re thinking about their UI and keeping tabs on what folks are experimenting with on Android right now. While all of the suggestions I’ve laid out sound nice, none of them are deal breakers for me to switch. I need to be clear about that distinction. And, while some of them may be better experiences for me, as a user, some of them could present issues for app developers. For instance, if some of these changes happened, users would potentially go inside apps less, and for apps that are based on addressable minutes and other attention-based metrics, these types of changes could present harsh realities about just how much time someone needs to spend in an app.