Boring? Are you kidding me? Time to look under the hood…
Somewhere in the land of media herding there was a familiar refrain. iOS 15 and iPadOS 15 are “boring”. Apparently the idea behind this is that there is no single feature that changes the entire experience of the iPhone or iPad – no “killer app” or killer upgrade.
The “boring” crowd have focused on things like “you can banish your ex from memories in iOS 15”. I saw a slew of articles with a variation on that title.
The biggest problem with the attitude, which must have been initiated by someone that has not really been hands on with any of the new iOS software (which is still in non-public beta only) is that it’s not true. (A public beta is expected in July but it is not recommended unless you are a developer testing on “non-critical” devices).)
Why? Because there are so many killer upgrades that it’s overwhelming, basically due to the avalanche of amazing new features and improvements. This article will attempt to give an illustration of that by focusing on only one feature inside one built-in app: Memories inside of the Photos app.
First a short digression. We have been testing on several devices including a MacBook Pro 15” from 2017, an original 1st generation iPad Pro (2015) and an iPhone XS Max from 2018. None of these machines have the new Apple Silicon chips and for that reason they are only able to produce the upgraded features that don’t require it.
That makes the improvements that are possible without buying any new hardware even more amazing. Stunningly, of the three devices we upgraded the MacBook Pro was the most stable right out of the gate. Any beta software will have bugs, glitches and sometimes crash but that does not prevent one from testing out features that are new.
The iPad pro, in a non-technical observation almost appears as if the screen resolution has been increased, obviously not possible but, as you will read below, could be part of a stunning emphasis on increased beauty, sensuality and luxurious feel in the new suite of OSs.
Memory movies on iPad OS15 are an amazing example of how AI and machine learning are evolving
For those not familiar with “Memories” they are auto-generated film clips that can be found in the “For You” tab in your photos app on iPhone and iPad. While you are sleeping this feature scans everything in your photos library and uses artificial intelligence, machine learning and neural networks to choose and edit the clips, as the name says, for you.
One not confirmed but almost certain technical backdrop to this is that the learning is improving even between updates to the OS. Not only that but all Apple devices on earth are “cooperating” to help each other learn. That’s a powerful force that spreads across over 1.65 billion devices.
This feature was added in iOS 12 but started to function in iOS 14 on a much higher level. If you had tested and used the feature over the last few years as we have you’d have noticed that the ability of the AI to “see” and select photos and videos to include was limited and, at times, comical. Not any more.
Much of the data that clues the software in as to what photos belong together is from the embedded meta data. The date, time and location information helps to tell the AI that you took a group of images or videos on a day in a particular location.
The difference in iPad OS15 (iPhone too, of course) is that the more difficult to accomplish tasks, such as recognizing the subjective quality of one photo verses another (humans often take several photos of the same scene to try to capture the best out of a bunch). Or, more importantly, who and what are the subject of a photo.
All of this began to get interesting in iPad OS 14 and many groups of photos and videos were already being chosen, edited and enhanced by the software to a level that was fairly impressive.
AI and aesthetics collide and the result is a Joy to witness
Something that is starting to become a thread and a definitive direction that Apple is taking, particularly with the iPad Pro series, is, true to the name, a Pro level of visual production and manipulation throughout the OS.
Center Stage, for example and many other video and photo related upgrades were some of the big features in the newest generation of iPad Pro. Those are great, but require a new iPad along with the OS upgrade.
When it comes to the memory movie clips what we found is that even on the oldest iPad Pro from 2015 the evolution of the software due to the constant learning by the AI is already taking a huge step forward doing all the things that it was already doing only much better.
Apple’s upgrade took that and give it an additional kick up a notch with somewhat that the company is known for: good taste.
What has changed specifically?
In iPad and iPhone OS 14 there were a few things that felt awkward in the way movies were created. The biggest shortfall was in the softwares ability to deal with various aspect ratios.
These days when we shoot photos and videos with an iPhone it is tempting and, at times, wonderful to use the vertical orientation. Other times, for landscapes and other scenes we might prefer a traditional film aspect or even use the panorama feature to get an ultra-wide screen “cinema-scope” style.
Until now this was dealt with very poorly by the software. Mostly the photos would constantly zoom in (the so called “Ken Burns” effect) and if shown without zooming in a vertical portrait shot would have ugly side bars (like a vertical letterbox effect).
The zooming and most of the effects in general destroyed the resolution and therefore the quality of many photos by enlarging them and adding the effects.
Additionally the effects that were added, while cute and fun, were not much more than a way to add fun and not what would likely be used by a human editor. All of this and more made for a kind of novelty feel to the whole process that was nice to have, but many never even bothered to look at the movies that software created for them.
That’s about to be over.
A whole new array of options for the AI to use while trying to entertain
In iPad OS 15, as can be seen on the photos and videos in this article, the ways that the software solves the aspect ratio issue as described above is genius and, dare I say it, beautiful.
In a collaboration between the AI and the software itself it now has a new bag of tricks to use and, boy, does it work. One feature that is fantastic is the letterbox generator for any wide screen photos in any aspect ratio.
How this works is that it takes the iPad aspect ratio and then uses the photo in it original at 100% full resolution and then adds a letterbox. But this is not the usual plain black bars we are all familiar with – the software and AI are able to see and analyze the photo and create a custom gradient letterbox that can be any shade or color.
The effect is often astoundingly tasteful and often makes the original photo look even better. We tested it on award winning photos (video above) and the result is, basically art. Also on our own “nice” photos, chosen 100% by the AI and software, look amazing also.
Actually, all the photos and videos in the clips generated from the library look much better than I had remembered. That turns out to be because the software and AI now do automated color grading on all the photos and videos in all the generated memories !
Color grading also known as color correction, especially for video, has traditionally required an expensive expert and high end software (and hardware) to enhance and color match various photos and clips, that have often been taken at different times and places, where lighting conditions vary and sometimes were shot with different camera.
AI and machine learning software on iPad OS15 (and iOS 15) now has a virtual colorist actively adjusting your shots and enhancing and color matching them while you sleep. That is basically insane. That’s probably why it appeared that the photos and even the iPad itself had been upgraded.
Ok, I could go on and on about that one feature, but let’s move to some more features. There are also new effects that are added that vary with each memory (there are a lot more clips being generated, including various versions of the same idea to choose from).
In the experiments so far the effects are clearly better and more subtle than in iOS 14. Again in many cases I found myself saying the word “beautiful” when I tried find an adjective to describe the results.
For shots that have a vertical bias there’s a vertical geometric split screen effect, often with a thin black border, and it has a kind of 60’s on steroids feel with the bars sliding in and out and resizing into place.
Another effect not seen in iOS 14 is a kind of circular rotation – great for landscapes – it’s not a common effect probably because it is computationally complex, but for the AI, it’s a snap. Sometimes this effect has a kind of blur-dissolve added which makes it fun and, again, still tasteful.
It appears that the effects are not only better and there’s a larger bag of them, but they appear to evolve and adapt to the content, that is to say that the speed and depth of each changes with the music combined with the photo and video content.
Oh, and the music. OMG. Each clip has 6 songs pre-selected and the entire clip adapts, in real time (!), when you change the song, showing you various styles and looks that match. Apparently Apple Music is also connected if you have a subscription.
As a mater of fact, it is hard to be certain, as we have not had more than a few hours to test this, but nearly everything appears to be “live” and constantly evolving in real time. In order to “freeze” a version of a memory you have to “favorite” it (with the typical heart symbol) and then “add to memories” in order to edit (change the names or choose more images – or remove anything if it is not to your liking).
There is so much more not yet mentioned here: this article could probably be a book
The AI is also getting creative with names and “concepts” for the clips. For example, if you had lunch (or took photos) over the years in the same city (for me it was Knoxville, TN) it might look at the coincidence that you tended to take photos around midday in that town and then create a memory clip called “Lunch in Knoxville over the Years”. Or for example the clip at the head of this article: “Golden Hour Over The Years”.
This is an early and primitive foretaste of the literary ambitions of AI. In the new Photos App in iOS 15 it is beginning to “think” about when, where and why humans take photos and videos and then conceiving a story that fits the behavior it is witnessing.
Other titles go beyond the basic “Amsterdam in 2016” and start to use the understanding and visual ability to “see” what is in the photo to create a clip like : “Playing in the Snow at Christmas”. Snow? Does it know it’s cold? Maybe just that it’s white and happens in the northern hemisphere in December. This is just the very beginning of something that will evolve, hourly, from now on. I can’t wait.