Category Archives: iOS 15

iPhone ’13’ and New AirPods are a September thing according to… Everybody

Above: Photo Collage / Lynxotic / Apple

Potential launch dates teases Apple customers as September anticipation builds

According to claims found on the Chinese e-commerce site IT Home, Apple may have plans to launch its iPhone 13 (all models including mini, pro and pro max) starting on September 17 and the next generation AirPods on September 30.

The dates are totally speculative, other earlier dates have also been floated, and there have yet to be any official confirmations of launch dates from the company. The two separate September product launch dates could also align with previous rumors that Apple will hold up to 3 events in the same month.

However Apple has traditionally released new products starting in September so the timeline could be true. In previous years, Apple has held events on September 7, 14, and 21.

Latest Apple stories:


Find books on Music, Movies & Entertainment and many other topics at our sister site: Cherrybooks on Bookshop.org

Enjoy Lynxotic at Apple News on your iPhone, iPad or Mac.

Lynxotic may receive a small commission based on any purchases made by following links from this page

Apple Video showcases creative simplicity to capture incredible portraits using an iPhone 12

Above:Photo / Apple / Mark Clennon

How to take an iPhone camera and make the most of it using taste and talent

Apple offers “Today at Apple” to inspire and educate iPhone users to help them learn and be creative with their devices. The online videos range from music to photography and are centered around Apple’s latest technologies.

This most recent video teaches users how to get the most out of the iPhone in portrait mode, but really using all to the various power features that are already in the iPhone 12 and even 11.

For example, 3 lenses allow ultra-wide, X1 and X2 (tele) shots at any time just by flipping between each preset. And once a group of photos exists, simple creative cropping and framing has a huge potential to bring out the most attractive and interesting features of each shot, and can be immediately saves as a separate image.

Of course, with such a high resolution image capture in the first place, cropping retains enough image-data that even an ultra-close crop can retain beautiful depth and detail.

Learning how to use all the tools, and most of all a photographer’s eye

The five minute video features self-taught New York City photographer Mark Clennon. In the clip he shows and explains how he sets up, shoots and edits his images, mainly in-camera (that is to say “in-iphone-camera”) to capture his most powerful portraits.

With iOS 15 (released in public beta) and soon with the iPhone 13 (or what the actual name turns out to be) both expected to be revealed in early September, the potential for portrait mode will be added also, potentially, to video in addition to photos and many other upgrades and improvements are on deck.

Free and extensive software upgrades, along with the not free and not inexpensive new hardware are a yearly ritual with Apple since the very first iPhone was released in 2007. Recently, with Apple Silicon and the gradual merging of the functionality to MacOS, iOS and iPad OS the upgrades seem to be in overdrive.

We have cataloged some of the more interesting changes in stand alone articles but still have more to come as this years upgrades and changes are particularly extensive.

This video is an example of how it’s possible to take even one feature, designed to assist in one form of photographic expression, and dig deeper into it with talent and intelligent use of experience and take the resulting images to a whole other level:

Recent Articles:


Find books on Music, Movies & Entertainment and many other topics at our sister site: Cherrybooks on Bookshop.org

Enjoy Lynxotic at Apple News on your iPhone, iPad or Mac.

Lynxotic may receive a small commission based on any purchases made by following links from this page

iPadOS 15 Preview: Get Ready for AI and Machine Learning that will Blow You Away

Boring? Are you kidding me? Time to look under the hood…

Somewhere in the land of media herding there was a familiar refrain. iOS 15 and iPadOS 15 are “boring”. Apparently the idea behind this is that there is no single feature that changes the entire experience of the iPhone or iPad – no “killer app” or killer upgrade.

The “boring” crowd have focused on things like “you can banish your ex from memories in iOS 15”. I saw a slew of articles with a variation on that title.

The biggest problem with the attitude, which must have been initiated by someone that has not really been hands on with any of the new iOS software (which is still in non-public beta only) is that it’s not true. (A public beta is expected in July but it is not recommended unless you are a developer testing on “non-critical” devices).)

Why? Because there are so many killer upgrades that it’s overwhelming, basically due to the avalanche of amazing new features and improvements. This article will attempt to give an illustration of that by focusing on only one feature inside one built-in app: Memories inside of the Photos app.

First a short digression. We have been testing on several devices including a MacBook Pro 15” from 2017, an original 1st generation iPad Pro (2015) and an iPhone XS Max from 2018. None of these machines have the new Apple Silicon chips and for that reason they are only able to produce the upgraded features that don’t require it.

That makes the improvements that are possible without buying any new hardware even more amazing. Stunningly, of the three devices we upgraded the MacBook Pro was the most stable right out of the gate. Any beta software will have bugs, glitches and sometimes crash but that does not prevent one from testing out features that are new.

The iPad pro, in a non-technical observation almost appears as if the screen resolution has been increased, obviously not possible but, as you will read below, could be part of a stunning emphasis on increased beauty, sensuality and luxurious feel in the new suite of OSs.

Memory movies on iPad OS15 are an amazing example of how AI and machine learning are evolving

For those not familiar with “Memories” they are auto-generated film clips that can be found in the “For You” tab in your photos app on iPhone and iPad. While you are sleeping this feature scans everything in your photos library and uses artificial intelligence, machine learning and neural networks to choose and edit the clips, as the name says, for you.

One not confirmed but almost certain technical backdrop to this is that the learning is improving even between updates to the OS. Not only that but all Apple devices on earth are “cooperating” to help each other learn. That’s a powerful force that spreads across over 1.65 billion devices.

This feature was added in iOS 12 but started to function in iOS 14 on a much higher level. If you had tested and used the feature over the last few years as we have you’d have noticed that the ability of the AI to “see” and select photos and videos to include was limited and, at times, comical. Not any more.

Much of the data that clues the software in as to what photos belong together is from the embedded meta data. The date, time and location information helps to tell the AI that you took a group of images or videos on a day in a particular location.

The difference in iPad OS15 (iPhone too, of course) is that the more difficult to accomplish tasks, such as recognizing the subjective quality of one photo verses another (humans often take several photos of the same scene to try to capture the best out of a bunch). Or, more importantly, who and what are the subject of a photo.

All of this began to get interesting in iPad OS 14 and many groups of photos and videos were already being chosen, edited and enhanced by the software to a level that was fairly impressive.

AI and aesthetics collide and the result is a Joy to witness

Something that is starting to become a thread and a definitive direction that Apple is taking, particularly with the iPad Pro series, is, true to the name, a Pro level of visual production and manipulation throughout the OS.

Center Stage, for example and many other video and photo related upgrades were some of the big features in the newest generation of iPad Pro. Those are great, but require a new iPad along with the OS upgrade.

When it comes to the memory movie clips what we found is that even on the oldest iPad Pro from 2015 the evolution of the software due to the constant learning by the AI is already taking a huge step forward doing all the things that it was already doing only much better.

Apple’s upgrade took that and give it an additional kick up a notch with somewhat that the company is known for: good taste.

What has changed specifically?

In iPad and iPhone OS 14 there were a few things that felt awkward in the way movies were created. The biggest shortfall was in the softwares ability to deal with various aspect ratios.

These days when we shoot photos and videos with an iPhone it is tempting and, at times, wonderful to use the vertical orientation. Other times, for landscapes and other scenes we might prefer a traditional film aspect or even use the panorama feature to get an ultra-wide screen “cinema-scope” style.

Until now this was dealt with very poorly by the software. Mostly the photos would constantly zoom in (the so called “Ken Burns” effect) and if shown without zooming in a vertical portrait shot would have ugly side bars (like a vertical letterbox effect).

The zooming and most of the effects in general destroyed the resolution and therefore the quality of many photos by enlarging them and adding the effects.

Additionally the effects that were added, while cute and fun, were not much more than a way to add fun and not what would likely be used by a human editor. All of this and more made for a kind of novelty feel to the whole process that was nice to have, but many never even bothered to look at the movies that software created for them.

That’s about to be over.

A whole new array of options for the AI to use while trying to entertain

In iPad OS 15, as can be seen on the photos and videos in this article, the ways that the software solves the aspect ratio issue as described above is genius and, dare I say it, beautiful.

In a collaboration between the AI and the software itself it now has a new bag of tricks to use and, boy, does it work. One feature that is fantastic is the letterbox generator for any wide screen photos in any aspect ratio.

How this works is that it takes the iPad aspect ratio and then uses the photo in it original at 100% full resolution and then adds a letterbox. But this is not the usual plain black bars we are all familiar with – the software and AI are able to see and analyze the photo and create a custom gradient letterbox that can be any shade or color.

Photos in clip above courtesy of The 2021 International Portrait Photographer of the Year
Copyright © 2021. www.internationalportraitphotographer.com

The effect is often astoundingly tasteful and often makes the original photo look even better. We tested it on award winning photos (video above) and the result is, basically art. Also on our own “nice” photos, chosen 100% by the AI and software, look amazing also.

Actually, all the photos and videos in the clips generated from the library look much better than I had remembered. That turns out to be because the software and AI now do automated color grading on all the photos and videos in all the generated memories !

Color grading also known as color correction, especially for video, has traditionally required an expensive expert and high end software (and hardware) to enhance and color match various photos and clips, that have often been taken at different times and places, where lighting conditions vary and sometimes were shot with different camera.

AI and machine learning software on iPad OS15 (and iOS 15) now has a virtual colorist actively adjusting your shots and enhancing and color matching them while you sleep. That is basically insane. That’s probably why it appeared that the photos and even the iPad itself had been upgraded.

Ok, I could go on and on about that one feature, but let’s move to some more features. There are also new effects that are added that vary with each memory (there are a lot more clips being generated, including various versions of the same idea to choose from).

In the experiments so far the effects are clearly better and more subtle than in iOS 14. Again in many cases I found myself saying the word “beautiful” when I tried find an adjective to describe the results.

For shots that have a vertical bias there’s a vertical geometric split screen effect, often with a thin black border, and it has a kind of 60’s on steroids feel with the bars sliding in and out and resizing into place.

Another effect not seen in iOS 14 is a kind of circular rotation – great for landscapes – it’s not a common effect probably because it is computationally complex, but for the AI, it’s a snap. Sometimes this effect has a kind of blur-dissolve added which makes it fun and, again, still tasteful.

It appears that the effects are not only better and there’s a larger bag of them, but they appear to evolve and adapt to the content, that is to say that the speed and depth of each changes with the music combined with the photo and video content.

Oh, and the music. OMG. Each clip has 6 songs pre-selected and the entire clip adapts, in real time (!), when you change the song, showing you various styles and looks that match. Apparently Apple Music is also connected if you have a subscription.

As a mater of fact, it is hard to be certain, as we have not had more than a few hours to test this, but nearly everything appears to be “live” and constantly evolving in real time. In order to “freeze” a version of a memory you have to “favorite” it (with the typical heart symbol) and then “add to memories” in order to edit (change the names or choose more images – or remove anything if it is not to your liking).

There is so much more not yet mentioned here: this article could probably be a book

The AI is also getting creative with names and “concepts” for the clips. For example, if you had lunch (or took photos) over the years in the same city (for me it was Knoxville, TN) it might look at the coincidence that you tended to take photos around midday in that town and then create a memory clip called “Lunch in Knoxville over the Years”. Or for example the clip at the head of this article: “Golden Hour Over The Years”.

This is an early and primitive foretaste of the literary ambitions of AI. In the new Photos App in iOS 15 it is beginning to “think” about when, where and why humans take photos and videos and then conceiving a story that fits the behavior it is witnessing.

Other titles go beyond the basic “Amsterdam in 2016” and start to use the understanding and visual ability to “see” what is in the photo to create a clip like : “Playing in the Snow at Christmas”. Snow? Does it know it’s cold? Maybe just that it’s white and happens in the northern hemisphere in December. This is just the very beginning of something that will evolve, hourly, from now on. I can’t wait.

Latest Related Articles:

Find books on Music, Movies & Entertainment and many other topics at our sister site: Cherrybooks on Bookshop.org

Enjoy Lynxotic at Apple News on your iPhone, iPad or Mac.

Lynxotic may receive a small commission based on any purchases made by following links from this page

Apple’s free upgrades are Inviting you into the Metaverse: iOS15 – macOS Monterey & iPadOS15

Above:Photo Credit – Apple / Lynxotic

Sci-fi sounding, inevitable upgrade for today’s online communications

You might have heard lately about the “metaverse” and yet that can mean a variety of different things to different people. Often, it’s a term that relates back to gaming and 3D augmented reality enhancements of networked communications.

There are even crypto and blockchain related projects using this term and concept. While all of these various factors are welcome, and potentially part of this next phase of convergence of communication via networked technology, there’s something else happening under the surface.

”The pandemic, with its requirements of physical distancing, has brought people into online digital environments for a growing range of shared human experiences.” — Wired UK from “The Metaverse is coming” by David Baszucki

The acceleration in AI application, machine learning, and converging use cases for all communications tech has created a situation where the entry-portal to the emerging metaverse is already here.

One often overlooked aspect of a transition to a more complete digital life is the need for humans to have adapted to the need and potential benefits of the idea. This is what is happening via many routes, including Apple and the constant synergistic upgrade cycles that have just gone into a new, bigger phase with the migration to a unified OS structure built around Apple Silicon.

The gradual increases in iOS functionality and user sophistication are changing how we interact

iOS15, previewed this week at the WWDC2021 is rolling out literally dozens of new features, many based on machine learning, neural networks and AI that propose a new level of highly sophisticated options to communicate with video, photos and text.

While this mixture of “basic” media has been the staple of our current modes of online communication, particularly via social media, the incredibly increased depth of new options and functionality of iOS15 and iPadOS15 and MacOS Monterey will make all modes of communication feel completely new.

In the evolution of online media and enriched communication (OMEC to coin an 80s sounding acronym) the slow and uneven progress is based on many factors. #1 is always user adoption and sophistication.

Second is the quality of the hardware devices and software upgrades each user around the world has access to. In the case of iOS (iPad, iPhone & macOS) the immediate adoption of upgrades is a large factor on the plus side, helping new innovation to arrive in general use more quickly.

The last factor, a huge one, is access to fast ubiquitous internet data connections, and, in the US at least, this is less consistent than ever (or our expectations are rising faster than the build out).

Related Story links:

However, particularly in Asia, 5g is beginning to make a dent. Satellite broadband, like Starlink, should also start to be a factor as early as 2022. Government infrastructure build-out funding and subsidies in the US is on the way in 2021.

Augmented 3D features are still growing but will merge with 2D

The upshot of this topic is that “2D” factors and increasingly sophisticated manipulation and interactive features that are already coming in iOS15 will bring us all closer the entry-portal stone-age version of the metaverse.

We all depend more and more on communications and using our devices – work from home, personal, business and hybrid activities (such as the emerging content creator class). Often, as a result, we have fewer options to go offline for “organic” RL (real life) interactions.

The increasingly sophisticated capabilities available are beginning to make even face to face communications, particularly in work situations, feel “un-enhanced” as we become accustomed to and dependent on the digital enhancements and potential of a full media rich interaction.

This is an example, one could say, of the subtle encroachment of the emerging metaverse onto the “real world” and how the boundaries are blurring and even beginning to disappear.

Rather than a sudden “jump” into a metaverse, similar to the cliché sci-fi plots from films like “Ready Player 1”, what is happening is a nearly imperceptible transition to metaverse-like experiences that will become commonplace, initially in a primitive form, and then eventually become the norm. Similar to the proverbial Frog in pot, with warm water temperatures that increase so slowly that the Frog doesn’t even notice, until it finds that it is swimming in pot that is already boiling.

The misconception that a “killer app” or sudden shift into an online, virtual reality world, is the future, and that a big leap will happen nearly all at once, is harmlessly superimposed on the real transition that has already begun.

When Apple’s 2007 launch of the iPhone changed communication forever: the journey began

The new “Digital Legacy Program”, also announced at WWDC2021, is another hint that we are already living in an extremely primitive version of the metaverse. Our online identity, data, and even behaviors and experiences are so essential and all pervasive that it has become necessary to keep a digital key to access the huge trove of personal data we will leave behind to pass on to our living loved ones, after we are gone.

The metaverse, that means, is not only creating a parallel digital universe for us to live in, in an ever more complete and sophisticated way, but we are also already setting up the eternal storage of our virtual life experiences to be passed down to future generations.

Though nearly invisible while in such a relatively primitive iteration, the concept, an example of overlapping advancements in innovation, is a tiny step towards digital immortality.

The metaverse could help to save us all

It’s not just professional and work related communication that relates to the gradual increase in the depth of networked communication options, but, even more so, casual and leisure communication and interaction is key.

TikTok and other video communication trends are at the forefront of of user evolution and metaverse activity expansion. When people feel motivated to find new and better ways to communicate using richer media and augmented techniques for fun, and to gain more recognition in online societies, that advances digital sophistication.

This process of the evolution of user comfort and sophistication, while existing and interacting in the metaverse, is the fastest way for the augmentation to become more effective.

There’s a mostly unseen benefit and need for this, otherwise seemingly pointless, global development

The challenges that the world faces, encroaching, devastating fallout from global warming and excess carbon in the atmosphere, political corruption and inequality, disinformation and cybercrime, and so on.

Ultimately, unlike at any time in human history, we are facing a challenge. The survival of our species and even the planet are at stake.

In the years and decades to come it will become more and more obvious that there are only two paths possible. One path toward a kind of Utopia, or another one that will lead, inexorably to Oblivion.

Though the metaverse is scary in many ways, and does not always appear as a way to a better life, augmented and enhanced communication is one of the most desperately needed ways that solutions could eventually be discovered and implemented.

And that would put this progression and evolution of tech more in service of Utopia, and could be at the heart of a rescue plan to prevent Oblivion, before it’s too late.

https://www.apple.com/newsroom/videos/universal-control/Apple-Universal-Control-cc-us-_1280x720h.mp4
Above: Craig Federighi Demo Video at WWDC 2021


Find books on Music, Movies & Entertainment and many other topics at our sister site: Cherrybooks on Bookshop.org

Enjoy Lynxotic at Apple News on your iPhone, iPad or Mac.

Lynxotic may receive a small commission based on any purchases made by following links from this page

iOS 15: It’s not just about the new Weather Animations, there’s a lot more

For what seems like a long time many of us have been living inside our iPhone, immersed in a metaverse of our digital lives.

And the deeper into Apple’s walled garden we are submerged, the more monumental the yearly OS upgrades become. That’s because, when you are in a digital life, we’ll, lots of things are worse than the “real” world. The sensual experience is built of fractions of the full sensory bandwidth of life.

But there’s one thing about the metaverse, the fact that, since it’s artificial and human engineered, it can, and does, improve.

In the case of Apple’s universe, the yearly upgrades and constant, sometimes nearly imperceptible changes in a thousand different parameters add together, over time, and suddenly, the world comes alive with vibrant, super sensual satisfaction …

Sure, the weather animations just got sent to a 3rd convolution level of better-ness, that’s true. But add this to all the thousands of better feelings and deeper interactions with yourself and the spirit of ourselves, and you will find: the future

Photo credit: Apple

WWDC 2021 was a pure upgrade fest with a lot of detail to sift through

We are in the middle of our ongoing coverage of the Apple event and all that was revealed. There are so many features and so many important details and interdependent uses for this features that it can be more easily digested in bites.

What we are witnessing is the growing interdependence and interoperability of iOS 15, iPad OS 15 and macOS 12 Monterey, particularly with the built in apple apps they all have built in.

Safari, though still with slight variations between the three OSs, is becoming more powerful everywhere, FaceTime got a huge upgrade in the new systems, and utilities connected to iCloud such as the Find My network are also extensively revamped.

While some find the sheer width and breath of Apple’s hardware, software and services conceptually off-putting, it is, at this early stage of the monumental changes that are being wrought by Apple Silicon, a wonder to behold how all the various products and underlying software for those products is evolving in a way that is constant and deep.

As put forth in articles published by Lynxotic years ago the changes that are underway are vast and were conceived and put into motion based on Steve Jobs’ core concepts for the future of Apple many years ago. And Tim Cook and the rest of Apple have not deviated from that vision, in fact are reaping benefits on behalf of users that could barely be imagined a decade ago.

One bite we’ve started to delve into is the dual and interdependent features from macOS Monterey; Airplay to Mac and Universal control. It turns out that compete interoperability for Airplay to Mac is still in the future, the list of the various models and vintages that it functions on is as follows:

  • 2018 or later MacBook Pro or MacBook Air
  • a 2019 or later iMac or Mac Pro
  • an iMac Pro
  • the 2020 Mac mini

As you can see this is a fairly exclusive list. What is most conspicuously missing is the possibility to use and older mac, such as a 2018 27” 5k iMac to take advantage of the beautiful screen.

Universal Control, meanwhile appears to work with most devices that run on iPadOS 15 and macOS Monterey. It allows you to a single mouse and keyboard and flow from ‌iPad‌ to Mac and back, pretty much as you would imagine using the cursor and keyboard for either, and, thankfully there is no setup required.

FaceTime just got a Facelift

FaceTime’s big jump ahead is somewhat more complex since the iPhone, iPad (various models of both) and the mac each have a UX and screen size that varies, as well as different computing advantages. One interesting note on the various technical enhancements, pretty much across the board from what was announced at WWDC 2021, M1 chips and Apple Silicon based devices get the biggest boost from all the new capabilities.

Rather than being a marketing ploy, at least so far there’s no evidence of that kind of approach, this is an organic by product of the underlying “big picture” goal – to unify the experience and potential of the three device categories even as they cross pollinate one-another.

All the various, and gradually hard to list, OS flavors, macOS 12 Monterey, iOS 15, iPadOS 15, tvOS 15, watchOS 8 and all the various accessories that benefit from the upgrades such as AirPods pro spatial audio, HomePod mini liaison with Apple TV 4k and tvOS 15, as well as SharePlay where FaceTime can allow multiple users to share streaming audio or video content for a synchronized experience.

Please stay tuned for the many articles to come that will further dive into the changes and improvements that are on the way, free of charge, for Apple users with this massive roll-out that will culminate in fall 2021.

As per Apple:

Redesigned Weather and Notes Apps

Weather includes more graphical displays of weather data, full-screen maps, and dynamic layouts that change based on conditions. Beautifully redesigned animated backgrounds more accurately reflect the sun’s position and precipitation, and notifications highlight when rain or snow starts and stops. Video animation below:

https://www.apple.com/newsroom/videos/apple-iphone12pro-ios15-weather-app/large_2x.mp4

Notes adds user-created tags that make it easy to quickly categorize notes, and mentions allow members of shared notes to notify one another of important updates. An all-new Activity view shows the recent history of a shared note.

Notes adds user-created tags that make it easy to quickly categorize notes in line with relevant content:

Related Recent Article Links:


Find books on Music, Movies & Entertainment and many other topics at our sister site: Cherrybooks on Bookshop.org

Enjoy Lynxotic at Apple News on your iPhone, iPad or Mac.

Lynxotic may receive a small commission based on any purchases made by following links from this page

New iPhone Separation Alerts will warn you if you’re about to lose your AirPods Pro

Above: Photo Credit / Apple

This is a feel-good story if it works and then it’s Hallelujah!

Ok, other than to be a benign stalker or to monitor your girlfriends actual ETA, do you even use the Apple “Find My” network? Were you aware that it applies to people, devices and now “items” using air tags? And that you can also find yourself, in case you don’t know where you are?

Well, all that has been expanded into a new realm, at least for those that want to hang onto expensive audio gadgets as long as possible. You guessed it, Apple has now added AirPods Pro and AirPods Max to the devices list.

This was announced at WWDC2021 but it is not, I repeat, not, something that you need to wait for a software upgrade to make use of. It’s live now if you have iOS 14.

Above: Photo Credit / Apple

Why the jubilation? I think because so many of us suffer from extended anxiety, even early symptoms of PTSD from the thought of losing those $250 or yikes(!) $549 babies.

Losing both or even one of these could be a life changing event, triggering depression and doubt and sadness. And Apple appears to understand this, indeed, they assure us that, not only is the Find My network supposed to geolocate your AirPods, they even have a new feature called “Separation Alerts”.

They specifically describe this feature thusly: “Separation Alerts notify a user if they leave an AirTag, Apple device, or Find My network accessory behind in an unfamiliar location…”

“Separation Alerts”, admittedly, sounds like a term for childhood trauma but, again, if you walk out of a Starbucks on 5th Ave. in NYC and you get a notification that you are sans your AirPod Pros, hallelujah, all is not lost, instead, you are running back in to grab those babies, STAT! (From the Latin word statim, meaning “immediately.”)

These subtle, free and unheralded improvements are seemingly the wave of the future, if you are an inhabitant of the Apple ecosystem, which is on the verge of turning into a kind of pre-metaverse training ground for future digital life pioneers.

Hardware Schwardware updates, who needs ’em?

WWDC2021 was, perhaps cleverly, devoid of hardware news or release announcements. This trick worked like a charm. Various top and sub-top media outlets spent the weeks ahead of the show and keynote, touting leaks, speculation and sources regarding, mostly, what turned out to be non-existent hardware updates.

Those will no doubt still be forthcoming, just not at the WWDC2021 keynote, as that ship has sailed.

Instead, and this is a good thing, we can all begin the journey to learn more about what is in store for the annual pilgrimage to the next generation of software enhancements for the Apple ecosystem, meaning macOS 12 Monterey, iOS 15, iPadOS 15 and all the rest too numerous to name here.

See our latest posts from the show for more details.

As per Apple release:

”Find My introduces new capabilities to help locate a device that has been turned off or erased, as well as live-streaming locations for family and friends who choose to share their location. Separation Alerts notify a user if they leave an AirTag, Apple device, or Find My network accessory behind in an unfamiliar location, and the Find My network now supports AirPods Pro and AirPods Max. A new Find My widget offers an at-a-glance view directly from the Home Screen.”

Links to latest related posts:


Find books on Music, Movies & Entertainment and many other topics at our sister site: Cherrybooks on Bookshop.org

Enjoy Lynxotic at Apple News on your iPhone, iPad or Mac.

Lynxotic may receive a small commission based on any purchases made by following links from this page

FaceTime gets Portrait Mode in iOS 15 to give the look of DSLR prime lens systems

The “Pro-Vlogger” look popularized on YouTube now available to all…

Above: The stunning Portrait mode from the Camera app is now optimized for video calls in FaceTime.
Photo Credit / Apple

If you are a prolific FaceTime user or if you don’t use it as much as you would if the aesthetics were a bit better (read: more flattering selfie styles) you are in luck. In a twist which takes advantage of tech that was initially created to make portrait mode a reality in the iPhone camera app is now coming to FaceTime on iPhone and also on iPad.

The maturation of features across platforms is paying big dividends

Portrait mode was added to the camera app as a way to get a DSLR style “prime lens” look with “bokeh” which is a Japanese term for the beautiful background out of focus blur that a long lens focused on the subject in the foreground will produce.

The computational fireworks required to produce this effect are nothing short of…. well check out Apple’s description:

” It’s a depth-guided, people-focused segmentation mask generated from a proprietary Apple neural network trained to detect people. It separates an individual in the foreground from whatever is in the background, with greater detail and clarity than with the depth map alone. It achieves this clarity in part because the matte image has higher resolution than the depth map.”


So this effect, which has been in the iPhone camera app since iOS 12 is now, likely due to the ever beefier potentials of the proprietary Apple neural network can now, starting with iOS 15 and iPadOS 15 be applied to video. Live.

New features in FaceTime help users look and sound their best. – credit: Apple

This, like so many other upgrades revealed today at the WWDC2021, is a great idea. High end YouTube videographers know it’s a great idea which is why they buy special DSLR camera with prime lenses just to get the very beautiful and flattering effect of a sharp subject in the foreground and a compressed, blurred “bokeh” effect in the background.

New features in FaceTime help users look and sound their best.

Not only visual but also audio upgrades are coming

They are also adding another obviously useful feature “spatial audio”, which creates the effect of having the perceived source location of each speaker match where they appear on the screen.

This is combined with “new microphone modes” which can reduce background noises and audio interference when in a chaotic sound environment and, alternatively when appropriate pickup an entire soundscape all at once.

All in all these improvements to both the visual experience, and the audio are a much needed change from the often ugly reality of bad-webcam zoom style meetings we all endured during 2020.

And with the front facing camera, lighting and software beautifications constantly getting better, we can, at least those with great internet and high end devices, look forward to a much more sensually pleasing level of FaceTime interactions.

Additional new upgraded features for FaceTime include, but are not limited to:

A new grid view that makes it possible to do a “zoom” like stack of equal size boxes.

SharePlay which is a somewhat odd sounding option to share “Apple Music, watching a TV show or movie in sync, or sharing their screen to view apps together”. Additionally, sharing can include anyone using an iPhone, iPad or Mac and if shared playback controls are active any of the parties that are sharing can play, pause or jump ahead.

Users can now share experiences with SharePlay while connecting with friends on FaceTime, including listening to songs together with Apple Music, watching a TV show or movie in sync, or sharing their screen to view apps together. SharePlay works across iPhone, iPad, and Mac, and with shared playback controls, anyone in a SharePlay session can play, pause, or jump ahead.

There’s an expanding list of sources that can be used, including, of course, Apple TV, but also third party services that opt in, and currently, according to Apple the list already includes: Disney+, ESPN+, HBO Max, Hulu, MasterClass, Paramount+, Pluto TV, TikTok, Twitch, and many others

https://www.apple.com/newsroom/videos/apple-ios15-shareplay-music/large_2x.mp4

FaceTime calls that use all of these new features will continue to be end-to-end encrypted, so privacy is not compromised.


Recent related links:


Find books on Music, Movies & Entertainment and many other topics at our sister site: Cherrybooks on Bookshop.org

Enjoy Lynxotic at Apple News on your iPhone, iPad or Mac.

Lynxotic may receive a small commission based on any purchases made by following links from this page