Tag Archives: Memoji

iOS 14 Brings Pins, Threads, Mentions, Memoji and More to Messages App

https://video-lynxotic.akamaized.net/iOS14-Group-Messages-29Sec.mp4
excerpt from APPLE PRESENTATION VIDEO FOR messageS FROM WWDC 2020

New ways to stay connected with Group Messages

Announced in late June at Apple’s World Wide Development Conference and currently running a beta trial before getting released next fall, iOS 14—the upcoming operating system for the iPhone—will offer users a lot of new experiences and options. Some of the most exciting changes it offers will come in the Messages app, where people go to send and receive most texts.

For years, iMessage has been a pretty simple and straightforward platform for sharing and exchanging messages via the iPhone. Group chats have always been an option on Messages, but still, users often opt to download Slack, GroupMe, or Facebook Messenger for more sophisticated or efficient groups.

Read More: AirPods Pro will have Spatial Audio and Seamless Switching in iOS 14 update, coming this Fall

Click Here to See Apple deals on Amazon.

iOS 14 is aiming to change that, as its biggest upgrades in Messages focus on refining the group message experience. First, users will finally have the option to pin conversations in their Messages conversation screen. This means that you can select up to nine conversations to be at the top of the list when you open the app. It will certainly help users keep their closest contacts and most important groups highly accessible.

Within the conversations, users will also have the ability to create threads and mention people directly. Much like the “Reply” option on Facebook’s comment section, iMessage’s “unthread” utility will let people respond directly to individuals within a larger group message. This will save users the time of starting a separate message when they want to address someone in the group directly. Moreover, it will also make the group chats more efficient, as topics and messages will no longer avalanche and interrupt on top of each other without organization.

Mentions and Memoji Updates

https://video-lynxotic.akamaized.net/iOS14-Memoji.mp4
EXCERPT FROM APPLE PRESENTATION VIDEO FOR memoji FROM WWDC 2020

As for the mention feature, people will now be able to call out individuals within the group. Much like how on Slack or Facebook you can highlight an individual by typing “@” before their name, iMessage under iOS 14 will let users carry out the same function, but with no “@” symbol necessary—the phone will just recognize the name when typed. Likewise, when someone is mentioned within a chat, iOS 14 will give that person a separate notification. That way the mentioned individual knows when he or she is brought up and becomes more aware of the conversation. It’ll be particularly useful when you’re mentioned in a thread that you’d otherwise would ignore or keep on mute.

Click to See “The Four
and help Lynxotic
and Independent Bookstores.
Also Available on Amazon.

Lastly, for bells, whistles, and aesthetic’s sake, the new Messages will allow group chats to have Group Photos and it will offer a variety of new Memoji. Group Photos will provide conversations with fun identity signifiers as they sit in the app. Meanwhile, the Memoji will make for more visual communication options while using Messages. The new Memoji include 20 new hairstyles and face-coverings along with additional age and headwear options. There will also be three brand new stickers: a hug, a fist bump, and a blush.

Read More: Tons of Changes in Apple WatchOS 7: “Dance” in Re-named Fitness App at top of list

Many of these novel features and more will also be available on the iMac under the new macOS Big Sur, which will similarly come out sometime later this year.

The iOS 14 will also feature an updated home screen with widgets and a more organized app library, as well as a built in translator, a virtual CarKey app, picture-in-picture technology, and an upgraded Apple Maps with a long-overdue cycling option. All of this is exciting, but as aforementioned, Messages is (and always has been) one of the iPhone’s premiere utilities. Messaging is at the very core of what makes a SmartPhone a SmartPhone.

Therefore, under iOS 14, it looks like the SmartPhone is about to get little smarter.


Check out all Lynxotic Apple Coverage

Subscribe to our newsletter for all the latest updates directly to your inBox.

Find books on Big TechSustainable EnergyEconomics and many other topics at our sister site: Cherrybooks on Bookshop.org

Enjoy Lynxotic at Apple News on your iPhone, iPad or Mac.

Lynxotic may receive a small commission based on any purchases made by following links from this page.

iPhone 11 Pro Max: Night Mode for all and Elon Musk’s Cyborgs come to life

On Friday we took out the new iPhone 11 Pro Max for a spin and got some Manhattan Beach photos and video footage. Saturday night we did some Night mode tests. The tests revealed some amazing revelations. Below are reactions and examples.

First, the autopsy. According to Apple, intelligent software and A13 Bionic are behind what makes Night mode possible. This, while in the past would be considered a compromise, is now an integral part of the concept of this new technique for shooting photos in very low light situations.

In traditional photographic techniques a long shutter “speed” setting would leave the shutter open, collecting light, for several seconds, even as long as hours for very advanced experimental shots. Although the results can be spectacular the set up, requiring virtually perfect conditions, make these type of photos “pro only” for the most part.

The long shutter opening is the starting off point for Night mode but that is where the similarities end, for the most part. Ultimately the way the iPhone 11 Pro produces ultra-low-light images is the beginning of what is bound to be a long and interesting road; software assisted real-time image processing using machine learning and all manner of highly sophisticated tech enhancements only possible within Apple’s unique total eco-system of chips (hardware), software and A.I.

Sept 13 Keynote featuring the iPhone 11 Pro image processing systems using Bionic A13 chip, GPU and Neural Engine

But in the “real” world it is precisely the simplicity that takes this little upgrade into a whole other head space. First – no tripod needed (unless you are doing ultra max long exposure, and even then you can get by with a steady hand). This alone is, for anyone that has tried long exposures without stabilization, just astounding. Night mode senses that a potential photo can be enhanced and turns on automatically – and the results can be seen (in approximation) live before shooting. Once you shoot there is a delay – from, generally, 1-12 seconds (but usually 3-5 seconds) while the light is gathered and the software goes crazy. At the end is an almost noiseless photo that appears as if light was there that really wasn’t. It’s very odd. But often breathtakingly beautiful.

In the image below you can see the camera app layout during Night mode. On the bottom left you see the automatic setting (in this case set to 3 seconds) and if you tap on that icon the manual controls pop-up (on the right side of the image below) where you can manually adjust the exposure time or, if you want to disable Night mode and take a “normal” shot in low light, you can set it to zero.

iPhone 11 Pro in camera app with Night mode activeand with an auto-setting of 3 seconds

Here is where it gets interesting. And, a little confusing. Confusing because newer iPhones are already automatically enhancing and increasing the “simulated exposure” settings. This means that, to get a photo of what the human eye actually sees in a given low light situation, you either have to use different software with manual settings or simulate the darkness in Lightroom / Photoshop, etc.

So, all the “with and without” Night mode comparisons all over the internet? They are all misrepresenting the “real” situation. To illustrate we have done a mock up with the 3 levels – first on the far left is the simulated “human” light situation and the middle is the Night mode off setting and finally the Night mode end result:

Below, you see the “human vision simulation” and on the right the basic Night mode enhanced version. The room had basically zero light in it except the very dim led Christmas lights on the tree in the subject.

What are we looking at here? We are talking about some sort of Navy Seal, Ninja “let’s take pictures in the dark -cause we can” kind of thing! The Apple examples below, while beautiful and fantastic, don’t even begin to address the Pandora’s Box we are opening here:

https://video-lynxotic.akamaized.net/apple-night-mode.mp4
Apple Instagram night mode video

What one feels when exploring uses of this tool is that it is simply an increased range of lighting situations that suddenly become available and viable. The “usual” thought process of what a photo reveals vs. what the eye sees is literally gone. And one has to start thinking in a completely different way about what, when and where a photo can be taken.

Shot in virtually total darkness with iPhone 11 Max Pro

What is remarkable about the photo above is nothing, except that it was pitch black and the sign was unreadable to the naked eye. If this was a shot needed for a Lifeguard video shoot and the sun had already set, no worries, sun not required. The end result is “normal” but the idea of shooting in the darkest corners and the result looking “normal” is quite strange indeed.

The shot below is a more traditional type of long exposure setting. A girl with a flash light happened to run by and that is where the decorative line was created. So far, other than a shot in nothing but moonlight such as the Apple Joshua Tree in Mojave example in the above video, using Night mode to recreate high contrast “traditional” long exposure photos is actually far less effective than re-thinking the entire photo-taking process to incorporate settings that heretofore were simply not viable for any camera, let alone a “cell-phone” camera.

We are all Cyborgs and the Door is opening to the Second Wave

Elon Musk said recently “we are already Cyborgs” because we carry cellphones. The idea is, a la Marshall McLuhan, that our iPhones (and computers and media and all tech) are extensions of our senses and, ultimately, ourselves. So now, just like that, we have Night Vision. Perhaps the use of the word “Bionic” for the A13 chip is no accident. But it is us, iPhone users, that are becoming Bionic.

“We are already Cyborgs”

Elon Musk at “Super-intelligence: Science or Fiction” conference

Here at Lynxotic, we are dedicated to the exploration of all new “languages” that can be used online. Photos led to film and film to video and now, via the internet we also use various evolving hybrids: animated gifs, live photos, Emojis, Animojis, Memojis, and now “Slofies” and Day-for-Night-mode shots that represent an experience that can only be possible as a Cyborg or Bionically enhanced human.

More and more, so many of us, particularly those of us that work in tech and media, live “inside” our computers and devices. Although our blurry eyes attest to some of the downsides of this life, upgrades, particularly to software relating to the sensual, sensory experiences we share daily, are a very big upside. When you are bionic and live inside your “phone” an upgrade to your vision is nothing less than a tune-up for your soul.

” When you are bionic and live inside your “phone” an upgrade to your vision is nothing less than a tune-up for your soul.”

– D.H.

We believe that we are in ultra-early times in the Second Wave. Just as a grunting cave man was “pre-Shakespearean” in his communication skills, we are searching for new ways to reach out to one another over digital networks and every new tool or method can change our digital lives, irreversibly. Night mode will be one such tool.

Photo shot using iPhone 11 Pro Max, Manhattan Beach CA on 9-20-19 using equivalent 52mm lens

Find books on Big TechSustainable EnergyEconomics and many other topics at our sister site: Cherrybooks on Bookshop.org

Enjoy Lynxotic at Apple News on your iPhone, iPad or Mac and subscribe to our newsletter.

Lynxotic may receive a small commission based on any purchases made by following links from this page.

New Memoji Videos Released In Time For 2019 Grammy Awards

Apple Produces More Singing Heads

3 new Apple Memoji videos hit the street today, and, like previous Animoji clips are basically Karaoke Music Videos. Timed to coincide with the 2019 Grammy Awards, today’s batch feature heads crafted to look like the original artists themselves, as a way to emphasize the “me” in Memoji.

In previous “Animoji” video clips Apple used anonymous people that were carefully chosen to match pre-set characters, such as the Cat or Unicorn. Generally, the entire Animoji project appears to be a fringe benefit from the face ID security system, and as a gift from the Steve Jobs Pixar legacy. Ok, true or not, in any case, they look like Toy Story style characters.

It’s a mystery why the talking heads always sway from side to side. Perhaps this is meant to show off the head tracking response, for example, but by now one would think that’s a given.

Although extremely cute and more responsive and adaptable all the time, it looks like third-party apps will have to fill the void, if virtual actors with more range are to emerge.

Read More: Disney, Universal and Pixar Films available to Stream in advance of original VOD release date

It would be great to see a talking avatar that has more realism or at least a personality that goes beyond “cute” and “super cute”.

When this concept finally goes beyond fun, games and Karaoke, a wide variety of potential uses could become possible. Just ask Max Headroom.

All the angst over AI taking over service and manufacturing jobs notwithstanding, by now you’d think someone would have created an automated Megyn Kelly. The 69 million pay-out she received when she was fired could have recouped any prior outlay for research and development. And, hey, at least the animated anchor would have stayed on script.

In November of last year the world’s first AI news anchor started working in China. The Chinese-English speaking bot “…can read texts as naturally as a professional news anchor”, the state-run Xinhua news agency that created the bot, claimed in a statement. Judge for yourself:

If you know of any third party apps trying to graduate from cute when it comes to Memoji recording, please contact our video department. Here at Lynxotic News we could definitely use a few new spokes-bots.


Find books on Big TechSustainable EnergyEconomics and many other topics at our sister site: Cherrybooks on Bookshop.org

Enjoy Lynxotic at Apple News on your iPhone, iPad or Mac and subscribe to our newsletter.

Lynxotic may receive a small commission based on any purchases made by following links from this page.