iPhone 11 Pro Max: Night Mode for all and Elon Musk’s Cyborgs come to life
On Friday we took out the new iPhone 11 Pro Max for a spin and got some Manhattan Beach photos and video footage. Saturday night we did some Night mode tests. The tests revealed some amazing revelations. Below are reactions and examples.
First, the autopsy. According to Apple, intelligent software and A13 Bionic are behind what makes Night mode possible. This, while in the past would be considered a compromise, is now an integral part of the concept of this new technique for shooting photos in very low light situations.
In traditional photographic techniques a long shutter “speed” setting would leave the shutter open, collecting light, for several seconds, even as long as hours for very advanced experimental shots. Although the results can be spectacular the set up, requiring virtually perfect conditions, make these type of photos “pro only” for the most part.
The long shutter opening is the starting off point for Night mode but that is where the similarities end, for the most part. Ultimately the way the iPhone 11 Pro produces ultra-low-light images is the beginning of what is bound to be a long and interesting road; software assisted real-time image processing using machine learning and all manner of highly sophisticated tech enhancements only possible within Apple’s unique total eco-system of chips (hardware), software and A.I.
But in the “real” world it is precisely the simplicity that takes this little upgrade into a whole other head space. First – no tripod needed (unless you are doing ultra max long exposure, and even then you can get by with a steady hand). This alone is, for anyone that has tried long exposures without stabilization, just astounding. Night mode senses that a potential photo can be enhanced and turns on automatically – and the results can be seen (in approximation) live before shooting. Once you shoot there is a delay – from, generally, 1-12 seconds (but usually 3-5 seconds) while the light is gathered and the software goes crazy. At the end is an almost noiseless photo that appears as if light was there that really wasn’t. It’s very odd. But often breathtakingly beautiful.
In the image below you can see the camera app layout during Night mode. On the bottom left you see the automatic setting (in this case set to 3 seconds) and if you tap on that icon the manual controls pop-up (on the right side of the image below) where you can manually adjust the exposure time or, if you want to disable Night mode and take a “normal” shot in low light, you can set it to zero.
Here is where it gets interesting. And, a little confusing. Confusing because newer iPhones are already automatically enhancing and increasing the “simulated exposure” settings. This means that, to get a photo of what the human eye actually sees in a given low light situation, you either have to use different software with manual settings or simulate the darkness in Lightroom / Photoshop, etc.
So, all the “with and without” Night mode comparisons all over the internet? They are all misrepresenting the “real” situation. To illustrate we have done a mock up with the 3 levels – first on the far left is the simulated “human” light situation and the middle is the Night mode off setting and finally the Night mode end result:
Below, you see the “human vision simulation” and on the right the basic Night mode enhanced version. The room had basically zero light in it except the very dim led Christmas lights on the tree in the subject.
What are we looking at here? We are talking about some sort of Navy Seal, Ninja “let’s take pictures in the dark -cause we can” kind of thing! The Apple examples below, while beautiful and fantastic, don’t even begin to address the Pandora’s Box we are opening here:
What one feels when exploring uses of this tool is that it is simply an increased range of lighting situations that suddenly become available and viable. The “usual” thought process of what a photo reveals vs. what the eye sees is literally gone. And one has to start thinking in a completely different way about what, when and where a photo can be taken.
What is remarkable about the photo above is nothing, except that it was pitch black and the sign was unreadable to the naked eye. If this was a shot needed for a Lifeguard video shoot and the sun had already set, no worries, sun not required. The end result is “normal” but the idea of shooting in the darkest corners and the result looking “normal” is quite strange indeed.
The shot below is a more traditional type of long exposure setting. A girl with a flash light happened to run by and that is where the decorative line was created. So far, other than a shot in nothing but moonlight such as the Apple Joshua Tree in Mojave example in the above video, using Night mode to recreate high contrast “traditional” long exposure photos is actually far less effective than re-thinking the entire photo-taking process to incorporate settings that heretofore were simply not viable for any camera, let alone a “cell-phone” camera.
We are all Cyborgs and the Door is opening to the Second Wave
Elon Musk said recently “we are already Cyborgs” because we carry cellphones. The idea is, a la Marshall McLuhan, that our iPhones (and computers and media and all tech) are extensions of our senses and, ultimately, ourselves. So now, just like that, we have Night Vision. Perhaps the use of the word “Bionic” for the A13 chip is no accident. But it is us, iPhone users, that are becoming Bionic.
Here at Lynxotic, we are dedicated to the exploration of all new “languages” that can be used online. Photos led to film and film to video and now, via the internet we also use various evolving hybrids: animated gifs, live photos, Emojis, Animojis, Memojis, and now “Slofies” and Day-for-Night-mode shots that represent an experience that can only be possible as a Cyborg or Bionically enhanced human.
More and more, so many of us, particularly those of us that work in tech and media, live “inside” our computers and devices. Although our blurry eyes attest to some of the downsides of this life, upgrades, particularly to software relating to the sensual, sensory experiences we share daily, are a very big upside. When you are bionic and live inside your “phone” an upgrade to your vision is nothing less than a tune-up for your soul.
We believe that we are in ultra-early times in the Second Wave. Just as a grunting cave man was “pre-Shakespearean” in his communication skills, we are searching for new ways to reach out to one another over digital networks and every new tool or method can change our digital lives, irreversibly. Night mode will be one such tool.
Find books on Big Tech, Sustainable Energy, Economics and many other topics at our sister site: Cherrybooks on Bookshop.org
Enjoy Lynxotic at Apple News on your iPhone, iPad or Mac and subscribe to our newsletter.
Lynxotic may receive a small commission based on any purchases made by following links from this page.
Books3 months ago
Self-Improvement for the New Year: 9 Books to Spark the Inspiration to Improve
Climate Solutions3 months ago
This Climate Solution is a Sleeping Giant
Breaking News3 months ago
Greta Thunberg took Andrew Tate out and a Pizza Box Took him Down
Climate Solutions3 months ago
What’s an Atmospheric River and Why is California Lovin’ It?
AI3 months ago
Hey, ChatGPT 3: Words are Beautiful, Powerful and Meaningful. Prove it to me by writing examples…
Elon Musk3 months ago
Elon Musk’s Net Worth Journey: Now Down $200 Billion Since Peak
Entertainment3 months ago
Is 2023 the year for Virtual Reality Workouts?
Climate Solutions2 months ago
Microgrids and Distributed Solar Energy can Change our Future
AI3 months ago
Hey ChatGPT, Be my Oracle, my Mirror, my Research Intern (!?)
Earth and Ecology2 months ago
Silver Lining Scenario: a Huge Superbloom Could be Coming
AI2 months ago
Everything You’ve Read About ChatGPT is Wrong
AI2 months ago
A New Wave of AI Tools are Arriving on the Heels of ChatGPT Mania