UST stands for Ultra Short Throw

Chinese XGIMI has released Aura, a 4K Ultra Short Throw Laser Projector.

Claiming “Your Next (150″!) TV is Not a TV” they state:

“Simply put, AURA revolutionizes the home cinematic experience. This space-saving, stylish laser projector utilizes a laser-powered UST projection 17.3” from any wall, remarkable 4K UHD resolution, and insanely bright 2400 ANSI lumens to provide you a luxurious TV-like experience — without the TV.”

Buy it here: https://ca.xgimi.com/products/aura for only $4K!

My take: Hmm. If you have a spare four grand in your pocket, would you replace your current TV with this, or jet off to Mexico for an all-inclusive holiday? This unit sounds nice, but I’d have to see it in person to judge how large the picture is, and how bright it is in daylight — I’d want to be able to use it without having to close the curtains at noon.

Samsung’s new Freestyle digital projector

Samsung has just introduced a fantastic 1080p digital projector: the Freestyle.

Janko Roettgers of Protocol reports:

  • “The new Samsung Freestyle is a portable projector capable of projecting video from 30 inches to 100 inches. It offers access to the very same UI and apps as any of the company’s other 2022 smart TVs, but that’s pretty much where the similarities to a traditional TV end.
  • Weighing 830 grams, the Freestyle is designed for portability. “It’s about the same weight as a coconut or cauliflower,” Samsung Senior Director of Lifestyle TV Product Marketing Stephen Coppola told me recently. The projector can be powered via USB-C from a wall plug or external battery pack.
  • The Freestyle can be angled to use any free wall space as a screen, including the ceiling. It automatically calibrates the image to keep it in focus, level it and keystone it. “This is the magical feature on this device,” Coppola said.
  • The projector ships with a modified smart TV remote, but can also be controlled with voice commands via a far-field microphone after a voice assistant (Google, Alexa or Bixby) has been enabled.
  • The Freestyle ships with a lens cap that turns it into an ambient light projector, which is a pretty ingenious way of using a TV-like device for something that’s definitely not at all like a TV.
  • Later this year, Samsung wants to sell an optional light bulb socket adapter, further doubling down on this “my TV is a mood light in its spare time” idea.
  • There’s also a built-in speaker, which comes in handy in combination with far-field voice control. “There’ve been smart speakers, but never really a smart speaker with a 100-inch screen attached to it,” Coppola said.”

My take: I think the optional screw-in base is brilliant. Imagine using a goose-neck lamp in your living room to drive this! How long before they come up with a higher resolution? More uses:

Robot advances

Emma Roth reports on The Verge that robots are making advances: A humanoid robot makes eerily lifelike facial expressions; it’s interesting and a little scary.

She writes:

Engineered Arts, a UK-based designer and manufacturer of humanoid robots, recently showed off one of its most lifelike creations in a video posted on YouTube. The robot, called Ameca, is shown making a series of incredibly human-like facial expressions.”

But wait, there’s more! Meet Mesmer, even more life-like:

This, of course, builds on the research of Dr. Paul Ekman and his exploration of expression.

His FACS (Facial Action Coding System) is used by major animation studios to bring emotion to their creations.

My take: I wonder if robots will ever develop to the point where we can cast them in movies. I mean, we’re half way there with CG VFX.

New lightfield lens records depth info

John Aldred reports on DIYPhotography that the K|Llens One lens is about to released on Kickstarter.

He says:

“The K|Llens One lens, teased earlier this year by German company K|Lens, is finally about to released on Kickstarter. They say that this is the world’s first light field lens that can be used with regular DSLR and mirrorless cameras — and it works for both stills and video. Designed for full-frame cameras, the lens is a “ground-breaking mix of state-of-the-art lens and software technology” which K|Lens says will open up new worlds of creativity to users.”

The lens shoots nine images at once, with each taking up 1/9th the area of the sensor in a 3×3 grid. Custom software then manipulates those images into the desired result.

Because this lens turns any camera into a 3D camera it might have application for specific tasks like Visual Effects, where having depth information is vital for compositing.

Aldred adds:

“Interestingly, while all of the software was developed in-house, the lens itself, they say, was developed in cooperation with Carl Zeiss Jena GmbH, who they say will also be doing all of the manufacturing. So, while K|Lens might be a company that few have heard of, it will essentially be a Zeiss lens. And not just their name stamped on somebody else’s product as Huawei did with Leica, as they’re actually making the thing.”

See the company website.

My take: I’ve blogged about the light field a few times in the last decade and I really like the promise. Could it be the end of out of focus shots for ever? All we need is a similar “sound field” that would allow us to capture every sound source at once and later go into the soundscape to re-record those sources much closer. Right? (Hmm. Is this that?)

Colour Display AR Smart Glasses

Deirdre O Donnel reveals on Notebookcheck some of the most advanced Smart Glasses yet.

She writes:

“Thunderbird is an augmented reality (AR) -focused start-up supported by the display-centric OEM TCL. Now, the two brands have unveiled something apparently three years in the making: the new Smart Glasses Pioneer Version, with a groundbreaking color micro-LED display geared toward an optimal AR experience. This pair of spectacles is, as the name suggests, the kind of ‘true’ smart glasses that integrate a working, partially transparent display capable of overlaying a mixed-reality display over the wearer’s real-world surroundings. Thunderbird and TCL make the new device sound like a blend of features from the Facebook Ray-Bans and Xiaomi’s own concept Smart Glasses. They do integrate a camera — obtrusively found on the nose-piece — and touch controls on the outside of the ear-hooks to interact with the glasses and the content, phone-like apps, smart-home and -car controls they are rated to sync with.”

My take: These are much better than Google Glass and Snap Spectacles. Still too nerdy for me though, but they might appeal to someone wearing a Smart Watch. BONUS: here’s the excellent music from the Thunderbird video: Black Math’s Point Blank (Alternate).

Google AI can now enhance low res pix

Remember those laughable TV episodes in which someone asks, “Can you enhance that?”

Well, laugh no more. Google AI has mastered “high fidelity image generation.”

You can just about hear it: “HAL, unlock the enhancing algorithm.”

Google explains their new method:

“Diffusion models work by corrupting the training data by progressively adding Gaussian noise, slowly wiping out details in the data until it becomes pure noise, and then training a neural network to reverse this corruption process. Running this reversed corruption process synthesizes data from pure noise by gradually denoising it until a clean sample is produced.”

Add noise to the picture, and then denoise it?

Here is the Super-Resolution via Repeated Refinement paper.

And the Cascaded Diffusion Models for High Fidelity Image Generation paper.

My take: It was Arthur C. Clarke who said, “Any sufficiently advanced technology is indistinguishable from magic.” Google has just given us more magic. And we so smugly said those enhancing programs can’t add resolution back into a pixilated picture. Looks like we were wrong, yet again.

Bourdain speaks from the beyond in new doc

Roadrunner: A Film About Anthony Bourdain, directed and produced by Morgan Neville, was released in the United States on July 16, 2021 by Focus Features. Celebrity chef and TV presenter Anthony Bourdain died by suicide on June 8, 2018, in France while on location, and this film explores his complex psyche.

But a controversy has erupted over the director’s inclusion of an AI-generated voiceover.

Helen Rosner reviewed the film in The New Yorker and noticed something strange:

“There is a moment at the end of the film’s second act when the artist David Choe, a friend of Bourdain’s, is reading aloud an e-mail Bourdain had sent him: “Dude, this is a crazy thing to ask, but I’m curious” Choe begins reading, and then the voice fades into Bourdain’s own: “…and my life is sort of shit now. You are successful, and I am successful, and I’m wondering: Are you happy?” I asked (director) Neville how on earth he’d found an audio recording of Bourdain reading his own e-mail. Throughout the film, Neville and his team used stitched-together clips of Bourdain’s narration pulled from TV, radio, podcasts, and audiobooks. “But there were three quotes there I wanted his voice for that there were no recordings of,” Neville explained. So he got in touch with a software company, gave it about a dozen hours of recordings, and, he said, “I created an A.I. model of his voice.” In a world of computer simulations and deepfakes, a dead man’s voice speaking his own words of despair is hardly the most dystopian application of the technology. But the seamlessness of the effect is eerie. “If you watch the film, other than that line you mentioned, you probably don’t know what the other lines are that were spoken by the A.I., and you’re not going to know,” Neville said. “We can have a documentary-ethics panel about it later.””

Well, the panel has been convened.

In a follow-up article, Rosner writes: “Neville used the A.I.-generated audio only to narrate text that Bourdain himself had written” and reveals the director’s “initial pitch of having Tony narrate the film posthumously á la Sunset Boulevard — one of Tony’s favorite films.”

People seem offended that the director has literally put words into Bourdain’s mouth, albeit his own words. Personally, I don’t have an issue with this but think there should have been a disclaimer off the top revealing, “Artificial Intelligence was used to generate 45 seconds of Mr. Bourdain’s voiceover in this film.”

My take: what I want to know is, how can I license the Tony Bourdain AI to narrate my movie?

Portal installation links two city centres

Futuristic-looking round visual portals have appeared in Vilnius, Lithuania, and Lublin, Poland, allowing citizens to see each other in real time.

The two portals connect Vilnius’s Train Station with Lublin’s Central Square, about 600 km away.

Benediktas Gylys, initiator of PORTAL says:

“Humanity is facing many potentially deadly challenges; be it social polarisation, climate change or economic issues. However, if we look closely, it’s not a lack of brilliant scientists, activists, leaders, knowledge or technology causing these challenges. It’s tribalism, a lack of empathy and a narrow perception of the world, which is often limited to our national borders. That’s why we’ve decided to bring the PORTAL idea to life – it’s a bridge that unifies and an invitation to rise above prejudices and disagreements that belong to the past. It’s an invitation to rise above the us and them illusion.”

PORTAL is a collaboration of the Benediktas Gylys Foundation, the City of Vilnius, the City of Lublin and the Crossroads Centre for Intercultural Creative Initiatives.

More portals are planned between Vilnius, Lithuania and London, England and Reykjavik, Iceland.

See the official website.

My take: back in the early Nineties (before the Internet caught the public eye) I conceived of a similar network of interconnected public spaces, called Central Square. My vision was similar to Citytv‘s Speakers’ Corner but was to be located in large public outdoor spaces and used to broadcast citizen reports, rants or demonstrations. It would have included sound, which PORTAL seems to have overlooked. I think it was to have appeared on television sets on some of the high-numbered channels. Of course, once increased bandwidth could support Internet video, web cams took off instead. See EarthCam.com for a list.

Pushing drone footage to the next level

Drone footage. You’ve seen lots of dreamy sequences from high in the sky. But on March 8, 2021, a small Minneapolis company released a 90-second video with footage the likes of which you’ve never seen before. Here’s the local KARE-TV coverage:

Trevor Mogg of Digital Trends adds:

“Captured by filmmaker and expert drone pilot Jay Christensen of Minnesota-based Rally Studios, the astonishing 90-second sequence, called Right Up Our Alley, comprises a single shot that glides through Bryant Lake Bowl and Theater in Minneapolis. The film, which has so far been viewed more than five million times on Twitter alone, was shot using a first-person-view (FPV) Cinewhoop quadcopter, a small, zippy drone that’s used, as the name suggests, to capture cinematic footage.”

Here’s their corporate website and the original tweet.

Oscar Liang has a great tutorial on Cinewhoops.

Johnny FPV has a great first person view overview.

My take: ever had dreams of flying? This might be even better.

How NFTs will unleash the power of the Blockchain

NFT. WTF?

Let’s break this down to the individual letters.

F = Fungible. “Fungible” assets are exchangeable for similar items. We can swap the dollars in each other’s pockets or change a $10 bill into two $5 bills without breaking a sweat.

T = Token. Specifically, a cryptographic token validated by the blockchain decentralized database.

N = Non. Duh.

So NFT is a Non-Fungible Token, or in other words, a unique asset that is validated by the blockchain. This solves the real-world problem of vouching for the provenance of that Van Gogh in your attic; in the digital world, the blockchain records changes in the price and ownership, etc. of an asset in a distributed ledger that can’t be hacked. (Just don’t lose your crypto-wallet.)

Early 2021 has seen an explosion in marketplaces for the creation and trading of NFTs. Like most asset bubbles, it’s all tulips until you need to sell and buyers are suddenly scarce.

But I believe NFTs hold the key to unleashing the power of the blockchain for film distribution.

Cathy Hackl of Forbes writes about the future of NFTs:

“Non-fungible tokens are blockchain assets that are designed to not be equal. A movie ticket is an example of a non-fungible token. A movie ticket isn’t a ticket to any movie, anytime. It is for a very specific movie and a very specific time. Ownership NFTs provide blockchain security and convenience, but for a specific asset with a specific value.”

What if there was an NFT marketplace dedicated to streaming films? Filmmakers would mint a series of NFTs and each viewer would redeem one NFT to stream the movie. This would allow for frictionless media dissemination and direct economic compensation to filmmakers.

Here’s a tutorial on turning art in NFTs.

My take: while I think NFTs hold promise in film distribution, the key will be to lower the gas price; the fee paid when creating NFTs in the first place.