Pushing drone footage to the next level

Drone footage. You’ve seen lots of dreamy sequences from high in the sky. But on March 8, 2021, a small Minneapolis company released a 90-second video with footage the likes of which you’ve never seen before. Here’s the local KARE-TV coverage:

Trevor Mogg of Digital Trends adds:

“Captured by filmmaker and expert drone pilot Jay Christensen of Minnesota-based Rally Studios, the astonishing 90-second sequence, called Right Up Our Alley, comprises a single shot that glides through Bryant Lake Bowl and Theater in Minneapolis. The film, which has so far been viewed more than five million times on Twitter alone, was shot using a first-person-view (FPV) Cinewhoop quadcopter, a small, zippy drone that’s used, as the name suggests, to capture cinematic footage.”

Here’s their corporate website and the original tweet.

Oscar Liang has a great tutorial on Cinewhoops.

Johnny FPV has a great first person view overview.

My take: ever had dreams of flying? This might be even better.

How NFTs will unleash the power of the Blockchain

NFT. WTF?

Let’s break this down to the individual letters.

F = Fungible. “Fungible” assets are exchangeable for similar items. We can swap the dollars in each other’s pockets or change a $10 bill into two $5 bills without breaking a sweat.

T = Token. Specifically, a cryptographic token validated by the blockchain decentralized database.

N = Non. Duh.

So NFT is a Non-Fungible Token, or in other words, a unique asset that is validated by the blockchain. This solves the real-world problem of vouching for the provenance of that Van Gogh in your attic; in the digital world, the blockchain records changes in the price and ownership, etc. of an asset in a distributed ledger that can’t be hacked. (Just don’t lose your crypto-wallet.)

Early 2021 has seen an explosion in marketplaces for the creation and trading of NFTs. Like most asset bubbles, it’s all tulips until you need to sell and buyers are suddenly scarce.

But I believe NFTs hold the key to unleashing the power of the blockchain for film distribution.

Cathy Hackl of Forbes writes about the future of NFTs:

“Non-fungible tokens are blockchain assets that are designed to not be equal. A movie ticket is an example of a non-fungible token. A movie ticket isn’t a ticket to any movie, anytime. It is for a very specific movie and a very specific time. Ownership NFTs provide blockchain security and convenience, but for a specific asset with a specific value.”

What if there was an NFT marketplace dedicated to streaming films? Filmmakers would mint a series of NFTs and each viewer would redeem one NFT to stream the movie. This would allow for frictionless media dissemination and direct economic compensation to filmmakers.

Here’s a tutorial on turning art in NFTs.

My take: while I think NFTs hold promise in film distribution, the key will be to lower the gas price; the fee paid when creating NFTs in the first place.

Digital Humans coming soon!

Epic Games and Unreal Engine have announced MetaHuman Creator, coming later in 2021.

MetaHuman Creator is a cloud-streamed app designed to take real-time digital human creation from weeks or months to less than an hour, without compromising on quality. It works by drawing from an ever-growing library of variants of human appearance and motion, and enabling you to create convincing new characters through intuitive workflows that let you sculpt and craft the result you want. As you make adjustments, MetaHuman Creator blends between actual examples in the library in a plausible, data-constrained way. You can choose a starting point by selecting a number of preset faces to contribute to your human from the diverse range in the database.”

Right now, you can start with 18 different bodies and 30 hair styles.

When you’re happy with your human, you can download the asset via Quixel Bridge, fully rigged and ready for animation and motion capture in Unreal Engine, and complete with LODs. You’ll also get the source data in the form of a Maya file, including meshes, skeleton, facial rig, animation controls, and materials.”

Got that? See documentation.

The takeaway is that your digital humans can live in your Unreal Engine environment. Is this the future of movies?

My take: This reminds me of my experiments in machinima ten years ago. I used a video game called The Movies that had a character generator (that would sync mouth movements with pre-recorded audio,) environments and scenes to record shots I would then assemble into movies. See Cowboys and Aliens (The Harper Version) for one example. You know, in these COVID times, I wonder if Unreal Engine’s ability to mash together video games and VFX will become a safer way to create entertainment that does not require scores of people to film together in the same studio at the same time.

Shoot your next film in Virtual Unreality

Oakley Anderson-Moore reports for No Film School on How One Studio Is Thriving During COVID (and Why It’s a Big Deal for Indies).

(The studio tour proper starts just before 14 minutes in this promotional video.)

“During the pandemic, one studio stayed open when most others closed. How? L.A. Castle Studios has developed ‘a better way to shoot.’ And owner Tim Pipher believes it’s the way of the future — perhaps no more so than for independent film. ‘I guess some of it comes down to luck,’ explained Pipher to No Film School. His studio has been slammed with work in the midst of the shutdowns. ‘COVID or no COVID, we think we’ve got a better way to shoot.'”

What sets this green-screen studio apart from others is the ability to shoot with a live-composited set.

Simply put, you and your actor can now create inside virtual reality.

How is this possible? It’s achieved by marrying movie making and video game 3D environments. The core software is Epic GamesUnreal Engine.

See the Unreal Engine website and its Marketplace.

Check out L.A. Castle Studios.

My take: I love this technology! Basically, it’s Star Trek‘s Holodeck with green instead of black walls. Keep in mind, as a filmmaker, you still have to address every other component other than location: for instance casting, costumes, makeup, props, blocking, lighting, shot selection and performance. Do I know any Unreal Engine gurus?

Intel Labs creates photorealistic 3D VR from photos

Jacob Fox on PCGamesN suggests that new tech from Intel Labs could revolutionise VR gaming.

He describes:

“A new technique called Free View Synthesis. It allows you to take some source images from an environment (from a video recorded while walking through a forest, for example), and then reconstruct and render the environment depicted in these images in full ‘photorealistic’ 3D. You can then have a ‘target view’ (i.e. a virtual camera, or perspective like that of the player in a video game) travel through this environment freely, yielding new photorealistic views.”

David Heaney on Upload VR clarifies: “Researchers at Intel Labs have developed a system capable of digitally recreating a scene from a series of photos taken in it.

“Unlike with previous attempts, Intel’s method produces a sharp output. Even small details in the scene are legible, and there’s very little of the blur normally seen when too much of the output is crudely ‘hallucinated’ by a neural network.”

Read the full paper.

My take: this is fascinating! This could yield the visual version of 3D Audio.

Disney scientists perfect deep fakes

We propose an algorithm for “fully automatic neural face swapping in images and videos.

So begins a startling revelation by Disney Researchers Jacek NaruniecLeonhard HelmingerChristopher Schroers and Romann M. Weber in a paper delivered virtually at The 31st Eurographics Symposium on Rendering in London recently.

Here’s the abstract:

“In this paper, we propose an algorithm for fully automatic neural face swapping in images and videos. To the best of our knowledge, this is the first method capable of rendering photo-realistic and temporally coherent results at megapixel resolution. To this end, we introduce a progressively trained multi-way (comb network) and a light- and contrast-preserving blending method. We also show that while progressive training enables generation of high-resolution images, extending the architecture and training data beyond two people allows us to achieve higher fidelity in generated expressions. When compositing the generated expression onto the target face, we show how to adapt the blending strategy to preserve contrast and low-frequency lighting. Finally, we incorporate a refinement strategy into the face landmark stabilization algorithm to achieve temporal stability, which is crucial for working with high-resolution videos. We conduct an extensive ablation study to show the influence of our design choices on the quality of the swap and compare our work with popular state-of-the-art methods.”

Got that?

My advice: just watch the video and be prepared to be wowed.

My take: Deep fakes were concerning enough. However, this technology actually has production value. I envision a (very near) future where “substitute actors” (sub-actors?) are the ones who give the performances on set and then this Disney technology replaces their faces the those of the “stars” they represent. In fact, if I was an agent, I’d be looking for those subactors now so I could package the pair. A star who didn’t want to mingle with potentially COVID-19 carriers could send their doubles to any number of projects at the same time. All that would be left is to do a high resolution 3D scan and some ADR work. Of course — Jimmy Fallon already perfected this technique five years ago:

TikTok emerges as worthy Vine replacement

Joshua Eferighe posits on OZY that The Next Big Indie Filmmaker Might Be a TikToker.

Joshua’s key points:

  • “The social media platform is shaping the future of filmmaking.
  • Novice filmmakers are using the platform’s sophisticated editing tools to learn the trade and test their work.
  • Unlike Instagram, TikTok’s algorithm allows users without many followers to go viral, adding to its popularity.”

What is TikTok? The Chinese app claims to be “the leading destination for short-form mobile video. Our mission is to inspire creativity and bring joy.”

Why is TikTok valuable to filmmakers? The hashtag #cinematics with 3.7 billion views.

See these risks and this safety guide.

My take: Shorter is better! Remember Vine?

Some SmartTVs to become obsolete

Catie Keck reports in Gizmodo: Here’s Why Netflix Is Leaving Some Roku and Samsung Devices.

She says,

“Select Roku devices, as well as older Samsung or Vizio TVs, will soon lose support for Netflix beginning in December…. With respect to Roku devices in particular, the issue boils down to older devices running Windows Media DRM. Since 2010, Netflix has been using Microsoft PlayReady. Starting December 2, older devices that aren’t able to upgrade to PlayReady won’t be able to use the service.”

Netflix says,

“If you see an error that says: ‘Due to technical limitations, Netflix will no longer be available on this device after December 1st, 2019. Please visit netflix.com/compatibledevices for a list of available devices.’ It means that, due to technical limitations, your device will no longer be able to stream Netflix after the specified date. To continue streaming, you’ll need to switch to a compatible device prior to that date.”

Antonio Villas-Boas writes on Business Insider:

“This has surfaced one key weakness in Smart TVs — while the picture might still be good, the built-in computers that make these TVs ‘smart’ will become old and outdated, just like a regular computer or smartphone. That was never an issue on ‘dumb’ TVs that are purely screens without built-in computers to run apps and stream content over the internet.”

He concludes, “You should buy a streaming device like a Roku, Chromecast, Amazon Fire TV, or Apple TV instead of relying on your Smart TV’s smarts.”

My take: does this happen to cars as well?

The Internet turns 50!

Last Sunday, October 29, 2019, the Internet turned 50 years old.

We’ve grown from the 1970 topology:

to this in 2019:

internetmap072

Okay, here’s a real representation of the Internet.

What’s next? The Interplanetary Internet of course.

My take: It’s important to note that the World Wide Web is not the same thing as the Internet. (The Web wouldn’t be invented for another 20 years!) The Internet is the all-important backbone for the numerous networking protocols that traverse it, http(s) being only one.

Meet the world’s smallest stabilized camera

Insta360 has released the world’s smallest action camera, the GO. It is so small it’s potentially a choking hazard.

They call it the “20 gram steady cam.”

Here are some specs:

  • Standard, Interval Shooting, Timelapse, Hyperlapse, and Slow Motion modes
  • 8GB of on-board storage
  • iPhone or Android Connections
  •  IPX4 water-resistant
  • Charge Time: GO: approx. 20min, Charger Case: approx. 1hr
  • .Mp4 files exported via app at 1080@25fps; Timelapse and Hyperlapse at 1080@30fps;  Slow Motion: 1600×900@100fps and output at 1600×900@30fps

Some sample footage:

See some product reviews.

You can buy it now for $270 in Canada.

My take: this is too cool! My favourite features are the slow motion and the barrel roll you can add in post. This technology sparks lots of storytelling ideas!