Filmed on iPhone

Apple‘s October 30, 2023 Event closed with this end credit: “This event was shot on iPhone and edited on Mac.

That would be filmed using the iPhone 15 Pro Max with the free Blackmagic Camera App and Apple Log. Plus edited in Premier Pro and then graded and finalized in Davinci Resolve.

michael tobin adds further insight and commentary:

My take: Of course, now Apple has to film everything they do like this!

 

Using AI for good

Alyssa Miller reports on No Film School that Deepfake Technology Could Help You Change Inappropriate Dialogue in Post.

Flawless AI‘s tools include TrueSync that can be used to replace dialogue or even change the spoken language, all the while preserving correct mouth and lip movement.

Flawless TrueSync from Flawless AI on Vimeo.

Lara Lewington from BBC‘s Click explores the technology well:

Flawless on BBC Click from Flawless AI on Vimeo.

Other Flawless AI tools include: AI Reshoot (“‘allows you to make dialogue changes, synchronizing the on-screen actors mouths to the new lines, without needing to go back to set”) and DeepEditor (“enables filmmakers to edit and transfer an actor’s performance from one shot, to another, even if the shots were filmed from different angles.”)

My take: this is powerful technology but not sure how I feel about Robert De Niro’s face speaking in German, German that some other actor is speaking. (Of course, the next iteration of this tech is to voice clone and use that to speak in German. But now we’re really offside.)

The Revolution Will Be Televised!

Blackmagic has introduced Digital Film for iPhone.

Blackmagic Camera unlocks the power of your iPhone by adding Blackmagic’s digital film camera controls and image processing! Now you can create the same cinematic ‘look’ as Hollywood feature films. You get the same intuitive and user friendly interface as Blackmagic Design’s award winning cameras. It’s just like using a professional digital film camera! You can adjust settings such as frame rate, shutter angle, white balance and ISO all in a single tap. Or, record directly to Blackmagic Cloud in industry standard 10-bit Apple ProRes files up to 4K! Recording to Blackmagic Cloud Storage lets you collaborate on DaVinci Resolve projects with editors anywhere in the world, all at the same time!”

The tech specs are impressive.

This is a great way to learn Blackmagic’s menu system.

It’s also a great way to get introduced to Blackmagic’s Cloud.

And it’s a great way to explore the Sync Bin in DaVinci Resolve.

Oh, and by the way, it’s free.

My take: Grant Petty mentions how multiple filmers at “protests” in this update could use Blackmagic Cameras on their iPhones and work with an editor to create almost instant news stories; I think this technique could also be used during concerts as well.

So, have things changed from fifty years ago? The revolution will be live.

What is the difference between USB-C and Thunderbolt?

Now that Apple has brought a USB-C port to the iPhone 15, it’s time to review how USB-C differs from Thunderbolt.

Similarities:

  • They look alike.
  • They are compatible with each other.

How to identify similar cables:

  • A Thunderbolt connector will have a lightning bolt symbol on it.
  • No lightning bolt? Then it’s a USB-C connector.

Keep in mind:

  • Use Thunderbolt cables between Thunderbolt devices to get the fastest transfer speeds.
  • Thunderbolt 4 runs at 40Gbps, up to four times faster than USB-C.
  • USB-C cables marked SS (for Super Speed) are faster than USB 2.0 cables.

(See Cult of Mac for more.)

My take: turns out, a cable is not just a cable! I wonder if there’s a plug in device to tell you if your cable is legit and what speeds it supports. But whatever you do, don’t check out this video if you’re paranoid.

Sony tests cameraless virtual production

Jourdan Aldredge notes on No Film School of Sony Testing Real-Time Cameraless Production for New Ghostbuster Movie.

He writes:

“Sony Pictures Technologies has unveiled its latest developments in real-time game engine technology with this new proof-of-concept project…. Its “cameraless” virtual production style… intends to allow developers to use this real-time game engine to produce a scene live on a motion capture set.”

Jason Reitman, who wrote and directed the two-minute scene in one day, says:

“I love filmmaking in real places with real actors. So for me, this is not a substitute. If I were to make Juno again today, I would make Juno exactly the same way. What I see here, what thrills me, is if I wanted to make a movie like Juno that took place in ancient Rome, I wouldn’t be able to do that because it would cost $200 million to make Juno. And that’s not cost effective. There’s no studio who would sign up for that. You can make Ghostbusters for a lot of money, but you can’t make an independent film in an unexpected location. You can’t make an independent film underwater, on the moon or, you know, a thousand years ago or into the future, and what thrills me about it is the possibility of independent filmmakers who want to tell their kind of stories, but in environments that they don’t have access to with characters that they don’t have access to, and the possibility of getting a whole new wave of stories that you could read in a book, but you would never actually get in a film.”

My take: While I agree with Jason Reitman that this technology is promising, I think their finished scene is underwhelming. It’s just not believable. For instance, the folks on the sidewalks are obviously from a video game. The traffic is not real world either. And the actor is not human; he’s a marshmallow! However, this might be where superhero comic book movies are going: totally computer-generated, with the faces of the stars composited onto the quasi-lifelike animation. (My nightmare situation: those faces and voices are AI generated from scans and recordings!)

AI delivers on “Fix it in post!”

Michael Wynne, an audio mastering engineer, just claimed on his In The Mix channel: I Found The Best FREE AI Noise Reduction Plugin in 2023.

The tool is a free AI de-noising and re-reverb plugin called GOYO by South Korea’s Supertone AI.

Michael begins:

“I’ve used many of the sort of more premium and expensive dialogue, restoration and denoising softwares, and those are very good, but I haven’t come across a free tool that even comes close to what is offered by those. So I was really curious, downloaded it, tried it in my DAW and video editor, and was just completely shocked by the results.”

There are three dials: Ambience, Voice and Voice Reverb. You can solo, mute, decrease or increase each band. Simple and powerful!

His expert tip:

“My favourite way to use these sorts of tools is to dial them in and then print the results. So I would control this to the amount I’d like. I would export that as a new WAV file, take a listen and then work with that so that I know that the next time I open up the session, it’s going to be exactly the same.”

The How to Use page from Supertone is quite sparse so Michael’s examples are great.

My take: the perennial joke on set has always been, “We’ll fix it in post.” Well, now that’s possible for sound! I’ve used this on a short and can attest to its ease of use and incredible results. I concur with Michael that it’s best to print out each voice track as a WAV file and re-synchronize it to the timeline because I found that either the effect did not persist between sessions or the values had reset to zero or the effect was present but the numbers displayed as zero. My other tip is to only use the graphical user interface (and do not use the Inspector) as this seemed to work best. After all, this is a free beta!

A Better Green Screen

Devin Coldewey reports on TechCrunch that Netflix’s AI-assisted green screen bathes actors in eye-searing magenta.

Netflix researchers have described an experimental way to create more accurate green screen mattes. They propose lighting subjects with magenta light in front of a green background. Devin says:

“The technique is clever in that by making the foreground only red/blue and the background only green, it simplifies the process of separating the two. A regular camera that would normally capture those colors instead captures red, blue and alpha. This makes the resulting mattes extremely accurate, lacking the artifacts that come from having to separate a full-spectrum input from a limited-spectrum key background.”

Once the mattes are created, green information needs to be added back to the subjects. The solution? AI. It learns how to do this task more accurately than a simple green filter:

Read the full paper here.

My take: Not quite ready for prime time, especially if actors need to perform under magenta lights.

Original Wilhelm Scream Digitally Preserved

Craig Smith, blogging on Freesound, relates the fascinating story of how he discovered and digitally preserved over 1,000 vintage sound recordings, including the infamous Wilhelm Scream.

Each original sound was on a roll of 35mm magnetic acetate film; these were transferred to audio tape in 1990. Craig explains what happened next:

“I got the SSE tapes from the USC Archive in 2016. It was immediately clear that these tapes had a big problem. They were recorded onto used Ampex tape from the 1980s. Tape manufacturers changed their formulations in the early ’80s, and it turned out these new tapes were very unstable. They started to display what became known as Sticky Shed Syndrome. (Google it.) When this happens, the glue that binds the magnetic oxide to the plastic base becomes sticky, and separates. This makes the tapes virtually unplayable. Fortunately, there’s a temporary fix. Tapes can be baked for several hours at a low temperature in an oven. So that’s what I did. Each tape was baked at 150ºF for four hours, then cooled for four hours. This made the tapes stable enough to transfer using my Nagra 4.2 full track recorder.”

Here’s the direct link to the web page with Man Eaten by Alligator, Screamshttps://freesound.org/people/craigsmith/sounds/675810/

By the way, here are the movies listed in the compilation above:

0:15, 1:03 – The Venture Bros. (2004, 2008)
0:21 Aeon Flux (2005)
0:27 Star Wars I: The Phantom Menace (1999)
0:33 Team America: World Police (2004)
0:39 Star Wars IV: A New Hope (1977)
0:46 Spaceballs (1987)
0:54 Lethal Weapon 4 (1998)
1:09 Hellboy (2004)
1:18 Star Wars VI: Return of the Jedi (1983)
1:26 The Animatrix (2003)
1:33 Sin City (2005)
1:39 Batman Returns (1992)
1:46 Lord of the Rings: The Return of the King (2003)
1:52 Howard the Duck (1986)
1:59 Family Guy episode “North by North Quahog” (2005)
2:06 Raiders of the Lost Ark (1981)
2:14 Star Wars Holiday Special (1978)
2:22 King Kong (2005)
2:29 Toy Story (1995)
2:37 Indiana Jones and the Temple of Doom (1984)
2:43 Wallace and Gromit in Curse of the WereRabbit (2005)
2:51 Angel episode “The Cautionary Tale of Numero Cinco” (2003)
2:57, 3:16 Kill Bill, Vol. 1 (2003)
3:04 Lord of the Rings: The Two Towers (2002)
3:11 Angel episode “A New World” (2001)
3:23 Drawn Together (2004)

And, finally, here’s the original in context, in Distant Drums (1951):

My take: the original meme! I’ve used it too, in a pitch video of all things!

DaVinci Resolve 18.5 Beta: Relight tool can replace power windows

The new Relight FX in Blackmagic Design‘s DaVinci Resolve Studio is amazing!

“The new Relight FX lets you add virtual light sources into a scene to creatively adjust environmental lighting, fill dark shadows or change the mood. Light sources can be directional to cast a broad light, a point source, or a spotlight and be adjusted for surface softness and specularity control.

My take: wow! This looks like so much fun. I can see using Relight instead of a power window to punch up illumination on the subject, drawing the eye exactly where you want it to go. This tool brings new meaning to the phrase, “We’ll fix it in Post!”

Disney software is virtual fountain of youth

The fountain of youth is a spring that is said to restore the youth of anyone who drinks or bathes in its waters. This idea has been mentioned in many different cultures throughout history, often as a symbol of eternal youth and rejuvenation. In some stories, the fountain is guarded by a powerful being, such as a nymph or a fairy, and must be sought out by brave adventurers. Despite many people searching for the fountain throughout history, it has never been found and is generally considered to be a mythical concept.

Until now.

Disney Research has created production-ready face re-aging for visual effects.

Andrew Liszewski writing on Gizmodo explains their approach:

“To make an age-altering AI tool that was ready for the demands of Hollywood and flexible enough to work on moving footage or shots where an actor isn’t always looking directly at the camera, Disney’s researchers, as detailed in a recently published paper, first created a database of thousands of randomly generated synthetic faces. Existing machine learning aging tools were then used to age and de-age these thousands of non-existent test subjects, and those results were then used to train a new neural network called FRAN (face re-aging network). When FRAN is fed an input headshot, instead of generating an altered headshot, it predicts what parts of the face would be altered by age, such as the addition or removal of wrinkles, and those results are then layered over the original face as an extra channel of added visual information. This approach accurately preserves the performer’s appearance and identity, even when their head is moving, when their face is looking around, or when the lighting conditions in a shot change over time. It also allows the AI generated changes to be adjusted and tweaked by an artist, which is an important part of VFX work: making the alterations perfectly blend back into a shot so the changes are invisible to an audience.”

At five seconds per frame, FRAN can age or de-age one minute of footage in two hours.

That’s got to be cheaper than Hollywood VFX.

My take: Imagine if they had this technology for The Curious Case of Benjamin Button!