“We propose an alternative generator architecture for generative adversarial networks, borrowing from style transfer literature. The new architecture leads to an automatically learned, unsupervised separation of high-level attributes (e.g., pose and identity when trained on human faces) and stochastic variation in the generated images (e.g., freckles, hair), and it enables intuitive, scale-specific control of the synthesis. The new generator improves the state-of-the-art in terms of traditional distribution quality metrics, leads to demonstrably better interpolation properties, and also better disentangles the latent factors of variation.”
Of course, we’ve seen something similar before. Way back in 1985 Godley & Creme released a music video for their song Cry; the evocative black and white video used analogue wipes and fades to blend a myriad of faces together, predating digital morphing. Here’s a cover version and video remake by Gayngs, including a cameo by Kevin Godley:
My take: Definitely scary. But if that’s the current state of the art, I think it means we are _not_ living in the Simulation — yet, even though Elon Musk says otherwise.
Th’ new crew offers an end t’ end solution fer th’ secure holding, quality control, conversion ‘n delivery o’ film assets t’ clients around th’ globe, promisin’ a one-stop solution fer digital storage, protection ‘n delivery o’ film ‘n media treasure.
“The core of the service is designed with protection of content from unauthorized sharing and piracy in mind, by using a blockchain-based forensic encoding technology.”
“You never want your film to be on The Pirate Bay. You probably don’t want your film being shared on campus networks. You don’t want the guy from the local paper that you asked to review your movie sending it to his friends. We get people from all over the world to anonymously tell us when they find your film where it shouldn’t be. We have blockchain magic, and we will find them.”
“My own feature film that I Executive Produced and directed (Black Water, starring Jean-Claude Van Damme and Dolph Lundgren), was leaked online several months before it’s official world release date. Since then, I have been researching technologies that could help track every delivery and download of the master files of the film.”
My take: ’tis a smart solution that helps filmmakers keep tabs on thar screeners. ‘Tis prolly worth th’ cost, as th’ price o’ piracy be growin’ in terms o’ lost revenue ‘n compliance. Arrgh!
“Deepfake, a portmanteau of “deep learning” and “fake”, is an artificial intelligence-based human image synthesis technique. It is used to combine and superimpose existing images and videos onto source images or videos.”
My favourite technology show, BBC Click, explains it well:
Back to Siwei:
“Because these techniques are so new, people are having trouble telling the difference between real videos and the deepfake videos. My work, with my colleague Ming-Ching Chang and our Ph.D. student Yuezun Li, has found a way to reliably tell real videos from deepfake videos. It’s not a permanent solution, because technology will improve. But it’s a start, and offers hope that computers will be able to help people tell truth from fiction.”
The key?
Blinking.
“Healthy adult humans blink somewhere between every 2 and 10 seconds, and a single blink takes between one-tenth and four-tenths of a second. That’s what would be normal to see in a video of a person talking. But it’s not what happens in many deepfake videos.”
They analyze the rate of blinking to decide the veracity of the video.
“Our work is taking advantage of a flaw in the sort of data available to train deepfake algorithms. To avoid falling prey to a similar flaw, we have trained our system on a large library of images of both open and closed eyes. This method seems to work well, and as a result, we’ve achieved an over 95 percent detection rate.”
My take: Wow! So, basically, now you can no longer believe what you read, hear or see. Interestingly, this means that IRL will take on added value. (Oops, it seems that technology has already moved on: now deepfakes can include blinking.)
Irish band U2 has always embraced technology and continues to do so on their latest tour by embracing AR.
AR is Augmented Reality and superimposes information on top of your phone’s camera image.
Fans attending the shows will be able to hold up their phones to reveal a huge iceberg and a virtual singing Bono.
You can download the U2 eXPERIENCE app here. To test drive it, point it at the album cover for Songs of Experience. A virtual cover will float on top of the picture of the cover, shatter into shards as music begins to play and then an animated Bono will begin to sing.
As you move your phone side to side or up and down, you’ll see different angles of the holographic representation.
My take: this is pretty cool and might be many folks’ first experience of AR.
“That animal brain is not aware of anything, I am very confident of that. Hypothetically, somebody takes this technology, makes it better, and restores someone’s activity. That is restoring a human being. If that person has memory, I would be freaking out completely.”
My take: this subject gets murky very quickly. Witness the ethical issues the scientists raise in Nature. Another questions whether a brain without stimuli would be torture. Heady stuff.
My take: basically, lack of a suitable camera is no longer an excuse for not filming. But everything else stays the same, starting with a great script and a smart plan.
“The new Pocket Cinema Camera 4K has a ton of features that’ll appeal to that market — like a mini XLR connector, LUT support, and 4K recording at 60 fps — but it still has limitations that’ll keep the camera confined to a niche audience (which, to be fair, is kind of true of every camera). Basically, unless you’re a filmmaker who’s typically in control of lighting and the overall environment they’ll be filming in, this camera probably isn’t for you. It doesn’t have in-body stabilization, and the small sensor will struggle in low light and require adaptors to get the depth of field you’d get from full frame or even Super 35 cameras. That might not matter to some filmmakers, but it could be an issue for people on fast shoots or traveling to unfamiliar locations.”
“The 65-inch display sits flat and sturdy on your wall, like a normal television, until you’re done with it. With one push of a button, the display descends down into its stand, rolling around a coil like wrapping paper. The screen can roll up completely for safe storage and easy transportation, or you can leave a small section of it sticking up, at which point the screen automatically shifts into a widgetized, information-providing display with weather and sports scores. LG’s device has almost nothing in common with most TVs, other than its size. Functionally, it’s more like a really big tablet.”
Fully unrolled, the aspect ration is 16:9.
But wait, there’s more! It can roll down to 21:9, eliminating the black bars above and below widescreen movies.
My take: I want one! I would hang it upside down from the ceiling, so it would mimic a cinema screen of yore.
Get ready for an onslaught of new immersive video cameras.
Youtube launched the VR180 format last year and parent company Google has just partnered with Lenovo to make the world’s simplest point and shoot camera, the Mirage.
180 is the shorthand for VR180, which is the moniker for 3D VR180. The two front-facing lenses approximate your eyes, creating depth.
Lenovo has published the camera’s specs but the biggest drawback I see is the lack of a view screen. It truly is a point and shoot camera, although you could use the onboard WIFI to send the picture to your smartphone for viewing.
“VR180, like most things in VR right now, is the simple-but-usable version of what will someday be much cooler. It exists for a few reasons: because 360-degree video is actually really complicated to do well, because there aren’t many great ways to watch 360 video, and because even when they do watch super-immersive footage, viewers don’t tend to look around much. With VR180, your camera can look and operate more like a regular point-and-shoot, and viewers get a similarly immersive feel without having to constantly spin around.”
There’s also the YI Horizon VR180 coming soon and it includes a view screen, higher resolution and HDMI out, I believe. See Think Media‘s review:
My take: I’m a big fan of 180 and can’t wait to play around with both of these cameras. (Also, I wish the ‘VR’ label would just go away since this technology is not “virtual reality” but basically “reality”. Virtual Reality to me means computer-generated environments; video games are a prime example. 180 is as close as we’re going to come to reality other than actually being there.)