Lytro reveals revolutionary studio camera

Although you’ll never be able to afford one, Lytro introduced its Lytro Cinema Camera at NAB on April 19, 2016.

This is a huge studio camera with a foot-and-a-half-wide lens tethered to its own server farm. It captures “755 RAW Megapixels” at 300 fps in up to 16 stops of dynamic range.

That’s about 15 times more resolution than a full-frame DSLR at 50MP.

It doesn’t actually record images though. It captures the “light field” — the lightscape of reflected light rays in front of the lens. Behind the front lens, an array of microlenses allows Lytro to “capture a light field, compute the ray angles and then replicate that light field in a virtual space.”

In other words, this camera captures a virtual hologram of the scene in front of it.

With this computational model, Lytro can, after capture, i.e. “in post”:

  • refocus and change depth of field
  • adjust frame rate and shutter angle
  • pull a key based on depth and not green screen
  • stabilize camera movement based on actual movement in space
  • natively create 3D footage from one shot
  • as a DI, output optimal deliverables for any format

Watch the No Film School interview and video.

My take: With its Cinema Camera, Lytro has displaced image capture with lightscape hologram capture. If I was a Hollywood producer, I’d use this camera on 3D shoots and to simplify keys for composite work. And — to fix those pesky out-of-focus shots. But wait! There’s more! They’re also promising a Light Field VR Camera called Lytro Immerge.

Kodak announces a new Super 8 camera

At CES 2016 in Las Vegas last week, Kodak stunned the world by announcing it is making a new Super 8 camera for release this fall.

After emerging from bankruptcy two years ago, Kodak decided to go all in on film, even though film represents only 10% of its business.

Hollywood filmmakers, many who grew up shooting Super 8, convinced Kodak to bring back the narrow-gauge format.

Kodak believes Super 8 can join the Maker Movement and ride the analogue trend.

Check out the camera specs.

My take: I’m also one of the filmmakers who got their start shooting Super 8. I have two concerns with Kodak’s new camera. While the viewfinder and SD card are appreciated, what were they thinking with the microphone? Super 8 cameras are typically noisy! My other concern is with the jitter inherent in Super 8. Logmar of Denmark has solved this — but their camera costs ten times as much. What I do think is brilliant is Kodak getting back into the film processing business and combining it with film scanning. That combination is the real news here and could make more people consider shooting on Super 8. But only if your pockets are very deep or your shot list is (super) short.

The Most Technologically Advanced Book Ever Published

Chuck Salter writes in FastCoDesign about a publishing company that continues to innovate in the personal book field.

First came ‘The Girl Who Lost Her Name’ and ‘The Boy Who Lost His Name’. Now comes ‘The Incredible Intergalactic Journey Home’.

“This time, a lost boy or girl navigates his or her way from outer space back home. Spoiler alert: to the reader’s actual home. The wayward space ship swoops into his or her city and arrives in the child’s neighborhood. The image, the book’s big reveal, incorporates the corresponding satellite photos. That degree of personalization required even more algorithms and developers than Lost My Name’s first book, along with help from NASA, Microsoft, satellite makers, and other unlikely children’s book partners.”

The creators are Lost My Name of East London. What a wondrous book and a steal at $30.

My take: I love this concept and the marvellous execution! (The new book does remind me slightly of Arcade Fire‘s Chrome Experiment, The Wilderness Downtown, which may or may not be still working.) Now imagine this in the video realm. I see no reason, with the state of CGI, digital production and online streaming, that my likeness could not be inserted into productions and animated, for my entertainment only. Maybe not in real-time initially and probably not voice. But imagine your own channel on Netflix, starring or co-starring you! That might be fun.

To 4K or not to 4K

Near the beginning of your indie film project, you need to determine the video format you will be shooting.

Will it be 4K, or 2K, or — gasp — even 1080p?

This chart shows those display resolutions and many more.

More and more cameras shoot 4K so why even debate it?

There are some very good reasons not to shoot in 4K. Mentorless has three:

“#1 – Nobody Can Tell the Difference
#2 – It Will Stretch Your Budget And Take More Time
#3 – The Delivery Side Can’t Hold Its Part of the Contract (Yet)”

On the other hand, the main reasons touted to shoot in 4K are to allow for digital cropping, stabilization and zooming in post-production and to ‘future-proof’ your production, in anticipation of the day when everyone has 4K TVs and no caps on their Internet data.

My take: I like to shoot fast, so I’m partial to smaller form factors; 1080p is fine for me, for now. But the new DJI Osmo really caught my eye. The 4K footage is amazing and incredibly smooth for such a small camera.

Strides in VR filmmaking

Discovery has launched a new project: Discovery VR.

Although there are only 10 VR videos on the site right now, you can control each of them for a full 360 degrees with your mouse.

There’s diving with sharks, skateboarding in San Francisco and a surfing lesson.

I found that changing the control setting to Mouse Grab from the default Mouse Movement gave me more natural movement.

In addition to the website, there’s an app for iPhones and Android devices. Create a VR headset with Google Cardboard or Samsung Gear VR.

The company behind the magic is Littlstar.

My take: I remember the initial release of QuickTime VR in 1994 which gave me my first glimpses of ‘virtual reality’. GameSpot has an interesting history of VR. I think the application to narrative film will be fascinating. For instance, see Intimate Strangers : Chapter 1 — camera placement and mise en scene become very important. I like the way the ‘dream’ is projected onto the ceiling above the woman. A tip for VR directors, place the camera just to one side of the ‘line’ and let the viewer pan from one actor to the other and back.

U2 shows us the way with live mobile streaming

First Youtube enabled anyone to post moving images to the Internet, democratizing the movies.

Now mobile streaming apps are revolutionizing live broadcasting, once the domain of television.

Having just launched within the last three months, both Meerkat and Periscope enable anyone with a smartphone to stream live video broadcasts in realtime to the world.

Meerkat (IOS and Android) wants you to first log in to Twitter. The left column lists upcoming streams, comments are on the right and the stream is featured vertically in the middle. Meerkat loves the colour yellow.

Periscope (IOS and Adroid) was purchased by Twitter shortly after Meerkat debuted. Comments are superimposed in the bottom left-hand corner, and you can show some ‘love’ with hearts that float up the right side of the vertical screen.

You can search Twitter to find live Meerkat streams or live Periscope streams.

Or, New York digital & social agency, GLOW, offers two ways to sample multiple streams:

Rock band U2 have embraced Meerkat. During the current i+e Tour, according to The Hollywood Reporter,

“The band invites an audience member onto the B stage to shoot a stripped-down number — on this night, ‘Angel of Harlem’ — to be broadcast live via the fledgling Meerkat platform. ‘This goes out across the globe — to about 150 people, until it catches on,’ Bono quipped.”

My take: I think this is truly revolutionary. The ‘airwaves’ for traditional TV broadcasters are strictly controlled by the FCC in America and the CRTC in Canada. Now, everyone with a smartphone has a ‘TV’ camera in their pocket and can begin broadcasting to the world at any time, for free! Journalism and entertainment may never be the same again. Interestingly, both apps use a mobile-friendly vertical orientation, which is decidedly uncinematic.

Disney posits computer-aided editing

Disney researchers are working on an editing algorithm that edits footage from multiple cameras into coherent narratives.

It maps the common attention point in space for all the cameras as a proxy for the common subject. It then applies editing rules such as the 180 degree rule, jump cut avoidance and cutting on action — things your editor does now.

Their video is convincing.

See their take on interactive synchronization as well.

My take: this would be fascinating to see applied to news or documentary footage. It might also be applied to down-and-dirty multi-camera narrative work. The editor of the future’s job might evolve into finessing these cuts, choosing appropriate cutaways and organizing the order of scenes.

Hyperlapse solves shaky time-lapse footage

Fascinating news from Microsoft Research: we can fix your shaky GoPro time-lapse footage.

The technique is called Hyperlapse and they say an application is coming soon, perhaps in a few months.

Better yet, they show you how it’s done.

My take: thanks for sharing, Microsoft. I love that you’ve released the technical know-how as well. I predict a Google Street View-style multiple camera rig capturing overlapping footage to generate a rich ‘picture-scape’ combined with this software to create immersive, real-time, viewer-defined camera movement. Movies, meet video games.

Super 8 is about to make a comeback

The consumer film format called Super 8 was dominant in the sixties, seventies and eighties until the upstart technology called ‘video’ challenged it in the nineties and vanquished it from the marketplace in the new millenium. HD video now rules. With the right lens use and lighting, we can shoot economical, cinematic images.

Nevertheless, are you nostalgic for the real film look? It’s too expensive to actually shoot on film, right? 35mm, even 16mm, is out of reach. But what about Super 8? Is it possible to shoot on Super 8 and transfer to video for post?

My memory of the look of Super 8 is slightly soft, jittery Kodak Kodachrome, with it’s very warm tone and super-saturated reds. I shot my first films on Super 8, physically splicing the shots together and projecting the original reversal stock which would jump slightly as the cuts chattered through the projector gate.

One of Super 8’s strengths was also one of its weaknesses. The cartridges were extremely user-friendly but their design meant that the film was held steady during exposure by a simple pressure plate. Jitter, therefore, was built into all Super 8 cameras.

Now, a Danish company called Logmar plans to re-engineer the Super 8 camera. Their idea is to pull the film out of the cartridge and pin register it during exposure. The footage is rock-steady.

What about Super 8 film and developing? North American rights, film and processing will be handled by Pro8mm of Burbank, California.

My take: at 5 grand, this will be an expensive camera. I love the modern technology Logmar is brining to a mid-century medium, like the digital monitor and SD sound recording. Neat that they can scale this up to 16mm and 35mm as well. And I love the discipline of film versus video. But film! I thought it was dead! That sample footage does look more like 16mm than the Super 8 I remember. Perhaps if they address the dust on the negative and the dirt in the gate by the frame lines….

‘Sharknado 2’ can control your lights

Whether or not you appreciated the concept behind last year’s ‘Sharknado’ — sharks falling out of the sky — you should appreciate a cool technological tie-in tonight’s outing brings. (Syfy, Space 9 e, 6 p)

According to Mashable, ‘Sharknado 2: The Second One’ — now set in New York instead of LA — will be able to control your lights. Think flashing during lightning and drenching your room in red during the shark attacks.

That is, if you have Philips hue lights. Combining LED lights, the Internet and smart phone control gives you ‘personal wireless lighting.’

Plus, you need the Syfy Sync app on your smart device.

“The secret sauce of the whole experience is the Syfy Sync app, which typically brings the viewer second-screen information, such as actor profiles and trivia. Similar to Shazam, the app uses audio tagging to identify what the viewer is watching, delivering the right content at the right moment. But the Philips integration takes it to another level.”

My take: kinda cool. With 5.1 sound and responsive lighting, in the proper hands, this could make for very immersive experiences.