Apple Log on iPhone

Apple Log on iPhone is Not a Gimmick according to ZY Cheng of Malaysia.

In a fast-paced 10 minutes he covers:

  • the difference between conventional filming and filming in log
  • how exposure changes in log (1:12)
  • using the iPhone Camera App to film in log (5:06)
  • using the Blackmagic Camera App to film in log (5:47)
  • the log profile and black and white levels (6:31)
  • log exposure tips for day and night filming (7:53)

Apple Log is available on the iPhone 15 Pro and Pro Max. Note that Apple will most likely be upgrading these come September 2024.

My take: I want this. I need this!

“air head” is the first AI generated short film

Sora has received one of the first AI generated short films from Canada’s shykids_:

OpenAI says on their blog:

“Based in Toronto, shy kids are a multimedia production company who utilized Sora for their short film about a balloon man. ‘We now have the ability to expand on stories we once thought impossible,’ shares the trio made up of Walter Woodman, Sidney Leeder and Patrick Cederberg. Walter, who directed Air Head, remarks that ‘as great as Sora is at generating things that appear real, what excites us is its ability to make things that are totally surreal. A new era of abstract expressionism.’ Speaking to the wider industry, ‘people from all over the world with stories ready to burst out of their chests finally have the opportunity to show the world what’s inside.'”

The BTS video is quite illustrative, as it does look like some post work was done on at least one clip:

 

View this post on Instagram

 

A post shared by shy kids (@shykids_)

I love this quote: “Sora is very much a slot machine as to what you get back.”

By the way, these are the folks behind 2013’s brilliant, genre-launching Noah (nsfw):

My take: sure, it’s not prime time yet, but Sora will only get better. Check out these other “text to video” tools on Future Tools. Oh, and I think the contrast in “air head” is too flat.

ScriptReader.ai offers script analysis for $9.99

Jason Hellerman reports on No Film School that AI Says My Screenplay is Better Than ‘Silence of the Lambs’ and ‘Schindler’s List’.

He begins, “This morning, a really lovely NFS reader reached out to me and wanted my opinion on a program called ScriptReader.AI, which charges $10 for a complete screenplay breakdown.”

He uploads one of his scripts and in two hours gets extensive coverage. It appears the AI’s method is to rate each scene and then average the scores. His script gets a “Recommend.”

However, Jason is not impressed. He concludes:

“Computers cannot think and feel. They have zero sense of nuance. And you cannot take their coverage seriously. No matter how nice it is. AI coverage is worthless!”

My take: the ScriptReader.ai website woefully undersells its capabilities. It certainly looks sketchy. But you can test out three scenes for free.

And then there’s a link in the top right corner of the website labelled “SHALL WE PLAY A GAME?_” I recognized this as a reference to WarGames in which Matthew Broderick unwittingly provokes WWIII.

With some trepidation, I clicked on the link.

This brought me to a page titled “SceneProv_” with two modes, Fun and Training.

I decided to have some fun and generate a scene for a RomCom. The AI, suddenly my writing partner, set the scene and populated it with two characters. My interactions are bolded.

“Genre selected: romance

INT. COFFEE SHOP – DAY

JEN sits at a table with a book, sipping on her latte. She notices AIDEN walk in and their eyes meet.

JEN:
(smiling)
Hey!

AIDEN:
(nervously)
Hi! Can I sit here?

JEN:
(surprised)
Uh, sure!

AIDEN sits across from Jen and they strike up a conversation.

AIDEN:
I noticed you’re reading “Do Androids Dream of Sheep?”.

JEN:
Yup. It’s my favourite book.

Her eyes glitch.

AIDEN:
Do you want to go back to my place?

JEN:
(surprised)
Excuse me?

AIDEN:
(smiling)
Not like that. I have a first edition copy of that book. I’d love to show it to you.

JEN:
(relieved)
Oh, sure. That sounds great.

INT. AIDEN’S PLACE – NIGHT

Aiden and Jen are sitting on the couch, looking at the first edition. Her eyes glitch again.

AIDEN
Jen, why do your eyes “glitch?”

JEN
What do you mean?

AIDEN
Well, every once in a while they kinda spark.
Are you hiding something from me?

Jen looks at Aiden angrily and stands up quickly.

JEN:
(angrily)
No, I’m not hiding anything! It’s just a glitch in my synthetic retinas. I’m an android.

AIDEN:
(surprised)
An android? I had no idea.

JEN:
(sadly)
I know, it’s not exactly a desirable trait in the dating world.

AIDEN:
(smiling)
Well, I’m not like most guys. I think it’s pretty cool, to be honest.

Jen’s eyes glitch again, this time in surprise.

JEN:
(surprised)
Really?

AIDEN:
(nodding)
Really. Plus, I always thought I’d make a great cyborg myself.

Jen smiles and takes Aiden’s hand, feeling grateful for the first time to have found someone who accepts her for who she really is.

They continue to look at the book, their fingers touching.

Aiden turns to Jen, looking into her eyes.

AIDEN:
(softly)
You know, I may not be able to dream like you do, but
I have a feeling this is the beginning of something special.

JEN:
(smiling)
I have that feeling, too.

They lean in and share a kiss, the start of their unconventional love story.”

And then it rated “our” scene a B+. This AI passes the Turing Test for sure.

GAME OVER! OpenAI’s SORA just won the text-to-video race

The inevitability of script-to-screen technology is closer than ever.

OpenAI released test footage and announced, “Introducing Sora, our text-to-video model. All the clips in this video were generated directly by Sora without modification. Sora can create videos of up to 60 seconds featuring highly detailed scenes, complex camera motion, and multiple characters with vibrant emotions.”

See openai.com/sora for more.

“Sora is able to generate complex scenes with multiple characters, specific types of motion, and accurate details of the subject and background. The model understands not only what the user has asked for in the prompt, but also how those things exist in the physical world.”

“The model has a deep understanding of language, enabling it to accurately interpret prompts and generate compelling characters that express vibrant emotions. Sora can also create multiple shots within a single generated video that accurately persist characters and visual style.”

See the Technical Research.

Beyond text-to-video, “Sora can also be prompted with other inputs, such as pre-existing images or video. This capability enables Sora to perform a wide range of image and video editing tasks — creating perfectly looping video, animating static images, extending videos forwards or backwards in time, etc.”

Sora can even replace the whole background in a video: “Diffusion models have enabled a plethora of methods for editing images and videos from text prompts…. One of these methods, SDEdit,32… enables Sora to transform the styles and environments of input videos zero-shot.”

My take: this is powerful stuff! Workers in media industries might want to start thinking about diversifying their skills….

Kodak reinvents the Super 8 camera

Jourdan Aldredge invites us, on No Film School, to Take a Look at the First Footage From the Kodak Super 8 Camera.

“Pulling from our friends at B&H, it looks like their crew featuring Nick and Dave were able to take the new Kodak Super 8 camera through the paces as they ran around New York City.”

Here are timestamps for each section:

  • 0:00 – Main Specs
  • 0:48 – Lenses & LCD Screen
  • 1:52 – Menu System
  • 2:15 – Filmstock, Presets, & FPS
  • 3:12 – Extended Gate & Resolution
  • 3:35 – Sound & Connectivity
  • 4:10 – Who Is This Camera For?

The kicker? Kodak’s new Super-8 camera, based on sixty-year-old technology, costs $5,500 USD.

My take: This is way too expensive for me! And the image still bounces around like crazy. IMHO Kodak missed an opportunity to improve their technology by bypassing the cartridge’s internal pressure plate and figuring out how to stabilize each frame during exposure, kinda like the “true pin registration” that the Logmar GENTOO GS8 does. Pity.

Got an old undeveloped roll of film? Who you gonna call?

Film Rescue International to the rescue!

Joe O’Connor writing for the Financial Post explains How a business in small-town Saskatchewan is exposing lost history.

He states:

“Film Rescue has a global reach — and reputation — with a collection office in the Netherlands and another in northern Montana. Every few weeks, a batch of 200 to 400 rolls of old camera and home movie film appears at the Indian Head office, which is housed in a 125-year-old bank building on the main highway through town.”

That’s Indian Head, Saskatchewan, east of Regina.

“The film rescuers consist of Miller, Gostick and five employees…. One employee, Gerald Freyer, is a European-trained digitization expert. He was beavering away for a German museum when Miller recruited him to move to the Prairies. His latest star acquisition is Heather Harkins, who trained at the renowned Selznick School of Film Preservation in Rochester, N.Y., and was doing contract work for museums and archives throughout North America when she got a cold-call job offer.”

As Film Rescue says on their website: “We are the revealers of lost and found treasurers. Since 1999.

My take: Ooh! I see they also offer movie film scanning as well.

Super Fast Screenplay Coverage

Jason Hellerman writes on No Film School that I Got My Black List Script Rated By AI … And This Is What It Scored.

Jason says, “An AI-driven program called Greenlight Coverage gives instant feedback on your script. You just upload it, and the AI software spits coverage back to you. It rates different parts of the script on a scale from 1-10 and then gives a synopsis, positive comments, and notes on what would make it better. The program even creates a cast list and movie comps, allowing you to have an AI question-and-answer session to ask specific questions about the script.”

His script Himbo that was on The Black List in 2022 and rated incredibly high by ScriptShadow scored 6/10 on Greenlight Coverage.

He concludes:

“The truth is, I could see a read like this coming from a human being. Is it the best coverage? No. But as someone who has tested many services out there, I felt it gave better coverage than some paid sites, which are hit-and-miss depending on the person who reads your script. I look at AI as a tool that some writers may decide to use. I was happy I tried this tool, and I honestly was surprised by the feedback of the coverage.”

My take: I also participated in the beta test of Greenlight Coverage and asked the creator Jack Zhang the following questions via email.

Michael Korican: For folks used to buying coverage for their scripts, what are the main features of Greenlight Coverage that set it apart?
Jack Zhang: The speed, accuracy, consistency as well as reliability. Also the ability to ask follow up questions that can provide guidance on how to pitch to investors and financiers, all the way to how to further develop certain characters. In the future, we will also include data from Greenlight Essentials.

MK: Writers sometimes wait weeks if not longer for coverage. How fast is Greenlight Coverage?
JZ: 15 mins to 2 hours when they first upload their screenplay, depending on their subscribed package. The follow up questions are answered instantly.

MK: In your testing of Greenlight Coverage, how have produced Hollywood scripts rated?
JZ: It’s a mixed bag; the universally critically acclaimed ones usually get a very high score 8.5 to 9+, like The Godfather, Shawshank, etc.  The bad ones like The Room got 3/10. It really depends on the screenplay and the film.

MK: Greenlight Coverage uses a neural network expert system; the coverage section is highly structured whereas the question section is open-ended. How is this done and what LLM does Greenlight Coverage use?
JZ: We are using large language models to power our back end and it is not one, but a few different ones as well as our proprietary model that was fine tuned based on industry veteran feedback.

MK: Why should those folks who hate AI give Greenlight Coverage a try for free?
JZ: I totally see where they are coming from and personally I also agree that in such a creative industry, the human touch is 100% needed. This is just a tool to help give quick feedback and an unbiased opinion on the screenplay. It is useful as another input to the script, but not the end all and be all for it.

btw, not to brag but Greenlight Coverage gave my latest script, The Guerrilla Gardeners, 8/10. Wanna produce it?

Kodak releases ‘new’ Super 8 camera — the price will surprise you!

Brian Hallett writing on ProVideo Coalition, was one of the first to report on The Brand New KODAK Super 8mm Film Camera.

Yes, this camera films on those boxy Super 8 cartridges, but it does so much more:

  • It has a 4″ LCD viewfinder with aspect ratio overlays, interactive menus and camera settings.
  • It has an extended gate, now 13.5:9 full frame (versus the traditional 4:3 or 12:9 aspect ratio) that will make cropping down to 16:9 a breeze.
  • It comes equipped with a detachable wide-angle 6 mm 1:1.2 C-mount lens but you can screw on any C-mount lens.
  • It records sound with an on-board sound recorder and via 3.5 mm input to an integrated SD card reader.
  • It runs crystal sync at 24 and 25 fps, plus over and under cranks at 18 and 36 fps.
  • It has a distinctive top handle and a pistol grip.

Here’s the full datasheet. And here’s their sizzle reel:

Curiously, Kodak first announced this camera in January 2016 at CES for “between $400 and $750.”

Funny, both the released camera and the prototype look an awful lot like the Logmar Humboldt S8.

Logmar’s next Super 8 camera, the Chatham S8, had a superior Latham loop mechanism that resulted in rock-steady registration, never seen before on Super 8. Check out this sample that actually looks like 16 mm footage:

Logmar’s latest C-mount Super 8 camera is the Gentoo GS8 that “uses standard Kodak 50 ft cartridges in combination with a re-usable spacer providing true pin registration.”

Oh, and by the way, the MSRP of Kodak’s “new” Super 8 camera is only $5,495 USD. That does include one cartridge of Super 8 film. No word on if that includes processing though.

My take: [shakes head] Kodak, Kodak, Kodak. You’re just trying to get this project off your books, right?

Finally, new plots unlocked for Hallmark movies!

Ryan Morrison of Tom’s Guide, reveals I asked ChatGPT to create a Hallmark Christmas movie — and it went better than expected.

He begins:

“Part of my job is testing AI products to find out how well they work, what they can be used for and just how good they are at different tasks. So, inspired by my mom’s favorite genre of movie I decided to ask ChatGPT to write a Christmas story in the style of Hallmark.”

His ChatGPT 4 prompt? “Can you help me come up with the plot for a Hallmark-style Christmas movie?”

The resulting basic plot? “In ‘Christmas Carousel’, a New York architect discovers love and the value of tradition when she teams up with a local carousel restorer to save a cherished holiday attraction in a small town.”

There’s a more detailed plot, characters and even dialogue.

The AI even suggests filming in Cold Spring, New York State.

Oops! There is a real 2020 Hallmark movie called “A Christmas Carousel” with this plot: “When Lila is hired by the Royal Family of Ancadia to repair a carousel, she must work with the Prince to complete it by Christmas.”

My take: even though it appears ChatGPT 4 came extremely close to ripping off the title of an existing Hallmark movie, I like its plot better than the real one. Go figure.

YouTube’s Dream Track and Music AI Tools

Sarah Fielding of Engadget reports that YouTube’s first AI-generated music tools can clone artist voices and turn hums into melodies.

The YouTube technology is called Dream Track for Shorts.

The mock-up asks you for a text prompt and then writes lyrics, music and has a voice-cloned artist sing:

YouTube is also testing Music AI Tools:

This is all possible due to Google DeepMind’s Lyria, their most advanced AI music generation model to date.

Cleo Abram also explains the real issue with AI music: When do artists get paid?

My take: AI is just a tool — I use it occasionally and will be exploring it more in 2024. What we as a society really need to figure out is what everyone’s going to do, and how they’ll get (not earn) enough money to live, when AI and Robotics make paid work redundant.