Filmmakers, writers, musicians and other creative professionals have long used technology in their work. But can machines produce a kind of intelligence that helps storytellers at a deeper creative level?
That’s the premise of the IBM-sponsored “Storytellers With Watson” competition, launched in partnership with the Tribeca Film Festival. Over the last two months, the contest solicited and reviewed submissions about how media and entertainment pros can use artificial intelligence in new ways.
On Tuesday, after an all-day session at IBM’s Think Lab in New York culminating in “Shark Tank”-style pitches, the IBM and Tribeca jury selected a winner from five finalists: filmmaker and producer Seth Grossman and his “Rip-o-matic With Watson” (pictured above), which recognizes meaning in images and language for video editing to — in theory — automatically generate a sizzle-reel preview of a movie or TV pilot based on the script.
Grossman’s idea: to use AI to analyze, index and splice together “rips” from videos that represent a filmmaker’s vision. By recognizing information in images as well as finding and classifying their meaning in sets of written information, Watson would find and splice together the content that best matches the script, including specific lines, time periods, and locations.
“Ultimately, the people who will benefit the most will be the audience, because better Rip-o-matics will lead to better movies and TV shows,” Grossman wrote in submitting his idea for the contest.
The jurors selected Grossman and his “Rip-o-matic With Watson” concept “for his vision to tell a story, taking away barriers with a strong vision to push the envelope of visual recognition utilizing Watson.”
Of course, for now, the “Rip-o-matic” is just hypothetical. And even if it works as envisioned, it’s not clear how useful such an application ultimately would prove to be to filmmakers.
As the winner, Grossman will receive IBM mentorship to develop the idea and potentially launch it as a product. In addition, he’s getting a free trip for two to the 2018 Tribeca Film Festival. The other finalists had proposed using IBM Watson to assist in choreography, to simplify the script-review process, to improving film marketing, and to enable real-time language translation in a VR environment.
“Technology helps all of us to find opportunities where we didn’t know it existed,” according to Rob High, IBM fellow, VP and CTO of IBM Watson. “We can use cognitive computing to help us to make better decisions.”
The “Storytellers With Watson” jury members were IBM’s High; Paula Weinstein, Tribeca EVP, creative and programming; Sharon Badal, Tribeca VP, filmmaker relations and shorts programming; and Rashmy Chatterjee, CMO, IBM North America.
The jury bestowed an honorary mention to Sadaf Amouzegar, data scientist at End Cue, for “ScriptAloud,” which would use Watson Text to Speech and Tone Analyzer to transform written scripts into audio files available for casting directors and producers (instead of having to actually read a script).
The three other contenders were: Mary John Frank’s “Watson Dance Assistance” for product dance choreography visualizations; Kevin Cornish’s “Human Conversation Project” real-time translation using Watson Language Translator; and Billee Howard’s “Watson Brand Studio Model: Powering the Content Brands of Tomorrow,” which aims to use AI to find the key elements of successful marketing campaigns based on “emotional levers.”