{"id":822206,"date":"2025-01-26T11:12:03","date_gmt":"2025-01-26T17:12:03","guid":{"rendered":"https:\/\/newsycanuse.com\/index.php\/2025\/01\/26\/coming-to-theaters-an-ai-generated-bollywood-movie\/"},"modified":"2025-01-26T11:12:03","modified_gmt":"2025-01-26T17:12:03","slug":"coming-to-theaters-an-ai-generated-bollywood-movie","status":"publish","type":"post","link":"https:\/\/newsycanuse.com\/index.php\/2025\/01\/26\/coming-to-theaters-an-ai-generated-bollywood-movie\/","title":{"rendered":"Coming to Theaters: An AI-Generated Bollywood Movie"},"content":{"rendered":"<p>Moviemakers used AI image generators to create characters, then fed those characters into video generators. <\/p>\n<div data-headline=\"Coming to Theaters: An AI-Generated Bollywood Movie\">\n<p><strong>By now, you\u2019ve <\/strong><strong>likely<\/strong> seen the short videos produced using AI video-generation tools, which make it possible to create photorealistic clips of several seconds from a simple text prompt. An Indian startup is now pushing the technology to its limits: It plans to release, by the end of 2025, a feature-length movie created almost entirely with <a href=\"https:\/\/spectrum.ieee.org\/tag\/generative-ai\" target=\"_self\">generative AI<\/a> tools.\n<\/p>\n<p><a href=\"https:\/\/intelliflicks.com\/\" rel=\"noopener noreferrer\" target=\"_blank\">Intelliflicks Studios<\/a>, based in Chandigarh, is the brainchild of author <a href=\"https:\/\/x.com\/Singhkhushwant\" target=\"_blank\">Khushwant Singh<\/a> and <a href=\"https:\/\/www.linkedin.com\/in\/gurdeep-pall-0aa639bb\/\" rel=\"noopener noreferrer\" target=\"_blank\">Gurdeep Pall<\/a>, president of AI strategy at Qualtrics, in Seattle, and former corporate vice president of AI incubations at <a href=\"https:\/\/spectrum.ieee.org\/tag\/microsoft\" target=\"_self\">Microsoft<\/a>. The studio is creating a screen adaption of Singh\u2019s 2014 novel <a href=\"https:\/\/books.google.co.in\/books\/about\/Maharaja_in_Denims.html?id=i1iboAEACAAJ&#038;redir_esc=y\" target=\"_blank\"><em><em>Maharaja in Denims<\/em><\/em><\/a>, which tells the story of a young man in the present day who believes he is a reincarnation of <a href=\"https:\/\/en.wikipedia.org\/wiki\/Ranjit_Singh\" target=\"_blank\">Maharaja Ranjit Singh<\/a>, the founder of the 19th-century Sikh Empire.\n<\/p>\n<p>\n\tSingh says studio bosses in Bollywood have twice purchased film rights for the book, but the complexity and cost of telling a story spanning several time periods meant the movie never got made. So when Pall, a childhood friend of Singh\u2019s, told him about the rapidly improving capabilities of AI video generators, the pair decided to join forces and create what they say will be the first feature-length <a href=\"https:\/\/spectrum.ieee.org\/tag\/generative-ai\">generative AI<\/a> movie. \u201cWe are trying to take a pathbreaking step to show the capability of the technology,\u201d says Singh.\n<\/p>\n<h2>What generative AI tools are they using?  <\/h2>\n<p>\n\tThe company is using a suite of commercial and open-source AI tools to make the movie, according to Pall, and is developing its own software to manage the novel workflows. It\u2019s using image-generation models to produce character designs, scenes, and objects that are then fed into video-generation models. Other AI tools are used to create audio, lip-sync dialogue, and sharpen images. Pall says his team is also using conventional video production tools for simpler jobs like matching lighting and color between scenes.\n<\/p>\n<p>\n\tThe developers are primarily using pretrained models, and Pall says they have also fine-tuned some models on<br \/>\n\t<a href=\"https:\/\/spectrum.ieee.org\/tag\/india\" target=\"_self\">India<\/a>-specific data. But in some cases, fine-tuning isn\u2019t enough. One scene involves a woman performing a dance traditional in northern India, called a Kathak dance, and Pall says that gathering enough data to train a model would be impractical. Instead, they plan to record a real Kathak performance and use AI to swap in the face of an AI-generated character.<\/p>\n<p><span data-rm-shortcode-id=\"5ae092624cc2a06d59279784d22eea67\"><iframe frameborder=\"0\" height=\"auto\" type=\"lazy-iframe\" scrolling=\"no\" data-runner-src=\"https:\/\/www.youtube.com\/embed\/KA4D2B0XNHQ?rel=0\" width=\"100%\"><\/iframe><\/span><small placeholder=\"Add Photo Caption...\">Intelliflicks Studios released this trailer for the AI-generated feature film that it plans to release this year. <\/small><small placeholder=\"Add Photo Credit...\">Intelliflicks Studios<\/small><\/p>\n<p>The biggest challenge the team has faced is consistency, according to Pall. Generative AI is inherently probabilistic, so a model\u2019s response to a particular prompt will be different every time. This can make things tricky when a character must have the same appearance throughout a feature-length film.<\/p>\n<p>This challenge became significantly more manageable in the last year, as many models can now add a digital tag to each output. This tag can be added to future prompts to ensure that the model follows a similar style when it generates a new clip. The re-creations are never perfect though, Pall says, adding that his team is adapting to the constraints of the technology. \u201cYou have to look at it like a new medium,\u201d he explains. \u201cYou can\u2019t paint the same thing with watercolors as you can with oil.\u201d<\/p>\n<h2>What do outside experts think?<\/h2>\n<p><a href=\"https:\/\/www.linkedin.com\/in\/jamie-umpherson-03579710\/\" rel=\"noopener noreferrer\" target=\"_blank\">Jamie Umpherson<\/a><span>, head of creative at the AI video startup <\/span><a href=\"https:\/\/runwayml.com\/\" target=\"_blank\">Runway<\/a><span>, in New York City, says the most successful AI video projects are those that understand the technology\u2019s limitations and lean into them to enhance the storytelling. Yet the technology is constantly improving, he adds, so some of these limitations may be short-lived.<\/span><\/p>\n<p><span><\/span>Still, creating a feature-length film with today\u2019s technology is a bit of a stretch. Umpherson says most of Runway\u2019s customers\u2014which include film studios, advertising agencies, and independent artists\u2014use the technology to rapidly iterate ideas early in the creative process or to generate visual effects that supplement live action. \u201cTo create an entirely generated film is definitely possible,\u201d he declares, but it will require \u201can incredible amount of artistry.\u201d<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" alt=\"An image of a person in a stream.  \" data-rm-shortcode-id=\"3f8f66db940ae28988fefad7ab843c0b\" data-rm-shortcode-name=\"rebelmouse-image\" src=\"https:\/\/spectrum.ieee.org\/media-library\/an-image-of-a-person-in-a-stream.png?id=55387140&#038;width=980\" height=\"1935\" id=\"70f78\" lazy-loadable=\"true\" width=\"3456\"><small placeholder=\"Add Photo Caption...\">Many of today\u2019s video generators now provide a tag with each generated clip, which can be added to the next prompt to improve continuity.<\/small><small placeholder=\"Add Photo Credit...\"><br \/>\n            Intelliflicks Studios<br \/>\n        <\/small><\/p>\n<p>Part of the challenge, says<br \/>\n\t<a href=\"https:\/\/abedavis.com\/\" target=\"_blank\">Abe Davis<\/a>, an assistant professor of computer science at Cornell University, is that these tools are designed to generate high-fidelity video with minimal input from the user\u2014they take control of the details that would normally require human decision-making. That automation lets a layperson quickly generate a clip, but it can frustrate someone with expertise and a vision. \u201cPeople underestimate the number of relevant or important decisions that a filmmaker actually wants to make,\u201d says Davis.\n<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" alt=\"An image of a woman dancing.\" data-rm-shortcode-id=\"a7715b46fffe598f8d0cb3c5ae3c5860\" data-rm-shortcode-name=\"rebelmouse-image\" src=\"https:\/\/spectrum.ieee.org\/media-library\/an-image-of-a-woman-dancing.png?id=55387143&#038;width=980\" height=\"1934\" id=\"f666e\" lazy-loadable=\"true\" width=\"3456\"><small placeholder=\"Add Photo Caption...\">The AI-generated movie is set both in the modern world and the 19th century.  <\/small><small placeholder=\"Add Photo Credit...\">Intelliflicks Studios<\/small><\/p>\n<p>\n\tTake, for example, a decision about how an actor should deliver a line; that direction may be hard to articulate in a text prompt. And yet all these details need to remain consistent throughout the video, Davis adds, which becomes increasingly difficult as it gets longer.\n<\/p>\n<p>\n\tSingh admits that the first AI-generated feature film is likely to be distinctly different from those produced conventionally. But he\u2019s hopeful that this technology will break down the structural barriers that prevent people from being able to express their creativity. AI is a game changer, Singh says: \u201cI think this will democratize filmmaking in a huge way.\u201d<br \/>\n\t<span><\/span><\/p>\n<\/div>\n<p><a href=\"https:\/\/spectrum.ieee.org\/ai-movie\" class=\"button purchase\" rel=\"nofollow noopener\" target=\"_blank\">Read More<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Moviemakers used AI image generators to create characters, then fed those characters into video generators. By now, you\u2019ve likely seen the short videos produced using AI video-generation tools, which make it possible to create photorealistic clips of several seconds from a simple text prompt. An Indian startup is now pushing the technology to its limits:<\/p>\n","protected":false},"author":1,"featured_media":822207,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[1156,2283],"tags":[7787,14932],"class_list":{"0":"post-822206","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-coming","8":"category-theaters","9":"tag-coming","10":"tag-theaters"},"aioseo_notices":[],"_links":{"self":[{"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/posts\/822206","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/comments?post=822206"}],"version-history":[{"count":0,"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/posts\/822206\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/media\/822207"}],"wp:attachment":[{"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/media?parent=822206"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/categories?post=822206"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/tags?post=822206"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}