Digital
Cutting edge film maker EiPi Media’s love affair with generative AI
MUMBAI: Rohit Reddy may be lounging in his monochrome threads and bucket hat like it’s a lazy Sunday, but don’t let the chill vibe fool you, this man’s schedule is more packed than a Mumbai traffic jam. From his third-floor creative bunker in the chaos capital of Mumbai, he is juggling deadlines like flaming batons. Between AI wizardry and influencer fire drills, Reddy barely has time to blink at the traffic blazing past his floor-to-ceiling windows, let alone sip his coffee while it’s still hot.
As one is ushered into his conference room, he flashes a grin and shoots straight from the hip, “Sorry to keep you waiting! Got caught in an important business call.”
No fluff, no filter, just the kind of honest hustle you’d expect from a man who is steering the ship as advertising agency EiPi Media’s founder & CEO. Keeping pace with him are creative brain Tapoja Roy who scripts the stories, and Nikhil Chhabria, the agency’s go-to GenAI expert.
EiPi Media isn’t just making noise, it is orchestrating a full-blown content symphony. Whether it’s slick influencer videos, jaw-dropping CGI, or its latest AI-fueled experiments, this crew’s rewriting the rules of brand storytelling—one pixel at a time.
Forget your preconceived notions of a typical production house. EiPi Media didn’t exactly start with cameras and a clapperboard. Instead, it kicked off as a social media marketing agency, leveraging Rohit’s wife, co-founder and actress Anita Hassanandani’s television connections to exclusively manage TV artists in 2018.
“There was nobody managing TV stars at scale at that time,” recalls Rohit, who spent a good dozen years in finance and insurance before turning entrepreneur. “So we began the agency, but we didn’t want to onboard any talent as such. We were doing this whole brokering deal.”
Initially, its client roster was a cosy club of four or five friends in marketing, including big names like Domino’s and Neo to whom they supplied artistes for a fee. But the pandemic, a rude awakening for many, proved to be EiPi Media’s unlikely launchpad.

“In 2020, when the pandemic happened, all these clients of ours, they cancelled the POs,” the founder explains. “That time I realised that I cannot be dependent on just a few clients.”
This realisation sparked an aggressive sales drive, leading to inroads with giants like Nestle and P&G. As brands shifted television budgets to digital, EiPi Media found itself in the sweet spot, growing a whopping 8x in 2020. The team quickly evolved from merely supplying talent to offering creative ideation and eventually a full-blown production. “Brands had a lot of comfort because they had to only talk to one person,” he notes, highlighting the firm’s end-to-end, in-house model as a key differentiator.
EiPi Media’s ascent wasn’t just about diversification; it was about embracing cutting-edge tech. 2020 marked its deep dive into visual effects, earning the agency a reputation as a “marketing tech company.”
It was creating “frugal productions” at a time when big production houses wouldn’t touch small budgets.
Then came 2021, and with it, a massive leap into CGI. EiPi Media got busy crafting dinosaurs and animations for internal projects. “I was very sure that I needed to take this CGI initiative to brands,” Reddy asserts.
Its big break came with Adidas, which had just signed the Indian cricket team. EiPi Media delivered a CGI video for the jersey launch. Since then, it has churned out over 60 large and hundreds of smaller CGI campaigns.
“For us, CGI was always an extension of VFX,” he clarifies, “it was always improving the content.” This foresight gave EiPi a two-year head start on the competition.
Just as others were jumping on the CGI bandwagon, EiPi Media was already pivoting to AI. Its first AI video dropped in October 2023, well before brands even grasped its potential.
“We kind of saw the vision that AI would actually be a very, very strong pillar to production,” he states.
The real game-changer with AI, he believes, is its ability to “actually replace production.” While pre-production (concept, script, storyboard, casting) and post-production (editing, music, colour grading) remain manual, the entire production phase is now happening on computers. “There is no casting. There is no hair makeup. There is no costume. There is no actor. There is no director,” he enthuses. This significantly slashes costs and turnaround times, a true relief for clients.
Hdfc Bank and Fenesta Windows were among its first clients to embrace generative AI commercials, alongside international brands like South African noodle company Indomie and Lenovo.
What excites Reddy the most about generative AI?
“There’s no limit to creativity now,” he shrugs. “Ideas that were once too expensive or physically impossible to execute in traditional or even CGI production are now within reach. Imagine a conversation on Mars, with participants levitating, becomes possible at a fraction of a cost and a fraction of a time.”
While some might argue AI stifles creativity, he believes the current limitations lie with the nascent technology, not human ingenuity. Its viral “Mahabharata 1.0″ video, made in just six hours as an in-house project, proved the concept. The recently released “Mahabharata 2800,” an upgraded version, showcases the rapid advancements in tools.
The decision to create a generative AI film based on the Mahabharata was driven by its boundless storytelling potential. “It has so many layers, so many characters,” he explains. The epic’s fantastical elements also play well with current AI capabilities. Though its first Mahabharata trailer went viral, sparking calls from major publications, he cautions that making a full-length film with generative AI is “not at all easy” yet, as the tools aren’t quite there for complex storytelling. Disconnected content, like trailers, is where AI shines for now.
The demand for generative AI content is skyrocketing, but supply is scarce., points out Reddy
“We are the only people supplying good content,” he claims, attributing the agency’s advantage to its extensive experience in traditional filmmaking. “We understand storytelling. We understand scripting. We have everything in-house.”
This blend of creative and tech expertise positions EiPi perfectly to ride the AI wave.

Reddy predicts a hybrid future for TV commercials, where elements like exterior shots or traffic scenes might be generated by AI, while core scenes will be traditionally filmed. Smaller budget ad films (those around Rs 10-15 lakhs) are ripe for a full AI shift, potentially reducing costs to Rs 3-4 lakhs.
EiPi Media’s traditional filmmaking team boasts around 30 people, while its burgeoning AI team, currently eight strong, is focused on learning and experimenting. Hiring is less about age and more about interest and strong English skills for effective prompting. He laments the lack of impressive AI-generated visuals in India, partly due to the unavailability of tools like Google’s Veo 3. Fortunately, EiPi Media’s Ohio office gives the outfit early access to such innovations.
Its creative team, a lean but experienced trio, has penned over 3,000 scripts. For CGI, it outsources to Iran and Russia when the need arises, favouring the artists’ attention to detail and quality over Indian talent, who are more often than not tied up with Hollywood projects.
On the gen AI front, the team leverages a suite of tools, including Midjourney (now generating videos), Halo, Google Veo 3 (praised for its lip-sync function), and Runway.
He believes the playing field for generative AI is level globally. “The only people having an advantage are people who are investing more time than the others.”
While he foresees AI complementing and eventually replacing traditional filmmaking in genres like mythology and fantasy, he believes it will take about five years for the technology to fully mature for comprehensive storytelling.

EiPi Media’s focus will remain firmly on branded content, leveraging video as the primary communication medium. Its future plans involve significant investment in an R&D department dedicated to “just experimenting tools, going crazy, basically.” He anticipates that within the next two years, directors and producers will increasingly outsource specific scenes and elements to AI, particularly those that are not cost-effective or time-consuming to shoot traditionally.
It does not take too much intelligence to guess who will end up getting the fruits of this transition.
(If you are an Anime fan and love Anime like Demon Slayer, Spy X Family, Hunter X Hunter, Tokyo Revengers, Dan Da Dan and Slime, Buy your favourite Anime merchandise on AnimeOriginals.com.)
Digital
IDS 2026: AI rewires media value chain, says JioStar’s Prashant Khanna
BENGALURU: Artificial intelligence is rapidly becoming the operating backbone of the media industry, transforming everything from content creation to distribution, said JioStar head – sports and live experiences, production technology and services Prashant Khanna, at the India Digital Summit 2026.
Speaking at a panel on automating the content value chain organised by IAMAI, Khanna said AI was no longer a peripheral tool but a core layer enabling scale, precision and personalisation across media workflows.
Live sports, he noted, requires unparalleled accuracy, with tens of millions of viewers watching in real time. AI-driven systems are now helping production teams move from reactive execution to predictive storytelling, using data, context and historical patterns to anticipate visuals, graphics and narrative elements before they are needed.
This shift, Khanna said, allows creative professionals to focus more on storytelling while automation handles manual processes.
Beyond production, AI is reshaping distribution by enabling the same live content to be delivered across multiple formats, from vertical video and short highlights to extended recaps and full-length broadcasts, tailored to different viewing preferences.
According to Khanna, seamless automation across the value chain is increasingly central to acquiring viewers and deepening engagement. He added that AI is also democratising premium production experiences, making features such as high-quality language commentary, advanced camera work, auto-framing and real-time adaptation accessible at scale.
Addressing the rise of AI-generated content, Khanna said technology lowers barriers to entry but does not replace the need for strong storytelling. Its true power lies in expanding creative possibilities rather than substituting narrative craft.
Looking ahead, he predicted a more immersive and interactive future for live entertainment, driven by virtual reality, second-screen experiences and personalised data layers, allowing fans to curate their own viewing experiences.
In Khanna’s view, AI’s true impact on media will be measured not by novelty, but by how seamlessly it integrates creativity, certainty and scale, turning the entire content lifecycle into a more intelligent, responsive and inclusive system.
Digital
Why AI’s Next Big Flex is Knowing When to Zip It
MUMBAI: We’ve all been sold the same sci-fi fever dream for decades: the invisible digital butler. The Jarvis to our Tony Stark, if you may. An intelligence that doesn’t wait for a prompt but simply exists in the periphery, whispering the right answer before you’ve even finished forming the question.
Recent moves from the tech giants suggest we’re finally crossing the threshold into “personal intelligence,” a system that pulls context across your entire digital life. We have, thankfully, graduated from the “goldfish amnesia” phase of early LLMs. Context windows and memory features have given AI a decent short-term recall, but we are still languishing in the uncanny valley of partial context. You’ve likely had that moment where you stare at a generated response and wonder, “What on earth made you think that was what I wanted?” Custom instructions and pinned memories can only do so much heavy lifting when the AI is still looking at your life through a keyhole.
But as AI moves from a tool we “talk to” to a system that essentially lives in our OS, the industry is obsessed with the wrong metric. We’re still counting parameters and bragging about reasoning capabilities. The real breakthrough isn’t going to be how much the AI knows; it’s going to be how much it chooses to ignore.
From “Helpful” to “Opinionated”
When AI starts linking context across your life, it ceases to be a neutral tool and starts becoming an opinionated system. This is where the “intelligence” narrative gets spicy. At their core, Large Language Models still function as high-speed autocomplete. They predict the next word in a sequence based on a generic world-view, and that isn’t fundamentally changing. What is changing is the rise of agentic AI. Agents sit around the model, interacting with tools, data, and the environment to observe context, react to signals, and take action. Personal intelligence, then, becomes about how those predictions get applied to your specific history.
If these agents know your budget, your health goals, and your calendar, and you ask for a dinner recommendation, does it give you what you want or what it thinks you need? Imagine a scenario where you’ve had a brutal day at work, and you just want a greasy burger. However, your AI “sees” your high cortisol levels and the fact that you’ve missed your last three gym sessions. Does it “helpfully” bury the burger joint in the search results and prioritize a salad bar instead?
At what point does “helpful context” become a digital nanny? This isn’t just a UI challenge; it’s a fundamental shift in the power dynamic between human and machine. As these systems grow more proactive, governance can’t just be about data privacy, it has to be about agency. We need to ensure that as AI gets better at recognizing our needs, it doesn’t start dictating them to us. A system that “knows best” is only one bad update away from becoming a system that “knows better than you.” If an AI becomes too opinionated, it doesn’t solve friction; it creates a new kind of psychological tax where the user feels they have to “fight” their assistant to get what they actually want.
Designing the Invisible (and Avoiding the Creepy)
There is a razor-thin line between an AI that feels like a superpower and one that feels like a digital stalker. The tech industry has a pathological need to show its work. Usually, when a system gains a new capability, the marketing instinct is to broadcast it. But in the world of personal intelligence, this “Are you proud of me?” approach to software engineering is a fast track to the uncanny valley.
The goal for personal intelligence should be to become digital wallpaper essential, but unnoticed. The moment an AI “interrupts” to show off how much it knows about you, it has failed. To make AI feel invisible rather than invasive, we have to master the art of the “nudge.” This requires a deep understanding of human psychology, and by extension the art of shutting up.
The Ultimate Advantage: Strategic Restraint
The “hero narrative” of AI has always been about more: more data, more speed, more answers. But as we move into the era of personal intelligence, the ultimate competitive advantage is going to be restraint. This is a concept we rarely talk about in Silicon Valley, where “growth” and “engagement” are the primary gods. However, for a system to be truly personal, it must respect the sanctity of the user’s focus.
In the real world, the smartest person in the room is rarely the loudest; it’s the one who knows exactly when to chime in and when to stay silent. The same applies to our silicon counterparts. The engineering challenge is no longer just about building a model that can pass the Bar Exam or write a sonnet in the voice of a 17th-century pirate. The real challenge is building a model that has access to your deepest digital secrets and has the “wisdom” to do absolutely nothing with them until the exact moment it actually matters.
This brings us to the core question: Is the next AI advantage about intelligence, or about knowing when not to act on personal data?
If a company can prove that their AI has the discipline to stay in the background, they will win the one thing that is currently in shortest supply: trust. We are reaching “intelligence saturation.” Every major player has a model that is “smart.” What they don’t all have is a philosophy of silence. Knowing when not to act is the highest form of intelligence because it requires a level of contextual nuance that goes beyond pattern matching. It requires an understanding of human boundaries.
Digital
Stockholding rolls out StockFin 2.0 app to simplify investing nationwide
MUMBAI: When investing meets a software refresh, ease is the real upgrade. Stockholding Services Limited has rolled out Stockfin 2.0 nationwide, positioning the revamped investing app as a one-stop, mobile-first platform aimed at widening retail participation across India.
Designed to work as smoothly in metro markets as in fast-growing tier II and tier III cities, Stockfin 2.0 reflects the changing profile of India’s investors. Built on a future-ready architecture, the app features upgraded performance, a refreshed interface and a simplified structure intended to make market participation less intimidating and more intuitive.
The platform brings together equities, derivatives, stock SIPs, mutual funds, ETFs, SME stocks and IPOs within a single interface. Product-wise grouping allows users to navigate quickly, while a clean dashboard offers real-time snapshots of market indices, portfolio value, top gainers and losers, and profit and loss positions.
For investors seeking deeper insight, Stockfin 2.0 includes screeners, technical indicators, research calls and detailed reports. Short-term traders are catered to with a dedicated ‘Buy Today, Sell Tomorrow’ section, while goal-based mutual fund flows aim to simplify long-term financial planning.
The app also focuses on execution and security. Best price routing directs trades to the exchange offering the most competitive price, while MPIN, biometric login and OTP-based verification reinforce account safety. Personalisation options, including themes, font sizes and saved order settings, add flexibility to the user experience.
Speaking at the launch, officials highlighted the role of technology-led platforms in expanding financial inclusion and supporting India’s broader digital and self-reliance goals. Company leadership described Stockfin 2.0 as more than a cosmetic upgrade, positioning it as a step towards making investing more accessible, informed and dependable for retail participants nationwide.
Backed by StockHolding’s long-standing presence in financial services, the new app is aimed at investors who want real-time insights, secure access and the ability to manage multiple asset classes on the move, all without losing clarity in a fast-moving market.
-
News Broadcasting2 days agoMukesh Ambani, Larry Fink come together for CNBC-TV18 exclusive
-
iWorld6 days agoNetflix celebrates a decade in India with Shah Rukh Khan-narrated tribute film
-
I&B Ministry3 months agoIndia steps up fight against digital piracy
-
iWorld3 months agoTips Music turns up the heat with Tamil party anthem Mayangiren
-
MAM2 days agoNielsen launches co-viewing pilot to sharpen TV measurement
-
iWorld12 months agoBSNL rings in a revival with Rs 4,969 crore revenue
-
MAM3 months agoHoABL soars high with dazzling Nagpur sebut
-
News Broadcasting2 months agoCNN-News18 dominates Bihar election coverage with record viewership


