Ever since Karthi Purushothaman launched 1hat, an AI-first healthcare startup, he hasn’t worried that patients won’t trust an algorithmic doctor. He’s worried the algorithm might not trust him.

The algorithm isn’t his. 1hat offers AI-agentic services. An invisible layer of healthcare, such as voice notes that turn into records, doctor “twins” that remember patient histories, bots that nudge people to take their pills, and even co-pilots for insurers. But it all runs on top of large language models (LLMs) built by OpenAI, Anthropic, Google, and Meta. Take your pick. 

The plug, the power, the brain—it all belongs to Big AI.

Any one of these companies could become inaccessible, too expensive, or decide to compete directly. Worse, they could hold startups like Karthi’s captive to their own stacks, “while regulators are still drafting the guardrails”.

It’s the kind of dependency that’s making India’s antitrust watchdog nervous.

The Competition Commission of India (CCI) in October releasedCompetition Commission of IndiaMarket study on AI and competition its first market study on AI—its attempt to make sense of a world where hundreds of local startups depend on a handful of global models for survival. And its big idea is to ask everyone to “self-audit” their algorithms. Keep a paper trail of data sources, model inputs, and objectives.

“The Commission is asking you to audit every touchpoint where you consume data,” said Shashank K, co-founder of Redacto, a privacy-infrastructure startup that just raised Rs 12 crore from Peer Capital and Antler India.

It’s a reasonable idea. Modern, even. Except that the same report also found 90% of Indian AI firms don’t have reliable access to data in the first place. It’s a little like asking tenants to account for the furniture in a landlord’s house.

And this is the paradox CCI is walking into: India’s AI boom is built almost entirely on borrowed infrastructure. Almost all of roughly 900 GenAI startups in India, counted by Nasscom, are doing some version of what Karthi is doing: wrapping a global foundation model in local skin. 

But the technical term for this—“verticalising”—only works until the model owner turns up.