Will infra bottlenecks sour LLM dreams?
As India accelerates its efforts to build a homegrown large language model (LLM), a critical question emerges: is the computing infrastructure geared up to support the AI workloads? After all, training large AI models requires massive processing capabilities, an area where the US and China are currently leading with their advanced supercomputing infrastructure.
What about data quality? While India generates humongous volumes of data, most of it is unstructured and requires extensive cleaning and organisation before it can be useful for AI training.