‘Hard to find high-quality data for complex reasoning’
China’s breakthrough with DeepSeek won’t impact investments in GenAI science, core research in reasoning and multimodal capabilities but only demonstrates that models can be built faster and cheaper, said Niki Parmar, 34, currently member of technical staff at Anthropic, developer of the Claude large language model. One of the eight Google scientists who co-authored the landmark transformer paper, Attention Is All You Need, Parmar told Himanshi Lohchab that boundaries between research and engineering have blurred. Parmar went from research scientist at Google Brain to co-founding two AI startups–Adept AI and Essential AI—before the full-time role at Anthropic. Edited excerpts: