ET Infographic | Running out of data

With foundation models running out of data for training, companies are placing their bets on the use of synthetic data for training the models. When Meta released the latest iteration of its open-source model, Llama 3.1 405B, it also updated the model’s licence to generate synthetic data which can be further used to train proprietary small models. On paper, this sounds plausible.

Read more

You may also like

Comments are closed.

More in IT