Kyle Wiggers / TechCrunch:
Google’s PaLM 2 paper is forthcoming on many of the LLM’s major limitations, but doesn’t reveal which data or hardware setup the company used to train the model — At its annual I/O conference, Google unveiled PaLM 2, the successor to its PaLM large language model for understanding and generating multilingual text.