Интересная новость
Интересная новость.
https://www.hpcwire.com/2023/11/13/training-of-1-trillion-parameter-scientific-ai-begins/
Интересно даже не тем, что 1T модель обучают (если оно MoE, то бывали и побольше), а тем, что не на Нвидии это делают. Неужели реальная конкуренция наконец?
"Argonne National Laboratory (ANL) is creating a generative AI model called AuroraGPT and is pouring a giant mass of scientific information into creating the brain.
The model is being trained on its Aurora supercomputer, which delivers more than an half an exaflop performance at ANL. The system has Intel’s Ponte Vecchio GPUs, which provide the main computing power."
...
"Brkic said its Ponte Vecchio GPUs outperformed Nvidia’s A100 GPUs in another Argonne supercomputer called Theta, which has a peak performance of 11.7 petaflops."
https://www.hpcwire.com/2023/11/13/training-of-1-trillion-parameter-scientific-ai-begins/
Интересно даже не тем, что 1T модель обучают (если оно MoE, то бывали и побольше), а тем, что не на Нвидии это делают. Неужели реальная конкуренция наконец?
"Argonne National Laboratory (ANL) is creating a generative AI model called AuroraGPT and is pouring a giant mass of scientific information into creating the brain.
The model is being trained on its Aurora supercomputer, which delivers more than an half an exaflop performance at ANL. The system has Intel’s Ponte Vecchio GPUs, which provide the main computing power."
...
"Brkic said its Ponte Vecchio GPUs outperformed Nvidia’s A100 GPUs in another Argonne supercomputer called Theta, which has a peak performance of 11.7 petaflops."
Источник: gonzo-обзоры ML статей
2023-11-15 21:56:38