Sweden's Berzelius supercomputer will gain enhanced AI performance through the addition of more Nvidia AI systems.
The supercomputer at Linköping University has already been used to create a family of large language models, known as GPT-SW3, that can generate text in Swedish.
More:
- Now, Berzelius will gain another 34 Nvidia DGX A100 systems for AI workloads, adding to the 60 it already has.
- The units, which comprise Nvidia GPUs with 80GB of memory each, will place the machine among the 100 fastest AI supercomputers.
- It previously trained a Swedish large language model at a capacity of 20 billion parameters. It could now be used to help create a model of nearly 175 billion parameters that would also be fluent in Danish and Norwegian, according to Nvidia.
- Berzelius is operated by the Swedish research outfit known as the Wallenberg Artificial Intelligence, Autonomous Systems and Software Program, which is mostly funded by Knut and Alice Wallenberg Foundation.