News2019-11-07T20:58:48+01:00

We’re creating a lot of buzz!

Read our latest news, press releases, and interviews and subscribe to our monthly newsletter and follow us on social media to keep up to date with our latest news and announcements.

November 12, 2018  – BeeGFS wins the HPCwire Best HPC Storage Product or Technology at SC18

ThinkParQ’s leading parallel cluster file system, BeeGFS has been recognized in the annual HPCwire Readers’ and Editors’ Choice Awards, presented at the 2018 International Conference for High Performance Computing, Networking, Storage and Analysis (SC18), in Dallas, Texas.

October 25 2018  – BeeGFS based burst buffer enables world record hyperscale data distribution

ThinkParQ announced today that BeeGFS, the leading parallel cluster file enabled a world record hyperscale data distribution with AIC and Zettar. For the 3rd time running since 2016, BeeGFS has been instrumental in enabling three Zettar’s world records including the latest ‘Holy Grail’ world record run: a long distance data transfer of 1 PB in 29 hours (1 PiB = 1.1259 PB).

August 1, 2018  –  Exclusive Platinum Partnership with Advanced HPC

Advanced HPC, a leading HPC specialist and solutions provider, announced today that it has become the first and only U.S. Platinum Partner of BeeGFS, the globally-renowned parallel cluster file system.

March 27, 2018  – ThinkParQ teams up with Nyriad for GPU-accelerated Storage

At NVIDIA GTC in San Jose, Nyriad and ThinkParQ announced a partnership to develop a certification program for high performance, resilient storage systems that combine the BeeGFS parallel file system with NSULATE, Nyriad’s solution for GPU-accelerated storage-processing.

March 8, 2018  – TUK in Germany installs NEC LX Supercomputer with Intel Omni-Path

NEC Deutschland GmbH has delivered an LX series supercomputer to Technische Universität Kaiserslautern (TUK), one of Germany’s leading Universities of Technology. The storage solution is based on the widely deployed BeeGFS parallel file system made in Germany.

Go to Top