HPC = High Performance Computing is the application of “supercomputers” to computational problems that are either too large for standard computers or would take too long.
HPC clusters are characterized by many cores and processors, lots of memory, high-speed networking, and large data stores – all shared across many rack-mounted servers.
We work on optimization strategies to make the HPC cluster efficient.
HPC in Natural Sciences
We use the HPC cluster for:
Calculations of unusual magnetic and electronic properties of carbon materials
Studying peculiarities of crystal growth
Climate change predictions
H2020 Project 1.09.2017 – 1.09.2018 Teaming Phase 1
The HPC Lab team is participating in the project Big Data for Smart Society funded by the European Commission Framework program HORIZON2020, the Topic: WIDESPREAD‐04‐2017
Call: H2020‐WIDESPREAD‐2016‐2017 (WIDESPREAD)
Type of action: CSA Co‐ordination and support action
The main research objective is to advance the state‐of‐the‐art in the whole Big Data Value Chain, including development of advanced methods and tools for data collection from a variety of structured and unstructured sources, data consistency checking and cleaning, data aggregation and linking, data processing, modelling and analysis, data delivery by providing both accessibility and
proper visualisation. Following the challenges of Horizon 2020 and the Bulgarian Innovation Strategy for Smart Specialization 2014‐2020, the project team selected the most promising data driven innovation pillars: data driven government (public services based on open data); data driven industry
(manufacturing and production); data driven society (smart cities); and data driven science.