
High-performance computing (HPC) is more than just a buzzword.
Over the years, science has used existing resources to create models, theories, and formulas capable of explaining human curiosity and the most diverse phenomena in the universe. Through a variety of methods and instruments, the scientific community has been able to analyze the world around us, far beyond what we can see with our own eyes.
But this ability to innovate and make discoveries has only been possible because there are numerous repositories with documented academic and scientific work, accessible in countless libraries around the world. For this very reason, information and knowledge about the past are two of the essential ingredients in the innovation process, as they make it possible to reuse data that has already been produced and choose research paths that are more likely to be successful.
Proof of this is the fact that any master's or doctoral thesis, report or activity plan usually includes several references to articles, studies, documents, and facts already produced and validated by the community (academic, scientific, journalistic, or just civil society), allowing for the development of new areas of research. But while the digitization and mass use of information technology (which we have seen in recent decades) has solved some problems related to the format, preservation and sharing of information, the truth is that other risks related to its security, availability and veracity have emerged.

High-performance computing: a threat, a solution or both?
Unfortunately, the technology developed to solve humanity's problems has been used by other people or groups who maliciously exploit it for less legitimate purposes. For example, relatively recently, a cyber-attack compromised more than twenty energy sector organizations in Denmark, which is currently one of the most advanced countries in the field of cybersecurity. This attack was based on one (or more) vulnerabilities in the security tools (in this case firewall and VPN systems) of a well-known technology company.
Going back a few years, we can remember the emergence of peer-to-peer technology, which revolutionized the way data was transacted in academic and scientific circles. But the same basic technology also made it possible to promote piracy by illegally sharing multimedia content, even shaking up the film industry. Or the emergence of email, which initially allowed people to exchange messages almost instantaneously between different geographical locations, but which at the same time allowed the mass sending of unaddressed advertising (SPAM) or, more recently, the development of elaborate phishing campaigns (sending malicious emails with the aim of obtaining credentials, credit cards or other personal and confidential information).
The advances in technologies and the globalization of their use have enabled the creation of virtual currencies, supported by databases and distributed transactions (blockchain), challenging the traditional, centralized banking sector. But at the same time, this technology is used for untraceable ransomware payments, for payments for services on the dark market of the dark web or for other illicit activities.
People, companies, and the media have found social networks to be a quick way to reach friends, clients, and the public, allowing two-way communication in real-time. However, the creation of fake profiles, the production of deepfake videos and the dissemination of fake news are becoming a reality all over the world, causing discomfort and distrust in society.
This has been the case over the years and will probably continue to be the case in the future because the success of the internet (as we know it) is due to its capacity for global connection and decentralized management in an apparent anarchy where everyone can do or say whatever they want, without filters or censorship.
Currently, developments in high-performance computing, the commercialization of supercomputers, the development of artificial intelligence dynamics and the first steps in quantum computing are some of the technological developments that most excite the scientific community in the field of technology.
It has the potential to revolutionize countless areas of science and technology, from chemistry to cryptography, medicine, genetics, and meteorology, solving complex problems quickly and efficiently. However, high-performance computing comes with numerous challenges, particularly in cybersecurity.
Despite the various initiatives developed by international organizations, such as ICANN (Internet Corporation for Assigned Names and Numbers), IEFT (Internet Engineering Task Force), W3C (World Wide Web Consortium), we have seen the exponential growth of large-scale computer crimes, online disinformation campaigns, publication of unauthorized personal data, harassment and "cyber bullying", among other realities incompatible with the initial purpose of this technological resource.

The potential for revolution through high-performance computing
It is in this context that we find one of today's greatest challenges for science (in terms of technology): solving problems and failures in the confidentiality, integrity and availability of information systems and technologies, controlling uncertain parameters, and limiting unwanted agents (while maintaining a global and accessible communication resource such as the internet).
Some of these problems could be solved with existing technologies, such as:
- a qualified digital signature could be mandatory when creating or using a nominal user profile on a social network;
- the incorporation of blockchain technology in the sale of tickets to sports venues or cultural shows would eliminate the parallel market, characterized by price speculation or even the sale of copies or counterfeits;
- the obligation to use SPF (Sender Policy Framework) and DMARC (Domain-based Message Authentication, Reporting & Conformance) in domains that send emails could eliminate SPAM and phishing campaigns that invade users' mailboxes;
However, any of these actions, made mandatory, would change the paradigm of the free and decentralized Internet. It would call into question the principle of freedom and force a worldwide commitment focused on usability and information security, which is why the new Web 5.0 poses a great challenge to artificial intelligence in filtering content (separating the wheat from the chaff), automatically interconnecting IoT devices or in responding to needs that humans themselves have not yet identified.
In this field, developments in high-performance computing and the emergence of quantum computers will completely change the way we learn, communicate, and manage knowledge in modern societies. But at the same time, imagining a quantum computer in the hands of a criminal group is a frightening scenario, given that the time needed to decode the strongest encryption algorithms (today) is only a few seconds.

High-performance computing vs. the future: what to expect?
It is estimated that between 2030 and 2040, quantum computers will be available for home use, at lower costs and in smaller sizes. By then, 5G will certainly be obsolete and a new data communication technology will support the transmission and sharing of information and knowledge in real-time even more effectively. Society's challenges will certainly be different from those of today, and science will play an even more important role because knowledge could continue to be one of the most important assets in the process of creating value in organizations and in society itself.
Several questions will certainly arise.
Will we be able, through science, to control or limit natural phenomena, prevent wars between nations or eliminate cyber-attacks?
Will high-performance computing (including quantum computing) be the new milestone of civilization, after the invention of fire, writing, electricity, or the internet?
Will future generations be conditioned in their creativity and intellectual evolution because of the mass use of artificial intelligence?

Carlos Domingues
IT & Security Coordinator