The Culture of Safety – Part 2

Indice

Italy is facing an increasing number of cyber threats both in quantity and complexity, yet there seems to be no equally effective evolution in terms of cybersecurity. Let us try to delve into this topic.

That voice is not mine!

April 2023, in a boardroom of a large company based in Rome, the General Manager’s phone starts ringing. A budget report is in progress and there is some tension due to problems with the allocation of funds; at first the Director does not realise the phone is ringing but, a few moments later, he grabs it and turns it over (he was face down). The number that was calling him was his, as was the photograph that appeared on the screen. Ring after ring the table of participants began to fall silent as the Director General, without realising it, had taken on a puzzled expression to say the least. Cautiously he ran his thumb across the screen and brought the smartphone to his ear.

“Good morning Director, we inform you that while you are on the phone the files of the entire organisation are being deleted. We are able to provide you with a copy at a price that…”

There was something worse than the message itself and that was the voice. The voice of that phone call was not a stranger’s but his own, that of the Director himself who had not only called himself but was also talking to himself accusing him of exfiltrating and encrypting the files. A few hours later, when asked if he wanted to add anything to his reconstruction, he simply said:

“That voice is not mine.”

What you have read is not a film, it is not an invention and it really happened. Years later, the phenomenon is repeated more effectively and ends up in the headlines of a national newspaper, the Corriere della Sera.

The phone call sounded like it came from the Ministry of Defence. And they must have sounded very credible. In one case at least, the voice, perhaps reproduced by software, sounded like that of Minister Guido Crosetto. Instead, it was a hoax.

The human factor

Both the ‘Crosetto case’ (if we want to call it that) and the one above are very similar: at the heart of the matter is the human response to a computer incident. The human factor, too often forgotten, is presented as the only one capable of making a real difference in borderline and emergency situations. But then why is it underestimated?

The human factor requires economic resources in the first place: humans must be trained to cope with cyber incidents. It will not have escaped the notice of the experts that there is a measure dedicated to this in the European NIS 2 Directive, which reads as follows:

Basic hygiene practices and computer security training.(Legislative Decree 138/2024, Art. 24, c.2, point (g))

We keep writing in every piece of legislation that training is important, but it is often lacking in quality. The human factor is that element which reduces the possibility of total unpreparedness in the face of the problem, which makes us cautious, reactive and disciplined in handling the emergency.

On 13 February 2025, Sapienza di Roma organised a PCTO (pathway for transversal skills and orientation) called‘Consapevoli e cybersicuri. A pathway for moving with awareness on the net’. On that occasion, I was able to clarify some aspects that it is good not to forget.

From ‘mechanics’ to ‘strategists’

Let us be clear: the time of the ‘mechanical’ systems engineer who was responsible for assembling, configuring and controlling systems is over. Today, system engineers are asked to do something more: they are asked to prepare a system operation strategy that does not answer the question does the system work? but rather does the system work securely?

Today’s systems are functional from the moment they are unpacked: pre-installed operating systems, existing main drivers, loaded default configurations. Of course it is necessary to do the finishing touches, but let’s face it, compared to the past today it is easier and faster to configure a system from scratch.

What is more difficult is to make it work safely: to ensure all the necessary parameters so that incidents have a reduced impact and responses are timely. This needs to be understood once and for all:

  • IT security requires discipline because it must be maintained over time, even when doing so is annoying and frustrating;
  • cybersecurity requires multidisciplinary collaboration because without interaction with lawyers, with managers, with consultants, not much will be done and the individualistic view does not pay;
  • cybersecurity requires transparency, and one feels a certain weight when writing this sentence because everyone knows that the term transparency has been abused over the years. But the truth is that today many data breaches pass through a deafening silence on the part of companies and also of many complacent DPOs who do not do their job properly.

There is no rule that holds

If these attitudes cannot be changed, IT security will never prevail, it will never ‘win’ in efficiency over an incident, certainly there are virtuous cases but they are few. A beautiful article written by Andrea Lisi on LinkedIn reads:

Between national and European regulations that seem to weave themselves into an inextricable skein and digital technologies that surprise us every day, it seems impossible today to guarantee a linear path to being digitally compliant…Yet it is enough to proceed without bingeing too much, using one’s brain and managing innovation through interdisciplinary teams, with patience and foresight.

We have a profusion of regulations that risk overlapping, jostling each other, changing parameters, deadlines, objectives, with the risk of creating more noise than anything else. We jump from AgID’s Circular 2/2017, to GDPR, passing by Law 90/2024, without forgetting the NIS2 Directive but without letting the DORA Regulation fail, forcing the construction of traceability matrices that allow a harmonious reading of every aspect of all these regulations.

But wasn’t IT supposed to simplify?

Jurisprudence chases: this is one of the mantras Andrea Lisi often repeats and it is true; the rule chases, especially when it is specific. When, on the other hand, it deals with general principles, it is more agile and well-functioning, but requires the use of the brain and more… The norm works if it is applied, if it is respected, otherwise it is just a sterile dictum that someone has taken the trouble to write. The norm generally makes sense and that sense must be rediscovered with a view to an appreciation of ethics and transparency. The alternative is to continue to standardise but for what purpose? Create more bureaucracy? Further blocking organisational processes? Increase confusion out of all proportion? To regulate is an art: not only in writing but in understanding what to leave and what to repeal. The CAD is an example of this: from a text that was initially readable and comprehensible to all, it has become over time a complex list of cross-references, changes, articulations, which have made it hostile, complicated, useless and essentially useless.

Between algorithms and computer hygiene

It should be of concern that NIS2 has, at its core, an obligation to‘basic hygiene practices and cybersecurity training‘ because it is as if in everyday life there is a legislative decree that we must breathe for a living. Having to write such an obligation in 2025, when hackers are using artificial intelligence algorithms to create multi-stage attacks in hybrid contexts, is a bit like stating a platitude. Over the years, we have seen giants fall under attacks of a certain complexity (the Colonial Pipeline or the data breach suffered by Sony), so it is not just a domestic problem, but certainly one that concerns us appears even more critical.

In the deepest silence

In recent years, following the data breaches, I have been approached by employees of many public and private entities with messages such as the following.

“...the DPO never replied to our clarification emails, we are worried and do not know where to turn…”

Or:

“…the company refused to tell us what happened and told the DPO not to give any indication of what happened…

And again:

“...personally I was intimidated by my director when I told him that I would like to ask the DPO for more information…

And finally:

“…the director of information systems, Mr. XXXXX, blatantly provided me with false data on the condition that I stop asking questions about the infrastructure under analysis and, when I pointed out to him that the situation he recounted was technically impossible, he stopped answering my phone calls and e-mails...

Readers will recall the case of the ASL 1 data breach in L’Aquila, which was followed by a class action lawsuit as recounted by Wired in this article.

For the time being, the claims of the two lawyers, who work in tandem, are not claims for compensation or class actions, as Colantoni pointed out to Wired: “We have started requests to Asl 1 to obtain information, because there is none. Only at the end will we be able to evaluate a possible litigation

In 2023, a data breach at a very famous public entity left employees without any information for days. Some of them, in apprehension, even closed their bank accounts and cut off communication with some suppliers. The response of many organisations to data breaches can be summarised as‘stand still and move as little dust as possible‘ without realising that the dust is often moved by silence itself. Hackers publish the data on the internet and this allows them to understand how they were kept and determine any inadequacies. Then follow (often belatedly), bizarre and incomplete communiqués from the organisations that minimise the problem, sometimes even misrepresenting what happened as if this would serve to restore a more dignified image of them.

What to do then?

Demanding that one’s data be treated well is not an option: it is a right. To those people who, after an IT incident, find the courage to come forward to assert their rights, it is important to make it clear that they are doing the right thing, they are not in the wrong; if anything, it is the organisation that refuses to respond to their emails, or the DPO who, at the behest of the company, omits details and answers. Change is only possible if there is respect for the law and real transparency.

Certainly the supervisory authorities are essential, but if they take a ‘good father’ approach, the risk is that they do not go beyond a pat on the back and a little warning.

Finally, in many cases, it is possible to determine prolonged, continuous and constant violations by negligent and indifferent employees and suppliers. The question is: why should they not be held accountable themselves? In any healthy company, when there are repeated errors, the problem is solved, but in cybersecurity this is very difficult. Personally, I have known information systems directors who, after repeated breaches of data protection regulations and related incidents, were not only not removed from office but even got a raise. A distortion that only resulted in silencing those who, instead, would have wished for greater quality and transparency of services.

What to do? Demand quality even if this requires a greater effort and commitment of economic and human resources. This is the only way to ensure a real and appreciable change in the cybersecurity scenario.