‘Are we in love with cyber insecurity?’ (Eric Luiijf, 2014)

The following opinion piece was written by Eric Luiijf, Principal Consultant for Cyber Operations and Critical (Information) Infrastructure Protection at Dutch knowledge institute TNO, and published (paywalled) in the Int’l Journal of Critical Infrastructure Protection Vol 7 Issue 3 (September 2014). An extended Dutch version of this text is available here. With the author’s permission, here is a copy of the English text:

Are we in love with cyber insecurity?

Almost 40 years ago, as a student, I earned extra money by coding and testing programs for the administrative processes of a building hardware company. It was still the time of punch cards. In addition to coding programs in RPG3, I had to design sets of test data and describe the expected outputs to show the system designers and code reviewers that the test sets covered all the decision logic branches of the programs.

However, a successful run on a test set after debugging a program was just a first step. When I announced that a program was ready, the system designer walked to the garbage bin adjacent to the card punch machines and collected a stack of a few hundred cards that were rejected because they contained errors. This was the second test set. If even one of those cards was not rejected by my program, I had a difficult time. The coding standard was to perform rigorous input validation of each and every data field before any data could be moved to the company databases. This was one of the early examples of the principle of self-protection.

In 1978, an analysis of the almost daily crashes of our mainframe operating system led us conclude that most of the input buffers of the system programs were unguarded. Even simple user program errors would cause buffer overflows and overwrite executable code that, in turn, led to crashes of the entire mainframe. In a major effort, we patched and secured more than 100 system utilities. The code was sent to the system manufacturer via a non-standard software error reporting route. In hindsight, this could be considered to be a form of responsible disclosure.

Gaining access to the mainframe at that time was easy. All it needed was a new username and a simple password. All your data gone? You probably made a typo in your username.

Some years later, our system required passwords to be changed every three to six months. But all you had to do was to enter the old password and use that as the “new” password again. Or, you could just use an intermediate password and then change it to the old password. In the mid 1980s, we made many modifications to the password change program to block the use of simple passwords and the recycling of old passwords. But then came minicomputers and later personal computers; now we have smartphones and tablets. Every time a new cyber wave starts, there is no protection or very weak attempts to provide protection.

At the end of the 1980s, Unix and networking developed great momentum. Data was moved from one system to another without much authentication. Usernames and passwords were transmitted in the clear over networks. Protocols were designed for benign environments and did not expect anyone to send one byte less or one byte more of data, let alone deliberately attempt to overrun input buffers, the Ping-of-Death is just one example.

Yes, many of the security problems were patched, but always after the fact. A 1968 NATO report about the IBM 360 operating system concluded that it was unreliable because of its size. The report noted that there were about 1,000 errors in every new release – and this number seemed to be “reasonably constant.” Could this be a law of software coding? If so, what do we do about Android with its 1.2 million lines of code? Let us not even think about Windows!

Why are we so much in love with cyber insecurity?

First of all, we like progress and, therefore, we cannot get enough of the new functionality that information and communications technologies bring us. We love the ease of use. And we absolutely adore new gadgets. More cyber security hinders all these things. Information Security? It is a department at headquarters, is it not?

Second, designers and software developers of new information and communications technologies – think tablets, smart watches, Google Glasses – do not look backwards as they drive innovation forward. For this reason, they fail to learn from past cyber security failures. Unfortunately, we really cannot blame the innovators because we too do not learn cyber security lessons or use good cyber security engineering principles and cyber security practices.

It is this lack of learning from the cyber insecurity examples identified over the past 50 years that has resulted in advanced security controls not being implemented properly in systems. It is what causes manufactures to hardwire passwords deep in process control systems. It is what causes information technology systems to be deployed with the same manufacturer-supplied default passwords. It is what causes maintenance engineers to use simple and identical passwords for all their remotely-accessed customer systems. It is what causes programmers to develop new features again and again, without buffer overflow protection and without proper validation of all input data.

Odd incidents have occurred in critical infrastructures. For instance, the Australian power transmission grid was faced with a jump in demand when there actually was an increase in supply. This caused a five-fold jump in the spot energy price. Similar cases where illogical data had system-wide effects have been reported by NEMMCO, a major Australian electrical transmission system operator.

Some years ago, a Dutch flood barrier automatically initiated its closure sequence when two cables were inadvertently exchanged during maintenance. The system thought, incorrectly, that the water level in one town rose a few meters above the flood level.

In both cases, the old and good practice of self-protection by rigorous validation of input data was not learned and applied. As we move towards chains of information flows controlling mission-critical processes and critical infrastructures such as smart grids that rely on information from less trusted sources, adhering to the old principles – as I saw firsthand in the 1970s – is vital.

Are we ready for the next wave of cyber insecurities?

Innovative functionalities empowered by information and communications technologies are rapidly being embedded in the critical infrastructures and, indeed, in all the veins and capillaries of modern society. We are taking about smart refrigerators and washing machines, smart cars, smart grids, even smart cities.

But we have not been so smart because we have not learned from and applied the cyber security lessons identified in the past. We can, therefore, expect major disruptions of our smart gadgets and infrastructures due to cyber security threats, by accident as well as by deliberate action.

Is there a way out?

Yes, but it will take time.

We must collect and codify all the cyber insecurity lessons learned in the past and perform root cause analyses. We must convert these lessons to cyber security engineering principles, cyber security architectural design principles, and good and best practice approaches. We must understand and apply (cyber security) safety factors such as defense-in-depth just as rigorously as structural engineers and nuclear engineers use theirs. We must transform the art of mitigating cyber security risk to the scientific application of a well-balanced, self-amplifying set of mitigation measures.

Finally, we must impart the relevant knowledge to manufacturers, managers, programmers and, yes, even end users. They must learn this knowledge; they must apply it; they must be self-aware. Cyber technologies are embedded in all aspects of modern society; cyber security and cyber security awareness must be all pervasive.

Otherwise, we will face similar cyber security failures in 2050 as we have in the past. We can also look forward to celebrating 100 years of buffer overflows, bad coding practices and cyber insecurity with horrific “smart darkness events” involving our gadgets, our infrastructures and our society.

EOF

Leave a Reply

Your email address will not be published. Required fields are marked *