SPECIALISTS IN INFORMATION SECURITY MANAGEMENT SYSTEMS (ISO/IEC 27001)  
YOU ARE IN GAMMA’S RESEARCH ARCHIVES — THIS PAGE IS OF HISTORIC INTEREST ONLY — EXIT

 

(Les critères d'homologation de la sécurité des systèmes d'information)

This paper was written by Dr. David Brewer and was presented at the Eurosec 98 conference in Paris on 16 March 1998, Copyright © Gamma Secure Systems Limited, 1998.

Overview

In English, the French word "homologation" means "officially recognised". In the context of security and information technology, homologation describes a process whereby a security authority grants permission for an information technology (IT) system to be used for operational purposes with live data. The process is usually applied in the context of information confidentiality, implying that a system is not permitted to handle sensitive information until the homologation process is complete. Indeed, the process is traditionally perceived as a state transition: at some point in time a non-homologated system is instantaneously transformed into a homologated one. Homologation is often called "accreditation" in military or national security circles.

This paper investigates the evolution of homologation criteria and examines their utility in real world situations. Our conclusions force us to challenge the simple state transition model and replace it with the more radical idea that homologation is really a complex, continuous, life cycle process. We show that there is an alternative, that an appropriate marriage of emergent Common Criteria and BS7799 goes part way to solving the problem, and that real-time security management tools are a necessary ingredient for success.


Problems of Homologation

The US Orange Book was conceived in an era of computing technology when the mainframe computer was king. In a world of paper-tape and punched cards, computing was confined to the computer room and any sense of computer security was the dominion of the operating system. As far as security was concerned, the first obstacle that a spy would have to overcome would be to gain access to the site on which the computer installation was located. The spy would then have to gain access to the computer room, and then to the computer operating system itself. In this paradigm there are three distinct levels of defence: "site security", "room security" and "operating system security". The Orange Book identified a comprehensive set of security evaluation criteria. However, from a homologation perspective, they applied only to the operating system. Contemporary European work accepted the US work as a partial solution and initially concentrated on completing the picture by developing criteria to govern the site and room security. This gave rise to the concept of a "System Security Policy", or SSP that would determine the site, room and operating system safeguards necessary to protect the data held within the computer system from unauthorised access.

By the time the Orange Book was published (1985), remote mainframe access via dumb terminals was commonplace. Clearly, homologation now had to take cognisance of communication security (COMSEC) and electromagnetic radiation (TEMPEST), but the basic ideas of site, room and operating system security, and the SSP, still applied.

With hindsight we can recognise that we had just been given our first glimpse of a fundamental problem with homologation: the need for homologation criteria to keep pace with technology. We were soon to learn other factors, in particular cost and time. Indeed, as has been found with product evaluations, by the time the homologation process is complete, system requirements may have changed sufficiently to warrant restarting the homologation process all over again. There are also procurement issues, particularly concerning outsourcing and private finance initiatives.

The Evolution of Homologation Solutions

To this day, from its beginnings in the mid-1980s, homologation has been characterised by:

  • assessing the risk and thereby choosing an appropriate level of assurance;

  • applying the chosen assurance criteria to evaluate the system in the context of a SSP to identify exploitable vulnerabilities.

The second characteristic is elegantly embraced by the ITSEC. For example, when applied in a system as opposed to a product context, the ITSEC explicitly calls up a SSP as a major component of the Security Target. The ITSEC provides criteria to test the IT within the context of the SSP but falls short of providing criteria to test the effectiveness of the physical, personnel and procedural safeguards specified by the SSP. Surprisingly, this has never been regarded as a major weakness. Technology had moved on, of course: by the time the ITSEC was published (1991) data exchange between systems had become commonplace. This gave rise to the idea of a "System Interconnection Security Policy" (SISP) to govern bilateral data exchange between systems and a Community Security Policy (CSP) when three or more systems enter into a common data exchange agreement.

A greater challenge has arisen from the growth of the domestic PC market. The ability of individuals to buy off-the-shelf IT for home use, often with greater computing power than they enjoy at work, has prompted businesses and government departments to streamline their acquisition processes. Coupled with modern risk management disciplines, this has led to an evolutionary approach to IT acquisition, with which ITSEC-based homologation processes can hardly keep pace. While the complete application of ITSEC in a system context may take 9 months or longer, a typical evolutionary approach may introduce major system upgrades, with a corresponding change in user security profiles, every three months or so.

An alternative approach might be to apply ITSEC in conjunction with a Certificate Maintenance Scheme (CMS). However, the UK CMS was developed as a way to reduce the need for product re-evaluations. It is not intended to address changes in security threats, objectives or safeguards.

A more radical alternative would be to only make use of certified Security Enforcing Functions (SEFs). In other words: firstly the security components of an IT system are constructed solely from ITSEC evaluated products and, secondly, the SSP only specifies those security features that were certified during the product evaluations. All other required security functionality is then provided by physical, personnel or procedural means. This approach has been applied successfully in several Microsoft NT-based LANs, including one for an Anglo-French-Italian project. In this case, its effect was to reduce the ITSEC part of the homologation process to the submission of a security target and effectiveness documentation to the homologation authority and obviate the need for a system evaluation.

However, none of these approaches explicitly recognise that information security is not only a technical problem, it is also a management problem. We will therefore introduce the idea of an Information Security management System (ISMS) to address the management issues. Later we will show that the ISMS is perhaps an essential component of the homologation process.

The Impact of the World Wide Web

The popular uptake of the World Wide Web (WWW) has rendered computer access "user centric". Many people now enjoy a computing environment where they transparently connect to virtually any computer in the world to access information. Alternatively, people have borrowed the Internet concept to set up their own "private" intra- or extranet facilities.

From a homologation perspective there is no longer a many to one relationship between users and computer systems but a many to many relationship. Indeed, in this context, the concept of a computer system, as originally conceived by the Orange Book, is severely strained. It is perhaps wiser to think in terms of service providers and client communities. However, the impact of this approach on the homologation process is not straightforward as Brewer and Wilsher intimated at Eurosec 97. In the traditional sense, the homologation process is carried out by the data owner, which is why, in the "Insurefast" case, Insurefast's clients wanted to audit the Insurefast service before they used it. However, they are not allowed to do this in practice and a different homologation process is required. To understand why, consider the interconnections of a service provider S and three client communities A, B and C. There are three cases:

  • Single enterprise. In this case, S could be an operational risk management system, and A, B and C could be different departments within, say, the same bank. The data owner is the bank and the homologation process is performed by the bank.

  • Co-operative enterprises. In this case, S could be a command and control system and A, B and C could be a city's police, fire and ambulance services respectively. There is an agreement to share information belonging to A, B, and C to co-ordinate emergency services. In this case the homologation process is performed jointly by A, B and C.

  • Competing enterprises. In this case, S could be Insurefast and A, B and C its clients. There is no agreement to share information. Indeed, the reverse is true: S must ensure the strict separation of data belonging to A, B and C. A, B or C cannot be party to the homologation process for S, least this strict separation rule is broken (e.g. by A unwittingly gaining access to B or C's data). Instead, the homologation process is perhaps best performed by some third party T in accordance with some publicly recognised standard.

The Impact of Acquisition

Viewed simply, the homologation process is the answer to the cautious buyer's prayer. The buyer acquires a system to keep the buyer's transactions secret. From the buyer's perspective, there can be no absolute guarantee that the supplier is not, at least, the unwitting agent of the criminal or spy. The buyer therefore invokes the homologation procedure to determine whether the system is safe to use before using it.

Of course, the buyer ought to pay the supplier for the system before invoking the homologation process. However, since there is a risk that the supplier may be required to modify the system before it can pass homologation, the buyer will invariably defer payment until after successful homologation. Unfortunately, the homologation process is not under the suppliers control. Seen from the supplier's perspective, homologation could be a means to obtain use of a system without paying for it. The homologation process therefore has a tendency to cause friction between the buyer and the supplier. This friction is at its greatest when large amounts of private finance are at stake.

The Need for Real-Time Security

The homologation process need not be restricted to issues of confidentiality. Let us introduce the notion of information quality to capture the concepts of fitness-for-purpose and the timeliness of information. From a management perspective, the quality of information is often its fitness for the purpose of making an informed decision, and can be measured by the ability of the manager to spot mistakes and correct them before the aforementioned decision has been made. As the homologation process can tell people that their systems will keep their secrets secret, when applied in the context of information quality, the homologation process will tell them that their systems can be relied upon to help them spot mistakes.

The ability to spot mistakes is, of course, a real-time activity; the collapse of Barings being a classic counter-example. Good management requires the continuous monitoring of business information. Likewise, in the context of open systems, world knowledge of vulnerabilities increases daily, as does technology's ability to deliver new safeguards. Gone are the days when the security systems could be introduced, homologated and then forgotten about. In today's world we need to react rapidly to changes in threat, exposure and vulnerability. A "step-function" homologation process is therefore probably insufficient, as upon the completion of the homologation process, changes in threat, exposure and vulnerability may cause the homologation process to start over again. At first view, homologation needs to be a continuous process.

Technology Solutions

If homologation is a continuous process, then in addition to a SSP and ITSEC (or ITSEC-like criteria) we undoubtedly require mechanisms to audit the actual state of the subject of homologation (e.g. actual network topology) and the state-of-the-art regarding information security attack and defence. The chosen mechanisms would be IT-based and if they were not deployed continuously they would be deployed regularly on a daily, weekly or monthly basis. Such mechanisms might include:

  • auditing and accounting tools to identify success and unsuccessful attempts at information compromise, assist in damage assessment/limitation and hold people and organisations accountable for their actions;

  • network and modem discovery tools to identify the actual topology of networks, external connectivity, network protocols, services (e.g. anonymous ftp, sendmail, etc.), operating systems used and installed patches, etc.;

  • vulnerability identification and safeguard prioritisation tools to identify real network vulnerabilities, corresponding safeguards and a means to map these to the SSP so that the deployment of safeguards can be ranked, e.g. in terms of risk reduction;

  • "ethical" hacking to provide practical demonstration of the existence of vulnerabilities and the effectiveness of safeguards.

A variety of commercial-off-the-shelf tools exist in support of all of these categories, the most useful addressing two or more categories. A potentially canonical example use to be the "Expert" risk assessment product.  This product, which no longer seems to be available, combined network and modem discovery with vulnerability identification and safeguard prioritisation. The utility of this tool was further enhanced by its ability to connect to an on-line, regularly updated library of vulnerabilities and safeguards. This particular feature, categorised it as a management tool, able to assist a service provider or system manager to regularly check compliance with the SSP and take action accordingly. In this sense, homologation becomes a day-to-day management problem. We therefore return to the idea of an Information Security Management System (ISMS) which has the objective of assisting management to carry out this process. Of course, such an ISMS is merely an information quality system with the sole purpose of facilitating informed decisions about the security of that enterprise (the "target enterprise") within the scope of the ISMS. In this sense, it would be sensible to demand homologation for the ISMS. The homologated ISMS then ensures continued homologation for the target enterprise.

We must therefore shift our concept of homologation from the target enterprise, e.g., the traditional IT system as defined by ITSEC, to the ISMS that manages the security of that target enterprise. Fortunately, criteria now exist which facilitate the homologation of an ISMS.

BS7799

BS7799: 1995 Part 1 was originally conceived as a code of practice for information security management. It catalogues a whole host of good security controls with near universal applicability for multi-national organisations. The 1995 attempt to adopt BS7799 as an ISO standard failed, which perhaps was a good thing as it has focused attention on the importance of the ISMS, as opposed to the technology involved. Two particular developments have taken place:

  • The code of practice has been adopted by the Netherlands, Australia and New Zealand. Other nations are showing keen interest.

  • The UK and the Netherlands have established EN45012 accredited certification schemes.

The Dutch advanced the idea of allowing organisations not only the ability to decide which BS7799 security controls applied to their target enterprise and which did not, but also what additional security controls not covered by BS7799 applied. The UK embraced this "supersetting" idea wholeheartedly in formulating BS7799:1997 Part 2 to address accredited certification against the standard. BS7799 Part 2 requires the creation of an ISMS, rather like ISO 9000 requires the creation of a quality management system. As in the ISO 9000 case, BS7799 certification, certifies the ISMS.

Table 1 compares the mandatory requirements of BS7799:1997 Part 2 with those required for the traditional homologation process practised by the British government (IM5). The compatibility is striking for two reasons:

  • IM5 was produced by government with a specific Defence/National Security orientation. BS7799 was produced independently by industry, primarily by computer managers in industries such as oil, banking, retailing and insurance.

  • Both standards address security management, albeit that IM5 assumes that security management is a relatively static process with infrequent review while BS7799 assumes a more dynamic process with continuous review.

BS7799:1997 Part 2 requirement IM5 System Security Policy requirement
Security Policy The overarching statement concerning the classification of information and the steps to be taken (nation-wide) to protect it.
Scope Statement System Description (Chapter 2)
Risk Assessment Definition of Threat, and determination of ITSEC E-level (Chapter 3)
Statement of Applicability Identification of the technical and non-technical safeguards under ITSEC's 8 generic headings (Chapters 4 -11)
ISMS Security management, in particular re-homologation conditions (Chapter 12)
Table 1: A comparison of the mandatory BS7799 requirements and those required for the traditional homologation process practised by the British government.

However, it is BS7799's ability to invoke other standards through the "superset" rule that renders BS7799 of paramount importance as far as homologation is concerned. With reference to Trust Services (for example the provision of basic cryptographic certification services) a combination of PKIX, and BS7799:1995 provides a useful basis for designing a BS7799 certifiable ISMS for a Trust Service Provider. In the context of the "competing enterprise" homologation scenario previously described, this ought to provide the mechanism for asserting adherence to the desired publicly recognised standard.

The Common Criteria's Advantage

But, of course, homologation is not just a management problem: we need the right technical solution to be in place as well as a reliable management system to deploy that solution, monitor its effectiveness and react accordingly.

The Common Criteria (CC) provides criteria for the evaluation of IT security measures that harmonises the European (ITSEC) and North American (US Federal Criteria and the Canadian Criteria) approaches. Its advantage over the ITSEC is threefold:

  • It differentiates between the generic (the Protection Profile) and the specific (the Security Target) and sensibly partitions the application of the ITSEC effectiveness criteria to each.

  • It provides a comprehensive catalogue of security functionality with explicit dependencies: given any one choice of security functionality, the CC identifies what other security functionality is required in support.

  • It allows the assurance criteria to be chosen in accordance to their merit in a given situation, whilst allowing the result of that selection to be mapped onto a common scale of evaluation assurance levels (EALs).

Like ITSEC, the CC assumes a static world where security products and systems are built and last forever. This is a false world, it is not the world of reality where products and systems evolve rapidly. Thus, if ITSEC dies, so will the CC, the advantages of the CC over ITSEC will not sustain the CC. However a "marriage" of the CC and BS7799 would allow the CC to assume the dynamic world where products and systems are forever changing. The CC's dowry would, in particular, be the extension of its library of functional dependencies into the non-technical areas required by an ISMS. The ability to mix and match assurance criteria and declare generic ISMSs would undoubtedly be advantageous. The "marriage" could consummate, for example, the concept of only using certified SEFs.

The AccredIS Solution

BS7799 and the CC ought to form the basis for the modern day homologation process. AccredIS provides a means for its achievement.

AccredIS is really three things:

  • a means to design an ISMS which takes prior account of the responsibility and liability issues that concern the target enterprise before considering the IT issues, i.e. it puts business before technology;

  • a means to select the appropriate safeguards and assurance criteria;

  • a means to ensure that the ISMS satisfies the information quality principles.

The AccredIS process was described at Eurosec 97. At that time we merely hoped that BS7799 certification would produce the required public statement of assurance in support of the modern day homologation process. The inclusion of the "superset" rule in BS7799:1997 - Part 2, makes this hope a reality.

Summary and Conclusions

In this paper we have investigated the evolution and use of homologation criteria. We concluded that with the rapid advances in technology and migration towards a more user-centric culture, the homologation process should move towards certifying the ISMS rather than the target enterprise. An appropriate marriage of emergent Common Criteria and the BS7799 standard goes part way to solving the problem, but real-time security management tools are likely to be a necessary ingredient for success.