Security Systems - Module 2

Ace your homework & exams now with Quizwiz!

Security specialists (e.g., Anderson [6] ) have found it useful to place potential security violations in three categories.

1) Unauthorized information release: an unauthorized person is able to read and take advantage of information stored in the computer. This category of concern sometimes extends to "traffic analysis," in which the intruder observes only the patterns of information use and from those patterns can infer some information content. It also includes unauthorized use of a proprietary program. 2) Unauthorized information modification: an unauthorized person is able to make changes in stored information--a form of sabotage. Note that this kind of violation does not require that the intruder see the information he has changed. 3) Unauthorized denial of use: an intruder can prevent an authorized user from referring to or modifying information, even though the intruder may not be able to refer to or modify the information. Causing a system "crash," disrupting a scheduling algorithm, or firing a bullet into a computer are examples of denial of use. This is another form of sabotage.

Design principle 2. User Acceptability

1. Another Kinda of Cost of security. It's annoying, gets in the way. 2. Examples: password requirements, multi-factor authentication. 3. Related Observation: System only secure as weakest link. People often the weak link 4. Make Right Assumptions about people so they are not weak link

design principles for secure systems help us understand

1. Cost of security, including impact on usability 2. importance of safe defaults and only granting privileges that are necessary 3. open design and keeping system simple

Defender security cost and benefit

1. Cyber risk = Attack likelihood * Attack Impact 2. Reduce risk to an acceptable level 3. Defense vs response cost

Detecting Login Trojan with defense in depth

1. Two independently developed compilers. assume no more than one source is likely to introduce Trojan 2. Assume Trojan is present in A (we don't know).

Design Principle 4: Least Privilege

1. User authenticated 2. Privilege = access rights 3. Least privilege = anytime program running, need resource access to fewest privilege. 4. Do common systems do this? Unix UID - all files accessible while browse web? Android - different UIDs for different apps (don't all have to have same privileges) 5. Separation of privileges. 6. Fail-safe Default

Design Principle 3: Keep it simple. Economy of Mechanism

1. few simpler mechanisms 2. Open design - don't rely on obscurity or attackers ignorance 3. Do widely used systems follow principle? 4. OS vs hypervisor as TCB - more compact

Separation of privilege

A protection mechanism is more flexible if it requires two separate keys to unlock it, allowing for two-person control and similar techniques to prevent unilateral action by a subverted individual. The classic examples include dual keys for safety deposit boxes and the two-person control applied to nuclear weapons and Top Secret crypto materials. Figure 3 (courtesy of the Titan Missile Museum) shows how two separate padlocks were used to secure the launch codes for a Titan nuclear missile

Economy of mechanism

A simple design is easier to test and validate.

Complete mediation

Access rights are completely validated every time an access occurs. Systems should rely as little as possible on access decisions retrieved from a cache. Again, file permissions tend to reflect this model: the operating system checks the user requesting access against the file's ACL. The technique is less evident when applied to email, which must pass through separately applied packet filters, virus filters, and spam detectors.

How much value do I place on what needs to be secured?

Assets

How make it unprofitable for attacker?

Attackers cost in no more than what they're able to make. Lower point of curve. Darker part of curve.

Open design

Baran (1964) argued persuasively in an unclassified RAND report that secure systems, including cryptographic systems, should have unclassified designs. This reflects recommendations by Kerckhoffs (1883) as well as Shannon's maxim: "The enemy knows the system" (Shannon, 1948). Even the NSA, which resisted open crypto designs for decades, now uses the Advanced Encryption Standard to encrypt classified information.

Fail-safe defaults

Base access decisions on permission rather than exclusion.

Full-safe Default

Default is deny (fail rather than allow potentially insecure access). Default password of 12345? A popular router had this vulnerability

Enterprise network uses both a firewall and IDS/IPS

Defense in depth

Lock and key, alarms, guards, unarmed vs armed, armies

Defenses

Corollary:

Diverse mechanisms are less likely to share the same vulnerability

eight design principles, which tend to reduce both the number and the seriousness of any flaws

Economy of mechanism, fail-safe defaults, complete mediation, open design, separation of privilege, least privilege, least common mechanism, and psychological acceptability.

Complete mediation

Every access to every object must be checked for authority.

Least privilege

Every program and every user of the system should operate using the least set of privileges necessary to complete the job

Least privilege

Every program and user should operate while invoking as few privileges as possible. This is the rationale behind Unix "sudo" and Windows User Account Control, both of which allow a user to apply administrative rights temporarily to perform a privileged task.

Another Tradeoff: Complexity vs. Security

Features/Functionality on y, security assurance on x. Large attack surface vs. too conservative, needed functionality? usability?

Fail-safe defaults

Figure 2 shows a physical example: outsiders can't enter a store via an emergency exit, and insiders may only use it in emergencies. In computing systems, the save default is generally "no access" so that the system must specifically grant access to resources. Most file access permissions work this way, though Windows also provides a "deny" right. Windows access control list (ACL) settings may be inherited, and the "deny" right gives the user an easy way to revoke a right granted through inheritance. However, this also illustrates why "default deny" is easier to understand and implement, since it's harder to interpret a mixture of "permit" and "deny" rights.

every location in memory were tagged with an extra bit.

If the bit is OFF, the word in that location is an ordinary data or instruction word. If the bit is ON, the word is taken to contain a value suitable for loading into a protection descriptor register.

Psychological acceptability

It is essential that the human interface be designed for ease of use, so that users routinely and automatically apply the protection mechanisms correctly.

Compromise recording

It is sometimes suggested that mechanisms that reliably record that a compromise of information has occurred can be used in place of more elaborate mechanisms that completely prevent loss

Prevention=

Keep attacker out. This is what we focus on

Economy of mechanism

Keep the design as simple and small as possible

User or psychological acceptability design principle

Make Right Assumptions about people so they are not weak link

Impose higher cost on people

Makes system less secure

Least common mechanism

Minimize the amount of mechanism common to more than one user and depended on by all users

Will defender spend more on asset itself?

No

Do systems pay attention to design principles?

No. US Postal Service exposed data of 60 million users. Mirai botnet. remote debug feature. Enterprise network uses both firewall and IDS/IPS.

What value set on security?

Not more than asset itself

US postal Service exploit

Once login, read anyones. No access control. least privilege not enforced

Security=

Prevention + Detection + Response/Remediation

Steps to determine if compiler is bad

Produce two binaries with compilers, same source. Should X prime = Y prime? Functionally same; may not match byte by byte.

Work factor

Stronger security measures pose more work for the attacker. The authors acknowledged that such a measure could estimate trial-and-error attacks on randomly chosen passwords. However, they questioned its relevance since there often existed "indirect strategies" to penetrate a computer by exploiting flaws. "Tiger teams" in the early 1970s had systematically found flaws in software systems that allowed successful penetration, and there was not yet enough experience to apply work factor estimates effectively.

Open design

The design should not be secret

Compromise recording

The system should keep records of attacks even if the attacks aren't necessarily blocked. The authors were skeptical about this, since the system ought to be able to prevent penetrations in the first place. If the system couldn't prevent a penetration or other attack, then it was possible that the compromise recording itself may be modified or destroyed.

Psychological acceptability

This principle essentially requires the policy interface to reflect the user's mental model of protection, and notes that users won't specify protections correctly if the specification style doesn't make sense to them.

What does it need to be protected from and why are we a target?

Threat actors

All users of the memory system would use the same addressing descriptors, and these descriptors would have no permission bits--only a base and a bound value

True

Complex Systems are challenging to make trust worthy

True

Design principles should be informed by cost and effectiveness of defenses

True

For a given functional level, most existing protection systems differ in the way they handle protection dynamics

True

If the supervisor switches control of a real processor from one virtual processor to another, it would first reload the protection descriptors; the processor address space thus is different for different users, while the system address space remains the same for all users

True

It is usually possible to design the normal access controls to accommodate most system functions without privileges

True

One vulnerability is too many

True

TCB that follow secure design principles is more likely to meet is requirements

True

The key to the control of the design process is the security architecture—a detailed description of all aspects of the system that relate to security, along with a set of principles to guide the design.

True

The more you spend on defense, less likely response cost

True

Thus accountability in ticket-oriented systems can be difficult to pinpoint.

True

When program A is in control, it can have access only to itself and the math routine; similarly, when program B is in control, it can have access only to itself and the math routine. Since neither program has the power to change the descriptor register, sharing of the math routine has been accomplished while maintaining isolation of program A from program B.

True

a virtual processor is not permitted to load its own protection descriptor registers

True

capability systems are ticket-oriented, while access control list systems are list-oriented

True

privileges should not be used as a catch-all to make up for deficient and inflexible access controls

True

the principle of least privilege provides a rationale for where to install the firewalls. The military security rule of "need-to-know" is an example of this principle.

True

• Security should not affect users who obey the rules.• It should be easy for users to give access.• It should be easy for users to restrict access.

True

Should X prime = Y prime?

Two functionally equivalent programs should produce same output given same input. If not, it's buggy. Defense in depth principle requires more than one defense but provides greater security

Systematic way of building system

Use Design Principles

Least common mechanism

Users should not share system mechanisms except when absolutely necessary, because shared mechanisms may provide unintended communication paths or means of interference.

Cyber Risk?

What's likelihood and cost going to be?

Separation of privilege

Where feasible, a protection mechanism that requires two keys to unlock it is more robust and flexible than one that allows access to the presenter of only a single key.

Attacker cost and benefit

Work factor (cost of attack) vs gain

memory system itself would be addressed with two-component addresses:

a unique segment identifier (to be used as a key by the memory system to look up the appropriate descriptor) and an offset address that indicates which part of the segment is to be read or written

Implementations of protection mechanisms that permit sharing fall into the two general categories described by Wilkes [37]

a) "List-oriented" implementations, in which the guard holds a list of identifiers of authorized users, and the user carries a unique unforgeable identifier that must appear on the guard's list for access to be permitted. A store clerk checking a list of credit customers is an example of a list-oriented implementation in practice. The individual might use his driver's license as a unique unforgeable identifier. b) "Ticket-oriented" implementations, in which the guard holds the description of a single identifier, and each user has a collection of unforgeable identifiers, or tickets,17 corresponding to the objects to which he has been authorized access. A locked door that opens with a key is probably the most common example of a ticket-oriented mechanism; the guard is implemented as the hardware of the lock, and the matching key is the (presumably) unforgeable authorizing identifier.

At least four levels of functional goals for a protection system can be identified

all-or-nothing systems, controlled sharing, user-programmed sharing controls, and putting strings on information

3. Design Principle: Economy of Mechanisms

complex systems have lots of mechanisms. less complex have fewer

The term "privacy"

denotes a socially defined ability of an individual (or organization) to determine whether, when, and to whom personal (or organizational) information is to be released.

The term "security"

describes techniques that control who may use or modify the computer or the information contained in it.2

Separation of privileges

example: separate keys for different secure access (each office has different key). Fine-grain access control: different resources are accessed with different privileges

remote debug feature exploited by morris work

fail-safe defaults. remote debug should be defaulted to off

What can be done to facilitate detection, response, and remediation?

false alerts and security analyst overload (related to first principle). modularity for patching (remediation). Secure in design, by default and in deployment.

descriptors

have been introduced here for the purpose of protecting information, although they are also used in some systems to organize addressing and storage allocation

Economics of Security

how do we decide on how much to spend on security

Design Principle: Defense in Depth

ie Multiple checkpoints. No defense is perfect. Penetrating layer of defenses.

Examples of security techniques sometimes applied to computer systems are the following:

labeling files with lists of authorized users, verifying the identity of a prospective user by demanding a password, shielding the computer to prevent interception and subsequent interpretation of electromagnetic radiation, enciphering information sent over telephone lines, locking the room containing the computer, controlling who is allowed to make changes to the computer system (both its hardware and software), using redundant circuits or programmed cross-checks that maintain security in the face of hardware or software failures, certifying that the hardware and software are actually implemented as intended.

Mirai botnet

lot botnet, routers, cameras, used bots to mount ddos attacks, used small set of possible usernamee/passwords

Examples of administrative functions that are not security-relevant include

mounting tapes,taking dumps, starting and stopping printer queues, bringing up and bringing down variousbackground system processes, reconfiguring hardware, and entering certain user attributes.

This particular tagged architecture is known as a capability system,

one that lets the user place protection descriptor values in memory addresses that are convenient to him. A memory word that contains a protection descriptor value (in our simple tagged system, one that has its tag bit ON) is known as a capability.

Prevention only goes this far

popular cybersecurity axion. everyone compromised. some know they are and other will know in future. Some major breaches lasted years

Separation of privilege

systems providing user-extendible protected data types usually depend on separation of privilege for their implementation.

in a system based on a security kernel,

we often go so faras to make it impossible for a person with administrator privilege to affect security.


Related study sets

Normal pregnancy and prenatal care

View Set

Fundamentals of Biology HW Set #2

View Set

Unit 4 - The Sociology of Families and Households

View Set

Fundamentals ATI Practice B - rationales

View Set

Unit 2: Bonding, Naming and Writing compounds Study Guide

View Set

SD - vientisinio sakinio skyryba

View Set