CO 21 - 23

Réussis tes devoirs et examens dès maintenant avec Quizwiz!

Note: Not all PKI certificates are directly received from a CA. A registration authority (RA) is a subordinate CA and is certified by a root CA to issue certificates for specific uses.

Class - Description 0 - Used for testing in situations in which no checks have been performed. 1 - Used by individuals who require verification of email. 2 - Used by organizations for which proof of identity is required. 3 - Used for servers and software signing. Independent verification and checking of identity and authority is done by the certificate authority. 4 - Used for online business transactions between companies. 5 - Used for private organizations or government security.

Sandboxing is a technique that allows suspicious files to be executed and analyzed in a safe environment. Automated malware analysis sandboxes offer tools that analyze malware behavior. These tools observe the effects of running unknown malware so that features of malware behavior can be determined and then used to create defenses against it.

Cuckoo Sandbox is a popular free malware analysis system sandbox. It can be run locally and have malware samples submitted to it for analysis. A number of other online public sandboxes exist. These services allow malware samples to be uploaded for analysis. Some of these services are VirusTotal, Joe Sandbox, and CrowdStrike Falcon Sandbox.

PROTECT Identity Management and Access Control Information Protection Processes and Procedures Maintenance Protective Technology

DETECT Anomalies and Events Security Continuous Monitoring Detection Processes

RESPOND Response Planning Communications Analysis Mitigation Improvements

RECOVER Recovery Planning Improvements Communications

Data Encryption Standard (DES) This is a legacy symmetric encryption algorithm. It uses a short key length that makes it insecure for most current uses.

3DES (Triple DES) The is the replacement for DES and repeats the DES algorithm process three times. It should be avoided if possible as it is scheduled to be retired in 2023. If implemented, use very short key lifetimes.

The US Government Federal Information Processing Standard (FIPS)

A digital certificate is equivalent to an electronic passport. It enables users, hosts, and organizations to securely exchange information over the Internet. Specifically, a digital certificate is used to authenticate and verify that a user who is sending a message is who they claim to be. Digital certificates can also be used to provide confidentiality for the receiver with the means to encrypt a reply.

iptables - This is an application that allows Linux system administrators to configure network access rules that are part of the Linux kernel Netfilter modules. nftables - The successor to iptables, nftables is a Linux firewall application that uses a simple virtual machine in the Linux kernel. Code is executed within the virtual machine that inspects network packets and implements decision rules regarding packet acceptance and forwarding. TCP Wrappers - This is a rule-based access control and logging system for Linux. Packet filtering is based on IP addresses and network services.

A host-based intrusion detection system (HIDS) is designed to protect hosts against known and unknown malware. A HIDS can perform detailed monitoring and reporting on the system configuration and application activity. It can provide log analysis, event correlation, integrity checking, policy enforcement, rootkit detection, and alerting. A HIDS will frequently include a management server endpoint

After the risks are identified, they may be scored or weighted as a way of prioritizing risk reduction strategies.

A mandatory activity in risk assessment is the identification of threats and vulnerabilities and the matching of threats with vulnerabilities in what is often called threat-vulnerability (T-V) pairing. The T-V pairs can then be used as a baseline to indicate risk before security controls are implemented. This baseline can then be compared to ongoing risk assessments as a means of evaluating risk management effectiveness. This part of risk assessment is referred to as determining the inherent risk profile of an organization.

An interesting online tool is ANY.RUN, which is shown in the figure. It offers the ability to upload a malware sample for analysis like any online sandbox. However, it offers a very rich interactive reporting functionality that is full of details regarding the malware sample. ANY.RUN runs the malware and captures a series of screen shots of the malware if it has interactive elements that display on the sandbox computer screen.

A means of capturing just the right period for baseline measurement is known as sliding window anomaly detection

Check: Monitor implementation Compile reports Support external certification audit

Act: Continually audit processes Continually improve processes Take corrective action Take preventive action

ISMSs are a natural extension of the use of popular business models, such as Total Quality Management (TQM) and Control Objectives for Information and Related Technologies (COBIT), into the realm of cybersecurity. An ISMS is a systematic, multi-layered approach to cybersecurity. The approach includes people, processes, technologies, and the cultures in which they interact in a process of risk management.

An ISMS often incorporates the "plan-do-check-act" framework, known as the Deming cycle, from TQM. It is seen as an elaboration on the process component of the People-Process-Technology-Culture model of organizational capability

Patch management software is available from companies such as SolarWinds and LANDesk. Microsoft System Center Configuration Manager (SCCM) is an enterprise-level tool for automated distribution of patches to a large number of Microsoft Windows workstations and servers.

An Information Security Management System (ISMS) consists of a management framework through which an organization identifies, analyzes, and addresses information security risks. ISMSs are not based in servers or security devices. Instead, an ISMS consists of a set of practices that are systematically applied by an organization to ensure continuous improvement in information security. ISMSs provide conceptual models that guide organizations in planning, implementing, governing, and evaluating information security programs.

It can be said that host-based security systems function as both detection and prevention systems because they prevent known attacks and detect unknown potential attacks. A HIDS uses both proactive and reactive strategies. A HIDS can prevent intrusion because it uses signatures to detect known malware and prevent it from infecting a system. However, this strategy is only good against known threats. Signatures are not effective against new, or zero day, threats. In addition, some malware families exhibit polymorphism.

Anomaly-based - Host system behavior is compared to a learned baseline model of normal behavior. Significant deviations from the baseline are interpreted as the result of some sort of intrusion. If an intrusion is detected, the HIDS can log details of the intrusion, send alerts to security management systems, and take action to prevent the attack. Policy-based - Normal system behavior is described by rules, or the violation of rules, that are predefined. Violation of these policies will result in action by the HIDS. The HIDS may attempt to shut down software processes that have violated the rules and can log these events and alert personnel to violations.

One way of decreasing the attack surface is to limit access to potential threats by creating lists of prohibited applications. This is known as block listing.

Application block lists can dictate which user applications are not permitted to run on a computer. Similarly, allow lists can specify which programs are allowed to run.

According to NIST, vulnerability management is a security practice that is designed to proactively prevent the exploitation of IT vulnerabilities that exist within an organization. The expected result is to reduce the time and money spent dealing with vulnerabilities and the exploitation of those vulnerabilities. Proactively managing vulnerabilities of systems will reduce or eliminate the potential for exploitation and involve considerably less time and effort than responding after an exploitation has occurred.

Asset management involves the implementation of systems that track the location and configuration of networked devices and software across an enterprise.

Examples of protocols that use asymmetric key algorithms include: Internet Key Exchange (IKE) - This is a fundamental component of IPsec VPNs. Secure Socket Layer (SSL) - This is now implemented as IETF standard Transport Layer Security (TLS). Secure Shell (SSH) - This protocol provides a secure remote access connection to network devices. Pretty Good Privacy (PGP) - This computer program provides cryptographic privacy and authentication. It is often used to increase the security of email communications.

Asymmetric algorithms are substantially slower than symmetric algorithms. Their design is based on computational problems, such as factoring extremely large numbers or computing discrete logarithms of extremely large numbers.

Rivest ciphers (RC) series algorithms This algorithm was developed by Ron Rivest. Several variations have been developed, but RC4 was the most prevalent in use. RC4 is a stream cipher that was used to secure web traffic. It has been found to have multiple vulnerabilities which have made it insecure. RC4 should not be used.

Asymmetric algorithms, also called public-key algorithms, are designed so that the key that is used for encryption is different from the key that is used for decryption, The decryption key cannot, in any reasonable amount of time, be calculated from the encryption key and vice versa.

The first step in the CA authentication procedure is to securely obtain a copy of the CA's public key. All systems that leverage the PKI must have the CA's public key, which is called the self-signed certificate. The CA public key verifies all the certificates issued by the CA and is vital for the proper operation of the PKI. Note: Only a root CA can issue a self-signed certificate that is recognized or verified by other CAs within the PKI.

Authentication no longer requires the presence of the CA server, and each user exchanges their certificates containing public keys.

Asymmetric algorithms use a public key and a private key. Both keys are capable of the encryption process, but the complementary paired key is required for decryption. The process is also reversible. Data that is encrypted with the public key requires the private key to decrypt. Asymmetric algorithms achieve confidentiality and authenticity by using this process.

Because neither party has a shared secret, very long key lengths must be used. Asymmetric encryption can use key lengths between 512 to 4,096 bits. Key lengths greater than or equal to 2,048 bits can be trusted, while key lengths of 1,024 or shorter are considered insufficient.

Today, symmetric encryption algorithms are commonly used with VPN traffic. This is because symmetric algorithms use less CPU resources than asymmetric encryption algorithms. This allows the encryption and decryption of data to be fast when using a VPN. When using symmetric encryption algorithms, like any other type of encryption, the longer the key, the longer it will take for someone to discover the key. Most encryption keys are between 112 and 256 bits. To ensure that the encryption is safe, a minimum key length of 128 bits should be used. Use a longer key for more secure communications.

Block ciphers transform a fixed-length block of plaintext into a common block of ciphertext of 64 or 128 bits. Common block ciphers include DES with a 64-bit block size and AES with a 128-bit block size.

Symmetric encryption algorithms such as Data Encryption Standard (DES), 3DES, and Advanced Encryption Standard (AES) are based on the premise that each communicating party knows the pre-shared key.

Data confidentiality can also be ensured using asymmetric algorithms, including Rivest, Shamir, and Adleman (RSA) and the public key infrastructure (PKI).

The authentication objective of asymmetric algorithms is initiated when the encryption process is started with the private key. The process can be summarized using the formula: Private Key (Encrypt) + Public Key (Decrypt) = Authentication

Diffie-Hellman (DH) is an asymmetric mathematical algorithm that allows two computers to generate an identical shared secret without having communicated before. The new shared key is never actually exchanged between the sender and receiver. However, because both parties know it, the key can be used by an encryption algorithm to encrypt traffic between the two systems.

Digital signatures are a mathematical technique used to provide authenticity, integrity, and nonrepudiation. Digital signatures have specific properties that enable entity authentication and data integrity. In addition, digital signatures provide nonrepudiation of the transaction. In other words, the digital signature serves as legal proof that the data exchange did take place. Digital signatures use asymmetric cryptography.

Digital signatures are commonly used in the following two situations: Code signing - This is used for data integrity and authentication purposes. Code signing is used to verify the integrity of executable files downloaded from a vendor website. It also uses signed digital certificates to authenticate and verify the identity of the site that is the source of the files. Digital certificates - These are similar to a virtual ID card and used to authenticate the identity of system with a vendor website and establish an encrypted connection to exchange confidential data.

Digital signatures are commonly used to provide assurance of the authenticity and integrity of software code. Executable files are wrapped in a digitally signed envelope, which allows the end user to verify the signature before installing the software.

Digitally signing code provides several assurances about the code: The code is authentic and is actually sourced by the publisher. The code has not been modified since it left the software publisher. The publisher undeniably published the code. This provides nonrepudiation of the act of publishing.

PLAN: Understand relevant business objectives Define scope of activities Access and manage support Assess and define risk Perform asset management and vulnerability assessment

Do: Create and implement risk management plan Establish and enforce risk management policies and procedures Train personnel, allocate resources

Cross-certified CA topologies - this is a peer-to-peer model in which individual CAs establish trust relationships with other CAs by cross-certifying CA certificates. Users in either CA domain are also assured that they can trust each other. This provides redundancy and eliminates the single-point of failure.

Hierarchical CA topologies - the highest-level CA is called the root CA. It can issue certificates to end users and to a subordinate CA. The sub-CAs could be created to support various business units, domains, or communities of trust. The root CA maintains the established "community of trust" by ensuring that each entity in the hierarchy conforms to a minimum set of practices. The benefits of this topology include increased scalability and manageability. This topology works well in most large organizations. However, it can be difficult to determine the chain of the signing process.

IDENTIFY Develop an organizational understanding to manage cybersecurity risk to systems, assets, data, and capabilities. PROTECT Develop and implement the appropriate safeguards to ensure delivery of critical infrastructure services. DETECT Develop and implement the appropriate activities to identify the occurrence of a cybersecurity event. RESPOND Develop and implement the appropriate activities to act on a detected cybersecurity event. RECOVER Develop and implement the appropriate activities to maintain plans for resilience and to restore any capabilities or services that were impaired due to a cybersecurity event

IDENTIFY Asset Management Business Environment Governance Risk Assessment Risk Management Strategy

ISO is the International Organization for Standardization. ISO's voluntary standards are internationally accepted and facilitate business conducted between nations. ISO partnered with the International Electrotechnical Commission (IEC) to develop the ISO/IEC 27000 series of specifications for ISMSs

ISO/IEC 27000 Information security management systems - Overview and vocabulary - Introduction to the standards family, overview of ISMS, essential vocabulary.

ISO/IEC 27001 Information security management systems - Requirements - Provides an overview of ISMS and the essentials of ISMS processes and procedures.

ISO/IEC 27003 Information security management system implementation guidance - Critical factors necessary for successful design and implementation of ISMS.

ISO/IEC 27004 Information security management - Monitoring, measurement, analysis and evaluation - Discussion of metrics and measurement procedures to assess effectiveness of ISMS implementation.

ISO/IEC 27005 Information security risk management - Supports the implementation of ISMS based on a risk-centered management approach.

Server Profile Element Description Listening ports These are the TCP and UDP daemons and ports that are normally allowed to be open on the server. Logged in users and accounts These are the parameters defining user access and behavior. Service accounts These are the definitions of the type of service that an application is allowed to run. Software environment These are the tasks, processes, and applications that are permitted to run on the server.

In addition to statistical and behavioral approaches to anomaly detection is rule-based anomaly detection. Rule-based detection analyzes decoded packets for attacks based on pre-defined patterns.

Host-based antivirus protection is also known as agent-based. Agent-based antivirus runs on every protected machine. Agentless antivirus protection performs scans on hosts from a centralized system.

In addition to the protection functionality provided by host-based security products is the telemetry function. Most host-based security software includes robust logging functionality that is essential to cybersecurity operations. Some host-based security programs will submit logs to a central location for analysis.

Mobile device management (MDM), especially in the age of BYOD, presents special challenges to asset management. Mobile devices cannot be physically controlled on the premises of an organization. They can be lost, stolen, or tampered with, putting data and network access at risk. Part of an MDM plan is acting when devices leave the custody of the responsible party. Measures that can be taken include disabling the lost device, encrypting the data on the device, and enhancing device access with more robust authentication measures.

MDM systems, such as Cisco Meraki Systems Manager

This is known as a cipher suite. The key components of the cipher suite are the Message Authentication Code Algorithm (MAC), the encryption algorithm, the key exchange algorithm, and the authentication algorithm.

Network monitoring becomes more challenging when packets are encrypted. However, security analysts must be aware of those challenges and address them as best as possible. For instance, when site-to-site VPNs are used, the IPS should be positioned so it can monitor unencrypted traffic. However, the increased use of HTTPS in the enterprise network introduces new challenges. Since HTTPS introduces end-to-end encrypted HTTP traffic (via TLS/SSL), it is not as easy to peek into user traffic.

Diffie-Hellman uses different DH groups to determine the strength of the key that is used in the key agreement process. The higher group numbers are more secure, but require additional time to compute the key. The following identifies the DH groups supported by Cisco IOS Software and their associated prime number value: DH Group 1: 768 bits DH Group 2: 1024 bits DH Group 5: 1536 bits DH Group 14: 2048 bits DH Group 15: 3072 bits DH Group 16: 4096 bits

Note: A DH key agreement can also be based on elliptic curve cryptography. DH groups 19, 20, and 24, which are based on elliptic curve cryptography, are also supported by Cisco IOS Software.

The Public Key Infrastructure (PKI) consists of specifications, systems, and tools that are used to create, manage, distribute, use, store, and revoke digital certificates. The certificate authority (CA) is an organization that creates digital certificates by tying a public key to a confirmed identify, such as a website or individual. The PKI is an intricate system that is designed to safeguard digital identities from hacking by even the most sophisticated threat actors or nation states.

PKI is needed to support large-scale distribution and identification of public encryption keys. The PKI framework facilitates a highly scalable trust relationship. It consists of the hardware, software, people, policies, and procedures needed to create, manage, store, distribute, and revoke digital certificates.

Here are two of the most common methods of revocation: Certificate Revocation List (CRL) - A list of revoked certificate serial numbers that have been invalidated because they expired. PKI entities regularly poll the CRL repository to receive the current CRL. Online Certificate Status Protocol (OCSP) - An internet protocol used to query an OCSP server for the revocation status of an X.509 digital certificate. Revocation information is immediately pushed to an online database.

PKI-related issues that are associated with security warnings include: Validity date range - The X.509v3 certificates specify "not before" and "not after" dates. If the current date is outside the range, the web browser displays a message. Expired certificates may simply be the result of administrator oversight, but they may also reflect more serious conditions. Signature validation error - If a browser cannot validate the signature on the certificate, there is no assurance that the public key in the certificate is authentic. Signature validation will fail if the root certificate of the CA hierarchy is not available in the browser's certificate store.

Note: An enterprise can also implement PKI for internal use. PKI can be used to authenticate employees who are accessing the network. In this case, the enterprise is its own CA.

PKIs can form different topologies of trust. The simplest is the single-root PKI topology.

Patch management is related to vulnerability management. Vulnerabilities frequently appear in critical client, server, and networking device operating systems and firmware. Application software, especially internet applications and frameworks like Acrobat, Flash, and Java, also are frequently discovered to have vulnerabilities. Patch management involves all aspects of software patching, including identifying required patches, acquiring, distributing, installing, and verifying that the patch is installed on all required systems. Installing patches is frequently the most effective way to mitigate software vulnerabilities. Sometimes, they are the only way to do so.

Patch management is required by some compliance regulations, such as Sarbanes Oxley (SOX) and the Health Insurance Portability and Accountability Act (HIPAA)

Base Metric Group Exploitability metrics. Criteria Description Attack vector This is a metric that reflects the proximity of the threat actor to the vulnerable component. The more remote the threat actor is to the component, the higher the severity. Threat actors close to your network or inside your network are easier to detect and mitigate. Attack complexity This is a metric that expresses the number of components, software, hardware, or networks, that are beyond the attacker's control and that must be present for a vulnerability to be successfully exploited.

Privileges required This is a metric that captures the level of access that is required for a successful exploit of the vulnerability. User interaction This metric expresses the presence or absence of the requirement for user interaction for an exploit to be successful. Scope This metric expresses whether multiple authorities must be involved in an exploit. This is expressed as whether the initial authority changes to a second authority during the exploit.

The table lists the impact metric components. Term Description Confidentiality Impact This is a metric that measures the impact to confidentiality due to a successfully exploited vulnerability. Confidentiality refers to the limiting of access to only authorized users. Integrity Impact This is a metric that measures the impact to integrity due to a successfully exploited vulnerability. Integrity refers to the trustworthiness and authenticity of information. Availability Impact This is a metric that measures the impact to availability due to a successfully exploited vulnerability. Availability refers to the accessibility of information and network resources. Attacks that consume network bandwidth, processor cycles, or disk space all impact the availability.

Rating CVSS Score None 0 Low 0.1 - 3.9 Medium 4.0 - 6.9 High 7.0 - 8.9 Critical 9.0 - 10.0 In general, any vulnerability that exceeds 3.9 should be addressed. The higher the rating level, the greater the urgency for remediation.

Penetration Testing This type of test uses authorized simulated attacks to test the strength of network security. Internal personnel with hacker experience, or professional ethical hackers, identify assets that could be targeted by threat actors. A series of exploits is used to test security of those assets. Simulated exploit software tools are frequently used. Penetration testing does not only verify that vulnerabilities exist, it actually exploits those vulnerabilities to determine the potential impact of a successful exploit. An individual penetration test is often known as a pen test. Metasploit is a tool used in penetration testing. CORE Impact offers penetration testing software and services.

Risk analysis Individuals conduct comprehensive analysis of impacts of attacks on core company assets and functioning Internal or external consultants, risk management frameworks Vulnerability Assessment Patch management, host scans, port scanning, other vulnerability scans and services OpenVas, Microsoft Baseline Analyzer, Nessus, Qualys, Nmap Penetration Testing Use of hacking techniques and tools to penetrate network defenses and identify depth of potential penetration Metasploit, CORE Impact, ethical hackers

Common Vulnerabilities and Exposures (CVE) This is a dictionary of common names, in the form of CVE identifiers, for known cybersecurity vulnerabilities. The CVE identifier provides a standard way to research a reference to vulnerabilities.

Risk management involves the selection and specification of security controls for an organization. It is part of an ongoing organization-wide information security program that involves the management of the risk to the organization or to individuals associated with the operation of a system.

Risk avoidance Stop performing the activities that create risk. It is possible that as a result of a risk assessment, it is determined that the risk involved in an activity outweighs the benefit of the activity to the organization. If this is found to be true, then it may be determined that the activity should be discontinued.

Risk reduction Decrease the risk by taking measures to reduce vulnerability. This involves implementing management approaches discussed earlier in this chapter. For example, if an organization uses server operating systems that are frequently targeted by threat actors, risk can be reduced through ensuring that the servers are patched as soon as vulnerabilities have been identified.

Risk sharing Shift some of the risk to other parties. For example, a risk-sharing technique might be to outsource some aspects of security operations to third parties. Hiring a security as a service (SECaaS) CSIRT to perform security monitoring is an example. Another example is to buy insurance that will help to mitigate some of the financial losses due to a security incident.

Risk retention Accept the risk and its consequences. This strategy is acceptable for risks that have low potential impact and relatively high cost of mitigation or reduction. Other risks that may be retained are those that are so dramatic that they cannot realistically be avoided, reduced, or shared.

Network Profile Element Description Session duration This is the time between the establishment of a data flow and its termination. Total throughput This is the amount of data passing from a given source to a given destination in a given period of time. Ports used This is a list of TCP or UDP processes that are available to accept data. Critical asset address space These are the IP addresses or the logical location of essential systems or data.

Server profiling is used to establish the accepted operating state of servers. A server profile is a security baseline for a given server. It establishes the network, user, and application parameters that are accepted for a specific server.

Specifically, there are two internal LAN elements to secure: Endpoints - Hosts commonly consist of laptops, desktops, printers, servers, and IP phones, all of which are susceptible to malware-related attacks. Network infrastructure - LAN infrastructure devices interconnect endpoints and typically include switches, wireless devices, and IP telephony devices. Most of these devices are susceptible to LAN-related attacks including MAC address table overflow attacks, spoofing attacks, DHCP related attacks, LAN storm attacks, STP manipulation attacks, and VLAN attacks.

Signature-based - This approach recognizes various characteristics of known malware files. Heuristics-based - This approach recognizes general features shared by various types of malware. Behavior-based - This approach employs analysis of suspicious behavior.

Advanced Encryption Standard (AES) AES is a popular and recommended symmetric encryption algorithm. It offers combinations of 128-, 192-, or 256-bit keys to encrypt 128, 192, or 256 bit-long data blocks.

Software-Optimized Encryption Algorithm (SEAL) SEAL is a faster alternative symmetric encryption algorithm to AES. SEAL is a stream cypher that uses a 160-bit encryption key and has a lower impact on the CPU compared to other software-based algorithms.

Note: DES is a legacy algorithm and should not be used. 3DES should be avoided if possible.

Symmetric algorithms use the same pre-shared key to encrypt and decrypt data. A pre-shared key, also called a secret key, is known by the sender and receiver before any encrypted communications can take place.

There are a number of HIDS products on the market today. Most of them utilize software on the host and some sort of centralized security management functionality that allows integration with network security monitoring services and threat intelligence. Examples are Cisco AMP, AlienVault USM, Tripwire, and Open Source HIDS SECurity (OSSEC).

The OSSEC server, or Manager, can also receive and analyze alerts from a variety of network devices and firewalls over syslog. OSSEC monitors system logs on hosts and also conducts file integrity checking. OSSEC can detect rootkits and other malware, and can also be configured to run scripts or applications on hosts in response to event triggers.

An attack surface is the total sum of the vulnerabilities in a given system that is accessible to an attacker. The attack surface can consist of open ports on servers or hosts, software that runs on internet-facing servers, wireless network protocols, and even users.

The SANS Institute describes three components of the attack surface: Network Attack Surface - The attack exploits vulnerabilities in networks. This can include conventional wired and wireless network protocols, as well as other wireless protocols used by smartphones or IoT devices. Network attacks also exploit vulnerabilities at the network and transport layers. Software Attack Surface - The attack is delivered through exploitation of vulnerabilities in web, cloud, or host-based software applications. Human Attack Surface - The attack exploits weaknesses in user behavior. Such attacks include social engineering, malicious behavior by trusted insiders, and user error.

The Common Vulnerability Scoring System (CVSS) is a risk assessment tool that is designed to convey the common attributes and severity of vulnerabilities in computer hardware and software systems. The third revision, CVSS 3.0, is a vendor-neutral, industry standard, open framework for weighting the risks of a vulnerability using a variety of metrics. These weights combine to provide a score of the risk inherent in a vulnerability. The numeric score can be used to determine the urgency of the vulnerability, and the priority of addressing it.

The benefits of the CVSS can be summarized as follows: It provides standardized vulnerability scores that should be meaningful across organizations. It provides an open framework with the meaning of each metric openly available to all users. It helps prioritize risk in a way that is meaningful to individual organizations.

The Forum of Incident Response and Security Teams (FIRST) has been designated as the custodian of the CVSS to promote its adoption globally. The Version 3 standard was developed with contributions by Cisco and other industry partners.

This represents the characteristics of a vulnerability that are constant over time and across contexts. It has two classes of metrics: Exploitability - These are features of the exploit such as the vector, complexity, and user interaction required by the exploit. Impact metrics - The impacts of the exploit are rooted in the CIA triad of confidentiality, integrity, and availability.

Interoperability between a PKI and its supporting services, such as Lightweight Directory Access Protocol (LDAP) and X.500 directories, is a concern because many CA vendors have proposed and implemented proprietary solutions instead of waiting for standards to develop. Note: LDAP and X.500 are protocols that are used to query a directory service, such as Microsoft Active Directory, to verify a username and password.

To address this interoperability concern, the IETF published the Internet X.509 Public Key Infrastructure Certificate Policy and Certification Practices Framework (RFC 2527). The X.509 version 3 (X.509 v3) standard defines the format of a digital certificate.

Risk Analysis This is a discipline in which analysts evaluate the risk posed by vulnerabilities to a specific organization. A risk analysis includes assessment of the likelihood of attacks, identifies types of likely threat actors, and evaluates the impact of successful exploits on the organization.

Vulnerability Assessment This test employs software to scan internet facing servers and internal networks for various types of vulnerabilities. These vulnerabilities include unknown infections, weaknesses in web-facing database services, missing software patches, unnecessary listening ports, etc. Tools for vulnerability assessment include the open source OpenVAS platform, Microsoft Baseline Security Analyzer, Nessus, Qualys, and FireEye Mandiant services. Vulnerability assessment includes, but goes beyond, port scanning.

Asymmetric algorithms are used to provide confidentiality without pre-sharing a password. The confidentiality objective of asymmetric algorithms is initiated when the encryption process is started with the public key. The process can be summarized using the formula: Public Key (Encrypt) + Private Key (Decrypt) = Confidentiality

When the public key is used to encrypt the data, the private key must be used to decrypt the data. Only one host has the private key; therefore, confidentiality is achieved. If the private key is compromised, another key pair must be generated to replace the compromised key.

Advanced Malware Protection (AMP) - This provides endpoint protection from viruses and malware. Email Security Appliance (ESA) - This provides filtering of SPAM and potentially malicious emails before they reach the endpoint. An example is the Cisco ESA. Web Security Appliance (WSA) - This provides filtering of websites and block listing to prevent hosts from reaching dangerous locations on the web. The Cisco WSA provides control over how users access the internet and can enforce acceptable use policies, control access to specific sites and services, and scan for malware. Network Admission Control (NAC) - This permits only authorized and compliant systems to connect to the network.

Windows Defender Firewall - First included with Windows XP, Windows Firewall (now Windows Defender Firewall) uses a profile-based approach to firewall functionality. Access to public networks is assigned the restrictive Public firewall profile. The Private profile is for computers that are isolated from the internet by other security devices, such as a home router with firewall functionality. The Domain profile is the third available profile. It is chosen for connections to a trusted network, such as a business network that is assumed to have an adequate security infrastructure. Windows Firewall has logging functionality and can be centrally managed with customized group security policies from a management server such as System Center 2012 Configuration Manager.

Configuration management addresses the inventory and control of hardware and software configurations of systems. Secure device configurations reduce security risk.

With the advent of cloud data centers and virtualization, management of numerous servers presents special challenges. Tools like Puppet, Chef, Ansible, and SaltStack enable efficient management of servers that are used in cloud-based computing.

There are three Digital Signature Standard (DSS) algorithms that are used for generating and verifying digital signatures: Digital Signature Algorithm (DSA) - DSA is the original standard for generating public and private key pairs, and for generating and verifying digital signatures. Rivest-Shamir Adelman Algorithm (RSA) - RSA is an asymmetric algorithm that is commonly used for generating and verifying digital signatures. Elliptic Curve Digital Signature Algorithm (ECDSA) - ECDSA is a newer variant of DSA and provides digital signature authentication and non-repudiation with the added benefits of computational efficiency, small signature sizes, and minimal bandwidth.

publish public-key cryptography standards (PKCS)


Ensembles d'études connexes

Bio102 Unit 4 Mastering Biology Quizzes

View Set

Marketing for Entrepreneurs Exam 1

View Set

Chemistry 9.4 Naming and Writing Formulas for Acids and Bases

View Set