NS 15 - 17

Pataasin ang iyong marka sa homework at exams ngayon gamit ang Quizwiz!

The keyspace increases exponentially: A 2-bit (22) key length = a keyspace of 4 because there are four possible keys (00, 01, 10, and 11). A 3-bit (23) key length = a keyspace of 8, because there are eight possible keys (000, 001, 010, 011, 100, 101, 110, 111). A 4-bit (24) key length = a keyspace of 16 possible keys. A 40-bit (240) key length = a keyspace of 1,099,511,627,776 possible keys.

Algorithm Full Name Advanced Encryption Standard Timeline Official standard since 2001 Type of Algorithm Symmetric Key Size (in bits) 128, 192, and 256 Speed High Time to Crack(assuming a computer could try 255 keys per second) 149 trillion years Resource Consumption Low

Asymmetric algorithms, also called public-key algorithms, are designed so that the key that is used for encryption is different from the key that is used for decryption. The decryption key cannot, in any reasonable amount of time, be calculated from the encryption key and vice versa.

Asymmetric algorithms use a public key and a private key. Both keys are capable of the encryption process, but the complementary paired key is required for decryption. The process is also reversible. Data that is encrypted with the public key requires the private key to decrypt. Asymmetric algorithms achieve confidentiality and authenticity by using this process.

The rule "the longer the key, the better" is valid, except for possible performance reasons. Shorter keys equal faster processing, but are less secure. Longer keys equal slower processing, but are more secure.

Asymmetric and symmetric encryption are the two classes of encryption used to provide data confidentiality. These two classes differ in how they use keys.

A more scientific approach is to use the fact that some characters in the English alphabet are used more often than others. This method is called frequency analysis.

Cryptanalysis is often used by governments in military and diplomatic surveillance, by enterprises in testing the strength of security procedures, and by malicious hackers in exploiting weaknesses in websites.

Cont... Origin Authentication - Guarantees that the message is not a forgery and does actually come from whom it states. Many modern networks ensure authentication with algorithms such as hash-based message authentication code (HMAC). Data Confidentiality - Guarantees that only authorized users can read the message. If the message is intercepted, it cannot be deciphered within a reasonable amount of time. Data confidentiality is implemented using symmetric and asymmetric encryption algorithms. Data Non-Repudiation - Guarantees that the sender cannot repudiate, or refute, the validity of a message sent. Nonrepudiation relies on the fact that only the sender has the unique characteristics or signature for how that message is treated.

Cryptography can be used almost anywhere that there is data communication. In fact, the trend is toward all communication being encrypted.

On average, an attacker has to search through half of the keyspace before the correct key is found. The time that is needed to accomplish this search depends on the computer power that is available to the attacker.

Current key lengths can easily make any attempt insignificant because it takes millions or billions of years to complete the search when a sufficiently long key is used.

The most popular symmetric encryption algorithm is the Advanced Encryption Standard (AES). Symmetric encryption algorithms are based on the premise that each communicating party knows the pre-shared key.

Data confidentiality can also be ensured using asymmetric algorithms, including Rivest, Shamir, and Adleman (RSA) and the public key infrastructure (PKI). Asymmetric encryption algorithms are based on the assumption that the two communicating parties have not previously shared a secret and must establish a secure method to do so.

There are two primary methods for validating a source in network communications: authentication services and data nonrepudiation services. Authentication guarantees that a message comes from the source that it claims to come from. Authentication is similar to entering a secure personal identification number (PIN) for banking at an ATM, as shown in the figure. The PIN should only be known to the user and the financial institution. The PIN is a shared secret that helps protect against forgeries.

Data integrity ensures that messages are not altered in transit. With data integrity, the receiver can verify that the received message is identical to the sent message and that no manipulation occurred.

Asymmetric algorithms are substantially slower than symmetric algorithms. Their design is based on computational problems, such as factoring extremely large numbers or computing discrete logarithms of extremely large numbers.

Diffie-Hellman (DH) 512, 1024, 2048, 3072, 4096 The Diffie-Hellman algorithm allows two parties to agree on a key that they can use to encrypt messages they want to send to each other. The security of this algorithm depends on the assumption that it is easy to raise a number to a certain power, but difficult to compute which power was used given the number and the outcome.

Combining the two asymmetric encryption processes provides message confidentiality, authentication, and integrity.

Diffie-Hellman (DH) is an asymmetric mathematical algorithm that allows two computers to generate an identical shared secret without having communicated before. The new shared key is never actually exchanged between the sender and receiver. However, because both parties know it, the key can be used by an encryption algorithm to encrypt traffic between the two systems.

Digital signatures are a mathematical technique used to provide authenticity, integrity, and nonrepudiation. Digital signatures have specific properties that enable entity authentication and data integrity. Digital signatures use asymmetric cryptography.

Digital signatures are commonly used in the following two situations: Code signing - This is used for data integrity and authentication purposes. Code signing is used to verify the integrity of executable files downloaded from a vendor website. It also uses signed digital certificates to authenticate and verify the identity of the site that is the source of the files. Digital certificates - These are similar to a virtual ID card and used to authenticate the identity of system with a vendor website and establish an encrypted connection to exchange confidential data.

EIGamal 512 - 1024 An asymmetric key encryption algorithm for public-key cryptography which is based on the Diffie-Hellman key agreement. A disadvantage of the ElGamal system is that the encrypted message becomes very big, about twice the size of the original message and for this reason it is only used for small messages such as secret keys.

Elliptic curve techniques 224 or higher Elliptic curve cryptography can be used to adapt many cryptographic algorithms, such as Diffie-Hellman or ElGamal. The main advantage of elliptic curve cryptography is that the keys can be much smaller.

The certificate enrollment process is used by a host system to enroll with a PKI. To do so, CA certificates are retrieved in-band over a network, and the authentication is done out-of-band (OOB) by telephone.

Here are two of the most common methods of revocation: Certificate Revocation List (CRL) - A list of revoked certificate serial numbers that have been invalidated because they expired. PKI entities regularly poll the CRL repository to receive the current CRL. Online Certificate Status Protocol (OCSP) - An internet protocol used to query an OCSP server for the revocation status of an X.509 digital certificate. Revocation information is immediately pushed to an online database.

The Enigma machine was an electromechanical encryption device that was developed and used by Nazi Germany during World War II. The device depended on the distribution of pre-shared keys that were used to encrypt and decrypt messages. The Enigma ciphers were broken by the Allies, and numerous Enigma-encoded messages were decoded during the war. This provided a significant advantage to the Allies and is estimated to have greatly shortened the war and saved many lives.

In transposition ciphers, no letters are replaced; they are simply rearranged. An example of this type of cipher is taking the FLANK EAST ATTACK AT DAWN message and transposing it to read NWAD TA KCATTA TSAE KNALF. Modern encryption block cipher algorithms, such as AES and the legacy 3DES, still use transposition as part of the algorithm.

Because neither party has a shared secret, very long key lengths must be used. Asymmetric encryption can use key lengths between 512 to 4,096 bits. Key lengths greater than or equal to 2,048 bits can be trusted, while key lengths of 1,024 or shorter are considered insufficient.

Internet Key Exchange (IKE) - This is a fundamental component of IPsec VPNs. Secure Socket Layer (SSL) - This is now implemented as IETF standard Transport Layer Security (TLS). Secure Shell (SSH) - This protocol provides a secure remote access connection to network devices. Pretty Good Privacy (PGP) - This computer program provides cryptographic privacy and authentication. It is often used to increase the security of email communications.

the highest-level CA is called the root CA - can issue certificates to end users and to a subordinate CA. The sub-CAs could be created to support various business units, domains, or communities of trust. The root CA maintains the established "community of trust" by ensuring that each entity in the hierarchy conforms to a minimum set of practices. The benefits of this topology include increased scalability and manageability. This topology works well in most large organizations. However, it can be difficult to determine the chain of the signing process.

Interoperability between a PKI and its supporting services, such as Lightweight Directory Access Protocol (LDAP) and X.500 directories, is a concern because many CA vendors have proposed and implemented proprietary solutions instead of waiting for standards to develop. Note: LDAP and X.500 are protocols that are used to query a directory service, such as Microsoft Active Directory, to verify a username and password.

Key Verification - Almost all cryptographic algorithms have some weak keys that should not be used. With the help of key verification procedures, weak keys can be identified and regenerated to provide a more secure encryption.

Key Exchange - Key management procedures should provide a secure key exchange mechanism that allows secure agreement on the keying material with the other party, probably over an untrusted medium.

Key management is often considered the most difficult part of designing a cryptosystem. Many cryptosystems have failed because of mistakes in their key management, and all modern cryptographic algorithms require key management procedures. In practice, most attacks on cryptographic systems are aimed at the key management level, rather than at the cryptographic algorithm itself.

Key Generation - In a modern cryptographic system, key generation is usually automated and not left to the end user. The use of good random number generators is needed to ensure that all keys are equally generated so that the attacker cannot predict which keys are more likely to be used.

Key Storage - On a modern multi-user operating system that uses cryptography, a key can be stored in memory. This presents a possible problem when that memory is swapped to the disk, because a Trojan horse program installed on the PC of a user could then have access to the private keys of that user.

Key Lifetime - Using short key lifetimes improves the security of legacy ciphers that are used on high-speed connections. In IPsec a 24-hour lifetime is typical. However, changing the lifetime to 30 minutes improves the security of the algorithms.

With hash functions, it is computationally infeasible for two different sets of data to come up with the same hash output. Furthermore, the hash value changes every time the data is changed or altered. Because of this, cryptographic hash values are often called "digital fingerprints". These fingerprints can be used to detect duplicate data files, file version changes, and similar applications. These values are used to guard against an accidental or intentional change to the data, or accidental data corruption.

Mathematically, the equation h= H(x) is used to explain how a hash algorithm operates. a hash function H takes an input x and returns a fixed-size string hash value h. If a hash function is hard to invert, it is considered a one-way hash. Hard to invert means that given a hash value of h, it is computationally infeasible to find an input for x such that h=H(x).

Here are two examples of instances when DH is commonly used: Data is exchanged using an IPsec VPN SSH data is exchanged

Note: A DH key agreement can also be based on elliptic curve cryptography. DH groups 19, 20, and 24, which are based on elliptic curve cryptography, are also supported by Cisco IOS Software.

The higher the class number, the more trusted the certificate. Therefore, a class 5 certificate is trusted much more than a lower-class certificate. Class Description 0 Used for testing in situations in which no checks have been performed. 1 Used by individuals who require verification of email. 2 Used by organizations for which proof of identity is required. 3 Used for servers and software signing. Independent verification and checking of identity and authority is done by the certificate authority. 4 Used for online business transactions between companies. 5 Used for private organizations or government security.

Note: An enterprise can also implement PKI for internal use. PKI can be used to authenticate employees who are accessing the network. In this case, the enterprise is its own CA.

Symmetric encryption algorithms such as Data Encryption Standard (DES), 3DES, and Advanced Encryption Standard (AES) are based on the premise that each communicating party knows the pre-shared key. Data confidentiality can also be ensured using asymmetric algorithms, including Rivest, Shamir, and Adleman (RSA) and the public key infrastructure (PKI).

Note: DES is a legacy algorithm and should not be used. 3DES should be avoided if possible.

Some examples of Certificate Authorities are IdenTrust, DigiCert, Sectigo, GlobalSign, and GoDaddy. These CAs charge for their services. Let's Encrypt is a non-profit CA that offers certificates free of charge.

Note: Not all PKI certificates are directly received from a CA. A registration authority (RA) is a subordinate CA and is certified by a root CA to issue certificates for specific uses.

Threat actors can use SSL/TLS to introduce regulatory compliance violations, viruses, malware, data loss, and intrusion attempts in a network.

PKI-related issues that are associated with security warnings include: Validity date range - The X.509v3 certificates specify "not before" and "not after" dates. If the current date is outside the range, the web browser displays a message. Expired certificates may simply be the result of administrator oversight, but they may also reflect more serious conditions. Signature validation error - If a browser cannot validate the signature on the certificate, there is no assurance that the public key in the certificate is authentic. Signature validation will fail if the root certificate of the CA hierarchy is not available in the browser's certificate store

Several types of cryptographic keys can be generated: Symmetric keys - Can be exchanged between two routers supporting a VPN Asymmetric keys - Are used in secure HTTPS applications Digital signatures - Are used when connecting to a secure website Hash keys - Are used in symmetric and asymmetric key generation, digital signatures, and other types of applications

Regardless of the key type, all keys share similar issues. Choosing a suitable key length is one issue. If the cryptographic system is trustworthy, the only way to break it is with a brute-force attack. If the keyspace is large enough, the search requires an enormous amount of time, making such an exhaustive effort impractical. The table summarizes the key length required to secure data for the indicated amount of time.

Software-Optimized Encryption Algorithm (SEAL) SEAL is a faster alternative symmetric encryption algorithm to AES. SEAL is a stream cypher that uses a 160-bit encryption key and has a lower impact on the CPU compared to other software-based algorithms.

Rivest ciphers (RC) series algorithms This algorithm was developed by Ron Rivest. Several variations have been developed, but RC4 was the most prevalent in use. RC4 is a stream cipher that was used to secure web traffic. It has been found to have multiple vulnerabilities which have made it insecure. RC4 should not be used.

Digital Signature Standard (DSS) and Digital Signature Algorithm (DSA) 512 - 1024 DSS specifies DSA as the algorithm for digital signatures. DSA is a public key algorithm based on the ElGamal signature scheme. Signature creation speed is similar to RSA, but is 10 to 40 times slower for verification.

Rivest, Shamir, and Adleman encryption algorithms (RSA) 512 to 2048 RSA is for public-key cryptography that is based on the current difficulty of factoring very large numbers. It is the first algorithm known to be suitable for signing, as well as encryption. It is widely used in electronic commerce protocols and is believed to be secure given sufficiently long keys and the use of up-to-date implementations.

MD5 with 128-bit digest - Developed by Ron Rivest and used in a variety of internet applications, MD5 is a one-way function that produces a 128-bit hashed message. MD5 is considered to be a legacy algorithm and should be avoided and used only when no better alternatives are available. It is recommended that SHA-2 or SHA-3 be used instead.

SHA-1 - Developed by the U.S. National Security Agency (NSA) in 1995. It is very similar to the MD5 hash functions. Several versions exist. SHA-1 creates a 160-bit hashed message and is slightly slower than MD5. SHA-1 has known flaws and is a legacy algorithm

SHA-2 - Developed by the NSA. It includes SHA-224 (224 bit), SHA-256 (256 bit), SHA-384 (384 bit), and SHA-512 (512 bit). If you are using SHA-2, then the SHA-256, SHA-384, and SHA-512 algorithms should be used whenever possible.

SHA-3 - SHA-3 is the newest hashing algorithm and was introduced by the National Institute of Standards and Technology (NIST) as an alternative and eventual replacement for the SHA-2 family of hashing algorithms. SHA-3 includes SHA3-224 (224 bit), SHA3-256 (256 bit), SHA3-384 (384 bit), and SHA3-512 (512 bit). The SHA-3 family are next-generation algorithms and should be used whenever possible.

For as long as there has been cryptography, there has been cryptanalysis. Cryptanalysis is the practice and study of determining the meaning of encrypted information (cracking the code), without access to the shared secret key. This is also known as codebreaking.

Several methods are used in cryptanalysis: Brute-force method - The attacker tries every possible key knowing that eventually one of them will work. Ciphertext method - The attacker has the ciphertext of several encrypted messages but no knowledge of the underlying plaintext. Known-Plaintext method - The attacker has access to the ciphertext of several messages and knows something about the plaintext underlying that ciphertext. Chosen-Plaintext method - The attacker chooses which data the encryption device encrypts and observes the ciphertext output. Chosen-Ciphertext method - The attacker can choose different ciphertext to be decrypted and has access to the decrypted plaintext. Meet-in-the-Middle method - The attacker knows a portion of the plaintext and the corresponding ciphertext.

A digital certificate is equivalent to an electronic passport. It enables users, hosts, and organizations to securely exchange information over the internet. Specifically, a digital certificate is used to authenticate and verify that a user who is sending a message is who they claim to be. Digital certificates can also be used to provide confidentiality for the receiver with the means to encrypt a reply.

The Public Key Infrastructure (PKI) consists of specifications, systems, and tools that are used to create, manage, distribute, use, store, and revoke digital certificates. The certificate authority (CA) is an organization that creates digital certificates by tying a public key to a confirmed identify, such as a website or individual. The PKI is an intricate system that is designed to safeguard digital identities from hacking by even the most sophisticated threat actors or nation states.

SSL - Secure web servers use X.509.v3 for website authentication in the SSL and TLS protocols, while web browsers use X.509v3 to implement HTTPS client certificates. SSL is the most widely used certificate-based authentication. IPsec - IPsec VPNs use X.509 certificates when RSA-based authentication is used for internet key exchange (IKE). S/MIME - User mail agents that support mail protection with the Secure/Multipurpose Internet Mail Extensions (S/MIME) protocol use X.509 certificates. EAP-TLS - Cisco switches can use certificates to authenticate end devices that connect to LAN ports using 802.1x between the adjacent devices. The authentication can be proxied to a central ACS via the Extensible Authentication Protocol with TLS (EAP-TLS).

The first step in the CA authentication procedure is to securely obtain a copy of the CA's public key. All systems that leverage the PKI must have the CA's public key, which is called the self-signed certificate. The CA public key verifies all the certificates issued by the CA and is vital for the proper operation of the PKI. Note: Only a root CA can issue a self-signed certificate that is recognized or verified by other CAs within the PKI.

The US Government Federal Information Processing Standard (FIPS) Publication 140-3 specifies that software available for download on the internet is to be digitally signed and verified.

The purpose of digitally signed software is to ensure that the software has not been tampered with, and that it originated from the trusted source as claimed. Digital signatures serve as verification that the code has not been tampered with by threat actors and malicious code has not been inserted into the file by a third party.

To ensure secure communications across both the public and private infrastructure, the network administrator's first goal is to secure the network infrastructure, including routers, switches, servers, and hosts. This can be accomplished using device hardening, AAA access control, ACLs, firewalls, monitoring threats using IPS, securing endpoints using Advanced Malware Protection (AMP), and enforcing email and web security using the Cisco Email Security Appliance (ESA) and Cisco Web Security Appliance (WSA).

There are three primary objectives of securing communications: Authentication - This guarantees that the message is not a forgery and actually comes from the authentic source. Modern networks ensure authentication using hash message authentication code (HMAC). Integrity - This guarantees that no one intercepted the message and altered it; similar to a checksum function in a frame. This is provided by implementing the SHA-2 or SHA-3 family of hash-generating algorithms. Confidentiality - This guarantees that if the message is captured, it cannot be deciphered. This is provided using symmetric or asymmetric encryption algorithms.

Therefore, hashing is vulnerable to man-in-the-middle attacks and does not provide security to transmitted data. To provide integrity against man-in-the-middle attacks, origin authentication is also required.

To add origin authentication and integrity assurance, use a keyed-hash message authentication code (HMAC). HMAC uses an additional secret key as input to the hash function. Note: Other Message Authentication Code (MAC) methods are also used. However, HMAC is used in many systems including SSL, IPsec, and SSH.

Gilbert Vernam was an AT&T Bell Labs engineer who, in 1917, invented, and later patented, the stream cipher. He also co-invented the one-time pad cipher. Vernam proposed a teletype cipher in which a prepared key consisting of an arbitrarily long, non-repeating sequence of numbers was kept on paper tape

To decipher the ciphertext, the same paper tape key was again combined character by character, producing the plaintext. Each tape was used only once; hence, the name one-time pad. As long as the key tape does not repeat or is not reused, this type of cipher is immune to cryptanalytic attack. This is because the available ciphertext does not display the pattern of the key.

Public Key (Encrypt) + Private Key (Decrypt) = Confidentiality

When the public key is used to encrypt the data, the private key must be used to decrypt the data. Only one host has the private key; therefore, confidentiality is achieved. If the private key is compromised, another key pair must be generated to replace the compromised key.

A rail fence cipher is used with the key of 3.

When using the Vigenère cipher, if the message is longer than the key, the key is repeated. For example, SECRETKEYSECRETKEYSEC is required to encode FLANK EAST ATTACK AT DAWN:

Cryptography is dynamic and always changing. A security analyst must maintain a good understanding of cryptographic algorithms and operations to be able to investigate cryptography-related security incidents.

Where can PKI be used by an enterprise? The following provides a short list of common uses of PKIs: SSL/TLS certificate-based peer authentication Secure network traffic using IPsec VPNs HTTPS Web traffic Control access to the network using 802.1x authentication Secure email using the S/MIME protocol Secure instant messaging Approve and authorize applications with Code Signing Protect user data with the Encryption File System (EFS) Implement two-factor authentication with smart cards Securing USB storage devices

Using a hash function is another way to ensure data confidentiality. A hash function transforms a string of characters into a usually shorter, fixed-length value or key that represents the original string. The difference between hashing and encryption is in how the data is stored.

With the hash function, after the data is entered and converted using the hash function, the plaintext is gone. The hashed data is simply there for comparison.

The purpose of encryption and hashing is to guarantee confidentiality so that only authorized entities can read the message

A scytale is a device used to generate a transposition cipher. A strip of paper or other material is wrapped around a rod of a known diameter

3DES (Triple DES) The is the replacement for DES and repeats the DES algorithm process three times. It should be avoided if possible as it is scheduled to be retired in 2023. If implemented, use very short key lifetimes.

Advanced Encryption Standard (AES) AES is a popular and recommended symmetric encryption algorithm. It offers combinations of 128-, 192-, or 256-bit keys to encrypt 128, 192, or 256 bit-long data blocks.

PKIs can form different topologies of trust. The simplest is the single-root PKI topology.

Cross-certified CA topologies - is a peer-to-peer model in which individual CAs establish trust relationships with other CAs by cross-certifying CA certificates. Users in either CA domain are also assured that they can trust each other. This provides redundancy and eliminates the single-point of failure.

Today, symmetric encryption algorithms are commonly used with VPN traffic. This is because symmetric algorithms use less CPU resources than asymmetric encryption algorithms. This allows the encryption and decryption of data to be fast when using a VPN. When using symmetric encryption algorithms, like any other type of encryption, the longer the key, the longer it will take for someone to discover the key. Most encryption keys are between 112 and 256 bits. To ensure that the encryption is safe, a minimum key length of 128 bits should be used. Use a longer key for more secure communications.

Data Encryption Standard (DES) This is a legacy symmetric encryption algorithm. It uses a short key length that makes it insecure for most current uses.

It also guaranteed authenticity based on the unique signet ring impression.

Data confidentiality ensures privacy so that only the receiver can read the message. This can be achieved through encryption. Encryption is the process of scrambling data so that it cannot be easily read by unauthorized parties.

Rivest, Shamir, and Adleman (RSA) algorithm, run slowly due to large key lengths

If the data that is being protected is worth significantly more than the $1 million dollars needed to acquire a cracking device, then another algorithm should be used.

Key Revocation and Destruction - Revocation notifies all interested parties that a certain key has been compromised and should no longer be used. Destruction erases old keys in a manner that prevents malicious attackers from recovering them.

Key length - Also called the key size, this is the measure in bits. In this course, we will use the term key length. Keyspace - This is the number of possibilities that can be generated by a specific key length.

With modern algorithms that are trusted, the strength of protection depends solely on the size of the key. Choose the key length so that it protects data confidentiality or integrity for an adequate period of time. Data that is more sensitive and needs to be kept secret longer must use longer keys.

Performance is another issue that can influence the choice of a key length. The estimated funding of the attacker should also affect the choice of key length.

Note: Longer keys are more secure; however, they are also more resource intensive. Caution should be exercised when choosing longer keys because handling them could add a significant load to the processor in lower-end products.

The DES weak keys are those that produce 16 identical subkeys. This occurs when the key bits are: Alternating ones and zeros (0101010101010101) Alternating F and E (FEFEFEFEFEFEFEFE) E0E0E0E0F1F1F1F1 1F1F1F1F0E0E0E0E

The Caesar Cipher is a type of substitution cipher in which each letter is replaced by another letter that is a set number of places away in the alphabet.

The Vigenère cipher is a type of polyalphabetic substitution cipher. It was considered unbreakable until 1863. To use the cipher a key text is generated that repeats for the length of the message to be encrypted. A combination of the plaintext letter and the corresponding key letter are used to locate the ciphertext value for the letter in a tableVigenère Cipher Table

Integrity Authenticity Confidentiality MD5 (legacy) HMAC-MD5 (legacy) 3DES (legacy) SHA HMAC-SHA-256 AES RSA and DSA

These are the four elements of secure communications: Data Integrity - Guarantees that the message was not altered. Any changes to data in transit will be detected. Integrity is ensured by implementing either of the Secure Hash Algorithms (SHA-2 or SHA-3). The MD5 message digest algorithm is still widely in use. However, it is inherently insecure and creates vulnerabilities in a network. Note that MD5 should be avoided.

Cryptography - the development and use of codes Cryptanalysis - the breaking of those codes Cryptology is the science of making and breaking secret codes

This means that the security of encryption lies in the secrecy of the keys, not the algorithm

The authentication objective of asymmetric algorithms is initiated when the encryption process is started with the private key. The process can be summarized using the formula: Private Key (Encrypt) + Public Key (Decrypt) = Authentication

When the private key is used to encrypt the data, the corresponding public key must be used to decrypt the data. Because only one host has the private key, only that host could have encrypted the message, providing authentication of the sender.

Hashes are used to verify and ensure data integrity. They are also used to verify authentication. Hashing is based on a one-way mathematical function that is relatively easy to compute, but significantly harder to reverse.

hash function takes a variable block of binary data, called the message, and produces a fixed-length, condensed representation, called the hash. The resulting hash is also sometimes called the message digest, digest, or digital fingerprint.

There are three Digital Signature Standard (DSS) algorithms that are used for generating and verifying digital signatures: Digital Signature Algorithm (DSA) - DSA is the original standard for generating public and private key pairs, and for generating and verifying digital signatures. Rivest-Shamir Adelman Algorithm (RSA) - RSA is an asymmetric algorithm that is commonly used for generating and verifying digital signatures. Elliptic Curve Digital Signature Algorithm (ECDSA) - ECDSA is a newer variant of DSA and provides digital signature authentication and non-repudiation with the added benefits of computational efficiency, small signature sizes, and minimal bandwidth.

public-key cryptography standards (PKCS)


Kaugnay na mga set ng pag-aaral

PALM 205- Module 4 Exam (practice quizzes)

View Set

Chapter 11: Health Care of the Older Adult

View Set

Org Management chapter 9, Org Management Chapter 5&6, Chapter 7&8 org management

View Set

Chapter 40 Fluid, Electrolyte, and Acid-Base Balance

View Set

Skeletal System (25 major bones)

View Set

THINKING SKILLS FOR TROUBLESHOOTING

View Set