1. Linux Overview

Ace your homework & exams now with Quizwiz!

File servers

A Linux file server is a machine that has been set up and configured to let other machines store and retrieve files to and from a central location. In addition, using a file server can simplify backups and security. Using SMB shares and a variety of programs such as Samba or Network File System a Linux file server can share files with other Linux systems, as well as with non-Linux systems such as Windows and Mac.

VPN

A VPN (Virtual Private Network) can be installed on a Linux host and is a type of network that uses encryption to allow IP traffic to travel securely over the TCP/IP network. A VPN is primarily used to support secure communications over an untrusted network (for example, connecting two remote site by means of the internet).

Database

A database is a structured set of data held in a computer, especially one that is accessible in various ways. In simpler terms, a database is an organized collection of various forms of data. The information stored in a database is typically organized into rows, columns, and tables. Database information is also indexed, to make it easier to find the information required. Many open-source databases are available for Linux, which allow you to manage large chunks of data in a secure way with high performance abilities. Many versions of Linux databases can be installed on your Linux system, such as MySQL, Apache Derby, and PostgreSQL.

Certificate authority

A digital certificate is an electronic document that can be used as proof of identification. For example, digital certificates are used between an end user and a bank to establish a trusted connection. As an end user, we trust digital certificates because we trust the entities that create the digital certificates. The entities that create these certificates are called certificate authorities (CAs). A few of the most public certificate authorities include GeoTrust, Comodo, Digicert, Thawte, Verisign and GoDaddy. These CAs require the person or company applying for a certificate (such as your bank) to provide documents and information that proves they are who they claim to be. At times, you may find that using digital certificates within your own organization can be beneficial. For example, when using VPNs, you could use a digital certificate for authentication instead of a pre-shared key. Digital certificates could also be useful for such things as your development and staging systems. Rather than paying a public certificate authority for digital certificates for your internal needs, you can configure a Linux system to be a certification authority. One method of doing this is to use OpenSSL, a free open-source library.

Hardware interface

A key function of the operating system is to ensure that one application running on the system does not try to use an area in memory that's already in use by another application. It is also responsible for ensuring that a given application running on the system does not monopolize the CPU time so that other applications running on the system cannot use the CPU.

Mail server

A mail server is a computer that sends, receives, and stores email for users. When a user creates an email, he or she does so using a mail user agent (MUA) (such as Evolution, Mozilla Thunderbird, or Mutt). The MUA must be configured to send and receive mail by means of a mail server or a Linux system where the mail transfer agent (MTA) has been installed. It is the MTA's responsibility to then either save the message so it can be downloaded by another local user or, using the internet, send the email to the destination MTA where it will be stored for download by the intended user. Some Linux distribution may have a default MTA that can be configured and used. If one does not exist or you want to use a different email system, other MTAs can be downloaded and installed. A few common MTAs include Postfix and Qmail.

Name server

A name server resolves (or maps) the fully qualified domain names (FQDNs), such as www.TestOut.com, to their respective IP addresses and IP addresses to their respective FQDNs. For example, this lets a user access the TestOut site from their web browser by entering https://www.TestOut.com instead of something like https://104/16/32/53. In many cases, you need to download and install a name resolver software on your Linux system to enable the name server features. The Berkeley Internet Name Domain (BIND) software is an example of one of the most widely used DNS software on the internet.

Drag the server role to its proper description. SNMP Name Server Web Server SSH A protocol used to communicate with and monitor network devices and servers. A protocol used to securely log on to remote systems using encryption. Resolves (or maps) the fully qualified domain names (FQDNs) to IP Addresses. A program responsible for accepting HTTP (Hypertext Transfer Protocol) requests from clients.

A protocol used to communicate with and monitor network devices and servers. SNMP A protocol used to securely log on to remote systems using encryption. SSH Resolves (or maps) the fully qualified domain names (FQDNs) to IP Addresses. Name server A program responsible for accepting HTTP (Hypertext Transfer Protocol) requests from clients. Web server SSH (Secure Shell or Secure Socket Shell) is a protocol used to securely log onto remote systems using encryption. SSH is the most common way to access a remote Linux system. OpenSSH is an open-source implementation of the Secure Shell (SSH) protocol and is implemented on most Linux distributions by default. A web server is the program responsible for accepting HTTP (Hypertext Transfer Protocol) requests from web browsers or clients and, in turn, sending the clients the files that form webpages. For example, webpages often consist of HTML (Hypertext Markup Language) documents and linked objects, such as images. A machine that has been dedicated to perform this role is also called a web server. A name server resolves (or maps) the fully qualified domain names (FQDNs), such as www.TestOut.com, to their respective IP addresses and IP addresses to their respective FQDNs. This would let a user access the TestOut site from her web browser by entering https://www.TestOut.com instead of something like https://104/16/32/53. The Simple Network Management Protocol (SNMP) is a protocol designed for managing complex networks and is used to communicate with and monitor network devices, servers, and other devices through the IP protocol. SNMP lets network hosts exchange configuration and status information. For example, SNMP can be used to remotely retrieve the operational statistics of a router or a firewall. On a Linux machine, SNMP runs as a daemon.

Proxy

A proxy is a computer that provides indirect internet access to the computers in your network. In most cases, a proxy server is installed on the same computer as the firewall. Proxy servers provide increased performance and security by blocking direct access between two networks, such as the corporate network and the internet. Proxy can be configured in a variety of ways, such as using SSH tunneling or installing an app on a system that has been configured as a web server.

Web server

A web server is the program responsible for accepting HTTP (Hypertext Transfer Protocol) requests from web browsers or clients and, in turn, sending the clients the files that form webpages. For example, webpages often consist of HTML (Hypertext Markup Language) documents and linked objects, such as images. A machine that has been dedicated to performing this role is also called a web server. A few examples of Linux web server implementations include: Apache server Nginx Lighttpd Apache Tomcat Monkey HTTP Daemon

Logging

An important Linux role is the ability to capture a timeline of events that have taken place on the computer in the form of a file, which is referred to as a log file. The process of creating these logs is known as logging. Logging is enabled by default, and logs are often captured for such things as services, the Linux operating system, and applications. Logging is useful for troubleshooting, security, and evaluating server performance. You can configure a centralized logging server, making it easier to evaluate and use the logs created on many systems. Although log files can be stored in a variety of places, most logs are stored in the /var/log directory or a subdirectory thereof.

Linux Server Roles

As a result of the continued popularity, growth, and development of Linux, it can now fulfill many server roles. In most cases, a server role is an application or process installed on a Linux server. In some cases, the Linux server may be dedicated to run or fulfill a single server role, while at other times, several server roles may be functioning on the same server. The following table summarizes and explains several of the Linux server roles.

Linux Operating System Overview

As an operating system, Linux provides the following key functions of a computer:

Tim, a system administrator, wants to simplify the provisioning and disabling of user accounts. Which of the following server roles should Tim install and configure?

Authentication server Linux centralized authentication using an authentication server makes provisioning and disabling user accounts easier. A proxy is a computer that provides indirect internet access to the computers in your network. Linux containers give you the ability to run an application (with all of the necessary libraries, dependencies, and files) in an isolated environment known as an image or container. When a company has back-end servers that receive a significant amount of traffic (such as Netfilx, Hulu, or Airbnb), response time to these servers can be increased through load balancers by distributing the workload across the available servers.

Drag the server role to its proper description. Proxy Logging VPN Load Balancer Capturing a timeline of events that have taken place on the computer in the form of a file. A type of network that uses encryption to allow IP traffic to travel securely over the TCP/IP network. Increases response time to back-end servers by distributing the workload across the available servers. A computer that provides indirect internet access to the computers in your network.

Capturing a timeline of events that have taken place on the computer in the form of a file. Logging A type of network that uses encryption to allow IP traffic to travel securely over the TCP/IP network. VPN Increases response time to back-end servers by distributing the workload across the available servers. Load balancer A computer that provides indirect internet access to the computers in your network. Proxy Proxy: A proxy is a computer that provides indirect internet access to the computers in your network. In most cases, a proxy server is installed on the same computer as the firewall. Proxy servers provide increased performance and security by blocking direct access between two networks, such as the corporate network and the internet. Proxy can be configured in a variety of ways, such as using SSH tunneling or installing an app on a system that has been configured as a Web server. Logging: An important Linux role is the ability to capture a timeline of events that have taken place on the computer in the form of a file, referred to as a log file. The process of creating these logs is known as logging. Logging is enabled by default and logs are often captured for such things as services, the Linux operating system, and applications. Logging is useful for such things as troubleshooting, security, and for evaluating server performance. If desired, you can configure a centralized logging server making it easier to evaluate and use the logs created on many systems. Although log files can be stored in a variety of places, most logs are stored in the /var/log directory or a subdirectory thereof. VPN: A VPN (Virtual Private Network) can be installed on a Linux host and is a type of network that uses encryption to allow IP traffic to travel securely over the TCP/IP network. A VPN is used primarily to support secure communications over an untrusted network. For example, connecting two remote site by means of the internet. Load balancer: When a company has back-end servers that receive a significant amount of traffic (such as Netfilx, Hulu and Airbnb), response time to these servers can be increased through load balancers by distributing the workload across the available servers.Although load balancers can be purchased as a hardware appliance, software can be installed on a Linux server, making it a load balancer. Three common Linux load balancers include Linux Virtual Server (a free and open-source project), Nginx, and HAProxy, all of which run on top of Linux. Some of the load balancer software is free and some are for pay.

Your company recently setup a VPN and wants to use a digital certificate for authentication instead of a pre-shared key. Which of the following server roles would allow the company to provide this functionality internally instead of using an external provider? Name server Certificate Authority SSH SNMP

Certificate Authority At times, you may find that using digital certificates within your own organization can be beneficial. For example, when using VPNs, you could use a digital certificate for authentication instead of a pre-shared key. Digital certificates could also be useful for such things as your development and staging systems. Rather than paying a public certificate authority for digital certificates for your internal needs, you can configure a Linux system to be a certification authority. One method for doing this is to use OpenSSL, a free open-source library. A name server resolves (or maps) fully qualified domain names (FQDNs), such as www.testout.com, to their respective IP addresses, and IP addresses to their respective FQDNs. The Simple Network Management Protocol (SNMP) is a protocol designed for managing complex networks and is used to communicate with and monitor network devices, servers, and more by means of the IP protocol. SSH (Secure Shell or Secure Socket Shell) is a protocol used to securely log onto remote systems using encryption. SSH is the most common way to access a remote Linux system.

Your company is running a critical business application. The executive team wants to ensure the server is available at all times, even in the event of a server failure. Which of the following server roles would be used to provide a failover server in the event of a system failure? SNMP Proxy Load balancer Clustering

Clustering Clustering is often used to create a failover system, a load balance system, or a parallel processing unit. A failover cluster means that if one system fails, the other servers will take over the load, giving end-users uninterrupted access to the desired data. There are many options for building a Linux cluster, including using free open-source software (such as OpenHPC) or purchasing a commercial product. . When a company has back-end servers that receive a significant amount of traffic (such as Netfilx, Hulu, or Airbnb), response time to these servers can be increased through load balancers by distributing the workload across the available servers. A proxy is a computer that provides indirect internet access to the computers in your network. In most cases, a proxy server is installed on the same computer as the firewall. The Simple Network Management Protocol (SNMP) is a protocol designed for managing complex networks and is used to communicate with and monitor network devices, servers, and more by means of the IP protocol.

Your company develops applications to run on Linux systems. You currently have four teams working on a different aspect of the same application. Which of the following server roles would give you the BEST method for testing all team members' code without effecting your part of the project or your operating system and personal files? Clustering Database Load balancer Containers Monitoring

Containers Linux containers give you the ability to run an application (with all of the necessary libraries, dependencies, and files) in an isolated environment known as an image or container. Due to this isolation, multiple containers can run on the same host without affecting each other or the main operating system. All containers utilize and share the same operating system kernel of the host machine, making them very lightweight and fast.Containers are highly portable. When you move or copy a container from one host to another, all of the files and changes necessary to run the applications within the container are moved or copied with it. Moving a container to a new host does not impact the host operating system. Although Linux containers are extremely portable, they must be compatible with the underlying system. For example, x86 Linux systems will run x86 containers while ARM Linux systems only run ARM Linux containers. However, an x86 Linux system cannot run an ARM Linux container. With clustering, two or more servers are grouped together to make them work like one. Clustering is often used to create a failover system, a load balance system, or a parallel processing unit. Load balancers distribute workload across available servers, which increases response time. Monitoring refers to the process of monitoring the essential Linux services, including operating system metrics, process state, logs, service state, and file system usage. It also refers to monitoring servers' availability. A database is a structured set of data held in a computer, especially one that is accessible in various ways. In simpler terms, a database is an organized collection of various forms of data.

Users are complaining that they are unable to connect to any servers or the internet. Based on the symptoms they describe, you suspect that the users are not being assigned the correct IP addresses. Which of the following server roles would be the BEST role to work with to correct this issue? Proxy DHCP VPN SNMP

DHCP The Dynamic Host Configuration Protocol (DHCP) centralizes IP address assignment management by allowing a server (such as a Linux server) to dynamically assign IP addresses to clients. DHCP also allows users who move from network to network to easily obtain an IP address appropriate for the subnet they are connected to. Since the users are not able to connect to each other as well as to the internet, the most likely cause is an issue with the DHCP server. A proxy is a computer that provides indirect internet access to the computers in your network. In most cases, a proxy server is installed on the same computer as the firewall. Although the proxy could be part of the problem, since the users are not able to communicate with each other, it would not be the only or main issue. The Simple Network Management Protocol (SNMP) is a protocol designed for managing complex networks and is used to communicate with and monitor network devices, servers, and other devices through the IP protocol. SNMP lets network hosts exchange configuration and status information. A VPN (Virtual Private Network) can be installed on a Linux host and is a type of network that uses encryption to allow IP traffic to travel securely over the TCP/IP network. Although the VPN could be part of the problem, since the users are not able to communicate with each other, it would not be the only or main issue.

Alex, a webmaster, is implementing an order processing system on the company's website. Which of the following server roles should Alex implement with the order processing application? Clustering Database VPN Monitoring

Database A database server should be implemented with the order processing application to store the data gathered by the application. Monitoring refers to the process of monitoring the essential Linux services, including such things as operating system metrics, process state, logs, service state, and file system usage. Clustering is often used to create a failover system, a load balance system, or a parallel processing unit. A failover cluster means that if one system fails, the other servers will take over the load, giving end users uninterrupted access to the desired data. A VPN (Virtual Private Network) is a type of network that uses encryption to allow IP traffic to travel securely over the TCP/IP network.

Embedded Linux

Embedded Linux is the process of embedding Linux within intelligent devices, such as automation and control equipment, smart TVs, smart phones, and tablets. To accomplish this, the operating system is customized so it only provides the functions required by that particular device, and all the remaining unnecessary elements of the Linux kernel are removed. Once that's done, the kernel itself is embedded in flash memory chips on the given device.

Your company uses both Linux desktops and Windows desktops. Which of the following server roles should be used to provide a central location for users of both operating systems to share files? Proxy Authentication server File servers Database

File servers A Linux file server is a machine that has been set up and configured to let other machines store and retrieve files to and from a central location. In addition, using a file server can simplify backups and security. Using SMB shares and a variety of programs such as Samba or Network File System, a Linux file server can share files with other Linux systems, as well as with non-Linux systems such as Windows and Mac. Linux centralized authentication (an authentication server) can be accomplished in many ways, depending on the Linux distribution being used. A proxy is a computer that provides indirect internet access to the computers in your network. In most cases, a proxy server is installed on the same computer as the firewall. A database is a structured set of data held in a computer, especially one that is accessible in various ways. In simpler terms, a database is an organized collection of various forms of data.

Linux and cloud computing

In cloud computing, the hardware, software, and/or network resources that have historically been implemented on-site are moved offsite. When a new Linux system is required, you can use an internet cloud provider to deploy the new Linux virtual machine using a hypervisor at their site. You then pay that provider a fee to access this virtual machine through your organization's network connection. This process is referred to as Infrastructure as a Service (IaaS). Other cloud computing options for Linux include: Software as a Service (SaaS)SaaS provides access to software and data through the cloud. Network as a Service (NaaS)NaaS provides network connectivity through the cloud. Storage as a Service (STaaS)STaas provides access to storage devices through the cloud.

A few desktop application examples include:

LibreOffice - A free office software suite for word processing, spreadsheets, and presentations. Apache OpenOffice - A free office software suite for word processing, spreadsheets, and presentations. GIMP - GIMP is an acronym for GNU Image Manipulation Program. It is a free and open-source image editor similar to Photoshop. LightWorks - An editing tool available in free and for-purchase versions.

Containers

Linux containers give you the ability to run an application (with all of the necessary libraries, dependencies, and files) in an isolated environment known as an image or container. Due to this isolation, multiple containers can be run on the same host without affecting each other or the main operating system. All containers utilize and share the same operating system kernel of the host machine, making them very lightweight and fast. Containers are highly portable. When you move or copy a container from one host to another, all of the files and changes necessary to run the applications within the container are moved or copied with it. Moving a container to a new host does not impact on the host operating system. Although Linux containers are extremely portable, they must be compatible with the underlying system. For example, x86 Linux systems run x86 containers while ARM Linux systems only run ARM Linux containers. However, an x86 Linux system cannot run an ARM Linux container.

Linux on mobile devices

Linux has nearly taken over the mobile device market in the form of the Android operating system. The current Android operating system is a specialized Linux distribution created by Google. It was designed primarily for touch screen mobile devices, such as smart phones and tablet computers. Android benefits include: Cost--since Android is based on the Linux kernel, it is much less expensive than other mobile device operating systems, like iOS or Windows RT. Performance--android performs extremely well on mobile devices. Application or apps support--there are many apps available for Android devices. In most cases, these apps allow Android devices the ability to provide the same functionality as the more expensive devices from Apple and Microsoft.

Other Linux Implementations

Linux is also useful in the following implementations:

Your company has been expanding the number of servers in the company's data center, and there is an increased need to gather metrics, watch process states, work with logs, watch services states and file system usage. Which of the following sever roles should be installed to provide this functionality? Monitoring Database Containers Logging

Monitoring Monitoring refers to the process of monitoring the essential Linux services, including such things as operating system metrics, process state, logs, service state, and file system usage. It also refers to monitoring servers' availability.Depending on the Linux distribution, monitoring information can often be gathered manually using command line monitoring tools, such as top , lsof , tcdump , and vmstat . Web-based utilities (such as Monit and Nagios) can also be installed, which usually provides some type of user interface that makes seeing and analyzing the information easier. A database is a structured set of data held in a computer, especially one that is accessible in various ways. In simpler terms, a database is an organized collection of various forms of data. An important Linux role is the ability to capture a timeline of events that have taken place on the computer in the form of a file, which is referred to as a log file. Linux containers give you the ability to run an application (with all of the necessary libraries, dependencies, and files) in an isolated environment known as an image or container.

Monitoring

Monitoring refers to the process of monitoring the essential Linux services, including such things as operating system metrics, process state, logs, service state, and file system usage. It also refers to monitoring servers' availability. Depending on the Linux distribution, monitoring information can often be gathered manually using command line monitoring tools, such as top, lsof, tcdump, and vmstat. Web-based utilities (such as Monit and Nagios) can also be installed, which usually provides some type of user interface that makes seeing and analyzing the information easier.

Authentication server

Most enterprise networks require centralized user authentication and access controls for all system resources. This is not only convenient for users, but also allows an administrator to monitor and audit user types and the type of access they have on each machine. It also makes provisioning and disabling user accounts easier. Linux centralized authentication (an authentication server) can be accomplished in many ways, depending on the Linux distribution being used. Some options include installing and using OpenLDAP (Lightweight Directory Access Protocol) or purchasing programs that aid in the installation and management of centralized authentication, such as FreeIPA Identity & Access Manager.

Users are complaining that the clocks for their operating systems do not match the current time for the location in which they live. Which of the following server roles is BEST for correcting this issue? NTP Correct Answer: DHCP Correct Answer: Proxy Correct Answer: SSH

NTP The Network Time Protocol (NTP) is used to synchronize the time on your Linux system with a centralized NTP server. A local NTP server on the network can be synchronized with an external timing source to keep all the servers in your organization in sync with an accurate time. NTP uses a hierarchy of clocks and computers for synchronizing the current time. SSH (Secure Shell or Secure Socket Shell) is a protocol used to securely log onto remote systems using encryption. The Dynamic Host Configuration Protocol (DHCP) centralizes IP address assignment management by allowing a server (such as a Linux server) to dynamically assign IP addresses to clients. A Proxy server is used to access the internet using a shared connection on a LAN and cache internet content. A proxy server provides internet access control among other features.

Which of the following server roles would you implement to provide services offered by CUPS and IPP? SSH Monitoring Print server Proxy

Print server When a company wants to make a printer available to multiple users over a network, this goal is typically accomplished using a print server. Print servers accept the print jobs from the users and stores them in a queue. When the appropriate printer is available, the job is sent from the queue to the printer. In addition, a print server makes printer queue and status information available to end users and network administrators. The Common UNIX Printing System, or CUPS, is the most common Linux printing system in use today. CUPS manages print jobs and queues and provides network printing using the standard Internet Printing Protocol (IPP). A proxy is a computer that provides indirect internet access to the computers in your network. In most cases, a proxy server is installed on the same computer as the firewall. SSH (Secure Shell or Secure Socket Shell) is a protocol used to securely log on to remote systems using encryption. Monitoring refers to the process of monitoring the essential Linux services, including such things as operating system metrics, process state, logs, service state, and file system usage.

Linux can be implemented in many different ways. Drag the implementation type to the definition that matches it BEST. Linux virtualization Linux and Cloud Computing Linux on mobile devices Embedded Linux Running Linux and Windows on the same physical computer. Infrastructure as a Service (IaaS). Used by Google on many of the physical products it sells. Manages intelligent devices, such as automation and control equipment.

Running Linux and Windows on the same physical computer. Linux virtualization Infrastructure as a Service (IaaS). Linux and Cloud Computing Used by Google on many of the physical products it sells. Linux on mobile devices Manages intelligent devices, such as automation and control equipment. Embedded Linux Linux on mobile devices:Linux has nearly taken over the mobile device market in the form of the Android operating system. The current Android operating system is a specialized Linux distribution created by Google. It was designed primarily for touch screen mobile devices, such as smart phones, and tablet computers. Linux Virtualization:Virtualization is the ability to install and run multiple operating systems concurrently on a single physical machine. The Linux operating system can be virtualized. Embedded Linux:Embedded Linux is the process of embedding Linux within intelligent devices, such as automation and control equipment, smart TVs, smart phones, and tablets. To accomplish this, the operating system is reworked and customized in such a way that it provides only the functions required by that particular device and all the remaining unnecessary elements of the Linux kernel are removed. Once that's done, the kernel itself is embedded in flash memory chips on the given device. Linux and Cloud Computing:In cloud computing, the hardware, software, and/or network resources that have historically been implemented onsite are moved offsite. When a new Linux system is required, you can use an internet cloud provider to deploy the new Linux virtual machine using a hypervisor at their site. You then pay that provider a fee to access this virtual machine through your organization's network connection. This process is referred to as Infrastructure as a Service (IaaS).

SSH

SSH (Secure Shell or Secure Socket Shell) is a protocol used to securely log onto remote systems using encryption. SSH is the most common way to access a remote Linux system. OpenSSH is an open source implementation of the Secure Shell (SSH) protocol and implemented by default on most Linux distributions. Two major components of SSH include the SSH client and the SSH server. The SSH client is a program that is typically only run as needed. Once installed, the SSH server is a daemon that constantly runs in the background.

Linux Key Components

See the following:

Linux Distributions

Simply put, a Linux distribution (also known as a distro) is a unique compilation of the Linux kernel (free and open to all), utilities, desktop environments, applications, and more. Since the Linux operating system is not produced by a single organization, different organizations combine the desired components they want to use, sometimes creating their own unique features, and will then compile them into their own flavor of a Linux operating system or distribution. This distribution is then often made available at no cost or, in some cases (usually for server versions of Linux), for a fee. Individuals can also create their own distribution, but the process of compiling the software can be time consuming and, and it is difficult to make all of the different programs work together properly. There are hundreds of distributions available. Some of the most popular include: Mint Ubuntu Debian Fedora openSUSE Red Hat Enterprise Linux Oracle CentOS

Which of the following is the primary role of a mail transfer agent (MTA)? Store messages so they can be downloaded or send email to a destination MTA. Provide redundant online storage for the mail sever. Control the bandwidth used by mail user agent (MUA). Transfer mail to a print server queue.

Store messages so they can be downloaded or send email to a destination MTA. The mail transfer agent (MTA) has been installed. It is the MTA's responsibility to then either save the message so it can be downloaded by another local user or, using the internet, send the email to the destination MTA, where it will be stored for download by the intended user. The MUA is the email client, such as Evolution, Mozilla Thunderbird, or Mutt. Redundant online storage and transferring email to a print server are not performed by the MTA.

DHCP

The Dynamic Host Configuration Protocol (DHCP) centralizes IP address assignment management by allowing a server (such as a Linux server) to dynamically assign IP addresses to clients. DHCP also allows users who move from network to network to easily obtain an IP address appropriate for the subnet they are connected to. The DHCP server and the client use broadcasts to communicate with each other. In many cases, you need to download and install the DHCP server software. For example, for a Ubunto server, enter: $ sudo apt install isc-dhcp-server at the command prompt. Configuration of the server can then be completed as needed.

Linux kernel (Operating system)

The Linux kernel is the core of the Linux operating system. It is the actual operating system itself. It is the component that fulfills the key operating system duties listed in the Linux Operating System Overview section above. The Linux also provides libraries. Libraries contain pre-written code elements that the programmers can use within their programs, such as how to interface with a hard disk. For example, when a programmer needs to write data to a hard disk, the programmer does not need to know whether the machine has a SATA, IDE, or SCSI drive installed. Instead, the programmer simply calls the appropriate library and tells the operating system that it needs to write data to whatever hard drive is installed in the system, and then the operating system takes care of the rest using its libraries.

Linux Development History

The Linux operating system had its start in 1991 when, as a graduate student at Finland's University of Helsinki, Linus Torvalds began a project that later became the Linux kernel. Linus based his version of Linux on a Unix-like system named MINIX, which was released by Andrew S. Tanenbaum. Linux version 0.02 was released in October of 1991 and consisted of the Linux kernel and its three basic utilities: A Bash shell providing a command line interface. An update utility used for flushing the file system buffers. A GCC (GNU Compiler Collection) compiler system allowing an individual to write their own programs. The source code for the Linux operating system was shared as freeware on the internet, and others were encouraged to enhance it and make it better. At this point, Linux took on a life of its own, and it became a worldwide collaborative development project with no secrecy or tightly guarded copyrights. Access to the source code was open to anyone who wanted it. This collaborative development project on Linux continued for several years until 1994, when Linux version 1.0 was released. The Linux kernel is licensed under the GNU general public license (GPL), which requires the source code to remain freely available to anybody who wants it. GNU is a recursive acronym for "GNU's Not Unix!"

Utilities

The Linux operating system includes a wide variety of utilities that can complete operating system management tasks, such as creating files and maintaining file systems, editing text files, managing the applications that are running on the system, installing new applications on the system, etc.

User interfaces

The Linux operating system provides the end user with a means of interacting with the operating system, the user interface. Linux provides two different user interfaces: Graphical user interface--a Linux graphical user interface (GUI) is similar to the GUIs used in other operating systems, such as Windows. When a user wants to complete some task, he can click on buttons or navigate through menus to accomplish the desired task. Text-based command line interfaces--a text-based interface (often referred to as a terminal) provides a place where the user can type commands. This is similar to Windows Command Prompt and PowerShell. Linux system administrator needs to know how to perform tasks from the text-based interface because most Linux servers disable the graphical user interface to better utilize the systems memory and processor.

NTP

The Network Time Protocol (NTP) is used to synchronize the time on your Linux system with a centralized NTP server. A local NTP server on the network can be synchronized with an external timing source to keep all the servers in your organization in sync with an accurate time. NTP uses a hierarchy of clocks and computers for synchronizing the current time. NTP uses stepping to quickly make large adjustments to close wide time discrepancies, usually about once every 60 seconds. For example, if there's a big differential between the time provider's time and the time on your local system, NTP will adjust the time on your local system in small increments until the time eventually becomes synchronized.

SNMP

The Simple Network Management Protocol (SNMP) is a protocol designed for managing complex networks and is used to communicate with and monitor network devices, servers, and more by means of the IP protocol. SNMP lets network hosts exchange configuration and status information. For example, SNMP can be used to remotely retrieve the operational statistics of a router or a firewall. On a Linux machine, SNMP runs as a daemon. In many cases, you need to download and install SNMP. For example, to install SNMP on a CentOS system ,enter: yum -y install net-snmp net-snmp-utils at the command prompt.

Data storage

The operating system is responsible for providing an efficient and reliable means for storing data. This is usually done using some type of storage device, like a hard disk drive formatted with a particular file system. The file system's job is to organize the information on the hard disk in an easily retrievable format.

Security

The operating system is responsible for providing some degree of security for the data that's stored on its storage devices. For example, the system administrator can create rules and assign permissions that determine who can access what information on the system.

Application platform

The operating system provides a platform where applications can run.

Network connectivity

The operating system provides some type of connectivity between computer systems over a network connection. They can do this using a variety of different network media and interfaces, such as an Ethernet connection between computer systems. There are other standards that can be used to create network connections, such as mobile broadband wireless or Wi-Fi wireless.

1.1.4 Server Roles Facts

This lesson covers the following topic: Linux Server Roles

Linux virtualization

Virtualization is the ability to install and run multiple operating systems concurrently on a single physical machine. This is typically accomplished using a hypervisor. A hypervisor is a thin layer of software that resides between the guest operating system and the hardware. A hypervisor allows virtual machines to interact with the hardware without going through the host operating system. The Linux operating system can be virtualized. A key benefit of virtualization is a more efficient use of system resources. All of the available computing capacity of the system hardware is allocated and distributed among all the virtual machines running on the system. Another benefit of virtualization is the ability to run multiple platforms at the same time. For example, you can run Windows at the same time you are running Linux. This can be a real benefit for Linux software developers and testers. It also makes it much easier to test how an application being developed will perform on different platforms or different versions of a given operating system.

Linux Expansion and Growth

When Linux was first released, it was considered an experimental operating system; something you may experiment with in the lab, but would probably never consider putting it into a production environment. Since that time, things have changed dramatically, and Linux is now a mainstay operating system, especially in server rooms. Using the wide variety of network services that are now available for the Linux operating system, you can configure Linux to perform almost any networking role that any competing server operating system can perform. including: File server Print server Database server Web server Email server Linux is also slowly becoming more popular as a desktop operating system due to the many applications that are currently available. Many of these applications are free.

Load balancer

When a company has back-end servers that receive a significant amount of traffic (such as Netfilx, Hulu, or Airbnb), response time to these servers can be increased through load balancers by distributing the workload across the available servers. Although load balancers can be purchased as a hardware appliance, software can be installed on a Linux server, making it a load balancer. Three common Linux load balancers include Linux Virtual Server (a free and open-source project), Nginx, and HAProxy, all of which run on top of Linux. Some of the load balancer software is free, and some must be purchased.

Print server

When a company wants to make a printer available to multiple users over a network, this goal is typically accomplished using a print server. Print servers accept the print jobs from the users and stores them in a queue. When the appropriate printer is available, the job is sent from the queue to the printer. In addition, a print server makes printer queue and status information available to end users and network administrators. The Common UNIX Printing System, or CUPS, is the most common Linux printing system in use today. CUPS manages print jobs and queues and provides network printing using the standard Internet Printing Protocol (IPP).

Clustering

With clustering, two or more servers are grouped together in a way to make them work like one. With clustering, two or more servers are grouped together in a way to make them work like one.

A technician has been given a work order to install the Apache webserver on a system configured with a YUM repository. Which of the following commands will install the webserver? yum install apache2 yum install httpd rpm -ivh apache2 dnf install httpd

yum install httpd yum install httpd is used to install Apache on a system using a YUM repository. dnf install httpd would work on systems where dnf is used instead of YUM. yum install apache2 will return "No package apache2 available." rpm -ivh apache2 will return"No such file or directory." The rpm command need the full .rpm file name.


Related study sets

Microsoft Office Specialist - Office 2016

View Set

dh32, ch. 2 Properties of Materials

View Set

UNCW SABA 379 FINAL EXAM Spring 2019

View Set

BSC 227 Strait Chapter 3 Histology part 2

View Set

Chapter 33 Post-Class Assignment Part II: Aggregate Demand and Aggregate Supply

View Set

ASE A1 - Practice Test (Section E) Documen term 2 set 8

View Set

MIS 160 Chapter 4 Homework questions

View Set

Chapter 10 - Misleading Graphs & Statistics

View Set