Table of Contents
Within the realms of desktop environments, Linux is supposedly no large player (market investigations estimate a market share of about 3%). However, you'll most likely know two or more people who use Linux, some of them even exclusively. If you take that into consideration, either you know the personal OS preference of over a hundred people (you're popular) or the estimation is to be taken with a grain of salt.
Nevertheless, 3% is still a lot (ever thought about how many desktop systems there are? I did, never found the answer though). And if we take other markets into account (embedded systems, servers, network appliances and more) the Linux share increases.
But still, many people have no idea what Linux is or how to work with it. In this book, I offer a technical, quick introduction to the Linux operating system from a users perspective. I'm not going to dive into the concept, advantages or disadvantages of Free Software (although a few paragraphs don't hurt) and am not going to talk about the history and evolution of Linux Operating Systems. For more resources on these subjects I refer you to the Further Resources section at the end of this chapter.
For a book to be complete about Linux as an operating system, it is important to inform the user about operating systems in general. Linux is very modular and open, meaning that each component in the Linux Operating System is visible to the user. Without understanding of the structure of an operating system it would be hard for a user to comprehend the reasoning behind each module in the Linux OS. For this reason alone I devote an entire section on operating systems in general.
Once I have crossed the tasks of an operating system, I continue with an explanation on the real Linux operating systems: Linux distributions.
Finally, each chapter in this book will offer a set of exercises that you could attempt to solve. You will not be able to find the answers of each question in this book. Rather see the exercises as a means to push you further and help you seek (and find) more topics related to Linux. At the end of the book, a list of tips and/or answers is given for each question.
An operating system is actually a stack of software, each item designed for a specific purpose.
The kernel is the core of an operating system: it manages communication between devices and software, manages the system resources (like CPU time, memory, network, ...) and shields off the complexity of device programming from the developer as it provides an interface for the programmer to manipulate hardware.
The system libraries contain program methods for developers to write software for the operating system. The libraries contain methods for process creation and manipulation, file handling, network programming, etc. It is a vital part of an operating system because you can't (or shouldn't) communicate with the kernel directly: the library shields off the complexity of kernel programming for the system programmer.
The system tools are built using the system libraries and enable administrators to administer the system: manage processes, navigate on the file system, execute other applications, configure the network, ...
The development tools provide the means to build new software on (or for) the system. Although not a required part of an operating system I do like to mention it because with Gentoo, this is a requirement (we'll see later on why this is the case). These tools include compilers (translate code to machine code), linkers (which collect machine code and bring it together into a working binary) and tools that ease the build process considerably.
Other libraries on the system enhance the developers' coding experience by providing access to methods other developers have already written. Examples of such libraries include graphical libraries (for manipulating windows) or scientific libraries. They aren't required on every operating system, but if you want to run a specific tool, it will require certain libraries to be installed on your system. On top of those additional libraries you will find the end-user tools (office suites, multimedia tools, graphical environments, ...) that you want to install on your operating system.
handling system calls
The first responsibility is called device management. A computer system has several devices connected to it: not only the CPU and memory are available, but also disks (and disk controllers), network cards, graphical cards, sound cards, ... Because every device operates differently, the kernel is required to know what the device can do and how to address and manipulate each device so that it plays nice in the entire system. This information is stored in the device driver: without such driver, the kernel doesn't know the device and will therefore not be able to control it.
Next to the device drivers, the kernel also manages the communication between the devices: it governs access to the shared components so that all drivers can happily operate next to each other. All communication must adhere to strict rules and the kernel makes sure that these rules are followed.
The memory management component manages the memory usage of the system: it keeps track of used and unused memory, assigns memory to processes who require it and ensures that processes can't manipulate the data of one another. To do this, it uses the concept of virtual memory addresses: addresses for one process are not the real addresses, and the kernel keeps track of the correct mappings. It is also possible for data not to be really present in memory although it is present for a process: such data is stored on a swap space. Because swap space is much, much slower than real memory, use of this space should be limited to data that isn't read often.
To ensure that each process gets enough CPU time, the kernel gives priorities to processes and gives each of them a certain amount of CPU time before it stops the process and hands over the CPU to the next one. Process management not only deals with CPU time delegation (called scheduling), but also with security privileges, process ownership information, communication between processes and more.
Finally, for a kernel to actually work on a system, it must provide the means to the system and the programmer to control itself and give or receive information from which new decisions can be made. Using system calls a programmer can write applications that query the kernel for information or ask the kernel to perform a certain task (for instance, manipulate some hardware and return the result). Of course, such calls must be safe to use so that malicious programmers can't bring the system down with a well-crafted system call.
A Linux operating system, like Gentoo Linux, uses Linux as the kernel.
Because a kernel can't do much out of itself, it must be triggered to perform tasks. Such triggers are made by applications, but these applications must of course know how to place system calls for the kernel. Because each kernel has a different set of system calls available (it is very system specific), programmers have created standards with which they can work. Each operating system supports these standards, and these are then translated to the system specific calls for that operating system.
One example standard is the C library, probably the most important system library available. This library makes pretty vital operations available to the programmer, such as basic input/output support, string handling routines, mathematical methods, memory management and file operations. With these functions a programmer can create software that builds on every operating system that supports the C library. These methods are then translated by the C library to the kernel specific system calls (if system calls are necessary). This way the programmer doesn't need to know the kernel internals and can even write software (once) that can be build for many platforms.
There is no single specification on what a system library is. The author of this book believes that system libraries are whatever library is part of the default, minimal install of an operating system. As such, system libraries for one operating system (and even Linux distribution) can and will differ from the system libraries of another. Most Linux distributions have the same system libraries, which is to be expected because all Linux distributions can run the same software and this software is of course built for these system libraries. Some distributions just don't mark one library part of the default, minimal install while others do.
Just like with system libraries there is no single specification for system tools. But, unlike system libraries, system tools are quite visible to the end user. Because of this, almost all Linux distributions use the same system tools, or similar tools with the same features but different implementations.
But what are system tools? Well, with a kernel and some programming libraries you can't manipulate your system yet. You need access to commands, input you give to the system that gets interpreted and executed. These commands do primitive stuff like file navigation (change directory, create/remove files, obtain file listings, ...), information manipulation (text searching, compression, listing differences between files, ...), process manipulation (launching new processes, getting process listings, exiting running processes, ...), privilege related tasks (changing ownership of files, changing user ids, updating file permissions, ...) and more.
If you don't know how to deal with all this stuff, you don't know how to work with your operating system. Some operating systems hide these tasks behind complex tools, others have simple tools for each task and bundle the power of all these tools. Unix (and Linux) is one of the latter. Linux systems usually have the GNU Core Utilities for most of these tasks.
With the above three components you have a running, working operating system. You might not be able to do everything you want, but you can update your system until it does what you want. How? By installing additional tools and libraries until you have your functional system.
These additional tools and libraries are of course written by programmers and they must be able to build their code so that it works on your system. Some systems, like Gentoo Linux, even build this software for you instead of relying on the prebuilt software by others. To be able to build these tools, you need the source code of each tool and the necessary tools to convert the source code to executable files.
These tools are called a tool chain: a set of tools that are used as in a chain in order to produce a working application. A general tool chain consists out of a text editor (to write the code in), compiler (to convert code to machine-specific language), linker (to combine machine-code of several sources - including prebuilt "shared" libraries - into a single, executable file) and libraries (those I just mentioned as being "shared" libraries).
A tool chain is of the utmost importance for a developer; it is a vital development tool, but not the only development tool. For instance, developers of graphical applications generally need tools to create graphics as well, or even multimedia-related tools to add sound effects to their program. A development tool is a general noun for a tool that a developer would need in order to create something, but isn't vital for an operating system of an average non-developer user.
Once a developer has finished creating its product, you have an end-user tool with accompanying libraries (which might be required by other tools that are build on top of this product). These end tools are what makes a system unique for a user: they represent what a user wants to do with his system. Although not required by an operating system they are required by the end user and are therefore very important for his system.
Most operating systems don't install all or most of the end-user tools because there are just too many to choose from. Some operating systems don't even provide the means to install end-user tools to their users but rely on the ingenuity of each programmer to create an installer that sets up the tool on the system. Other operating systems bring a small but considerable subset of end-user tools with them so that their users can quickly update their system to whatever shape they want without requiring a long and difficult search across the Internet (or even worse, computer/software shop) to find the software they need.
Examples of end-user tools are well known, such as office suites, graphic design tools, multimedia players, communication software, Internet browsers, ...
The GNU Project is an effort of several programmers and developers to create a free, Unix-like operating system. GNU is a recursive acronym that stands for GNU is Not Unix, because it is Unix-like but contains no Unix code and is (and remains) free. The GNU foundation, the legal entity behind the GNU project, sees free as more than just the financial meaning of free: the software should be free to use for any purpose whatsoever, free to study and modify the source code and behaviour, free to copy and free to distribute the changes you made.
This idea of free software is a noble thought that is active in many programmers' minds: hence many software titles are freely available. Software is generally accompanied by a license that explains what you can and cannot do with it (also known as the "End User License Agreement"). Free Software also has such a license - unlike the EULAs they actually allow most things instead of denying it. An example of such license is the GPL - GNU General Public License.
When we look at a Linux Operating System, its core component is its kernel. The kernel all Linux Operating System use is the Linux kernel, or just Linux. Yes, that's right, the Linux Operating System is called after the kernel, Linux.
Now although all Linux Operating Systems use Linux as their kernel, many of them use a different flavour. This is because the kernel development has several branches. The most important one I call the vanilla kernel. This kernel is the main development kernel where most kernel developers work on; every other kernel is based on this kernel. Other kernels introduce features that the vanilla kernel doesn't want yet (or has tossed away in favour of another feature); still, these kernels are fully compatible with the vanilla kernel.
The Linux kernel saw its first light in 1991 and is created (and still maintained) by Linus Torvalds. It grew rapidly (in 1994, version 1.0.0 saw the light) both in size (1.0.0 had more than 175000 lines of code) and in popularity. Over the years, its development model stayed the same: there are few major players in the development who decide what goes in and what stays out of the kernel code, but the majority of contributions happen from several hundreds of developers (kernel 4.8 had contributions from more than 1500 individuals).
The latest kernel version at the time of writing is 4.8.15. The first two numbers play the role of the major version, the third number is the minor version (mostly bugfix releases). Sometimes a fourth number is added when a one-off bug fix was needed. The Linux kernel development generally increments the major numbers (most of the time the second number) for functional improvement releases: for every increment, users (and developers) know that the kernel has new features.
Once a new version of the Linux kernel is released, it isn't distributed to all of its users. No, this is where distributions come into play...
If an end user would want to install a Linux Operating System without additional help, he would need to build a Linux kernel himself, build the components of the operating system (like the libraries, end tools ...) and keep track of changes in the free software world (like new versions or security fixes). And although all this is perfectly possible (look for the Linux From Scratch project), most users would want something that is a bit more... user friendly.
Enter distributions. A distribution project (like the Gentoo Project) is responsible for a Linux Operating System (the distribution) to such an extend that for the end user, the distribution project is the point of contact for his Linux installation.
Distribution projects make choices regarding the software:
How should the users install the operating system?
Perhaps users are encouraged to perform as many steps as possible during the installation process (the "distribution" Linux from Scratch probably has the most intensive installation process). The very inverse is an installation CD or USB image that doesn't even require any configuration or installation: it just boots the environment and you're ready to start using the Linux distribution.
What installation options are there (CD, DVD, network, Internet, ... ?)
Most Linux distributions offer an installation CD/DVD as it is the most popular method for acquiring software. But many other installation options exist. You can install a distribution from a network using net booting (a popular approach in enterprise environments as it makes unattended installations possible) or from within another operating system.
What software should be available to the user?
Popular desktop Linux distributions offer a wide range of software to the end users. This allows the distribution to become widely accepted as it fits the needs of many users. However, more advanced distributions exist that focus on a particular market (like set-top boxes for multimedia presentations, firewalls and network management, home automation appliances, ...) and of course, these distributions offer different software titles to the users.
How is the available software built (specific system, features ...)?
If a distribution wants the software to run on as many processor types as possible (Pentium, i7, Athlon, Xeon, Itanium, ...) it needs to build the software for a generic platform (say i686) rather than for a specific one (Itanium). Of course, this means that the software doesn't use all features that new processors provide, but the software does run on many more systems.
The same is true for features supported by certain software titles. Some software titles offer optional support for ipv6, ssl, truetype fonts, ... but if you want it, you need to compile this support in the application. Distributions that offer software in a binary format (most distributions do) need to make this choice for their users. More than often, they attempt to offer support for as many features as possible, but not all end-users would need or even want this.
Is internationalization of the software important?
Some distributions are targeting specific user groups tied to their language and geography. There are distributions that are fully localized to a specific group (say "Belgian Dutch-speaking users" or "Canadian French-speaking users"), but also distributions that try to offer localization for as many groups as possible.
How should users update and maintain their system?
Many distributions offer an automated software update process, but not all distributions offer a live upgrade process (where, once installed, your installation gradually builds up and becomes the latest version of that distribution without any specific actions). Some distributions even require you to boot from the latest installation CD and perform an upgrade step.
How would a user configure his system?
If you are a graphical Linux user, you definitely don't want to hear about configuration file editing or command-line actions to be taken. So, you will most likely look for distributions that offer a full graphical interface to configure your system. But some users do like the idea of writing the configuration files directly as it offers the largest flexibility (but also the highest learning curve) and distributions often work on these sentiments. Some distributions don't even allow you to update the configuration files directly as they (re)generate those files anyway (overwriting your changes).
What is the target user group of the distribution?
Most desktop distributions target home/office users, but there are distributions that target children or scientists. Some distributions are made for developers and others for elder people. There are distributions for visually impaired people and distributions for people without Internet access.
What policies does the distribution put on its software?
Organizations like FSF have a vision on how the (software) world should look like. Many distributions offer a way of implementing these visions. For instance, some distributions only allow software that is licensed under an FSF-approved license. Other distributions allow users to use non-free software. There are distributions that implement a higher security vision in the distribution, offering a more hardened approach to operating systems.
Should the distribution be freely available?
Of course, money is often a major decision point as well. Not all distributions are freely downloadable / available on the Internet, although the majority is. But even when the distribution is freely available, it might still be necessary to obtain commercial support, even just for the security updates of the distribution.
You'll find several distributions in the world; each of those distribution projects answers the questions a bit different from the others. Hence, choosing the right distribution is often a quest where you have to answer many questions before you find the correct distribution.
Of course, when you're starting with Linux, you probably don't have a strong opinion about these questions yet. That's okay because, if you want to start using Linux, you should start with the distribution of which you'll have the best support. Ask around, perhaps you have friends who might help you with Linux. And be honest, what better support is there than personal support?
A distribution is a collection of software (called the packages) bundled together in a coherent set that creates a fully functional environment. The packages contain software titles (build by other projects) and possibly patches (updates) specific for the distribution so that the package integrates better with other packages or blends in better with the overall environment. These packages are usually not just copies of the releases made by the other software projects but contain a lot of logic to fit the software in the global vision of the distribution.
Take KDE for example. KDE is a (graphical) desktop environment which bundles several dozens of smaller tools together. Some distributions provide a pristine KDE installation to their users, others change KDE a bit so that it has a different default look and such.
Another example would be MPlayer, a multimedia player especially known for its broad support of various video formats. However, if you want to view Windows Media Video files (WMV), you need to build in support for the (non-free) win32 codecs. Some distributions provide MPlayer with support for these codecs, others without. Gentoo Linux lets you choose if you want this support or not.
When you want to use a distribution, you can (but you don't have to) use tools built by the distribution project to ease several tasks:
to install the distribution you can use one or more installation tools provided by the distribution project
to install additional packages on your system you can use one or more software management tools provided by the distribution project
to configure your system you can use one or more configuration tools provided by the distribution project
I cannot stress enough the importance of the term can. You don't have to use the distributions' installation tools (you can always install a distribution differently), you don't have to install software using the software management tools (you can always build and install your software manually) and you don't have to configure your system with the configuration tools (you can always edit the configuration files of the various applications by hand).
Why then does a distribution put all this effort in these tools? Because they make it a lot easier for the user to use his system. Take software installation as an example. If you don't use a software management tool, you need to build the software yourself (which can be different depending on the software you want to build), keep track of updates (both bug fixes and security fixes), make sure you have installed all the dependent software (software this depends on software that which depends on library a, b and c ...) and keep track of the installed files so that your system doesn't clutter up.
Another major addition distributions provide are the software packages themselves. A software package contains a software title (think of the Mozilla Firefox browser) with additional information (such as a description of the software title, category information, depending software and libraries ...) and logic (how to install the software, how to activate certain modules it provides, how to create a menu entry in the graphical environments, how to build the software if it isn't built already, ...). This can result in a complex package, making it one of the reasons why distributions usually cannot provide a new package at the same day the software project itself releases a new version.
For security fixes however, most information and logic stays the same so security fix releases by the software project usually result in a quick security fix release of the software package by the distribution project.
documentation about the distribution
infrastructure where you can download the distribution and its documentation from
daily package updates for new software
daily security updates
support for the distribution (which can be in the form of forums, e-mail support, telephone support or even more commercial contractual support)
Now, a distribution project is more than all that. By bundling all packaging into a single project, developers can work together to build an operating system that extends the ''commercial-grade'' operating systems. To do this, most distribution projects have divisions for public relations, user relations, developer relations, release management, documentation and translations, etc.
I haven't talked about architectures yet, but they are important nevertheless. Let me first define the concept of instruction sets.
An instruction set of a CPU is the set of commands that that particular CPU understands. These commands perform a plethora on functions such as arithmetic functions, memory operations and flow control. Programs can be written using these functions but usually programmers use a higher level programming language because a program written in this specific language (called the assembly language of that CPU) can only be run on that CPU. That, and assembly is so low-level that it is far from easy to write a tool with it. The tools that still use assembly language are compilers (which translate higher-level programming language to assembly), boot loaders (which load an operating system into memory) and some core components of operating systems (the Linux kernel has some assembly code).
Now, every CPU type has a different instruction set. The Intel Pentium IV has a different instruction set than the Intel Pentium III; the Sun UltraSPARC III has a different instruction set than the Sun UltraSPARC IIIi. Still, their instruction sets are very similar. This is because they are in the same family. CPUs of the same family understand a particular instruction set. Software tools built for that instruction set run on all CPUs of that family, but cannot take advantage of the entire instruction set of the CPU they run on.
Families of CPUs are grouped in architectures. Architectures are global and represent the concept of an entire system; they describe how disks are accessed, how memory is handled, how the boot process is defined. These define the large, conceptual differences between system. For instance, the Intel compatible range of systems is grouped in the x86 architecture; if you boot such a system, its boot process starts with the BIOS (Basic Input-Output System). Sun Sparc compatible systems are grouped in the sparc architecture; if you boot such a system, its boot process starts with the Boot PROM.
Architectures are important because Linux distributions often support multiple architectures and you should definitely know what architecture your system uses. It is most probably the x86 or amd64 architecture (both are quite equivalent) but you should understand that other architecture exist as well. You will even find tools that are not supported for your architecture even though they are available for your distribution, or some packages will have the latest version available on one architecture and not yet on the others.
Linux is frequently hyped in the media - sometimes with reason, most of the time without. Although I discussed what Linux is previously, a quick recap:
Linux is a generic term referring to the Linux Operating System, a collection of tools running under the Linux kernel and most of the time offered through a Linux distribution project.
Of course, this is often not clear for users unknown to the world beyond Microsoft Windows. Although the best way of discovering what Linux is is by using Linux, I feel it is important to debunk some myths before I continue with the rest of the book.
A myth is a story which is popular, but not true. Myths surrounding Linux will always exist. The next few sections try to offer my ideas behind many of these myths...
It is always possible for someone to point to a Linux distribution that is difficult to install. The Linux From Scratch "distribution" is actually a document explaining the entire process for setting up a Linux distribution by building compilers, building software, placing files, etc. Yes, this is hard and might even be difficult if the documentation wasn't up to date.
However, many distributions (most of them even) are simple to install. They offer the same installation approach as other operating systems (including Microsoft Windows) together with online help (on-screen help) and offline help (installation guides). Some distributions can even be installed with as little as two or three questions, and you can even use Linux without having to install it at all.
There were days that Linux had no commercial support, but that was in the previous century. You can now obtain the Linux operating system from major software vendors such as Novell or RedHat (with support), or use a freely downloadable Linux distribution and get a contract with a company that offers support for that distribution.
All distributions offer excellent free support as well (something I'll talk about in the next few chapters) and many have an active security follow-up, resulting in quick security fixes as soon as a vulnerability is found or reported. There is often no need for a desktop user to obtain commercial support as the freely available support channels offer a major advantage compared to some other, proprietary operating systems.
Actually, because it is free software, security holes are far more difficult to remain in the source code. There are too many eyes watching the source code and many free software projects have a very active developer community that checks and rechecks source code changes over and over again before they are pushed to the end user.
The Linux kernel is not a graphical kernel, but the tools that run beneath the Linux kernel can be graphical. Even more, most distributions offer a full graphical interface for every possible aspect of the operating system: it boots graphically, you work graphically, you install software graphically, you even troubleshoot issues graphically. Although you can work with a command line exclusively, most distributions focus on the graphical environment.
This book is not a good example regarding this myth as it focuses on the command-line. However, that is because of the personal preference of the author.
For many Microsoft Windows titles, this is true. But there is almost certainly software available in Linux that offers the same features as the software you are referring to. Some software even is available for Linux: the popular browsers Firefox and Chrome are two examples, the freely available office suite LibreOffice.org is another.
There are also Windows emulators and libraries that offer an interface allowing Microsoft Windows applications to run within Linux. I don't recommend using this software for every possible software title though. It is more of a last resort in case you definitely require a certain software title but already perform the majority of your work within Linux.
This is also a myth. Linux is no more secure than Microsoft Windows or Apple's Mac OS X. Security is more than the sum of all vulnerabilities in software. It is based upon the competence of the user, the administrator, the configuration of the system and more.
Linux can be made very secure: there are distributions that focus on security intensively through additional settings, kernel configurations, software choices and more. But you don't need such a distribution if you want to have a secure Linux system. Better is to read the security documentation of your distribution, make sure that you regularly update your system, don't start software you don't need or visit sites you know aren't legit.
Many groups refer to Linux as being fragmented because there are so many Linux distributions. However, a user of one distribution can easily work with users of other distributions (no issue here). A user of one distribution can also help users of other distributions, because their software is still the same (no issue here either). Even more, software created on one distribution runs perfectly on another distribution (no issue here). The widespread availability of distributions is a strength, not a weakness, as it offers more choice (and more expertise) to the end user.
Perhaps people are referring to the various Linux kernel trees that exist. Yet, all these trees are based upon the same mainline kernel (often called the "vanilla kernel") and every time the mainline kernel brings out a new version, these trees update their own code so branches are never lagging behind. The additional trees that exist are there because of development purposes (additional patches for unsupported hardware before it is merged with the mainline kernel, additional patches for specific virtualisation solutions that are otherwise incompatible or cannot be merged due to license issues, additional patches that are too intrusive and will take a while before they are stabilized, etc.)
Or perhaps people are referring to the various graphical environments (like KDE and GNOME). Yet, they do not speak about the interoperability between those graphical environments (you can run KDE applications in GNOME and vice versa), the standards that this diversity creates (standards on dealing with file formats, menu entries, object linking and more), and more.
Controlled fragmentation is what Linux (and free software in general) offers. Controlled, because it is matched with open standards and free specifications that are well documented and that all software adheres to. Fragmented because the community wants to offer more choices to the end users.
Linux isn't an alternative, but a different operating system. There's a difference between the terms. Alternatives try to offer the same functionality and interface, but using different means. Linux is a different operating system, because it doesn't strive to offer the same functionality or interface of Microsoft Windows.
It isn't because people that have certain feelings about Microsoft are often using Linux, that Linux is anti-Microsoft. The Linux operating system wants nothing more than be fully interoperable with any other operating system. Software projects most definitely want their software to run on any operating system, not only Microsoft Windows or Linux.
Yet not all information spread around are myths. Some are real weaknesses that Linux still needs to work on.
True. Although there are many free software games around, many games are developed for Microsoft Windows exclusively, and not all games can be run using emulators or libraries like WINE within Linux (luckily, many are). It is hard to ask game developers to develop for Linux as most developers focus their endeavours on libraries (like DirectX) that are only available for Microsoft Windows.
Recently though, improvements have been made in this area. Valve has released Steam for Linux, providing gaming experience on Linux desktops. This gave a big boost to "games on Linux".
However, another trend is also emerging: more and more games are only being released on consoles, dropping the PC environment altogether. I personally don't know how games will evolve in the future, but I think that real action games will focus on game consoles more.
If the vendor of the hardware doesn't offer Linux drivers, then it does take a while before the hardware support is brought within the Linux kernel. However, this is not a process spanning multiple years, but rather months. Chances are that a brand-new graphic card / sound card is supported within 3 to 6 months after being released.
The same is true for wireless network cards. Whereas this was a weakness previously, support for wireless network cards is now well integrated within the community. A major reason here is that most vendors are now officially supporting their wireless chip set for Linux, offering drivers and documentation.
Create a list of Linux distributions you have heard of and check, for every one of them, how they perform in the fields you find important (for instance, availability of documentation, translations, support for specific hardware, multimedia, ...).
List 7 CPU architectures.
Why are new kernel releases not distributed to the end user immediately? What role do distributions play in this process?