
The GPU (Graphics Processing Unit – see details on www.abbreviationfinder.org), also called the graphics chip is certainly the most important component of a video card. In a nutshell, it is a type of processor responsible for performing calculations and routines that result in the images displayed on the computer’s video monitor.
As with CPUs , there is a wide variety of GPUs available on the market, some more powerful, developed especially for processing complex 3D graphics (for playing games or making films, for example), even the simplest ones, manufactured with focus on the low-cost computer market. There are several GPU manufacturers on the market, but the best known companies in the industry are NVIDIA , AMD (formerly ATI) and Intel , the first two of which are the most popular for more sophisticated graphics chips.
You can probably argue that you’ve seen video cards from other brands, such as Gigabyte, Asus, Zotac, XFX, among others. Realize, however, that these companies manufacture the cards, but do not produce GPUs. It is up to them to insert GPUs on their boards as well as other resources, such as memory and connectors (subjects that will also be covered in this article). On the other hand, it is also important to note that GPUs can be directly embedded in computer motherboards, being popularly called ” onboard video cards ” in these cases.
Features of a GPU
The GPU was created to “relieve” the computer’s main processor (CPU) from the heavy task of generating images. Therefore, it is capable of handling a large volume of mathematical and geometric calculations, a trivial condition for the processing of 3D images (used in games, computerized medical examinations, among others).
So that the images can be generated, the GPU works by executing a sequence of steps, which involve elaboration of geometric elements, application of colors, insertion of effects and so on. This sequence, in a very brief way, consists of the GPU receiving a set of vertices (the meeting point of two sides of an angle); in the processing of this information so that it obtains a geometric context; in the application of effects, colors and the like; and transforming all of this into elements formed by pixels (a pixel is a point that represents the smallest part of an image), a process known as rasterization . The next step is to send this information to the video memory (frame buffer) so that the final content can be displayed on the screen.
GPUs can rely on several resources to perform these steps, including:
– Pixel Shader: shader is a set of instructions used for processing image rendering effects. Pixel Shader, therefore, is a program that works with the generation of effects based on pixels. This feature is widely used in 3D images (from games, for example) to generate lighting, reflection, shading effects, etc;
– Vertex Shader: similar to Pixel Shader, except that it works with vertices instead of pixels. Therefore, Vertex Shader consists of a program that works with structures formed by vertices, dealing, therefore, as geometric figures. This feature is used to model the objects to be displayed;
– Render Output Unit (ROP): basically, it manipulates the data stored in the video memory so that they “become” the set of pixels that will form the images to be displayed on the screen. It is up to these units to apply filters, depth effects, among others;
– Texture Mapping Unit (TMU): it is a type of component capable of rotating and resizing bitmaps (basically, images formed by sets of pixels) for applying a texture under a surface.
These features are used by GPUs in components whose quantities vary from model to model. You saw above, for example, that there are units for Vertex Shaders and units for Pixel Shaders. At first and depending on the application, this scheme proves to be advantageous. However, there may be situations where units of one or the other are missing, generating an imbalance that impairs performance. To deal with this, several current graphics chips use stream processors , that is, units that can take on both the function of vertex Shaders and Pixel Shaders, according to the need of the application.
In general, you can find details describing the use of these and other features on your video card’s GPU in the manual or on the manufacturer’s website.
GPU Clock
If you look at the image of the program above, you will notice that among the various fields there is a so-called ” GPU Clock “. And what is this? Now, if the GPU is a type of processor, then it works within a certain frequency , that is, a clock . In general, the clock is a synchronization signal. When computer devices receive the signal to perform their activities, this event is called “clock pulse”. On each pulse, devices perform their tasks, stop and move to the next clock cycle.
Clock measurement is done in hertz(Hz), the standard unit of frequency measurements, which indicates the number of oscillations or cycles that occur within a given measure of time, in this case, seconds. So when a device works at 900 Hz, for example, it means that it is capable of handling 900 clock cycle operations per second. Note that, for practical purposes, the word kilohertz (KHz) is used to indicate 1000 Hz, just as the term megahertz (MHz) is used to indicate 1000 KHz (or 1 million hertz). Likewise, gigahertz (GHz) is the name used when you have 1000 MHz, and so on. Thus, if a GPU has, for example, a frequency of 900 MHz,).
Therefore, the higher the frequency of a GPU, the better its performance, at least theoretically, since this issue depends on the combination of a series of factors, such as the amount of memory and bus speed, for example. This indicates that clock is an important feature, however, the user does not need to worry so much about it, even because, in newer video cards, certain components can work with different frequencies than the one used by the GPU itself, such as the units responsible for the shader processing, for example.
Resolution and colors
When purchasing a video card, an important feature that is usually described in the device’s specifications is its maximum resolution . When we talk about this aspect, we are referring to the set of pixels that form horizontal and vertical lines on the screen. Let’s take a resolution of 1600×900 as an example.
Of course, the higher the supported resolution, the greater the amount of information that can be displayed on the screen, as long as the video monitor is capable of handling the values supported by the video card. Within the maximum limit, the resolution can be changed by the user through specific features of the operating system, where it is also possible to change the amount of colors with which the video card works.
For a long time, the combination of information regarding resolutions and colors indicated the standard used by the video card. Here are the most common patterns:
MDA (Monochrome Display Adapter): standard used in the first PCs, indicating that the card was capable of displaying 80 columns with 25 lines of characters, supporting only two colors. Used at a time when computers worked, essentially, with command lines;
CGA (Color Graphics Adapter): more advanced standard than MDA and, therefore, more expensive, generally supporting resolution up to 320×200 pixels (reaching 640×200) with up to 4 colors at the same time among 16 available;
EGA (Enhanced Graphics Adapter): standard used in the then revolutionary PC AT, generally supporting a resolution of 640×350 with 16 colors at the same time within 64 possible;
VGA (Video Graphics Adapter): standard that has become widely known with the Windows 95 operating system, working with a resolution of 640×480 and 256 colors simultaneously or 800×600 with 16 colors at the same time;
SVGA (Super VGA): considered an evolution of VGA, SVGA initially indicated the resolution of 800×600 pixels and, later, 1024×768. In fact, since SVGA, video cards have come to support even more varied resolutions and millions of colors, so it is considered the current standard.
Regarding the determination of the number of colors, this is established by the number of bits destined for each pixel. The calculation consists of the following: making 2 raised to the number of bits. Thus, for 8 bits per pixel, there are 256 colors (2 raised to 8 is equal to 256). For 32 bits, there are 4,294,967,296 colors.