In part I of the Evolution of Resolution, we discussed CGA, EGA,VGA/SVGA and progressive scan. In part II, we need to get into the whirlwind history of computer graphics, but first, let’s conclude our discussion of resolutions. Once they started getting higher, the nomenclature was switched to things like standard definition and high definition. Sometimes the numbers may be confusing, so here’s the easy way to look at it: the first number of a resolution is the width of the frame, the second is the number of the height.

Standard Def = 480p (640x480 pixels).

High Def = 720p (1280x720 pixels).


The best for online streaming: Full High Def = 1080p (1920x1080)

Most cellphones are the standard when people say “high def.”


QHD or Quad high def =1440p (2560x1440)

QHD is typically for high-end cell phones and gaming monitors. 


2K = 1080p (2048x1080)

2K is used mainly for larger displays.


UHD (Ultra High Definition) or 4K Resolution = 2160p  (3840x2160) pixels.

Both 2K and 4K resolutions are for intense coloring, graphics, or theatrical viewing.


FUHD (Full Ultra High Definition) = 4320p (7680x4320) pixels.


8K Resolution is a super high-resolution option ideal for zooming in from a wide shot without pixilation or for creating stunning video effects.


Computer Graphics from the 90’s to Now

Picking up the GPU timeline, ATI introduced their first VGA video card in 1987 (the ATI VGA Wonder) and by 1991, S3 entered the video card market with their S3 911 and 911A graphics chips, which provided up to 256-color graphics quality. OpenGL, developed by Silicon Graphics, came along in 1992 and was used for rendering 2D/3D video game vector graphics, as well as VR, CAD, and more.

In 1996, 3dfx’s Voodoo 1 video card was introduced. Popular with gamers, it required a 2D video card be installed in a computer and allowed it to run in tandem, providing 3D graphics rendering. The following year, NVIDIA released the RIVA 128 graphics accelerator chip, enabling video card manufacturers to incorporate 2D/3D graphics acceleration into their video cards. Unfortunately, it had lower quality graphics rendering than Voodoo 1. In 1998, 3dfx released Voodoo2, the first video card to provide SLI support, enabling two video cards to work together for optimal graphics. 

In 2000, ATI debuted Radeon R100 series video cards (their initial cards were fully DirectX 7 compatible and featured ATI's HyperZ technology). Not be outdone, NVIDIA introduced the GeForce 3 series (the first video cards with programmable pixel shaders) in 2001. AMD acquired ATI in 2006, and fast forwarding to 2013, Sony’s PlayStation 4 and Microsoft’s Xbox One arrived with both consoles using a GPU based on the AMD Radeon HD 7790 and 7850 video cards.

By 2020, NVIDIA dominated the landscape releasing its RTX 30 series GPUs. Ampere GPU microarchitecture succeeded Volta and Turing architectures, GeForce 30 series consumer GPUs debuted, and the 80GB A100 GPU was introduced. In 2021, Mobile RTX graphics cards and the RTX 3060 were revealed. At the 2021 GPU Technology Conference, NVIDIA also announced Ampere's successors, Ampere Next slated for release this year, as well as Ampere Next Next for 2024.

As creative applications and render engines become more advanced, so too must your hardware. From rendering 3D animation to decoding the human genome, capabilities which once seemed impossible are now commonplace. As each new generation of software arrives, more and more compute power is required, not just by the GPU, but the CPU as well. You may have the best GPU on the market, but if you don’t have the right CPU to tell it what to do, you’re not maximizing your tech. The same can be said for liquid cooling, power supplies, custom chassis, and premium components performance tuned to maximize performance. BOXX exists for that reason. As the resolution evolution rolls on, so do we.