VRAM

    ReddIt
    Twitter
    Facebook
    Pinterest

    VRAM, an abbreviation of Video Random Access Memory or Video RAM, is a type of dual-ported DRAM (Dynamic Random-Access Memory) that is used to store image data a computer displays. VRAM can be described as the framebuffer between the CPU (Central Processing Unit) and the video adaptor (or video card).

    How does VRAM work?

    Before a picture is displayed on your screen, it is processed by the CPU and written to the VRAM of the graphics adaptor to be displayed on your monitor via an HDMI or Display port.

    VRAM is a dual-port memory with two sets of data output pins that can be accessed simultaneously. It can also read and write simultaneously. The first port is called the DRAM port and it is accessed by the host computer while the second port is the video port. Its main purpose is to provide high throughput and a serialized data channel for the graphics chipset. All types of VRAM are special arrangements of dynamic RAM (DRAM).

    Similar to the RAM (Random-Access Memory) used by computers, VRAM temporarily stores data relating to graphics. Meaning that unlike non-volatile memory such as your storage drive, every time the computer is turned off, all the graphics data stored in the VRAM will be completely lost as it is a volatile memory.

    How does is VRAM implemented?

    VRAM is used in dedicated video adaptors such as graphics cards. As stated above it is the framebuffer between the CPU and graphics card. VRAM is extremely fast and has a large bandwidth which makes it ideal for dedicated graphical devices.

    Different Types of VRAM

    Each system is unique and requires a video memory suited for its specific operations, as a result, several types of VRAM have been developed:

    • MDRAM (Multibank Dynamic RAM)
    • RDRAM (Rambus DRAM)
    • SGRAM (Synchronous Graphics RAM)
    • WRAM (Windows RAM)

    History of VRAM

    VRAM was invented in 1980 by F. Dill, D. Ling and R. Matick at the IBM Research Center. Their idea was patented 5 years later. The first commercial video memory was used in a high-resolution graphics adapter for an RT PC system that was marketed by IBM in 1986.  From that moment it set a new standard for graphics displays. Before VRAM had been developed, dual-ported memory was pretty expensive, the high costs prevented higher resolution bitmapped graphics being used.

    As the technology became more affordable the use of VRAM has gradually increased. VRAM has also improved the overall framebuffer throughput, allowing for higher-resolution, high-speed, color graphics at a lower cost. Currently, all Modern GUI-based operating systems benefitted from this technology.

    « Back to Definition Index