GPU, short for Graphics Processing Unit, is specially designed to effectively process the displaying needs of a working computer, it is an integral part of a graphics card – and this is the gist of the difference.
Graphics Card
A Graphics card is hardware responsible for producing output to monitor. Also referred to as video card, video adapter, graphics-accelerator card, or display adapter, it is connected to the monitor and to the computer’s motherboard through connectors. The Graphics card has its own memory modules, including the graphics processing unit, which does the creation of the display that we see on the screen. You will realize that modern graphics cards do have more powerful processors, their own power input connectors, RAM, and cooling solutions. Video cards are primarily used for gaming but its also possible to use an array of cards for speeding up tasks that require parallel processing. An example of these tasks is password cracking. You might also get a graphics card to add more monitors to a system. Many video cards offer added functions, such as accelerated rendering of 3D scenes, video capture, TV-tuner adapter, MPEG-2 and MPEG-4 decoding, FireWire, light pen, or TV output, while other modern high performance cards are used for more graphically demanding purposes such as PC games. Read About: Intel’s 8th Generation processor family Previously, an integrated GPU was a seperate chip connected over PCI (Peripheral Component Interconnect), AGP (Accelerated Graphics Port) or PCIe (Peripheral Component Interconnect Express) to the system which was on the motherboard. These processors were mainly low powered processors. Majority of the latest desktop and notebook computers have integrated GPUs, which are considered to be less powerful than those on a video card. Image: Xhoba