There is a lot of terminology when it comes to computer hardware. “Integrated” vs “dedicated” graphics card is one example. In this post, we’ll clarify if and when you might want to seek out a laptop that has an integrate or dedicated graphics card.
What does a graphics card do in the first place?
A graphics card, also known as a Graphics Processing Unit (GPU), is a special kind of processor well-suited for managing computer graphics and video. GPUs are made up of many cores. These cores can take on a bunch of tasks in parallel, which is ideal for graphical computing (e.g., 3D rendering tasks).
What’s the difference between integrated and dedicated?
An integrated graphics card is on the same chip as the computer’s central processing unit (CPU). The graphics card and CPU share the same RAM (see this post for a primer on RAM).
Dedicated graphics cards, on the other hand, are entirely separate from the CPU and have their own RAM.
So which do you need?
Because integrated graphics cards use system memory, they’re a lot more limited than dedicated graphics cards. That being said, they work perfectly fine for a lot of important tasks—web browser, document editing, and even video streaming. If your computer usage consists of these more basic tasks, then you’ll be good-to-go with an integrated graphics card.
However, if you’re doing any heavy gaming or video/music editing, you’ll almost certainly want to pay extra for a dedicated graphics card.