With new monitors supporting higher resolutions, you might be wondering if it’s time to upgrade. In this article, I’ll compare what it’s like to game at 1080P vs 1440P vs 4K vs 8K, and everything in between.
Don’t buy a new monitor until you finish reading this article!
Understanding Display Resolutions:
If you’re not familiar with display resolutions, I’ll quickly break down the basics for you.
A display’s resolution is the total number of pixels, measured horizontally and vertically (across and down) with an aspect ratio of 16:9.
Multiplying the length and width will reveal the total number of pixels on the screen area, the more pixels, the more detailed the image.
In most cases, resolutions are only referred to by the vertical pixels. For example, 1920×1080 is known as 1080P (P for Pixel) and so on.
1080P
1080P is the most common resolution, it has 1920 horizontal pixels and 1080 vertical pixels. It’s usually called “1080P” or “Full High Definition (Full HD)”.
Previously, 720P was known as High Definition (HD). Nowadays 720P displays are only found on very small displays or budget laptops.
Almost all monitors and TVs manufactured in the last couple of decades are 1080P. There’s a good chance you’re reading this article on a 1080P monitor.
1440P
1440P is a display resolution that has 2560 horizontal pixels and 1440 vertical pixels (2560 x 1440). 1440P is also sometimes called “2K” but that’s not an accurate description. 1440P is also called Quad HD or Wide Quad HD.
4K:
The exact resolution of a 4K display is 3840×2160. The reason it’s called “4K” instead of “2160P” is that the horizontal pixels are close to 4,000 and the total pixel count is 4x that of 1080P or Full HD. 4K displays are often called Ultra High Definition, or UHD for short.
8K:
At the moment, 8K is the highest resolution, and only a handful of TVs support it. The exact resolution is 7680x 4320. The reason it’s called “8K” is that the horizontal number is close to 8,000 (8K for short).
It also has 4x the number of pixels as a 4K resolution, and 16x more than standard 1080P. 8K is the clearest resolution possible on a single display at the time this article is published.
Let’s move on to the pros and cons of gaming at these resolutions.
Resolution & Performance:
Resolution is one of the most demanding graphical settings in video games. The reason for that is the graphics card is rendering more pixels, which results in a more detailed image.
The pixel density is increased and the image looks much sharper and details are clearer. If you lower the resolution in a game, you will notice the graphics appear very blurry and “blocky” because the pixels become larger and there are fewer of them.
Although the graphics card does most of the work, the computer also needs a good CPU and fast RAM to run games at higher resolutions. Your device also needs processing power for the other settings too, such as shadows, lighting, shaders, textures, and whatnot.
For that reason, increasing the resolution will be very demanding on your system, and your performance will be limited by the capabilities of your hardware. If your system does not have the power to push a certain resolution, the frame rate in your games will be very low. Of course, it also depends on the particular game and how it’s optimized.
In a nutshell, the visuals at higher resolutions are much better, but the performance is often much worse. It’s up to you on how you balance performance and visuals.
Gaming at 1080P vs 1440P vs 4K vs 8K:
Now let’s take a look at what it’s like to game at these resolutions:
Gaming at 1080P (Entry-level):
Gaming at 1080P is not very demanding because it’s one of the most common resolutions. Almost every graphics card can render games at 1080P without struggling too much, depending on the game and its requirements. You don’t need a powerful graphics card to run games at 1080P.
Newer games might struggle to run at 1080P on older machines. The reason for that is not due to the pixel count but rather the additional graphics features included in new games (such as ray tracing and whatnot) that require more power.
Gaming at 1080P is acceptable, and you can usually reach a frame rate of higher than 60 on most titles. You don’t need a powerful graphics card to play games at 1080P.
Just like how the majority of people have 1080P displays, most people play games at 1080P too.
Gaming at 1080P Pros:
- Not very demanding on systems.
- Can achieve higher frame rates in competitive games.
- Can benefit from high refresh rate monitors because of the increased frame rates.
- Most monitors are 1080P and no special HDMI cable is required.
- 1080P monitors are not expensive
1440P (Mid-range, Sweet Spot for Performance and Visuals):
Gaming at 1440P requires a more powerful graphics card because the pixel density is much higher than at 1080P. To compare, a 1440P monitor has 78% more pixels than 1080P.
At the moment, 1440P is the sweet spot for PC gamers because the hardware requirements needed to reach reasonable FPS are not too high.
Graphics cards released in the last five years or so should be able to run most games at 1440P at around 60 frames a second. However, you may need to lower some of the other graphics settings.
Bear in mind, moving from 1080P to 1440P will significantly impact your frame rate. When increasing the resolution from 1080P to 1440P, it’s not unusual to see a 20-30 drop in FPS.
To play games at 1440P, you need a medium to a high-end graphics card. A GTX 1070 or newer should be acceptable for gaming at 1440P. Most people agree 1440P is becoming the normal standard for displays.
Gaming at 1440P Pros:
- Mid-range graphics cards can usually output 1440P at a stable frame rate.
- 1440P has 78% more pixels than 1080P.
- 1440P is considered the “sweet spot” for performance and visuals.
- Displays that support 1440P are cheaper nowadays.
Concerns:
- Make sure your system can support 1440P.
- 1440P looks best on 27-inch and larger monitors.
- DisplayPort is the best connection option.
- To run 1440P at 144 Hz you need HDMI 2.1 or DisplayPort.
4K (High End, Requires Top Tier Hardware):
Playing games at 4K is possible, but only with the latest graphics cards. Gaming at 4K is incredibly demanding on your system.
Even with the most powerful graphics card, you likely won’t reach a stable 60 FPS at 4K. In most cases, only the latest RTX series cards can run games at 4K without stuttering and frame rate drops.
For the average consumer, gaming at 4K is unrealistic because the hardware required is incredibly expensive. For example, not only do you need the latest graphics card, but you also need a 4K monitor or TV.
The upside is gaming at 4K looks incredible. The downside is not many games are optimized for 4K at 60 FPS at the moment.
Gaming at 4K Pros:
- An incredible amount of detail and image clarity.
Need to Know:
- 4K is very demanding on systems, and only the most recent graphics cards can handle games on it.
- 4K monitors are expensive.
- Building a 4K compatible PC requires very expensive hardware.
- Requires HDMI 2.1 or DisplayPort
8K (Extreme High End, Unrealistic Hardware Requirements):
Currently, gaming on 8K is mostly done for demonstrations. It will be a long time before the average consumer can afford to play games at 8K with 60+ frames a second.
For starters, you would need an 8K display, and those are very rare, not to mention expensive. There is only a handful of 8K TVs on the market. On top of that, only the top-tier graphics cards can run games at 8K with a reasonable framerate.
The RTX 3090 can run some games at 8K at around 30 – 40 frames a second. The in-game graphics look amazing at 8K but the stutters and low frame rates are not worth the trouble. The hardware requirements are far too high.
Several tech YouTubers have tested gaming at 8K, and it’s possible, but it’s mostly for tech demonstrations. A fun fact is 8K has 4x as many pixels as 4K resolution.
Cost & Other Tips:
Price of Hardware & Display
The main piece of hardware that you’ll need to push the resolution higher than 1080P is a graphics card.
On top of that, you need a monitor that supports that resolution too, with your preferred refresh rate. 1440P monitors are not nearly as expensive as they once were, and there are plenty of reasonable deals out there.
An RTX series card will handle 1440P with no issues. Even the older GTX 1070 or 1080 cards can support 1440P. I personally would aim to get a PC that can run games at 1440P and not bother focusing on 4K or higher.
If you’re not sure, I recommend aiming for a 1440P 144 Hz monitor because it’s the sweet spot. On another note, your monitor needs to have the same video ports as your graphics card, usually HDMI 2.1 or DisplayPort.
The price for 4K monitors has also come down quite a bit, but gaming at 4K is very demanding and only the top 1% of PC builds can support it.
Pixel Density:
An interesting fact about pixels is they’re not the same size. You might think a 4K screen has to be larger than 1080P because it has more pixels but that’s not the case.
For example, you could have a 27-inch 1080P monitor and a 27-inch 1440P monitor, and the image on the 1440P monitor will look much clearer.
The reason the 1440P monitor will look better, even though they’re both the same size, is because the 1440P monitor has a higher pixel density. The official specification is called Pixels Per Inch or PPI.
A 27-inch 1080P monitor will have 81 PPI, while the 1440P will have 108 PPI. And a 27-inch 4K monitor has 163 PPI. As you might have guessed, the higher the PPI, the more visual detail.
It’s also important to consider your distance from the monitor. If the monitor is 1440P but you’re sitting further away from it, you might not be able to notice the improved resolution.
Conclusion:
We covered a lot of information in this article, and I hope it cleared up some of the misconceptions about gaming at certain resolutions.
To summarize, 1080P is the easiest milestone to reach, 1440P is better but requires a good graphics card, and 4K and up is not recommended at the moment.