1080i vs 1080p: Explaining the Key Differences Impact on Image Quality

While 1080i and 1080P may appear to be the same, they’re actually quite different. One is significantly better than the other.

What’s the difference? are 1080i and 1080p the exact resolution? Which one should you enable for the best image quality?

1080i vs 1080p Which is Better?

Regarding image quality, 1080i is significantly worse than 1080p because two halves of a frame mesh together. The halves are often not in sync which creates a blurry image.

During high-motion videos, the “stitching” also known as “combing” of the two halves can be very noticeable.

“Combing” looks like horizontal bars clustered around a moving object. The effect is more noticeable on larger 1080i displays.

On some TVs, 1080i and 1080P appear identical because most modern TVs automatically include de-interlacing post-processing effects.

Either way, if you have to choose between 1080i and 1080P, 1080P is always the better choice.

Static images will look the same on both 1080i and 1080P.

1080i vs 1080p Explained:

1080i and 1080P have the same total number of pixels but they use different methods to draw the pixels.

1080i uses interlaced scan and 1080p use progressive scan. 1080i is the dated method that was common in the CRT era.

Today, most TVs only support progressive scans because it’s the superior method.

Interlaced

interlaced video signal example

Back in the day, the interlaced scan was the go-to method for displays to draw frames because it didn’t require much bandwidth.

The frames were split into two fields; one with even-numbered lines removed and the other with odd numbers removed.

A TV would then merge these two fields like connecting to pieces of a puzzle to form a complete image. The process happens so quickly that it’s not noticeable.

Another way to think of it is half a frame is drawn on odd-numbered lines of the screen and then the rest is filled in.

The problem with the interlaced format is that the two fields aren’t necessarily from the exact same moment.

So you could get one field that is slightly out of sync, which creates noticeable visual artefacts around moving objects.

It’s like trying to mesh two blurry pictures together. When motion is involved, 1080i is about 60% lower definition than 1080P because the fields aren’t completely in sync with each frame.

Most cable companies still output interlaced signals.

If you notice black lines around objects, it’s most likely because the source material was using the interlaced format.

You can read more about interlaced signals on Wikipedia. 

What is 1080i60?

You may have seen that 1080i60 is a video mode in your TV settings. 1080i60 means 1080i at 60 fields a second.

Not to be confused with frames a second!

As you know, with interlaced, it takes two fields to create a frame. 1080i60 is another way to say 1080 at 30 frames a second.

It was the most common format for TVs back in the day.

Likewise, 1080i50 is 1080i with 50 fields which comes to 25 frames a second.

These formats are common in cable and satellite broadcasts. Many companies still broadcast in 480i!

Some TVs also use the term “Hz” to describe the number of fields per second.

The PAL broadcasting standard is advertised as 50 Hz but that stands for 50 fields a second which comes to 25 frames a second.

1080 Progressive Scan

Progressive scan is the modern method to draw an image.

Instead of drawing two halves (or fields) and then merging them together, progressive scan draws a complete frame every time.

Each line is drawn in a row one after another. It doesn’t draw the even and odd lines and separate times, they’re all drawn in a sequence.

With progressive scan, each frame is drawn just as quickly as the signal source. Progressive scan generally delivers higher quality than 1080i because the frames are drawn in sync with the framerate. You won’t see “combing” with a progressive scan.

1080P requires more bandwidth than 1080I and the image quality is much better.

What About Screen Tearing?

In some cases, particularly in video games, you may notice an issue called screen-tearing.

It occurs when two frames are out of sync and drawn over each other, it looks like the image is split in the middle. Screen tearing is similar to the “combing” effect of interlaced content when two out-of-sync fields merge together.

Generally, screen tearing is more common on PC. Consoles usually lock the frame rate to the TV’s refresh rate, eliminating the screen tearing issue.

To remove screen tearing on the PC, you can enable V-Sync or G-sync but those features may introduce input lag.

The progressive scan method uses Hz to measure how many times the screen is drawn in a second, also called the refresh rate.

Nowadays the most common refresh rate is 60Hz but there are plenty of gaming monitors with higher refresh rates.

There’s also some confusion here because Hz was previously used to measure interlaced signal fields per second.

100 Hz TVs?

For example, back in the early 2000s, some TVs were advertised with 100 Hz capabilities.

What that meant was the TV was capable of 100 fields a second which comes to about 50 frames a second.

It may also refer to the “simulated” frame rate of TVs, a combination of features that make it seem like the frame rate is higher than it really is.

These go by different names, but the common features are “Motion Rate”, “Motion Flow”, and “Clear Motion”.

Some Sony TVs advertise a Motion XR Rate of 1200 Hz!

Try to avoid paying too much attention to these features.

What you need to focus on is the native refresh rate of a TV, it’s the true number of times the screen can refresh a second.

Hint: It’s always in multiples of even numbers: 60, 120, 144, 240, etc.

If you’re confused, I recommend reading our TV refresh rates ultimate guide.

Next-gen TVs can also support 4K at 120 Hz.

Der-Interlacing:

Today most displays do not support interlaced signals.

What happens when you want to watch a movie or programme that was created with interlaced signals?

Well, as I mentioned earlier, most cable and satellite broadcasts still use interlaced signals.

To make those compatible with modern TVs, the cable company uses set-top boxes that convert the interlaced signal into progressive.

The quality of that conversion really depends on a number of factors.

The process is called de-interlacing – “cleaning up” an interlaced signal for modern displays.

You might have seen a feature for de-interlacing in your TV’s picture settings.

If you’re watching interlaced content, I recommend enabling de-interlacing on your TV. It will clear up the “combing” effect of interlaced media.

The Bottom Line

Long story short, 1080i and 1080P may look the same but they’re not. 1080i is inferior to 1080p, it’s about the same as 720P.

When given the option, it’s always best to use 1080P or any other format that uses progressive scan.

Was this article helpful?

Yes No
×

How can we improve it?

×

We appreciate your helpul feedback!

Your answer will be used to improve our content. And you can help other readers too 🙂

Follow us on social media:

Facebook Pinterest
About Tim Gagnon

Timothy Gagnon is a tech blogger and writer. When he's not dissembling computers, he's researching the latest tech gadgets and trends.

Leave a Comment