Should I use interlaced refresh rate

Have you upgraded your monitor to one with a higher refresh rate? Not feeling the difference? How about making your old 60Hz monitor faster? Changing your monitor's refresh rate can also save some energy, which is useful if you're running on battery, and even make your graphics card quieter, too.

The higher your monitor's refresh rate is, everything from moving the mouse cursor to scrolling down on documents and pages will feel smoother. It can also make you better at games that require quick responses, simply because you'll see everything sooner. Windows won't always default to the max refresh rate your monitor is capable of, so it's worth checking out your settings.

Note: A high refresh rate isn't the same as variable refresh rate, which synchronizes the monitor's refresh rate with a game's frame rate. Read more about FreeSync and G-Sync in our explainer here.

Should I use interlaced refresh rate

If you always use your PC with the same monitor, changing the refresh rate is simple. On the Windows desktop, right-click and choose "Display settings."

Scroll down a bit and choose "Advanced display settings."

Should I use interlaced refresh rate

Then, under Refresh Rate, choose your desired setting. If you are wondering, an "interlaced" refresh rate means that only half of the pixels get refreshed on each cycle.

Should I use interlaced refresh rate

You may not always want to choose the highest number, though: a higher refresh rate requires more work from your GPU, and setting it to 144Hz may move it to a higher power consumption mode than 120Hz, for example. With some semi-passive graphics cards, that can be the difference between the fans spinning and not spinning when you aren't gaming.

More than one monitor

If you use your PC with more than one monitor, you may prefer to set the refresh rate for each of them individually. In that case, just above the Refresh Rate section, choose "Display adapter properties" for the desired monitor.

Under the "Monitor" tab, choose the rate you want and click "OK."

Should I use interlaced refresh rate

Overclocking a 60Hz monitor

Even if you only have a 60Hz monitor, chances are it can actually go a bit higher, especially if it's a cheap 1080p monitor with a TN panel. A 70Hz or 75Hz refresh rate will be a noticeable upgrade over 60Hz and won't put your monitor at a serious risk of overheating. On the other hand, if you have a 120Hz monitor you may want to set it to 90Hz to save energy, for example.

To achieve that, you'll need the AMD Radeon, Nvidia GeForce or Intel Graphics driver for your GPU.

We'll use AMD Radeon drivers for the demonstration, but the steps are surprisingly similar with Nvidia and Intel drivers...

Should I use interlaced refresh rate

First, launch the driver. Usually you can find your GPU driver settings by right-clicking on the desktop ("more options" in Windows 11), or by searching for its name in Windows Search.

Then, click the Settings button, and choose the "Display" tab. Scroll down, and next to "custom resolutions," click the "Create new" button (if you don't see it, click on "custom resolutions" to accept the EULA).

The next step is to change the refresh rate in the pop-up window. The related settings will change automatically. As with any overclock, we recommend that you are extra careful and add about 5Hz each time. After you do, click "Create."

Should I use interlaced refresh rate

Repeat the steps above to set the new refresh rate in Windows' display settings. The new custom setting will show up in the list, though it may not work with your monitor.

If anything goes wrong, simply don't touch anything for 15 seconds and Windows will go back to its previous display settings. If you want to check that your overclock has worked and that you monitor isn't skipping frames, you can use Blur Busters' frame skipping test. If the monitor fails the test, you should go back to the highest setting that worked.

ave you ever wondered what the ‘p’ stands for in 720p? What about the ‘i’ in 1080i? They stand for progressive scan and interlaced respectively, but what do those terms actually mean? We’ll be defining both as well as weighing which is better and why. Let’s get started by comparing the definitions.

Interlaced vs progressive scan

Progressive vs interlaced explained

To see how these two terms compare and contrast, we’ll start by defining them individually. If you encounter any other unfamiliar terms, be sure to check out our guide to filmmaking vocabulary and our glossary of cinematography terms to learn more.

PROESSIVE AND INTERLACED DEFINITIONs

First, what is progressive?

Progressive is one of two primary scanning methods used in the transmission of broadcast television signals. Progressive scan indicates the specific pattern that lines of visual information are displayed across the viewer’s TV screen. With progressive scanning, all of the lines that make up a single frame are transmitted at once.

Second, what is interlaced?

Interlaced is the alternative scanning method to progressive scan. With interlaced scanning, lines of visual information are alternated as odds and evens. Only half of a frame’s visual information is broadcast at a time (i.e., the even lines will be displayed on a viewer’s screen, THEN the odd lines will be displayed; not simultaneously).

Difference Between Interlaced and Progressive Scan:

  • Two different methods for broadcast scanning
  • Progressive-scan displays all lines at once
  • Interlaced-scan displays half of the lines at a time

Interlaced Refresh Rate

Image scanning over the years

The distinction between these two scanning methods was more important during the days of CRT TVs and the height of broadcast television. Interlacing images began as something of a compromise to the high demands of television broadcasting of the time. In the digital age, interlaced scanning is a thing of the past in all but a few select areas.

Captain Disillusion explores interlacing  •  What is interlaced video

Progressive scan became the standard in the 1990s after decades where both scanning methods found consistent use on television screens. As technology improved and standard definition broadcasts were phased out, so too was interlaced scanning.

  • What is a DIT? →
  • How to Pitch a TV Show →
  • How to Speed Up Video Footage →

Interlaced vs Progressive Refresh Rate

The case for interlaced

Broadcasting interlaced video was cheaper than broadcasting progressive-scan images but also resulted in lower quality images with a higher risk of artifacting, flickering, and other visual imperfections.

With only 50% of the broadcast lines visible at any given time, the transitions between the odd and even lines often resulted in inferior images. But, the switching back and forth between odd and even lines happened so quickly — about 60 times per second — that the human eye perceives a full image rather than two half-images in close proximity.

Interlacing also conserved bandwidth at a time where it could be in short supply. The technical savings were so significant that a station may have been able to broadcast interlaced footage at a 1080i resolution while only being capable of broadcasting progressive-scan footage at a 720p resolution. Interlaced scanning represented a compromise between accessibility, visual fidelity, and image quality.

The joys and sorrows of interlacing  •  How to check if video is interlaced or progressive

It often came down to a case-by-case basis whether or not individuals considered the savings of interlaced scanning worth the drawbacks. Certain types of programs like soap operas and news broadcasts were considered worth the costs of interlaced broadcast. While other types of programs like major television shows or films broadcast on television may have opted for progressive scan to retain the highest visual fidelity possible at the time.

Is Progressive Scan Better than Interlaced

The case for progressive-scan

Progressive scanning ate up more bandwidth and was more costly but also resulted in a much higher quality image presented for each individual frame. Progressive-scan broadcasts were the clear choice for anyone prioritizing quality and visual fidelity.

Progressive scan vs. interlaced

In the modern age, it’s not much of a choice between the two scanning methods. Interlaced scanning is best to be avoided whenever possible, and that’s pretty easy to do these days.

UP NEXT

What is the 4:3 Aspect Ratio?

You now know what the ‘p’ and ‘i’ stand for in TV displays. Even though the distinction between progressive scan and interlaced has become a bit moot over the years, not every television technical limitation of old has become obsolete. Learn why some filmmakers choose to continue using the 4:3 aspect ratio formerly required by television screens despite the widespread adoption of widescreen televisions, up next

Is interlaced refresh rate better?

Given a fixed bandwidth and high refresh rate, interlaced video can also provide a higher spatial resolution than progressive scan.

Is interlaced Hz better for gaming?

60i (interlaced) is less demanding since instead of drawing full resolution “frames” at 60Hz like in 60p (progressive), the graphics card is only drawing half resolution “fields” which means a full frame is drawn when both odd and even fields are displayed.

Is interlaced display better?

Interlaced vs Progressive Refresh Rate Broadcasting interlaced video was cheaper than broadcasting progressive-scan images but also resulted in lower quality images with a higher risk of artifacting, flickering, and other visual imperfections.

Which is better interlaced or not?

Interlaced monitors are easier to build and therefore cheaper, but as you can guess-they aren't as good as non-interlaced monitors. The problem is that all things being equal, it takes twice as long to create the complete screen image on an interlaced monitor.