

Games, movies, TV shows, and even photos are rendered and mastered with HDR baked in, so lowering their brightness would do nothing but detract from the way they were intended to be viewed. Even those don’t do much, because in the age of HDR reducing brightness makes as much sense as trying to turn water a little drier.

Note even the most hardcore, top of the line, expensive TVs now have essentially done away with brightness settings and only offer gamma scales. Other than when displaying SDR content, in which case they can automatically tone down brightness without user intervention. I want to note also that previously to using this in G-SYNC mode I kept the screen at 96hz through Nvidias control panel custom resolution settings, this prevents some frame skipping with only losing 4 hz from the max level of 100. The advent of HDR has essentially made the brightness setting redundant, because monitors and TVs that wish to deliver good HDR performance must always showcase peak brightness. So don’t worry, HDR isn’t like staring at the sun. For context, DisplayHDR currently goes up to 1400 nits, and several TV manufacturers routinely make models capable of 4000 nits. Anything below that looks essentially like SDR and will function as HDR in name only. Formats such as HDR10 settle for slightly lower figures like 350 candelas/nits. Generally, VESA DisplayHDR certification requires a minimum of 400 candelas or nits of peak brightness to qualify as HDR. The line between standard dynamic range (SDR) and high dynamic range isn’t always that clear. However, HDR also happens to function as a very delicate technology.

BenQ HDRi technology takes HDR to the next level by auto-adjusting display brightness based on ambient lighting, instead of sticking with static HDR metadata. It adds realism to displays and helps colours pop with life. High dynamic range makes image quality brighter and more vivid. That HDR is part of the new normal for monitors and gaming is well-known by now.
