Originally Posted by: NumberlessMath 
From what I understand, the main difference in HDR from SDR is tighter calibration standards. Local dimming can enhance contrast particularly in low-light viewing (which is the standard calibration environment). Wide-gamut is useful for content-creation, but not so much for consumers.
Until the majority of content is mastered to the same consistency as the new panel standards, and ambient-lighting compensation is standard on said panels, subjective experience in terms of vibrancy, accuracy, total quality, will be a roulette*.
Correct, the main difference between a HDR and a SDR display are tighter calibration standards.
A HDR display can simply offer darker blacks without black crush, whiter whites without an over-exposed image and can offer local dimming or a zonal adaptive backlight display. A HDR display can also give you more accurate and vibrant colors compared to an SDR display. However, out of the box almost all displays aren't accurate as they're designed to be eye-catching, vibrant, colorful and are meant to stand out and draw your attention in showrooms.
When it comes to color accuracy however, there is no subjectivity in that. Accuracy is accuracy. The more precise your TV is traking its black, white and colour levels, the better the picture accuracy will be. It'll look more natural.
The benefit of modern TV's are presets. They're a God given gift that most overlook. A simple way of having a good picture for any viewing is to create multiple HDR and SDR presets and switch between them.