In really ancient times, before NTSC color TV standards existed, there was NTSC monochrome. This dictated a 60 Hz 2:1 interlaced field rate and a resulting 30 Hz frame rate. The horizontal sweep rate was 15,750 Hz, which resulted in a maximum of 15,750/30 = 525 horizontal lines. Not all of these were available for TV display: some were lost during the retrace interval as the cathode ray tube electron beam returned from the bottom to the top of the screen. A few more were lost during the vertical blanking interval. And, if interlace was less than perfect (a common fault with el-cheapo B/W TVs) you were lucky to see some 200 lines actually displayed on the picture tube. Later, with the NTSC color standard, and to be reverse compatible with monochrome TVs, both the horizontal and vertical scan frequencies were slightly changed for reasons I won't go into here. The reasons are complicated. But before that, TV techs could sync their scopes to the field rate using the LINE sync provision. I guess most o'scope manufacturers left it in because it is easy to do, and most of us old timers expect it to be there.
So, today, there is very little use for LINE synchronization other than to make a line-based noise or "hum" signal "pop" out of the background. Signals not in sync with the line frequency will not present a stable presentation on the oscilloscope trace. Signals, whether desirable or not (and usually not is appropriate) will present a stable display, making it easier to track down the interference with the o'scope probe. Usually turns out to be a leaky electrolytic capacitor causing excessive ripple in the power supply of the problem equipment.