Whether 4k is rare or not, just because the game may run at a 'lower' resolution, doesn't mean the game will be outputting to the TV at a 'lower' resolution. There are games that are running at 4k and with techniques like CB rendering for example, they could be reconstructing a 4k image which WILL need a HDMI 2.1 port to work. The Series X won't let you pick 120hz if your TV won't accept it and, if you can opt for lower resolutions, will have to go into the settings and turn down the output resolution to enable 120hz - assuming the game will let you select 120hz modes even then...
The only game I know of at 1440/120 so far is Dirt 5 but there are a few at 4k and I wouldn't be surprised if we see more as the generation goes on. Devs will get better at optimising, engines will get better and more streamlined and use techniques like CB rendering, Variable Rate Shading maybe even Machine Learning upscaling (like DLSS) etc, Mesh Shading etc. Look at Battlefield 4 and the way Frostbite engine games were able to get better visuals at higher resolutions and more consistent frame rates on a Base XB1...
An Xbox One S can output at 4k regardless of what the resolution the game is running at. You can use your console to upscale to 4k if you want and your TV will be 'receiving' a 4k image. If the output is set at 4k, regardless of what resolution the game is running at, the console is sending a 4k image and to get 120fps, you will need a HDMI 2.1 port.
You also need a display that can actually accept 120hz and display at 120hz - not some fake method of simulating 240, 360 or 480hz as they claim.
I really do not see why you want to settle for HDMI 2.0, miss out on features that HDMI2.1 offers when there are TV's on the market offering HDMI2.1. I guess if you want to upgrade a few years later when there is more choice, maybe but most buy for more than a few years. It seems odd to me to want a HDMI 2.0 TV for 'next gen' gaming and BOTH consoles coming with HDMI 2.1 and HDMI 2.1 features that games will utilise...