I'll try and keep it simple.
HDR10 is the standard for HDR. This is tv does accept an hdr10 input and will show Playstation and Xbox games in HDR. Also UHD blu-ray and streaming have hdr10 base layers associated their standards. Any TV labelled HDR accepts HDR10
HLG is the broadcast standard for HDR (developed by NHK & BBC) which combines the old SDR and new HDR in such a way that its a much smaller signal to transmit. BBC iplayer uses this. (it stands for hybrid log gamma, log being the old system gamma being the new)
Hdr10+ is an enhanced version of hdr10 with 'dynamic metadata' which means that it can adapt overall range on a scene-by-scene basis. (i think hdr plus is just another name for this but I could be wrong about that)
Why is dynamic metadata desirable? Well, they whole point is to increase detail in dark and bright scenes to better match what you see in real life, there is a thing called HDR Premium certification which is for screens with an above average dynamic range bit these are more expensive. Most HDR sets still have a limited dynamic range compared with the source material. So if the TV has a dynamic range of 500 but the source is 0 to 1000 then the gamma curve without metadata has to be set for the the whole movie at the start, and you lose detail because it's 1:2 (it would actually clip in a non-linear way but I'm trying to keep it simple!). But with dynamic metadata a dark scene can set to map dark bits in the range of the TV (showing the 0 to 500) and a bright scene can slide the overall brightness way up so that it would lose dark detail but gains lots of bright detail (showing 500 to 1000).
This isn't really how it works, but it's the best way for people to visualises it I've found. It's like how your eyes get used to a dark room.
Dynamic metadata is desirable in theory but the results vary wildly out in the real world.
TL;DR: HDR10 and HLG are the ones you need for games, movies, and broadcast TV right now, anything else is a bonus.