Doesn't less blur make motion appear less smooth? eg a film/film sequence with a very short shutter open time is going to look more 'choppy' (less smooth motion) than one with a longer shutter open time (more motion blur)?
Less blur does, but the key is understanding what causes the blur. There is a set amount of time (a limitation) that an LCD takes to display an image. It doesn't matter how fast the panel of the tv draws in a second (hz) the TV will always be limited by how long it takes to even display the image in the first place.
What do you mean buy "compared to just a led"? Do you mean compared to OLED?
The difference in technology, an LED or OLED is going to be able to light and change a lot faster than an LCD with a backlight. You need to wait for the backlight to respond, this process takes a lot longer than just lighting up some phospor or an OLED, thats why motion is so much better on CRT/Plasma/OLED compared to LCD and why a lot of people just can't go back to LCD tv's after having a Plasma for example.
With an LCD TV (with LED backlight) that has it's local dimming (or similar tech) disabled, isn't it going to just keep the same backlight level on throughout (no time needed for it to adjust)? So then you've only got the liquid crystal response time (and picture/audio processing delay).
By no means am I technical enough to understand the details, but you can imagine when you have LED's behind a screen rather than an individual LED (OLED) or Phospor (Plasma/CRT) it is going to take some time to even work out how bright it has to be. The back light may be at a constant level, but how does it define to light up a particular area of the screen when the ratio of LED vs LCD pixels are so low. Does it show the whole picture at full brightness because just one part of the picture is bright when another is low? This decisions, processing etc takes time. On an OLED its on a per pixel basis, so the whole process happens a hell of a lot faster.
This link explains everything a lot better than I do:
Motion Blur of TVs
Surely OLED and LCD are the ones that really can potentially have naturally smoother motion than plasma, due to Plasma generally maxing out at 60Hz. Though most current film content will be 24p and without interpolation won't really get smoother motion (ignoring the blur produced by the panel/in the source due to shutter) - though true full UHD res 100/120Hz TVs might be out in 2017 (or maybe a year or 2 later?)? Also if a particular Plasma always output at 60Hz (not a 24Hz multiple - I think some did eg. 48Hz though) surely the LCD/OLEDs that have a higher refresh rate will output without (or with less of) the pull-down judder of 24->60Hz.
Judder and motion blur are two different things. LCDs and plasmas, even ones with native 60hz refresh rates can display 24hz without judder. Its only older models that struggled and had to perform pulldown to match the 60hz refresh rate.
I don't know if any of what I said makes sense, but I just know that a lot of people don't seem to understand the differences, they look for faster HZ tv's thinking they will exhibit less blur, yes they may have less blur than the same tv with a slower refresh rate, but this doesn't mean there isn't a 60hz or 50hz tv out there with a faster response time that does a better job. If we could feed the tv's with the same signal of the panels this would change, but sadly we can't. I hope the new consoles can and in the future HDMI specs will come in to accept 120hz+.