More detail in my link:
Guide: Motion explanation and Best Buys for motion 2017 Edition
Sony don't make it easy in the menus to understand how motionflow works.
Motion Judder aka "Smoothness"
Film Mode controls the detection of whether motion interpolation is used. It scans the content to try and determine how it needs to process the motion. Setting film mode higher means the TV will try harder to detect changes in motion. The stronger the film mode setting the more likely the TV will detect changes in motion, but also the higher the setting the more mistakes the TV will make when detecting the content.
If you run with Film mode on high, the TV will be constantly trying to work out in the background if it needs to interpolate the image or not. It calculates best case scenarios by scanning the input signal and determining the original frame rate. This is usually handled a lot better by the built in tuner or when running apps built in to the TV than compared to an external source. There are situations where you get motion hiccups that are not the fault of the TV though and will occur with every model:
1. The TV broadcaster is required to send information to the tuner when it is broadcasting, this information includes detail such as the resolution of the broadcast, the frame rate and the scan type. The problem is the broadcaster can often change what type of video being broadcasted
without updating this info.
To use a real life example this happens on every Match of the Day episode and probably every live football game too.
When a replay shows, the frame rate of the image is changed and it throws off the TVs "film mode" detection. Suddenly the TV goes from thinking you are watching a 50hz football match to perhaps a 24hz film..it gets confused because the information being fed is telling the TV we are watching 50hz football, yet the image itself is showing something at a much lower frame rate.
Again In MOTD when they return back to the studio, the broadcaster changes completely from an interlaced 1080i signal at 50fps to a 1080p one at 25hz so once again the TV tries to calculate and adjust its frame rate accordingly.
Another example is when a TV channel changes to an advert break, most advert breaks show progressive info whilst most TV content is interlaced.
These changes in source motion do not help the TV and there is nothing that the TV manufacture can do about it. If the broadcaster stuck with the same signal throughout the programme it wouldn't be a problem or if they sent details about each change every time they changed such as when they go back to the studio or show a replay it wouldn't be a problem, but for whatever reason, probably production cost/live TV they can't.
2. When using an external source.
This is related to point number one, but can be controlled yet often isn't.
Not only does a broadcaster send wrong the wrong information sometimes, the tuner that you use to receive the video from the broadcaster and send to the TV often interferes also. Examples of this scenario would be using Sky Q, Sky HD, Virgin TiVO/V6 boxes or any external tuner box for that matter.
Most of these boxes output only a fixed signal. Instead of passing the signal they receive exactly to the TV from the broadcasts they instead process it themselves. This poses a problem for the TV because instead of just being able to read the signal it receives about the video it instead has to scan the video even more to find discrepancies and correct them. The majority of boxes in Europe will send a fixed signal of either 2160p @50hz for UHD or 1080p @50hz depending on whether you are connected to an UHD TV or not.
They will keep this signal fixed, no matter what content is shown on the TV channel. You can be watching a TV program that uses 576i resolution at 25hz and the box will deinterlace and upscale it to the fixed resolution, scan type and frame rate defined.
You could be watching a TV program or film that is mastered at 1080p @ 24hz and the TV will still output everything at a fixed amount.
Because the TV works to display the picture without judder, it uses Film mode to detect when the signal it receives doesn't match up with the signal actually displayed. It then adjusts its panel refresh rate accordingly to compromise.
Sadly is hard to do when instead of passing the signal the broadcaster sends, these boxes only ever output a fixed signal that doesn't change. It makes it even harder then for the TV to work out which content is which frame rate.
It is for this reason that this point usually causes more of an issue than point 1. Historically there have been known problems between a source and TVs detection that have been fixed in firmware updates. A good example of this was with Sony's 2015 TVs and Sky HD. For whatever reason Film mode was incorrectly determining frame rate from the Sky box, the way the Sky box was processing its information and sending it at a fixed amount was not working with the detection system of the TV.
Sony later were able to fix it by testing the problem whilst using the Sky HD box and releasing a firmware update.
Obviously there is nothing really can do about these kind of problems without testing every single tuner box before they create their film mode software. This would be hard to do as in the UK alone there are probably more than 10 different tuner boxes being used and Sony sell to every country around the world who no doubt use different TV standards and different tuner boxes.
This is a problem that exists on all TVs. Some may fair better with your source more than others but its not avoidable unless by luck you find a TV that doesn't have these errors with your source.
Whilst point 2 is avoidable, the judder caused by point 1 isn't and this will stay that way until broadcasters improve sending the correct info, or some kind of technology on the broadcast aids this (such as maybe upcoming HFR and HLG).
So to summarise how Film mode and motion interpolation works:
Higher film mode = More active detection in frame rate changes, less judder overall but also more mistakes incorrectly detecting changes when there isn't (eg slow motion replays) and therefore processing errors.
Higher Smoothness = More frames inserted when film mode triggers that there should be and therefore a smoother image, but at the expense of Soap Opera Effect and motion artefacts.
Blur aka "clearness".
This is just black frame insertion or BFI that we already mentioned. It works on all content regardless of detection by film mode and just inserts blank black frames quickly in between frames by strobing the backlights of the TV very fast. It is this strobing that causes the flicker and it is the blackness of each frame that causes the picture to darken. By inserting these frames quickly it tricks our eyes into thinking there is less blur. Using BFI though can also make it appear like there are dark trails behind bright colours too as your eyes pick up the blackness in each frame. For most content though its not noticeable and some people are more sensitive than others.
A couple of other notes regarding the XE9005 in particular.
If you use BFI, you must also increase the backlight and maybe also the brightness of the TV. Do not just enable BFI and assume it isn't worth having because it darkens the image too much, adjust the image afterwards to reach a similar kind of brightness to before enabling BFI.
If you have the backlight and brightness on a low setting the way the TV darkens it is by flashing the lights on or off. This means if you run brightness lower on the TV you will get more noticalbe flicker. If you are sensitive to flicker and especially as noted above when using BFI do not have the backlight at a low level.
I have rambled on for a while, I'll try to stop now.
What can you do about it? With this all aside.
If you are not happy with motion blur on the TV even with BFI enabled then I would really recommend you look at OLEDs as they natively have almost zero motion blur. You might also find that Samsungs implementation of BFI is better than Sonys and therefore using BFI on a Samsung LCD is likely to work better.
In fact I think even Panasonic's BFI is better than Sony.
Judder wise, you are going to have this on every TV, the best thing you can do is demo the TV you chose before you buy it with your own content to make sure you are happy before you buy it. Everyone's perception of judder is different and everyone likely has a different problem with judder based on their own source too, so be sure when demoing to use material that is similar to the material you watch at home, if not exactly the same.
I have to ask though, how are you testing this judder and blur? Are you using an external TV tuner like Sky or Virgin? Have you compared how that performs compared to the built in tuner of the television?
Hope this helps anyway, I know its ridiculously complicated but I can't think of a much simpler way to explain it.