I noticed the same thing with my ST60 however if they were aiming for bt.1886 they did a fairly rubbish job.
When calibrating the ST60 I selected the 2.6 gamma preset as this gave the closest average gamma to 2.4 but it actually only really intersected the 2.4 curve at around 50 IRE. Gamma at 10 IRE was more like 1.8 and 80 IRE about 2.7 - at 90 IRE it actually dropped quite sharply to around 2.2.
While pattern sizes and ABL can have an effect on the gamma curve I got fairly similar readings using a mixture of window patterns from 5%-12% and also APL patterns.
A proper bt.1886 curve follows around 2.3 for most of it - it's only around 20 IRE where it drops fairly sharply towards 2.1 - however that depends on the black level of the display as bt.1886 is calculated on black level. Something like the ZT would very nearly be a straight 2.4 because of it's extremely low black level.
This is the gamma targets for bt.1886 on my st60 - this assumes a white level of 120cd/m2 and black level of 0.008 cd/m2. The st60 has good black levels so the drop is quite sharp below 10IRE - a screen that does not have such good black levels would curve down much earlier. This is why I prefer bt.1886 as it takes into account the limitations of the individual display rather that a one size fits all approach.
5% 2.16
10% 2.24
15% 2.28
20% 2.30
25% 2.31
30% 2.32
35% 2.32
40% 2.33
45% 2.33
50% 2.34
55% 2.34
60% 2.34
65% 2.35
70% 2.35
75% 2.35
80% 2.35
85% 2.35
90% 2.35
95% 2.36
On my LG Led a straight 2.4 would never work because of the ridiculously bad black levels but even 2.2 crushed dark detail in high contrast scenes due to the displays dimming technology. Bt.1886 helped the issue as that gave me a gamma of
[email protected] and
[email protected] and that brought out the dark details but still preserved the contrast in brighter scenes.