Pulling my hair out - Intel i3/i5 24fps issue

EnwezorN

Established Member
Joined
Apr 8, 2008
Messages
263
Reaction score
17
Points
113
Age
48
Location
Portsmouth
I thought I'd cracked it. My HTPC spec. list was ready to order and then I hear about the 24fps issue that exists for i3/i5 processors!!

I like the idea for an i3/i5 processor as I get HD bitstream audio, HD video without the need for extra cards.

My personal requirements are for a low power, high cooling HTPC as it is going somewhere that can get pretty hot. Also, I want a lot of storage in Raid 5 configuration.
With my chosen i3-550 and a Q57 motherboard, I could stick this in with 6 * 2TB drives in an Antec 902 case and have a case that can take the heat and doesn't have a separate graphics or sound card.

But now, I hear that Intel don't output at 23.976fps which means you get a little micro-stutter every 40 seconds or so by all accounts. There seem to be work arounds but nothing bullet-proof and given that this problem has been around for ages, Intel appear to have done nothing.

So, I have 2 questions,

1. Is it really a problem for those that have gone with i3/i5 Clarkdale during Bluray playback?

2. What is my alternative processor, motherboard, other card (graphics/sound) combination that meets the following requirements?
- connect 6 SATA drives
- bitstream HD Audio
- HD video with no stutter
- ATX, micro ATX, mini-ITX form factor
- at least 1 PCIe and PCI slot (for satellite card & PCI/SATA adaptor for Bluray drive)
- fairly low power

Any help appreciated! All I want is what my Panasonic bluray player will do!
 
Whether the 23.976 vs 24.000 Hz issue is an issue for you personally is almost certainly a personal issue.

I know some people are prepared to put up with - or indeed don't notice - some AV issues that cause video and audio to be unwatchable to me. I had to replace my HDTV with a 24p compatible model when I started watching Blu-rays because I found 3:2 content impossible to watch.

Similarly I can't watch 50i content de-interlaced to 25p - but I know many others who can (or don't even notice the difference) I have to de-interlace 50i to 50p to retain the full motion.

Whether you spot the microstutter (at best a repeated frame every 40" or so) will depend on how sensitive you are to this stuff. It is part of my job to spot this kind of thing at work - so I find it REALLY annoying - and couldn't put up with a system that did it.

Others happily do.

I think most people who find it a problem AND want HD audio bitstreamed are using ATI 5450 or similar cards for the GPU duties.
 
Thanks Stephen for that.
I think it will drive me a bit nuts. When I got my Blu-ray player and Onkyo 876, I noticed that the picture was a bit juddery on panning shots - drove me nuts. Didn't know why it was happening and then I realised I hadn't set the player up for 24fps.
I think I'll have to go back to scratch with the motherboard now and build around the Radeon card doing HD audio/video.

Cheers
 
Similarly I can't watch 50i content de-interlaced to 25p - but I know many others who can (or don't even notice the difference) I have to de-interlace 50i to 50p to retain the full motion.

Can you elaborate on this a bit more?

Does a "standard" setup with 50hz set in the graphics drivers output 50i content (eg live tv) at 25p (frame doubled?) or 50p?
 
I have an i3 and only notice the issue very rarely. It's great otherwise. You could still get it and see then add a passive ATI card if it bothers u
 
Can you elaborate on this a bit more?

Does a "standard" setup with 50hz set in the graphics drivers output 50i content (eg live tv) at 25p (frame doubled?) or 50p?

The latter - ideally.

There are two distinct sources of 50i content.

1. 50i Native Interlaced camera material (News, Entertainment, some Soaps like EastEnders and Corrie). These capture 50 separate images per second, and through interlace send 50 separate fields each second. This means that each "frame" contains a pair of fields - from two different points in time 1/50th second apart. On interlaced displays they are displayed consecutively, not simultaneously. When you de-interlace them to a 50Hz progressive output you need to generate 50 progressive frames. There are lots of different ways of doing this requiring differing amounts of processing power - some deliver better vertical resolution than others.

2. 25p Film and "Film look" video material. (Drama, Most documentary etc.) These capture 25 separate images per second. They can then be sent as 50 field interlaced content - but the two pairs of fields in each frame are from the same point in time, and can thus be merged to create a 25p frame, and then repeated to create a 50p sequence.

Detecting whether a source is 1. or 2. requires some motion detection techniques, and it is entirely possible for a video signal to contain both at the same time (think of a 50i caption rolling over content shot 25p) - and it is in these complicated scenarios that clever de-interlacing live vector-adaptive, pays dividends.
 
I asked because although I'm pretty sure my setup is doing VA deinterlacing properly, I'm still not 100% the picture I get is as good as a Sky or Virgin set-top box (not that I can compare them side-by-side unfortunately). I can see - as can the wife - a slight motion blur to some content. It's very very subtle though, and I may just be being too picky.

I would have just said it was due to low-bitrate SD looking naff on a 50" screen, but I see it with BBC HD content too, Top Gear in particular. As with any of this stuff though, it's incredibly difficult to describe.

Perhaps my HD4550 (soon to be replaced with a HD5450) just can't deinterlace content as well as the Kuro itself could, but it obviously never gets the chance as it's only ever fed a progressive signal.
 
Have you made sure that all noise reduction and edge enhancement is disabled in your Catalyst Control Centre? By default the noise reduction is enabled - and smears motion horrendously, and the edge enhancement is horrid. You may need also to untick "Enforce Smooth Playback" or similar.

A good test for the de-interlacing is watching Sky News or BBC News and looking at the ticker at the bottom of frame, particularly if they are showing this over 25p rather than 50i content.

My ATI 4450 does a better de-interlace than my Sky HD box on SD content (but I have to have my Sky HD box set to Automatic with 576i de-interlaced to 576p) but I've never used the Sky box to scale (as it won't correctly handle 4:3 content in other modes)
 
Yep, all the "enhancements" are turned off. Ticker tapes on the news channels look smooth as silk. VA confirmed as (probably) on using the cheese slices test video.

It's probably just the way it is, and there'll be nothing I can do about it. Might even be something wrong with the Kuro, who knows :eek:
 
Well plasmas have their own artefacts - dithering, quantisation banding etc. - and the processing in some can be quite subtle.

They are great for some things - but have inherent processing required to actually work. Personally whilst I love the motion rendition and black level of plasmas - I struggle with the other artefacts.

(An inevitable side effect of a technology that actually has no ability to vary the light emitted by individual pixels. Plasmas achieve - an illusion arguably - of greyscale by switching pixels on and off for varying amounts of time each frame to emulate greyscale)
 
Digging up this old thread, would we still have the 24p sync problem with an i5 if we don't use DXVA hardware acceleration? The i5 should have enough CPU power to decode 1080p content right?

Nuno
 
Digging up this old thread, would we still have the 24p sync problem with an i5 if we don't use DXVA hardware acceleration? The i5 should have enough CPU power to decode 1080p content right?

Nuno

It's not the DXVA decoding causing the problems, it's the inability of the Intel GPU to actually output a 1920x1080 23.976Hz video signal with the correct timings. That won't change with where you do the video decoding - it's downstream of it.

Sandy Bridge apparently gets very close to 23.976Hz if you disable UAC. Clarkdale is still a way off. Whether people actually see the issue and if they do whether they find it annoying is another matter.
 
I thought UAC was part of win7 security controls, how does that play a part?
that is crazy, I wonder how the first person spotted that?
 

The latest video from AVForums

Is 4K Blu-ray Worth It?
Subscribe to our YouTube channel
Back
Top Bottom