Hi. I have a query that’s difficult to explain, hopefully you can follow me. I’m trying to do something with the Motion Blur layer. I reckon I’ve got a fair handle on how to use it and all, but something about how it works has caught my attention. I understand from this that the Motion Blur layer merges a set of, lets call them, “subframes”, rendered into the past as far as the aperture parameter specifies. (Aperture would probably be more correctly named “shutter speed”, but I digress.)
What I notice is that these subframes are merged together with linear decay; that is, the subframe at time aperture in the past is completely transparent, and the subframes between that and zero time (relative to the frame) are composited with linearly increasing opacity. When I look at the output of using the layer, that does appear to be what is happening - the past motion of the frame seems to fade out. In my opinion, it doesn’t look very believable.
It strikes me that this is not how film works at all. Film collects light for however long the aperture is open, but the “older” colours don’t “fade” away like this seems to. I mean for each frame of film, it doesn’t matter at what point during the time the shutter was open that the light arrived on the film - all times of arrival are considered equal. Look at this as an example.
Am I missing something? Perhaps there’s a subtlety in moving shutters on film cameras, that I’m unaware of, that might justify the linear decay. But if not, I would think a constant opacity for each subframe would be more logical. Then the result would simply be an average of all the subframes, rather than a (unnecessarily complex) linearly graded opacity.
Thoughts? In my opinion this is a bug, but I figured I’d better ask. If not to just fix this, perhaps the kind of blur could be a new parameter for the Motion Blur layer?