There is a fairly annoying bug (?) in Extendscript that I am hoping to rectify by starting to dig into the C++ SDK and hopefully solve using a hybrid extension.
Here is the bug:
I have manually set my frame count to start from 0 in my Project, as per my facilities specs. This is so our timecode from the client stays correct to theirs, however while we like to start the count from 0 we like to have our master comp set to start at frame 1;
So the comp is set to 1.
For reference this comp is set to 29.97
If I query currentComp.displayStartTime I get:
0.03336670003337
One frame in, in seconds.
If I query currentComp.frameDuration I get (drum roll):
0.03336670003337
One frames length in seconds.
So to programmatically set it in the future the code should be:
currentComp.displayStartTime = currentComp.frameDuration;
If I query displayStartTime after setting it, I get:
0.03336669877172
So, it seems the JavaScript doubles are becoming Floats somewhere where it talks to the AE engine.
Why this is a gnarly bug, sometimes running this sets the start time to 0 (depends on the frame rate, 29.97 goes to 0, 23.976 goes to 1)
I have seen fixes like add (1/1000) to the value to nudge it just over the edge. While this gets numbers I want it seriously breaks rendering. If I submit to royal render it will render from frame 0 (a grey frame) and then try and (usually crash) the remaining frame except for the last frame, all the while struggling to split the comp between machines. So I actually do need the setting to be as accurate as the getting WITH NO fudging. (why can't we script with INTEGER frames ffs!?)
I would love to fix this without C++ but have had no luck.
Why C++ seems to be answer:
It seems the floats are quantized to certain step values in the C++ side of things. (See time_scale and time_step)
Before I commit to having no life and woodshedding some C++, does anyone have a fix for this? Can I pre-quantize the floats in AE and set that? Can I X-Post to the scripting forum?