SGT Research

first posted 17/05/2020 23:46 UTC+1; updated 18/05/2020 17:16 to correct frame rate

Today we looking into how the shine-get timer (sgt) deviates from a theoretical perfect timer.

data here (xlsx format; comparing the fastest runs of bianco 7)

Perfect Timer

the perfect timer is defined as follows:
set a base frame 0 with time exactly 0.
time for frame n := n * fps rounded down to 2dp.
the frame rate is exactly fps = 30/1.001 ≈ 29.97003 per the NTSC standard.

look at the hundredths digit. this cycles every 3 frames.
theoretically, it should depend on the frame number n as follows:

n range digit cycle
0000–0099 0 3 6
0100–0199 0 3 7
0200–0299 0 4 7
0300–0399 1 4 7
0400–0499 1 4 8
0500–0599 1 5 8
0600–0699 2 5 8
0700–0799 2 5 9
0800–0899 2 6 9
0900–0999 3 6 9
1000–1099 0 3 6
1100–1199 0 3 7
 

since 300 / fps = 10.01 exactly, this pattern continues indefinitely. hence, every cycle of 3 digits repeats for 100 frames (~3.337s). how convenient :o

notice that the cycle always increases one of the 3 digits, in fact always the one that is 4 below the next highest (looping 10 around to 0). a useful invariant is that each digit is always 3 or 4 away from the next on either side (cos of how rounding works). these cycles will be essential to the analysis to follow.

Actual Timer

Let’s look at the hundredths digit cycles in 4 actual runs – guy/weeg/dd/trq bianco-7.

First we sync them up on the first frame sgt is fully visible. It shows 4.96 or 4.97, so match it to the nearest time given by perfect timer, namely 4.97 (frame 149). Accordingly, all these runs re-time to 15.98 on perfect timer (frame 479).

The sgt deviation column is the difference of the sgt visible in each run from the perfect timer. the sgt cycle deviation is the difference in the cycle number between the run and perfect timer. this is calculated as the sum of the 3 previous sgt deviations, which is the same thing because each step up in the cycle number increases one of the 3 repeating digits by one.

Observations

First of all, the sample means show a difference in average cycle deviation between the runs – guy < weeg < dd < trq. But looking at the entire cycle deviation sample, we see that it’s consistent throughout the entire run.

Some patterns:

Explanation

This is my theory based on the above and a lil computer intuition. The sgt timer actually takes timepoints in real time using the CPU clock, then subtracts them from the start timepoint.

When a timepoint is taken within a frame varies a bit depending on what’s going on, and is a bit sooner during an event presumably cos no physics is happening. This is actually demonstrable just by pausing the game – observe how your cycle decreases.

The time on the shine-get frame increases by only 0.02s sometimes compared to previous, which is theoretically also possible on a frame when a cutscene or text starts, but I haven’t seen it. However, I think this is much more likely to happen when picking up a grounded shine (shadow mario) than otherwise, since I’ve not seen this in any other runs.

Other than this variation, the timing of timepoints being taken within a frame seems fairly consistent. There is random noise, possibly caused by preemptive multitasking (aka cpus are non-deterministic) or maybe just game engine lag.

There is one oversight however – what happens before the timer becomes visible? Specifically at the start timepoint, we’re just coming out of a load, which is presumably a very inconsistent timing. The start timepoint doesn’t necessarily have greater error than the others, but remember – the sgt is the difference between the current timepoint and the fixed start timepoint, so the error in the start becomes a systematic error in the visible sgt, which explains why the cycles these runs are on stay so consistent throughout.

I had a gander at some ricco 2 ILs where the start comes straight outta normal gameplay, and the cycle at the start looked a bit more consistent, all 0 3 6 or 0 3 7, with only Kaff getting a lucky 3 6 9 for a couple secs. This indicates the start has a bigger error than the end if there is a load, maybe twice as big from informal watching.

Conclusion (tl;dr)

The variation in sgt is mostly caused by the load you get and somewhat by random lag/non-determinism that hits you on your shine-get frame. I’d estimate the error to be 2x bigger in the load than the end. It should be possible to estimate sgt well from the number of frames that passed and the 3 hundredths digits you can see as you start the level.

Further work for someone

How to contribute samples

  1. Download video using:
    • this for clips
    • this (website) or this (program) for VoDs
    • JDownloader or w/e for YouTube
  2. Open video in VirtualDub.
  3. Press F10 to turn off right video and right-click left video to select comfy zoom level.
  4. Frame-advance using ← and → buttons. In excel (or google sheet?), copy the perfect timer columns, then try to match-up start frame. To automate data entry, fill out the data column you’re making with =[cell 3 rows down], which will cause it to auto-repeat so u only have to type in numbers when the cycle changes.
  5. Put in the last digit of an sgt sample corresponding to each perfect timer time. Do this by comparing digits not by counting frames. We do not care for frame pacing erros. If the frame is skipped, put in a -.