Specifically, either at the UI rendering rate or the FPS cap, whichever is lower, or simply at the UI rendering rate if there is no FPS cap.
Also, always render the UI at the screen refresh rate; importantly, even when the simulation is unpaused. Unpausing the simulation would lower the UI rendering rate to the FPS cap to prevent rendering from being too out of sync with the simulation, but this is now impossible.
Also stop displaying 0 FPS when the simulation is paused. This now closely matches the way FPS used to be displayed, i.e. display simulation tick rate when the simulation is unpaused, display renderer tick rate when it's paused, and don't display SDL tick rate anywhere.
Client wouldn't get ticked if the simulation was paused >_> Also fix some old bugs that would allow weird delay values to be derived when tick schedule was given weird target FPS values.
It's really annoying when a stable workflow fails because nextVersion didn't get updated, whereas we actively seek out this variable when planning a new version between stables.
And have ui.fpsCap set only GameView's fps cap. This is the closest we'll be getting to the fps cap being true to its name (i.e. actually controlling frames per second, rather than ticks per second) with this iteration of the user interface. Also disable SRT if the sim is paused. Together, these changes fix some old problems such as non-sim user interfaces burning CPU for no reason when the fps cap is removed.
This was ill-designed: "vsync" should be a draw cap limit, if anything. We can't currently think of a way to allow both vsync and a different arbitrary fps limit without banishing the simulation to another thread, so this gets shelved now.
So now user interface ticks (event processing, animation, etc) happen in tandem with drawing the user interface. This happens to solve some weird lag issues on linux with ibus, and who knows how many other similar performance problems we're not aware of.
Namely, the ones caused by element numbers already invalid at the time of saving. This makes it impossible to intentionally have invalid element numbers in the low bits of properties listed in CarriesTypeIn, but these bits are considered to be under TPT's control anyway, so this is ok.
This would leave saves opened this way missing elements, but also not produce a warning in the preview, because by that time that preview is made, autorun.lua has been run.
Some people really don't like when an app immediately phones home the moment they open it. This of course means losing automatic notification and motds but oh well.
Broken since a38e1c48bb, where older saves were allowed to work off the legacy identity mapping of save numbers to element numbers. The problem was that this could leave even those element numbers mapped that were associated by the palette with unknown element identifiers. These elements would then fail to spawn but would not show up as missing in the save preview.
Still not perfect, see the TODO in the diff.
Like 5103db8288, this addresses the upload issues experienced on windows, but this time with the official fix from libcurl developers; see the relevant tpt-libs commit.
Broken since 462460b6b3, where the break in the old code was thoughtlessly moved to the new code, the issue being that create_part failing doesn't necessarily mean that we're out of particle IDs. I don't think pasting is a hot enough operation for such optimizations to be necessary, so continue should be fine.
Broken since c645269c86, where we started giving the renderer thread only the live part of Simulation::parts. Or at least intended to, but ended up copying whatever amount we should have copied the previous time around. The flickering was hard to notice because the number of particles rarely changes much between frames.
This should fix the long-standing issue of Create/ChangeType callbacks not being called in all cases when particles were created or changed type.
I'm eagerly awaiting reports of the horrendous crashes this will inevitably cause.
Broken since 02b679aec3 and extremely similar to 74386631e0, which makes sure that both LSI and GameView are alive when destroying Lua windows: in this case, they both need to be alive when destroying components attached to the main window. Also, the main window has to exist still.
AFTERSIMDRAW can be composited on the main thread on top of the frame that got rendered in parallel, so there is no reason to have it hinder SRT. This cannot be done with BEFORESIMDRAW, which is applied to the frame before the rest of the sim is rendered. If we had proper associative compositing with pre-multiplied alpha, it could be, but this doesn't really work well with the colour space we're using right now (sRGB, with u8 components).
Completely neglected to normalize the Rect's size. Somehow completely asymptomatic unless looking at the favourites menu if it's too long to fit on the screen, on MacOS. Yeah.
Another in the series of fixes for easily avoidable problems introduced in c2bb777212 by my infinite wisdom >_>. The previous (and thankfully, first) commit in the series is 8cab4ab738.