As in, for any two Transform3D objects A and B i might encounter does Godot (4.1) always return A * B == B * A as true?
Alternatively is it approximately commutative, ie (A * B).is_equal_approx(B * A), in case there are situations where floating point imprecision messes the exact equality up.
and i wanted to be sure that if i do set_base_transform(some_node, some_transform) i'd be guaranteed to get that get_base_transform(some_node) == some_transform afterwards.
But when i tried it the above code did not work out, at least i didnt get the result i expected. But when i flipped it so that set_base_transform did node.transform = node.transform * base_transform.affine_inverse() instead it did work out.
Its still not hard proof though, maybe something else was messed up the first time, or it only looks like it works now and i'll discover the transform still isn't what i wanted it to be. Or they do commute but only under some constriction like no scale on any axis or something and i just happened to fulfill it with all the ones i used in my test.
So it would still be good to know for sure whether/when Transform3D's commute.
EDIT:
I accidentally wrote the first line wrong, it said that they do commute. When actually the experience i had with it working only after both functions did their multiplications in a compatible order should indicate that they don't commute.
I'm writing this as someone who has not done much with Godot, but from the mathematical standpoint, two Transform3Ds do not commute in general. There are situations in which they will commute, though. If they are both pure rotations, they will commute if their rotation axes are the same.
Edit to add: This was based on thinking about a Transform3D as a transformation matrix acting on R3.
I likewise don't really use Godot, but for graphics in general, the 4th coordinate is important, even if it is "usually" 1. It's most obvious to correctly interpolate near the poles of a sphere with a single rectangular texture, but think for a minute what "near" means.
Back to the main point though: the important things we normally rely on for matrix math are associativity (particularly, for exponentiation!) and anticommutativity (beware definitions that are sloppy about "inverse").