If I set 175% scaling in Gnome Settings, the value is saved as 1.7518248558044434 in ~/.config/monitors.xml:
<monitors version="2">
<configuration>
<logicalmonitor>...
It is related to a mix of actual display resolution vs conversions to virtual resolutions (the scaled resolution), and use of single precision floating point calculations.
Essentially my understanding is what it is doing is storing the value needed to convert your actual resolutions number of pixels (2160p) to a virtual resolution number of pixels (2160/1.75 horizontally) but that gets you fractions of a virtual pixel. So instead of 1.75 it scaled by 1.75182... to get to a whole number of virtual pixels to work with. Then on top of that the figure is slightly altered from what we'd expect by floating point errors.
If you take the actual horizontal resolution 2190 and divide it by the virtual resolution it's trying to use 1233 pixels, you need a conversion value of 1.75182.... to convert to it so you don't get fractions of a pixel. If you used 1.75 you'd get 1234.2857... pixels. So gnome is storing the fraction that gets you a clean conversion in pixels to about 4 decimal places of a pixel.
Full credit to rakslice at Stack Exchange who also goes into the detail.
TBF the error can become that big if you do a bunch of unstable operations (i.e. operations that continue to increase the relative error), though that's probably not what is happening here.
As the answer in the link explains, it's adjustment of your scaling factor to the nearest whole pixel, plus a loss of precision rounding to/from single/double floating point values.
So I'm not really sure of the point of this post. It's not a question, as the link quite effectively answers it. It's more just "here's why your scaling factor looks weird in your gnome config file", and it's primarily the first reason - rounding to whole pixels.
Gnome is coded with JavaScript (lmao 🤣) so yeah, I Think you are right.
EDIT: Actually, even if JavaScript and other languages have this issue, the value 1.7518248558044434 has not this issue. There is another reply that explains it and makes totally sense. But still pretty lame to know the desktop runs with JavaScript. (Yeah, I hate Gnome)
It's not a "language" issue it's a "computer" issue. This math is being done on the CPU.
IEEE 754
Some languages do provide for "arbitrary precision math" (Java's BigDecimal for example) but it's slower to do that. Not what you want if you're multiplying a 4k matrix every millisecond.
And Gnome is far from the only desktop that uses JS, KDE Plasma, for example, also uses a lot of JavaScript.
It's weird when people bash Gnome for using JS, when practically everybody else uses it a lot too. Shows that they're just regurgitating "Gnome = bad!!!" nonsense.
We get it, you think disliking Gnome is a quirky, edgy personality trait.