If I set 175% scaling in Gnome Settings, the value is saved as 1.7518248558044434 in ~/.config/monitors.xml:
<monitors version="2">
<configuration>
<logicalmonitor>...
TBF the error can become that big if you do a bunch of unstable operations (i.e. operations that continue to increase the relative error), though that's probably not what is happening here.
To get to 0.01 error, you'd need to add up trillions of trillions of floating point errors. It will not happen solely because of floating point unless you're doing such crazy math that you shouldn't be using primitives in the first place.
As the answer in the link explains, it's adjustment of your scaling factor to the nearest whole pixel, plus a loss of precision rounding to/from single/double floating point values.
So I'm not really sure of the point of this post. It's not a question, as the link quite effectively answers it. It's more just "here's why your scaling factor looks weird in your gnome config file", and it's primarily the first reason - rounding to whole pixels.
Gnome is coded with JavaScript (lmao đ¤Ł) so yeah, I Think you are right.
EDIT: Actually, even if JavaScript and other languages have this issue, the value 1.7518248558044434 has not this issue. There is another reply that explains it and makes totally sense. But still pretty lame to know the desktop runs with JavaScript. (Yeah, I hate Gnome)
Well, I started this thread saying it runs on JavaScript, and I mean that they need JS for most of the interactions with the desktop, like gesture or mouse events. đ Even if most of the code is C, we all know we need to write much many lines of code of C to do the same with JS, so most of the logics on GNOME is computed by JS. We need some rust here. đŚ đŚ đŚ đŚ
You don't get to decide what too much JS in the project is unless you actually work on and have in depth knowledge of the project. I dont like JS, but it has its uses.
Many people are conflating modern electron bloatware with 'JS bad', but things are not that simple.
There is less than 4% more code in C than JavaScript. That's pretty much, many features on the gnome-desktop is using JavaScript too, like gestures and mouse events.
Using JavaScript isn't inherently a bad thing. JavaScript can be very useful when used for scripting. Obviously anything with a new for performance will be done in C.
JavaScript isn't the best language to make a desktop interface in my opinion, it can be very efficient, but you can see in bugs (at least in the past) how bad performance it had, and they needed to re-factor it to replace to C or improve the JavaScript. I'm just laughing and making fun of it using JavaScript, not saying it is slow, Gnome is pretty fast nowadays.
Javascript was a toy created in the mid 90s to make dumb interactive animations and have some sort of dynamic aspect to a web page. The world starting to code entire desktop programs and servers in it was a giant, horrific, societal mistake.
It's not a "language" issue it's a "computer" issue. This math is being done on the CPU.
IEEE 754
Some languages do provide for "arbitrary precision math" (Java's BigDecimal for example) but it's slower to do that. Not what you want if you're multiplying a 4k matrix every millisecond.
And Gnome is far from the only desktop that uses JS, KDE Plasma, for example, also uses a lot of JavaScript.
It's weird when people bash Gnome for using JS, when practically everybody else uses it a lot too. Shows that they're just regurgitating "Gnome = bad!!!" nonsense.
We get it, you think disliking Gnome is a quirky, edgy personality trait.
Mostly C because you need to type more C code to do the same with JavaScript, so I suppose most of the logics are using JavaScript.
Plasma desktop has 2% JavaScript (https://invent.kde.org/plasma/plasma-desktop), it's not comparable. đ
There's a lot more to your UX than just the Plasma desktop. And you're also trying to pass off Gnome's shell as being Gnome desktop. Pretty disingenuous.