> Most modern software still defaults to imperial units somewhere deep in its stack — APIs, IoT firmware, cloud dashboards, even DB schemas.
I find it hard to believe that bold claim of “most”. The article doesn’t give evidence, not even a single example, so it doesn’t convince me I’m wrong in that.
If globally we collectively used metric for all units then this becomes realistic. But whilst we have different units being used and a person needs to view this is going to happen.
If the output of a sensor that records in metric, then needs to be viewed in imperial it needs to be converted. Is it better to convert at write or read time always depends on use case and geography. Or do we just force the human in the loop to know that the speed limit is 112km/h even though it’s non digitally signposted at 70.
Hours aren't SI/metric. That factor of 3600 makes any conversion to acceleration more ugly than it should be (rounding a curve for instance). Nearly as bad as 5280 in my opinion. And really, 30 meters/second would be a nice human readable number.
Even in science though, the pros are happy to switch to electron volts or units where the speed of light and years are both 1.
I don't believe the "2–10% CPU savings" claim for general computing, or servers. Only maybe for IoT devices. And you don't cite any source for that claim. What is your source?
This is probably due to most software working with graphics somewhere (GUI, printing), and font sizes and resolutions are predominantly imperial (pt/dpi).
> Most modern software still defaults to imperial units somewhere deep in its stack — APIs, IoT firmware, cloud dashboards, even DB schemas.
I find it hard to believe that bold claim of “most”. The article doesn’t give evidence, not even a single example, so it doesn’t convince me I’m wrong in that.
If it has been designed in Silicon Valley, I would not be surprised by fallback on imperial units.
Trust me bro
If globally we collectively used metric for all units then this becomes realistic. But whilst we have different units being used and a person needs to view this is going to happen.
If the output of a sensor that records in metric, then needs to be viewed in imperial it needs to be converted. Is it better to convert at write or read time always depends on use case and geography. Or do we just force the human in the loop to know that the speed limit is 112km/h even though it’s non digitally signposted at 70.
Hours aren't SI/metric. That factor of 3600 makes any conversion to acceleration more ugly than it should be (rounding a curve for instance). Nearly as bad as 5280 in my opinion. And really, 30 meters/second would be a nice human readable number.
Even in science though, the pros are happy to switch to electron volts or units where the speed of light and years are both 1.
And how this can spare 2%-10% CPU costs on your system
I don't believe the "2–10% CPU savings" claim for general computing, or servers. Only maybe for IoT devices. And you don't cite any source for that claim. What is your source?
> Most modern software still defaults to imperial units somewhere deep in its stack
I know that lot of software is produced in USA but one would expect better from software developers.
This is probably due to most software working with graphics somewhere (GUI, printing), and font sizes and resolutions are predominantly imperial (pt/dpi).