I'd bet it actually simplifies as least as many things as it breaks. Basically all computers already keep track of time as a count of seconds since a UTC epoch anyway, and then do timezone conversions on top of that.
Well, in essence yes. But I have seen enough mishandling and homegrown stuff of custom date-time calculations that this could get interesting. I suspect that there are a lot of systems where the TZ database is never updated which at least will result in shifted displayed local time.
Also, it is fun to get data from old programs and also from userinput where the actual offset has to be guessed from the timezone. And if that conversion data is old, fun is had. It does not matter how time is represented internally in this case.
There are probably hundreds of thousands of devices out there that are smart enough to know about time zones but old enough that there no chance of a software update, for example APC UPSs and power strips used in data centres world wide and years beyond end of support.
As long as they can get tzdata / Olsen db updates before the first change, there's usually no problem. But, I'm sure there are still devices out there using the OLD US switchover dates from the Bush era, because they have a different, possibly "hard-coded" rule set.
It wont be too bad. If you mean computers used commercially, Unix keeps a running total of seconds since the epoch and DST only affects whats displayed as time. Nothing will fundamentally change for these systems. In windows server products it wont be a problem, and some windows clients use an automatic patch process. Air gapped computers may have some hacky process around DST, but its a pretty solved problem.