Not too surprising if the people making malware, and the people making the security software are basically the same people, just with slightly different business models.
This is, in a lot of ways, impressive. This is CrowdStrike going full "Hold my beer!" about people talking about what bad production deploy fuckups they made.
The answer is obviously to require all users to change their passwords and make them stronger. 26 minimum characters; two capitals, two numbers, two special characters, cannot include '_', 'b' or the number '8', and most include Pi to the 6th place.
Great! Now when I brute force the login, I can tell my program to not waste time trying '_', 'b' and '8' and add Pi to the 6th place in every password, along with 2 capitals, 2 numbers and 2 other special characters.
Furthermore, I don't need to check passwords with less than 26 characters.
The modern direction is actually going the other way. Tying identity to hardware, preventing access on unapproved or uncompliant hardware. It has the advantage of allowing biometrics or things like simple pins. In an ideal world, SSO would ensure that every single account, across the many vendors, have these protections, although we are far from a perfect world.
Well, it is hindsight 20/20... But also, it's a lesson many people have already learned. There's a reason people use canary deployments lol. Learning from other people's failures is important. So I agree, they should've seen the possibility.
I saw one rumor where they uploaded a gibberish file for some reason. In another, there was a Windows update that shipped just before they uploaded their well-tested update. The first is easy to avoid with a checksum. The second...I'm not sure...maybe only allow the installation if the windows update versions match (checksum again) :D
The kernel driver should have parsed the update, or at a minimum it should have validated a signature, before trying to load it.
There should not have been a mechanism to bypass Microsoft's certification.
Microsoft should never have certified and signed a kernel driver that loads code without any kind signature verification, probably not at all.
Many people say Microsoft are not at fault here, but I believe they share the blame, they are responsible when they actually certify the kernel drivers that get shipped to customers.
I'm waiting for the post mortem before declaring this to not be anything to do with MS tbh. It's only affecting windows systems and it wouldn't be the first time dumb architectural decisions on their part have caused issues (why not run the whole GUI in kernel space? What's the worst that could happen?)
If you patch a security vulnerability, who's fault is the vulnerability? If the OS didn't suck, why does it need a 90 billion dollar operation to unfuck it?
Because it isn't. Their Linux sensor also uses a kernel driver, which means they could have just as easily caused a looping kernel panic on every Linux device it's installed on.