If you're affected, here's CrowdStrike's advice: 1. Boot Windows into Safe Mode or the Windows Recovery Environment 2. Navigate to the C:\Windows\System32\drivers\CrowdStrike directory 3. Locate the file matching “C-00000291*.sys”, and delete it. 4. Boot the host normally. 5. Repeat 50,000 times (enterprise only) (Can you guess I added number 5? 😛) Edit: Booting into safe mode might be harder if you have Bitlocker enabled. Update: 👇 There are reports of the CS patch being successfully downloaded before a BSOD. This can take anywhere from 3 to 50+ reboots. May the odds be ever in your favour. Also, if you're recovering cloud systems: 1. Unmount the OS drive with the bad CS driver 2. Mount to a new host 3. Delete the .sys file above 4. Mount back to original host
Fix requires admin rights and bitlocker key. Can organisations afford to have an admin at each machine 🤔? And how about implementing their fix in Azure cloud services? Or use the firewall to block the connection with CS, may helps to bring the devices back online 👍
Who doesn't have bitlocker and why not (well, for workstations)?
Its probably worth noting that once crowdstrike is disabled you probably should get some other form of protection installed. I hope other vendors are stepping up to the plate here to make it as easy ascpossible to replace crowdstrike
You forgot steps 6 & 7: 6. Remove Crowdstrike from your platforms 7. Demand a refund.
Busy afternoon and evening ahead for security teams I imagine
Surely this isnt limited to Windows (and end point devices - cos surely the majority of CI runs on Linux)?
I like 5, I’ve said it a few times
Finally someone mentioned bitlocker!
Co-Founder, Offensive Director at Volkis
5moThere are reports of the CS patch being successfully downloaded before a BSOD. This can take anywhere from 3 to 50+ reboots. May the odds be ever in your favour. Also, if you're recovering cloud systems: 1. Unmount the OS drive with the bad CS driver 2. Mount to a new host 3. Delete the .sys file above 4. Mount back to original host