In a previous post, we talked about Securing the Keys to the Kingdom and finding files with key passwords saved in clear text on our shares or within our virtual machines. Just as important to securing our most critical passwords, is properly understanding all activities related to highly privileged accounts on our domain.
One of the account types that typically comes under routine scrutiny by internal and external audit teams are accounts not specifically tied to or owned by a single person, but carry with them elevated privileges. I call these services accounts, some call them managed accounts, others call them application accounts. Whatever the name, they are accounts created for a specific purpose to carry out a specific task which requires raised access. It is not uncommon to find these types of accounts tied to scheduled tasks, scripts, or jobs carrying out a particular process. They can often be found within applications to bind websites or modules to database back ends. Often they are configured in a 'Set It and Forget It' type fashion, and it is also not uncommon for their password policy set to 'Never Expire'. For all of these reasons it is important to limit where these accounts have access and track their activity to be sure they are not being abused beyond the purpose for which they were intended.
Here is an example of one such abuse that I have seen multiple times as an IT Pro, relating to these types of accounts. In this example we will focus on a service account which is used to automate security and patch deployment to our Windows desktops and servers. The account is named MS Updates, it is a member of Domain Admins, and it's password is set to never expire. Not necessarily a best practice, but more common then you might think and the reasons for doing so typically relate to an immediate need without a focus on the long term effects.
Normal Activity - Baseline
Below is the timeline for all activities performed by the MS Updates account on a virtual desktop that just underwent a series of monthly updates. This will serve as a baseline for what normal activity should look like for this account during a patching cycle. In essence a series of reads, writes and deletes within folders specific to Windows and the applications that are being updated.
Reads on the HR Share
Below is an indication that something outside of the normal activity for the MS Updates account is occurring. On the HR Share, we see that the MS Updates account is flagged as one of the most active users. That is unusual as this is a CIFS/SMB share that should not be accessed by this account. We also see at a quick glance that there has been an uptick in the number of reads occurring on this share during off hours.
Drilling into the timeline of activities for this share, and filtering on the activity being performed by the MS Updates account during this time period, we see that the account is performing a series of reads on the HR share. Suspicious, as these certainly don't look to be related to Microsoft patches or updates.
Combing through a few of the other department shares we can see that this activity is not isolated to HR, but files are also being read / copied from the Marketing and Public shares as well by this account.
Just a quick double check to compare notes to the patching baseline taken earlier on the virtual desktop shows that we don't see any account activity by MS Updates to that desktop at the time that the other shares are having data copied from them. At this point, it is pretty safe to say that this account is being used on these department shares in a fashion that is outside of the typical activity pattern for which the account was intended.
While this example has been simulated for demonstration purposes, it shows the importance of being able to establish a baseline and being able to track activities for all account types. While only an example, this draws on real experiences that I have seen and been involved in first hand where security breaches were uncovered through the exploit of shared service accounts.