I hear it often stated by IT and Security staff alike, “we don’t know what we don’t know.” Well, it’s a true enough statement, but what do you do to protect an environment where you don’t know MUCH? How is it acceptable to not know what you are responsible for? How can you ascertain whether or not:
- Your corporate systems and all of their applications are patched in accordance with policy?
- All of your applications are appropriately licensed?
- There are no security tools on the network that shouldn’t be present?
- There are no illegal programs on your network?
That my friends, requires lifting the lid and looking inside at all of the wriggling worms in that can. You will need to build a list of authorized software, and find out how many of each program is in use on the network. While you are at it, it would be wise to also look for software that IS NOT on that list, but is present.
In the good old days, I whipped up a few scripts (batch files, really) that were run at each login to query each computer to determined what was installed on them. The scripts provided WMI “scans” since I worked primarily in Windows shops, that basically returned the names of all EXE, COM, and other executable files, parts of the registry.
As you can imagine, running a script that looks for the presence of files on a large hard drive could slow down logins tremendously. The scripts ended up being broken into several parts, and each part would scrape data from a specific part of the computer, and then terminate. Next time it was run on that computer, it would start from where it left off, and repeat, until the entire script had been run.
This remained highly inefficient for several reasons, but beat the heck out of connecting to each computer remotely and doing an inventory, or worse, desk-side visits to do the same thing. Each of these had been done in the past, as this inventory was an audit essential.
Eventually, tools began to appear on the market. Some were just optimized scripts very similar to what I had written myself, but with benefits such as continuing to run in the background so as not to hold up login script processing, being able to throttle the intelligence gathering to save network bandwidth and CPU resources on the local machine, a structure output file that could be parsed and queried instead of a dumb text file that often became too large to open, etc.
As these tools matured, they have evolved into Asset Management suites. If you don’t own one, you are still working in the 80’s. You need one. These tools will answer many other questions besides the 4 above. They can point to money-saving opportunities, standardization, and identify rogue applications that you don’t want on your network uncontrolled. Why is that accountant running nMap? What is a pirate copy of CD-burner-Pro doing on that tax system? Can you survive a vendor audit from Adobe?
There are now no shortage of professional tools from recognized companies. Symantec offers Altiris, CyberArk has this capability, CA, McAffee, there are many. Pick one.