An industry reporter asked me a couple of pointed questions recently as part of an interview for a feature article. He wanted to know if I felt that Enterprise Information Security was broken, and what could be done to fix it.
“Given the increasing number of denial of service attacks, Java exploits, break-ins, malware delivered by spam etc. , is Enterprise Security broken?”
No, I don’t believe that Enterprise Security is broken. I do believe that some of the fundamental assumptions that we in the Information Technology industry made early on in IT and communication development were flawed and are now being abused. Enterprise Information Security is a strategic model whose intent is to formalize and promote security practices in a consistent manner across an organization remains a fundamentally correct objective.
One of the biggest concerns that I have had over my 30+ year IT career has been that of consistency. Remember that Information Security as a recognized discipline didn’t exist when Information Technology was born, and came about well after IT and technology had started to mature. We built the communications protocols at the heart of TCPIP to support and focus on resilience, continuity, and speed. The naive belief was, if a set of rules was cast that delivered reliable communication, the job was pretty much done. The entire concept was based on trust. What else could you possibly want?
What was missing was consideration of the human factor; an authentication layer, a repudiation criteria, the guarantee of confidentiality, the assurance of data integrity, and the practices of controlled access and least privilege.
People are creative, curious, and in many cases, selfish creatures. If they find a weakness in an application, or a way to take advantage of a process that will provide them with notoriety, wealth, or some other desired benefit, I guarantee that it will be exploited. Look at how games get hacked for online gold, extra advantage, or simply bragging rights, to underline the problem. The abuser doesn’t consider or perhaps even care that the author views the game as a years of work and a revenue stream, and doesn’t gauge the impact that player actions have on the developers’ livelihood. They just want the desired item.
Until we can replace or rebuild the TCPIP suite with those missing pieces at its core, we need to put in place a governance and architectural model, policies, processes, standards, controls and guidance that when taken together, provide a consistent information security architecture. That architecture should apply evenly across the enterprise, not only to this group or that region, and should be able to manage and adapt to the upcoming disruptive factors that will make up our IT world in the future.
“What are some of these recent disruptive factors?”
- BYOD – Employees recently fell in love with the idea of using their own smartphones and tablets for work. Management embraced the concept, since it enhanced the bottom line, eliminating the need to purchase and maintain hardware that tends to become obsolete within a calendar year anyway.
BYOD introduced consumer tech into the enterprise, and although I like others resisted it, we all knew it was inevitably going to happen. These new consumer devices come with all of the warts that you would expect from a consumer device; no standard image, little focus on security and data protection, few points of control, fewer points of integration, and no separation of personal versus corporate identities.
Employees are just now beginning to question how deep they will let work intrude into their personal lives. Did IT just turn their beloved smartphone into a tracking device? Can the company now monitor and examine their personal emails, chats, and browsing habits? Employees are beginning to resent that personal time is now becoming potentially unpaid work time. Managing these challenges must be part of the new Information Security Architecture.
- Malware – Malicious software has evolved from a nuisance to a plague. It’s been monetized, and has grown into a full blown industry unto itself. Malware is now custom developed, the developers are organized, and they coordinate their efforts. Some of them specialize, and offer their services to one another, mercenary style. Our vendors need to do the same, and change the model from signature based detection to signature, characteristic (white-listing), and behavior based protection. All of them, not one of them.
Vendors also need to move away from the “backwards compatible with everything” development model. Bloating code to support multiple Operating Systems, especially those that are no longer being developed or supported by their creators, perpetuates vulnerabilities on several fronts. It potentially brings all of the previous versions’ vulnerabilities into the new version, it perpetuates the existence of out dated software amongst businesses and home users, and it complicates business processes like asset and license management. All of these result in a larger attack surface to be exploited, and liabilities to customer organizations.
Malware distribution is undergoing a major shift, from being widely distributed so as to have the maximum effect on a target rich environment, from quick in – acquire target – quick out blitzing strategies, to custom-made, no signature available, targeted to a specific industry, business, or user to limit solution development, and placed where it will be most effectively consumed by the target. The new malware is being tweaked to avoid detection, doing nothing observably destructive, and maintaining a discrete profile for as long as possible. It stays in the environment, collecting information, trickling out intelligence, and potentially offering backdoor access for its author or owner. These little nasties tend to stay embedded within an organization for years.
- Data Leakage – I used to worry about the impacts malware had, the downtime it incurred, the mess it made, and the time it takes to clean up after an infection. Incident Response, Business Continuity and Disaster Recovery practices have matured, alleviating the bulk of those concerns, and now I don’t have to worry as much about what sort of malware gets into the environment. Over the years, I have adopted an attitude that concerns itself more and more with egress management. I now worry more about what data is getting out. In order to maximize my nightly pillow time, I develop or procure capabilities to monitor traffic flows, and to identify the types of documents, contents of documents, and other materials that should not be leaving the network.
The challenges here are accounting for every egress method, every potential removal vehicle, every characteristic that makes a document sensitive, and dealing with each one in an appropriate and manageable fashion. The electronic communications are the low hanging fruit, they are easily monitored. It is the physical devices that pose the greatest challenges.
- Next Generation Firewalls – The Internet Protocol suite was built to support communication using a set of rules, identifying specific ports and protocols, packet and frame sizes, and expecting specific content to be in each frame. The developers assumed that applications and people would operate within those rules. We also assumed that technology would present a perimeter that could be easily controlled and managed. If the protocol used matched the port designated for it, and that port/protocol set was allowed to pass through the firewall, it was all good. Unfortunately, attackers do not play by those rules. They use them against us.
Next Generation Firewalls are emerging that analyze relationships and behaviors. They inspect traffic to ensure that someone or something is accountable for each packet on the network, that it fits within an expected data request stream, conforms to much more granular rules based on expected and observed behavior, and that it is shaped and formed the way the rules expect it to be.
- The Cloud – Every silver lining has a cloud, and every cloud has security implications. We experimented in the past with out-sourcing our IT worker bees in order to save costs. In some places that was successful, and not so successful in others. We are now doing the same thing with applications, services, data, and infrastructure. The risks to those assets remain the same, but we are now concentrating those assets along with many other assets in one place, and giving up visibility and control, while increasing the value of the hosting target.
The arguments make sense, we are not an IT company, why do we need to invest in so much hardware, software, and staff to maintain it? Someone else can do this better, focus entirely on it, and save us money by providing it to the masses as a Service. The other side of the coin is that the risks don’t go away, the liabilities don’t go away, but the ability to directly control and manage the out-sourced entities becomes more difficult. Accountability becomes fuzzy, but ultimately lies with the data owner, not the hosting comapny. In a cloud-based model, you are trusting someone else to do a better job of managing and protecting your data, you are trusting them not to mis-use your data, and you are trusting them to provide access to the right people while blocking access of the wrong folks. Audit and Compliance issues become evident.
Ultimately, if this new juicy data target is breached by someone attacking you or one of the many other customers that use this service, your data may be exposed, and your business is liable and accountable. Your data may not even be exposed, but if you use the breached vendors’ services, the perception may be that you were breached. Your customers won’t care if the breach happened at your data center or your provider’s. You were trusted with their data, and it was at risk of exposure on your watch. You may also increase your dependency on the cloud service, and that increases your susceptibility to denial of service attacks.
- Attacker Motivation & Capability – The enemy has found that those annoying virus and worm characteristics developed in the past for notoriety or destructive power can be used for financial gain, espionage, and they have gotten organized. The dark side has put forth significant effort into developing a diverse set of tools, expertise, and strategies. We need to model our defenses after those of the attackers. Vendors need to start integrating, working together, and providing the enterprise with consumable, actionable, accurate intelligence about what is going on inside and outside of their networks. SIEM is a step in the right direction, but let’s not stop walking forward.
“Do we need a fundamental change in the way enterprises approach/design security?”
Here, I would say yes, and I believe that this change has been cooking along for quite some time in a very slow, “bolt-it-on” fashion. Technology changes seem to be revolutionary, coming out of nowhere and establishing themselves quickly in response to disruptive factors and needs. Changes in protection capabilities tend to be evolutionary, taking their own sweet time to develop and mature in reaction to unforeseen circumstances that arise post-implementation of technology. Physicist Niels Bohr said, “Prediction is very difficult, especially if it’s about the future.”
We in IT as an industry, and businesses in general, need to realize that the perimeter is continuing to melt, to focus on monitoring the network and protecting the data, to insist on integration, increased visibility, and to demand built-in security from our products, vendors, service providers, and business partners. Enterprise Information Security offers a conduit through architecture and governance to provide a well thought out strategy that can adapt and react to disruptive advancements in technology. It lays the ground work, and operates best by implementing consistent governance over people, processes and technology at the enterprise level for the purpose of supporting management, operation, and the protection of information and assets.