Research universities offer unique computer system and network security challenges because of the breadth of their business, teaching, research, and clinical requirements, their organizational diversity and autonomy, and the scarcity of resources for computer system administration. C&C is attempting to assist units responsible for computer system security (and especially any sensitive data they host) by identifying protection strategies that work in decentralized organizations and are compatible with the operation of a high-availability, high-performance network utility.
In a perfect world, we would be able to prevent connectivity from and to those who would do our computer systems harm. However, the packets from attackers don't come with warning labels to make them easy to block, and opinions vary on how best to be safe. Many hold the view that "the network" is where the problem came from, so it's up to "the network" to solve the problem. The common phrase "network security" reflects this assumption, yet a growing number of security professionals recognize that a network-centric approach to security is at best inadequate, and at worst, very dangerous--especially in an environment where vast numbers of quasi-independent units with wildly differing computing needs share the same network infrastructure. Therefore, we claim that the term "network security" is best understood as: security of networked computers, using all appropriate methods to protect computer systems from attacks carried out via network connections.
While conventional wisdom calls for building an electronic moat--a "border firewall"--around the enterprise, this approach doesn't work in our environment (and large businesses are beginning to discover it doesn't work very well in theirs either). But if the conventional "network security" wisdom doesn't work, what can be done? A concise statement of our network security credo is: "Open networks, closed servers, protected sessions." The general strategy is to push security perimeters and policy definition as close to the organizations and computers to be protected as possible, and to make sure all sensitive traffic is encrypted. That's hard to do, and it means there are no "silver bullet" solutions. But if you believe that a border firewall surrounding the 55,000 computers on UW's network will make your computer truly secure... well, I've got some beach-front property in Arizona you might like to buy. Not only that, but border firewalls have some long-term negative consequences, such as encouraging people to tunnel all manner of applications through ports that are rarely blocked by firewalls, and increasing the time it takes to troubleshoot problems with networked applications.
The following seven axioms provide a basis for discussing how to go about achieving the goal of secure networked computers at the UW:
1. Network security is maximized when we assume there is no such thing.
In other words, don't assume that "the network" can solve "the network security problem." It cannot. Moreover, it's human nature: if we believe someone else is protecting us, we may not "get around to" fixing our own security problems.
2. Large security perimeters mean large vulnerability zones.
Multiple small moats provide more security than one large one because the large moat will need to have more bridges across it (more holes through the firewall) to accommodate the needs of the larger population behind it. Small moats (firewalls) can be tuned to the specific application or needs of each small group or individual, thus allowing tighter security. In other words, trying to make a large-perimeter (border firewall) strategy work requires imposing a one-size-fits-all security policy on the entire community--which is especially problematic in our world.
3. Firewalls are such a good idea, every computer should have one.
Because enterprise firewalls don't live up to the hype doesn't mean firewalling is not important. Current versions of all three major desktop operating systems come with integral firewall capabilities. If a particular computer's operating system is not securable and cannot be upgraded, then protect it with an add-on personal firewall. If one or more computers on a subnet cannot be individually secured for whatever reason, then consider deploying a "logical firewall"--a technique developed by C&C that makes it easy to deploy a small-perimeter "security moat."
4. Remote access is fraught with peril, just like local access.
Whatever mechanisms are deemed necessary and sufficient for remote users to access sensitive resources should be applied to local users as well. Session or transport-level protection is crucial, regardless of whether the connections go outside the enterprise border or stay within it.
5. One person's security perimeter is another's broken network.
Managing special cases within a network infrastructure--for example, department or application-specific router access lists--is extremely problematic and expensive in terms of network monitoring and management, policy database management, customer support, and time to repair. Hence the importance of moving the security policy decisions and enforcement perimeters as close as possible to the affected computers and their owners.
6. Private networks won't help.
Network security threats occur at the edges of the network, rarely in the core. There are of course threats to the network infrastructure itself, but the biggest problems we face are a result of huge numbers of end-systems of every vintage, and few resources available to keep them updated.
7. Network security is about psychology as much as technology.
Legitimate users, system managers, and policymakers all play a role in keeping systems safe. Achieving consensus on security requirements, priorities, funding, risk assessment, and acceptable levels of inconvenience is not primarily a question of technology.
Computer system security is a daunting problem with no easy answers, but there are strategies and tactics that can make a big difference. Key examples include: use of secure application protocols (SSL, SSH, Kerberos), strong two-factor authentication, moving critical or sensitive systems to "server sanctuaries", upgrading desktops to current operating system releases, and using proactive probing to identify vulnerable systems before the bad guys do.
For a more detailed look at these issues, read the complete "Network Security Credo" paper on the Web atstaff.washington.edu/gray/papers/credo.html and let me know if you have any suggestions.
See also www.washington.edu/computing/security/ for steps you can take to better safeguard your personal information, your work, your computers, and the UW network.
University of Washington Computing & Communications
Windows on Technology, No. 27, June 2002
Modified: May 31, 2002