A student in a security class offered a link to Ranum’s blog. Having not overused the material there before, it seemed like a good starting place was The Six Dumbest Ideas in Computer Security. The next six articles will address the ideas presented by Ranum, with arguments explaining mistakes and misconceptions. Below is the opening to the article in question.
#1) Default Permit
This dumb idea crops up in a lot of different forms; it’s incredibly persistent and difficult to eradicate. Why? Because it’s so attractive. Systems based on “Default Permit” are the computer security equivalent of empty calories: tasty, yet fattening.
The most fundamental concept of Information Security is the CIA triad. Confidentiality, Integrity, and Availability are the components of this triad. “Default Deny” is more commonly called Implicit Deny in the Security Industry. In teaching and explaining Information Security it is helpful to use generic terms people are more familiar with. So do not mistake my mentioning this as an attack on the author. In fact, a simplification may help in explaining Implicit Deny.
Using a Subject, Verb, and Object relationship we can break down security concerns and see the “how and why” of each. Let us start with Implicit Deny showing its components. No subject is allowed to verb an object unless there is a white list (a subject that is a list of subjects) showing that the subject in question (now an object on a list) is allowed (the verb). In essence “no person is allowed in, unless this list says they are allowed in.” Please note this is a function of access only. It is not a function of what you are allowed to do to an object. Implicit deny only concerns itself with if you are allowed access, NOT what you are allowed to do with access.
Now to explain the problem with Ranum’s thinking. The first problem occurs here:
The most recognizable form in which the “Default Permit” dumb idea manifests itself is in firewall rules.
By definition there is one difference between a firewall and a router. When a packet with an unknown network arrives at a router, the router has a default action. That is that a router will pass this packet down a default route. This is in effect the “default allow” that Ranum is talking about. A firewall by comparison does not have this default action. When a firewall receives a packet it checks against a list of rules. If there is no rule present for what to do with the packet, it dissipates as heat. That is, the firewall drops the packet.
The author’s first example is inherently contradictory to the function of a firewall. The function of a firewall is based entirely on Implicit Deny. Now we see more confusion on the part of the author:
Back in the very early days of computer security, network managers would set up an internet connection and decide to secure it by turning off incoming telnet, incoming rlogin, and incoming FTP. Everything else was allowed through, hence the name “Default Permit.”
A router did this, NOT a firewall. A router would be configured with an ACL (Access Control List) that limited connections based on a port number or source and destination addresses. This is why there was a default permit function, it was a router NOT a firewall. I bothers me that a security researcher would make such an oversight.
Suppose a new vulnerability is found in a service that is not blocked – now the administrators need to decide whether to deny it or not, hopefully, before they got hacked.
Maybe like the web service? Should we shut the web server down? Is that REALLY the recommendation you are making Ranum? What about availability? What do I do to make sure that my on-line business continues? What recommendation ARE you actually making?
A lot of organizations adopted “Default Permit” in the early 1990′s and convinced themselves it was OK because “hackers will never bother to come after us.” The 1990′s, with the advent of worms, should have killed off “Default Permit” forever but it didn’t.
Either you are not making one, or you are telling me I should just stop web traffic. I would also like to see some supporting documentation for your assertion that:
In fact, most networks today are still built around the notion of an open core with no segmentation.
The risk and liability alone would make this unbelievable. Please, offer some documentation for this assertion. Information Security practitioners often are skeptics, so do not be surprised by a request for the source of this claim.
Ranum moves on after providing some fear. It is time for us to consider how this is more than just a problem with networks.
Another place where “Default Permit” crops up is in how we typically approach code execution on our systems. The default is to permit anything on your machine to execute if you click on it, unless its execution is denied by something like an antivirus program or a spyware blocker.
Mac OSX. Programs are NOT by default permitted to run. This has been a component of Unix/Linux/OSX for the better part of a decade and a half. Just because one operating system has a problem with this (the assumption here is that Ranum is commenting on Windows) does not mean that every OS has this problem. After this Ranum talks about it being a bad idea. Maybe it would be a good idea to get in touch with Microsoft and mention this? Rather than make it sound like everyone has this problem, be specific. That is another component of quality information security research, specificity.
Now we finally get a well written anecdote that demonstrates what the author is talking about:
A few years ago I worked on analyzing a website’s security posture as part of an E-banking security project. The website had a load-balancer in front of it, that was capable of re-vectoring traffic by URL, and my client wanted to use the load-balancer to deflect worms and hackers by re-vectoring attacks to a black hole address. Re-vectoring attacks would have meant adopting a policy of “Default Permit” (i.e.: if it’s not a known attack, let it through) but instead I talked them into adopting the opposite approach. The load-balancer was configured to re-vector any traffic not matching a complete list of correctly-structured URLs to a server that serves up image data and 404 pages, which is running a special locked-down configuration. Not surprisingly, that site has withstood the test of time quite well.
Congratulations, you used a load-balancer as a firewall. Now Ranum starts to wrap up his opinion:
One clear symptom that you’ve got a case of “Default Permit” is when you find yourself in an arms race with the hackers. It means that you’ve put yourself in a situation where what you don’t know can hurt you, and you’ll be doomed to playing keep ahead/catch-up.
So the idea that a vulnerability could exist in the design even with implicit deny enabled is completely foreign to you? Has there ever been a problem with a TCP stack such that people got into a network even with a port closed (impact deny enabled)? Yes, that has happened. So the idea that someone having to do their job and figure out how a “hacker” (I hate that term) does not mean that there was a failure in implicit deny. It means there was a vulnerability. We are then treated with this platitude:
The opposite of “Default Permit” is “Default Deny” and it is a really good idea. It takes dedication, thought, and understanding to implement a “Default Deny” policy, which is why it is so seldom done. It’s not that much harder to do than “Default Permit” but you’ll sleep much better at night.
The problem is you can not always deny access to everything that has a vulnerability.
- You may be using the service that has a vulnerability
- What if the vulnerability is a zero day exploit?
- The attack can look like legitimate traffic (TCP handshake attack)
- This article focuses on 20% of the problem, external attacks. It does not address the other 80% of the issue, security failures from inside the company. Every illustration shows protection from attacks originating outside the company.
In short this article does not address the anything remotely related to a new security concept, or even make a useful recommendation. It is my hope that Ranum revises his article to be “more in line with reality” as I think he put it.