4

This question is two part.

First, If I had a list of websites that I would've liked to block. How do I tell my computer do to block those and any relevant subdomains?

Secondly, How do I get this right on a per-user basis. For Example, telling the computer to block userA from accessing facebook should not block userB from facebook.

Bonus points if the answer is a command-line one.

Stefan
  • 25,300
  • at first I was going to say use /etc/hosts but that won't do per user... maybe you could set up a proxy that requires some form of authentication so it know's who's who.. other options would include browser plugins... – xenoterracide Nov 06 '10 at 15:31

1 Answers1

4

There are three parts in your question, in fact:

  1. Decide on a blocking strategy at the network level: what connections are allowed?
  2. Implement that blocking strategy.
  3. … in a way that only affects certain users.

Blocking websites is not easy. In fact, I would say that it's impossible to completely block a website without completely blocking network access. All you can do is make the blocked user's life more difficult, but if they really want to they will be able to access the blocked site, with increased latency and decreased bandwidth, provided they have enough technical sophistication and possibly can rely on an outside server. For ordinary browsing, users can look at cached copies on Google or otherwise. Users who have an outside server can use it as a proxy, or they can use existing proxies (open proxies come and go too fast to block usefully).

You can try blocking by domain name or by IP address. IP addresses might work for a big site like Facebook, although you'd have to keep up with all their server moves. It won't work with smaller sites that are co-hosted.

A lightweight way to block some web sites is to block their DNS name resolution. Just this is likely make the users' life annoying enough that they work around your block by using an external proxy (which does require some sophistication). But there's no practical way of tuning DNS resolution per-user (it's not impossible in principle, but you'd need to set up a working identd and find a DNS server that talks to it).

The natural way to block web sites is to block direct web access and allow only access through a web proxy. Squid is the de facto standard. You can set it up as a transparent proxy (all connections on ports 80 and 443 are routed to the proxy machine; the odd website on another port may or may not work depending on how you configure your firewall) or as an explicit proxy (users must configure their browser; only the machine with the proxy can connect to the outside).

An easy way of implementing per-user settings is to require authentication in the proxy. Then having different levels of access is a job for the proxy. To avoid the password requirement, you can also make the proxy use ident (though this adds latency for all accesses).

Your task will be easier if you can run the proxy on a different machine (it can be a virtual machine). Doing everything on the same machine is possible but complicated on Linux, and I suspect it's also possible-but-complicated on other unices.