2

I can't find a tutorial on how to set up a shared hosting server.

The part that I'm missing is the way privileges are set for the webmasters so that they don't see each others directories.


Previous post:

How OVH can configure their SSH server to do this?

I'm trying to set up a multi-users web server and for that I'd like that each user can use both connect with SSH and SFTP but, most important, only sees their own directory. OVH managed to do that, but after 6 hours searching and trying (creating a chroot jail), I don't see how they did it. Maybe it's trivial, but I simply don't see it.

Here is what I can do when I log into my OVH account:

  • pwd gives me my home dir (/homez.52/creak)
  • /homez.52/creak is actually a symlink to /home/creak
  • I can cd into all the common Linux directories (/bin, /usr, /home, ..) but each time ls gives me this error: ls: cannot open directory .: Permission denied
  • I can browse all my files in both /homez.52/creak and /home/creak

How did they managed to do that? chroot? ACL?

Thanks

Creak
  • 314
  • 5
    Why don't you just examine the directory permissions with ls -ald /usr and getfacl /usr? It doesn't sound like a chroot environment if they have to lock down the permissions like that. – user3188445 Aug 16 '15 at 21:39
  • I didn't know about getfacl! That's brillant! So it might be just a simple ACL configuration? I don't know how ACL works, but now that I have a lead, I'll go there! – Creak Aug 16 '15 at 22:25
  • Well apparently ACL is one of the ways to have a proper shared hosting server. There must be something else because I don't see any other directories in /home. – Creak Aug 17 '15 at 00:05
  • 1
    see http://unix.stackexchange.com/questions/101263/what-are-the-different-ways-to-set-file-permissions-etc-on-gnu-linux for info on facl and more. – ctrl-alt-delor Aug 17 '15 at 05:50
  • Thanks @richard! I don't know if that will be enough to solve my case but it is definitely an important link! How come there are no tutorial to build up a shared hosting server on the internet? – Creak Aug 17 '15 at 12:52

2 Answers2

4

In a shared web-hosting environment, there are a couple issues that you need to address right off the bat.

Regarding directory permissions and only being able to access your files: what you want to do is set home directory permissions such that the "others" group has no permission whatsoever. Remember that eXecute permission is needed to cd into directories, but that by itself won't allow you to read their contents. Therefore, /home should be owned by root and have rwxr-x--x, so users can only "blindly" go to their home folder but not have a peek around and know how many users are in your system. It would look something like this (date, size etc omitted for clarity):

# ls -la /home
drwxr-x--x root root .
drwxr-xr-x root root ..
drwxr-x--- usr1 usr1 usr1
drwxr-x--- usr2 usr2 usr2
...

If you really don't want users to be able to read the contents of directories like /bin, etc. simply remove the "read" permission bit for the "others" group. It will not affect their ability to use the binaries contained within, provided they know the full route beforehand.

For SSH and FTP access, if you configure your filesystem permissions correctly, then any decent SSH or FTP implementation will already be secure. I recommend vsftpd for FTP, of course OpenSSH for SSH, but by no means I imply they're the only correct options available. Remember to tweak configuration options for those services (in particular, disallow root login through SSH, probably disallow password login for anyone sudo-capable, etc.)

The tricky part is configuring your web server correctly, especially if you have to run CGI scripts for dynamic websites. Everyone and their grandmother want PHP these days, and you really can't have /home/dumbuser/public_html/php_shell.php running as the same user that spawned your Apache/Nginx, right?

A possible solution here, if you're running the Apache web server, is to use the suexec module, which will have your CGI script run as the user that owns the executable file (think setuid bit). To allow the HTTP server access to the actual files, consider adding the user the server runs as (typ. www-data) to every user group on the system ("every user group" meaning every user that's using your shared environment, not every user account on the system).

Note that this is barely scratching the surface of all that must be done to properly configure and harden a shared server. The configuration files for each service running must be completely understood, and you will probably have to modify them to suit your needs. In particular, you'll probably have to spend a good week reading configuration options for your web server and trying it out on a development/testing environment.

Nubarke
  • 321
1

If you don't have r permission for a directory, but only x permission, you cannot "scan" the directory, but you can access any file in it that you know the name of.

If you do ls -ld /bin you might see the mode setting to be drwxr-x--x to see that the "other" of /bin is able to use programs in it, such as /bin/ls, but not access (view) the directory in a scan.

Between users, i.e., the /home/* directories, one might also drop the x permission so as to prohibit cross access completely.