Thinking about making web hosting services for the folks who just wants a website or can't afford servers... Any sysadmin tips to make it secure and safe if ever I make one?

· · Web · 4 · 2 · 1
@matthilde What kind of websites would you like to host? Because, I think, you don't have enough capacity to host something like Facebook.

@wiecek I would like to make an invite-only service (like you say why would you like to use this service, do you agree the rules, etc.) where people can host a static site (or maybe more like PHP or smth if I can).

@matthilde I would look at Autistici/Inventati does for 21 years now. But maybe I'm reading "safe and secure" differently.

@nikita seems like what I am looking for, I'll check this out!

@matthilde there's a couple more like them, some with donate what you can, other on a more fair-price basis (I'd count IN-Berlin eV in there, although they are not necessarily really usable international last time I asked, but still decent and fair prices for a small project running for a long time)
@matthilde make sure only people you know and trust can use the service for publising. Providing free/publicly available web hosting is a good way to put yourself in trouble.

@kouett Not really public, it would be on application.

Say one wants to make a website about uhh... Let's say their blog.
They send a mail, asking to make an account for their site and I set it up for them.
There will have rules to follow.
But yeah I think the project will begin by letting people I trust.

@matthilde what exactly do you want to host? websites or more? what kind of websites? (just static pages, or also stuff like fastcgi, or maybe even random webserver stuff?) would some kind of shared hosting suffice, or would you have to virtualize each user to keep them separate from one another?

@sys64738 Static serving and FastCGI is what I am planning for now.
I would make a shared server where everyone has their home directory to drop their files in.


@matthilde ok so, this means users will be able to run code (the fastcgi processes)

first youll need some sort of infra to autogenerate webserver config and autorequest https certs for every domain/vhost youll be serving, can be hacked together w/ shell scripting

for the webserver, at least both apache and nginx support fastcgi iirc, though apaches fcgi support is more widely used and better-known (nginx is used for other purposes, most often). though you can always try your preferred webserver first & see what happens (though apache might be useful for having .htaccess stuff, but maybe nginx supports that as well these days)

so the main protection you want to have, is having users be unable to read (and write) files of other users. this wouldnt be very difficult to do, if it werent for the fact that the webserver user (often www-data) *also* needs to be able to access these files

there is, however, a hacky way to work around this issue: ACLs. with these, you can give specific users fine-grained access rights to files or folders, more than just the owner/group/others stuff w/ standard unix perms. with this setup, you could make the root user own the user home directories, set the permissions to 700 of that folder, give rwx permissions to the user using ACLs, and r-x to www-data (or, if the web folder is a subfolder (eg. ~/www), set ~ to --x to www-data, and ~/www to r-x). you might want to use a ~/www setup anyway, so that a users ~/.ssh folder etc. cant be accessed by the webserver at all (if youre giving ssh access of any kind, which is also needed for sftp)

also, users would also able to see the full process list, you probably want to limit that a bit so at least the program arguments are not visible to other users

maybe you could also try to make the fastcgi processes always run in some kind of jail so that they wont be able to leak eg. ~/.ssh, never done that myself, though

if youd ever have the need to scale it up, it might be a good idea to set up some stuff right now to make migration easiser: ldap for managing user accounts, nfs for user file storage, and ansible for having the same config (eg. for the webserver) on multiple machines. then it wont be hard to have multiple webservers running, and having a loadbalancer in front

for your users, it might also be interesting to have them access the webserver access and error logs *of their vhosts*, which can be useful for debugging errors in webservers. ofc you can redact some info in the access log (such as ip addresses) in the webserver config

re: long 

@matthilde i mean you could also go the "lol containerize everything" way & put everything in docker & use traefik or so as the reverse proxy

Sign in to participate in the conversation

Merveilles is a community project aimed at the establishment of new ways of speaking, seeing and organizing information — A culture that seeks augmentation through the arts of engineering and design. A warm welcome to any like-minded people who feel these ideals resonate with them.