Is there a good alternative to github pages? I need just a static website up.
- I have a domain.
- I have my site (local machine)
- And that’s all I have.
- I have a machine that could be running 24/7 too.
I built this for my personal use: https://git.prisma.moe/aichan/simple_web_editor
There’s actually a surprising amount of free static website hosting out there. Besides GitHub, GitLab, Cloudflare, and Netlify come to mind offhand.
Codeberg does too
Codeberg is not just for static websites. It’s for FOSS projects. Their FAQ addresses this.
if you’ve already got something at home to run it on and want it easy to set up/maintain. take a look at mkdocs.
Codeberg pages
Something that may help:
Why doesn’t GitHub Pages fit your use case? It’s nice to get free static hosting from them.
I don’t want to serve my work in silver plate to theis AI.
AI encroachment
In what way? Anything on the public internet is likely being used for AI training. I guess by using free GitHub you can’t object to training.
Then again anywhere you host you sort of run into the same problem. You can use robots.txt, but things don’t have to listen to it.
Self-hosting there are some ways to fight back, or depending on your opinions on Cloudflare it seems they’re fairly effective at blocking the AI crawlers.
Yep, on top of simply blocking, if you’re self hosting or using cloudflare, you can enable AI tarpits.
How do I do this? I don’t mind (and may prefer) to host not at home. My main concern with GH is that you become an AI snack whether you like it or not.
Which part? If you’re wanting to use cloudflare pages, it’s relatively straightforward. You can follow this and get up & running pretty quickly: https://www.hongkiat.com/blog/host-static-website-cloudflare-pages/
If you’re asking about the tarpits, there’s two ways (generally) to accomplish that. Even if you don’t use cloudflare pages to host your site directly (if you use nginx on your server, for example), you can still enable AI tarpits for your entire domain, so long as you use cloudflare for your DNS provider. If you use pages, the setup is mostly the same: https://blog.cloudflare.com/ai-labyrinth/#how-to-use-ai-labyrinth-to-stop-ai-crawlers
If you want to do it all locally, you could instead setup iocaine or nepenthes which are both self hosted and can integrate with various webserver software. Obviously, cloudflare’s tarpits are stupid simple to setup compared to these, but these give you greater control of exactly how you’re poisoning the well and trapping crawlers.
Github, acquired by Microsoft, is now forcing AI on its user base.
That’s one of my main drivers to stay away from GH
If you want free static hosting then probably: https://wasmer.io/
If you have the machine at home then you could set up port forwarding to it, but you would need to do everything yourself like:
- running a web server like nginx
- setting up ssl for it with certbot
- storing the static files in /var/www/html for example
- port forwarding from your router to that machine
- using some service like DuckDNS to point a domain to your dynamic IP at home
- pointing a CNAME to the DuckDNS subdomain on your domain
If it’s purely static without the need to generate generate easily new page, simply use a web server.
So, uh…
Digital Ocean Is pretty inexpensive at US$7 monthly for 1 vCPU/1GB RAM with 1TB transfer. Decent platform. US-based, alas.
(2025 September, for the archives)
Oracle Cloud will give you far more for free.
Oracle Cloud will also delete your shit for the price of admission.
Caveat emptor, hey?
Mine has been running for years now without any such deletions.
And I genuinely hope it stays that way for years more to come. Cheers.
I use nginx you can have configs for different sites and have the server_name have the domain for each server block (I use a file per site) and you can either do static via a root folder, or proxy_pass for active running servers, and nginx will map the domains to the server blocks you should also have a default, and you can then have multiple domains point to the same ip address, but keep in mind that home internet often has a dynamic ip, so you may need to update it every so often. There is a service to help with the dynamic ip I think noip.com has a solution available, but feel free to look around.
- Any of https://staticsitegenerators.bevry.me/
- Any webserver + virtualhost config that serves plain HTML pages
- a build/upload script
GitLab has their own version of Pages
You could port forward.
However, I’d buy a digital droplet for 10 USD a month, point the A record of the domain to that and then use Caddy to implement SSL.
Caddy can run a http server or reverse proxy something on localhost.
$10/month just for a static website is a lot, especially with free alternatives out there.
Neocities?
I also thought about it, but the custom domain feature only works on the $5 / month plan.
I have not deployed Garage S3, but it has a static pages feature you could use — just buid your static files with jekyl or something, create a bucket and set the permissions.