Is there a good alternative to github pages? I need just a static website up.

  • I have a domain.
  • I have my site (local machine)
  • And that’s all I have.
  • I have a machine that could be running 24/7 too.
  • jqubed@lemmy.world
    link
    fedilink
    English
    arrow-up
    32
    arrow-down
    1
    ·
    10 天前

    There’s actually a surprising amount of free static website hosting out there. Besides GitHub, GitLab, Cloudflare, and Netlify come to mind offhand.

  • meh@piefed.blahaj.zone
    link
    fedilink
    English
    arrow-up
    4
    ·
    10 天前

    if you’ve already got something at home to run it on and want it easy to set up/maintain. take a look at mkdocs.

  • csm10495@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    10
    ·
    10 天前

    Something that may help:

    Why doesn’t GitHub Pages fit your use case? It’s nice to get free static hosting from them.

      • csm10495@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        18
        arrow-down
        2
        ·
        10 天前

        In what way? Anything on the public internet is likely being used for AI training. I guess by using free GitHub you can’t object to training.

        Then again anywhere you host you sort of run into the same problem. You can use robots.txt, but things don’t have to listen to it.

        • jqubed@lemmy.world
          link
          fedilink
          English
          arrow-up
          7
          ·
          10 天前

          Self-hosting there are some ways to fight back, or depending on your opinions on Cloudflare it seems they’re fairly effective at blocking the AI crawlers.

          • AmbiguousProps@lemmy.today
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            1
            ·
            10 天前

            Yep, on top of simply blocking, if you’re self hosting or using cloudflare, you can enable AI tarpits.

            • iveseenthat@reddthat.comOP
              link
              fedilink
              English
              arrow-up
              1
              ·
              9 天前

              How do I do this? I don’t mind (and may prefer) to host not at home. My main concern with GH is that you become an AI snack whether you like it or not.

              • AmbiguousProps@lemmy.today
                link
                fedilink
                English
                arrow-up
                1
                ·
                edit-2
                9 天前

                Which part? If you’re wanting to use cloudflare pages, it’s relatively straightforward. You can follow this and get up & running pretty quickly: https://www.hongkiat.com/blog/host-static-website-cloudflare-pages/

                If you’re asking about the tarpits, there’s two ways (generally) to accomplish that. Even if you don’t use cloudflare pages to host your site directly (if you use nginx on your server, for example), you can still enable AI tarpits for your entire domain, so long as you use cloudflare for your DNS provider. If you use pages, the setup is mostly the same: https://blog.cloudflare.com/ai-labyrinth/#how-to-use-ai-labyrinth-to-stop-ai-crawlers

                If you want to do it all locally, you could instead setup iocaine or nepenthes which are both self hosted and can integrate with various webserver software. Obviously, cloudflare’s tarpits are stupid simple to setup compared to these, but these give you greater control of exactly how you’re poisoning the well and trapping crawlers.

  • Jeena@piefed.jeena.net
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    3
    ·
    10 天前

    If you want free static hosting then probably: https://wasmer.io/

    If you have the machine at home then you could set up port forwarding to it, but you would need to do everything yourself like:

    • running a web server like nginx
    • setting up ssl for it with certbot
    • storing the static files in /var/www/html for example
    • port forwarding from your router to that machine
    • using some service like DuckDNS to point a domain to your dynamic IP at home
    • pointing a CNAME to the DuckDNS subdomain on your domain
  • foremanguy@lemmy.ml
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    1
    ·
    9 天前

    If it’s purely static without the need to generate generate easily new page, simply use a web server.

  • SolidGrue@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    10 天前

    So, uh…

    Digital Ocean Is pretty inexpensive at US$7 monthly for 1 vCPU/1GB RAM with 1TB transfer. Decent platform. US-based, alas.

    (2025 September, for the archives)

  • S0UPernova@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    10 天前

    I use nginx you can have configs for different sites and have the server_name have the domain for each server block (I use a file per site) and you can either do static via a root folder, or proxy_pass for active running servers, and nginx will map the domains to the server blocks you should also have a default, and you can then have multiple domains point to the same ip address, but keep in mind that home internet often has a dynamic ip, so you may need to update it every so often. There is a service to help with the dynamic ip I think noip.com has a solution available, but feel free to look around.

  • Hawk@lemmynsfw.com
    link
    fedilink
    English
    arrow-up
    5
    ·
    10 天前

    You could port forward.

    However, I’d buy a digital droplet for 10 USD a month, point the A record of the domain to that and then use Caddy to implement SSL.

    Caddy can run a http server or reverse proxy something on localhost.

  • panda_abyss@lemmy.ca
    link
    fedilink
    English
    arrow-up
    2
    ·
    10 天前

    I have not deployed Garage S3, but it has a static pages feature you could use — just buid your static files with jekyl or something, create a bucket and set the permissions.