"Buy Me A Coffee"

  • 3 Posts
  • 85 Comments
Joined 1 year ago
cake
Cake day: June 13th, 2023

help-circle
  • Yes it would. In my case though I know all of the users that should have remote access snd I’m more concerned about unauthorized access than ease of use.

    If I wanted to host a website for the general public to use though, I’d buy a VPS and host it there. Then use SSH with private key authentication for remote management. This way, again, if someone hacks that server they can’t get access to my home lan.


  • Their setup sounds similar to mine. But no, only a single service is exposed to the internet: wireguard.

    The idea is that you can have any number of servers running on your lan, etc… but in order to access them remotely you first need to VPN into your home network. This way the only thing you need to worry about security wise is wireguard. If there’s a security hole / vulnerability in one of the services you’re running on your network or in nginx, etc… attackers would still need to get past wireguard first before they could access your network.

    But here is exactly what I’ve done:

    1. Bought a domain so that I don’t have to remember my IP address.
    2. Setup DDNS so that the A record for my domain always points to my home ip.
    3. Run a wireguard server on my lan.
    4. Port forwarded the wireguard port to the wireguard server.
    5. Created client configs for all remote devices that should have access to my lan.

    Now I can just turn on my phone’s VPN whenever I need to access any one of the services that would normally only be accessible from home.

    P.s. there’s additional steps I did to ensure that the masquerade of the VPN was disabled, that all VPN clients use my pihole, and that I can still get decent internet speeds while on the VPN. But that’s slightly beyond the original ask here.


  • A couple of options in my opinion, as I just did this myself:

    You can use the CLI tool to “upload” them. You can even do this from the server itself. So upload times would be as fast as your network card can process or however fast your server is, whichever is slower. It does require that you create an API key for the user in question though.

    Otherwise you can create an external library and link that to your account. Now Immich will still index this library but it won’t move or manage the actual files. I’m not sure though if it looks at those files for duplicates (i.e. if you try and upload the same photo from your phone to the server). This external library will also prevent deleting photos as well, FYI.

    There might be other options that I’m not aware of, as I’ve only been using Immich for about a month now.


  • Correct. As I can only provide links to posts that are on your selected home instance. Eventually I’ll change this but you’ll get a 404 page for links that aren’t on your home instance, but see my P.S. below.

    P.s. there have been changes to the Lemmy API that have prevented me from getting updates for about a month now. So most of the results you’re seeing are from old posts only. Until I can rebuild the crawler or find a new API there won’t be any new content.



  • Yep that’s the new idea. The sad part is that with this method there’s no way to get historical data. Only new posts. So if a server goes down, gets DDOSd etc… I’ll lose posts forever.

    Also building an ActivityPub implementation from scratch isn’t trivial either. So that’ll take some time.

    I’ve got a few other ideas I’m playing with as well. Like just assuming that internal post IDs are all sequential and literally fetching them one by one. Or maybe some combination of both?







  • Think of Lemmy as email. Each post or comment is just an email sent to a distribution group (a community). If your email server goes down, all of those users and distribution groups are gone. Now I’ll still have the emails I sent to you in my email box but you won’t be able to see them as your email server is offline. Sure you could create a new account on a new server but you’d have to tell everyone about your new address (federate) but there’s nothing to associate your old user with your new one and there’s no way to backfill data. I could reply-all or forward (comment) on to your new address but there’s still no way to associate those old posts with your new account.




  • I’m also running Ubuntu as my main machine at home. (I have a Mac and do Android development for my day job).

    But at home, I do a lot of website and backend dev.

    1. Code in VSCode
    2. Build using docker buildx
    3. Test using a local container on my machine
    4. Upload the tested code to a feature brach on git (self hosted server)
    5. Download that same feature branch on a RaspberryPi for QA testing.
    6. Merge that same code to develop 6a. That kicks off a CI build that deploys a set of docker images to DockerHub.
    7. Merge that to main/master.
    8. That kicks off another CI build.
    9. SSH into my prod machine and run docker compose up -d

  • There is a public API now. While I won’t support sorting, you can process and do what you will with the results as-is. Currently I only support Posts and Communities for now.

    When you search for posts you’re just matching against the title or body. For communities it’s searching the posts within that community.

    There’s also more filters now with: instance/community/author/since/until and a safe-search option.

    So I’m not sure how close this comes to your idea but I thought I’d share.





  • I’ve already started to abstract away Lemmy from the search engine itself. So the first steps are in place. Once I get the kinks of the 0.4.x release knocked out then I plan on reading up on Kbin’s API and I’ll start working on the crawler. I can’t promise anything but that should give you a rough timeline.

    If you have any programming skills I could always use a hand.