this post was submitted on 13 Sep 2025
1 points (66.7% liked)

Selfhosted

56958 readers
1041 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

  7. No low-effort posts. This is subjective and will largely be determined by the community member reports.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 2 years ago
MODERATORS
 

Links are almost always base64 encoded now and the online url decoders always produce garbage. I was wondering if there is a project out there that would allow me to self-host this type of tool?

I'd probably network this container through gluetun because, yanno, privacy.

Edit to add: Doesn't have to be specifically base64 focused. Any link decoder that I can use in a privacy respecting way, would be welcome.

Edit 2: See if your solution will decode this link (the one in the image): https://link.sfchronicle.com/external/41488169.38548/aHR0cHM6Ly93d3cuaG90ZG9nYmlsbHMuY29tL2hhbWJ1cmdlci1tb2xkcy9idXJnZXItZG9nLW1vbGQ_c2lkPTY4MTNkMTljYzM0ZWJjZTE4NDA1ZGVjYSZzcz1QJnN0X3JpZD1udWxsJnV0bV9zb3VyY2U9bmV3c2xldHRlciZ1dG1fbWVkaXVtPWVtYWlsJnV0bV90ZXJtPWJyaWVmaW5nJnV0bV9jYW1wYWlnbj1zZmNfYml0ZWN1cmlvdXM/6813d19cc34ebce18405decaB7ef84e41 (it should decode to this page: https://www.hotdogbills.com/hamburger-molds)

top 18 comments
sorted by: hot top controversial new old
[–] FreedomAdvocate@lemmy.net.au 2 points 7 months ago* (last edited 7 months ago) (1 children)

That url isn’t base64 encoded. You can tell by the fact that it’s still a URL, and doesn’t decode……

[–] possiblylinux127@lemmy.zip 2 points 7 months ago (1 children)

...that one little detail everyone missed

[–] Hawk@lemmy.dbzer0.com 2 points 7 months ago

There is no such thing as a base64 encoded url. Part of an url might hold base64 encoded data, but never the url itself.

These online tools aren't working because you're using them wrong.

[–] hendrik@palaver.p3x.de 2 points 7 months ago (1 children)

There's base64 -d on the command line.

[–] ReedReads@lemmy.zip 1 points 7 months ago (1 children)

base64 -d

Right but the / in the url trips it up and I'd like to just copy/paste the full url and have it spit out the proper, decoded link.

[–] ExFed@programming.dev 3 points 7 months ago* (last edited 7 months ago) (1 children)

The / character isn't a part of the base64 encoding. In fact, only one part of the URL looks like base64. No plain base64 tool (whether via CLI, self-hosted, or otherwise) will be able to decode an entire URL like that. You'll first need to parse the URL to isolate the base64 part. This is literally solved with a single line of bash:

echo "https://link.sfchronicle.com/external/41488169.38548/aHR0cHM6Ly93d3cuaG90ZG9nYmlsbHMuY29tL2hhbWJ1cmdlci1tb2xkcy9idXJnZXItZG9nLW1vbGQ_c2lkPTY4MTNkMTljYzM0ZWJjZTE4NDA1ZGVjYSZzcz1QJnN0X3JpZD1udWxsJnV0bV9zb3VyY2U9bmV3c2xldHRlciZ1dG1fbWVkaXVtPWVtYWlsJnV0bV90ZXJtPWJyaWVmaW5nJnV0bV9jYW1wYWlnbj1zZmNfYml0ZWN1cmlvdXM/6813d19cc34ebce18405decaB7ef84e41" | cut -d/ -f6 | base64 -d

See TIO for example.

edit: add TIO link

[–] ReedReads@lemmy.zip 0 points 7 months ago (3 children)
  1. Thank you for this
  2. You know more than I do re: bash. Where can I learn what | cut -d/ -f6 | means? I assume the cut is the parsing? But maybe that is wrong? Would love to learn how to learn this.
[–] 30p87@feddit.org 2 points 7 months ago* (last edited 7 months ago)

cut --help and man cut can teach you more than anyone here.

But: "|" takes the output of the former command, and uses it as input for the latter. So it's like copying the output of "echo [...]", executing "cut -d '/' -f 6", and pasting it into that. Then copy the output of "cut", execute "base64 -d" and paste it there. Except the pipe ("|") automates that on one line.

And yes, cut takes a string (so a list of characters, for example the url), splits it at what -d specifies (eg. cut -d '/' splits at "/"), so it now internally has a list of strings, "https:", "", "link.sfchronicle.com", "external", 41488169.38548", "aHR0cHM6Ly93d3cuaG90ZG9nYmlsbHMuY29tL2hhbWJ1cmdlci1tb2xkcy9idXJnZXItZG9nLW1vbGQ_c2lkPTY4MTNkMTljYzM0ZWJjZTE4NDA1ZGVjYSZzcz1QJnN0X3JpZD1udWxsJnV0bV9zb3VyY2U9bmV3c2xldHRlciZ1dG1fbWVkaXVtPWVtYWlsJnV0bV90ZXJtPWJyaWVmaW5nJnV0bV9jYW1wYWlnbj1zZmNfYml0ZWN1cmlvdXM" and "6813d19cc34ebce18405decaB7ef84e41", and from that list outputs whatever is specified by -f (so eg. -f 6 means the 6th of those strings. And -f 2-3 means the 2nd to 3rd string. And -5 means everything up to and including the fifth, and 3- means everything after and including the third).

But all of that is explained better in the manpage (man cut). And the best way to learn is to just fuck around. So echo "t es t str i n g, 1" | cut ... and try various arguments.

[–] krnl386@lemmy.ca 2 points 7 months ago* (last edited 7 months ago) (1 children)

Try explainshell.com - you can paste in any oneliner and the site will parse it and explain each part.

Here’s the link

[–] Enoril@jlai.lu 1 points 7 months ago

Really nice! Thanks for sharing this

[–] ccryx@discuss.tchncs.de 1 points 7 months ago* (last edited 7 months ago)

You can use man <command> (in this case man cut) to read a program's manual page. Appending --help (without any other arguments will often produce at least a short description of the program and list the available options.

[–] Scripter17@lemmy.world 1 points 7 months ago* (last edited 7 months ago)

I've been working on a URL cleaning tool for almost 2 years now and just committed support for that type of URL. I'll release it to crates.io shortly after Rust 1.90 on the 18th.

https://github.com/Scripter17/url-cleaner

It has 3 frontends right now: a CLI, an HTTP server and userscript to clean every URL on every webpage you visit, and a discord bot. If you want any other integration let me know and I'll see what I can do.

Also, amusingly, you decoded the base64 wrong. You forgot to change the _ to / and thus missed the /burger-dog-mold and tracking parameter garbage at the end. I made sure to remove the tracking parameters.

Edit: Published on crates.io and github under AGPL. Sadly the discord frontend couldn't be published to crates.io because to work around something (I forget exactly what) I changed a dependency from the one on crates.io to a more up-to-date version of it on github. Crates.io correctly rejects that kind of stuff. If you want to use the discord frontend, git clone the repository then run cargo build -r -p url-cleaner-discord-app.

The offer to write extra frontends stands, btw. If you want a slack bot I'll make one.

[–] moseschrute@lemmy.world 1 points 7 months ago* (last edited 7 months ago)

Not that you should vibe code, but you could vibe code this so easily. Have it output a static website. Give the source code a scan if you’re paranoid. Check the network tab if you’re really really paranoid. But literally you could have it output this as a static index.html file that you drop into your browser of choice.

This is the only type of coding LLMs should ever be used for imo. A small, very clearly defined task that is very easy to verify if it works. And code that won’t infect a larger project.

Edit: as others pointed out, that url isn’t base64 encoded. You would have to clearly define what you are trying to do if you want this to work. For example, do all urls follow the same format as the above?

[–] e0qdk@reddthat.com 0 points 7 months ago (1 children)

There's something else going on there besides base64 encoding of the URL -- possibly they have some binary tracking data or other crap that only makes sense to the creator of the link.

It's not hard to write a small Python script that gets what you want out of a URL like that though. Here's one that works with your sample link:

#!/usr/bin/env python3

import base64
import binascii
import itertools
import string
import sys

input_url = sys.argv[1]
parts = input_url.split("/")
  
for chunk in itertools.accumulate(reversed(parts), lambda b,a: "/".join([a,b])):
  try:
    text = base64.b64decode(chunk).decode("ascii", errors="ignore")
    clean = "".join(itertools.takewhile(lambda x: x in string.printable, text))
    print(clean)
  except binascii.Error:
    continue

Save that to a file like decode.py and then you can you run it on the command line like python3 ./decode.py 'YOUR-LINK-HERE'

e.g.

$ python3 ./decode.py 'https://link.sfchronicle.com/external/41488169.38548/aHR0cHM6Ly93d3cuaG90ZG9nYmlsbHMuY29tL2hhbWJ1cmdlci1tb2xkcy9idXJnZXItZG9nLW1vbGQ_c2lkPTY4MTNkMTljYzM0ZWJjZTE4NDA1ZGVjYSZzcz1QJnN0X3JpZD1udWxsJnV0bV9zb3VyY2U9bmV3c2xldHRlciZ1dG1fbWVkaXVtPWVtYWlsJnV0bV90ZXJtPWJyaWVmaW5nJnV0bV9jYW1wYWlnbj1zZmNfYml0ZWN1cmlvdXM/6813d19cc34ebce18405decaB7ef84e41'
https://www.hotdogbills.com/hamburger-molds/burger-dog-mold

This script works by spitting the URL at '/' characters and then recombining the parts (right-to-left) and checking if that chunk of text can be base64 decoded successfully. If it does, it then takes any printable ASCII characters at the start of the string and outputs it (to clean up the garbage characters at the end). If there's more than one possible valid interpretation as base64 it will print them all as it finds them.

[–] ReedReads@lemmy.zip 1 points 7 months ago

Wow, this is really helpful. Thank you!!

[–] countzukula@lemmy.world 0 points 7 months ago (1 children)
[–] qwerty@discuss.tchncs.de 1 points 7 months ago

I was about to install it on my server until I found out that it's developed by the UK government. Now I won't trust it even though it's open source.