this post was submitted on 03 Dec 2025
215 points (97.8% liked)

Selfhosted

53448 readers
498 users here now

A place to share alternatives to popular online services that can be self-hosted without giving up privacy or locking you into a service you don't control.

Rules:

  1. Be civil: we're here to support and learn from one another. Insults won't be tolerated. Flame wars are frowned upon.

  2. No spam posting.

  3. Posts have to be centered around self-hosting. There are other communities for discussing hardware or home computing. If it's not obvious why your post topic revolves around selfhosting, please include details to make it clear.

  4. Don't duplicate the full text of your blog or github here. Just post the link for folks to click.

  5. Submission headline should match the article title (don’t cherry-pick information from the title to fit your agenda).

  6. No trolling.

  7. No low-effort posts. This is subjective and will largely be determined by the community member reports.

Resources:

Any issues on the community? Report it using the report flag.

Questions? DM the mods!

founded 2 years ago
MODERATORS
 

… by running your own instance of the free and open-source federated metasearch engine SearXNG on OpenBSD!

you are viewing a single comment's thread
view the rest of the comments
[–] N0x0n@lemmy.ml 4 points 4 days ago* (last edited 4 days ago) (9 children)

I used to self-host searxng for a while, but somehow the search results were always off and mixed with to much non-relevant results :/.

It's not about searxng itself... Rather how the most relevant info gets drown into AI slope and non-sense bullshit. The best blogposts/info are transmitted from people to people...

I'm kinda sad to admit that stupid AI "solved" this issue and had better results :/

[–] sunstoned@lemmus.org 2 points 4 days ago (3 children)

You can self host that too ;)

OpenWebUI + Ollama + SearxNG. OpenWebUI can do llm web search using the engine of your choice (even self hosted SearxNG!). From there it's easy to set the default prompt to always give you the top (10, 20, whatever) raw results so you're not confined to ai results. It's not quite duck.ai slick but I think I can get there with some more tinkering.

[–] dubyakay@lemmy.ca 3 points 4 days ago (1 children)

Is there a guide on how to do this on Linux + 16GB Radeon?

[–] sunstoned@lemmus.org 6 points 4 days ago* (last edited 4 days ago)

I mean, I could write one! I kind of just pieced it together from guides on the three individuals

Edit: back of the napkin guide below is basically in the OpenWebUI docs already! I use NixOS (btw) but docker/podman should work well.

OpenWebUI + Ollama setup -- tl;dr docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main

OpenWebUI SearXNG guide -- a little more involved, but not difficult.

load more comments (1 replies)
load more comments (6 replies)