Yeah honestly, it’s great so far. I tried searxng for quite awhile and it did the trick somewhat, but damn SEO farms were my biggest pet peeve. The time I save is worth the money
Long-term Linux operations guy who somehow became a Golang developer.
I also run the lemmy.serverfail.party instance
Yeah honestly, it’s great so far. I tried searxng for quite awhile and it did the trick somewhat, but damn SEO farms were my biggest pet peeve. The time I save is worth the money
Pretty great on the web browser front-end to be honest - haven’t had an issue when I have used it on my phone. Not sure about the app side of things since I’ve been trying to limit my doom scrolling to when I’m at a computer
Fired up a FreshRSS instance for myself when the reddit API notifications came about. Reminds me of my Google Reader days - quite happy with it thus far. Any of the decent quality news sites seem to have an RSS option, at least in my experience so far.
I’d say any LTS release you can get a working setup of Adobe in should be fine for them. 90% of what they’re going to do is probably via a browser so it’s OS-agnostic. I’m fond of Debian since it’s very stable, but it comes with the drawback of older packages as time goes on, though you can pull in repos for more recent stuff for most important things.
What do they plan to do with it? Just browse to gmail/facebook/etc? If so, really anything with a web browser that can stay up-to-date and they should be fine. LTS releases are good in that case.
If anything more than that, then might have to be a bit more selective with the distro.
Yeah - this was a tad annoying at work today. Thank god for terraform if outages had become more severe
In all honesty, there are a ton of us tech enthusiasts who have no problem paying 10-20$ per month to run an instance out of our own pockets. We get the ability to subscribe to content we used to use Reddit for, and we can have a few folks hop on with us. Multiply that by a bunch, and add in community funded instances, and we’ll be fine.
Gotta consider server costs were only a fraction of Reddit’s costs. Salaries are quite pricey, and we have lots of folks volunteering time which will make it all work.
Ran it around Christmas - was still an intense resource hog. Lots of features and great for corps, but too much otherwise
I just run a searxng instance for myself. Fetches from multiple sources.
I’ve heard good things about kagi, but it does require paying for (though you can try out a free tier to see if it’ll work for you)
There are some folks in the lemmy_support area lurking around offering help on the technical side, if that’s what you’re after! Many don’t have time to dedicate to running a full instance themselves, but are happy to help with the setup
This is basically why I’m sticking around, besides being able to have a copy of the content I consume on servers I can do something about (ie, backup.)
Not expecting things to get better after IPO personally
This is my droplet with 1GB of RAM only running lemmy:
free -m
total used free shared buff/cache available
Mem: 964 386 68 141 509 219
Swap: 2047 310 1737
So expect at least 1GB for lemmy with postgres included when you include spikes etc.
Same. I know it’s more work than caddy etc, but I’ve been doing it for eons now so it’s muscle memory at this point.
Take a look at https://browse.feddit.de/
There’s a auto-updating list showing even the popularity level - helps a ton finding them!
Current communities are popping up like crazy today and the previous couple days, so it’s a bit to keep track of.
The bigger instances mostly are fine on the auth side, it’s primarily pictures and some slow SQL stuff being worked on still. So best thing some users can do on smaller instances is be aware that the bigger ones may go up and down a little, so content may come in bursts from the communities on the bigger ones
Awesome, thanks for all the recommendations folks! I’m going to try out calibre-web and kavita and see how they are
Thanks! It’s my “fun” domain.
But yeah, you shouldn’t have any issues with bandwidth if you don’t have a massive amount of users. The big instances are running into bottlenecks related to CPU/disk speed from what I’ve been seeing vs network speed.
Judging from my DO usage network chart, with me subscribed to a ton of communities: minimal. Just a lot of API calls back and forth from federated servers.
Git clone completed - seeing a lot of companies crack down on everything lately, so clone your favourite repos!
Generally, if in the same country you’d have to comply. As another example though: If your server was in Canada, and some department in Alabama wanted your data, you could tell them to pound sand. Though they may put some sort of warrant out for you for failure to comply (doesn’t matter though if you never go there)