Status update July 4th
Just wanted to let you know where we are with Lemmy.world.
Issues
As you might have noticed, things still won't work as desired.. we see several issues:
Performance
- Loading is mostly OK, but sometimes things take forever
- We (and you) see many 502 errors, resulting in empty pages etc.
- System load: The server is roughly at 60% cpu usage and around 25GB RAM usage. (That is, if we restart Lemmy every 30 minutes. Else memory will go to 100%)
Bugs
- Replying to a DM doesn't seem to work. When hitting reply, you get a box with the original message which you can edit and save (which does nothing)
- 2FA seems to be a problem for many people. It doesn't always work as expected.
Troubleshooting
We have many people helping us, with (site) moderation, sysadmin, troubleshooting, advise etc. There currently are 25 people in our Discord, including admins of other servers. In the Sysadmin channel we are with 8 people. We do troubleshooting sessions with these, and sometimes others. One of the Lemmy devs, @nutomic@lemmy.ml is also helping with current issues.
So, all is not yet running smoothly as we hoped, but with all this help we'll surely get there! Also thank you all for the donations, this helps giving the possibility to use the hardware and tools needed to keep Lemmy.world running!
I expect it is federation outbound activity growing in memory usage, it's all held in RAM for every post/comment/like going out to subscribed servers. !lemmyperformance@lemmy.ml is a community for the focus on scaling Lemmy.
Lemmy also buffers failed posts/votes in memory, such as if there's server issues like Lemmy.world and Lemmy.ml are having, to retry later.
Couple that with the server having to deal with an exponentially increasing amount of servers to federate with (since our Federation model currently links every server with every other server in a worst-case scenario), and that's probably doing bad things to the performance.