Skip Navigation

Missing comments from subscribed community on another instance?

I do the majority of my Lemmy use on my own personal instance, and I've noticed that some threads are missing comments, some large threads, even large quantities of them. Now, I'm not talking about comments not being present when you first subscribe/discover a community to your instance, in this case, I noticed it with a lemmy.world thread that popped up less than a day ago, very well after I subscribed.

At the time of writing, that thread has 361 comments. When I view the same thread on my instance, I can see 118, that's a large swathe of missing content for just one thread. I can use the search feature to forcibly resolve a particular comment to my instance and reply to it, but that defeats a lot of the purpose behind having my own instance.

So has anyone else noticed something similar happening? I know my instance hasn't gone down since I created it, so it couldn't be that.

18
18 comments
  • This arises from the good ol issue of everybody just migrating to the same three or four big servers which end overloaded with their own users and can't send updates to other instances.

    I remember the same happening to Mastodon during the first few exodus until a combination of people not staying, stronger servers and software improvements settled the issue.

    I can barely get updates from lemmy.ml and lemmy.world isn't much better

    Beehaw seems to perform okey.

    • About half of the communities on lemmy.ml I subscribed tomare on "Subscribe Pending" and have been since I started this server.

  • Does your server have enough power and workers to handle all the federated messages? Or is it constantly at 100% CPU?

    • The machine is a dedicated server with 6 cores, 12 threads, all of which are usually under 10% utilization. Load averages currently are 0.35 0.5 0.6. Maybe I need to add more workers? There should be plenty of raw power to handle it.

      • Yeah that sounds about enough to handle the load. How many workers do you use? And do you see any errors in your logs about handling messages? You could try to search for that particular thread to see if all replies are handled correctly?

  • I seriously thought I'm alone with this issue, but it seems it's fairly common for people hosting on their own. Same as you guys, it won't sync everything, some communities are even "stuck" with posts from a day back, even though there were many new ones posted.

    Kind of off topic question, but I guess it's related? Is there anyone that can't pull a certain community from an instance? I seem to can't pull !asklemmy@lemmy.world or anything from that community, that includes posts and comments. No matter how many times I try, it won't populate on my instance.

    EDIT: Caught this in my logs:

    lemmy | 2023-06-20T08:48:21.353798Z ERROR HTTP request{http.method=GET http.scheme="https" http.host=versalife.duckdns.org http.target=/api/v3/ws otel.kind="server" request_id=cf48b226-cba2-434a-8011-12388c351a7c http.status_code=101 otel.status_code="OK"}: lemmy_server::api_routes_websocket: couldnt_find_object: Failed to resolve actor for asklemmy@lemmy.world

    EDIT2: Apparently it's a known issue with !asklemmy@lemmy.world, and a bug to be fixed in a future release.

  • I haven’t noticed it happening. But haven’t checked much.

    What I have noticed is that some of the overloaded and larger instances can be slow…to post comments….to subscribe to…to post threads on etc. especially from a separate federated instance.

    Lemmy.world is easily one I have noticed along with lemmy.ml and occasionally…beehaw (but much less so).

    My guess is that in general those instances may be slow to sync/update data or respond.

  • I've noticed something similar on my instance in some cases as well. Nothing obvious logged as errors either. It just seems like the comment was never sent. In my case cpu is minimal so it doesn't seem like a resource issue on the receiving side.

    I suspect it may be a resource issue on the sending side. Potentially, not able to keep up with the number of subscribers. I know there was some discussion from the devs around the number of federation workers needing to be increased to keep up, so another possibility there.

    It's definitely problematic though. I was contemplating implementing some kind of resync this entire post and all comments via the Lemmy API to get things back in sync. But, if it is a sending server resource issue, I'm also hesitant to add a bunch more API calls to the mix. I think some kind of resync functionality will be necessary in the end.

  • I've noticed the same situation in some threads on my own instance too. But I'm under the impression that it might just be backlogged on the responsible instance that's supposed to send out the federated content. I've noticed this when just having my home feed set to New and then suddenly seeing like thirty posts from lemmy.world come across all at once with widely varied timestamps.

    I suppose the best way to test if this is the case would be to note down any threads that are missing substantial amounts of comments on your local server and then check back with that thread periodically to see if and when they start to fill in.

18 comments