I hate Facebook with a furious seething passion and have no vested interest in defending any of their shit, but this truly sounds like it made up a random number, with it being a total coincidence that the actual number belonged to another whatsapp user.
I mean the latter statement is not true at all. I'm not sure why you think this. A basic GPT model reads a sequence of tokens and predicts the next one. Any sequence of tokens is possible, and each digit 0-9 is likely its own token, as is the case in the GPT2 tokenizer.
An LLM can't generate random numbers in the sense of a proper PRNG simulating draws from a uniform distribution, the output will probably have some kind of statistical bias. But it doesn't have to produce sequences contained in the training data.