Most Intel GPUs are great at transcoding. Reliable, widely supported and quite a bit of transcoding power for very little electrical power.
I think the main thing I would check is what formats are supported. If the other GPU can support newer formats like AV1 it may be worth it (if you want to store your videos in these more efficient formats or you have clients who can consume these formats and will appreciate the reduced bandwidth).
But overall I would say if you aren't having any problems no need to bother. The onboard graphics are simple and efficient.
If the iGPU is getting the job done, I would leave that alone. You could add a GPU and pass it through to a gaming VM. But that is an entirely different project.
I would avoid it, if you care at all about availability and downtime. The result will probably not be great, you need to ensure the server side gets enough resources under load, and setting it up may require constant restarts if things aren’t immediately working as expected.
For an old nvidia it might be too much energy drain.
I was also using the integrated intel for video re-encodes and I got an Arc310 for 80 bucks which is the cheapest you will get a new card with AV1 support.
I ran a 1650 super for a while. At idle it added about 10W and would draw 30-40W while transcoding. I ended up taking it out because the increased power wasn't worth the slight performance increase for me.
I only have a GPU because my CPU doesn't have any graphics. I don't use the graphics anyway, but I need it to boot. So I put our crappiest spare GPU in (GTX 750 Ti) and call it good.
I wouldn't bother. If you end up needing it, it'll take like 15 min to get it installed and drivers set up and everything. No need to bother until you actually need it.