This post is part "hear me out" and part asking for advice.
Looking at the table above AI gpus are a pure scam, and it would make much more sense to (atleast looking at this) to use gaming gpus instead, either trough a frankenstein of pcie switches or high bandwith network.
so my question is if somebody has build a similar setup and what their experience has been. And what the expected overhead performance hit is and if it can be made up for by having just way more raw peformance for the same price.
LLMs are experimental, alpha-level technologies. Nvidia showed investors how fast their cards could compute this information. Now investors can just tell the LLM what they want, and it will spit out something that probably looks similar to what they want. But Nvidia is going to sell as many cards as possible before the bubble bursts.
Correct. Pattern recognition + prompts to desire a positive result even if the answer isn't entirely true. If it's close enough to the desired pattern, it get pushed.