It's not entirely without logic. Base 12 is actually better that base 10 for a start, as it allows for a lot more fractions that have clean representations, so 12 inches in a foot is fine.
The next thing is that people seem to think we have all of these strange units with strange conversions, when in reality we have 3 units for short distances, and then a seperate unit for long distances. 12 inches in a foot, 3 feet in a yard, and then nobody cares how long a mile is in terms of feet or yards. Once you realize that the mile is not even really part of the same measuring system as inches, feet and yards, the weird conversion makes sense. We exclusively use miles to talk about long distances above 0.1 miles, and then yards are used below 500 yards (which has an overlap of 324 yards).
And then for the logic, it is entirely based on actual human scale shit. A foot is called a foot because it is roughly the size of your foot. A yard is approximately how long one stride is. Saying something is 100 yards away means it is approximate 100 steps away. Obviously there will be a bit of variance for how accurate that will be for any given person (and children will have to base it off of an adult obviously), but because it is based more on human things it is more useful for measuring human scale things. It was designed to not use decimals or large numbers because humans don't comprehend those very well.