Maybe this is old news to you guys. If so, excuse showing my novice ignorance. I was wondering how much latency is acceptable in relatable terms rather than depending on the number of milliseconds some guy in a lab coat says is "imperceptible by most people". I figured the speed of sound might be a good starting point and did a little basic math to find out something I never realized before. So since sound travels at 1130 feet per second, then it takes .85 milliseconds to travel a single foot. You can manipulate it as many ways as you want, but obviously we seem to perceive sounds happening instantly most of the time because most things we are paying attention to are fairly close by. You can be only one or two hundred feet away from the source of a sound before it becomes interestingly noticeable. But anyway, most places such as small bars and such (i.e. gigs) sounds tend to be perceived to match the event pretty much on the money. I've been at concerts where I was far enough away that I could see a slight lag. But you have to be a fair distance, say 50 feet? before it seems to be an issue. Of course it might be more noticeable when something at your feet has even the slightest lag, i.e. it would be weird and noticeable for the 50-foot lag to happen when your brain is expecting a 5-foot lag. So the magic "acceptable latency" number I see is always 25 milliseconds. That equivalently translates to being 28 feet away from the source of a sound. I'm not sure if I by that, so... I'm going to go outside tomorrow and throw some rocks at the neighbor kids or something and see if I notice a noticeable latency if they are only 30 feet away. Their yelps are likely to lag slightly behind the thwack! of the initial impact, so I need to ignore any sounds that come after. I might have to try several times to learn at filtering out and ignoring the extraneous sounds I might generate.