I'm in startup of designing a client/server audio system which can stream audio arbitrarily over a network. One central server pumps out an audio stream and x number of clients receives the audio data and plays it. So far no magic needed and I have even got this scenario to work with VLC media player out of the box.
However, the tricky part seems to be synchronizing the audio playback so that all clients are in audible synch (actual latency can be allowed as long as it is perceived to be in sync by a human listener).
My question is if there's any known method or algorithm to use for this type of synchronization problem (video is likely solved the same way). My own initial thoughts centers around synchronizing clocks between physical machines and thereby creating a virtual "main timer" and somehow aligning audio data packets against it.
Some products already solving the problem (however still not sufficient for my overall use-case):
Any pointers are most welcome. Thanks.
PS: This related question seems to have died long ago.
Ryan Barrett wrote up his findings on his blog.
His solution involved using NTP as a method to keep all the clocks in-sync:
Seriously, though, there's only one trick to p4sync, and that is how it uses NTP. One host acts as the p4sync server. The other p4sync clients synchronize their system clocks to the server's clock, using SNTP. When the server starts playing a song, it records the time, to the millisecond. The clients then retrieve that timestamp, calculate the difference between current time from that timestamp, and seek forward that far into the song.