Noticed this issue when connecting to a server that had been running the same map for a few days:
int snd_dma.c, S_StartSound():
start = cl.frame.servertime * 0.001 * dma.speed + s_beginofs;
These were the values I had:
		cl.frame.servertime	110254500
		dma.speed	48000
		s_beginofs	1578529
		cl.frame.servertime * 0.001 * dma.speed + s_beginofs  5293794529.0000000
That overflows the max int value, and you end up with:
		(int)(cl.frame.servertime * 0.001 * dma.speed + s_beginofs)	-2147483648
I tried making start and s_beginofs double's a while back, but it seems there may be reduced performance doing that.  What do you think the proper way to fix this is?
Edit: I think the performance issue was just a fluke.  Sometimes the framerate will vary even when I haven't changed anything.  Really annoying when I'm trying to test optimizations.  Using doubles is probably the correct thing to do.