Ok, I have some answers after doing some more tests, so I am sharing it with anyone who is interested.
I've placed a variable to measure time intervals between ticks, inside the play
method (the method that actually sends the play
message to the AVAudioPlayer
object), and as my simple compare-to-external-watch experiment showed, the 60 BPM was too slow - I got these time intervals (in seconds):
1.004915
1.009982
1.010014
1.010013
1.010028
1.010105
1.010095
1.010105
My conclusion was that some overhead time elapses after each 1-second-interval is counted, and that extra time (about 10msec) is accumulated to a noticeable amount after a few tens of seconds --- quite bad for a metronome. So instead of measuring the interval between calls, I decided to measure the total interval from the first call, so that the error won't be accumulated. In other words I've replaced this condition:
while (continuePlaying && ((currentTime0 + [duration doubleValue]) >= currentTime1)
with this condition:
while (continuePlaying && ((_currentTime0 + _cnt * [duration doubleValue]) >= currentTime1 ))
where now _currentTime0
and _cnt
are class members (sorry if it's a c++ jargon, I am quite new to Obj-C), the former holds the time stamp of the first call to the method, and the latter is an int
counting number of ticks (==function calls). This resulted in the following measured time intervals:
1.003942
0.999754
0.999959
1.000213
0.999974
0.999451
1.000581
0.999470
1.000370
0.999723
1.000244
1.000222
0.999869
and it is evident even without calculating the average, that these values fluctuate around 1.0 second (and the average is close to 1.0 with at least a millisecond of accuracy).
I will be happy to hear more insights regarding what causes the extra time to elapse - 10msec sounds as eternity for a modern CPU - though I am not familiar with the specs of the iPod CPU (it's iPod 4G, and Wikipedia says the CUP is PowerVR SGX GPU 535 @ 200 MHz)
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…