AVAudioPlayerNode: Unterschied zwischen den Versionen

Aus Wikizone
Wechseln zu: Navigation, Suche
Zeile 3: Zeile 3:
 
  [[Swift (Programmiersprache)]]
 
  [[Swift (Programmiersprache)]]
  
 +
 +
== Scheduled Audio for timing accurate playback ==
 
You can use AVAudioEngine and AVAudioPlayerNode to schedule buffers for timing accurate playback. The AVAudioPlayerNode class has a method called scheduleBuffer(_:at:options:completionHandler:) that allows you to schedule a buffer for playback at a specific time in the future. This can be used to achieve timing accurate playback.
 
You can use AVAudioEngine and AVAudioPlayerNode to schedule buffers for timing accurate playback. The AVAudioPlayerNode class has a method called scheduleBuffer(_:at:options:completionHandler:) that allows you to schedule a buffer for playback at a specific time in the future. This can be used to achieve timing accurate playback.
  
Zeile 30: Zeile 32:
 
Please let me know if this helps, or if you have any other questions.
 
Please let me know if this helps, or if you have any other questions.
  
'''How can I use this to play a sound in accurate intervals for example 120 times a minute
+
=== Play in intervals ===
'''
+
'''How can I use this to play a sound in accurate intervals for example 120 times a minute'''
 
You can use the AVAudioPlayerNode's scheduleBuffer(_:at:options:completionHandler:) method to schedule multiple buffers for playback in accurate intervals to achieve a metronome like functionality.
 
You can use the AVAudioPlayerNode's scheduleBuffer(_:at:options:completionHandler:) method to schedule multiple buffers for playback in accurate intervals to achieve a metronome like functionality.
  
Zeile 60: Zeile 62:
 
</syntaxhighlight>
 
</syntaxhighlight>
  
In this example, it first creates an instance of AVAudioEngine and AVAudioPlayerNode, attaches the player to the engine, connects the player to the engine's main mixer node. Then it creates a buffer with the audio data that should be played. After that, it calculates the number of samples per minute and number of samples per click, using the sample rate of the audio format (44100 samples per second) and the desired number of clicks per minute (120). It sets the nextClickSampleTime to the current sample time plus the number of samples per click and then it schedules 120 buffers for playback at the nextClickSampleTime with the scheduleBuffer(_:at:options:completionHandler:) method.
+
In this example, it first creates an instance of AVAudioEngine and AVAudioPlayerNode, attaches the player to the engine, connects the player to the engine's main mixer node. Then it creates a buffer with the audio data that should be played. After that, it '''calculates the number of samples per minute''' and '''number of samples per click''', using the '''sample rate''' of the audio format (44100 samples per second) and the '''desired number of clicks per minute''' (120). It sets the nextClickSampleTime to the current sample time plus the number of samples per click and then it schedules 120 buffers for playback at the nextClickSampleTime with the scheduleBuffer(_:at:options:completionHandler:) method.
 +
 
 +
The scheduleBuffer(_:at:options:completionHandler:) method allows you to specify the '''sample time''' at which the buffer should start playing, so by scheduling the buffer at the nextClickSampleTime variable, you are scheduling the playback of the buffer at accurate intervals.
 +
 
 +
=== How do I fill the buffer with audio data? ===
 +
 
 +
To fill the buffer with audio data, you can use the AVAudioFile class to read an audio file from disk, and then use the AVAudioPCMBuffer's append(_:) method to copy the audio data from the file into the buffer.
 +
 
 +
Here's an example of how you can fill an AVAudioPCMBuffer with audio data from an audio file:
 +
 
 +
<syntaxhighlight lang="swift">
 +
let fileURL = Bundle.main.url(forResource: "audioFile", withExtension: "caf")!
 +
let file = try AVAudioFile(forReading: fileURL)
  
The scheduleBuffer(_:at:options:completionHandler:) method allows you to specify the sample time at which the buffer should start playing, so by scheduling the buffer at the nextClickSampleTime variable, you are scheduling the playback of the buffer at accurate intervals.
+
let buffer = AVAudioPCMBuffer(pcmFormat: file.processingFormat, frameCapacity: AVAudioFrameCount(file.length))!
 +
try file.read(into: buffer)
 +
</syntaxhighlight>
  
It's worth noting that you should also set the engine.mainMixerNode.outputLatency to 0 or a low value to reduce delay between the scheduled time and the actual playback.
+
In this example, it first constructs a file url of the audio file, then creates an instance of AVAudioFile using this url, then it creates an instance of AVAudioPCMBuffer with the same format as the file and with the enough frame capacity to hold the audio data, and then it reads the file's data into the buffer using the read(into:) method of the file instance.
 +
 
 +
You can also use the AVAudioPCMBuffer's floatChannelData property to access the buffer's audio data as a UnsafeMutableBufferPointer<Float>. This can be useful if you want to generate audio data programmatically, for example, to create a sine wave.
 +
 
 +
<syntaxhighlight lang="swift">
 +
let buffer = AVAudioPCMBuffer(pcmFormat: format, frameCapacity: 44100)!
 +
let data = buffer.floatChannelData![0]
 +
 
 +
// fill the buffer with a sine wave
 +
for i in 0..<Int(buffer.frameCapacity) {
 +
    data[i] = sin(Float(i) / 44100 * 2 * Float.pi * 440)
 +
}
 +
</syntaxhighlight>

Version vom 14. Januar 2023, 07:10 Uhr

Links

Swift (Programmiersprache)#Swift - Audio
Swift (Programmiersprache)


Scheduled Audio for timing accurate playback

You can use AVAudioEngine and AVAudioPlayerNode to schedule buffers for timing accurate playback. The AVAudioPlayerNode class has a method called scheduleBuffer(_:at:options:completionHandler:) that allows you to schedule a buffer for playback at a specific time in the future. This can be used to achieve timing accurate playback.

Here is an example of how you can use AVAudioEngine and AVAudioPlayerNode to schedule buffers for timing accurate playback:

let engine = AVAudioEngine()
let player = AVAudioPlayerNode()
engine.attach(player)
engine.connect(player, to: engine.mainMixerNode, format: engine.mainMixerNode.outputFormat(forBus: 0))

// Prepare a buffer with your audio data
let buffer = AVAudioPCMBuffer(pcmFormat: player.outputFormat(forBus: 0), frameCapacity: 1024)!
// ... fill buffer with your audio data

// Schedule the buffer for playback
let sampleTime = AVAudioFramePosition(44100) // the sample time you want the buffer to start playing
player.scheduleBuffer(buffer, at: nil, options: .interrupts, completionHandler: nil)
engine.startAndReturnError(nil)

In this example, it first creates an instance of AVAudioEngine and AVAudioPlayerNode, attaches the player to the engine, connects the player to the engine's main mixer node, then create a buffer with the audio data. Finally, it schedules the buffer for playback at a specific time (in this case, 44100 samples) by calling the scheduleBuffer(_:at:options:completionHandler:) method of the player.

The scheduleBuffer(_:at:options:completionHandler:) method allows you to specify the sample time at which the buffer should start playing. If you pass nil to the atTime parameter, the buffer will start playing as soon as possible.

It's worth noting that you should also set the engine.mainMixerNode.outputLatency to 0 or a low value to reduce delay between the scheduled time and the actual playback.

Please let me know if this helps, or if you have any other questions.

Play in intervals

How can I use this to play a sound in accurate intervals for example 120 times a minute You can use the AVAudioPlayerNode's scheduleBuffer(_:at:options:completionHandler:) method to schedule multiple buffers for playback in accurate intervals to achieve a metronome like functionality.

Here's an example of how you can use this to play a sound 120 times per minute:

let engine = AVAudioEngine()
let player = AVAudioPlayerNode()
engine.attach(player)
engine.connect(player, to: engine.mainMixerNode, format: engine.mainMixerNode.outputFormat(forBus: 0))

// Prepare a buffer with your audio data
let buffer = AVAudioPCMBuffer(pcmFormat: player.outputFormat(forBus: 0), frameCapacity: 1024)!
// ... fill buffer with your audio data

// schedule the buffer for playback
let samplesPerMinute = 44100 * 60
let samplesPerClick = samplesPerMinute / 120

var nextClickSampleTime = engine.outputNode.lastRenderTime.sampleTime + samplesPerClick

for _ in 0..<120 {
    player.scheduleBuffer(buffer, at: nil, options: .interrupts, completionHandler: nil)
    nextClickSampleTime += samplesPerClick
}

engine.startAndReturnError(nil)

In this example, it first creates an instance of AVAudioEngine and AVAudioPlayerNode, attaches the player to the engine, connects the player to the engine's main mixer node. Then it creates a buffer with the audio data that should be played. After that, it calculates the number of samples per minute and number of samples per click, using the sample rate of the audio format (44100 samples per second) and the desired number of clicks per minute (120). It sets the nextClickSampleTime to the current sample time plus the number of samples per click and then it schedules 120 buffers for playback at the nextClickSampleTime with the scheduleBuffer(_:at:options:completionHandler:) method.

The scheduleBuffer(_:at:options:completionHandler:) method allows you to specify the sample time at which the buffer should start playing, so by scheduling the buffer at the nextClickSampleTime variable, you are scheduling the playback of the buffer at accurate intervals.

How do I fill the buffer with audio data?

To fill the buffer with audio data, you can use the AVAudioFile class to read an audio file from disk, and then use the AVAudioPCMBuffer's append(_:) method to copy the audio data from the file into the buffer.

Here's an example of how you can fill an AVAudioPCMBuffer with audio data from an audio file:

let fileURL = Bundle.main.url(forResource: "audioFile", withExtension: "caf")!
let file = try AVAudioFile(forReading: fileURL)

let buffer = AVAudioPCMBuffer(pcmFormat: file.processingFormat, frameCapacity: AVAudioFrameCount(file.length))!
try file.read(into: buffer)

In this example, it first constructs a file url of the audio file, then creates an instance of AVAudioFile using this url, then it creates an instance of AVAudioPCMBuffer with the same format as the file and with the enough frame capacity to hold the audio data, and then it reads the file's data into the buffer using the read(into:) method of the file instance.

You can also use the AVAudioPCMBuffer's floatChannelData property to access the buffer's audio data as a UnsafeMutableBufferPointer<Float>. This can be useful if you want to generate audio data programmatically, for example, to create a sine wave.

let buffer = AVAudioPCMBuffer(pcmFormat: format, frameCapacity: 44100)!
let data = buffer.floatChannelData![0]

// fill the buffer with a sine wave
for i in 0..<Int(buffer.frameCapacity) {
    data[i] = sin(Float(i) / 44100 * 2 * Float.pi * 440)
}