39
26

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?

More than 5 years have passed since last update.

[iOS]Swift4で音声の入力・出力をする方法

Last updated at Posted at 2019-02-15

はじめに

この記事では、マイクから音声を入力して一旦ファイル化し、その音をスピーカーから出力するまでの処理をSwift4で書いてみることにします。
Swiftのアップデートが早くてSwift4もすぐにオワコンになってしまうかもしれないですが、自分用の備忘も兼ねて記録しておきます:grinning:

使ったもの

  • Xcode 10.1
  • Swift 4.2
  • AVFoundation

共通で必要な処理の実装

AVFoundationのインスタンスを生成する
let session = AVAudioSession.sharedInstance()
再生と録音をすることを宣言してからアクティブ化する
try session.setCategory(.playAndRecord, mode: .default)
try session.setActive(true)

音声入力の実装

まず録音で使用するAVAudioRecorderを宣言する
var audioRecorder: AVAudioRecorder!
録音フォーマットの設定をする
let settings = [
    AVFormatIDKey: Int(kAudioFormatMPEG4AAC),
    AVSampleRateKey: 44100,
    AVNumberOfChannelsKey: 2,
    AVEncoderAudioQualityKey: AVAudioQuality.high.rawValue
]
AVAudioRecorderインスタンスを生成する
audioRecorder = try AVAudioRecorder(url: getAudioFileUrl(), settings: settings)
audioRecorder.delegate = self as? AVAudioRecorderDelegate
func getAudioFileUrl() -> URL {
    let paths = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask)
    let docsDirect = paths[0]
    let audioUrl = docsDirect.appendingPathComponent("recording.m4a")
    
    return audioUrl
}
ボタンを押したら録音開始(録音中に押したら録音停止)
@IBAction func record(_ sender: Any) {
    if !audioRecorder.isRecording {
        audioRecorder.record()
    } else {
        audioRecorder.stop()
    }
}

音声出力の実装

まず再生で使用するAVAudioEngine、AVAudioFile、AVAudioPlayerNodeを宣言する
var audioEngine: AVAudioEngine!
var audioFile: AVAudioFile!
var audioPlayerNode: AVAudioPlayerNode!
AVAudioEngine、AVAudioFile、AVAudioPlayerNode各々のインスタンスを生成する
audioEngine = AVAudioEngine()
audioFile = try AVAudioFile(forReading: getAudioFileUrl())
audioPlayerNode = AVAudioPlayerNode()
//上で書いたfuncと同じ
func getAudioFileUrl() -> URL {
    let paths = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask)
    let docsDirect = paths[0]
    let audioUrl = docsDirect.appendingPathComponent("recording.m4a")
    
    return audioUrl
}
Nodeのつなぎ合わせをする
audioEngine.attach(audioPlayerNode)
audioEngine.connect(audioPlayerNode, to: audioEngine.outputNode, format: audioFile.processingFormat)
ボタンを押したら再生開始(再生中に押したら初めから再生)
@IBAction func play(_ sender: Any) {
    do {
    //中略(インスタンス生成やNodeのつなぎ合わせはここでやる)
        audioPlayerNode.stop()
        audioPlayerNode.scheduleFile(audioFile, at: nil)
            
        try audioEngine.start()
        audioPlayerNode.play()
    } catch let error {
        print(error)
    }
}

通しで書くとこんな感じ

import UIKit
import AVFoundation

class ViewController: UIViewController {

    var audioRecorder: AVAudioRecorder!
    var audioEngine: AVAudioEngine!
    var audioFile: AVAudioFile!
    var audioPlayerNode: AVAudioPlayerNode!

    override func viewDidLoad() {
        super.viewDidLoad()
        // Do any additional setup after loading the view, typically from a nib.

        let session = AVAudioSession.sharedInstance()

        do {
            try session.setCategory(.playAndRecord, mode: .default)
            try session.setActive(true)

            let settings = [
                AVFormatIDKey: Int(kAudioFormatMPEG4AAC),
                AVSampleRateKey: 44100,
                AVNumberOfChannelsKey: 2,
                AVEncoderAudioQualityKey: AVAudioQuality.high.rawValue
            ]

            audioRecorder = try AVAudioRecorder(url: getAudioFileUrl(), settings: settings)
            audioRecorder.delegate = self as? AVAudioRecorderDelegate
        } catch let error {
            print(error)
        }

    }

    @IBAction func record(_ sender: Any) {
        if !audioRecorder.isRecording {
            audioRecorder.record()
        } else {
            audioRecorder.stop()
        }
    }

    
    @IBAction func play(_ sender: Any) {
        audioEngine = AVAudioEngine()
        do {
            audioFile = try AVAudioFile(forReading: getAudioFileUrl())
            
            audioPlayerNode = AVAudioPlayerNode()
            audioEngine.attach(audioPlayerNode)
            audioEngine.connect(audioPlayerNode, to: audioEngine.outputNode, format: audioFile.processingFormat)
            
            audioPlayerNode.stop()
            audioPlayerNode.scheduleFile(audioFile, at: nil)
            
            try audioEngine.start()
            audioPlayerNode.play()
        } catch let error {
            print(error)
        }
    }
    
    func getAudioFileUrl() -> URL {
        let paths = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask)
        let docsDirect = paths[0]
        let audioUrl = docsDirect.appendingPathComponent("recording.m4a")

        return audioUrl
    }

}

こんなクラスファイルがあると実装が楽

この辺のユーティリティと言うほどではないけれど、いろんな画面から呼び出しそうな処理はこんな感じで一箇所にまとめて外出ししてしまった方がコードが冗長にならなくていいなあと思います:ok_woman:
あくまで一例ですが…

AudioManager.swift
import UIKit
import AVFoundation

final class AudioManager {
    
    var audioRecorder: AVAudioRecorder!
    var audioEngine: AVAudioEngine!
    var audioFile: AVAudioFile!
    var audioPlayerNode: AVAudioPlayerNode!
    var audioUnitTimePitch: AVAudioUnitTimePitch!
    
    init() {}
    
    func setUpAudioRecorder() {
        let session = AVAudioSession.sharedInstance()
        
        do {
            try session.setCategory(.playAndRecord, mode: .default)
            try session.setActive(true)
            
            let settings = [
                AVFormatIDKey: Int(kAudioFormatMPEG4AAC),
                AVSampleRateKey: 44100,
                AVNumberOfChannelsKey: 2,
                AVEncoderAudioQualityKey: AVAudioQuality.high.rawValue
            ]
            
            audioRecorder = try AVAudioRecorder(url: getAudioFilrUrl(), settings: settings)
            audioRecorder.delegate = self as? AVAudioRecorderDelegate
        } catch let error {
            print(error)
        }
    }
    
    func playSound(speed: Float, pitch: Float, echo: Bool, reverb: Bool) {
        audioEngine = AVAudioEngine()
        
        let url = getAudioFilrUrl()
        
        do {
            audioFile = try AVAudioFile(forReading: url)
            
            audioPlayerNode = AVAudioPlayerNode()
            audioEngine.attach(audioPlayerNode)
            
            audioUnitTimePitch = AVAudioUnitTimePitch()
            audioUnitTimePitch.rate = speed
            audioUnitTimePitch.pitch = pitch
            audioEngine.attach(audioUnitTimePitch)
            
            let echoNode = AVAudioUnitDistortion()
            echoNode.loadFactoryPreset(.multiEcho1)
            audioEngine.attach(echoNode)
            
            let reverbNode = AVAudioUnitReverb()
            reverbNode.loadFactoryPreset(.cathedral)
            reverbNode.wetDryMix = 50
            audioEngine.attach(reverbNode)
            
            if echo && reverb {
                connectAudioNodes(audioPlayerNode, audioUnitTimePitch, echoNode, reverbNode, audioEngine.outputNode)
            } else if echo {
                connectAudioNodes(audioPlayerNode, audioUnitTimePitch, echoNode, audioEngine.outputNode)
            } else if reverb {
                connectAudioNodes(audioPlayerNode, audioUnitTimePitch, reverbNode, audioEngine.outputNode)
            } else {
                connectAudioNodes(audioPlayerNode, audioUnitTimePitch, audioEngine.outputNode)
            }
            
            audioPlayerNode.stop()
            audioPlayerNode.scheduleFile(audioFile, at: nil)
            
            try audioEngine.start()
            audioPlayerNode.play()
        } catch let error {
            print(error)
        }
    }
    
    private func connectAudioNodes(_ nodes: AVAudioNode...) {
        for x in 0..<nodes.count - 1 {
            audioEngine.connect(nodes[x], to: nodes[x+1], format: audioFile.processingFormat)
        }
    }
    
    func getAudioFilrUrl() -> URL {
        let paths = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask)
        let docsDirect = paths[0]
        let audioUrl = docsDirect.appendingPathComponent("recording.m4a")
        
        return audioUrl
    }
}

おわりに

単に音声の入出力だけでも結構色々やらなきゃいけないことが多いみたいでなれるまでは大変かもしれませんね。
個人的にはNodeをつなぎ合わせるという考え方が、楽器→エフェクター→アンプの接続を想起させるような作りになっていて面白いと思いました:open_mouth:

39
26
0

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
39
26

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?