Posts from 2014

Questions or comments? Email me.

Wed 08 January 2014

Recording Live Audio Streams on iOS

Some people might consider this strange, but I'm still a fan of radio. It's not my 24/7 music source, but I find myself getting annoyed with algorithms trying to predict music that I'd like and ultimately failing. In addition to music, you've got news and sports radio - getting a large mix is fun.

These days tons of radio stations have moved their streams to be listen-able online, and recently I found myself wishing I had something that I could use to record these radio streams on a whim. My main device is an iPhone 5; sadly, I didn't find anything particularly pleasing or enjoyable to use in the App Store, so I set out to see about throwing together my own. What follows is a breakdown of a (relatively) easy way to download live audio streams using AVFoundation and Core Audio. It assumes you've got a working knowledge of Objective C, as well as a working knowledge in terms of building iOS apps in general.

Disclaimer

This code is here as an example and nothing more. In recording anything from a live stream that might possibly be copyrighted and/or legally protected you should ensure that you're allowed to do so. This post and the author holds no responsibility for what you do with the content hereafter.

Update 2015

In the comments spg has shown a case wherein .m3u8 files probably won't work with this method due to how they differ in playback. It would be cool to see if there's a way around this, but I don't have time at the moment to investigate sadly. Get in touch if you figure this out!

Update 2016

I've sinced removed comments from my site due to their neglect. Feel free to email me with questions about this though!

Audio Streaming

The basics of app structure/building aside, the first concern is the actual streaming of the audio. There are a myriad of factors here that are honestly just a massive chore to deal with - you've got network speed issues, you could have the network drop and pick back up, the list is endless. A cursory Google search tends to show that this is where people tripped themselves up a lot - the general consensus seems to be that relying on AVPlayer can't work for remote data, as there's seemingly no way to get at the AudioBuffer data - it only works with local files. This leads to everyone reinventing the wheel with the (arguably painful) Core Audio APIs.

Here's the thing: I had no desire to deal with any of that. They are certainly interesting problems, but right now they're just in the way. If possible I would much rather use AVPlayer and let Apple handle all the intricacies/edge cases in regards to playback.

Enter MTAudioProcessingTap

Now, MTAudioProcessingTap is no secret - it's what you'd use to, say, visualize audio data for local files played through AVPlayer. If you're not familiar with it, this is a pretty good writeup. The general gist is that you create a tap, set up an audio mix for a track, and set it on the AVPlayerItem of the audio player. The problem with remote data is that AVPlayer just works differently there - you don't have access to any AVAssetTracks with which to make an AVMutableAudioMix, presumably because streaming is just a different setup entirely behind the scenes.

However, if we look a bit further, after the streaming starts, you can access an AVAssetTrack on the player. Now it's a simple matter of KVObserve-ing the status of the player, grabbing the track when it's available, and setting up our stream handler. Given that it's MTAudioProcessingTap you could do any number of things here, but personally I just needed to pass the raw audio data through.

Unlike other articles on this site, I'm opting to just include three gists at the bottom that act as a full (mini) library, along with example usage. There's a bit of Core Audio involved, but it's nothing too annoying - hopefully this helps anyone who's been wondering how to handle this. This code isn't meant to be drop-in-good-to-go; with this type of project it's kind of expected that you integrate it based on your own needs.

If you have questions, you can always feel free to get in touch, be it email, Twitter, or GitHub. Generally down to help!

#import "GenericClassName.h"

@implementation GenericClassName

- (void)myMethodToStartRecording
{
    // Retain it, yo
    self.streamer = [RMStreamer new];
    [self.streamer recordStreamFromURL:myURL onError:^(NSError *error) {
        NSLog(@"Ah junks some error happened: %@", error);
    }];
}

- (void)myOtherMethodThatStopsStreaming
{
    [self.streamer stopRecording];
    // Do stuff
}

@end
#import <AVFoundation/AVFoundation.h>
#import <AudioToolbox/AudioToolbox.h>
#import <Accelerate/Accelerate.h>
#import <MediaToolbox/MediaToolbox.h>

typedef void (^RMStreamerErrorBlock)(NSURL *);

@interface RMStreamer : NSObject

@property (nonatomic, strong, readonly) NSURL *outputURL;
@property (nonatomic, assign, readonly) ExtAudioFileRef captureFile;
@property (nonatomic, strong, readonly) AVPlayer *audioPlayer;
@property (nonatomic, copy) RMStreamerErrorBlock onError;

- (void)recordStreamFromURL:(NSURL *)url onError:(RMStreamerErrorBlock)error;
- (void)stopRecording;

@end
#import "RMStreamer.h"

// Allows us to get access to our RMStreamer instance in the process method below~
void init(MTAudioProcessingTapRef tap, void *clientInfo, void **tapStorageOut) {
    *tapStorageOut = clientInfo;
}

void finalize(MTAudioProcessingTapRef tap)
{
    // Self explanatory - any final operations you need to handle.
    // I didn't need much in the way of anything here.
}

// We defer the creation of our output file until this is called - by doing so,
// we don't need to guess at the format it comes in as.
void prepare(
    MTAudioProcessingTapRef tap, 
    CMItemCount maxFrames, 
    const AudioStreamBasicDescription *processingFormat
) {
    NSLog(@"Preparing the Audio Tap Processor");
    RMStreamer *streamer = (__bridge RMStreamer *) MTAudioProcessingTapGetStorage(tap);
    [streamer createOutputFileForStreamWithFormat:*processingFormat];
}

void unprepare(MTAudioProcessingTapRef tap) {
    // Self explanatory - if you have things you need done here, do them.
}

void process(
    MTAudioProcessingTapRef tap,
    CMItemCount numberFrames,
    MTAudioProcessingTapFlags flags,
    AudioBufferList *bufferListInOut,
    CMItemCount *numberFramesOut,
    MTAudioProcessingTapFlags *flagsOut
) {
    RMStreamer *streamer = (__bridge RMStreamer *) MTAudioProcessingTapGetStorage(tap);

    OSStatus err = MTAudioProcessingTapGetSourceAudio(
        tap, 
        numberFrames, 
        bufferListInOut, 
        flagsOut, 
        NULL, 
        numberFramesOut
    );
    
    if(err) {
        // There was an error getting audio buffers from the stream.
        // React accordingly in your application!
        streamer.onError([NSError 
            errorWithDomain:NSOSStatusErrorDomain 
            code:err 
            userInfo:nil
        ]);
    }
    
    OSStatus f = ExtAudioFileWrite(
        streamer.captureFile, 
        *numberFramesOut, 
        bufferListInOut
    );
    
    if(f) {
        // Writing to the audio file failed for some reason. Check why and react.
        streamer.onError([NSError 
            errorWithDomain:NSOSStatusErrorDomain 
            code:err 
            userInfo:nil
        ]);
    }
}

@implementation RMStreamer

- (void)recordStreamFromURL:(NSURL *)url onError:(RMStreamerErrorBlock)error
{
    _onError = error;
    
    AVPlayerItem *item = [AVPlayerItem playerItemWithURL:url];
    _audioPlayer = [AVPlayer playerWithPlayerItem:item];

    // Watch the status property - when this is good to go, we can access the
    // underlying AVAssetTrack we need.
    [item addObserver:self forKeyPath:@"status" options:0 context:nil];
}

- (void)observeValueForKeyPath:(NSString *)keyPath 
                      ofObject:(id)object 
                        change:(NSDictionary *)change 
                       context:(void *)context
{
    if(![keyPath isEqualToString:@"status"])
        return;
    
    AVPlayerItem *item = (AVPlayerItem *)object;
    if(item.status != AVPlayerItemStatusReadyToPlay)
        return;
        
    AVURLAsset *asset = (AVURLAsset *)item.asset;
    AVAssetTrack *audioTrack = [asset tracksWithMediaType:AVMediaTypeAudio][0];
    [self beginRecordingAudioFromTrack:audioTrack];
    [_audioPlayer play];
}

- (void)beginRecordingAudioFromTrack:(AVAssetTrack *)audioTrack
{
    // Configure an MTAudioProcessingTap to handle things.
    MTAudioProcessingTapRef tap;
    MTAudioProcessingTapCallbacks callbacks;
    callbacks.version = kMTAudioProcessingTapCallbacksVersion_0;
    callbacks.clientInfo = (__bridge void *)(self);
    callbacks.init = init;
    callbacks.prepare = prepare;
    callbacks.process = process;
    callbacks.unprepare = unprepare;
    callbacks.finalize = finalize;
    
    OSStatus err = MTAudioProcessingTapCreate(
        kCFAllocatorDefault, 
        &callbacks, 
        kMTAudioProcessingTapCreationFlag_PostEffects, 
        &tap
    );
    
    if(err) {
        NSLog(@"Unable to create the Audio Processing Tap %ld", err);
        _onError([NSError errorWithDomain:NSOSStatusErrorDomain code:err userInfo:nil]);
        return;
    }

    // Create an AudioMix and assign it to our currently playing "item", which
    // is just the stream itself.
    AVMutableAudioMix *audioMix = [AVMutableAudioMix audioMix];
    AVMutableAudioMixInputParameters *inputParams = [AVMutableAudioMixInputParameters 
        audioMixInputParametersWithTrack:audioTrack];
        
    inputParams.audioTapProcessor = tap;
    audioMix.inputParameters = @[inputParams];
    _audioPlayer.currentItem.audioMix = audioMix;
}

// This you'll want to customize to your needs - it's pulled from my 
// own project and quickly revamped. Good luck!
- (void)createOutputFileForStreamWithFormat:(AudioStreamBasicDescription)clientFormat
{
    // This is an incredibly generic file path. Customize as need be.
    NSError *nserr = nil;
    NSFileManager *fm = [NSFileManager defaultManager];
    NSArray *paths = NSSearchPathForDirectoriesInDomains(
        NSDocumentDirectory, 
        NSUserDomainMask, 
        YES
    );
    NSString *documentsDirectory = [paths objectAtIndex:0];
    NSString *dir = [documentsDirectory stringByAppendingPathComponent:@"rm_streamer"];
    if(![fm fileExistsAtPath:dir])
        [fm createDirectoryAtPath:dir 
            withIntermediateDirectories:YES 
            attributes:nil 
            error:&nserr];
    
    NSTimeInterval timestamp = [[NSDate date] timeIntervalSince1970];
    NSString *output = [NSString stringWithFormat:@"%@/%f.caf", dir, timestamp];
    unlink([output UTF8String]);
    
    _outputURL = [NSURL fileURLWithPath:];
    OSStatus err = ExtAudioFileCreateWithURL(
        (__bridge CFURLRef)_outputURL, 
        kAudioFileCAFType, 
        &clientFormat, 
        NULL, 
        kAudioFileFlags_EraseFile, 
        &_captureFile
    );
    
    if(err) {
        _onError([NSError errorWithDomain:NSOSStatusErrorDomain code:err userInfo:nil]);
        // An error occurred with creating the file. Go figure.
    }
    
    // This setting is... annoying. Some devices will randomly crap out 
    // if the hardware audio support is wonky. Change as you need to.
    UInt32 codecManf = kAppleHardwareAudioCodecManufacturer;
    // UInt32 codecManf = kAppleSoftwareAudioCodecManufacturer;
    err = ExtAudioFileSetProperty(
        _captureFile, 
        kExtAudioFileProperty_CodecManufacturer, 
        sizeof(UInt32), 
        &codecManf
    );

    if(err) {
        _onError([NSError errorWithDomain:NSOSStatusErrorDomain code:err userInfo:nil]);
        return;
    }
    
    err = ExtAudioFileSetProperty(
        _captureFile, 
        kExtAudioFileProperty_ClientDataFormat, 
        sizeof(clientFormat), 
        &clientFormat
    );
    
    if(err) {
        _onError([NSError errorWithDomain:NSOSStatusErrorDomain code:err userInfo:nil]);
        return;
    }
}

// Getting an MTAudioProcessingTapRef to properly release is kind of annoying in general.
// This should more or less handle it~
- (void)stopRecording
{
    [_audioPlayer pause];
    ExtAudioFileDispose(_captureFile);

    [_audioPlayer.currentItem removeObserver:self forKeyPath:@"status"];
    AVMutableAudioMixInputParameters *params = (AVMutableAudioMixInputParameters *) _player.currentItem.audioMix.inputParameters[0];
    MTAudioProcessingTapRef tap = params.audioTapProcessor;
    _player.currentItem.audioMix = nil;
    _player = nil;
    CFRelease(tap);
}

@end

Ryan around the Web