GithubHelp home page GithubHelp logo

theamazingaudioengine / theamazingaudioengine2 Goto Github PK

View Code? Open in Web Editor NEW
537.0 43.0 89.0 2.64 MB

The Amazing Audio Engine is a sophisticated framework for iOS audio applications, built so you don't have to.

Home Page: http://theamazingaudioengine.com/doc2

License: Other

Objective-C 89.75% Swift 3.84% C 4.42% Assembly 0.99% C++ 1.01%

theamazingaudioengine2's Introduction

Important Notice: The Amazing Audio Engine has been retired. See the announcement here

The Amazing Audio Engine

The Amazing Audio Engine is a sophisticated framework for iOS audio applications, built so you don't have to.

It is designed to be very easy to work with, and handles all of the intricacies of iOS audio on your behalf.

Built upon the efficient and low-latency Core Audio Remote IO system, The Amazing Audio Engine lets you get to work on making your app great instead of reinventing the wheel.

See https://youtu.be/OZQT4IGS8mA for introductory video.

See http://theamazingaudioengine.com for details and http://theamazingaudioengine.com/doc2 for documentation.

TAAE was written by developer of Audiobus and Loopy Michael Tyson, in consultation with Jonatan Liljedahl, developer of AUM and AudioShare.

The Amazing Audio Engine Library License

Copyright (C) 2012-2016 A Tasty Pixel

This software is provided 'as-is', without any express or implied warranty. In no event will the authors be held liable for any damages arising from the use of this software.

Permission is granted to anyone to use this software for any purpose, including commercial applications, and to alter it and redistribute it freely, subject to the following restrictions:

  1. The origin of this software must not be misrepresented; you must not claim that you wrote the original software. If you use this software in a product, an acknowledgment in the product documentation would be appreciated but is not required.

  2. Altered source versions must be plainly marked as such, and must not be misrepresented as being the original software.

  3. This notice may not be removed or altered from any source distribution.

The Amazing Audio Engine Sample Code (TAAESample) License

Strictly for educational purposes only. No part of TAAESample is to be distributed in any form other than as source code within the TAAE2 repository.

theamazingaudioengine2's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

theamazingaudioengine2's Issues

Input support for macOS

Currently, there are issues with receiving audio input on macOS, resulting in CannotDoInCurrentContext errors when rendering. I suspect this is related to sample rate, and that an AudioConverter will be required when the input unit's rate doesn't match the output unit's one. This is only a theory, though.

It's not working...

:0: error: failed to import bridging header '/Users/mac/Downloads/TheAmazingAudioEngine2-master/TAAESample/TAAESample-Bridging-Header.h'

play speed

Please tell me.How do i control it speed rate.Thanks!!!

AEAudioController not included in Files?

Hi,

I added the TAAE2 files as suggested by your comment here: #9, but I can't seem to find the AEAudioController files included in the project I cloned from github, even though it is included in the Sample project.

I attempted to add the AEAudioController files to the TAEE2.xcodeproj file but I was still unable to import them. I then added the AEAudioController files to my own project and imported them, but after running, I basically got errors all across my AmazingAudioEngine.h file saying that none of the files it is trying to import could be found.

Not really sure what to do at this point,

Thanks, Alan

Can RecorderModule only record incoming from mic while AEAudioFilePlayerModule is playing?

AEAudioFileRecorderModule record all of audio from output stack.
Is there a way to record the only coming from built-in mic. When Built-in speaker is playing? (When other players are running)

My render order :
...

   if ( THIS->_inputEnabled ) {// activate when user press the record button
        // Add audio input
        AEModuleProcess(input, context);
        if ( THIS->_playingThroughSpeaker == NO ) {
            //if headset is plugged in, user will be monitoring input data
            AERenderContextOutput(context, 1);
            if ( recorder ) {
                AEModuleProcess(recorder, context);
            }

        }else{
            //user record via device's built mic ( players module is running)
            if ( recorder ) {
                AEModuleProcess(recorder, context);
            }
        }
    }

    AEModuleProcess(THIS.mixer, context);
    AERenderContextOutput(context, 1);
    ....

Is there any idea of how to configure my render process? Or is it possible?

Odd behavior when accessing out-of-bounds index for AEArray

I noticed some odd behavior when using an AEArray inside a renderer.block today (which may be intentional actually). Let me know if this is actually desired behavior though, if not, it should be a pretty easy fix that I can throw into a PR.

  1. Create an AEArray of NSNumbers of type AESeconds for demonstration purposes

self.durationArray = [[AEArray alloc] init];
NSMutableArray *intervalArray = [[NSMutableArray alloc] init];
for (int i = 0; i < 3; i++) {
[intervalArray addObject:@((AESeconds)i)];
}
[self.durationArray updateWithContentsOfArray:intervalArray];

  1. Now if you use a subscript to access something out of bounds of the array, I would expect an exception to be thrown. Instead, it appears that the thread just gets locked and the execution stops at the line in which the index is being accessed out of bounds. For instance, if I put a breakpoint on the following line, which is out of bounds of the array, the thread does not continue past that line of execution:

AESeconds seconds = [self.durationArray[6] doubleValue];

Is this expected or should there be an exception thrown? Maybe I'm just getting odd behavior since I have only tested this inside the renderer.block which is happening on the core audio thread, so you may have some insight into what is going on. Thanks!

AEDSPGenerateOscillator - maybe document that it's just a loose approximation of a sine wave

Currently AEDSPGenerateOscillator's docs say

  • This function produces, sample by sample, a sine-line oscillator signal.

But the current formula is y = (x ** 2 - 1) ** 2 over the range -1..1, which is not a very accurate approximation of a real sine wave. People using this to debug their code might think that something was wrong with their audio processing pipeline.

I'd suggest either using an accurate approximation or documenting the the function produces a loose approximation of a sine wave.

AEAudioFilePlayerModule - Pause and Resume

Thanks for this making amazing project. Whats the best way to implement pause and resume in an AEAudioFilePlayerModule during playback?

The one idea I had was to persist the value of player.currentTime (AESeconds) on receiving a pause action and then later use this in the playAtTime method to resume. However the playAtTime seems to take a AudioTimeStamp not AESeconds.

Confused on how to add the library to a project

Hey Michael,

First off thanks for all of the hard work that you put into the original TAAE, this is my first time using it but have been a fan and following the progress over the years. Was really interested in this re-write, so wanted to start testing it.

I had no problem installing the original TAAE via Cocoapods in a project a few minutes ago, but having trouble installing this one. I dragged the files over from the sample, but i'm getting "file not found" errors in TheAmazingAudioEngine.h file.

Am I understanding correctly that this is a new library, or is it still utilizing the old one? Would love to install this ASAP so I can continue testing.

Thanks.

Thread sanitizer warning data race in AEManagedValueCommitPendingUpdates

Running my app on 64-bit simulator with the XCode 8.1 thread sanitizer turned on

I get this error when starting playback.

WARNING: ThreadSanitizer: data race (pid=11791)
Write of size 1 at 0x000104a7aa20 by thread T38 (mutexes: write M18032, read M18717):
#0 AEManagedValueCommitPendingUpdates AEManagedValue.m:257 (PlayMusic+0x0001016b7556)
#1 __38-[AEAudioUnitOutput initWithRenderer:]_block_invoke AEAudioUnitOutput.m:89 (PlayMusic+0x0001016d7238)
#2 AEIOAudioUnitRenderCallback AEIOAudioUnit.m:530 (PlayMusic+0x0001016e7058)
#3 AUConverterBase::RenderBus(unsigned int&, AudioTimeStamp const&, unsigned int, unsigned int) :174 (AudioToolbox+0x00000012cee5)

Previous read of size 1 at 0x000104a7aa20 by thread T32:
#0 -[AEManagedValue setValue:] AEManagedValue.m:204 (PlayMusic+0x0001016b6f01)
#1 -[AEManagedValue setObjectValue:] AEManagedValue.m:183 (PlayMusic+0x0001016b67d8)
#2 -[GPMAudioPlayerTrack createAudioBuffer] GPMAudioPlayerTrack.m:347 (PlayMusic+0x000100827469)
#3 -[GPMAudioPlayerTrack startRequest] GPMAudioPlayerTrack.m:464 (PlayMusic+0x00010082933f)
#4 __35-[GPMAudioPlayerTrack startRequest]_block_invoke.129 GPMAudioPlayerTrack.m:511 (PlayMusic+0x00010082a717)
#5 __60-[StreamFile serviceReadRequestsFromOffset:length:subStore:]_block_invoke.146 StreamFile.m:297 (PlayMusic+0x0001004eae75)
#6 __tsan::invoke_and_release_block(void*) :226 (libclang_rt.tsan_iossim_dynamic.dylib+0x00000005c3fb)
#7 _dispatch_client_callout :159 (libdispatch.dylib+0x00000002c0cc)

Location is global '__atomicUpdateWaitingForCommit' at 0x000104a7aa20 (PlayMusic+0x000101e39a20)

Mutex M18032 (0x7d68000bcc50) created at:
#0 pthread_mutex_init :226 (libclang_rt.tsan_iossim_dynamic.dylib+0x000000024a93)
#1 CAMutex::CAMutex(char const*) :174 (AudioToolbox+0x00000030d066)
#2 -[AEAudioUnitOutput start:] AEAudioUnitOutput.m:149 (PlayMusic+0x0001016d8035)
#3 __49-[GPMAudioPlayer initWithDelegate:delegateQueue:]_block_invoke GPMAudioPlayer.m:134 (PlayMusic+0x000100802e3b)
#4 -[GPMAudioPlayerModule setState:playbackError:] GPMAudioPlayerModule.m:620 (PlayMusic+0x00010081182e)
#5 __66-[GPMAudioPlayerModule playWithCompletionQueue:completionHandler:]_block_invoke GPMAudioPlayerModule.m:499 (PlayMusic+0x00010080daf9)
#6 __tsan::invoke_and_release_block(void*) :226 (libclang_rt.tsan_iossim_dynamic.dylib+0x00000005c3fb)
#7 _dispatch_client_callout :159 (libdispatch.dylib+0x00000002c0cc)

Mutex M18717 (0x000104a710c8) created at:
#0 pthread_rwlock_tryrdlock :226 (libclang_rt.tsan_iossim_dynamic.dylib+0x0000000250be)
#1 AEManagedValueGetValue AEManagedValue.m:284 (PlayMusic+0x0001016b782c)
#2 AEIOAudioUnitRenderCallback AEIOAudioUnit.m:528 (PlayMusic+0x0001016e6fef)
#3 AUConverterBase::RenderBus(unsigned int&, AudioTimeStamp const&, unsigned int, unsigned int) :174 (AudioToolbox+0x00000012cee5)

Thread T38 (tid=947078, running) created by thread T30 at:
#0 pthread_create :226 (libclang_rt.tsan_iossim_dynamic.dylib+0x000000023b60)
#1 CAPThread::Start() :174 (AudioToolbox+0x000000312a5c)
#2 -[AEAudioUnitOutput start:] AEAudioUnitOutput.m:152 (PlayMusic+0x0001016d80ce)
#3 __49-[GPMAudioPlayer initWithDelegate:delegateQueue:]_block_invoke GPMAudioPlayer.m:134 (PlayMusic+0x000100802e3b)
#4 -[GPMAudioPlayerModule setState:playbackError:] GPMAudioPlayerModule.m:620 (PlayMusic+0x00010081182e)
#5 __66-[GPMAudioPlayerModule playWithCompletionQueue:completionHandler:]_block_invoke GPMAudioPlayerModule.m:499 (PlayMusic+0x00010080daf9)
#6 __tsan::invoke_and_release_block(void*) :226 (libclang_rt.tsan_iossim_dynamic.dylib+0x00000005c3fb)
#7 _dispatch_client_callout :159 (libdispatch.dylib+0x00000002c0cc)

Thread T32 (tid=947021, running) created by thread T-1
[failed to restore the stack]

SUMMARY: ThreadSanitizer: data race AEManagedValue.m:257 in AEManagedValueCommitPendingUpdates

ThreadSanitizer report breakpoint hit. Use 'thread info -s' to get extended information about the report.

Port AEExpanderFilter from TAAE -> TAAE2

Hi there!

Was wondering if the implementation for the AEExpanderFilter from TAAE is in scope for the beta or release version of TAAE2? This was a very useful filter and would IMO be a good addition to the filter modules in TAAE2.

If one were to implement this filter, what are your thoughts on how it should be implemented? The TAAE version is available here: https://github.com/TheAmazingAudioEngine/TheAmazingAudioEngine/blob/master/Modules/AEExpanderFilter.m

Thanks!

Accessin rear mic and doing an FFT

Is it possible to access rear mic and do FFT with audio data on any of the mic data? Let me know. I came across your remoteio page and I am trying to record audio data but have not been successful. I am using auriotouch source code and I want to record and play audio data from rear microphone..

TAAE Sample Can't Find Headers

Out of the box, I get some errors right away when building the sample app. The project can't locate headers for some reason.

/Users/slcott/Downloads/TAAE2-master/TAAESample/TAAESample-Bridging-Header.h:9:9: note: in file included from /Users/slcott/Downloads/TAAE2-master/TAAESample/TAAESample-Bridging-Header.h:9:
#import "AEAudioController.h"
        ^
/Users/slcott/Downloads/TAAE2-master/TAAESample/Classes/AEAudioController.h:12:9: error: 'TheAmazingAudioEngine/TheAmazingAudioEngine.h' file not found
#import <TheAmazingAudioEngine/TheAmazingAudioEngine.h>
        ^
<unknown>:0: error: failed to import bridging header '/Users/slcott/Downloads/TAAE2-master/TAAESample/TAAESample-Bridging-Header.h'

Sample code - threading issues accessing balanceSweepRate

AEAudioController has an ivar _balanceSweepRate that's written by the main thread and read by the audio thread.

As far as I can tell, there is currently no guarantee that changes made to _balanceSweepRate by the main thread will be visible to the real-time audio thread. I think it is theoretically possible on a multi-core system for changes to _balanceSweepRate to stay in one core's data cache, and never become visible to the other core.

I'm not how to fix this. Maybe add "volatile" and use a memory barrier when writing it.

AEAudioUnitModuleProcess doubles the gain of mono buffers

I was using mono buffers to debug an AEAudioUnitModule, and I noticed that my output buffers had double the amplitude of my input buffers. (e.g. if my input samples were 1.0 peak-to-peak the output samples are 2.0 peak-to-peak.)

I think this is because AEAudioUnitModuleProcess is hard-coded to create a stereo buffer in line 141 "// Get a buffer with 2 channels". This stereo buffer is initialized with two copies of the original data, and later on output stereo buffers are combined back into a single mono buffer. As a result the amplitude of the samples gets doubled.

Should AEAudioFileOutput call AEManagedValueCommitPendingUpdates ?

The docs for AEManagedValueCommitPendingUpdates state:

"If you are not using AEAudioUnitOutput, then you must call the AEManagedValueCommitPendingUpdates function at the beginning of your main render loop."

But in reading AEAudioFileOutput, I don't see where AEManagedValueCommitPendingUpdates is called. Should it be called somewhere near line 71 of AEAudioFileOutput.m? (Top of the while loop inside of |AEAudioFileOutput runForDuration:completionBlock:|

Would you consider using NS_UNAVAILABLE and NS_DESIGNATED_INITIALIZER?

Today I made a mistake by initializing an oscillator module in a test program as follows:

[[AEOscillatorModule alloc] init];

That compiled fine, but it didn't work (as I found out after debugging) because the oscillator module doesn't get initialized fully.

I should have written this instead:

[[AEOscillatorModule alloc] initWithRenderer:renderer];

Would you consider adding

- (instancetype)init NS_UNAVAILABLE;

and maybe also

NS_DESIGNATED_INITIALIZER

in all the appropriate places in your public headers?

Set tempo

Hey,
I'm trying to set a tempo for a player playing any audio without affecting the pitch.
I tried AENewTimePitchFilter for setting rate, the tempo of the track seems fine but the audio doesn't sound right, it seems like the pitch is not affected but the audio seems to lose some of its frequencies.
for example I used a simple kick sample and when I increased the rate to say 1.5 it removed attack sound from the kick sample.
Im new to using the TAAE, so I'm a little lost.
Any kind of help would be greatly appreciated, thanks for your help in advance.

Add hooks for redirecting logging output from AEError, AERateLimit

Clients sometimes want to print log messages in special ways, rather than directly calling NSLog.

It would be nice if AEError and AERateLimit allowed clients to specify a custom error logging callback. The default if no custom error logging callback is specified would be to call NSLog.

Crash when using `AEAudioFilePlayerModule`

Hey,

I get a crash when I use AEAudioFilePlayerModule to play audio stored in a file.

Steps to reproduce:

  1. Start playing content of the audio file using AEAudioFilePlayerModule class.
  2. Set all references to the instance of the AEAudioFilePlayerModule class to nil in the completion block of the aforementioned class. In most cases there application keeps only one instance to that object.

Block of the renderer looks in the following way:

__weak typeof(self) weakSelf = self;
_renderer.block = ^(const AERenderContext * _Nonnull context) {
     __strong typeof(weakSelf) self = weakSelf;
    AEModuleProcess(self.audioFilePlayerModule, context);
    AERenderContextOutput(context, 1);
}

Result:

  1. Application crashes.

Expected result:

  1. No crash. ๐Ÿ˜€

crashscreenshot

AEModule inputs other than audio?

Hi Michael,

I'm trying to get my head around how it might be structured within TAAE2 to have inputs other than the the render context be handled inside an AEModule subclass. For example, if I were to create an LFO subclass of AEModule that doesn't generate audio per se, but rather a generates a "control signal" (in the form of an AudioBufferList) that I can access from inside other AEModule subclasses.

Would I essentially create a separate context for the LFO, and then add a property to the subclass to access that context?

Thanks for any guidance!
Ben

TAAESample needs to add NSMicrophoneUsageDescription key

Starting with iOS 10.2, IOS requires a NSMicrophoneUsageDescription key in the app's Info.plist for apps that use the microphone. This is checked both by Apple at app submit time and by the runtime when the microphone is opened. Here's the assert that happens if this string is missing:

2017-02-01 14:25:21.169740-0800 TAAESample[384:36946] [access] This app has crashed because it attempted to access privacy-sensitive data without a usage description. The app's Info.plist must contain an NSMicrophoneUsageDescription key with a string value explaining to the user how the app uses this data.

How to cleanup AEMixerModule after stopping AEAudioUnitOutput?

I have an audio controller that has these properties

  self.renderer = [AERenderer new];
  self.output = [[AEAudioUnitOutput alloc] initWithRenderer:self.renderer];
  self.mixer = [[AEMixerModule alloc] initWithRenderer:self.renderer];

Playing audio consists of adding AEAudioFilePlayerModule objects to the mixer, with different starting timestamps.

// For every sample
AEAudioFilePlayerModule *fp = [[AEAudioFilePlayerModule alloc] initWithRenderer:self.renderer URL:sample error:nil];
// Calculate timestamp
[fp playAtTime:timestamp];
__weak AEAudioFilePlayerModule *weakFp = fp;
[fp setCompletionBlock:^{
  [self.mixer removeModule:weakFp];
}];
[self.mixer addModule:fp];

And then starting playback is simply

[self.output start:error];

While audio is playing, more AEAudioFilePlayerModule might be scheduled.

Stopping audio is done like this:

[self.output stop];
[self.mixer setModules:@[]];

With this setup I can play and stop audio fine. But after playing, stopping and then playing again I get these messages in my console:

0x6000000ef280: AEManagedValueServiceReleaseQueue called from outside realtime thread
AEManagedValueCommitPendingUpdates called from outside realtime thread
0x6000000ef200: AEManagedValueServiceReleaseQueue called from outside realtime thread
0x6000000edc80: AEManagedValueServiceReleaseQueue called from outside realtime thread
0x6000000ef800: AEManagedValueServiceReleaseQueue called from outside realtime thread
0x6000000ef280: AEManagedValueServiceReleaseQueue called from outside realtime thread
AEManagedValueCommitPendingUpdates called from outside realtime thread
0x6000000ef200: AEManagedValueServiceReleaseQueue called from outside realtime thread
0x6000000edc80: AEManagedValueServiceReleaseQueue called from outside realtime thread
2016-09-28 23:57:51.516 TCH[55016:17181235] TAAE: Suppressing some messages
0x6000000ef280: AEManagedValueServiceReleaseQueue called from outside realtime thread
AEManagedValueCommitPendingUpdates called from outside realtime thread
0x6000000ef800: AEManagedValueServiceReleaseQueue called from outside realtime thread
0x6000000ef200: AEManagedValueServiceReleaseQueue called from outside realtime thread
0x6000000edc80: AEManagedValueServiceReleaseQueue called from outside realtime thread
0x6000000ef800: AEManagedValueServiceReleaseQueue called from outside realtime thread
0x6080000f2000: AEManagedValueServiceReleaseQueue called from outside realtime thread
0x6000000ef280: AEManagedValueServiceReleaseQueue called from outside realtime thread
AEManagedValueCommitPendingUpdates called from outside realtime thread
2016-09-28 23:57:52.596 TCH[55016:17181253] TAAE: Suppressing some messages

Sometimes the app crashes because of this. So the question is what am I doing wrong and what is the proper way to clean up after stopping audio in this case?

Question regarding `stopRecordingAtTime:completion` method of the `AEAudioFileRecorderModule` class

Hey,

Looking at the implementation of stopRecordingAtTime:completion method I noticed that it uses NSTimer. In fact, stopping of the recording is asynchronous and happens from within the implementation of AEAudioFileRecorderModuleProcess function.

In my application I would like to be able to stop recording of the audio file even if input and output audio units are stopped. I stop audio units in order to implement pause functionality in my recorder. Unfortunately, when these units are stopped, AEAudioFileRecorderModuleProcess function isn't called and consequently recording can't finish.

Because of the reasons described above I have a few questions:

  1. Is use of NSTimer in the implementation of stopRecordingAtTime:completion method necessary?
  2. Are there any reasons against synchronous version of stopRecordingAtTime:completion method which would be able to finish recording even if all audio units are stopped?

And one more time - thanks for a great work!

Question about Rendering Process

I'm trying to understand how the render/subrender pipeline works in TAAE2.
I'm trying to add filters to the individual player modules. With TAAE1 think it was possible to add a filter to a single AEAudioFilePlayer using the addFilter: toChannel:

So lets say in the Sample App I need to apply a AEDelayModule exclusively to the AEAudioFilePlayerModule-drums. How do I go about doing this?

My thinking was to create a renderer for the player and make use of AERendererRun(). So the subrenderer block in the sample app example would look something like this;

subrenderer.block = ^(const AERenderContext * _Nonnull context) {
        AERendererRun(rendererDrums, context->output, context->frames, context->timestamp);
        AEModuleProcess(mixer, context);
        AERenderContextOutput(context, 1);
    };

Here is the new renderer;

AEDelayModule * delayModule = [[AEDelayModule alloc] initWithRenderer:rendererDrums];
delayModule.delayTime = 0.5;
rendererDrums.block = ^(const AERenderContext * _Nonnull context) {
        AEModuleProcess(delayModule, context);
        AEBufferStackMixToBufferList(context->stack, YES, context->output);
        AEBufferStackPop(context->stack, 1);
        AERenderContextOutput(context, 1);
    };

This doesn't seem to work obviously :)

An update on TAAE2

Hi MIchael,
would you mind giving a quick update on the current state of TAAE2?

I'm guessing you're busy getting AB3 out the door, but as TAAE2 has been in beta for nearly 8 months, and as updates have been a little slow lately I'd love a quick confirmation that I'm betting on the right framework.

If you wouldn't mind:

  1. Is TAAE2 the preferred choice over TAAE1 at this point?
  2. Can we expect v1.0 in the not too distant future?

Thanks again for all your hard work.

ios 10 errors

errors compiling for ios 10. please let me know if i did something wrong or you update the code. thanks - julian

screen shot 2016-09-27 at 11 34 43 am

Rename "Core Types" directory so bazel can build it

It would be helpful if the "Core Types" directory was renamed to not have a space in it. (Maybe to "CoreTypes".)

The reason I'm asking for this change is that I'm trying to add TheAmazingAudioEngine to a project that's built using the bazel build system. Bazel (http://www.bazel.io/) has a limitation that it doesn't support directories (or files) with spaces in their names.

I can work around this issue by renaming my local copy of the "Core Types" directory, but it would be convenient if "Core Types" was renamed in the upstream TheAmazingAudioEngine2 project.

I'd be happy to submit a pull request that renames the directory, if you would like me to.

Bug with HeadPhones

Hey Michael,

TAAE2 Lib seems to have a bug with the headphones. You can reproduce easily the bug with your sample.
Just kill the sample app,
Plug your headphones
launch the sample app and play a beat
The beat seems to be 2 times faster

Readme Organization

Hey, your library is really interesting.

The only problem I found was the README.md, which lacks information.
I created this iOS Open source Readme Template so you can take a look on how to better organize.

If you want, I can help you to do it.

What are your thoughts?

Add a sample rate converter module?

Would it be possible to add a AESampleRateConverterModule, for sample rate conversion?

My use case is a streaming music player for iOS. There is no AudioUnit for streaming music, so I currently use the AudioStreamFile APIs to produce PCM audio at the output sample rate.

On some iOS devices the output sample rate changes when the headphone is plugged/unplugged. In that situation, I'd like to avoid having to restart the streaming to pick up the new sample rate. Instead, I'd like to use an kAudioUnitSubType_AUConverter to convert the PCM data from its "native" sample rate to the current output sample rate.

I'm going to try implementing it myself, by subclassing AEAudioUnitModule. I'm guessing I'll have to do something special with subrenderer.sampleRate, to make it stay constant when renderer.sampleRate changes.

(I apologize if I'm missing something obvious, and TAAE2 already handles this case some other way.)

Overriding part of the audio file which is used as recording output

Hi,

I have a case as follows:
There is an option to record microphone input in my application. I implemented this functionality using AEAudioFileRecorderModule and everything works as expected. Now, I want to add user a possibility to remove x last seconds from the recording and start recording again from the end of recording.

I looked at the classes available in AAE2 and AEAudioFileRecorderModule API and I don't see how I can achieve that kind of functionality. Can you give me some tips that would allow me to implement desired behaviour?

I wrote an AEAudioDataOutput class for use in my tests -- would it be useful in general?

I made a new class, based on AEAudioFileOutput, but instead of writing to an audio file it just calls a "sink" block with each rendered buffer.

I wrote this to help me make automated tests that compare the rendered output to expected output.

Is this something you would be interested in putting in TAAE2? If you like, I can clean up my code and send a pull request.

It's fine if you're not interested, or if you'd rather do it your own way. I just wanted to check before going to the trouble of creating a pull request.

AEIOAudioUnit - check AVAudioSessionInterruptionOptionShouldResume

In AEIOAudioUnit the observer for AVAudioSessionInterruptionNotification should check
AVAudioSessionInterruptionOptionShouldResume before trying to restart the audio session.

Something like this:

[[NSNotificationCenter defaultCenter] addObserverForName:AVAudioSessionInterruptionNotification object:nil queue:nil
                                              usingBlock:^(NSNotification *notification) {
    NSInteger type = [notification.userInfo[AVAudioSessionInterruptionTypeKey] integerValue];
    if ( type == AVAudioSessionInterruptionTypeBegan ) {
        wasRunning = weakSelf.running;
        if ( wasRunning ) {
            [weakSelf stop];
        }
    } else {
        NSUInteger optionFlags =
            [notification.userInfo[AVAudioSessionInterruptionOptionKey] unsignedIntegerValue];
        if (optionFlags & AVAudioSessionInterruptionOptionShouldResume) {
            if ( wasRunning ) {
                [weakSelf start:NULL];
            }
        }
    }
}];

Add DEBUG check that |AEManagedValue setValue| is called on main thread

I know |AEManagedValue setValue| is documented as having to be called on the main thread, but I mistakenly called it on a GCD background thread.

...it turns out that if you do that, AEManagedValue pollTimer doesn't work, and so old values never get released.

It would be nice for users if |AEManagedValue setValue| checked that it was on the main thread. You could limit the check to when poolTimer is created, to minimize runtime overhead.

Problem Record Input

I think the library has a main issue which that is impossible to record only the input, it causes a EXC_BAD_ACCESS on "ExtAudioFileWriteAsync"

AEContext->frameIndex

I would like to see a UInt64 frameIndex added to the rendercontext which basically would be the equivalent of timestamp->mSampleTime minus the pitfalls of the AudioTimeStamp and floatingpoint bs.

Better description under title

screen shot 2016-08-02 at 12 04 58 pm

Would be cooler to have something like:

The Amazing Audio Engine is a sophisticated framework for iOS audio applications, built so you don't have to.

Add protocol for output modules?

I've been using TAAE2 in a music player, and so far it's been working well.

For my automated tests I wrote an AudioDataOutput class, that is similar to AEAudioFileOutput, but instead of writing to an audio file it calls a callback method with the current batch of audio data. That way my automated tests can compare the generated audio to the expected audio.

This works, and it's convenient to have the tests run faster than real time.

I wonder if (a) this is something that TAAE2 could have built in, and (b) if there should be some sort of common protocol for output modules, so that it would be easier to write code that works with one of several output modules.

Currently my code that handles which output unit to create looks something like this:

... in my class's instance variable declaration block.
// Poor-man's polymorphism. Only one of these two outputs will be initialized.
AEAudioUnitOutput *_output;
MYAudioDataOutput *_dataOutput;

if (testing) {
_dataOutput = [[MYAudioDataOutput alloc] initWithRenderer:renderer ...];
[_dataOutput start:NULL];
} else {
_output = [[AEAudioUnitOutput alloc] initWithRenderer:renderer];
[_output start:NULL];
}

If there was a common protocol for output modules, the code could look like:

AEOutput *_output;

if (testing) {
_output = [[MYAudioDataOutput alloc] initWithRenderer:renderer ...];
} else {
_output = [[AEAudioUnitOutput alloc] initWithRenderer:renderer];
}
[_output start:NULL];

Not recording

Hi,

I've been trying to set up TAAE1/2 in different apps in an attempt to record the screen output essentially.
In this case, I am using TAAE2 and I have an app where I am loading 2 sounds locally, and I want to be able to start a recording, play and pause the 2 sounds at varying times along with some effects at random times, stop recording and then save the file.

I have linked up Record and PlayRecording buttons to snippets of code similar to what is available in the TAAE2 Sample, but can't seem to get any actual output recorded. When i check through iExplorer, I always have an AppOutput.m4a file on display, but it is always 557 Bytes in size, so i'm guessing it doesn't actually contain any audio.

Before pasting my code in, does recording not work when playing sounds through AVAudioPlayer? I'm not particularly sure about this. Though I did try to start playing the Drums sound that comes with the sample app when i click record and to stop it when i stop recording, but that sound never comes up when i run the app, though my code does enter the if audio.drums.playing section that'll be included below.

Anyway, I will paste snippets of my record and play record functions. Any help would be greatly appreciated.

- (IBAction)recordPressed:(id)sender {
	if (_audioController.recording) {
		if (_audioController.drums.playing) {
			[_audioController.drums stop];
		}
		NSLog(@"stop recording");
		[_audioController stopRecordingAtTime:0 completionBlock:^{
			[self.recordButton setSelected:false];
		}];
	} else {
		@try {
			NSLog(@"Start recording");
			NSError *error;
			[_audioController.drums playAtTime:AETimeStampNone];
			[_audioController beginRecordingAtTime:0 error:&error];
		} @catch (NSException *exception) {
			[_recordButton setEnabled:NO];
		}
	}
}

- (IBAction)playRecorded:(id)sender {
	if (_audioController.playingRecording) {
		NSLog(@"Stop Playing recording");
		[_audioController stopPlayingRecording];
		[_playRecordingButton setSelected:NO];
	} else {
		NSLog(@"Start playing recording");
		[_playRecordingButton setSelected:YES];
		
		[_audioController playRecordingWithCompletionBlock:^{
			[_playRecordingButton setSelected:NO];
		}];
	}
}

Thanks a lot,
Alan

how to install?

when i run pod file. installed Theamazing audio engine 1.5.5
use_frameworks!

target 'voice' do

pod 'TheAmazingAudioEngine'

end

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.