Skip to content

RunACR iOS - Sound Recognition SDK (like Shazam, audio fingerprint).

Notifications You must be signed in to change notification settings

andrei200287/RunACR-iOS

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 

Repository files navigation

RunACR iOS - Sound Recognition SDK.

RunACR SDK allows your mobile app to recognize sounds and display interactive content synchronized with what your users hear at that particular moment (like Shazam).

As of September 1st, 2018 SDK supports live streams. You can use SDK for recognizing TV and radio stations. You can also purchase SDK source codes for iOS, Android, and Web Server for unlimited use in your projects. Contact us at [email protected] to learn details.

Test the RunACR SDK install The Young Pope — Second Screen https://itunes.apple.com/us/app/id1215589988?mt=8

RunACR screen

Requirements

  • iOS 8+,
  • ARC.

Installation

  1. Drag the RunACRSDK.framework module to your project. When prompted, select Copy items into destination group's folder.
  2. Go to the General settings tab of your target settings and add the framework to the Embedded Binaries section.
  3. For iOS 10, add description NSMicrophoneUsageDescription to your .plist file.
  4. Import the module using @import RunACRSDK; in order to use the library.
  5. The RunACRSDK must be initialized in AppDelegate's, application:didFinishLaunchingWithOptions: method
[[RunACR sharedInstance] initializeWithAPIKey:@"API_KEY"];

Add impression database file that you downloaded from runacr.com and specify it in the updateDatabasePath method:

[[RunACR sharedInstance] updateDatabasePath:path];

Specify delegate RunACRDelegateRunACRDelegate:

[RunACR sharedInstance].delegate = self;

Define two methods of RunACRDelegate protocol: didRecognize — to receive the result of recognition processing. didNotRecognize — to be called in case of recognition failure.

#pragma mark - RunACRDelegate
-(void)didRecognize:(int)trackId absoluteTimeOffset:(float)absoluteTimeOffset relativeTimeOffset:(float)relativeTimeOffset{

    int seconds = (int)relativeTimeOffset % 60;
    int minutes = ((int)relativeTimeOffset / 60) % 60;
    int hours = relativeTimeOffset / 3600;

    NSString *str = [NSString stringWithFormat:@"%02d:%02d:%02d",hours, minutes, seconds];

    if (trackId == 6){
        // David Bowie - The Stars 00:02:43
        _myLabel.text = [NSString stringWithFormat:@"David Bowie - The Stars %@",str];
    } else if (trackId == 7){
        // The Rolling Stones - Sympathy For The Devil 00:03:26
        _myLabel.text = [NSString stringWithFormat:@"The Rolling Stones - Sympathy For The Devil %@",str];
    }
}


-(void)didNotRecognize{
    // Not found. Try again.
    [[RunACR sharedInstance] startRecognize];
}

To start recognition processing, call the startRecognize method.

[[RunACR sharedInstance] startRecognize];

Contact

Andrei Solovjev

About

RunACR iOS - Sound Recognition SDK (like Shazam, audio fingerprint).

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published