How do You Equip Your App with Audio Playback Capabilities Using Audio Kit? - Huawei Developers

Unlike text or video, when users consume audio content, they can also do something else while they're listening. This is why users tend to choose audio, rather than text or video, when commuting or doing housework.
This makes audio playback a valuable addition for many apps. For example, fitness and health apps are more engaging when they have the ability to play music or audiobooks, while education apps are more effective when they provide useful audio courses, and ringtone apps need to be able to play a variety of ringtones.
So then, how do you build audio capabilities for your app?
The answer is, by using HUAWEI Audio Kit.
It provides you with a range of capabilities, including audio encoding and decoding at both the hardware level and system bottom layer.
Audio Kit provides the following functions:
l Play audio: apps can decode and play high-resolution audio files of up to 384 kHz/24 bit.
l Control playback: users can play, pause, play previous, play next, stop, and drag the progress bar.
l Adjust volume: users can increase or decrease the volume.
l Manage playlists: gives users the ability to view, save, and delete playlists, as well as add songs to a playlist.
l Manage play modes: apps can provide sequential playback, repeat a playlist, repeat a song, and shuffle songs).
l Users can save their playback progress and start from where they left off.
l Apps can cache and encrypt audio content.
Demo:
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
You can find the demo source code on GitHub.
Development Practice
1. Integrate the HMS Core Audio SDK
1.1 Configure the Maven Repository Address for the Audio SDK
Step 1 Open the build.gradle file in the root directory of your Android Studio project.
Step 2 Configure the Maven repository address and add the gradle configuration.
l Go to allprojects > repositories and configure the Maven repository address for the Audio SDK.
l Go to buildscript > repositories and configure the Maven repository address for the Audio SDK.
l Go to buildscript > dependencies and add the gradle configuration.
Code:
<p style="line-height: 1.5em;">buildscript {
repositories {
google()
jcenter()
maven {url 'https://developer.huawei.com/repo/'}
}
dependencies {
classpath 'com.android.tools.build:gradle:3.4.2'
// NOTE: Do not place your app dependencies here; you need to put them
// in the individual module build.gradle files
}
}
allprojects {
repositories {
google()
jcenter()
maven {url 'https://developer.huawei.com/repo/'}
}
}
</p>
1.2 Add Build Dependencies
Step 1 Open the build.gradle file in the app directory.
Step 2 Add build dependencies in the dependencies section.
Code:
<p style="line-height: 1.5em;">dependencies {
implementation 'com.huawei.hms:audiokit-player:{version}'
}
</p>
1.3 Synchronize the Project
Once you have completed the configuration above, click the synchronization icon on the toolbar to synchronize the build.gradle file.
2 Configure Obfuscation Scripts
Before you build the APK, configure the obfuscation file to prevent the HMS Core SDK from being obfuscated.
The obfuscation configuration file is proguard-rules.pro for Android Studio.
Step 1 Open the obfuscation configuration file proguard-rules.pro in the app directory.
Step 2 Remove the HMS Core SDK from obfuscation.
Code:
<p style="line-height: 1.5em;">-ignorewarnings
-keepattributes *Annotation*
-keepattributes Exceptions
-keepattributes InnerClasses
-keepattributes Signature
-keepattributes SourceFile,LineNumberTable
-keep class com.huawei.hianalytics.**{*;}
-keep class com.huawei.updatesdk.**{*;}
-keep class com.huawei.hms.**{*;}
</p>
Step 3 If you are using AndResGuard, add its trustlist to the obfuscation configuration file.
Code:
<p style="line-height: 1.5em;">"R.string.hms*",
"R.string.connect_server_fail_prompt_toast",
"R.string.getting_message_fail_prompt_toast",
"R.string.no_available_network_prompt_toast",
"R.string.third_app_*",
"R.string.upsdk_*",
"R.layout.hms*",
"R.layout.upsdk_*",
"R.drawable.upsdk*",
"R.color.upsdk*",
"R.dimen.upsdk*",
"R.style.upsdk*",
"R.string.agc*"
</p>
3 Adding Permissions
The Audio SDK requires permissions to access the network, obtain the network status, operate SD cards, and read data from the Android media library. Declare these permissions in the Manifest file:
Code:
<p style="line-height: 1.5em;">// Permission to access the network.
<uses-permission android:name="android.permission.INTERNET" />
// Permission to obtain the network status.
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
//Permission to write data into the SD card.
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
// Permission to read data from the SD card.
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
// Permission to read data from the Android media library.
<uses-permission android:name="android.permission.READ_MEDIA_STORAGE" />
</p>
4 Developing Your App
Step 1 Create an audio management instance by calling HwAudioManager; manage audio playback by calling HwAudioPlayerManager; manage audio queues by calling HwAudioQueueManager; manage audio configurations by calling HwAudioConfigManager.
Code:
<p style="line-height: 1.5em;">private HwAudioPlayerManager mHwAudioPlayerManager;
private HwAudioConfigManager mHwAudioConfigManager;
private HwAudioQueueManager mHwAudioQueueManager;
public void createHwAudioManager() {
// Create a configuration instance, including various playback-related configurations.
HwAudioPlayerConfig hwAudioPlayerConfig = new HwAudioPlayerConfig(MainActivity.this);
// Create a control instance.
HwAudioManagerFactory.createHwAudioManager(hwAudioPlayerConfig, new HwAudioConfigCallBack() {
// Return the control instance through callback.
@Override
public void onSuccess(HwAudioManager hwAudioManager) {
try {
Log.i(TAG, "createHwAudioManager onSuccess");
// Obtain the playback control instance.
mHwAudioPlayerManager = hwAudioManager.getPlayerManager();
// Obtain the configuration control instance.
mHwAudioConfigManager = hwAudioManager.getConfigManager();
// Obtain the queue control instance.
mHwAudioQueueManager = hwAudioManager.getQueueManager();
} catch (Exception e) {
Log.i(TAG, "player init fail");
}
}
@Override
public void onError(int errorCode) {
Log.w(TAG, "init err:" + errorCode);
}
});
}
</p>
Step 2 Create a playlist and play songs.
Code:
<p style="line-height: 1.5em;">public void play() {
if (mHwAudioPlayerManager != null) {
// Create a playlist.
String path = "https://lfmusicservice.hwcloudtest.cn:18084/HMS/audio/Taoge-chengshilvren.mp3";
// Create an audio object and write audio information into the object.
HwAudioPlayItem item = new HwAudioPlayItem();
// Set the audio title.
item.setAudioTitle("Playing input song");
// Set the audio ID.
item.setAudioId(String.valueOf(path.hashCode()));
// Set whether audio is online.
item.setOnline(1);
// Set the online audio URL.
item.setOnlinePath(path);
List<HwAudioPlayItem> playItemList = new ArrayList<>();
playItemList.add(item);
// Play songs.
mHwAudioPlayerManager.playList(playItemList, 0, 0);
}
}
</p>
Step 3 Use instances. The following are examples. For more details, see Audio Kit's Management APIs.
l Clear the playback cache.
Code:
<p style="line-height: 1.5em;">public void clearPlayCache() {
if (mHwAudioConfigManager != null) {
mHwAudioConfigManager.clearPlayCache();
}
}
l Check whether the current playback queue is empty.
public boolean isQueueEmpty() {
if (mHwAudioQueueManager != null) {
return mHwAudioQueueManager.isQueueEmpty();
}
return false;
}
</p>
And that's it! You've equipped your app with audio playback capabilities.

Related

HMS Video Kit — 1

HMS Video Kit — 1
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
In this article, I will write about the features of Huawei’s Video Kit and we will develop a sample application that allows you to play streaming media from a third-party video address.
Why should we use it?
Nowadays, video apps are so popular. Due to the popularity of streaming media, many developers have introduced HD movie streaming apps for people who use devices such as tablets and smartphones for everyday purposes. With Video Kit WisePlayer SDK you can bring stable HD video experiences to your users.
Service Features
It provides a high definition video experience without any delay
Responds instantly to playback requests
Have intuitive controls and offer content on demand
It selects the most suitable bitrate for your app
URL anti-leeching , playback authentication, and other security mechanisms so your videos are completely secure
It supports streaming media in 3GP, MP4, or TS format and complies with HTTP/HTTPS, HLS, or DASH.
Integration Preparations
First of all, in order to start developing an app with most of the Huawei mobile services and the Video Kit as well, you need to integrate the HUAWEI HMS Core into your application.
Software Requirements
Android Studio 3.X
JDK 1.8 or later
HMS Core (APK) 5.0.0.300 or later
EMUI 3.0 or later
The integration flow will be like this :
For a detailed HMS core integration process, you can either refer to Preparations for Integrating HUAWEI HMS Core.
After creating the application on App Gallery Connect and completed the other steps that are required, please make sure that you copied the agconnect-services.json file to the app’s root directory of your Android Studio project.
Adding SDK dependencies
Add the AppGallery Connect plug-in and the Maven repository in the project-level build.gradle file.
Code:
buildscript {
repositories {
......
maven {url 'https://developer.huawei.com/repo/'}
}
dependencies {
......
classpath 'com.huawei.agconnect:agcp:1.3.1.300' // HUAWEI agcp plugin
}
}
allprojects {
repositories {
......
maven {url 'https://developer.huawei.com/repo/'}
}
}
2. Open the build.gradle file in the app directory and add the AppGallery connect plug-in.
Code:
apply plugin: 'com.android.application'
// Add the following line
apply plugin: 'com.huawei.agconnect' // HUAWEI agconnect Gradle plugin
android {
......
}
3.Configure the Maven dependency in the app level build.gradle file
Code:
dependencies {
......
implementation "com.huawei.hms:videokit-player:1.0.1.300"
}
You can find all the version numbers of this kit in its Version Change History.
Click to expand...
Click to collapse
4.Configure the NDK in the app-level build.gradle file.
Code:
android {
defaultConfig {
......
ndk {
abiFilters "armeabi-v7a", "arm64-v8a"
}
}
......
}
Here, we have used the abiFilters in order to reduce the .apk size by selecting the desired CPU architectures.
5.Add permissons in the AndroidManifest.xml file.
Code:
<uses-permission
android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<uses-permission
android:name="android.permission.ACCESS_WIFI_STATE" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<uses-permission android:name="com.huawei.permission.SECURITY_DIAGNOSE" />
Note : For Android 6.0 and later , Video Kit dynamically applies for the write permisson on external storage.
Click to expand...
Click to collapse
6.Lastly, add configurations to exclude the HMS Core SDK from obfuscation.
The obfuscation configuration file is proguard-rules.pro for Android Studio
Open the obfuscation configuration file of your Android Studio project and add the configurations.
Code:
-ignorewarnings
-keepattributes *Annotation*
-keepattributes Exceptions
-keepattributes InnerClasses
-keepattributes Signature
-keepattributes SourceFile,LineNumberTable
-keep class com.hianalytics.android.**{*;}
-keep class com.huawei.updatesdk.**{*;}
-keep class com.huawei.hms.**{*;}
With these steps, we have terminated the integration part. Now, let's get our hands dirty with some code …
Initializing WisePlayer
In order to initialize the player, we need to create a class that inherits from Application.Application class is a base class of Android app containing components like Activities and Services.Application or its sub classes are instantiated before all the activities or any other application objects have been created in the Android app.
We can give additional introductions to the Application class by extending it. We call the initialization API WisePlayerFactory.initFactory() of the WisePlayer SDK in the onCreate() method.
Java:
public class VideoKitPlayApplication extends Application {
private static final String TAG = "VideoKitPlayApplication";
private static WisePlayerFactory wisePlayerFactory = null;
@Override
public void onCreate() {
super.onCreate();
initPlayer();
}
private void initPlayer() {
// DeviceId test is used in the demo, specific access to incoming deviceId after encryption
Log.d(TAG, "initPlayer: VideoKitPlayApplication");
WisePlayerFactoryOptions factoryOptions = new WisePlayerFactoryOptions.Builder().setDeviceId("xxx").build();
WisePlayerFactory.initFactory(this, factoryOptions, initFactoryCallback);
}
/**
* Player initialization callback
*/
private static InitFactoryCallback initFactoryCallback = new InitFactoryCallback() {
@Override
public void onSuccess(WisePlayerFactory wisePlayerFactory) {
Log.d(TAG, "init player factory success");
setWisePlayerFactory(wisePlayerFactory);
}
@Override
public void onFailure(int errorCode, String reason) {
Log.d(TAG, "init player factory fail reason :" + reason + ", errorCode is " + errorCode);
}
};
/**
* Get WisePlayer Factory
*
* @return WisePlayer Factory
*/
public static WisePlayerFactory getWisePlayerFactory() {
return wisePlayerFactory;
}
private static void setWisePlayerFactory(WisePlayerFactory wisePlayerFactory) {
VideoKitPlayApplication.wisePlayerFactory = wisePlayerFactory;
}
}
Playing a Video
We need to create a PlayActivity that inherits from AppCompatActivity and implement the Callback and SurfaceTextureListener APIs.Currently, WisePlayer supports SurfaceView and TextureView. Make sure that your app has a valid view for video display; otherwise, the playback will fail. So that In the layout file, we need to add SurfaceView or TextureView to be displayed in WisePlayer.PlayActivity also implements the OnPlayWindowListener and OnWisePlayerListener in order to get callbacks from the WisePlayer.
Java:
import android.view.SurfaceHolder.Callback;
import android.view.TextureView.SurfaceTextureListener;
import com.videokitnative.huawei.contract.OnPlayWindowListener;
import com.videokitnative.huawei.contract.OnWisePlayerListener;
public class PlayActivity extends AppCompatActivity implements Callback,SurfaceTextureListener,OnWisePlayerListener,OnPlayWindowListener{
...
}
WisePlayerFactory instance is returned when the initialization is complete in Application. We need to call createWisePlayer() to create WisePlayer.
Java:
WisePlayer player = Application.getWisePlayerFactory().createWisePlayer();
In order to make the code modular and understandable, I have created PlayControl.java class as in the official demo and created the Wiseplayer in that class. Since we create the object in our PlayActivity class through the constructor,wisePlayer will be created in the onCreate() method of our PlayActivity.
Note: Before calling createWisePlayer() to create WisePlayer, make sure that Application has successfully initialized the WisePlayer SDK.
Click to expand...
Click to collapse
Now, we need to Initialize the WisePlayer layout and add layout listeners. I have created the PlayView.java for creating the views and updating them. So that we can create the PlayView instance on onCreate() method of our PlayActivity.
Java:
/**
* init the layout
*/
private void initView() {
playView = new PlayView(this, this, this);
setContentView(playView.getContentView());
}
In the PlayView.java class I have created SurfaceView for displaying the video.
Java:
surfaceView = (SurfaceView) findViewById(R.id.surface_view); SurfaceHolder surfaceHolder = surfaceView.getHolder(); surfaceHolder.addCallback(this); surfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
I will share the demo code that I have created. You can find the activity_play.xml layout and the PlayView.java files over there.
Click to expand...
Click to collapse
Registering WisePlayer listeners is another important step. Because the app will react based on listener callbacks. I have done this on PlayControl.java class with the method below.
Java:
/**
* Set the play listener
*/
private void setPlayListener() {
if (wisePlayer != null) {
wisePlayer.setErrorListener(onWisePlayerListener);
wisePlayer.setEventListener(onWisePlayerListener);
wisePlayer.setResolutionUpdatedListener(onWisePlayerListener);
wisePlayer.setReadyListener(onWisePlayerListener);
wisePlayer.setLoadingListener(onWisePlayerListener);
wisePlayer.setPlayEndListener(onWisePlayerListener);
wisePlayer.setSeekEndListener(onWisePlayerListener);
}
}
Here, onWisePlayerListener is an interface that extends required Wiseplayer interfaces.
Java:
public interface OnWisePlayerListener extends WisePlayer.ErrorListener, WisePlayer.ReadyListener,
WisePlayer.EventListener, WisePlayer.PlayEndListener, WisePlayer.ResolutionUpdatedListener,
WisePlayer.SeekEndListener, WisePlayer.LoadingListener, SeekBar.OnSeekBarChangeListener {
}
Now, we need to set URLs for our videos on our PlayControl.java class with the method below.
Java:
wisePlayer.setPlayUrl("http://commondatastorage.googleapis.com/gtv-videos-bucket/sample/BigBuckBunny.mp4");
Since I have used CardViews on my MainActivity.java class , I have passed the Urls and movie names on click action through intent from MainActivity to PlayControl. You can check it out on my source code as well.
Click to expand...
Click to collapse
We’ve set a view to display the video with the code below. In my demo application I have used SurfaceView to display the video.
Java:
// SurfaceView listener callback
@Override
public void surfaceCreated(SurfaceHolder holder) { wisePlayer.setView(surfaceView); }
In order to prepare for the playback and start requesting data, we need the call wisePlayer.ready() method .
Lastly, we need to call wisePlayer.start() method to start the playback upon a successful response of the onReady callback method in this API.
Java:
@Override public void onReady(WisePlayer wisePlayer)
{
wisePlayer.start();
}
We have finished the development, lets pick a movie and enjoy it
Movie List
You can find the source code of the demo app here.
In this article, we developed a sample application using HUAWEI Video Kit. HMS Video Kit offers a lot of features, for the sake of simplicity we implemented a few of them. I will share another post to show more features of the video kit in the near future.
RESOURCES
Documentation
Video Kit Codelab
what is the minimum resolution video we can play ??
What should I do if the signature fails to be verified on the server side?
shikkerimath said:
what is the minimum resolution video we can play ??
Click to expand...
Click to collapse
The minimum resolution is 270p, and the maximum is 4K.
Very interesting.

Intermediate: How to Integrate Location Kit into Hotel booking application

Introduction
This article is based on Multiple HMS services application. I have created Hotel Booking application using HMS Kits. We need mobile app for reservation hotels when we are traveling from one place to another place.
In this article, I am going to implement HMS Location Kit & Shared Preferences.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Flutter setup
Refer this URL to setup Flutter.
Software Requirements
1. Android Studio 3.X
2. JDK 1.8 and later
3. SDK Platform 19 and later
4. Gradle 4.6 and later
Steps to integrate service
1. We need to register as a developer account in AppGallery Connect.
2. Create an app by referring to Creating a Project and Creating an App in the Project
3. Set the data storage location based on current location.
4. Enabling Required Services: Location Kit.
5. Generating a Signing Certificate Fingerprint.
6. Configuring the Signing Certificate Fingerprint.
7. Get your agconnect-services.json file to the app root directory.
Important: While adding app, the package name you enter should be the same as your Flutter project’s package name.
Note: Before you download agconnect-services.json file, make sure the required kits are enabled.
Development Process
Create Application in Android Studio.
1. Create Flutter project.
2. App level gradle dependencies. Choose inside project Android > app > build.gradle.
Code:
apply plugin: 'com.android.application'
apply plugin: 'com.huawei.agconnect'
Root level gradle dependencies
Code:
maven {url 'https://developer.huawei.com/repo/'}
classpath 'com.huawei.agconnect:agcp:1.4.1.300'
Add the below permissions in Android Manifest file.
Code:
<manifest xlmns:android...>
...
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.ACCESS_COARSE_LOCATION" />
<uses-permission android:name="android.permission.ACCESS_FINE_LOCATION" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<uses-permission android:name="android.permission.ACCESS_BACKGROUND_LOCATION" />
<uses-permission android:name="android.permission.ACTIVITY_RECOGNITION" />
<uses-permission android:name="com.huawei.hms.permission.ACTIVITY_RECOGNITION" />
<application ...
</manifest>
3. Refer below URL for cross-platform plugins. Download required plugins.
https://developer.huawei.com/consum...y-V1/flutter-sdk-download-0000001050304074-V1
4. After completing all the steps above, you need to add the required kits’ Flutter plugins as dependencies to pubspec.yaml file. You can find all the plugins in pub.dev with the latest versions.
Code:
dependencies:
flutter:
sdk: flutter
shared_preferences: ^0.5.12+4
bottom_navy_bar: ^5.6.0
cupertino_icons: ^1.0.0
provider: ^4.3.3
huawei_location:
path: ../huawei_location/
flutter:
uses-material-design: true
assets:
- assets/images/
5. After adding them, run flutter pub get command. Now all the plugins are ready to use.
6. Open main.dart file to create UI and business logics.
Location kit
HUAWEI Location Kit assists developers in enabling their apps to get quick and accurate user locations and expand global positioning capabilities using GPS, Wi-Fi, and base station locations.
Fused location: Provides a set of simple and easy-to-use APIs for you to quickly obtain the device location based on the GPS, Wi-Fi, and base station location data.
Activity identification: Identifies user motion status through the acceleration sensor, cellular network information, and magnetometer, helping you adjust your app based on user behaviour.
Geofence: Allows you to set an interested area through an API so that your app can receive a notification when a specified action (such as leaving, entering, or lingering in the area) occurs.
Integration
Permissions
First of all we need permissions to access location and physical data.
Create a PermissionHandler instance,add initState() for initialize.
Code:
final PermissionHandler permissionHandler;
@override
void initState() {
permissionHandler = PermissionHandler(); super.initState();
}
Check Permissions
We need to check device has permission or not using hasLocationPermission() method.
Code:
void hasPermission() async {
try {
final bool status = await permissionHandler.hasLocationPermission();
if(status == true){
showToast("Has permission: $status");
}else{
requestPermission();
}
} on PlatformException catch (e) {
showToast(e.toString());
}
}
If device don’t have permission,then request for Permission to use requestLocationPermission() method.
Code:
void requestPermission() async {
try {
final bool status = await permissionHandler.requestLocationPermission();
showToast("Is permission granted");
} on PlatformException catch (e) {
showToast(e.toString());
}
}
Fused Location
Create FusedLocationPrvoiderClient instance using the init() method and use the instance to call location APIs.
Code:
final FusedLocationProviderClient locationService
@override
void initState() {
locationService = FusedLocationProviderClient(); super.initState();
}
getLastLocation()
Code:
void getLastLocation() async {
try {
Location location = await locationService.getLastLocation();
setState(() {
lastlocation = location.toString();
print("print: " + lastlocation);
});
} catch (e) {
setState(() {
print("error: " + e.toString());
});
}
}
getLastLocationWithAddress()
Create LocationRequest instance and set required parameters.
Code:
final LocationRequest locationRequest;
locationRequest = LocationRequest()
..needAddress = true
..interval = 5000;
void _getLastLocationWithAddress() async {
try {
HWLocation location =
await locationService.getLastLocationWithAddress(locationRequest);
setState(() {
String street = location.street;
String city = location.city;
String countryname = location.countryName;
currentAddress = '$street' + ',' + '$city' + ' , ' + '$countryname';
print("res: $location");
});
showToast(currentAddress);
} on PlatformException catch (e) {
showToast(e.toString());
}
}
Location Update using Call back
Create LocationCallback instance and create callback functions in initstate().
Code:
LocationCallback locationCallback;
@override
void initState() {
locationCallback = LocationCallback(
onLocationResult: _onCallbackResult,
onLocationAvailability: _onCallbackResult,
);
super.initState();
}
void requestLocationUpdatesCallback() async {
if (_callbackId == null) {
try {
final int callbackId = await locationService.requestLocationUpdatesExCb(
locationRequest, locationCallback);
_callbackId = callbackId;
} on PlatformException catch (e) {
showToast(e.toString());
}
} else {
showToast("Already requested location updates.");
}
}
void onCallbackResult(result) {
print(result.toString());
showToast(result.toString());
}
I have created Helper class to store user login information in locally using shared Preferences class.
Code:
class StorageUtil {
static StorageUtil _storageUtil;
static SharedPreferences _preferences;
static Future<StorageUtil> getInstance() async {
if (_storageUtil == null) {
var secureStorage = StorageUtil._();
await secureStorage._init();
_storageUtil = secureStorage;
}
return _storageUtil;
}
StorageUtil._();
Future _init() async {
_preferences = await SharedPreferences.getInstance();
}
// get string
static String getString(String key) {
if (_preferences == null) return null;
String result = _preferences.getString(key) ?? null;
print('result,$result');
return result;
}
// put string
static Future<void> putString(String key, String value) {
if (_preferences == null) return null;
print('result $value');
return _preferences.setString(key, value);
}
}
Result
Tips & Tricks
1. Download latest HMS Flutter plugin.
2. To work with mock location we need to add permissions in Manifest.XML.
3. Whenever you updated plugins, click on pug get.
Conclusion
We implemented simple hotel booking application using Location kit in this article. We have learned how to get Lastlocation, getLocationWithAddress and how to use callback method, in flutter how to store data into Shared Preferences in applications.
Thank you for reading and if you have enjoyed this article, I would suggest you to implement this and provide your experience.
Reference
Location Kit URL
Shared Preferences URL
Read full article
Goodjob
Thank you
I thought huawei doesn't support flutter. I guess it should as it is Android only.
good
Wow
Nice.
I thought its not doable
Interesting.
Like

Integrating HUAWEI Analytics Kit Using Unity

This document describes how to integrate Analytics Kit using the official Unity asset. After the integration, your app can use the services of this Kit on HMS mobile phones.
For details about Analytics Kit, please visit HUAWEI Developers.
1.1 Preparations​1.1.1 Importing Unity Assets​1.1.2 Generating .gradle Files​1. Enable project gradle.
Go to Edit > Project Settings > Player in Unity, click the Android icon, and go to Publishing Settings > Build.
Enable Custom Base Gradle Template.
Enable Custom Launcher Gradle Template.
Enable Custom Main Gradle Template.
Enable Custom Main Manifest.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
2. Signature
You can use an existing keystore file or create a new one to sign your app.
Go to Edit > Project Settings > Player in Unity, click the Android icon, and go to Publishing Settings > Keystore Manager > Keystore... > Create New.
Enter the password when you open Unity. Otherwise, you cannot build the APK.
1.1.3 Configuring .gradle Files and the AndroidManifest.xml File​1. Configure the BaseProjectTemplate.gradle file.
Code:
<p style="line-height: 1.5em;">Configure the Maven repository address.
buildscript {
repositories {**ARTIFACTORYREPOSITORY**
google()
jcenter()
maven { url 'https://developer.huawei.com/repo/' }
}
dependencies {
// If you are changing the Android Gradle Plugin version, make sure it is compatible with the Gradle version preinstalled with Unity.
// For the Gradle version preinstalled with Unity, please visit https://docs.unity3d.com/Manual/android-gradle-overview.html.
// For the official Gradle and Android Gradle Plugin compatibility table, please visit https://developer.android.com/studio/releases/gradle-plugin#updating-gradle.
// To specify a custom Gradle version in Unity, go do Preferences > External Tools, deselect Gradle Installed with Unity (recommended), and specify a path to a custom Gradle version.
classpath 'com.android.tools.build:gradle:3.4.0'
classpath 'com.huawei.agconnect:agcp:1.2.1.301'
**BUILD_SCRIPT_DEPS**
}
repositories {**ARTIFACTORYREPOSITORY**
google()
jcenter()
maven { url 'https://developer.huawei.com/repo/' }
flatDir {
dirs "${project(':unityLibrary').projectDir}/libs"
}
}</p>
2. Configure the launcherTemplate.gradle file.
Code:
<p style="line-height: 1.5em;">// Generated by Unity. Remove this comment to prevent overwriting when exporting again.
apply plugin: 'com.android.application'
apply plugin: 'com.huawei.agconnect'
dependencies {
implementation project(':unityLibrary')
implementation 'com.huawei.hms:hianalytics:5.1.0.300'
implementation 'com.android.support:appcompat-v7:28.0.0'
implementation 'com.huawei.agconnect:agconnect-core:1.2.0.300'
}</p>
3. Configure the mainTemplate.gradle file.
Code:
<p style="line-height: 1.5em;">apply plugin: 'com.android.library'
apply plugin: 'com.huawei.agconnect'
dependencies {
implementation fileTree(dir: 'libs', include: ['*.jar'])
implementation 'com.huawei.agconnect:agconnect-core:1.2.0.300'
implementation 'com.huawei.hms:hianalytics:5.0.0.301'
**DEPS**}</p>
4. Configure the AndroidManifest.xml file.
Code:
<p style="line-height: 1.5em;"><?xml version="1.0" encoding="utf-8"?>
<!-- Generated by Unity. Remove this comment to prevent overwriting when exporting again. -->
<manifest
xmlns:android="http://schemas.android.com/apk/res/android"
package="com.unity3d.player"
xmlns:tools="http://schemas.android.com/tools">
<application>
<activity android:name="com.hms.hms_analytic_activity.HmsAnalyticActivity"
android:theme="@style/UnityThemeSelector">
<intent-filter>
<action android:name="android.intent.action.MAIN" />
<category android:name="android.intent.category.LAUNCHER" />
</intent-filter>
<intent-filter>
<action android:name="android.intent.action.VIEW" />
<category android:name="android.intent.category.DEFAULT" />
<category android:name="android.intent.category.BROWSABLE" />
<data
android:host="unity.cn"
android:scheme="https" />
</intent-filter>
<meta-data android:name="unityplayer.UnityActivity" android:value="true" />
</activity>
</application></p>
1.1.4 Adding the agconnect-services.json File​1. Create an app by following instructions in Creating an AppGallery Connect Project and Adding an App to the Project.
Run keytool -list -v -keystore C:\TestApp.keyStore to generate the SHA-256 certificate fingerprint based on the keystore file of the app. Then, configure the fingerprint in AppGallery Connect.
2. Download the agconnect-services.json file and place it in the Assets/Plugins/Android directory of your Unity project.
1.1.5 Enabling HUAWEI Analytics​For details, please refer to the development guide.
1.1.6 Adding the HmsAnalyticActivity.java File​1. Destination directory:
2. File content:
Code:
<p style="line-height: 1.5em;">package com.hms.hms_analytic_activity;
import android.os.Bundle;
import com.huawei.hms.analytics.HiAnalytics;
import com.huawei.hms.analytics.HiAnalyticsTools;
import com.unity3d.player.UnityPlayerActivity;
import com.huawei.agconnect.appmessaging.AGConnectAppMessaging;
import com.huawei.hms.aaid.HmsInstanceId;
import com.hw.unity.Agc.Auth.ThirdPartyLogin.LoginManager;
import android.content.Intent;
import java.lang.Boolean;
import com.unity3d.player.UnityPlayer;
import androidx.core.app.ActivityCompat;
public class HmsAnalyticActivity extends UnityPlayerActivity {
private AGConnectAppMessaging appMessaging;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
HiAnalyticsTools.enableLog();
HiAnalytics.getInstance(this);
appMessaging = AGConnectAppMessaging.getInstance();
if(appMessaging != null){
appMessaging.setFetchMessageEnable(true);
appMessaging.setDisplayEnable(true);
appMessaging.setForceFetch();
}
LoginManager.getInstance().initialize(this);
boolean pretendCallMain = false;
if(pretendCallMain == true){
main();
}
}
private static void callCrash() {
throwCrash();
}
private static void throwCrash() {
throw new NullPointerException();
}
public static void main(){
JavaCrash();
}
private static void JavaCrash(){
new Thread(new Runnable() {
@Override
public void run() { // Sub-thread.
UnityPlayer.currentActivity.runOnUiThread(new Runnable() {
@Override
public void run() {
callCrash();
}
});
}
}).start();
}
@Override
protected void onActivityResult(int requestCode, int resultCode, Intent data)
{
LoginManager.getInstance().onActivityResult(requestCode, resultCode, data);
}
}</p>
1.2 App Development with the Official Asset​1.2.1 Sample Code​
Code:
<p style="line-height: 1.5em;">using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using HuaweiHms;
public class AnalyticTest : MonoBehaviour
{
private HiAnalyticsInstance instance;
private int level = 0;
// Start() is called before the first frame update.
void Start()
{
}
// Update() is called once per frame.
void Update()
{
}
public AnalyticTest()
{
// HiAnalyticsTools.enableLog();
// instance = HiAnalytics.getInstance(new Context());
}
public void AnalyticTestMethod()
{
HiAnalyticsTools.enableLog();
instance = HiAnalytics.getInstance(new Context());
instance.setAnalyticsEnabled(true);
Bundle b1 = new Bundle();
b1.putString("test", "123456");
instance.onEvent("debug", b1);
}
public void SetUserId()
{
instance.setUserId("unity test Id");
// Util.showToast("userId set");
}
public void SendProductId()
{
Bundle b1 = new Bundle();
b1.putString(HAParamType.PRODUCTID, "123456");
instance.onEvent(HAEventType.ADDPRODUCT2CART, b1);
// Util.showToast("product id set");
}
public void SendAnalyticEnable()
{
enabled = !enabled;
instance.setAnalyticsEnabled(enabled);
// TestTip.Inst.ShowText(enabled ? "ENABLED" : "DISABLED");
}
public void CreateClearCache()
{
instance.clearCachedData();
// Util.showToast("Clear Cache");
}
public void SetFavoriteSport()
{
instance.setUserProfile("favor_sport", "running");
// Util.showToast("set favorite");
}
public void SetPushToken()
{
instance.setPushToken("fffff");
// Util.showToast("set push token as ffff");
}
public void setMinActivitySessions()
{
instance.setMinActivitySessions(10000);
// Util.showToast("setMinActivitySessions 10000");
}
public void setSessionDuration()
{
instance.setSessionDuration(900000);
// Util.showToast("setMinActivitySessions 900000");
}
public void getUserProfiles()
{
getUserProfiles(false);
getUserProfiles(true);
}
public void getUserProfiles(bool preDefined)
{
var profiles = instance.getUserProfiles(preDefined);
var keySet = profiles.keySet();
var keyArray = keySet.toArray();
foreach (var key in keyArray)
{
// TestTip.Inst.ShowText($"{key}: {profiles.getOrDefault(key, "default")}");
}
}
public void pageStart()
{
instance.pageStart("page test", "page test");
// TestTip.Inst.ShowText("set page start: page test, page test");
}
public void pageEnd()
{
instance.pageEnd("page test");
// TestTip.Inst.ShowText("set page end: page test");
}
public void enableLog()
{
HiAnalyticsTools.enableLog(level + 3);
// TestTip.Inst.ShowText($"current level {level + 3}");
level = (level + 1) % 4;
}
}</p>
1.2.2 Testing the APK​1. Generate the APK.
Go to File > Build Settings > Android, click Switch Platform, and then Build And Run.
2. Enable the debug mode.
3. Go to the real-time overview page of Analytics Kit in AppGallery Connect.
Sign in to AppGallery Connect and click My projects. Select one of your projects and go to HUAWEI Analytics > Overview > Real-time overview.
4. Call AnalyticTestMethod() to display analysis events reported.
Our official website
Demo for Analytics Kit
Our Development Documentation page, to find the documents you need:
Android SDK
Web SDK
Quick APP SDK
If you have any questions about HMS Core, you can post them in the community on the HUAWEI Developers website or submit a ticket online.
We’re looking forward to seeing what you can achieve with HUAWEI Analytics!
More Information
To join in on developer discussion forums
To download the demo app and sample code
For solutions to integration-related issues
Checkout in forum

Separate Audio Sources in a Tap with Audio Editor Kit

Audio Editor Kit from HMS Core provides the audio source separation function, which allows you to separate human voices, human voices from accompaniments, and human voices from musical instrument sounds. The image below shows the accompaniment separated from Dream it Possible.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Let's see how to implement this function.
Step 1 Prepare the File for Audio Source Separation​An MP3 audio file is recommended. If this is not possible, follow the instructions in step 2 to convert your audio file to an MP3 file. What if the accompaniment to be separated is in a video file? No worries. Just extract the video's audio first by referring to the instructions in step 2.
Step 2 Integrate Audio Editor Kit​Development Practice​Preparations
1. Configure the Maven repository address in the project-level build.gradle file.
Code:
buildscript {
repositories {
google()
jcenter()
// Configure the Maven repository address for the HMS Core SDK.
maven {url 'https://developer.huawei.com/repo/'}
}
dependencies {
...
// Add the AppGallery Connect plugin configuration.
classpath 'com.huawei.agconnect:agcp:1.4.2.300'
}
}
allprojects {
repositories {
google()
jcenter()
// Configure the Maven repository address for the HMS Core SDK.
maven {url 'https://developer.huawei.com/repo/'}
}
}
2. Add the following configuration under the declaration in the file header.
Code:
apply plugin: 'com.huawei.agconnect'
3. Add the build dependency on the Audio Editor SDK in the app-level build.gradle file.
Code:
dependencies{
implementation 'com.huawei.hms:audio-editor-ui:{version}'
}
4. Apply for the following permissions in the AndroidManifest.xml file:
Code:
<!-- Vibrate -->
<uses-permission android:name="android.permission.VIBRATE" />
<!-- Microphone -->
<uses-permission android:name="android.permission.RECORD_AUDIO" />
<!-- Write into storage -->
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<!-- Read from storage -->
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
<!-- Connect to Internet -->
<uses-permission android:name="android.permission.INTERNET" />
<!-- Obtain the network status -->
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<!-- Obtain the changed network connectivity state -->
<uses-permission android:name="android.permission.CHANGE_NETWORK_STATE" />
Code Development​1. Create your app's custom activity and use it for selecting one or more audio files. Return their paths to the Audio Editor SDK in the following way
Code:
// Return the audio file paths to the audio editing screen.
private void sendAudioToSdk() {
// Set filePath to the obtained audio file path.
String filePath = "/sdcard/AudioEdit/audio/music.aac";
ArrayList<String> audioList = new ArrayList<>();
audioList.add(filePath);
// Return the paths to the audio editing screen.
Intent intent = new Intent();
// Use HAEConstant.AUDIO_PATH_LIST provided by the Audio Editor SDK.
intent.putExtra(HAEConstant.AUDIO_PATH_LIST, audioList);
// Use HAEConstant.RESULT_CODE provided by the Audio Editor SDK as the result code.
this.setResult(HAEConstant.RESULT_CODE, intent);
finish();
}
2. Register the activity in the AndroidManifest.xml file as described in the following code. When you choose to import the selected audio files, the SDK will send an intent with the action value com.huawei.hms.audioeditor.chooseaudio to jump to the activity.
Code:
<activity android:name="Activity ">
<intent-filter>
<action android:name="com.huawei.hms.audioeditor.chooseaudio"/>
<category android:name="android.intent.category.DEFAULT"/>
</intent-filter>
</activity>
3. Launch the audio editing screen. When you tap Add audio, the SDK will automatically call the activity defined earlier. Then you will be able to edit the audio and add special effects to the audio. After such operations are complete, the edited audio can be exported.
Code:
HAEUIManager.getInstance().launchEditorActivity(this);
4. Convert the audio file format that is not MP3 to MP3 (Optional)
Call transformAudioUseDefaultPath to convert the audio format and save the converted audio to the default directory.
Code:
// Convert the audio format.
HAEAudioExpansion.getInstance().transformAudioUseDefaultPath(context,inAudioPath, audioFormat, new OnTransformCallBack() {
// Called to receive the progress which ranges from 0 to 100.
@Override
public void onProgress(int progress) {
}
// Called when the conversion fails.
@Override
public void onFail(int errorCode) {
}
// Called when the conversion succeeds.
@Override
public void onSuccess(String outPutPath) {
}
// Called when the conversion is canceled.
@Override
public void onCancel() {
}
});
// Cancel format conversion.
HAEAudioExpansion.getInstance().cancelTransformAudio();
Call transformAudio to convert the audio format and save the converted audio to a specified directory.
Code:
// Convert the audio format.
HAEAudioExpansion.getInstance().transformAudio(context,inAudioPath, outAudioPath, new OnTransformCallBack(){
// Called to receive the progress which ranges from 0 to 100.
@Override
public void onProgress(int progress) {
}
// Called when the conversion fails.
@Override
public void onFail(int errorCode) {
}
// Called when the conversion succeeds.
@Override
public void onSuccess(String outPutPath) {
}
// Called when the conversion is canceled.
@Override
public void onCancel() {
}
});
// Cancel format conversion.
HAEAudioExpansion.getInstance().cancelTransformAudio();
5. Call extractAudio to extract audio from a video, which contains the accompaniment to be separated, to a specified directory. (Optional)
Code:
// outAudioDir (optional): path of the directory for storing the extracted audio.
// outAudioName (optional): name of the extracted audio, which does not contain the file name extension.
HAEAudioExpansion.getInstance().extractAudio(context,inVideoPath,outAudioDir, outAudioName,new AudioExtractCallBack() {
@Override
public void onSuccess(String audioPath) {
Log.d(TAG, "ExtractAudio onSuccess : " + audioPath);
}
@Override
public void onProgress(int progress) {
Log.d(TAG, "ExtractAudio onProgress : " + progress);
}
@Override
public void onFail(int errCode) {
Log.i(TAG, "ExtractAudio onFail : " + errCode);
}
@Override
public void onCancel() {
Log.d(TAG, "ExtractAudio onCancel.");
}
});
// Cancel audio extraction.
HAEAudioExpansion.getInstance().cancelExtractAudio();
6. Call getInstruments and startSeparationTasks for audio source separation.
Code:
// Obtain the accompaniment ID using getInstruments and pass the ID to startSeparationTasks.
HAEAudioSeparationFile haeAudioSeparationFile = new HAEAudioSeparationFile();
haeAudioSeparationFile.getInstruments(new SeparationCloudCallBack<List<SeparationBean>>() {
@Override
public void onFinish(List<SeparationBean> response) {
// Called to receive the separation data including the accompaniment ID.
}
@Override
public void onError(int errorCode) {
// Called when an error occurs during separation.
}
});
// Set the parameter for separation.
List instruments = new ArrayList<>();
instruments.add ("accompaniment ID")
haeAudioSeparationFile.setInstruments(instruments);
// Start separating.
haeAudioSeparationFile.startSeparationTasks(inAudioPath, outAudioDir, outAudioName, new AudioSeparationCallBack() {
@Override
public void onResult(SeparationBean separationBean) { }
@Override
public void onFinish(List<SeparationBean> separationBeans) {}
@Override
public void onFail(int errorCode) {}
@Override
public void onCancel() {}
});
// Cancel audio source separation.
haeAudioSeparationFile.cancel();
After completing these steps, you can get the accompaniment you desire. To create something similar to the demo, you can use a video editing program to synthesize the accompaniment with images and lyrics.
References​For more details, you can go to:
Audio Editor Kit official website
Audio Editor Kit Development Documentation page, to find the documents you need
Reddit to join our developer discussion
GitHub to download AudioEditor Kit sample codes
Stack Overflow to solve any integration problems
Thanks for sharing!!

How To Convert Audio from 2D to 3D

Immersive audio is becoming an increasingly important factor for enhancing user experience in the music, gaming, and audio/video editing fields. The spatial audio function is ideal for meetings, sports rehabilitation, and particularly for exhibitions, as it helps deliver a more immersive experience. For users who suffer from visual impairments, the function can serve as a helpful guide.
In this article, I am going to reuse the sample code on this GitHub repo .I will implement spatial audio function in my android app and delivers the 3D surround sound.
Development Practice​Preparations​Prepare the audio for 2D-to-3D conversion, which is better a MP3 file. If not, follow the instructions specified later to convert the format to MP3 first. If the audio is part of a video file, just extract the audio first by referring to the instructions described later.
1. Configure the Maven repository address in the project-level build.gradle file.
Code:
buildscript {
repositories {
google()
jcenter()
// Configure the Maven repository address for the HMS Core SDK.
maven {url 'https://developer.huawei.com/repo/'}
}
dependencies {
...
// Add the AppGallery Connect plugin configuration.
classpath 'com.huawei.agconnect:agcp:1.4.2.300'
}
}
allprojects {
repositories {
google()
jcenter()
// Configure the Maven repository address for the HMS Core SDK.
maven {url 'https://developer.huawei.com/repo/'}
}
}
Add the following configuration under the declaration in the file header:
Code:
apply plugin: 'com.huawei.agconnect'
2. Add the build dependency on the Audio Editor SDK in the app-level build.gradle file.
Code:
dependencies{
implementation 'com.huawei.hms:audio-editor-ui:{version}'
}
3. Apply for the following permissions in the AndroidManifest.xml file:
Code:
<!-- Vibrate -->
<uses-permission android:name="android.permission.VIBRATE" />
<!-- Microphone -->
<uses-permission android:name="android.permission.RECORD_AUDIO" />
<!-- Write into storage -->
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<!-- Read from storage -->
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
<!-- Connect to the Internet -->
<uses-permission android:name="android.permission.INTERNET" />
<!-- Obtain the network status -->
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<!-- Obtain the changed network connectivity state -->
<uses-permission android:name="android.permission.CHANGE_NETWORK_STATE" />
Code Development​1. Create app's custom activity for selecting one or more audio files. Return their paths to the SDK.
Code:
// Return the audio file paths to the audio editing screen.
private void sendAudioToSdk() {
// Set filePath to the obtained audio file path.
String filePath = "/sdcard/AudioEdit/audio/music.aac";
ArrayList<String> audioList = new ArrayList<>();
audioList.add(filePath);
// Return the path to the audio editing screen.
Intent intent = new Intent();
// Use HAEConstant.AUDIO_PATH_LIST provided by the SDK.
intent.putExtra(HAEConstant.AUDIO_PATH_LIST, audioList);
// Use HAEConstant.RESULT_CODE provided by the SDK as the result code.
this.setResult(HAEConstant.RESULT_CODE, intent);
finish();
}
2. Register the activity in the AndroidManifest.xml file as described in the following code. When you choose to import the selected audio files, the SDK will send an intent whose action value is com.huawei.hms.audioeditor.chooseaudio to jump to the activity.
Code:
<activity android:name="Activity ">
<intent-filter>
<action android:name="com.huawei.hms.audioeditor.chooseaudio"/>
<category android:name="android.intent.category.DEFAULT"/>
</intent-filter>
</activity>
Launch the audio editing screen. When you tap Add audio, the SDK will automatically call the activity defined earlier. Then operations like editing and adding special effects can be performed on the audio. After such operations are complete, the edited audio can be exported.
Code:
HAEUIManager.getInstance().launchEditorActivity(this);
3. (Optional) Convert the file format to MP3.
Call transformAudioUseDefaultPath to convert the format and save the converted audio to the default directory.
Code:
// Convert the audio format.
HAEAudioExpansion.getInstance().transformAudioUseDefaultPath(context,inAudioPath, audioFormat, new OnTransformCallBack() {
// Callback when the progress is received. The value ranges from 0 to 100.
@Override
public void onProgress(int progress) {
}
// Callback when the conversion fails.
@Override
public void onFail(int errorCode) {
}
// Callback when the conversion succeeds.
@Override
public void onSuccess(String outPutPath) {
}
// Callback when the conversion is canceled.
@Override
public void onCancel() {
}
});
// Cancel format conversion.
HAEAudioExpansion.getInstance().cancelTransformAudio();
Call transformAudio to convert audio and save the converted audio to a specified directory.
Code:
// Convert the audio format.
HAEAudioExpansion.getInstance().transformAudio(context,inAudioPath, outAudioPath, new OnTransformCallBack(){
// Callback when the progress is received. The value ranges from 0 to 100.
@Override
public void onProgress(int progress) {
}
// Callback when the conversion fails.
@Override
public void onFail(int errorCode) {
}
// Callback when the conversion succeeds.
@Override
public void onSuccess(String outPutPath) {
}
// Callback when the conversion is canceled.
@Override
public void onCancel() {
}
});
// Cancel format conversion.
HAEAudioExpansion.getInstance().cancelTransformAudio();
(Optional) Call extractAudio to extract audio from a video to a specified directory.
Code:
// outAudioDir (optional): directory path for storing extracted audio.
// outAudioName (optional): name of extracted audio, which does not contain the file name extension.
HAEAudioExpansion.getInstance().extractAudio(context,inVideoPath,outAudioDir, outAudioName,new AudioExtractCallBack() {
@Override
public void onSuccess(String audioPath) {
Log.d(TAG, "ExtractAudio onSuccess : " + audioPath);
}
@Override
public void onProgress(int progress) {
Log.d(TAG, "ExtractAudio onProgress : " + progress);
}
@Override
public void onFail(int errCode) {
Log.i(TAG, "ExtractAudio onFail : " + errCode);
}
@Override
public void onCancel() {
Log.d(TAG, "ExtractAudio onCancel.");
}
});
// Cancel audio extraction.
HAEAudioExpansion.getInstance().cancelExtractAudio();
Call getInstruments and startSeparationTasks for audio source separation.
Code:
// Obtain the accompaniment ID using getInstruments and pass the ID to startSeparationTasks.
HAEAudioSeparationFile haeAudioSeparationFile = new HAEAudioSeparationFile();
haeAudioSeparationFile.getInstruments(new SeparationCloudCallBack<List<SeparationBean>>() {
@Override
public void onFinish(List<SeparationBean> response) {
// Callback when the separation data is received. The data includes the accompaniment ID.
}
@Override
public void onError(int errorCode) {
// Callback when the separation fails.
}
});
// Set the parameter for accompaniment separation.
List instruments = new ArrayList<>();
instruments.add("accompaniment ID");
haeAudioSeparationFile.setInstruments(instruments);
// Start separating.
haeAudioSeparationFile.startSeparationTasks(inAudioPath, outAudioDir, outAudioName, new AudioSeparationCallBack() {
@Override
public void onResult(SeparationBean separationBean) { }
@Override
public void onFinish(List<SeparationBean> separationBeans) {}
@Override
public void onFail(int errorCode) {}
@Override
public void onCancel() {}
});
// Cancel separating.
haeAudioSeparationFile.cancel();
Call applyAudioFile to apply spatial audio.
Code:
// Apply spatial audio.
// Fixed position mode.
HAESpaceRenderFile haeSpaceRenderFile = new HAESpaceRenderFile(SpaceRenderMode.POSITION);
haeSpaceRenderFile.setSpacePositionParams(
new SpaceRenderPositionParams(x, y, z));
// Dynamic rendering mode.
HAESpaceRenderFile haeSpaceRenderFile = new HAESpaceRenderFile(SpaceRenderMode.ROTATION);
haeSpaceRenderFile.setRotationParams( new SpaceRenderRotationParams(
x, y, z, surroundTime, surroundDirection));
// Extension.
HAESpaceRenderFile haeSpaceRenderFile = new HAESpaceRenderFile(SpaceRenderMode.EXTENSION);
haeSpaceRenderFile.setExtensionParams(new SpaceRenderExtensionParams(radiusVal, angledVal));
// Call the API.
haeSpaceRenderFile.applyAudioFile(inAudioPath, outAudioDir, outAudioName, callBack);
// Cancel applying spatial audio.
haeSpaceRenderFile.cancel();
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
After completing these steps, you can now implement the 2D-to-3D conversion effect for your app.
Utilize the function according to your needs. To find more about it, check out:
Official website of Audio Editor Kit
Development guide to the kit
Can this conversion will be done in offline?
does it support all audio format?
Which Audio api's we can use for volume management?
muraliameakula said:
Can this conversion will be done in offline?
Click to expand...
Click to collapse
This service cannot be converted offline. Some functions can be used offline: AI dubbing, spatial rendering, separation, and functions related to material.
ProManojKumar said:
does it support all audio format?
Click to expand...
Click to collapse
Support mp3 wav flac aac etc.
vivek_yadav said:
Which Audio api's we can use for volume management?
Click to expand...
Click to collapse
https://developer.huawei.com/consum...unctions-0000001224604517#section171179354277

Categories

Resources