HMS Video Kit — 1 - Huawei Developers

HMS Video Kit — 1
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
In this article, I will write about the features of Huawei’s Video Kit and we will develop a sample application that allows you to play streaming media from a third-party video address.
Why should we use it?
Nowadays, video apps are so popular. Due to the popularity of streaming media, many developers have introduced HD movie streaming apps for people who use devices such as tablets and smartphones for everyday purposes. With Video Kit WisePlayer SDK you can bring stable HD video experiences to your users.
Service Features
It provides a high definition video experience without any delay
Responds instantly to playback requests
Have intuitive controls and offer content on demand
It selects the most suitable bitrate for your app
URL anti-leeching , playback authentication, and other security mechanisms so your videos are completely secure
It supports streaming media in 3GP, MP4, or TS format and complies with HTTP/HTTPS, HLS, or DASH.
Integration Preparations
First of all, in order to start developing an app with most of the Huawei mobile services and the Video Kit as well, you need to integrate the HUAWEI HMS Core into your application.
Software Requirements
Android Studio 3.X
JDK 1.8 or later
HMS Core (APK) 5.0.0.300 or later
EMUI 3.0 or later
The integration flow will be like this :
For a detailed HMS core integration process, you can either refer to Preparations for Integrating HUAWEI HMS Core.
After creating the application on App Gallery Connect and completed the other steps that are required, please make sure that you copied the agconnect-services.json file to the app’s root directory of your Android Studio project.
Adding SDK dependencies
Add the AppGallery Connect plug-in and the Maven repository in the project-level build.gradle file.
Code:
buildscript {
repositories {
......
maven {url 'https://developer.huawei.com/repo/'}
}
dependencies {
......
classpath 'com.huawei.agconnect:agcp:1.3.1.300' // HUAWEI agcp plugin
}
}
allprojects {
repositories {
......
maven {url 'https://developer.huawei.com/repo/'}
}
}
2. Open the build.gradle file in the app directory and add the AppGallery connect plug-in.
Code:
apply plugin: 'com.android.application'
// Add the following line
apply plugin: 'com.huawei.agconnect' // HUAWEI agconnect Gradle plugin
android {
......
}
3.Configure the Maven dependency in the app level build.gradle file
Code:
dependencies {
......
implementation "com.huawei.hms:videokit-player:1.0.1.300"
}
You can find all the version numbers of this kit in its Version Change History.
Click to expand...
Click to collapse
4.Configure the NDK in the app-level build.gradle file.
Code:
android {
defaultConfig {
......
ndk {
abiFilters "armeabi-v7a", "arm64-v8a"
}
}
......
}
Here, we have used the abiFilters in order to reduce the .apk size by selecting the desired CPU architectures.
5.Add permissons in the AndroidManifest.xml file.
Code:
<uses-permission
android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<uses-permission
android:name="android.permission.ACCESS_WIFI_STATE" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<uses-permission android:name="com.huawei.permission.SECURITY_DIAGNOSE" />
Note : For Android 6.0 and later , Video Kit dynamically applies for the write permisson on external storage.
Click to expand...
Click to collapse
6.Lastly, add configurations to exclude the HMS Core SDK from obfuscation.
The obfuscation configuration file is proguard-rules.pro for Android Studio
Open the obfuscation configuration file of your Android Studio project and add the configurations.
Code:
-ignorewarnings
-keepattributes *Annotation*
-keepattributes Exceptions
-keepattributes InnerClasses
-keepattributes Signature
-keepattributes SourceFile,LineNumberTable
-keep class com.hianalytics.android.**{*;}
-keep class com.huawei.updatesdk.**{*;}
-keep class com.huawei.hms.**{*;}
With these steps, we have terminated the integration part. Now, let's get our hands dirty with some code …
Initializing WisePlayer
In order to initialize the player, we need to create a class that inherits from Application.Application class is a base class of Android app containing components like Activities and Services.Application or its sub classes are instantiated before all the activities or any other application objects have been created in the Android app.
We can give additional introductions to the Application class by extending it. We call the initialization API WisePlayerFactory.initFactory() of the WisePlayer SDK in the onCreate() method.
Java:
public class VideoKitPlayApplication extends Application {
private static final String TAG = "VideoKitPlayApplication";
private static WisePlayerFactory wisePlayerFactory = null;
@Override
public void onCreate() {
super.onCreate();
initPlayer();
}
private void initPlayer() {
// DeviceId test is used in the demo, specific access to incoming deviceId after encryption
Log.d(TAG, "initPlayer: VideoKitPlayApplication");
WisePlayerFactoryOptions factoryOptions = new WisePlayerFactoryOptions.Builder().setDeviceId("xxx").build();
WisePlayerFactory.initFactory(this, factoryOptions, initFactoryCallback);
}
/**
* Player initialization callback
*/
private static InitFactoryCallback initFactoryCallback = new InitFactoryCallback() {
@Override
public void onSuccess(WisePlayerFactory wisePlayerFactory) {
Log.d(TAG, "init player factory success");
setWisePlayerFactory(wisePlayerFactory);
}
@Override
public void onFailure(int errorCode, String reason) {
Log.d(TAG, "init player factory fail reason :" + reason + ", errorCode is " + errorCode);
}
};
/**
* Get WisePlayer Factory
*
* @return WisePlayer Factory
*/
public static WisePlayerFactory getWisePlayerFactory() {
return wisePlayerFactory;
}
private static void setWisePlayerFactory(WisePlayerFactory wisePlayerFactory) {
VideoKitPlayApplication.wisePlayerFactory = wisePlayerFactory;
}
}
Playing a Video
We need to create a PlayActivity that inherits from AppCompatActivity and implement the Callback and SurfaceTextureListener APIs.Currently, WisePlayer supports SurfaceView and TextureView. Make sure that your app has a valid view for video display; otherwise, the playback will fail. So that In the layout file, we need to add SurfaceView or TextureView to be displayed in WisePlayer.PlayActivity also implements the OnPlayWindowListener and OnWisePlayerListener in order to get callbacks from the WisePlayer.
Java:
import android.view.SurfaceHolder.Callback;
import android.view.TextureView.SurfaceTextureListener;
import com.videokitnative.huawei.contract.OnPlayWindowListener;
import com.videokitnative.huawei.contract.OnWisePlayerListener;
public class PlayActivity extends AppCompatActivity implements Callback,SurfaceTextureListener,OnWisePlayerListener,OnPlayWindowListener{
...
}
WisePlayerFactory instance is returned when the initialization is complete in Application. We need to call createWisePlayer() to create WisePlayer.
Java:
WisePlayer player = Application.getWisePlayerFactory().createWisePlayer();
In order to make the code modular and understandable, I have created PlayControl.java class as in the official demo and created the Wiseplayer in that class. Since we create the object in our PlayActivity class through the constructor,wisePlayer will be created in the onCreate() method of our PlayActivity.
Note: Before calling createWisePlayer() to create WisePlayer, make sure that Application has successfully initialized the WisePlayer SDK.
Click to expand...
Click to collapse
Now, we need to Initialize the WisePlayer layout and add layout listeners. I have created the PlayView.java for creating the views and updating them. So that we can create the PlayView instance on onCreate() method of our PlayActivity.
Java:
/**
* init the layout
*/
private void initView() {
playView = new PlayView(this, this, this);
setContentView(playView.getContentView());
}
In the PlayView.java class I have created SurfaceView for displaying the video.
Java:
surfaceView = (SurfaceView) findViewById(R.id.surface_view); SurfaceHolder surfaceHolder = surfaceView.getHolder(); surfaceHolder.addCallback(this); surfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
I will share the demo code that I have created. You can find the activity_play.xml layout and the PlayView.java files over there.
Click to expand...
Click to collapse
Registering WisePlayer listeners is another important step. Because the app will react based on listener callbacks. I have done this on PlayControl.java class with the method below.
Java:
/**
* Set the play listener
*/
private void setPlayListener() {
if (wisePlayer != null) {
wisePlayer.setErrorListener(onWisePlayerListener);
wisePlayer.setEventListener(onWisePlayerListener);
wisePlayer.setResolutionUpdatedListener(onWisePlayerListener);
wisePlayer.setReadyListener(onWisePlayerListener);
wisePlayer.setLoadingListener(onWisePlayerListener);
wisePlayer.setPlayEndListener(onWisePlayerListener);
wisePlayer.setSeekEndListener(onWisePlayerListener);
}
}
Here, onWisePlayerListener is an interface that extends required Wiseplayer interfaces.
Java:
public interface OnWisePlayerListener extends WisePlayer.ErrorListener, WisePlayer.ReadyListener,
WisePlayer.EventListener, WisePlayer.PlayEndListener, WisePlayer.ResolutionUpdatedListener,
WisePlayer.SeekEndListener, WisePlayer.LoadingListener, SeekBar.OnSeekBarChangeListener {
}
Now, we need to set URLs for our videos on our PlayControl.java class with the method below.
Java:
wisePlayer.setPlayUrl("http://commondatastorage.googleapis.com/gtv-videos-bucket/sample/BigBuckBunny.mp4");
Since I have used CardViews on my MainActivity.java class , I have passed the Urls and movie names on click action through intent from MainActivity to PlayControl. You can check it out on my source code as well.
Click to expand...
Click to collapse
We’ve set a view to display the video with the code below. In my demo application I have used SurfaceView to display the video.
Java:
// SurfaceView listener callback
@Override
public void surfaceCreated(SurfaceHolder holder) { wisePlayer.setView(surfaceView); }
In order to prepare for the playback and start requesting data, we need the call wisePlayer.ready() method .
Lastly, we need to call wisePlayer.start() method to start the playback upon a successful response of the onReady callback method in this API.
Java:
@Override public void onReady(WisePlayer wisePlayer)
{
wisePlayer.start();
}
We have finished the development, lets pick a movie and enjoy it
Movie List
You can find the source code of the demo app here.
In this article, we developed a sample application using HUAWEI Video Kit. HMS Video Kit offers a lot of features, for the sake of simplicity we implemented a few of them. I will share another post to show more features of the video kit in the near future.
RESOURCES
Documentation
Video Kit Codelab

what is the minimum resolution video we can play ??

What should I do if the signature fails to be verified on the server side?

shikkerimath said:
what is the minimum resolution video we can play ??
Click to expand...
Click to collapse
The minimum resolution is 270p, and the maximum is 4K.

Very interesting.

Related

Hand Keypoint Detection with HMS ML Kit Explained (with a Demo Project)

{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Hand keypoint detection is the process of finding fingertips, knuckles and wrists in an image. Hand keypoint detection and hand gesture recognition is still a challenging problem in computer vision domain. It is really a tough work to build your own model for hand keypoint detection as it is hard to collect a large enough hand dataset and it requires expertise in this domain.
Hand keypoint detection can be used in variety of scenarios. For example, it can be used during artistic creation, users can convert the detected hand keypoints into a 2D model, and synchronize the model to a character’s model to produce a vivid 2D animation. You can create a puppet animation game using the above idea. Another example may be creating a rock paper scissors game. Or if you take it further, you can create a sign language to text conversion application. As you see varieties to possible usage scenarios are abundant and there is no limit to ideas.
Hand keypoint detection service is a brand-new feature that is added to Huawei Machine Learning Kit family. It has recently been released and it is making developers and computer vision geeks really excited! It detects 21 points of a hand and can detect up to ten hands in an image. It can detect hands in a static image or in a camera stream. Currently, it does not support scenarios where your hand is blocked by more than 50% or you wear gloves. You don’t need an internet connection as this is a device side capability and what is more: It is completely free!
It wouldn’t be a nice practice only to read related documents and forget about it after a few days. So I created a simple demo application that counts fingers and tells us the number we show by hand. I strongly advise you to develop your hand keypoint detection application beside me. I developed the application in Android Studio in Kotlin. Now, I am going to explain you how to build this application step by step. Don’t hesitate to ask questions in the comments if you face any issues.
1.Firstly, let’s create our project on Android Studio. I named my project as HandKeyPointDetectionDemo. I am sure you can find better names for your application. We can create our project by selecting Empty Activity option and then follow the steps described in this post to create and sign our project in App Gallery Connect.
2. In HUAWEI Developer AppGallery Connect, go to Develop > Manage APIs. Make sure ML Kit is activated.
3. Now we have integrated Huawei Mobile Services (HMS) into our project. Now let’s follow the documentation on developer.huawei.com and find the packages to add to our project. In the website click Developer / HMS Core/ AI / ML Kit. Here you will find introductory information to services, references, SDKs to download and others. Under ML Kit tab follow Android / Getting Started / Integrating HMS Core SDK / Adding Build Dependencies / Integrating the Hand Keypoint Detection SDK. We can follow the guide here to add hand detection capability to our project. We have also one meta-data tag to be added into our AndroidManifest.xml file. After the integration your app-level build.gradle file will look like this.
Code:
apply plugin: 'com.android.application'
apply plugin: 'kotlin-android'
apply plugin: 'kotlin-android-extensions'
apply plugin: 'com.huawei.agconnect'
android {
compileSdkVersion 30
buildToolsVersion "30.0.2"
defaultConfig {
applicationId "com.demo.handkeypointdetection"
minSdkVersion 21
targetSdkVersion 30
versionCode 1
versionName "1.0"
testInstrumentationRunner "androidx.test.runner.AndroidJUnitRunner"
}
buildTypes {
release {
minifyEnabled false
proguardFiles getDefaultProguardFile('proguard-android-optimize.txt'), 'proguard-rules.pro'
}
}
}
dependencies {
implementation fileTree(dir: "libs", include: ["*.jar"])
implementation "org.jetbrains.kotlin:kotlin-stdlib:$kotlin_version"
implementation 'androidx.core:core-ktx:1.3.1'
implementation 'androidx.appcompat:appcompat:1.2.0'
implementation 'androidx.constraintlayout:constraintlayout:2.0.1'
testImplementation 'junit:junit:4.12'
androidTestImplementation 'androidx.test.ext:junit:1.1.2'
androidTestImplementation 'androidx.test.espresso:espresso-core:3.3.0'
//AppGalleryConnect Core
implementation 'com.huawei.agconnect:agconnect-core:1.3.1.300'
// Import the base SDK.
implementation 'com.huawei.hms:ml-computer-vision-handkeypoint:2.0.2.300'
// Import the hand keypoint detection model package.
implementation 'com.huawei.hms:ml-computer-vision-handkeypoint-model:2.0.2.300'
}
Our project-level build.gradle file:
Code:
buildscript {
ext.kotlin_version = "1.4.0"
repositories {
google()
jcenter()
maven {url 'https://developer.huawei.com/repo/'}
}
dependencies {
classpath "com.android.tools.build:gradle:4.0.1"
classpath "org.jetbrains.kotlin:kotlin-gradle-plugin:$kotlin_version"
classpath 'com.huawei.agconnect:agcp:1.3.1.300'
}
}
allprojects {
repositories {
google()
jcenter()
maven {url 'https://developer.huawei.com/repo/'}
}
}
task clean(type: Delete) {
delete rootProject.buildDir
}
And don’t forget to add related meta-data tags into your AndroidManifest.xml.
Code:
<?xml version="1.0" encoding="utf-8"?>
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
package="com.demo.handkeypointdetection">
<uses-permission android:name="android.permission.CAMERA" />
<application
...
<meta-data
android:name="com.huawei.hms.ml.DEPENDENCY"
android:value= "handkeypoint"/>
</application>
</manifest>
4. I created a class named HandKeyPointDetector. This class will be called from our activity or fragment. Its init method has two parameters context and a viewgroup. We will add our views on rootLayout.
Code:
fun init(context: Context, rootLayout: ViewGroup) {
mContext = context
mRootLayout = rootLayout
addSurfaceViews()
}
5. We are going to detect hand key points in a camera stream, so we create a surfaceView for camera preview and another surfaceView to draw somethings. The surfaceView that is going to be used as overlay should be transparent. Then, we add our views to our rootLayout passed as a parameter from our activity. Lastly we add SurfaceHolder.Callback to our surfaceHolder to know when it is ready.
Code:
private fun addSurfaceViews() {
val surfaceViewCamera = SurfaceView(mContext).also {
it.layoutParams = LinearLayout.LayoutParams(LinearLayout.LayoutParams.MATCH_PARENT, LinearLayout.LayoutParams.MATCH_PARENT)
mSurfaceHolderCamera = it.holder
}
val surfaceViewOverlay = SurfaceView(mContext).also {
it.layoutParams = LinearLayout.LayoutParams(LinearLayout.LayoutParams.MATCH_PARENT, LinearLayout.LayoutParams.MATCH_PARENT)
mSurfaceHolderOverlay = it.holder
mSurfaceHolderOverlay.setFormat(PixelFormat.TRANSPARENT)
mHandKeyPointTransactor.setOverlay(mSurfaceHolderOverlay)
}
mRootLayout.addView(surfaceViewCamera)
mRootLayout.addView(surfaceViewOverlay)
mSurfaceHolderCamera.addCallback(surfaceHolderCallback)
}
6. Inside our surfaceHolderCallback we override three methods: surfaceCreated, surfacehanged and surfaceDestroyed.
Code:
private val surfaceHolderCallback = object : SurfaceHolder.Callback {
override fun surfaceCreated(holder: SurfaceHolder) {
createAnalyzer()
}
override fun surfaceChanged(holder: SurfaceHolder, format: Int, width: Int, height: Int) {
prepareLensEngine(width, height)
mLensEngine.run(holder)
}
override fun surfaceDestroyed(holder: SurfaceHolder) {
mLensEngine.release()
}
}
7. createAnalyzer method creates MLKeyPointAnalyzer with settings. If you want you can use default settings also. Scene type can be keypoint and rectangle around hands or we can use TYPE_ALL for both. Max hand results can be up to MLHandKeypointAnalyzerSetting.MAX_HANDS_NUM which is 10 currently. As we will count fingers of 2 hands, I set it to 2.
Code:
private fun createAnalyzer() {
val settings = MLHandKeypointAnalyzerSetting.Factory()
.setSceneType(MLHandKeypointAnalyzerSetting.TYPE_ALL)
.setMaxHandResults(2)
.create()
mAnalyzer = MLHandKeypointAnalyzerFactory.getInstance().getHandKeypointAnalyzer(settings)
mAnalyzer.setTransactor(mHandKeyPointTransactor)
}
8. LensEngine is responsible for handling camera frames for us. All we need to do is to prepare it with right dimensions according to orientation, choose the camera we want to work with, apply fps an so on.
Code:
private fun prepareLensEngine(width: Int, height: Int) {
val dimen1: Int
val dimen2: Int
if (mContext.resources.configuration.orientation == Configuration.ORIENTATION_LANDSCAPE) {
dimen1 = width
dimen2 = height
} else {
dimen1 = height
dimen2 = width
}
mLensEngine = LensEngine.Creator(mContext, mAnalyzer)
.setLensType(LensEngine.BACK_LENS)
.applyDisplayDimension(dimen1, dimen2)
.applyFps(5F)
.enableAutomaticFocus(true)
.create()
}
9. When you no longer need the analyzer stop it and release resources.
Code:
fun stopAnalyzer() {
mAnalyzer.stop()
}
10. As you can see in step-7 we used mHandKeyPointTransactor. It is a custom class that we created named HandKeyPointTransactor, which inherits MLAnalyzer.MLTransactor<MLHandKeypoints>. It has two overriden methods inside. transactResult and destroy. Detected results will fall inside transactResult method and then we will try to find the number.
Code:
override fun transactResult(result: MLAnalyzer.Result<MLHandKeypoints>?) {
if (result == null)
return
val canvas = mOverlay?.lockCanvas() ?: return
//Clear canvas.
canvas.drawColor(0, PorterDuff.Mode.CLEAR)
//Find the number shown by our hands.
val numberString = analyzeHandsAndGetNumber(result)
//Find the middle of the canvas
val centerX = canvas.width / 2F
val centerY = canvas.height / 2F
//Draw a text that writes the number we found.
canvas.drawText(numberString, centerX, centerY, Paint().also {
it.style = Paint.Style.FILL
it.textSize = 100F
it.color = Color.GREEN
})
mOverlay?.unlockCanvasAndPost(canvas)
}
11. We will check hand by hand and then finger by finger to find the fingers that are up to find the number.
Code:
private fun analyzeHandsAndGetNumber(result: MLAnalyzer.Result<MLHandKeypoints>): String {
val hands = ArrayList<Hand>()
var number = 0
for (key in result.analyseList.keyIterator()) {
hands.add(Hand())
for (value in result.analyseList.valueIterator()) {
number += hands.last().createHand(value.handKeypoints).getNumber()
}
}
return number.toString()
}
For more information, you can visit https://forums.developer.huawei.com/forumPortal/en/topicview?tid=0202369245767250343&fid=0101187876626530001

How do You Equip Your App with Audio Playback Capabilities Using Audio Kit?

Unlike text or video, when users consume audio content, they can also do something else while they're listening. This is why users tend to choose audio, rather than text or video, when commuting or doing housework.
This makes audio playback a valuable addition for many apps. For example, fitness and health apps are more engaging when they have the ability to play music or audiobooks, while education apps are more effective when they provide useful audio courses, and ringtone apps need to be able to play a variety of ringtones.
So then, how do you build audio capabilities for your app?
The answer is, by using HUAWEI Audio Kit.
It provides you with a range of capabilities, including audio encoding and decoding at both the hardware level and system bottom layer.
Audio Kit provides the following functions:
l Play audio: apps can decode and play high-resolution audio files of up to 384 kHz/24 bit.
l Control playback: users can play, pause, play previous, play next, stop, and drag the progress bar.
l Adjust volume: users can increase or decrease the volume.
l Manage playlists: gives users the ability to view, save, and delete playlists, as well as add songs to a playlist.
l Manage play modes: apps can provide sequential playback, repeat a playlist, repeat a song, and shuffle songs).
l Users can save their playback progress and start from where they left off.
l Apps can cache and encrypt audio content.
Demo:
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
You can find the demo source code on GitHub.
Development Practice
1. Integrate the HMS Core Audio SDK
1.1 Configure the Maven Repository Address for the Audio SDK
Step 1 Open the build.gradle file in the root directory of your Android Studio project.
Step 2 Configure the Maven repository address and add the gradle configuration.
l Go to allprojects > repositories and configure the Maven repository address for the Audio SDK.
l Go to buildscript > repositories and configure the Maven repository address for the Audio SDK.
l Go to buildscript > dependencies and add the gradle configuration.
Code:
<p style="line-height: 1.5em;">buildscript {
repositories {
google()
jcenter()
maven {url 'https://developer.huawei.com/repo/'}
}
dependencies {
classpath 'com.android.tools.build:gradle:3.4.2'
// NOTE: Do not place your app dependencies here; you need to put them
// in the individual module build.gradle files
}
}
allprojects {
repositories {
google()
jcenter()
maven {url 'https://developer.huawei.com/repo/'}
}
}
</p>
1.2 Add Build Dependencies
Step 1 Open the build.gradle file in the app directory.
Step 2 Add build dependencies in the dependencies section.
Code:
<p style="line-height: 1.5em;">dependencies {
implementation 'com.huawei.hms:audiokit-player:{version}'
}
</p>
1.3 Synchronize the Project
Once you have completed the configuration above, click the synchronization icon on the toolbar to synchronize the build.gradle file.
2 Configure Obfuscation Scripts
Before you build the APK, configure the obfuscation file to prevent the HMS Core SDK from being obfuscated.
The obfuscation configuration file is proguard-rules.pro for Android Studio.
Step 1 Open the obfuscation configuration file proguard-rules.pro in the app directory.
Step 2 Remove the HMS Core SDK from obfuscation.
Code:
<p style="line-height: 1.5em;">-ignorewarnings
-keepattributes *Annotation*
-keepattributes Exceptions
-keepattributes InnerClasses
-keepattributes Signature
-keepattributes SourceFile,LineNumberTable
-keep class com.huawei.hianalytics.**{*;}
-keep class com.huawei.updatesdk.**{*;}
-keep class com.huawei.hms.**{*;}
</p>
Step 3 If you are using AndResGuard, add its trustlist to the obfuscation configuration file.
Code:
<p style="line-height: 1.5em;">"R.string.hms*",
"R.string.connect_server_fail_prompt_toast",
"R.string.getting_message_fail_prompt_toast",
"R.string.no_available_network_prompt_toast",
"R.string.third_app_*",
"R.string.upsdk_*",
"R.layout.hms*",
"R.layout.upsdk_*",
"R.drawable.upsdk*",
"R.color.upsdk*",
"R.dimen.upsdk*",
"R.style.upsdk*",
"R.string.agc*"
</p>
3 Adding Permissions
The Audio SDK requires permissions to access the network, obtain the network status, operate SD cards, and read data from the Android media library. Declare these permissions in the Manifest file:
Code:
<p style="line-height: 1.5em;">// Permission to access the network.
<uses-permission android:name="android.permission.INTERNET" />
// Permission to obtain the network status.
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
//Permission to write data into the SD card.
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
// Permission to read data from the SD card.
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
// Permission to read data from the Android media library.
<uses-permission android:name="android.permission.READ_MEDIA_STORAGE" />
</p>
4 Developing Your App
Step 1 Create an audio management instance by calling HwAudioManager; manage audio playback by calling HwAudioPlayerManager; manage audio queues by calling HwAudioQueueManager; manage audio configurations by calling HwAudioConfigManager.
Code:
<p style="line-height: 1.5em;">private HwAudioPlayerManager mHwAudioPlayerManager;
private HwAudioConfigManager mHwAudioConfigManager;
private HwAudioQueueManager mHwAudioQueueManager;
public void createHwAudioManager() {
// Create a configuration instance, including various playback-related configurations.
HwAudioPlayerConfig hwAudioPlayerConfig = new HwAudioPlayerConfig(MainActivity.this);
// Create a control instance.
HwAudioManagerFactory.createHwAudioManager(hwAudioPlayerConfig, new HwAudioConfigCallBack() {
// Return the control instance through callback.
@Override
public void onSuccess(HwAudioManager hwAudioManager) {
try {
Log.i(TAG, "createHwAudioManager onSuccess");
// Obtain the playback control instance.
mHwAudioPlayerManager = hwAudioManager.getPlayerManager();
// Obtain the configuration control instance.
mHwAudioConfigManager = hwAudioManager.getConfigManager();
// Obtain the queue control instance.
mHwAudioQueueManager = hwAudioManager.getQueueManager();
} catch (Exception e) {
Log.i(TAG, "player init fail");
}
}
@Override
public void onError(int errorCode) {
Log.w(TAG, "init err:" + errorCode);
}
});
}
</p>
Step 2 Create a playlist and play songs.
Code:
<p style="line-height: 1.5em;">public void play() {
if (mHwAudioPlayerManager != null) {
// Create a playlist.
String path = "https://lfmusicservice.hwcloudtest.cn:18084/HMS/audio/Taoge-chengshilvren.mp3";
// Create an audio object and write audio information into the object.
HwAudioPlayItem item = new HwAudioPlayItem();
// Set the audio title.
item.setAudioTitle("Playing input song");
// Set the audio ID.
item.setAudioId(String.valueOf(path.hashCode()));
// Set whether audio is online.
item.setOnline(1);
// Set the online audio URL.
item.setOnlinePath(path);
List<HwAudioPlayItem> playItemList = new ArrayList<>();
playItemList.add(item);
// Play songs.
mHwAudioPlayerManager.playList(playItemList, 0, 0);
}
}
</p>
Step 3 Use instances. The following are examples. For more details, see Audio Kit's Management APIs.
l Clear the playback cache.
Code:
<p style="line-height: 1.5em;">public void clearPlayCache() {
if (mHwAudioConfigManager != null) {
mHwAudioConfigManager.clearPlayCache();
}
}
l Check whether the current playback queue is empty.
public boolean isQueueEmpty() {
if (mHwAudioQueueManager != null) {
return mHwAudioQueueManager.isQueueEmpty();
}
return false;
}
</p>
And that's it! You've equipped your app with audio playback capabilities.

Integrating HUAWEI Ads Kit Using Unity

This document describes how to integrate Ads Kit using the official Unity asset. After the integration, your app can use the services of this Kit on HMS mobile phones.
For details about Ads Kit, please visit HUAWEI Developers.
1.1 Restrictions​
1.1.1 Ads Supported by the Official Unity Asset​
The official asset of version 1.3.4 in Unity supports interstitial ads and rewarded ads of Ads Kit.
Note: To support other types of ads, you can use the Android native integration mode. This document will take the banner ad as an example to describe such integration.
1.1.2 Supported Devices and Unity Versions​
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Note: If the version is earlier than 2018.4.25, you can manually import assets. If there are any unknown build errors that are not caused by the AfterBuildToDo file, upgrade your Unity.
1.1 Preparations​
1.1.1 Prerequisites​
l HMS Core (APK) 4.0.0.300 or later has been installed on the device. Otherwise, the APIs of the HUAWEI Ads SDK, which depend on HMS Core (APK) 4.0.0.300 or later, cannot be used.
l You have registered as a Huawei developer and completed identity verification on HUAWEI Developers. For details, please refer to Registration and Verification.
You have created a project and add an app to the project in AppGallery Connect by referring to Creating an AppGallery Connect Project and Adding an App to the Project.
For details about applying for a formal ad slot, please visit HUAWEI Developers.
1.1.2 Importing Unity Assets​
1. Open Asset Store in Unity.
Go to Window > Asset Store in Unity.
2. Search for the Huawei HMS AGC Services asset. Download and then import it.
3. Import the asset to My Assets, with all services selected.
4. Change the package name.
Go to Edit > Project Settings> Player > Android > Other Settings in Unity, and then set Package Name.
The default package name is com.${Company Name}.${Product Name}. You need to change the package name, and the app will be released to AppGallery with the new name.
1.2.3 Generating .gradle Files​1. Enable project gradle.
Go to Edit > Project Settings > Player in Unity, click the Android icon, and go to Publishing Settings > Build.
Enable Custom Main Manifest.
Enable Custom Main Gradle Template.
Enable Custom Launcher Gradle Template.
Enable Custom Base Gradle Template.
2. Generate a signature.
You can use an existing keystore file or create a new one to sign your app.
Go to Edit > Project Settings > Player in Unity, click the Android icon, and go to Publishing Settings > Keystore Manager.
Then, go to Keystore... > Create New.
Enter the password when you open Unity. Otherwise, you cannot build the APK.
1.1.1 Configuring .gradle Files​1. Configure the BaseProjectTemplate.gradle file.
Configure the Maven repository address.
Code:
<p style="line-height: 1.5em;">buildscript {
repositories {**ARTIFACTORYREPOSITORY**
google()
jcenter()
maven { url 'https://developer.huawei.com/repo/' }
}
repositories {**ARTIFACTORYREPOSITORY**
google()
jcenter()
maven { url 'https://developer.huawei.com/repo/' }
flatDir {
dirs "${project(':unityLibrary').projectDir}/libs"
}
}
2. Configure the launcherTemplate.gradle file.
Add dependencies.
apply plugin: 'com.android.application'
apply plugin: 'com.huawei.agconnect'
dependencies {
implementation project(':unityLibrary')
implementation 'com.huawei.hms:ads-lite:13.4.29.303'
}</p>
1.2 App Development with the Official Asset​1.2.1 Rewarded Ads​Rewarded ads are full-screen video ads that allow users to view in exchange for in-app rewards.
1.2.1.1 Sample Code​
Code:
<p style="line-height: 1.5em;">using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using HuaweiHms;
namespace Demo
{
public class RewardDemo : MonoBehaviour
{
public void LoadRewardAds()
{
// Create a RewardAd object.
// "testx9dtjwj8hp" is the test ad slot. You can use it to perform a test. Replace the test slot with a formal one for official release.
RewardAd rewardAd = new RewardAd(new Context(), "testx9dtjwj8hp");
AdParam.Builder builder = new AdParam.Builder();
AdParam adParam = builder.build();
// Load the ad.
rewardAd.loadAd(adParam, new MRewardLoadListener(rewardAd));
}
}
// Listen for ad events.
public class MRewardLoadListener:RewardAdLoadListener
{
private RewardAd ad;
public rewardLoadListener(RewardAd _ad)
{
ad = _ad;
}
public override void onRewardAdFailedToLoad(int arg0)
{
}
public override void onRewardedLoaded()
{
ad.show(new Context(),new RewardAdStatusListener());
}
}
}</p>
1.3.1.2 Testing the APK​Go to File > Build Settings > Android, click Switch Platform and then Build And Run.
1.3.2 Interstitial Ads​1.3.2.1 Sample Code​
Code:
<p style="line-height: 1.5em;">using UnityEngine;
using HuaweiService;
using HuaweiService.ads;
namespace Demo
{
public class interstitialDemo : MonoBehaviour
{
public void LoadImageAds()
{
// Create an ad object.
// "testb4znbuh3n2" and "teste9ih9j0rc3" are test ad slots. You can use them to perform a test. Replace the test slots with formal ones for official release.
InterstitialAd ad = new InterstitialAd(new Context());
ad.setAdId("teste9ih9j0rc3");
ad.setAdListener(new MAdListener(ad));
AdParam.Builder builder = new AdParam.Builder();
AdParam adParam = builder.build();
// Load the ad.
ad.loadAd(adParam);
}
public void LoadVideoAds()
{
InterstitialAd ad = new InterstitialAd(new Context());
ad.setAdId("testb4znbuh3n2");
ad.setAdListener(new MAdListener(ad));
AdParam.Builder builder = new AdParam.Builder();
ad.loadAd(builder.build());
}
public class MAdListener : AdListener
{
private InterstitialAd ad;
public MAdListener(InterstitialAd _ad) : base()
{
ad = _ad;
}
public override void onAdLoaded()
{
// Display the ad if it is successfully loaded.
ad.show();
}
public override void onAdFailed(int arg0)
{
}
public override void onAdOpened()
{
}
public override void onAdClicked()
{
}
public override void onAdLeave()
{
}
public override void onAdClosed()
{
}
}
}
}</p>
1.4 App Development with Android Studio​1.4.1 Banner Ads​1.4.1.1 Loading Banner Ads on a Page as Required​AndroidJavaClass javaClass = new AndroidJavaClass("com.unity3d.player.UnityPlayerActivity");
javaClass.CallStatic("loadBannerAds");
1.4.1.2 Exporting Your Project from Unity​Go to File > Build Settings > Android and click Switch Platform. Then, click Export Project, select your project, and click Export.
1.4.1.3 Integrating the Banner Ad Function in Android Studio​Open the exported project in Android Studio.
Add implementation 'com.huawei.hms:ads-lite:13.4.29.303' to build.gradle in the src directory.
1. Add code related to banner ads to UnityPlayerActivity.
a. Define the static variable bannerView.
Code:
<p style="line-height: 1.5em;">private static BannerView
bannerView
;</p>
b. Add the initialization of bannerView to the onCreate method.
Code:
<p style="line-height: 1.5em;">bannerView = new BannerView(this);
bannerView.setAdId("testw6vs28auh3");
bannerView.setBannerAdSize(BannerAdSize.BANNER_SIZE_360_57);
mUnityPlayer.addView(bannerView);</p>
c. Add the following static method for loading ads in the Android Studio project, and then build and run the project.
Code:
<p style="line-height: 1.5em;">public static void loadBannerAds()
{ // "testw6vs28auh3" is a dedicated test ad slot ID. Before releasing your app, replace the test ad slot ID with the formal one.
AdParam adParam = new AdParam.Builder().build();
bannerView.loadAd(adParam);
}</p>
d. If banner ads need to be placed at the bottom of the page, refer to the following code:
Code:
<p style="line-height: 1.5em;">// Set up activity layout.
@Override protected void onCreate(Bundle savedInstanceState)
{
requestWindowFeature(Window.FEATURE_NO_TITLE);
super.onCreate(savedInstanceState);
String cmdLine = updateUnityCommandLineArguments(getIntent().getStringExtra("unity"));
getIntent().putExtra("unity", cmdLine);
RelativeLayout relativeLayout = new RelativeLayout(this);
mUnityPlayer = new UnityPlayer(this, this);
setContentView(relativeLayout);
relativeLayout.addView(mUnityPlayer);
mUnityPlayer.requestFocus();
bannerView = new BannerView(this);
bannerView.setAdId("testw6vs28auh3");
bannerView.setBannerAdSize(BannerAdSize.BANNER_SIZE_360_57);
RelativeLayout.LayoutParams layoutParams = new RelativeLayout.LayoutParams(ViewGroup.LayoutParams.MATCH_PARENT,ViewGroup.LayoutParams.WRAP_CONTENT); layoutParams.addRule(RelativeLayout.ALIGN_PARENT_BOTTOM); relativeLayout.addView(bannerView,layoutParams);
}
public static void LoadBannerAd() {
// "testw6vs28auh3" is a dedicated test ad slot ID. Before releasing your app, replace the test ad slot ID with the formal one.
AdParam adParam = new AdParam.Builder().build();
bannerView.loadAd(adParam);
bannerView.setAdListener(new AdListener()
{
@Override
public void onAdFailed(int errorCode)
{
Log.d("BannerAds" ,"error" + errorCode);
}
});
}</p>
1.5 FAQs​1. If an error indicating invalid path is reported when you export a project from Unity, change the export path to Downloads or Desktop.
2. Unity of a version earlier than 2018.4.25 does not support asset download from Asset Store. You can download the asset using Unity of 2018.4.25 or a later version, export it, and then import it to Unity of an earlier version.
More Information
To join in on developer discussion forums
To download the demo app and sample code
For solutions to integration-related issues
Checkout in forum

Extract table data from Image using Huawei Table Recognition by Huawei HiAI in Android

Introduction
The API can perfectly retrieve the information of tables as well as text from cells, besides, merged cells can also be recognized. It supports recognition of tables with clear and unbroken lines, but not supportive of tables with crooked lines or cells divided by color background. Currently the API supports recognition from printed materials and snapshots of slide meetings, but it is not functioning for the screenshots or photos of excel sheets and any other table editing software.
Here, the image resolution should be higher than 720p (1280×720 px), and the aspect ratio (length-to-width ratio) should be lower than 2:1.
In this article, we will learn how to implement Huawei HiAI kit using Table Recognition service into android application, this service helps us to extract the table content from images.
Software requirements
1. Any operating system (MacOS, Linux and Windows).
2. Any IDE with Android SDK installed (IntelliJ, Android Studio).
3. HiAI SDK.
4. Minimum API Level 23 is required.
5. Required EMUI 9.0.0 and later version devices.
6. Required processors kirin 990/985/980/970/ 825Full/820Full/810Full/ 720Full/710Full
How to integrate Table recognition.
1. Configure the application on the AGC.
2. Apply for HiAI Engine Library.
3. Client application development process.
Configure application on the AGC
Follow the steps.
Step 1: We need to register as a developer account in AppGallery Connect. If you are already a developer ignore this step.
Step 2: Create an app by referring to Creating a Project and Creating an App in the Project
Step 3: Set the data storage location based on the current location.
Step 4: Generating a Signing Certificate Fingerprint.
Step 5: Configuring the Signing Certificate Fingerprint.
Step 6: Download your agconnect-services.json file, paste it into the app root directory.
Apply for HiAI Engine Library
What is Huawei HiAI?
HiAI is Huawei's AI computing platform. HUAWEI HiAI is a mobile terminal–oriented artificial intelligence (AI) computing platform that constructs three layers of ecology: service capability openness, application capability openness, and chip capability openness. The three-layer open platform that integrates terminals, chips, and the cloud brings more extraordinary experience for users and developers.
How to apply for HiAI Engine?
Follow the steps
Step 1: Navigate to this URL, choose App Service > Development and click HUAWEI HiAI.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Step 2: Click Apply for HUAWEI HiAI kit.
Step 3: Enter required information like Product name and Package name, click Next button.
Step 4: Verify the application details and click Submit button.
Step 5: Click the Download SDK button to open the SDK list.
Step 6: Unzip downloaded SDK and add into your android project under libs folder.
Step 7: Add jar files dependences into app build.gradle file.
implementation fileTree(include: ['*.aar', '*.jar'], dir: 'libs')
implementation 'com.google.code.gson:gson:2.8.6'
repositories {
flatDir {
dirs 'libs'
}
}Copy code
Client application development process
Follow the steps.
Step 1: Create an Android application in the Android studio (Any IDE which is your favorite).
Step 2: Add the App level Gradle dependencies. Choose inside project Android > app > build.gradle.
Code:
apply plugin: 'com.android.application'
apply plugin: 'com.huawei.agconnect'
Root level gradle dependencies.
Code:
maven { url 'https://developer.huawei.com/repo/' }
classpath 'com.huawei.agconnect:agcp:1.4.1.300'
Step 3: Add permission in AndroidManifest.xml
XML:
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.READ_INTERNAL_STORAGE" />
<uses-permission android:name="android.permission.CAMERA" />
Step 4: Build application.
Java:
import android.Manifest;
import android.content.Intent;
import android.graphics.Bitmap;
import android.provider.MediaStore;
import android.support.annotation.Nullable;
import android.support.v7.app.AppCompatActivity;
import android.os.Bundle;
import android.view.View;
import android.widget.Button;
import android.widget.ImageView;
import android.widget.TableLayout;
import android.widget.TextView;
import android.widget.Toast;
import com.huawei.hiai.vision.common.ConnectionCallback;
import com.huawei.hiai.vision.common.VisionBase;
import com.huawei.hiai.vision.common.VisionImage;
import com.huawei.hiai.vision.image.sr.ImageSuperResolution;
import com.huawei.hiai.vision.text.TableDetector;
import com.huawei.hiai.vision.visionkit.text.config.VisionTableConfiguration;
import com.huawei.hiai.vision.visionkit.text.table.Table;
import com.huawei.hiai.vision.visionkit.text.table.TableCell;
import com.huawei.hiai.vision.visionkit.text.table.TableContent;
import java.util.List;
public class TableRecognition extends AppCompatActivity {
private boolean isConnection = false;
private int REQUEST_CODE = 101;
private int REQUEST_PHOTO = 100;
private Bitmap bitmap;
private Button btnImage;
private ImageView originalImage;
private ImageView conversionImage;
private TextView textView;
private TextView tableContentText;
private final String[] permission = {
Manifest.permission.CAMERA,
Manifest.permission.WRITE_EXTERNAL_STORAGE,
Manifest.permission.READ_EXTERNAL_STORAGE};
private ImageSuperResolution resolution;
private TableLayout tableLayout;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_table_recognition);
requestPermissions(permission, REQUEST_CODE);
initializeVisionBase();
originalImage = findViewById(R.id.super_origin);
conversionImage = findViewById(R.id.super_image);
textView = findViewById(R.id.text);
tableContentText = findViewById(R.id.content_text);
btnImage = findViewById(R.id.btn_album);
tableLayout = findViewById(R.id.tableLayout);
btnImage.setOnClickListener(v -> {
selectImage();
tableLayout.removeAllViews();
});
}
private void initializeVisionBase() {
VisionBase.init(this, new ConnectionCallback() {
@Override
public void onServiceConnect() {
isConnection = true;
DoesDeviceSupportTableRecognition();
}
@Override
public void onServiceDisconnect() {
}
});
}
private void DoesDeviceSupportTableRecognition() {
resolution = new ImageSuperResolution(this);
int support = resolution.getAvailability();
if (support == 0) {
Toast.makeText(this, "Device supports HiAI Image super resolution service", Toast.LENGTH_SHORT).show();
} else {
Toast.makeText(this, "Device doesn't supports HiAI Image super resolution service", Toast.LENGTH_SHORT).show();
}
}
public void selectImage() {
Intent intent = new Intent(Intent.ACTION_PICK);
intent.setType("image/*");
startActivityForResult(intent, REQUEST_PHOTO);
}
@Override
protected void onActivityResult(int requestCode, int resultCode, @Nullable Intent data) {
super.onActivityResult(requestCode, resultCode, data);
if (resultCode == RESULT_OK) {
if (data != null && requestCode == REQUEST_PHOTO) {
try {
bitmap = MediaStore.Images.Media.getBitmap(getContentResolver(), data.getData());
originalImage.setImageBitmap(bitmap);
if (isConnection) {
extractTableFromTheImage();
}
} catch (Exception e) {
e.printStackTrace();
}
}
}
}
private void extractTableFromTheImage() {
tableContentText.setVisibility(View.VISIBLE);
TableDetector mTableDetector = new TableDetector(this);
VisionImage image = VisionImage.fromBitmap(bitmap);
VisionTableConfiguration mTableConfig = new VisionTableConfiguration.Builder()
.setAppType(VisionTableConfiguration.APP_NORMAL)
.setProcessMode(VisionTableConfiguration.MODE_OUT)
.build();
mTableDetector.setVisionConfiguration(mTableConfig);
mTableDetector.prepare();
Table table = new Table();
int mResult_code = mTableDetector.detect(image, table, null);
if (mResult_code == 0) {
int count = table.getTableCount();
List<TableContent> tc = table.getTableContent();
StringBuilder sbTableCell = new StringBuilder();
List<TableCell> tableCell = tc.get(0).getBody();
for (TableCell c : tableCell) {
List<String> words = c.getWord();
StringBuilder sb = new StringBuilder();
for (String s : words) {
sb.append(s).append(",");
}
String cell = c.getStartRow() + ":" + c.getEndRow() + ": " + c.getStartColumn() + ":" +
c.getEndColumn() + "; " + sb.toString();
sbTableCell.append(cell).append("\n");
tableContentText.setText("Count = " + count + "\n\n" + sbTableCell.toString());
}
}
}
}Copy code
Result
Tips and Tricks
Recommended image should be larger than 720px.
Multiple table recognition currently not supported.
If you are taking Video from a camera or gallery make sure your app has camera and storage permission.
Add the downloaded huawei-hiai-vision-ove-10.0.4.307.aar, huawei-hiai-pdk-1.0.0.aar file to libs folder.
Check dependencies added properly.
Latest HMS Core APK is required.
Min SDK is 21. Otherwise you will get Manifest merge issue.
Conclusion
In this article, we have done table content extraction from image, for further analysis with statistics or just for editing it. This works for tables with clear and simple structure information. We have learnt the following concepts.
1. Introduction of Table recognition?
2. How to integrate Table using Huawei HiAI
3. How to Apply Huawei HiAI
4. How to build the application
Reference
Table Recognition
Apply for Huawei HiAI
Happy coding
useful sharing,thanks!

How To Convert Audio from 2D to 3D

Immersive audio is becoming an increasingly important factor for enhancing user experience in the music, gaming, and audio/video editing fields. The spatial audio function is ideal for meetings, sports rehabilitation, and particularly for exhibitions, as it helps deliver a more immersive experience. For users who suffer from visual impairments, the function can serve as a helpful guide.
In this article, I am going to reuse the sample code on this GitHub repo .I will implement spatial audio function in my android app and delivers the 3D surround sound.
Development Practice​Preparations​Prepare the audio for 2D-to-3D conversion, which is better a MP3 file. If not, follow the instructions specified later to convert the format to MP3 first. If the audio is part of a video file, just extract the audio first by referring to the instructions described later.
1. Configure the Maven repository address in the project-level build.gradle file.
Code:
buildscript {
repositories {
google()
jcenter()
// Configure the Maven repository address for the HMS Core SDK.
maven {url 'https://developer.huawei.com/repo/'}
}
dependencies {
...
// Add the AppGallery Connect plugin configuration.
classpath 'com.huawei.agconnect:agcp:1.4.2.300'
}
}
allprojects {
repositories {
google()
jcenter()
// Configure the Maven repository address for the HMS Core SDK.
maven {url 'https://developer.huawei.com/repo/'}
}
}
Add the following configuration under the declaration in the file header:
Code:
apply plugin: 'com.huawei.agconnect'
2. Add the build dependency on the Audio Editor SDK in the app-level build.gradle file.
Code:
dependencies{
implementation 'com.huawei.hms:audio-editor-ui:{version}'
}
3. Apply for the following permissions in the AndroidManifest.xml file:
Code:
<!-- Vibrate -->
<uses-permission android:name="android.permission.VIBRATE" />
<!-- Microphone -->
<uses-permission android:name="android.permission.RECORD_AUDIO" />
<!-- Write into storage -->
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<!-- Read from storage -->
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
<!-- Connect to the Internet -->
<uses-permission android:name="android.permission.INTERNET" />
<!-- Obtain the network status -->
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<!-- Obtain the changed network connectivity state -->
<uses-permission android:name="android.permission.CHANGE_NETWORK_STATE" />
Code Development​1. Create app's custom activity for selecting one or more audio files. Return their paths to the SDK.
Code:
// Return the audio file paths to the audio editing screen.
private void sendAudioToSdk() {
// Set filePath to the obtained audio file path.
String filePath = "/sdcard/AudioEdit/audio/music.aac";
ArrayList<String> audioList = new ArrayList<>();
audioList.add(filePath);
// Return the path to the audio editing screen.
Intent intent = new Intent();
// Use HAEConstant.AUDIO_PATH_LIST provided by the SDK.
intent.putExtra(HAEConstant.AUDIO_PATH_LIST, audioList);
// Use HAEConstant.RESULT_CODE provided by the SDK as the result code.
this.setResult(HAEConstant.RESULT_CODE, intent);
finish();
}
2. Register the activity in the AndroidManifest.xml file as described in the following code. When you choose to import the selected audio files, the SDK will send an intent whose action value is com.huawei.hms.audioeditor.chooseaudio to jump to the activity.
Code:
<activity android:name="Activity ">
<intent-filter>
<action android:name="com.huawei.hms.audioeditor.chooseaudio"/>
<category android:name="android.intent.category.DEFAULT"/>
</intent-filter>
</activity>
Launch the audio editing screen. When you tap Add audio, the SDK will automatically call the activity defined earlier. Then operations like editing and adding special effects can be performed on the audio. After such operations are complete, the edited audio can be exported.
Code:
HAEUIManager.getInstance().launchEditorActivity(this);
3. (Optional) Convert the file format to MP3.
Call transformAudioUseDefaultPath to convert the format and save the converted audio to the default directory.
Code:
// Convert the audio format.
HAEAudioExpansion.getInstance().transformAudioUseDefaultPath(context,inAudioPath, audioFormat, new OnTransformCallBack() {
// Callback when the progress is received. The value ranges from 0 to 100.
@Override
public void onProgress(int progress) {
}
// Callback when the conversion fails.
@Override
public void onFail(int errorCode) {
}
// Callback when the conversion succeeds.
@Override
public void onSuccess(String outPutPath) {
}
// Callback when the conversion is canceled.
@Override
public void onCancel() {
}
});
// Cancel format conversion.
HAEAudioExpansion.getInstance().cancelTransformAudio();
Call transformAudio to convert audio and save the converted audio to a specified directory.
Code:
// Convert the audio format.
HAEAudioExpansion.getInstance().transformAudio(context,inAudioPath, outAudioPath, new OnTransformCallBack(){
// Callback when the progress is received. The value ranges from 0 to 100.
@Override
public void onProgress(int progress) {
}
// Callback when the conversion fails.
@Override
public void onFail(int errorCode) {
}
// Callback when the conversion succeeds.
@Override
public void onSuccess(String outPutPath) {
}
// Callback when the conversion is canceled.
@Override
public void onCancel() {
}
});
// Cancel format conversion.
HAEAudioExpansion.getInstance().cancelTransformAudio();
(Optional) Call extractAudio to extract audio from a video to a specified directory.
Code:
// outAudioDir (optional): directory path for storing extracted audio.
// outAudioName (optional): name of extracted audio, which does not contain the file name extension.
HAEAudioExpansion.getInstance().extractAudio(context,inVideoPath,outAudioDir, outAudioName,new AudioExtractCallBack() {
@Override
public void onSuccess(String audioPath) {
Log.d(TAG, "ExtractAudio onSuccess : " + audioPath);
}
@Override
public void onProgress(int progress) {
Log.d(TAG, "ExtractAudio onProgress : " + progress);
}
@Override
public void onFail(int errCode) {
Log.i(TAG, "ExtractAudio onFail : " + errCode);
}
@Override
public void onCancel() {
Log.d(TAG, "ExtractAudio onCancel.");
}
});
// Cancel audio extraction.
HAEAudioExpansion.getInstance().cancelExtractAudio();
Call getInstruments and startSeparationTasks for audio source separation.
Code:
// Obtain the accompaniment ID using getInstruments and pass the ID to startSeparationTasks.
HAEAudioSeparationFile haeAudioSeparationFile = new HAEAudioSeparationFile();
haeAudioSeparationFile.getInstruments(new SeparationCloudCallBack<List<SeparationBean>>() {
@Override
public void onFinish(List<SeparationBean> response) {
// Callback when the separation data is received. The data includes the accompaniment ID.
}
@Override
public void onError(int errorCode) {
// Callback when the separation fails.
}
});
// Set the parameter for accompaniment separation.
List instruments = new ArrayList<>();
instruments.add("accompaniment ID");
haeAudioSeparationFile.setInstruments(instruments);
// Start separating.
haeAudioSeparationFile.startSeparationTasks(inAudioPath, outAudioDir, outAudioName, new AudioSeparationCallBack() {
@Override
public void onResult(SeparationBean separationBean) { }
@Override
public void onFinish(List<SeparationBean> separationBeans) {}
@Override
public void onFail(int errorCode) {}
@Override
public void onCancel() {}
});
// Cancel separating.
haeAudioSeparationFile.cancel();
Call applyAudioFile to apply spatial audio.
Code:
// Apply spatial audio.
// Fixed position mode.
HAESpaceRenderFile haeSpaceRenderFile = new HAESpaceRenderFile(SpaceRenderMode.POSITION);
haeSpaceRenderFile.setSpacePositionParams(
new SpaceRenderPositionParams(x, y, z));
// Dynamic rendering mode.
HAESpaceRenderFile haeSpaceRenderFile = new HAESpaceRenderFile(SpaceRenderMode.ROTATION);
haeSpaceRenderFile.setRotationParams( new SpaceRenderRotationParams(
x, y, z, surroundTime, surroundDirection));
// Extension.
HAESpaceRenderFile haeSpaceRenderFile = new HAESpaceRenderFile(SpaceRenderMode.EXTENSION);
haeSpaceRenderFile.setExtensionParams(new SpaceRenderExtensionParams(radiusVal, angledVal));
// Call the API.
haeSpaceRenderFile.applyAudioFile(inAudioPath, outAudioDir, outAudioName, callBack);
// Cancel applying spatial audio.
haeSpaceRenderFile.cancel();
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
After completing these steps, you can now implement the 2D-to-3D conversion effect for your app.
Utilize the function according to your needs. To find more about it, check out:
Official website of Audio Editor Kit
Development guide to the kit
Can this conversion will be done in offline?
does it support all audio format?
Which Audio api's we can use for volume management?
muraliameakula said:
Can this conversion will be done in offline?
Click to expand...
Click to collapse
This service cannot be converted offline. Some functions can be used offline: AI dubbing, spatial rendering, separation, and functions related to material.
ProManojKumar said:
does it support all audio format?
Click to expand...
Click to collapse
Support mp3 wav flac aac etc.
vivek_yadav said:
Which Audio api's we can use for volume management?
Click to expand...
Click to collapse
https://developer.huawei.com/consum...unctions-0000001224604517#section171179354277

Categories

Resources