How to Be Language All-rounder with HUAWEI ML Kit? - Huawei Developers

The original article is from HUAWEI Developer Forum
(forum link: https://forums.developer.huawei.com/forumPortal/en/home)
1. Introduction
Optical Character Recognition is the function to detect and process characters from a determined alphabet. With HUAWEI Text recognition service allows you to extract text from images, documents or any print representation of characters.
This service can run on the cloud or device. If the service needs to scan characters from Simplified Chinese, Japanese, Korean and Latin based language the service can use on Device text recognition.
The cloud service supports languages such as Simplified Chinese, English, Spanish, Portuguese, Italian, German, French, Russian, Japanese, Korean, Polish, Finnish, Norwegian, Swedish, Danish, Turkish, Thai, Arabic and Hindi.
2. Preparations
To enable the ML Kit, It is needed to follow these steps.
· Login to Huawei Developer https://developer.huawei.com/consumer/en/console#/serviceCards/
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
· Access HUAWEI AppGallery
· Select My Apps and the select the app you want to integrate the service.
· Access the Develop section and click on Manage APIs
· Enable ML Kit
· Set Storage Location
· Download agconnect-services.json and add it in the app root directory of your Android Studio project
· Open the build.gradle file in the root directory
· Add the maven “maven{url’https://developer.huawei.com/repo/’}” and Classpath “classpath’com.huawei.agconnect:agcp:1.2.1.301’”
· Add the SDKs dependencies to your project
Code:
“dependencies{
// Import the base SDK.
implementation 'com.huawei.hms:ml-computer-vision-ocr:1.0.3.300'
// Import the Latin-based language model package.
implementation 'com.huawei.hms:ml-computer-vision-ocr-latin-model:1.0.3.315'
// Import the Japanese and Korean model package.
implementation 'com.huawei.hms:ml-computer-vision-ocr-jk-model:1.0.3.300'
// Import the Chinese and English model package.
implementation 'com.huawei.hms:ml-computer-vision-ocr-cn-model:1.0.3.300'
}”
· Add the configuration inside the same build.gradle file
Code:
“apply plugin: ‘com.huawei.agconnect’”
· In the Android Manifest File add the following code to enable your app to automatically update the latest ML models on the device.
Code:
“<manifest
<meta-data
android:name="com.huawei.hms.ml.DEPENDENCY"
android:value= "ocr"/>
</manifest>”
3. Development
Text recognition from images on the device
1. Need to create the MLTextAnalyzer to recognize text in images. Also set MLLocalTextSetting to specify languages, if none are specified, only Latin-based languages are going to being recognized
This is not the end. For more information about this article, you can visit HUAWEI Developer Forum.

Related

HUAWEI Drive Kit Development Process (Part*2)

More information like this, you can visit HUAWEI Developer Forum
​
SDK Description
To integrate HUAWEI Drive Kit into your app, you will need to integrate both the HMS SDK and the Drive SDK. The HMS SDK provides sign-in and authentication information, while the Drive SDK provides the capability of managing users' files in Drive.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Integrating the Drive SDK and HMS SDK
If you are using Android Studio, you can integrate your Drive SDK and HMS SDK by using the Maven repository. Before you start developing an app, integrate both the Drive SDK and HMS SDK into your Android Studio first.
1. Add the AppGallery Connect configuration file of the app to your Android Studio.
a. Sign in to AppGallery Connect and select My projects.
b. Find your project from the project list and click the app on the project card.
c. On the Project Setting page, click agconnect-services.json to download the configuration file.
d. Copy the agconnect-services.json file to the app root directory.
2. Configure the Maven repository address for the HMS SDK.
a. Open the build.gradle file in the root directory of your Android Studio project.
3. Add build dependencies.
a. Open the build.gradle file in the app directory.
5. Define multi-language settings.
By default, your app supports all languages provided by the HMS SDK.If your app uses all these languages, skip the operation procedure in this section.
If your app uses only some of these languages, follow the operation procedure in this section to complete the required configuration.
Open the build.gradle file in the app directory of your project.
Go to android > defaultConfig and add the resConfigs configuration. en (English) and zh-rCN (Simplified Chinese) are mandatory languages. If your app only supports English and Simplified Chinese, the configuration is as follows:

Integrating Automatic Speech Recognition without Pickup UI in B4A Platform

Introduction
Automatic speech recognition (ASR) can recognize speech not longer than 60 Seconds and convert the input speech into text in real time. This service uses industry-leading deep learning technologies to achieve a recognition accuracy of over 95%. Currently, Mandarin Chinese (including Chinese-English bilingual speech), English, French, German, Spanish, and Italian can be recognized
Real-time result output
Available options: with and without speech pickup UI
Endpoint detection: Start and end points can be accurately located.
Silence detection: No voice packet is sent for silent part.
Intelligent conversion to digital formats: For example, the year 2020 can be recognized from voice input.
Follow all the steps mentioned in Basic Setup to start HMS integration on B4A Platform.
Refer to
https://forums.developer.huawei.com/forumPortal/en/topicview?tid=0201286424114350051&fid=0101246461018590361
Enable ML Kit in App gallery connect.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Creating Wrapper Class
1. Downloading the AAR Packages and JSON File
Sign in to HUAWEI Developer and download the AAR packages required.
2. Open AAR packages with rar tool and rename the class.jar and AndroidManifest.xml files. And save those jar file inside libs folder (It is recommended that the two files can be renamed consistently with the AAR package names.)
3. Copy required permissions in the <manifest> section in B4A IDE.
4. Copy all configurations in the <application> section.
5. Change ${applicationId} to $PACKAGE$.
6. Download the configuration file(agconnect-services.json) from App Gallery Connect
And Add the JSON File to the assets Folder of the AAR file as shown below.
B4A will automatically incorporate files in the assets folder of an AAR package to the assets folder of your main project.
Encapsulating Java Files Using SLC
1. Create Library as parent and then create bin, libs and src as subfolder in the project directory.
2. Develop java project inside the following path:
Choose Library Folder > src > b4x > asr > demo
3. Import the following two lines of code to each Java file.
Code:
import anywheresoftware.b4a.BA;
import anywheresoftware.b4a.BA.*;
import anywheresoftware.b4a.IOnActivityResult;
4. Add necessary annotations to the ASR Java files.
Code:
@Version(1.0f)@ShortName("Asr")@DependsOn(values={"agconnect-core-1.0.1.300.aar",
"tasks-1.3.1.302.aar",
"network-common-4.0.2.300.aar",
"network-grs-4.0.2.300.aar",
"okhttp-3.12.0.jar",
"okio-1.15.0.jar",
"ml-computer-agc-inner-2.0.1.300.aar",
"ml-computer-cloud-base-inner-2.0.1.300.aar",
"ml-computer-commonutils-inner-2.0.1.300.aar",
"ml-computer-ha-inner-2.0.1.300.aar",
"ml-computer-grs-inner-2.0.1.300.aar",
"ml-computer-net-2.0.1.300.aar",
"ml-computer-voice-asr-plugin-2.0.1.300.aar",
"ml-computer-vision-cloud-2.0.1.300.aar",
"ml-computer-voice-asr-2.0.1.300.aar"
})
@Permissions(values={"android.permission.INTERNET",
"android.permission.WRITE_EXTERNAL_STORAGE",
"android.permission.ACCESS_NETWORK_STATE",
"android.permission.RECORD_AUDIO",
"android.permission.READ_EXTERNAL_STORAGE",
"android.permission.CHANGE_WIFI_STATE",
"android.permission.ACCESS_WIFI_STATE",
"android.permission.CHANGE_CONFIGURATION",
"android.permission.WAKE_LOCK"})
More details, you can click https://forums.developer.huawei.com/forumPortal/en/topic/0203394430060880031
Very clear. Do you have a Repo with the code?

HMS AGC CloudStorage Kit Integration in B4A Platform

Overview
Cloud Storage Service allows you to store high volumes of data such as images, audios, and videos generated by your users securely and economically with direct device access. The service is stable, secure, efficient, and easy-to-use, and can free you from development, deployment, O&M, and capacity expansion of storage servers.
In this article, we can learn the following:
Create an app and configure its information in AppGallery Connect.
Use B4A IDE to integrate the HUAWEI HMS Cloud Storage SDK into your app and develop storage-related functions.
Build and demonstrate a demo app.
Manage files on the Cloud Storage page in AppGallery Connect.
For details, refer to
Free IDE for B4X Native Android Development | B4A
B4A includes all the features needed to quickly develop any type of Android app. B4A is used by tens of thousands of developers from all over the world, including companies such as NASA, HP, IBM and others.
www.b4x.com
Creating Wrapper Class
1. Download the following AAR package inside addition folder in your project directory.
HMSSDK AGConnect:
https://developer.huawei.com/repo/c...t-auth/1.4.1.300/agconnect-auth-1.4.1.300.aar
https://developer.huawei.com/repo/c...t-core/1.4.0.300/agconnect-core-1.4.0.300.aar
https://developer.huawei.com/repo/c...age/1.3.0.300/agconnect-storage-1.3.0.300.aar
https://developer.huawei.com/repo/c...e-core/1.4.1.300/datastore-core-1.4.1.300.aar
https://developer.huawei.com/repo/c.../1.4.1.300/datastore-annotation-1.4.1.300.jar
https://developer.huawei.com/repo/c...https/1.4.1.300/agconnect-https-1.4.1.300.aar
https://developer.huawei.com/repo/c.../1.4.1.300/agconnect-credential-1.4.1.300.aar
HMSSDK Base:
https://developer.huawei.com/repo/com/huawei/hms/base/5.0.0.301/base-5.0.0.301.aar
HMSSDK Update:
https://developer.huawei.com/repo/c...pdate/5.0.0.301/availableupdate-5.0.0.301.aar
https://developer.huawei.com/repo/com/huawei/hms/update/2.0.6.302/update-2.0.6.302.aar
HMSSDK Device:
https://developer.huawei.com/repo/com/huawei/hms/device/5.0.0.301/device-5.0.0.301.aar
HMSSDK DynamicAPI:
https://developer.huawei.com/repo/com/huawei/hms/dynamic-api/1.0.13.303/dynamic-api-1.0.13.303.aar
HMSSDK Log:
https://developer.huawei.com/repo/com/huawei/hms/log/5.0.0.301/log-5.0.0.301.aar
HMSSDK Network:
https://developer.huawei.com/repo/c...common/4.0.2.300/network-common-4.0.2.300.aar
https://developer.huawei.com/repo/com/huawei/hms/network-grs/4.0.2.300/network-grs-4.0.2.300.aar
HMSSDK Stats:
https://developer.huawei.com/repo/com/huawei/hms/stats/5.0.0.301/stats-5.0.0.301.aar
HMSSDK Tasks:
https://developer.huawei.com/repo/com/huawei/hmf/tasks/1.4.1.300/tasks-1.4.1.300.aar
HMSSDK UI:
https://developer.huawei.com/repo/com/huawei/hms/ui/5.0.0.301/ui-5.0.0.301.aar
Squareup Libraries:
https://repo1.maven.org/maven2/com/squareup/okhttp3/okhttp/3.11.0/okhttp-3.11.0.jar
https://repo1.maven.org/maven2/com/squareup/okio/okio/1.14.0/okio-1.14.0.jar
Note: After download of the above AAR files, create addition folder inside Project and keep all AAR packages in it.
2. Open each of the AAR package with RAR tool and rename the class.jar and AndroidManifest.xml files. And save those jar files inside libs folder (It is recommended that the two files be renamed consistently with the AAR package names.)
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
1. Create Library as parent and then create bin, libs and src as subfolder in the project directory.
2. Develop java project inside the following path:
Choose Library Folder > src > b4x > hms > cloudstorage
a. Import the following two lines of code to each Java file.
Code:
import anywheresoftware.b4a.BA;
import anywheresoftware.b4a.BA.*;
b. Add necessary annotations to the Java files.
Code:
@Version(1.0f)
@ShortName("CloudStorageFuncs")
@DependsOn(values={
"agconnect-auth-1.4.1.300.aar",
"agconnect-core-1.4.0.300.aar",
"agconnect-storage-1.3.0.300.aar",
"availableupdate-5.0.0.301.aar",
"base-5.0.0.301.aar",
"device-5.0.0.301.aar",
"dynamic-api-1.0.13.303.aar",
"log-5.0.0.301.aar",
"network-common-4.0.2.300.aar",
"network-grs-4.0.2.300.aar",
"stats-5.0.0.301.aar",
"tasks-1.4.1.300.aar",
"ui-5.0.0.301.aar",
"update-2.0.6.302.aar",
"okhttp-3.11.0.jar",
"datastore-core-1.4.1.300.aar",
"datastore-annotation-1.4.1.300.jar",
"agconnect-https-1.4.1.300.aar",
"agconnect-credential-1.4.1.300.aar",
"okio-1.14.0.jar"
})
More details, you can visit https://forums.developer.huawei.com/forumPortal/en/topic/0203411651765670210
Very interesting.

Experience Huawei ML Kit - Text Recognition feat. React Native (Cross platform)

Introduction
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Quality improvement has become crucial in this era of digitalization where all our documents are kept in the folders, shared over the network and read on the digital device.
Imaging the grapple of an elderly person who has no way to read and understand an old prescribed medical document which has gone blurred and deteriorated.
Can we evade such issues??
NO!!
Let’s unbind what Huawei ML Kit offers to overcome such challenges of our day to day life.
Huawei ML Kit provides Text Recognition API to improvise the quality and visibility of old and blurred text on an image.
Text Recognition API is very important and useful the industries where the data is huge and required to be extracted from the images.
Text Recognition API provides two ways to read and process the data on an image:
Local
Huawei ML kit’s Text Recognition API can detect and read the text on an image using the algorithms configured on the device itself and does not require any interaction with cloud and internet.
Remote
Huawei ML kit’s Text Recognition API can analyse and read the text on an image using cloud algorithm API and require interaction with cloud and internet.
Development Overview
We will be focusing on simple text recognition from an image using on device detection API’s
Prerequisite
1. Must have a Huawei Developer Account
2. Must have a Huawei phone with HMS 4.0.0.300 or later
3. React Native environment with Android Studio, Node Js and Visual Studio code.
Major Dependencies
1. React Native CLI : 2.0.1
2. Gradle Version: 6.0.1
3. Gradle Plugin Version: 4.0.1
4. React Native ML Kit SDK : 5.0.0
5. react-native-hms-ml kit gradle dependency
6. AGCP gradle dependency
Software Requirements
1. Java SDK 1.8 or later
2. Android 5.0 or later
Preparation
1. Create a react native project using React Native CLI and open android directory.
2. Download the React Native ML Kit SDK and paste it under Node modules directory of React Native project.
3. Create an app and project in the Huawei AppGallery Connect.
4. Provide the SHA Key and App Package name of the project.
5. Enable the ML API and download the agconnect-services.json file and paste to the app folder of your android folder.
Integration
Add below to build.gradle (project) file, under buildscript/repositories and allprojects/repositories.
Code:
Maven {url 'http://developer.huawei.com/repo/'}
Add below to build.gradle (app) file, under dependencies to use the ML kit SDK in your React Native application.
Code:
dependencies{
// Import the SDK.
implementation project(":react-native-hms-ml ")
…
…
implementation 'com.huawei.agconnect:agconnect-core:1.2.1.301'
}
Add below under Settings.Gradle file
Code:
include ':react-native-hms-ml
project(':react-native-hms-ml).projectDir = new File(rootProject.projectDir, '../node_modules/react-native-hms-ml /android')
Add below under MainApplication,java file.
Code:
import com.huawei.hms.rn.ml.HmsMlPackage;
public class MainApplication extends Application implements ReactApplication {
...
@Override
protected List<ReactPackage> getPackages() {
@SuppressWarnings("UnnecessaryLocalVariable")
List<ReactPackage> packages = new PackageList(this).getPackages();
//Add following line. Don't forget to add import statement also.
packages.add(new HmsMlPackage());
return packages;
}
...
};
Use case
Huawei ML kit’s Text Recognition API provides many different use cases, however our focus would be reading a hand written note and display the text on the application using on device capabilities of the API.
App.js
App.js file acts as entry point and creates navigation to navigate on required service page.
More details, you can check https://forums.developer.huawei.com/forumPortal/en/topic/0204424693794850035
Interesting.
Rebis said:
Interesting.
Click to expand...
Click to collapse
THANKS
If the text is not visible properly using this service can we extract the data.

What Can I Do If My Joint Operations App/Game Is Rejected Due to Incorrect Language Type?

Symptom
After I chose the Chinese mainland as the release country/region of my app that has integrated the HMS Core SDK and submitted the app for review, my app was rejected, for the language for the update pop-up is English, rather than Chinese.
As you can see, the following pop-up is in English, which is the reason for rejection.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
The ideal result would look like this.
Analysis
I use Android Studio for app development, and according to my experience, I assumed that the pop-up message is provided in the HMS Core SDK resource files. It might be caused by missing part of the files, so I checked the multi-language resources. Under External Libraries > Gradle:com.huawei.hms:ui-4.0.4.301 > res > value-zh-rCN, I found the values-zh-rCN.xml file and the pop-up message in Chinese, as shown in the following figure.
So why wasn’t it displayed?
Cause
I contacted Huawei technical support (https://developer.huawei.com/consumer/en/support/feedback/#/). Finally, I found a mistake in the multi-language configuration of my project. This is the sample code provided by the official documentation.
And here’s what I wrote.
That’s where the problem is.
I then changed zh to zh-rCN in my code, and used a VIVO phone for testing. I uninstalled HMS Core (APK) from the phone. Then I started my app, and saw the pop-up message displayed in Chinese.
This time, my app was approved.
Summary
Here’s a summary I’ve wrote:
If you use Eclipse to integrate the HMS Core SDK, ensure that the related multi-language resource packages are not missing or have not been modified. Generally, the language packages are in the res directory. Remember not to change the package name!
If you use Android Studio to integrate the HMS Core SDK, make sure:
The downloaded multi-language resource packages (in the res directory) are not missing or have not been modified.
If resConfigs has been added to android > defaultConfig, the supported languages are the same as the multi-language packages provided by Huawei, especially for zh-rCN, which is different from the naming on other platforms.

Categories

Resources