Beginner: Integration of Text Translation feature in Education apps (Huawei ML Kit-React Native) - Huawei P30 Pro Guides, News, & Discussion

View attachment 5237385
Overview
Translation service can translate text from the source language into the target language. It supports online and offline translation.
In this article, I will show how user can understand the text using ML Kit Plugin.
The text translation service can be widely used in scenarios where translation between different languages is required.
For example, travel apps can integrate this service to translate road signs or menus in other languages to tourists' native languages, providing those considerate services; educational apps can integrate this service to eliminate language barriers, make content more accessible, and improve learning efficiency. In addition, the service supports offline translation, allowing users to easily use the translation service even if the network is not available.
Create Project in Huawei Developer Console
Before you start developing an app, configure app information in App Gallery Connect.
Register as a Developer
Before you get started, you must register as a Huawei developer and complete identity verification on HUAWEI Developers. For details, refer to Registration and Verification.
Create an App
Follow the instructions to create an app Creating an App Gallery Connect Project and Adding an App to the Project. Set the data storage location to Germany.
React Native setup
Requirements
Huawei phone with HMS 4.0.0.300 or later.
React Native environment with Android Studio, NodeJs and Visual Studio code.
Dependencies
Gradle Version: 6.3
Gradle Plugin Version: 3.5.2
React-native-hms-ml gradle dependency
React Native CLI: 2.0.1
1. Environment set up, refer below link.
2. Create project using below command.
Code:
react-native init project name
3. You can install react native command line interface on npm, using the install -g react-native-cli command as shown below.
Code:
npm install –g react-native-cli
Generating a Signing Certificate Fingerprint
Signing certificate fingerprint is required to authenticate your app to Huawei Mobile Services. Make sure JDK is installed. To create one, navigate to JDK directory’s bin folder and open a terminal in this directory. Execute the following command:
Code:
keytool -genkey -keystore <application_project_dir>\android\app\<signing_certificate_fingerprint_filename>.jks -storepass <store_password> -alias <alias> -keypass <key_password> -keysize 2048 -keyalg RSA -validity 36500
This command creates the keystore file in application_project_dir/android/app
The next step is obtain the SHA256 key which is needed for authenticating your app to Huawei services, for the key store file. To obtain it, enter following command in terminal:
Code:
keytool -list -v -keystore <application_project_dir>\android\app\<signing_certificate_fingerprint_filename>.jks
After an authentication, the SHA256 key will be revealed as shown below.
View attachment 5237389
Adding SHA256 Key to the Huawei project in App Gallery
Copy the SHA256 key and visit AppGalleryConnect/ <your_ML_project>/General Information. Paste it to the field SHA-256 certificate fingerprint.
View attachment 5237391
Enable the ML kit from ManageAPIs.
Download the agconnect-services.json from App Gallery and place the file in android/app directory from your React Native Project.
Follow the steps to integrate the ML plugin to your React Native Application.
Integrate the Hms-ML plugin
Code:
npm i @hmscore/react-native-hms-ml
Download the Plugin from the Download Link
Download ReactNative ML Plugin under node_modules/@hmscore of your React Native project, as shown in the directory tree below:
Code:
project-dir
|_ node_modules
|_ ...
|_ @hmscore
|_ ...
|_ react-native-hms-ml
|_ ...
|_ ...
Navigate to android/app/build.gradle directory in your React Native project. Follow the steps:
Add the AGC Plugin dependency
Code:
apply plugin: 'com.huawei.agconnect'
Add to dependencies in android/app/build.gradle:
Code:
implementation project(':react-native-hms-ml')
Navigate to App level android/build.gradle directory in your React Native project. Follow the steps:
Add to buildscript/repositories
Code:
maven {url 'http://developer.huawei.com/repo/'}
Add to buildscript/dependencies
Code:
classpath 'com.huawei.agconnect:agcp:1.3.1.300')
Navigate to android/settings.gradle and add the following:
Code:
include ':react-native-hms-ml'
project(':react-native-hms-ml').projectDir = new File(rootProject.projectDir, '../node_modules/@hmscore/react-native-hms-ml/android')
Use case:
Huawei ML kit’s HMSTranslate API can be integrate for different applications and to translation between different languages.
Set API Key:
Before using HUAWEI ML in your app, set Api key first.
Copy the api_key value in your agconnect-services.json file.
Call setApiKey with the copied value.
Code:
HMSApplication.setApiKey("api_key").then((res) => {console.log(res);})
catch((err) => {console.log(err);})
Add below permission under AndroidManifest.xml file.
XML:
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<uses-permission android:name="android.permission.CAMERA" />
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.RECORD_AUDIO" />
<uses-permission android:name="android.permission.ACCESS_WIFI_STATE" />
Translation
Text translation is implemented in either asynchronous or synchronous mode. For details, please refer to HMSTranslate.
JavaScript:
async asyncTranslate(sentence) {
try {
if (sentence !== "") {
var result = await HMSTranslate.asyncTranslate(this.state.isEnabled, true, sentence, this.getTranslateSetting());
console.log(result);
if (result.status == HMSApplication.SUCCESS) {
this.setState({ result: result.result });
}
else {
this.setState({ result: result.message });
if (result.status == HMSApplication.NO_FOUND) {
this.setState({ showPreparedModel: true });
ToastAndroid.showWithGravity("Download Using Prepared Button Below", ToastAndroid.SHORT, ToastAndroid.CENTER);
}
}
}
} catch (e) {
console.log(e);
this.setState({ result: "This is an " + e });
}
}
Obtaining Languages
Obtains language codes in on-cloud and on-device translation services. For details, please refer to HMSTranslate.
JavaScript:
async getAllLanguages() {
try {
var result = await HMSTranslate.getAllLanguages(this.state.isEnabled);
console.log(result);
if (result.status == HMSApplication.SUCCESS) {
this.setState({ result: result.result.toString() });
}
else {
this.setState({ result: result.message });
}
} catch (e) {
console.log(e);
}
}
Downloading Prepared Model
A prepared model is provided for on-device analyzer to translate text. You can download the on-device analyzer model. You can translate the text in offline using the download Model. For details, please refer to HMSTranslate.
JavaScript:
async preparedModel() {
try {
var result = await HMSTranslate.preparedModel(this.getStrategyConfiguration(), this.getTranslateSetting());
console.log(result);
if (result.status == HMSApplication.SUCCESS) {
this.setState({ result: "Model download Success. Now you can use local analyze" });
}
else {
this.setState({ result: result.message });
}
} catch (e) {
console.log(e);
this.setState({ result: "This is an " + e });
}
}
Read full article continue reading
Output:
View attachment 5237403
Tips and Tricks
Download latest HMS ReactNativeML plugin.
Copy the api_key value in your agconnect-services.json file and set API key.
Add the languages to translate in Translator Setting.
For project cleaning, navigate to android directory and run the below command.
Code:
gradlew clean
Conclusion:
In this article, we have learnt to integrate ML kit in React native project.
Educational apps can integrate this service to eliminate language barriers, make content more accessible, and improve learning efficiency. In addition, the service supports offline translation, allowing users to easily use the translation service even if the network is not available.
Reference
https://developer.huawei.com/consum...uides-V1/text-translation-0000001051086162-V1
Read full article link

XDARoni said:
View attachment 5237385
Overview
Translation service can translate text from the source language into the target language. It supports online and offline translation.
In this article, I will show how user can understand the text using ML Kit Plugin.
The text translation service can be widely used in scenarios where translation between different languages is required.
For example, travel apps can integrate this service to translate road signs or menus in other languages to tourists' native languages, providing those considerate services; educational apps can integrate this service to eliminate language barriers, make content more accessible, and improve learning efficiency. In addition, the service supports offline translation, allowing users to easily use the translation service even if the network is not available.
Create Project in Huawei Developer Console
Before you start developing an app, configure app information in App Gallery Connect.
Register as a Developer
Before you get started, you must register as a Huawei developer and complete identity verification on HUAWEI Developers. For details, refer to Registration and Verification.
Create an App
Follow the instructions to create an app Creating an App Gallery Connect Project and Adding an App to the Project. Set the data storage location to Germany.
React Native setup
Requirements
Huawei phone with HMS 4.0.0.300 or later.
React Native environment with Android Studio, NodeJs and Visual Studio code.
Dependencies
Gradle Version: 6.3
Gradle Plugin Version: 3.5.2
React-native-hms-ml gradle dependency
React Native CLI: 2.0.1
1. Environment set up, refer below link.
2. Create project using below command.
Code:
react-native init project name
3. You can install react native command line interface on npm, using the install -g react-native-cli command as shown below.
Code:
npm install –g react-native-cli
Generating a Signing Certificate Fingerprint
Signing certificate fingerprint is required to authenticate your app to Huawei Mobile Services. Make sure JDK is installed. To create one, navigate to JDK directory’s bin folder and open a terminal in this directory. Execute the following command:
Code:
keytool -genkey -keystore <application_project_dir>\android\app\<signing_certificate_fingerprint_filename>.jks -storepass <store_password> -alias <alias> -keypass <key_password> -keysize 2048 -keyalg RSA -validity 36500
This command creates the keystore file in application_project_dir/android/app
The next step is obtain the SHA256 key which is needed for authenticating your app to Huawei services, for the key store file. To obtain it, enter following command in terminal:
Code:
keytool -list -v -keystore <application_project_dir>\android\app\<signing_certificate_fingerprint_filename>.jks
After an authentication, the SHA256 key will be revealed as shown below.
View attachment 5237389
Adding SHA256 Key to the Huawei project in App Gallery
Copy the SHA256 key and visit AppGalleryConnect/ <your_ML_project>/General Information. Paste it to the field SHA-256 certificate fingerprint.
View attachment 5237391
Enable the ML kit from ManageAPIs.
Download the agconnect-services.json from App Gallery and place the file in android/app directory from your React Native Project.
Follow the steps to integrate the ML plugin to your React Native Application.
Integrate the Hms-ML plugin
Code:
npm i @hmscore/react-native-hms-ml
Download the Plugin from the Download Link
Download ReactNative ML Plugin under node_modules/@hmscore of your React Native project, as shown in the directory tree below:
Code:
project-dir
|_ node_modules
|_ ...
|_ @hmscore
|_ ...
|_ react-native-hms-ml
|_ ...
|_ ...
Navigate to android/app/build.gradle directory in your React Native project. Follow the steps:
Add the AGC Plugin dependency
Code:
apply plugin: 'com.huawei.agconnect'
Add to dependencies in android/app/build.gradle:
Code:
implementation project(':react-native-hms-ml')
Navigate to App level android/build.gradle directory in your React Native project. Follow the steps:
Add to buildscript/repositories
Code:
maven {url 'http://developer.huawei.com/repo/'}
Add to buildscript/dependencies
Code:
classpath 'com.huawei.agconnect:agcp:1.3.1.300')
Navigate to android/settings.gradle and add the following:
Code:
include ':react-native-hms-ml'
project(':react-native-hms-ml').projectDir = new File(rootProject.projectDir, '../node_modules/@hmscore/react-native-hms-ml/android')
Use case:
Huawei ML kit’s HMSTranslate API can be integrate for different applications and to translation between different languages.
Set API Key:
Before using HUAWEI ML in your app, set Api key first.
Copy the api_key value in your agconnect-services.json file.
Call setApiKey with the copied value.
Code:
HMSApplication.setApiKey("api_key").then((res) => {console.log(res);})
catch((err) => {console.log(err);})
Add below permission under AndroidManifest.xml file.
XML:
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<uses-permission android:name="android.permission.CAMERA" />
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.RECORD_AUDIO" />
<uses-permission android:name="android.permission.ACCESS_WIFI_STATE" />
Translation
Text translation is implemented in either asynchronous or synchronous mode. For details, please refer to HMSTranslate.
JavaScript:
async asyncTranslate(sentence) {
try {
if (sentence !== "") {
var result = await HMSTranslate.asyncTranslate(this.state.isEnabled, true, sentence, this.getTranslateSetting());
console.log(result);
if (result.status == HMSApplication.SUCCESS) {
this.setState({ result: result.result });
}
else {
this.setState({ result: result.message });
if (result.status == HMSApplication.NO_FOUND) {
this.setState({ showPreparedModel: true });
ToastAndroid.showWithGravity("Download Using Prepared Button Below", ToastAndroid.SHORT, ToastAndroid.CENTER);
}
}
}
} catch (e) {
console.log(e);
this.setState({ result: "This is an " + e });
}
}
Obtaining Languages
Obtains language codes in on-cloud and on-device translation services. For details, please refer to HMSTranslate.
JavaScript:
async getAllLanguages() {
try {
var result = await HMSTranslate.getAllLanguages(this.state.isEnabled);
console.log(result);
if (result.status == HMSApplication.SUCCESS) {
this.setState({ result: result.result.toString() });
}
else {
this.setState({ result: result.message });
}
} catch (e) {
console.log(e);
}
}
Downloading Prepared Model
A prepared model is provided for on-device analyzer to translate text. You can download the on-device analyzer model. You can translate the text in offline using the download Model. For details, please refer to HMSTranslate.
JavaScript:
async preparedModel() {
try {
var result = await HMSTranslate.preparedModel(this.getStrategyConfiguration(), this.getTranslateSetting());
console.log(result);
if (result.status == HMSApplication.SUCCESS) {
this.setState({ result: "Model download Success. Now you can use local analyze" });
}
else {
this.setState({ result: result.message });
}
} catch (e) {
console.log(e);
this.setState({ result: "This is an " + e });
}
}
Read full article continue reading
Output:
View attachment 5237403
Tips and Tricks
Download latest HMS ReactNativeML plugin.
Copy the api_key value in your agconnect-services.json file and set API key.
Add the languages to translate in Translator Setting.
For project cleaning, navigate to android directory and run the below command.
Code:
gradlew clean
Conclusion:
In this article, we have learnt to integrate ML kit in React native project.
Educational apps can integrate this service to eliminate language barriers, make content more accessible, and improve learning efficiency. In addition, the service supports offline translation, allowing users to easily use the translation service even if the network is not available.
Reference
https://developer.huawei.com/consum...uides-V1/text-translation-0000001051086162-V1
Read full article link
Click to expand...
Click to collapse

Currently, real-time translation how many language support

huawei is not accepting new account. any other possible solution? thanks in advance

Related

How to implement HMS Location Kit With Flutter?

More information like this, you can visit HUAWEI Developer Forum​
HMS Location Kit Flutter
Hi everyone , Today I try describing how we can use HMS Location kit Flutter Plugin also I prepare a demo project .
--You can see that at the end of page with Github link.
What is the Location Kit?
HUAWEI Location Kit assists developers in enabling their apps to get quick and accurate user locations and expand global positioning capabilities by using GPS, Wi-Fi, and base station locations.
Fused location: Provides a set of simple and easy-to-use APIs for you to quickly obtain the device location based on the GPS, Wi-Fi, and base station location data.
Activity identification: Identifies user motion status through the acceleration sensor, cellular network information, and magnetometer, helping you adjust your app based on user behavior.
Geofence: Allows you to set an interested area through an API so that your app can receive a notification when a specified action (such as leaving, entering, or lingering in the area) occurs.
If you want to more information about Location kit:
Location Kit
HUAWEI Location Kit combines GPS, Wi-Fi, and base station location data to allow you to quickly obtain precise user…developer.huawei.com
Location Flutter Plugin
The Flutter Location Plugin provides adaption code used for the HUAWEI Location Kit to use in Flutter platform. HUAWEI Location Kit combines the GPS, Wi-Fi, and base station locations to help you quickly obtain precise user locations, build up global positioning capabilities, and reach a wide range of users around the globe.
Configuring Your Flutter Project
Registering as a Developer
Before you get started, you must register as a HUAWEI developer and complete identity verification on the HUAWEI Developers website. For details, please refer to Registration and Verification.
Creating AppGalery Connect Project
Sign in to AppGallery Connect and click My apps.
Click the desired app name.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
3. Click the Develop tab.
4. In the App information area, click agconnect-services.json to download the configuration file.
If you have made any changes, such as setting the data storage location and enabling or managing APIs, you need to download the latest agconnect-services.json file and use it to replace the existing file in the app directory.
5. Create a Flutter project if you have not created one.
6. Run the following command and make sure that no errors are reported.
[project_path]> flutter doctor
7. Copy the agconnect-services.json file to the android/app directory of your Flutter project.
8. Copy the generated signature file to the android/app directory of your Flutter project.
9. Verify that the agconnect-services.json file and signature file are successfully added to the android/app directory of your Flutter project.
10. Open the build.gradle file in the android directory of your Flutter project.
a. Go to buildscript, and configure the Maven repository address and AppGallery Connect plugin for the HMS Core SDK.
Code:
buildscript {
repositories {
google()
jcenter()
maven { url 'https://developer.huawei.com/repo/' }
}
dependencies {
/*
* <Other dependencies>
*/
classpath 'com.huawei.agconnect:agcp:1.2.1.301'
}
}
b. Go to allprojects, and configure the Maven repository address for the HMS Core SDK.
Code:
allprojects {
repositories {
google()
jcenter()
maven { url 'https://developer.huawei.com/repo/' }
}
}
11. Open the build.gradle file in the android/app directory of your Flutter project.
a. Add build dependencies.
Code:
dependencies {
implementation 'com.huawei.hms:location:4.0.3.301'
/*
* <Other dependencies>
*/
}
b. Add the apply plugin: ‘com.huawei.agconnect’ line after the apply plugin: ‘com.android.application’ line.
Code:
apply plugin: 'com.android.application'
apply plugin: 'com.huawei.agconnect'
apply from: "$flutterRoot/packages/flutter_tools/gradle/flutter.gradle"
c. Set minSdkVersion to 19 or higher in android > defaultConfig.
Code:
defaultConfig {
applicationId "<package_name>"
minSdkVersion 19
/*
* <Other configurations>
*/
}
The value of applicationId must match with that of package_name in the agconnect-services.json file.
d. Configure the signature in android based on the signature file information.
Code:
android {
/*
* <Other configurations>
*/
signingConfigs {
release {
storeFile file('<keystore_file>')
storePassword '<keystore_password>'
keyAlias '<key_alias>'
keyPassword '<key_password>'
}
}
buildTypes {
debug {
signingConfig signingConfigs.release
}
release {
signingConfig signingConfigs.release
}
}
}
Replace <keystore_file>, <keystore_password>, <key_alias> and <key_password> with matching entries in your signature file. For details about the app signing procedure in Flutter, please refer to App Signing in Flutter.
Integrating the Plugin
There are two ways to integrate the plugin to your project. Download the Location Kit Flutter Plugin and Using pub.dev HMS location_plugin
Download the Location Kit Flutter Plugin
Download the Location Kit Flutter Plugin and decompress it.
Open the pubspec.yaml file in your Flutter project and add the plugin dependency to the dependencies section.
Code:
dependencies:
huawei_location:
path: {library path}
Replace {library path} with the actual library path of the HUAWEI Location Kit Flutter plugin. For details, please refer to Using Packages.
3. Run following command to update the package information:
Code:
[project_path]> flutter pub get
4. Run the following command or click the run icon on the toolbar to start the app:
Code:
[project_path]> flutter run
Using pub.dev HMS location_plugin
Add this to your package’s pubspec.yaml file:
Code:
dependencies:
huawei_location: ^4.0.4+300
You can install packages from the command line:
with Flutter:
Code:
$ flutter pub get
Code:
Alternatively, your editor might support flutter pub get. Check the docs for your editor to learn more.
Permissions
First of all we need permissions to access location and physical activity data.
Add location permission to manifest file.Define this permissions in android/app/src/main/AndroidManifest.xml as follows:
Create a PermissionHandle intance.
Code:
PermissionHandler permissionHandler;
Add initState() for initialize.
Code:
permissionHandler = PermissionHandler();
What does the service provide?
Check Permissions
Request Permissions
Code:
void requestPermission() async {
try {
bool status = await permissionHandler.requestLocationPermission();
setState(() {
infoText = "Is permission granted $status";
});
} catch (e) {
setState(() {
infoText = e.toString();
});
}
}
Fused Location
Create a FusedLocationProviderClient instance using the init() method and use the instance to call location-related APIs.
Code:
FusedLocationProviderClient locationService;
Add initState() for initialize.
Code:
locationService = FusedLocationProviderClient();
What does the service provide?
Last Location
Last Location With Address
Mock Location
Location Updates
Reference: Fused Location developer.huawei.com
To use mock location
To use the mock location function, go to Settings > System & updates > Developer options > Select mock location app and select the app for using the mock location function.
(If Developer options is unavailable, go to Settings > About phone and tap Build number for seven consecutive times. Then, Developer options will be displayed on System & updates.)
To use mock location feature first configure the mock location permission in the android/app/src/main/AndroidManifest.xml file.
Code:
<uses-permission
android:name="android.permission.ACCESS_MOCK_LOCATION"
tools:ignore="MockLocation,ProtectedPermissions" />
Listen Location Update Event
Call the method onLocationData that listens the location update events.
StreamSubscription<Location> streamSubs;
Add initState()
Code:
streamSubs = locationService.onLocationData.listen((location) {
print(location.toString());
});
Example method : getLastLocation()
Code:
void getLastLocation() async {
setState(() {
infoText = "";
});
try {
Location location = await locationService.getLastLocation();
setState(() {
infoText = location.toString();
});
} catch (e) {
setState(() {
infoText = e.toString();
});
}
}
Activity Identification
Creating an Activity Identification Service Client
ActivityIdentificationService locationService;
Add initState() for initialize.
locationService = FusedLocationProviderClient();
What does the service provide?
Activity Conversion Updates
Activity Conversion Request
Activity Identification Updates
Activity Identification Request
Listen Activity Identification Event
You can use the onActivityIdentification method to listen to and receive data from activity identification events.
Add initState() for initialize.
ActivityIdentificationService activityIdentificationService =
ActivityIdentificationService();
Code:
void onActivityIdentificationResponse(ActivityIdentificationResponse response) {
for (ActivityIdentificationData data
in response.activityIdentificationDatas) {
setChange(data.identificationActivity , data.possibility);
//data.identificationActivity include activity type like vehicle,bike etc.
//data.posibility The confidence for the user to execute the activity.
//The confidence ranges from 0 to 100. A larger value indicates more reliable activity authenticity.
}
}
void streamListen() {
streamSubscription =
activityIdentificationService.onActivityIdentification.listen(onActivityIdentificationResponse);
}
For full content, you can visit https://forums.developer.huawei.com/forumPortal/en/topicview?tid=0201314069177450189&fid=0101187876626530001

Intermediate: Integration of landmark recognition feature in tourism apps (ML Kit-React Native)

Overview
Did you ever gone through your vacation photos and asked yourself: What is the name of this place I visited in India? Who created this monument I saw in France? Landmark recognition can help! This technology can predict landmark labels directly from image pixels, to help people better understand and organize their photo collections.
Landmark recognition can be used in tourism scenarios. The landmark recognition service enables you to obtain the landmark name, landmark longitude and latitude, and confidence of the input image. A higher confidence indicates that the landmark in the input image is more likely to be recognized. Based on the recognized information, you can create more personalized app experience for users.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
In this article, I will show how user can get the landmark information using ML Kit Plugin.
Integrate this service into a travel app so that images taken by users are detected by ML Plugin to return the landmark name and address, and the app can provide the brief introduction and tour suggestions based on the returned information.
Create Project in Huawei Developer Console
Before you start developing an app, configure app information in App Gallery Connect.
Register as a Developer
Before you get started, you must register as a Huawei developer and complete identity verification on HUAWEI Developers. For details, refer to Registration and Verification.
Create an App
Follow the instructions to create an app Creating an App Gallery Connect Project and Adding an App to the Project. Set the data storage location to Germany
Adding an App to the Project. Set the data storage location to Germany
React Native setup
Requirements
Huawei phone with HMS 4.0.0.300 or later.
React Native environment with Android Studio, NodeJs and Visual Studio code.
Dependencies
Gradle Version: 6.3
Gradle Plugin Version: 3.5.2
React-native-hms-ml gradle dependency
React Native CLI: 2.0.1
1. Environment setup, refer below link.
Setting up the development environment · React Native
This page will help you install and build your first React Native app.
reactnative.dev
2. Create project by using this command.
Code:
react-native init project name
3. You can install react native command line interface on npm, using the install -g react-native-cli command as shown below.
Code:
npm install –g react-native-cli
Generating a Signing Certificate Fingerprint
Signing certificate fingerprint is required to authenticate your app to Huawei Mobile Services. Make sure JDK is installed. To create one, navigate to JDK directory’s bin folder and open a terminal in this directory. Execute the following command:
Code:
keytool -genkey -keystore <application_project_dir>\android\app\<signing_certificate_fingerprint_filename>.jks -storepass <store_password> -alias <alias> -keypass <key_password> -keysize 2048 -keyalg RSA -validity 36500
This command creates the keystore file in application_project_dir/android/app
The next step is obtain the SHA256 key which is needed for authenticating your app to Huawei services, for the key store file. To obtain it, enter following command in terminal:
Code:
keytool -list -v -keystore <application_project_dir>\android\app\<signing_certificate_fingerprint_filename>.jks
After an authentication, the SHA256 key will be revealed as shown below.
Adding SHA256 Key to the Huawei project in App Gallery
Copy the SHA256 key and visit AppGalleryConnect/ <your_ML_project>/General Information. Paste it to the field SHA-256 certificate fingerprint.
Enable the ML kit from ManageAPIs.
Download the agconnect-services.jsonfrom App Gallery and place the file in android/app directory from your React Native Project.
Follow the steps to integrate the ML plugin to your React Native Application.
Integrate the HMS-ML plugin
Code:
npm i @hmscore/react-native-hms-ml
Download the Plugin from the Download Link
Download ReactNative ML Plugin under node_modules/@hmscore of your React Native project, as shown in the directory tree below:
Code:
project-dir
|_ node_modules
|_ ...
|_ @hmscore
|_ ...
|_ react-native-hms-ml
|_ ...
|_ ...
Navigate to android/app/build.gradle directory in your React Native project. Follow the steps:
Add the AGC Plugin dependency.
Code:
apply plugin: 'com.huawei.agconnect'
Add to dependencies in android/app/build.gradle:
Code:
implementation project(':react-native-hms-ml')
Navigate to App level android/build.gradle directory in your React Native project. Follow the steps:
Add to buildscript/repositories
Code:
maven {url 'http://developer.huawei.com/repo/'}
Add to buildscript/dependencies
Code:
classpath 'com.huawei.agconnect:agcp:1.3.1.300’'3)
Navigate to android/settings.gradle and add the following:
Code:
include ':react-native-hms-ml'
project(':react-native-hms-ml').projectDir = new File(rootProject.projectDir, '../node_modules/@hmscore/react-native-hms-ml/android')
Use case
Huawei ML kit’s HMSLandmarkRecognition API can be integrate for different applications and to return the landmark name and address, and the app can provide the brief introduction and tour suggestions based on the returned information.
Add below under AndroidManifest.xml file.
XML:
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<uses-permission android:name="android.permission.CAMERA" />
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.ACCESS_WIFI_STATE" />
<application
<meta-data
android:name="com.huawei.hms.ml.DEPENDENCY"
android:value="dsc"/>
</ application>
Set API Key:
Before using HUAWEI ML in your app, set Api key first.
Copy the api_key value in your agconnect-services.json file.
Call setApiKey with the copied value.
JavaScript:
HMSApplication.setApiKey("api_key").then((res) => {console.log(res);})
catch((err) => {console.log(err);})
Analyze Frame
Using HMSLandmarkRecognition.asyncAnalyzeFrame() recognizes landmarks in images asynchronously.
JavaScript:
async asyncAnalyzeFrame() {
try {
var result = await HMSLandmarkRecognition.asyncAnalyzeFrame(true, this.getFrameConfiguration(), this.getLandmarkAnalyzerSetting());
console.log(result);
if (result.status == HMSApplication.SUCCESS) {
result.result.forEach(element => {
this.state.landmark.push(element.landMark);
this.state.possibility.push(element.possibility);
this.state.url.push('https://en.wikipedia.org/wiki/'+element.landMark)
long = [];
lat = [];
element.coordinates.forEach(ll => {
long.push(ll.longitude);
lat.push(ll.latitude);
})
this.state.coordinates.push(lat, long);
});
this.setState({
landMark: this.state.landmark,
possibility: this.state.possibility,
coordinates: this.state.coordinates,
url:this.state.url,
});
}
else {
ToastAndroid.showWithGravity(result.message, ToastAndroid.SHORT, ToastAndroid.CENTER);
}
} catch (e) {
console.error(e);
}
}
Final Code:
JavaScript:
import React from 'react';
import {
Text,
View,
TextInput,
ScrollView,
TouchableOpacity,
Image,
ToastAndroid,
SafeAreaView
} from 'react-native';
import { styles } from '@hmscore/react-native-hms-ml/example/src/Styles';
import { HMSLandmarkRecognition, HMSApplication } from '@hmscore/react-native-hms-ml';
import { showImagePicker } from '@hmscore/react-native-hms-ml/example/src/HmsOtherServices/Helper';
import { WebView } from 'react-native-webview';
export default class App extends React.Component {
componentDidMount() { }
componentWillUnmount() { }
constructor(props) {
super(props);
this.state = {
imageUri: '',
landmark: [],
coordinates: [],
possibility: [],
url:[]
};
}
getLandmarkAnalyzerSetting = () => {
return { largestNumOfReturns: 10, patternType: HMSLandmarkRecognition.STEADY_PATTERN };
}
getFrameConfiguration = () => {
return { filePath: this.state.imageUri };
}
async asyncAnalyzeFrame() {
try {
var result = await HMSLandmarkRecognition.asyncAnalyzeFrame(true, this.getFrameConfiguration(), this.getLandmarkAnalyzerSetting());
console.log(result);
if (result.status == HMSApplication.SUCCESS) {
result.result.forEach(element => {
this.state.landmark.push(element.landMark);
this.state.possibility.push(element.possibility);
this.state.url.push('https://en.wikipedia.org/wiki/'+element.landMark)
long = [];
lat = [];
element.coordinates.forEach(ll => {
long.push(ll.longitude);
lat.push(ll.latitude);
})
this.state.coordinates.push(lat, long);
});
this.setState({
landMark: this.state.landmark,
possibility: this.state.possibility,
coordinates: this.state.coordinates,
url:this.state.url,
});
}
else {
ToastAndroid.showWithGravity(result.message, ToastAndroid.SHORT, ToastAndroid.CENTER);
}
} catch (e) {
console.error(e);
}
}
startAnalyze() {
this.setState({
landmark: [],
possibility: [],
coordinates: [],
url:[],
})
this.asyncAnalyzeFrame();
}
render() {
console.log(this.state.url.toString());
return (
<ScrollView style={styles.bg}>
<View style={styles.containerCenter}>
<TouchableOpacity onPress={() => { showImagePicker().then((result) => this.setState({ imageUri: result })) }}>
<Image style={styles.imageSelectView} source={this.state.imageUri == '' ? require('@hmscore/react-native-hms-ml/example/assets/image.png') : { uri: this.state.imageUri }} />
</TouchableOpacity>
</View>
<Text style={styles.h1}>pick the image and explore the information about place</Text>
<View style={styles.basicButton}>
<TouchableOpacity
style={styles.startButton}
onPress={this.startAnalyze.bind(this)}
disabled={this.state.imageUri == '' ? true : false} >
<Text style={styles.startButtonLabel}> Check Place </Text>
</TouchableOpacity>
</View>
<Text style={{fontSize: 20}}> {this.state.landmark.toString()} </Text>
<View style={{flex: 1}}>
<WebView
source={{uri: this.state.url.toString()}}
style={{marginTop: 20,height:1500}}
javaScriptEnabled={true}
domStorageEnabled={true}
startInLoadingState={true}
scalesPageToFit={true}
/>
</View>
</ScrollView>
);
}
}
Run the application (Generating the Signed Apk):
1. Open project directory path in Command prompt.
2. Navigate to android directory and run the below command for signing the Apk.
Code:
gradlew assembleRelease
Output:
Tips and Tricks:
Download latest HMS ReactNativeML plugin.
Copy the api_key value in your agconnect-services.json file and set API key.
Images in PNG, JPG, JPEG, and BMP formats are supported. GIF images are not supported.
For project cleaning, navigate to android directory and run the below command.
Code:
gradlew clean
Conclusion:
In this article, we have learnt to integrate ML kit in React native project.
This service into a travel apps, so that images taken by users and detected by ML Plugin to return the landmark information, and the app can provide the brief introduction and tour suggestions to user.
Reference
https://developer.huawei.com/consum...-Guides/landmark-recognition-0000001050726194

Beginner: Huawei App Messaging Unity Game Development

Introduction
You can use App Messaging of AppGallery Connect to subscribe and send relevant messages to target active users of your app to encourage them to use key app functions. For example, you can send in-app messages to encourage users to subscribe certain products, provide tips on passing a game level or recommend activities of a restaurant.
App Messaging allows you to customize our messages look and the way they will be sent, and define events for triggering message sending to your users at the right moment.
AG Connect supports three types of messages as follows:
1. Pop-up message
2. Image message
3. Banner message
Development Overview
You need to install Unity software and I assume that you have prior knowledge about the unity and C#.
Hardware Requirements
A computer (desktop or laptop) running Windows 10.
A Huawei phone (with the USB cable), which is used for debugging.
Software Requirements
Java JDK installation package.
Unity software installed.
Visual Studio/Code installed.
HMS Core (APK) 4.X or later.
Follows the steps.
1. Create Unity Project.
Open unity Hub.
Click NEW, select 3D, Project Name and Location.
Click CREATE, as follows:
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
2. Click Asset Store, search Huawei HMS Core App Services and click Import, as follows.
3. Once import is successful, verify directory in Assets> Huawei HMS Core App Services path, as follows.
4. Choose Edit > Project Settings > Player and edit the required options in Publishing Settings, as follows.
5. Generate a SHA-256 certificate fingerprint.
To generating SHA-256 certificate fingerprint use below command.
Code:
keytool -list -v -keystore D:\Unity\projects_unity\file_name.keystore -alias alias_name
6. Download agconnect-services.json and copy and paste to Assets > Plugins > Android, as follows.
7. Choose Project Settings > Player and update package name.
8. Open LauncherTemplate.gradle and add below line.
Code:
apply plugin: 'com.huawei.agconnect'
implementation 'com.huawei.hms:hianalytics:5.1.0.301'
implementation 'com.huawei.agconnect:agconnect-core:1.4.2.301'
9. Open AndroidManifest file and add below permissions.
Code:
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE"/>
10. Open "baseProjectTemplate.gradle" and add lines, as follows.
Code:
classpath'com.huawei.agconnect:agcp:1.4.1.300'
maven {url 'https://developer.huawei.com/repo/'}
11. Open "mainTemplate.gradle" and add lines, as follows.
Code:
implementation 'com.huawei.agconnect:agconnect-core:1.4.2.301'
implementation 'com.huawei.hms:hianalytics:5.1.0.301'
implementation 'com.android.support:appcompat-v7:28.0.0'
implementation 'com.huawei.agconnect:agconnect-appmessaging:1.4.1.300'
12. Enable debug mode use in cmd prompt [optional].
Code:
adb shell setprop debug.huawei.hms.analytics.app package_name
13. Configuring project in AGC
14. Create Scripts folder and create a class.
APPMessaging.cs
C#:
using System;
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using HuaweiService;
using HuaweiService.appmessage;
using HuaweiService.analytic;
using HuaweiService.push;
using Exception = HuaweiService.Exception;
public class APPMessaging : MonoBehaviour
{
private HiAnalyticsInstance instance;
private AGConnectAppMessaging appMessaging;
private void Start()
{
instance = HiAnalytics.getInstance(new Context());
appMessaging = AGConnectAppMessaging.getInstance();
appMessaging.setFetchMessageEnable(true);
appMessaging.setDisplayEnable(true);
instance.setAnalyticsEnabled(true);
getAAID();
}
private void getAAID(){
// Task result = instance.getAAID();
Task id = HmsInstanceId.getInstance(new Context()).getAAID();
id.addOnSuccessListener(new HmsSuccessListener<AAIDResult>((aaidResult) =>
{
string aaId = aaidResult.getId();
Debug.Log("AAID==>> "+aaId);
})).addOnFailureListener(new HmsFailureListener((e) =>
{
Debug.Log("AAID==>> Failed");
}));
}
public delegate void SuccessCallBack<T>(T o);
public class HmsSuccessListener<T>:OnSuccessListener{
public SuccessCallBack<T> CallBack;
public HmsSuccessListener(SuccessCallBack<T> c){
CallBack = c;
}
public void onSuccess(T arg0)
{
if(CallBack != null)
{
CallBack.Invoke(arg0);
}
}
public override void onSuccess(AndroidJavaObject arg0){
if(CallBack !=null)
{
Type type = typeof(T);
IHmsBase ret = (IHmsBase)Activator.CreateInstance(type);
ret.obj = arg0;
CallBack.Invoke((T)ret);
}
}
}
public delegate void SuccessCallBack(AndroidJavaObject object);
public delegate void FailureCallBack(Exception e);
public class HmsFailureListener:OnFailureListener{
public FailureCallBack CallBack;
public HmsFailureListener(FailureCallBack c){
CallBack = c;
}
public override void onFailure(Exception arg0){
if(CallBack !=null){
CallBack.Invoke(arg0);
}
}
}
}
AndroidManifest.xml
Code:
<?xml version="1.0" encoding="utf-8"?>
<!-- GENERATED BY UNITY. REMOVE THIS COMMENT TO PREVENT OVERWRITING WHEN EXPORTING AGAIN-->
<manifest
xmlns:android="http://schemas.android.com/apk/res/android"
package="com.unity3d.player"
xmlns:tools="http://schemas.android.com/tools">
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE"/>
<application>
<activity android:name="com.unity3d.player.UnityPlayerActivity"
android:theme="@style/UnityThemeSelector">
<intent-filter>
<action android:name="android.intent.action.MAIN" />
<category android:name="android.intent.category.LAUNCHER" />
</intent-filter>
<meta-data android:name="unityplayer.UnityActivity" android:value="true" />
</activity>
</application>
</manifest>
15. Follow the steps, as shown in image:
a. Create empty GameObject.
b. Assign script APPMessaging(as per you script name) to empty GameObject.
16. To build apk and run in device, choose File > Build Settings > Build for apk or Build and Run for run on connected device.
Result
Sign in to AppGallery Connect and click My projects.
Find and click your project.
Navigate to Grow > App Messaging and click New.
Select Name and Description.
Click Test in Operation drop-down.
Enter text in Test user AAID and click Save.
Tips and Tricks
Always use the latest version of the library.
Add agconnect-services.json file without fail.
Add SHA-256 fingerprint without fail.
Make sure dependencies added in build files.
Make sure you have enable debug mode.
Make sure the image url is correct and aspect ratio for Portrait is 3:2 (300x200) and Landscape aspect ratio is 1:1 or 3:2 (100 x100 or 300x200).
Conclusion
In this article, we have learnt to integrate Huawei AppMessaging service into unity game development. AppMessaging provides services sending promotional messages and new updates and also we can target users to promote some products.
Thanks for reading the article, please do likes, comments and your queries or suggestions.
References
App Messaging - click here
Unity App Messaging Integration Manual - click here
Original source

How a Programmer Developed a Perfect Flower Recognition App

Spring is a great season for hiking, especially when flowers are in full bloom. One weekend, Jenny, John's girlfriend, a teacher, took her class for an outing in a park. John accompanied them to lend Jenny a hand.
John had prepared for a carefree outdoor outing, like those in his childhood, when he would run around on the grass — but it took a different turn. His outing turned out to be something like a Q&A session that was all about flowers: the students were amazed at John’s ability to recognize flowers, and repeatedly asked him what kind of flowers they encountered. Faced with their sincere questioning and adoring expression, John, despite not a flower expert, felt obliged to give the right answer even though he had to sneak to search for it on the Internet.
It occurred to John that there could be an easier way to answer these questions — using a handy app.
As a programmer with a knack for the market, he soon developed a flower recognition app that's capable of turning ordinary users into expert "botanists": to find out the name of a flower, all you need to do is using the app to take a picture of that flower, and it will swiftly provide you with the correct answer.
Demo
How to Implement
The flower recognition function can be created by using the image classification service in HUAWEI ML Kit. It classifies elements within images into intuitive categories to define image themes and usage scenarios. The service supports both on-device and on-cloud recognition modes, with the former recognizing over 400 categories of items, and the latter, 12,000 categories. It also allows for creating custom image classification models.
Preparations
1. Create an app in AppGallery Connect and configure the signing certificate fingerprint.
2. Configure the Huawei Maven repository address, and add the build dependency on the image classification service.
Code:
<p style="line-height: 1.5em;">dependencies{
// Import the basic SDK.
implementation 'com.huawei.hms:ml-computer-vision-classification:2.0.1.300'
// Import the image classification model package.
implementation 'com.huawei.hms:ml-computer-vision-image-classification-model:2.0.1.300'
}</p>
3. Automatically update the machine learning model.
Add the following statements to the AndroidManifest.xml file. After a user installs your app from HUAWEI AppGallery, the machine learning model will be automatically updated to the user's device.
Code:
<p style="line-height: 1.5em;"><manifest
...
<meta-data
android:name="com.huawei.hms.ml.DEPENDENCY"
android:value= "label"/>
...
</manifest></p>
4. Configure obfuscation scripts.
For details, please refer to the ML Kit Development Guide on HUAWEI Developers.
5. Declare permissions in the AndroidManifest.xml file.
To obtain images through the camera or album, you'll need to apply for relevant permissions in the file.
Code:
<p style="line-height: 1.5em;"> <uses-permission android:name="android.permission.CAMERA" />
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
<uses-feature android:name="android.hardware.camera" />
<uses-feature android:name="android.hardware.camera.autofocus" /></p>
Development Process
1. Create and configure an on-cloud image classification analyzer.
Create a class for the image classification analyzer.
Code:
<p style="line-height: 1.5em;">public class RemoteImageClassificationTransactor extends BaseTransactor<List<MLImageClassification>>
</p>
In the class, use the custom class (MLRemoteClassificationAnalyzerSetting) to create an analyzer, set relevant parameters, and configure the handler.
Code:
<p style="line-height: 1.5em;">private final MLImageClassificationAnalyzer detector;
private Handler handler;MLRemoteClassificationAnalyzerSetting options = new MLRemoteClassificationAnalyzerSetting.Factory().setMinAcceptablePossibility(0f).create();
this.detector = MLAnalyzerFactory.getInstance().getRemoteImageClassificationAnalyzer(options);this.handler = handler;
</p>
2. Call asyncAnalyseFrame to process the image.
Asynchronously classify the input MLFrame object.
Code:
<p style="line-height: 1.5em;">@Override
protected Task<List<MLImageClassification>> detectInImage(MLFrame image) {
return this.detector.asyncAnalyseFrame(image);
}
</p>
3. Obtain the result of a successful classification.
Override the onSuccess method in RemoteImageClassificationTransactor to display the name of the recognized object in the image.
Code:
<p style="line-height: 1.5em;">@Override
protected void onSuccess(
Bitmap originalCameraImage,
List<MLImageClassification> classifications,
FrameMetadata frameMetadata,
GraphicOverlay graphicOverlay) {
graphicOverlay.clear();
this.handler.sendEmptyMessage(Constant.GET_DATA_SUCCESS);
List<String> classificationList = new ArrayList<>();
for (int i = 0; i < classifications.size(); ++i) {
MLImageClassification classification = classifications.get(i);
if (classification.getName() != null) {
classificationList.add(classification.getName());
}
}
RemoteImageClassificationGraphic remoteImageClassificationGraphic =
new RemoteImageClassificationGraphic(graphicOverlay, this.mContext, classificationList);
graphicOverlay.addGraphic(remoteImageClassificationGraphic);
graphicOverlay.postInvalidate();
}
</p>
If recognition fails, handle the error and check the failure reason in the log.
Code:
<p style="line-height: 1.5em;">@Override
protected void onFailure(Exception e) {
this.handler.sendEmptyMessage(Constant.GET_DATA_FAILED);
Log.e(RemoteImageClassificationTransactor.TAG, "Remote image classification detection failed: " + e.getMessage());
}
</p>
4. Release resources when recognition ends.
When recognition ends, stop the analyzer, release detection resources, and override the stop() method in RemoteImageClassificationTransactor.
Code:
<p style="line-height: 1.5em;">@Override
public void stop() {
super.stop();
try {
this.detector.stop();
} catch (IOException e) {
Log.e(RemoteImageClassificationTransactor.TAG,
"Exception thrown while trying to close remote image classification transactor" + e.getMessage());
}
}
</p>
For more details, you can go to:
For more details, you can go to:
l Our official website
l Our Development Documentation page, to find the documents you need
l Reddit to join our developer discussion
l GitHub to download demos and sample codes
l Stack Overflow to solve any integration problems
| Original Source

Intermediate: Text Recognition, Language detection and Language translation using Huawei ML Kit in Flutter (Cross platform)

Introduction
In this article, we will be learning how to integrate Huawei ML kit in Flutter application. Flutter ML plugin allows your apps to easily leverage Huawei’s long-term proven expertise in machine learning to support diverse artificial intelligence (AI) applications. ML plugin provides diversified leading machine learning capabilities that are easy to use, helping you develop various AI apps.
List of API’s ML plugin provides
Text-related services
Language-related services
Image-related services
Face/body-related services
Natural language processing
Custom model
In this article, we will be integrating some of the specific API’s related to Text-related services and Language-related service in flutter application.
Development Overview
You need to install Flutter and Dart plugin in IDE and I assume that you have prior knowledge about the Flutter and Dart.
Hardware Requirements
A computer (desktop or laptop) running Windows 10.
A Huawei phone (with the USB cable), which is used for debugging.
Software Requirements
Java JDK 1.7 or later.
Android studio software or Visual Studio or Code installed.
HMS Core (APK) 4.X or later.
Integration process
Step 1. Create flutter project.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Step 2. Add the App level gradle dependencies, choose inside project Android > app > build.gradle.
Code:
apply plugin: 'com.android.application'
apply plugin: 'com.huawei.agconnect'
implementation 'com.huawei.agconnect:agconnect-core:1.3.1.300'
Add root level gradle dependencies.
Code:
maven {url 'https://developer.huawei.com/repo/'}
classpath 'com.huawei.agconnect:agcp:1.4.1.300'
Step 3: Add the below permissions in Android Manifest file.
Code:
<uses-permission android:name="android.permission.CAMERA" />
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<uses-permission android:name="android.permission.ACCESS_WIFI_STATE" />
Step 4: Add plugin path in pubspec.yaml file under dependencies.
Step 5: Create a project in AppGallery Connect, find here.
pubspec.yaml
Code:
name: flutterdrivedemo123
description: A new Flutter project.
# The following line prevents the package from being accidentally published to
# pub.dev using `pub publish`. This is preferred for private packages.
publish_to: 'none' # Remove this line if you wish to publish to pub.dev
# https://developer.apple.com/library/archive/documentation/General/Reference/InfoPlistKeyReference/Articles/CoreFoundationKeys.html
version: 1.0.0+1
environment:
sdk: ">=2.12.0 <3.0.0"
dependencies:
flutter:
sdk: flutter
huawei_account:
path: ../huawei_account
huawei_drive:
path: ../huawei_drive
huawei_ml:
path: ../huawei_ml
# The following adds the Cupertino Icons font to your application.
# Use with the CupertinoIcons class for iOS style icons.
cupertino_icons: ^1.0.2
image_picker: ^0.8.0
dev_dependencies:
flutter_test:
sdk: flutter
# For information on the generic Dart part of this file, see the
# following page: https://dart.dev/tools/pub/pubspec
# The following section is specific to Flutter.
flutter:
Initialize MLApplication
Code:
MLApplication app = new MLApplication();
app.setApiKey(apiKey:"API_KEY");<strong> </strong>
Check required permissions
Code:
Future<void> checkPerms() async {
final bool isCameraPermissionGranted =
await MLPermissionClient().hasCameraPermission();
if (!isCameraPermissionGranted) {
final bool res = await MLPermissionClient()
.requestPermission([MLPermission.camera, MLPermission.storage]);
}
}
Select image and capture text from image
Code:
Future getImage() async {
final pickedFile = await picker.getImage(source: ImageSource.gallery);
//final pickedFile = await picker.getImage(source: ImageSource.camera);
setState(() {
if (pickedFile != null) {
File _image = File(pickedFile.path);
print('Path :' + pickedFile.path);
capturetext(pickedFile.path);
} else {
print('No image selected.');
}
});
}
Future<void> capturetext(String path) async {
// Create an MLTextAnalyzer object.
MLTextAnalyzer analyzer = new MLTextAnalyzer();
// Create an MLTextAnalyzerSetting object to configure the recognition.
MLTextAnalyzerSetting setting = new MLTextAnalyzerSetting();
// Set the image to be recognized and other desired options.
setting.path = path;
setting.isRemote = true;
setting.language = "en";
// Call asyncAnalyzeFrame to recognize text asynchronously.
MLText text = await analyzer.asyncAnalyzeFrame(setting);
print(text.stringValue);
setState(() {
msg = text.stringValue;
});
}
How to detect Language using ML kit?
Code:
Future<void> onClickDetect() async {
// Create an MLLangDetector object.
MLLangDetector detector = new MLLangDetector();
// Create MLLangDetectorSetting to configure detection.
MLLangDetectorSetting setting = new MLLangDetectorSetting();
// Set source text and detection mode.
setting.sourceText = text;
setting.isRemote = true;
// Get detection result with the highest confidence.
String result = await detector.firstBestDetect(setting: setting);
setState(() {
text = setting.sourceText + ": " + result;
});
}
How to translate Language using ML kit?
Code:
Future<void> onClickTranslate() async {
// Create an MLLocalTranslator object.
MLLocalTranslator translator = new MLLocalTranslator();
// Create an MLTranslateSetting object to configure translation.
MLTranslateSetting setting = new MLTranslateSetting();
// Set the languages for model download.
setting.sourceLangCode = "en";
setting.targetLangCode = "hi";
// Prepare the model and implement the translation.
final isPrepared = await translator.prepareModel(setting: setting);
if (isPrepared) {
// Asynchronous translation.
String result = await translator.asyncTranslate(sourceText: text);
setState(() {
text = result.toString();
});
}
// Stop translator after the translation ends.
bool result = await translator.stopTranslate();
}
Result
Tricks and Tips
Make sure that you have downloaded latest plugin.
Make sure that updated plugin path in yaml.
Make sure that plugin unzipped in parent directory of project.
Makes sure that agconnect-services.json file added.
Make sure dependencies are added in build file.
Run flutter pug get after adding dependencies.
Generating SHA-256 certificate fingerprint in android studio and configure in Ag-connect.
Conclusion
In this article, we have learnt how to integrate capabilities of Huawei ML kit in flutter application. Similar way you can use Huawei ML kit as per user requirement in your application.
Thank you so much for reading, I hope this article helps you to understand the Huawei ML kit capabilities in flutter.
Reference
MLkit
Plutter plugin
Original Source
useful sharing

Categories

Resources