A Novice Journey of Integrating Site Kit into Ionic Application - Huawei Developers

Introduction
HMS Site Kit allow us to search places and in return provides variety of information such as address, location, phone number etc. We can search for places either by proximity or a text string. Our app can provide users with convenient and secure access to diverse, place-related services, and therefore acquire more users.
HMS Site Kit Services
1. Place Search
2. Nearby Place Search
3. Place Details
4. Place Search Suggestion
5. Widget Search
In this article we will focus on Place Search, Nearby Place Search, Place Search Suggestion and Widget Search provided by HMS Site Kit.
Ionic Framework
Ionic Framework is an open source UI toolkit for building performant, high-quality mobile and desktop apps using web technologies such as HTML, CSS, and JavaScript with integrations for popular frameworks like Angular and React.
Think of Ionic as the front-end UI framework that handles all of the look and feel and UI interactions your app needs in order to be compelling. Unlike a responsive framework, Ionic comes with very native-styled mobile UI elements and layouts that you’d get with a native SDK on Android or iOS but didn’t really exist before on the web.
Since Ionic is an HTML5 framework, it needs a native wrapper like Cordova or Capacitor in order to run as a native app.
Here we will use Ionic framework with Angular and Capacitor as native wrapper.
Demo
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Prerequisite
1) Must have a Huawei Developer Account.
2) Must have a Huawei phone with HMS 4.0.0.300 or later.
3) Must have a laptop or desktop with Android Studio, Jdk 1.8, SDK platform 26 and Gradle 4.6 installed.
4) Must install node in the system
5) Must install Ionic in the system using below command:
Code:
npm install -g [user=1329689]@iONiC[/user]/cli
Things need to be done
1) Generating a Signing Certificate Fingerprint. For generating the SHA key, refer this article.
2) Create an app in the Huawei AppGallery connect and enable ML Kit in Manage API section.
3) Provide the SHA Key in App Information Section.
4) Provide storage location.
5) Download the agconnect-services.json and store it somewhere on our computer.
6) Create a blank Ionic Project using below command:
Code:
ionic start Your_Application_Name blank --type=angular
7) Download the Cordova Site Kit Plugin. Run the following command in the root directory of your Ionic project to install it through npm.
Code:
npm install <CORDOVA_SITE_KIT_PLUGIN_PATH>
8) If you want full Ionic support with code completion etc., install @iONiC-native/core in your project.
Code:
npm install [user=1329689]@iONiC[/user]-native/core --save-dev
9) Run the following command to copy the "ionic/dist/hms-ml" folder from library to "node_modules @iONiC-native" folder under your Ionic project.
Code:
cp node_modules/@hmscore/cordova-plugin-hms-ml/ionic/dist/hms-ml node_modules [user=1329689]@iONiC[/user]-native/ -r
10) Run the following command to compile the project:
Code:
ionic build
npx cap init [appName] [appId]
Where appName is the name of your app, and appId is package_name in your agconnect-services.json file (example: com.example.app).
11) Run the following command to add android platform to your project:
Code:
ionic capacitor add android
12) Make sure your project has a build.gradle file with a maven repository address and agconnect service dependencies as shown below:
Code:
buildscript {
repositories {
google()
jcenter()
maven {url 'https://developer.huawei.com/repo/'}
}
dependencies {
classpath 'com.android.tools.build:gradle:3.6.1'
classpath 'com.google.gms:google-services:4.3.3'
classpath 'com.huawei.agconnect:agcp:1.3.1.300'
// NOTE: Do not place your application dependencies here; they belong
// in the individual module build.gradle files
}
}
apply from: "variables.gradle"
allprojects {
repositories {
google()
jcenter()
maven {url 'https://developer.huawei.com/repo/'}
}
}
13) Add the Signing certificate configuration to the build.gradle file in the app directory as show below:
Code:
android {
compileSdkVersion rootProject.ext.compileSdkVersion
defaultConfig {
applicationId "YOUR_PACKAGE_NAME"
minSdkVersion rootProject.ext.minSdkVersion
targetSdkVersion rootProject.ext.targetSdkVersion
versionCode 1
versionName "1.0"
testInstrumentationRunner "androidx.test.runner.AndroidJUnitRunner"
}
signingConfigs {
release {
storeFile file("mykeystore.jks") // Signing certificate.
storePassword "XXXXXX" // KeyStore password.
keyAlias "XXXXXX" // Alias.
keyPassword "XXXXXX" // Key password.
v1SigningEnabled true
v2SigningEnabled true
}
}
buildTypes {
release {
signingConfig signingConfigs.release
minifyEnabled false
proguardFiles getDefaultProguardFile('proguard-android.txt'), 'proguard-rules.pro'
}
debug {
signingConfig signingConfigs.release
debuggable true
}
}
}<img src="https://communityfile-dre.op.hicloud.com/FileServer/getFile/cmtybbs/001/647/156/2640091000001647156.20201016091920.80521901239093620186278348845525:50511111010449:2800:68B931001F4C6BCA4A0671802459A06AA748D35EF1BA25865E9BE8C4A629C3FE.png" style="color: rgb(51,51,51);font-family: "Times New Roman" , serif;font-size: 21.0px;white-space: normal;width: 1.0px;height: 1.0px;">
14) Add plugin to the build.gradle file in the app directory as show below:
Code:
apply plugin: 'com.android.application'
apply plugin: 'com.huawei.agconnect'
15) Add ads service implementation into to dependencies section of build.gradle file in the app directory as show below:
Code:
dependencies {
…
implementation 'com.huawei.hms:site:5.0.0.300'
}
16) Add agconnect-services.json and signing certificate jks file to the app directory in your Android project as show below:
17) To update dependencies, and copy any web assets to your project, run the following code:
Code:
npx capacitor sync
Let's Code
Avoid No Provider Error
To avoid No Provider Error, we need to add FileChooser, Android Permission and HMS ML kit in app.module.ts providers section and also make sure to import the same as shown below:
Code:
import { NgModule } from '@angular/core';
import { BrowserModule } from '@angular/platform-browser';
import { RouteReuseStrategy } from '@angular/router';
import { IonicModule, IonicRouteStrategy } from [user=1329689]@iONiC[/user]/angular';
import { SplashScreen } from [user=1329689]@iONiC[/user]-native/splash-screen/ngx';
import { StatusBar } from [user=1329689]@iONiC[/user]-native/status-bar/ngx';
import { AppComponent } from './app.component';
import { AppRoutingModule } from './app-routing.module';
import { HmsSite } from [user=1329689]@iONiC[/user]-native/hms-site/ngx'
@NgModule({
declarations: [AppComponent],
entryComponents: [],
imports: [BrowserModule, IonicModule.forRoot(), AppRoutingModule],
providers: [
StatusBar,
SplashScreen,
HmsSite,
{ provide: RouteReuseStrategy, useClass: IonicRouteStrategy }
],
bootstrap: [AppComponent]
})
export class AppModule {}
Create page in Ionic
Run the following command to create page in Ionic project:
Code:
Ionic generate page page-name
Import
Import the following code in your home.page.ts :
Code:
import { Component, IterableDiffers } from '@angular/core';
import { HmsSite } from [user=1329689]@iONiC[/user]-native/hms-site/ngx';
Initializing the service
First we need to initialize the service. In order to that we need to call the initializeService method to initialize the HMS Site Kit service using our project API key. The API key can be found in our agconnect-services.json file.
Code:
constructor(private hmssite: HmsSite) {
var config = {
apiKey: "YOUR_API_KEY"
};
hmssite.initializeService(
config
);
}
Place Search
Basically it helps us to extract information of a particular place like tourist attraction, enterprises and school by specifying keywords, coordinates and other information.
Code:
public async onTextSearch() {
let textSearchReq = {
query: this.textSearchPlaces,
location: {
lat: 20.5937,
lng: 78.9629,
},
radius: 5000,
poiType: this.hmssite.LocationType.ADDRESS,
countryCode: "IN",
language: "en",
pageIndex: 1,
pageSize: 5
};
try {
var res = JSON.parse(JSON.stringify(await this.hmssite.textSearch(textSearchReq)));
this.data = JSON.stringify(JSON.parse(JSON.stringify(res)).sites);
this.result.length = 0;
for (var i = 0; i < JSON.parse(this.data).length; i++) {
var text: any = {
"NAME": JSON.stringify(JSON.parse(this.data)[i].name),
"ADDRESS": JSON.stringify(JSON.parse(this.data)[i].address),
"DISTANCE": JSON.stringify(JSON.parse(this.data)[i].distance),
"LOCATION": JSON.stringify(JSON.parse(this.data)[i].location),
"POIDETAILS": JSON.stringify(JSON.parse(this.data)[i].poi),
"RATING": JSON.stringify(JSON.parse(this.data)[i].rating),
"SITEID": JSON.stringify(JSON.parse(this.data)[i].siteId)
};
this.result.push(text)
}
}
catch (ex) {
alert(JSON.stringify(ex))
}
}
More details, you can visit https://forums.developer.huawei.com/forumPortal/en/topic/0203394385930860028&channelname=HuoDong59&ha_source=xda

Well explained, can we restrict area when we are searching data.

Thank you very much

Related

How to implement HMS Location Kit With Flutter?

More information like this, you can visit HUAWEI Developer Forum​
HMS Location Kit Flutter
Hi everyone , Today I try describing how we can use HMS Location kit Flutter Plugin also I prepare a demo project .
--You can see that at the end of page with Github link.
What is the Location Kit?
HUAWEI Location Kit assists developers in enabling their apps to get quick and accurate user locations and expand global positioning capabilities by using GPS, Wi-Fi, and base station locations.
Fused location: Provides a set of simple and easy-to-use APIs for you to quickly obtain the device location based on the GPS, Wi-Fi, and base station location data.
Activity identification: Identifies user motion status through the acceleration sensor, cellular network information, and magnetometer, helping you adjust your app based on user behavior.
Geofence: Allows you to set an interested area through an API so that your app can receive a notification when a specified action (such as leaving, entering, or lingering in the area) occurs.
If you want to more information about Location kit:
Location Kit
HUAWEI Location Kit combines GPS, Wi-Fi, and base station location data to allow you to quickly obtain precise user…developer.huawei.com
Location Flutter Plugin
The Flutter Location Plugin provides adaption code used for the HUAWEI Location Kit to use in Flutter platform. HUAWEI Location Kit combines the GPS, Wi-Fi, and base station locations to help you quickly obtain precise user locations, build up global positioning capabilities, and reach a wide range of users around the globe.
Configuring Your Flutter Project
Registering as a Developer
Before you get started, you must register as a HUAWEI developer and complete identity verification on the HUAWEI Developers website. For details, please refer to Registration and Verification.
Creating AppGalery Connect Project
Sign in to AppGallery Connect and click My apps.
Click the desired app name.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
3. Click the Develop tab.
4. In the App information area, click agconnect-services.json to download the configuration file.
If you have made any changes, such as setting the data storage location and enabling or managing APIs, you need to download the latest agconnect-services.json file and use it to replace the existing file in the app directory.
5. Create a Flutter project if you have not created one.
6. Run the following command and make sure that no errors are reported.
[project_path]> flutter doctor
7. Copy the agconnect-services.json file to the android/app directory of your Flutter project.
8. Copy the generated signature file to the android/app directory of your Flutter project.
9. Verify that the agconnect-services.json file and signature file are successfully added to the android/app directory of your Flutter project.
10. Open the build.gradle file in the android directory of your Flutter project.
a. Go to buildscript, and configure the Maven repository address and AppGallery Connect plugin for the HMS Core SDK.
Code:
buildscript {
repositories {
google()
jcenter()
maven { url 'https://developer.huawei.com/repo/' }
}
dependencies {
/*
* <Other dependencies>
*/
classpath 'com.huawei.agconnect:agcp:1.2.1.301'
}
}
b. Go to allprojects, and configure the Maven repository address for the HMS Core SDK.
Code:
allprojects {
repositories {
google()
jcenter()
maven { url 'https://developer.huawei.com/repo/' }
}
}
11. Open the build.gradle file in the android/app directory of your Flutter project.
a. Add build dependencies.
Code:
dependencies {
implementation 'com.huawei.hms:location:4.0.3.301'
/*
* <Other dependencies>
*/
}
b. Add the apply plugin: ‘com.huawei.agconnect’ line after the apply plugin: ‘com.android.application’ line.
Code:
apply plugin: 'com.android.application'
apply plugin: 'com.huawei.agconnect'
apply from: "$flutterRoot/packages/flutter_tools/gradle/flutter.gradle"
c. Set minSdkVersion to 19 or higher in android > defaultConfig.
Code:
defaultConfig {
applicationId "<package_name>"
minSdkVersion 19
/*
* <Other configurations>
*/
}
The value of applicationId must match with that of package_name in the agconnect-services.json file.
d. Configure the signature in android based on the signature file information.
Code:
android {
/*
* <Other configurations>
*/
signingConfigs {
release {
storeFile file('<keystore_file>')
storePassword '<keystore_password>'
keyAlias '<key_alias>'
keyPassword '<key_password>'
}
}
buildTypes {
debug {
signingConfig signingConfigs.release
}
release {
signingConfig signingConfigs.release
}
}
}
Replace <keystore_file>, <keystore_password>, <key_alias> and <key_password> with matching entries in your signature file. For details about the app signing procedure in Flutter, please refer to App Signing in Flutter.
Integrating the Plugin
There are two ways to integrate the plugin to your project. Download the Location Kit Flutter Plugin and Using pub.dev HMS location_plugin
Download the Location Kit Flutter Plugin
Download the Location Kit Flutter Plugin and decompress it.
Open the pubspec.yaml file in your Flutter project and add the plugin dependency to the dependencies section.
Code:
dependencies:
huawei_location:
path: {library path}
Replace {library path} with the actual library path of the HUAWEI Location Kit Flutter plugin. For details, please refer to Using Packages.
3. Run following command to update the package information:
Code:
[project_path]> flutter pub get
4. Run the following command or click the run icon on the toolbar to start the app:
Code:
[project_path]> flutter run
Using pub.dev HMS location_plugin
Add this to your package’s pubspec.yaml file:
Code:
dependencies:
huawei_location: ^4.0.4+300
You can install packages from the command line:
with Flutter:
Code:
$ flutter pub get
Code:
Alternatively, your editor might support flutter pub get. Check the docs for your editor to learn more.
Permissions
First of all we need permissions to access location and physical activity data.
Add location permission to manifest file.Define this permissions in android/app/src/main/AndroidManifest.xml as follows:
Create a PermissionHandle intance.
Code:
PermissionHandler permissionHandler;
Add initState() for initialize.
Code:
permissionHandler = PermissionHandler();
What does the service provide?
Check Permissions
Request Permissions
Code:
void requestPermission() async {
try {
bool status = await permissionHandler.requestLocationPermission();
setState(() {
infoText = "Is permission granted $status";
});
} catch (e) {
setState(() {
infoText = e.toString();
});
}
}
Fused Location
Create a FusedLocationProviderClient instance using the init() method and use the instance to call location-related APIs.
Code:
FusedLocationProviderClient locationService;
Add initState() for initialize.
Code:
locationService = FusedLocationProviderClient();
What does the service provide?
Last Location
Last Location With Address
Mock Location
Location Updates
Reference: Fused Location developer.huawei.com
To use mock location
To use the mock location function, go to Settings > System & updates > Developer options > Select mock location app and select the app for using the mock location function.
(If Developer options is unavailable, go to Settings > About phone and tap Build number for seven consecutive times. Then, Developer options will be displayed on System & updates.)
To use mock location feature first configure the mock location permission in the android/app/src/main/AndroidManifest.xml file.
Code:
<uses-permission
android:name="android.permission.ACCESS_MOCK_LOCATION"
tools:ignore="MockLocation,ProtectedPermissions" />
Listen Location Update Event
Call the method onLocationData that listens the location update events.
StreamSubscription<Location> streamSubs;
Add initState()
Code:
streamSubs = locationService.onLocationData.listen((location) {
print(location.toString());
});
Example method : getLastLocation()
Code:
void getLastLocation() async {
setState(() {
infoText = "";
});
try {
Location location = await locationService.getLastLocation();
setState(() {
infoText = location.toString();
});
} catch (e) {
setState(() {
infoText = e.toString();
});
}
}
Activity Identification
Creating an Activity Identification Service Client
ActivityIdentificationService locationService;
Add initState() for initialize.
locationService = FusedLocationProviderClient();
What does the service provide?
Activity Conversion Updates
Activity Conversion Request
Activity Identification Updates
Activity Identification Request
Listen Activity Identification Event
You can use the onActivityIdentification method to listen to and receive data from activity identification events.
Add initState() for initialize.
ActivityIdentificationService activityIdentificationService =
ActivityIdentificationService();
Code:
void onActivityIdentificationResponse(ActivityIdentificationResponse response) {
for (ActivityIdentificationData data
in response.activityIdentificationDatas) {
setChange(data.identificationActivity , data.possibility);
//data.identificationActivity include activity type like vehicle,bike etc.
//data.posibility The confidence for the user to execute the activity.
//The confidence ranges from 0 to 100. A larger value indicates more reliable activity authenticity.
}
}
void streamListen() {
streamSubscription =
activityIdentificationService.onActivityIdentification.listen(onActivityIdentificationResponse);
}
For full content, you can visit https://forums.developer.huawei.com/forumPortal/en/topicview?tid=0201314069177450189&fid=0101187876626530001

Intermediate: Integration of landmark recognition feature in tourism apps (ML Kit-React Native)

Overview
Did you ever gone through your vacation photos and asked yourself: What is the name of this place I visited in India? Who created this monument I saw in France? Landmark recognition can help! This technology can predict landmark labels directly from image pixels, to help people better understand and organize their photo collections.
Landmark recognition can be used in tourism scenarios. The landmark recognition service enables you to obtain the landmark name, landmark longitude and latitude, and confidence of the input image. A higher confidence indicates that the landmark in the input image is more likely to be recognized. Based on the recognized information, you can create more personalized app experience for users.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
In this article, I will show how user can get the landmark information using ML Kit Plugin.
Integrate this service into a travel app so that images taken by users are detected by ML Plugin to return the landmark name and address, and the app can provide the brief introduction and tour suggestions based on the returned information.
Create Project in Huawei Developer Console
Before you start developing an app, configure app information in App Gallery Connect.
Register as a Developer
Before you get started, you must register as a Huawei developer and complete identity verification on HUAWEI Developers. For details, refer to Registration and Verification.
Create an App
Follow the instructions to create an app Creating an App Gallery Connect Project and Adding an App to the Project. Set the data storage location to Germany
Adding an App to the Project. Set the data storage location to Germany
React Native setup
Requirements
Huawei phone with HMS 4.0.0.300 or later.
React Native environment with Android Studio, NodeJs and Visual Studio code.
Dependencies
Gradle Version: 6.3
Gradle Plugin Version: 3.5.2
React-native-hms-ml gradle dependency
React Native CLI: 2.0.1
1. Environment setup, refer below link.
Setting up the development environment · React Native
This page will help you install and build your first React Native app.
reactnative.dev
2. Create project by using this command.
Code:
react-native init project name
3. You can install react native command line interface on npm, using the install -g react-native-cli command as shown below.
Code:
npm install –g react-native-cli
Generating a Signing Certificate Fingerprint
Signing certificate fingerprint is required to authenticate your app to Huawei Mobile Services. Make sure JDK is installed. To create one, navigate to JDK directory’s bin folder and open a terminal in this directory. Execute the following command:
Code:
keytool -genkey -keystore <application_project_dir>\android\app\<signing_certificate_fingerprint_filename>.jks -storepass <store_password> -alias <alias> -keypass <key_password> -keysize 2048 -keyalg RSA -validity 36500
This command creates the keystore file in application_project_dir/android/app
The next step is obtain the SHA256 key which is needed for authenticating your app to Huawei services, for the key store file. To obtain it, enter following command in terminal:
Code:
keytool -list -v -keystore <application_project_dir>\android\app\<signing_certificate_fingerprint_filename>.jks
After an authentication, the SHA256 key will be revealed as shown below.
Adding SHA256 Key to the Huawei project in App Gallery
Copy the SHA256 key and visit AppGalleryConnect/ <your_ML_project>/General Information. Paste it to the field SHA-256 certificate fingerprint.
Enable the ML kit from ManageAPIs.
Download the agconnect-services.jsonfrom App Gallery and place the file in android/app directory from your React Native Project.
Follow the steps to integrate the ML plugin to your React Native Application.
Integrate the HMS-ML plugin
Code:
npm i @hmscore/react-native-hms-ml
Download the Plugin from the Download Link
Download ReactNative ML Plugin under node_modules/@hmscore of your React Native project, as shown in the directory tree below:
Code:
project-dir
|_ node_modules
|_ ...
|_ @hmscore
|_ ...
|_ react-native-hms-ml
|_ ...
|_ ...
Navigate to android/app/build.gradle directory in your React Native project. Follow the steps:
Add the AGC Plugin dependency.
Code:
apply plugin: 'com.huawei.agconnect'
Add to dependencies in android/app/build.gradle:
Code:
implementation project(':react-native-hms-ml')
Navigate to App level android/build.gradle directory in your React Native project. Follow the steps:
Add to buildscript/repositories
Code:
maven {url 'http://developer.huawei.com/repo/'}
Add to buildscript/dependencies
Code:
classpath 'com.huawei.agconnect:agcp:1.3.1.300’'3)
Navigate to android/settings.gradle and add the following:
Code:
include ':react-native-hms-ml'
project(':react-native-hms-ml').projectDir = new File(rootProject.projectDir, '../node_modules/@hmscore/react-native-hms-ml/android')
Use case
Huawei ML kit’s HMSLandmarkRecognition API can be integrate for different applications and to return the landmark name and address, and the app can provide the brief introduction and tour suggestions based on the returned information.
Add below under AndroidManifest.xml file.
XML:
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<uses-permission android:name="android.permission.CAMERA" />
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.ACCESS_WIFI_STATE" />
<application
<meta-data
android:name="com.huawei.hms.ml.DEPENDENCY"
android:value="dsc"/>
</ application>
Set API Key:
Before using HUAWEI ML in your app, set Api key first.
Copy the api_key value in your agconnect-services.json file.
Call setApiKey with the copied value.
JavaScript:
HMSApplication.setApiKey("api_key").then((res) => {console.log(res);})
catch((err) => {console.log(err);})
Analyze Frame
Using HMSLandmarkRecognition.asyncAnalyzeFrame() recognizes landmarks in images asynchronously.
JavaScript:
async asyncAnalyzeFrame() {
try {
var result = await HMSLandmarkRecognition.asyncAnalyzeFrame(true, this.getFrameConfiguration(), this.getLandmarkAnalyzerSetting());
console.log(result);
if (result.status == HMSApplication.SUCCESS) {
result.result.forEach(element => {
this.state.landmark.push(element.landMark);
this.state.possibility.push(element.possibility);
this.state.url.push('https://en.wikipedia.org/wiki/'+element.landMark)
long = [];
lat = [];
element.coordinates.forEach(ll => {
long.push(ll.longitude);
lat.push(ll.latitude);
})
this.state.coordinates.push(lat, long);
});
this.setState({
landMark: this.state.landmark,
possibility: this.state.possibility,
coordinates: this.state.coordinates,
url:this.state.url,
});
}
else {
ToastAndroid.showWithGravity(result.message, ToastAndroid.SHORT, ToastAndroid.CENTER);
}
} catch (e) {
console.error(e);
}
}
Final Code:
JavaScript:
import React from 'react';
import {
Text,
View,
TextInput,
ScrollView,
TouchableOpacity,
Image,
ToastAndroid,
SafeAreaView
} from 'react-native';
import { styles } from '@hmscore/react-native-hms-ml/example/src/Styles';
import { HMSLandmarkRecognition, HMSApplication } from '@hmscore/react-native-hms-ml';
import { showImagePicker } from '@hmscore/react-native-hms-ml/example/src/HmsOtherServices/Helper';
import { WebView } from 'react-native-webview';
export default class App extends React.Component {
componentDidMount() { }
componentWillUnmount() { }
constructor(props) {
super(props);
this.state = {
imageUri: '',
landmark: [],
coordinates: [],
possibility: [],
url:[]
};
}
getLandmarkAnalyzerSetting = () => {
return { largestNumOfReturns: 10, patternType: HMSLandmarkRecognition.STEADY_PATTERN };
}
getFrameConfiguration = () => {
return { filePath: this.state.imageUri };
}
async asyncAnalyzeFrame() {
try {
var result = await HMSLandmarkRecognition.asyncAnalyzeFrame(true, this.getFrameConfiguration(), this.getLandmarkAnalyzerSetting());
console.log(result);
if (result.status == HMSApplication.SUCCESS) {
result.result.forEach(element => {
this.state.landmark.push(element.landMark);
this.state.possibility.push(element.possibility);
this.state.url.push('https://en.wikipedia.org/wiki/'+element.landMark)
long = [];
lat = [];
element.coordinates.forEach(ll => {
long.push(ll.longitude);
lat.push(ll.latitude);
})
this.state.coordinates.push(lat, long);
});
this.setState({
landMark: this.state.landmark,
possibility: this.state.possibility,
coordinates: this.state.coordinates,
url:this.state.url,
});
}
else {
ToastAndroid.showWithGravity(result.message, ToastAndroid.SHORT, ToastAndroid.CENTER);
}
} catch (e) {
console.error(e);
}
}
startAnalyze() {
this.setState({
landmark: [],
possibility: [],
coordinates: [],
url:[],
})
this.asyncAnalyzeFrame();
}
render() {
console.log(this.state.url.toString());
return (
<ScrollView style={styles.bg}>
<View style={styles.containerCenter}>
<TouchableOpacity onPress={() => { showImagePicker().then((result) => this.setState({ imageUri: result })) }}>
<Image style={styles.imageSelectView} source={this.state.imageUri == '' ? require('@hmscore/react-native-hms-ml/example/assets/image.png') : { uri: this.state.imageUri }} />
</TouchableOpacity>
</View>
<Text style={styles.h1}>pick the image and explore the information about place</Text>
<View style={styles.basicButton}>
<TouchableOpacity
style={styles.startButton}
onPress={this.startAnalyze.bind(this)}
disabled={this.state.imageUri == '' ? true : false} >
<Text style={styles.startButtonLabel}> Check Place </Text>
</TouchableOpacity>
</View>
<Text style={{fontSize: 20}}> {this.state.landmark.toString()} </Text>
<View style={{flex: 1}}>
<WebView
source={{uri: this.state.url.toString()}}
style={{marginTop: 20,height:1500}}
javaScriptEnabled={true}
domStorageEnabled={true}
startInLoadingState={true}
scalesPageToFit={true}
/>
</View>
</ScrollView>
);
}
}
Run the application (Generating the Signed Apk):
1. Open project directory path in Command prompt.
2. Navigate to android directory and run the below command for signing the Apk.
Code:
gradlew assembleRelease
Output:
Tips and Tricks:
Download latest HMS ReactNativeML plugin.
Copy the api_key value in your agconnect-services.json file and set API key.
Images in PNG, JPG, JPEG, and BMP formats are supported. GIF images are not supported.
For project cleaning, navigate to android directory and run the below command.
Code:
gradlew clean
Conclusion:
In this article, we have learnt to integrate ML kit in React native project.
This service into a travel apps, so that images taken by users and detected by ML Plugin to return the landmark information, and the app can provide the brief introduction and tour suggestions to user.
Reference
https://developer.huawei.com/consum...-Guides/landmark-recognition-0000001050726194

Intermediate: Site kit Search Capabilities in Food DeliveryApp in Flutter – Part 2

Introduction
In this article, we will be integrating other Search features of Site kit, you can find previous article here, and Huawei Site Kit provides core capabilities to developer to quickly build apps with which users can explore world around them seamlessly. Huawei Site kit provides following Search capabilities to developer as shown below.
Keyword search: returns the place list based on the keywords entered by user.
Nearby place search: Searches for nearby location based on current location of the user’s device.
Place detail search: Search for details about the place.
Place search suggestion: Returns list of suggested places.
Autocomplete: Returns an autocomplete place and a list of suggested places based on the entered keyword.
Development Overview
You need to install Flutter and Dart plugin in IDE and I assume that you have prior knowledge about the Flutter and Dart.
Hardware Requirements
A computer (desktop or laptop) running Windows 10.
A Huawei phone (with the USB cable), which is used for debugging.
Software Requirements
Java JDK 1.7 or later.
Android studio software or Visual Studio or Code installed.
HMS Core (APK) 4.X or later.
Integration process
Step 1. Create flutter project
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Step 2. Add the App level gradle dependencies, choose inside project Android > app > build.gradle.
Code:
apply plugin: 'com.android.application'
apply plugin: 'com.huawei.agconnect'
Add root level gradle dependencies
Code:
maven {url 'https://developer.huawei.com/repo/'}
classpath 'com.huawei.agconnect:agcp:1.4.1.300'
Step 3: Add the below permissions in Android Manifest file.
Code:
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE"/>
<uses-permission android:name="android.permission.ACCESS_COARES_LOCATION"/>
<uses-permission android:name="android.permission.ACCESS_FINE_LOCATION"/>
Step 4: Add plugin path in pubspec.yaml file under dependencies.
Step 5: Create a project in AppGallery Connect, find here.
pubspec.yaml
Code:
name: sample_one
description: A new Flutter application.
# The following line prevents the package from being accidentally published to
# pub.dev using `pub publish`. This is preferred for private packages.
publish_to: 'none' # Remove this line if you wish to publish to pub.dev
version: 1.0.0+1
environment:
sdk: ">=2.7.0 <3.0.0"
dependencies:
flutter:
sdk: flutter
huawei_map:
path: ../huawei_map/
huawei_location:
path: ../huawei_location/
huawei_safetydetect:
path: ../huawei_safetydetect
huawei_site:
path: ../huawei_site
http: ^0.12.2
rflutter_alert: ^2.0.2
# The following adds the Cupertino Icons font to your application.
# Use with the CupertinoIcons class for iOS style icons.
cupertino_icons: ^1.0.2
# add this line to your dependencies
toast: ^0.1.5
dev_dependencies:
flutter_test:
sdk: flutter
# For information on the generic Dart part of this file, see the
# following page: https://dart.dev/tools/pub/pubspec
# The following section is specific to Flutter.
flutter:
Declare and instantiate service object
Code:
Future<void> initmySearchService() async {
searchService = await SearchService.create(Uri.encodeComponent(API_KEY));
}<strong> </strong>
How to I call QueryAutoCompleteRequest api ?
Code:
void autocomplete(String value) async {
// Declare an SearchService object and instantiate it. which i done in above initSearchService()
// Create QueryAutocompleteRequest and its body.
QueryAutocompleteRequest request = QueryAutocompleteRequest(query: value);
// Create QueryAutocompleteResponse object.
// Call queryAutocomplete() method.
// Assign the results.
QueryAutocompleteResponse response =
await searchService.queryAutocomplete(request);
if (response != null) {
Map<String, dynamic> data = json.decode(response.toJson());
List<dynamic> data2;
locations.clear();
entries.clear();
for (String key in data.keys) {
if (key == 'sites') {
data2 = data[key];
for (var element in data2) {
setState(() {
entries.add(element['name'] + "\n" + element['formatAddress']);
locations.add(new LatLng(
element['location']['lat'], element['location']['lng']));
});
}
}
}
}
}
How to I call QuerySuggestionRequest api ?
Code:
void querySuggestionSearch(String value) async {
// Declare an SearchService object and instantiate it. which i done in above initSearchService()
QuerySuggestionRequest request = QuerySuggestionRequest();
request.query = value;
request.location = Coordinate(lat: 12.893478, lng: 77.334595);
request.language = "en";
request.countryCode = "IN";
request.radius = 5000;
// Create QuerySuggestionResponse object.
// Call querySuggestion() method.
// Assign the results.
QuerySuggestionResponse response =
await searchService.querySuggestion(request);
if (response != null) {
Map<String, dynamic> data = json.decode(response.toJson());
List<dynamic> data2;
entries.clear();
for (String key in data.keys) {
if (key == 'sites') {
data2 = data[key];
for (var element in data2) {
setState(() {
entries.add(element['name'] + "\n" + element['formatAddress']);
locations.add(new LatLng(
element['location']['lat'], element['location']['lng']));
});
}
}
}
}
}
How to I call DetailSearchRequest api ?
Code:
void placeDetailSearch(String siteId) async {
// Declare an SearchService object and instantiate it. which i done in above initSearchService()
DetailSearchRequest request = DetailSearchRequest();
request.siteId = siteId;
request.language = "en";
// Create DetailSearchResponse object.
// Call detailSearch() method.
// Assign the results.
DetailSearchResponse response = await searchService.detailSearch(request);
if (response != null) {
Map<String, dynamic> data = json.decode(response.toJson());
List<dynamic> data2;
setState(() {
result = data['site'].toString();
});
} else {
print("Response is NULL");
}
}
Result
Note: Place detail search takes sit id as input and gives site information as result.
Tricks and Tips
Make sure that you have downloaded latest plugin.
Make sure that updated plugin path in yaml.
Make sure that plugin unzipped in parent directoryof project.
Makes sure that agconnect-services.json file added.
Make sure dependencies are added in build file.
Run flutter pug get after adding dependencies.
Generating SHA-256 certificate fingerprint in android studio and configure in Ag-connect.
Conclusion
In this article, we have learnt how to integrate Huawei Site kit Search capabilities for DeliveryApp in flutter. Where user can search for specific hotel or restaurants in the search box and clicks on the result to find the list of orders. Similar way you can use Huawei Site kit as per user requirement in your application.
Thank you so much for reading, I hope this article helps you to understand the Huawei Site kit Search capabilities in flutter.
Reference
Site kit
Site kit plugin
Original Source

Intermediate: Text Recognition, Language detection and Language translation using Huawei ML Kit in Flutter (Cross platform)

Introduction
In this article, we will be learning how to integrate Huawei ML kit in Flutter application. Flutter ML plugin allows your apps to easily leverage Huawei’s long-term proven expertise in machine learning to support diverse artificial intelligence (AI) applications. ML plugin provides diversified leading machine learning capabilities that are easy to use, helping you develop various AI apps.
List of API’s ML plugin provides
Text-related services
Language-related services
Image-related services
Face/body-related services
Natural language processing
Custom model
In this article, we will be integrating some of the specific API’s related to Text-related services and Language-related service in flutter application.
Development Overview
You need to install Flutter and Dart plugin in IDE and I assume that you have prior knowledge about the Flutter and Dart.
Hardware Requirements
A computer (desktop or laptop) running Windows 10.
A Huawei phone (with the USB cable), which is used for debugging.
Software Requirements
Java JDK 1.7 or later.
Android studio software or Visual Studio or Code installed.
HMS Core (APK) 4.X or later.
Integration process
Step 1. Create flutter project.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Step 2. Add the App level gradle dependencies, choose inside project Android > app > build.gradle.
Code:
apply plugin: 'com.android.application'
apply plugin: 'com.huawei.agconnect'
implementation 'com.huawei.agconnect:agconnect-core:1.3.1.300'
Add root level gradle dependencies.
Code:
maven {url 'https://developer.huawei.com/repo/'}
classpath 'com.huawei.agconnect:agcp:1.4.1.300'
Step 3: Add the below permissions in Android Manifest file.
Code:
<uses-permission android:name="android.permission.CAMERA" />
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<uses-permission android:name="android.permission.ACCESS_WIFI_STATE" />
Step 4: Add plugin path in pubspec.yaml file under dependencies.
Step 5: Create a project in AppGallery Connect, find here.
pubspec.yaml
Code:
name: flutterdrivedemo123
description: A new Flutter project.
# The following line prevents the package from being accidentally published to
# pub.dev using `pub publish`. This is preferred for private packages.
publish_to: 'none' # Remove this line if you wish to publish to pub.dev
# https://developer.apple.com/library/archive/documentation/General/Reference/InfoPlistKeyReference/Articles/CoreFoundationKeys.html
version: 1.0.0+1
environment:
sdk: ">=2.12.0 <3.0.0"
dependencies:
flutter:
sdk: flutter
huawei_account:
path: ../huawei_account
huawei_drive:
path: ../huawei_drive
huawei_ml:
path: ../huawei_ml
# The following adds the Cupertino Icons font to your application.
# Use with the CupertinoIcons class for iOS style icons.
cupertino_icons: ^1.0.2
image_picker: ^0.8.0
dev_dependencies:
flutter_test:
sdk: flutter
# For information on the generic Dart part of this file, see the
# following page: https://dart.dev/tools/pub/pubspec
# The following section is specific to Flutter.
flutter:
Initialize MLApplication
Code:
MLApplication app = new MLApplication();
app.setApiKey(apiKey:"API_KEY");<strong> </strong>
Check required permissions
Code:
Future<void> checkPerms() async {
final bool isCameraPermissionGranted =
await MLPermissionClient().hasCameraPermission();
if (!isCameraPermissionGranted) {
final bool res = await MLPermissionClient()
.requestPermission([MLPermission.camera, MLPermission.storage]);
}
}
Select image and capture text from image
Code:
Future getImage() async {
final pickedFile = await picker.getImage(source: ImageSource.gallery);
//final pickedFile = await picker.getImage(source: ImageSource.camera);
setState(() {
if (pickedFile != null) {
File _image = File(pickedFile.path);
print('Path :' + pickedFile.path);
capturetext(pickedFile.path);
} else {
print('No image selected.');
}
});
}
Future<void> capturetext(String path) async {
// Create an MLTextAnalyzer object.
MLTextAnalyzer analyzer = new MLTextAnalyzer();
// Create an MLTextAnalyzerSetting object to configure the recognition.
MLTextAnalyzerSetting setting = new MLTextAnalyzerSetting();
// Set the image to be recognized and other desired options.
setting.path = path;
setting.isRemote = true;
setting.language = "en";
// Call asyncAnalyzeFrame to recognize text asynchronously.
MLText text = await analyzer.asyncAnalyzeFrame(setting);
print(text.stringValue);
setState(() {
msg = text.stringValue;
});
}
How to detect Language using ML kit?
Code:
Future<void> onClickDetect() async {
// Create an MLLangDetector object.
MLLangDetector detector = new MLLangDetector();
// Create MLLangDetectorSetting to configure detection.
MLLangDetectorSetting setting = new MLLangDetectorSetting();
// Set source text and detection mode.
setting.sourceText = text;
setting.isRemote = true;
// Get detection result with the highest confidence.
String result = await detector.firstBestDetect(setting: setting);
setState(() {
text = setting.sourceText + ": " + result;
});
}
How to translate Language using ML kit?
Code:
Future<void> onClickTranslate() async {
// Create an MLLocalTranslator object.
MLLocalTranslator translator = new MLLocalTranslator();
// Create an MLTranslateSetting object to configure translation.
MLTranslateSetting setting = new MLTranslateSetting();
// Set the languages for model download.
setting.sourceLangCode = "en";
setting.targetLangCode = "hi";
// Prepare the model and implement the translation.
final isPrepared = await translator.prepareModel(setting: setting);
if (isPrepared) {
// Asynchronous translation.
String result = await translator.asyncTranslate(sourceText: text);
setState(() {
text = result.toString();
});
}
// Stop translator after the translation ends.
bool result = await translator.stopTranslate();
}
Result
Tricks and Tips
Make sure that you have downloaded latest plugin.
Make sure that updated plugin path in yaml.
Make sure that plugin unzipped in parent directory of project.
Makes sure that agconnect-services.json file added.
Make sure dependencies are added in build file.
Run flutter pug get after adding dependencies.
Generating SHA-256 certificate fingerprint in android studio and configure in Ag-connect.
Conclusion
In this article, we have learnt how to integrate capabilities of Huawei ML kit in flutter application. Similar way you can use Huawei ML kit as per user requirement in your application.
Thank you so much for reading, I hope this article helps you to understand the Huawei ML kit capabilities in flutter.
Reference
MLkit
Plutter plugin
Original Source
useful sharing

Intermediate: Integration of Huawei Game Services in Flutter (Cross platform)

Introduction
In this article, we will be integrating Huawei Game Services kit in flutter application. You will have access to range of development capabilities. You can promote your game quickly and more efficiently to Huawei’s vast users as Huawei Game Services allows users to login game using Huawei IDs. You can also use the service to quickly implement achievements, game events, and game addiction prevention functions and perform in-depth game operations based on user and content localization.
Huawei Game Services Capabilities
Game Login
Achievements
Floating window*
Game Addiction prevention*
Events
Leaderboards
Save Games*
Player statistics*
Access to Basic Game Information*
Note: Restricted to regions (*)
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Development Overview
You need to install Flutter and Dart plugin in IDE and I assume that you have prior knowledge about the Flutter and Dart.
Hardware Requirements
A computer (desktop or laptop) running Windows 10.
A Huawei phone with API 4.x.x or above (with the USB cable),Which is used for debugging.
Software Requirements
Java JDK 1.7 or later.
Android studio software or Visual Studio or Code installed.
HMS Core (APK) 4.X or later.
Integration process
Step 1. Create flutter project.
Step 2. Add the App level gradle dependencies, choose inside project Android > app > build.gradle.
Code:
apply plugin: 'com.android.application'
apply plugin: 'com.huawei.agconnect'
Add root level gradle dependencies.
Code:
maven {url 'https://developer.huawei.com/repo/'}
classpath 'com.huawei.agconnect:agcp:1.4.2.301'
Step 3: Add the below permissions in Android Manifest file.
Code:
<uses-permission android:name="android.permission.INTERNET" />
Step 4: Add plugin path in pubspec.yaml file under dependencies.
Step 5: Create a project in AppGallery Connect, find here.
pubspec.yaml
Code:
name: gameservice234demo
description: A new Flutter project.
# The following line prevents the package from being accidentally published to
# pub.dev using `pub publish`. This is preferred for private packages.
publish_to: 'none' # Remove this line if you wish to publish to pub.dev
# https://developer.apple.com/library/archive/documentation/General/Reference/InfoPlistKeyReference/Articles/CoreFoundationKeys.html
version: 1.0.0+1
environment:
sdk: ">=2.12.0 <3.0.0"
dependencies:
flutter:
sdk: flutter
huawei_account:
path: ../huawei_account
huawei_gameservice:
path: ../huawei_gameservice
# The following adds the Cupertino Icons font to your application.
# Use with the CupertinoIcons class for iOS style icons.
cupertino_icons: ^1.0.2
dev_dependencies:
flutter_test:
sdk: flutter
# For information on the generic Dart part of this file, see the
# following page: https://dart.dev/tools/pub/pubspec
# The following section is specific to Flutter.
flutter:
# The following line ensures that the Material Icons font is
# included with your application, so that you can use the icons in
# the material Icons class.
uses-material-design: true
How do I launch or initialize the game?
Code:
void init() async {
await JosAppsClient.init();
}
Use Huawei ID for login
Code:
Future<void> login() async {
helper = new HmsAuthParamHelper()
..setIdToken()
..setAccessToken()
..setAuthorizationCode()
..setEmail()
..setProfile();
huaweiId = await HmsAuthService.signIn(authParamHelper: helper);
if (huaweiId != null) {
setState(() {
isLoggedIn = true;
msg = huaweiId!.displayName;
loginLabel = "Logged in as";
print(msg);
});
getPlayer();
} else {
setState(() {
msg = " Inside else ";
});
}
}
How do I get Achievements list?
Code:
Future<void> getAchievements() async {
try {
List<Achievement> result =
await AchievementClient.getAchievementList(true);
print("Achievement:" + result.toString());
} on PlatformException catch (e) {
print("Error on getAchievementList API, Error: ${e.code}, Error Description: ${GameServiceResultCodes.getStatusCodeMessage(e.code)}");
}
}
How do I displaying the Achievements List Page of HUAWEI AppAssistant using Intent?
Code:
void showAchievementsIntent() {
try {
AchievementClient.showAchievementListIntent();
} on PlatformException catch (e) {
print("Error on showAchievementListIntent API, Error: ${e.code}, Error Description: ${GameServiceResultCodes.getStatusCodeMessage(e.code)}");
}
}
How do I call Floating window?
Code:
try {
await BuoyClient.showFloatWindow();
} on PlatformException catch (e) {
print("Error on showFloatWindow API, Error: ${e.code}, Error Description: ${GameServiceResultCodes.getStatusCodeMessage(e.code)}");
}
How do I get All Events?
Code:
Future<void> getEvents() async {
try {
List<GameEvent> result = await EventsClient.getEventList(true);
print("Events: " + result.toString());
} on PlatformException catch (e) {
print(
"Error on getEventList API, Error: ${e.code}, Error Description: ${GameServiceResultCodes.getStatusCodeMessage(e.code)}");
}
}
How do I submit an Event?
Code:
Future<void> sendEvent(String eventId) async {
try {
await EventsClient.grow(eventId, 1);
print("************** Event sent **************");
} on PlatformException catch (e) {
print(
"Error on grow API, Error: ${e.code}, Error Description: ${GameServiceResultCodes.getStatusCodeMessage(e.code)}");
}
}
How do I get All Leaderboard data?
Code:
Future<void> getLeaderboardList() async {
// check the leaderboard status
int result = await RankingClient.getRankingSwitchStatus();
// set leaderboard status
int result2 = await RankingClient.setRankingSwitchStatus(1);
List<Ranking> rankings = await RankingClient.getAllRankingSummaries(true);
print(rankings);
}
How do I submit the ranking score?
Code:
try {
int score = 102;
RankingClient.submitRankingScores(rankingId, score);
} on PlatformException catch (e) {
print("Error on submitRankingScores API, Error: ${e.code}, Error Description:${GameServiceResultCodes.getStatusCodeMessage(e.code)}");
}
Or
try {
int score = 125;
ScoreSubmissionInfo result = await RankingClient.submitScoreWithResult(rankingId, score);
} on PlatformException catch (e) {
print("Error on submitScoreWithResult API, Error: ${e.code}, Error Description: ${GameServiceResultCodes.getStatusCodeMessage(e.code)}");
}
How do I displaying the Leaderboard List Page of HUAWEI AppAssistant using Intent?
Code:
void showLeaderboardIntent() {
try {
RankingClient.showTotalRankingsIntent();
} on PlatformException catch (e) {
print("Error on showLeaderboardListIntent API, Error: ${e.code}, Error Description: ${GameServiceResultCodes.getStatusCodeMessage(e.code)}");
}
}
Result
Tricks and Tips
Make sure that you have downloaded latest plugin.
Make sure that updated plugin path in yaml.
Make sure that plugin unzipped in parent directory of project.
Makes sure that agconnect-services.json file added.
Make sure dependencies are added in build file.
Run flutter pug get after adding dependencies.
Generating SHA-256 certificate fingerprint in android studio and configure in Ag-connect.
Conclusion
In this article, we have learnt how to integrate capabilities of Huawei Game Services kit in flutter application. Yu can promote your game quickly and more efficiently to Huawei’s vast users as Huawei Game Services allows users to login game using Huawei IDs and achieve by implementing its capabilities in your application. Similar way you can use Huawei Game Services as per user requirement in your application.
Thank you so much for reading, I hope this article helps you to understand the Huawei Game Services capabilities in flutter.
Reference
GameServices Kit
Plutter Plugin
Original Source
Useful sharing, thanks

Categories

Resources