Migrate from Google Map to HMS Map Kit - Huawei Developers

1 Overview
This tutorial helps you migrate the Google Map SDK for Android to the HMS Core Map SDK for Android in your Android app. It starts with basic environment settings, and covers some basic use cases, such as creating a map and adding a marker. At the bottom of the page is a summary in which we provide links to an automated tool that helps you finish the migration.
2 Preparations
Before integrating HMS Core Map Kit, complete the following preparations:
Create an app in AppGallery Connect.
Create an Android Studio project.
Generate a signing certificate and a signing certificate fingerprint.
Configure the signing certificate fingerprint in AppGallery Connect.
Integrate the HMS SDK.
Configure a project signature.
For details, please refer to Preparations for Integrating HUAWEI HMS Core.
3 Creating a Map
This section describes how to create a map using Google Map SDK and HMS Core Map Kit.
3.1 Google Map SDK
1. Add build dependencies to the build.gradle file of your app.
Code:
implementation 'com.google.android.gms:play-services-maps:15.0.0'
2. Add a fragment to the activity layout file.
Code:
<fragment
android:id="@+id/mapFragment"
android:name="com.google.android.gms.maps.MapFragment"
android:layout_width="match_parent"
android:layout_height="match_parent"/>
3. Add a Google Android API key in the application section in the manifest.xml file.
Code:
<meta-data
android:name="com.google.android.geo.API_KEY"
android:value="YOUR_GOOGLE_KEY_GOES_HERE"/>
3.2 Map Kit
1. Add build dependencies to the build.gradle file of your app.
Code:
implementation 'com.huawei.hms:maps:4.0.1.301'
2. Add a fragment to the activity layout file.
Code:
<fragment
xmlns:map="http://schemas.android.com/apk/res-auto"
android:id="@+id/mapfragment_mapfragmentdemo"
class="com.huawei.hms.maps.MapFragment"
android:layout_width="match_parent"
android:layout_height="match_parent"
map:cameraTargetLat="48"
map:cameraTargetLng="118"/>
4 Adding a Marker
This section describes how to add a marker using Google Map SDK and HMS Core Map Kit.
4.1 Google Map SDK
1. Define a GoogleMap object and a MapFragment object.
Code:
private GoogleMap gMap;
private MapFragment mapFragment;
2. Implement the OnMapReadyCallback API and override the onMapReady method.
Code:
@Override
public void onMapReady(GoogleMap googleMap) {
gMap = googleMap;
LatLng amsterdam = new LatLng(52.37, 4.90);
gMap.addMarker(new MarkerOptions().position(amsterdam).title("Amsterdam"));
gMap.animateCamera(CameraUpdateFactory.newLatLngZoom(amsterdam, 8));
}
3. Load MapFragment in the onCreate method of the activity and call getMapAsync().
Code:
mapFragment = (MapFragment)getFragmentManager().findFragmentById(R.id.mapFragment);
mapFragment.getMapAsync(this);
4.2 Map Kit
1. Define a HuaweiMap object and a MapFragment object.
Code:
private HuaweiMap hMap;
private MapFragment mMapFragment;
2. Implement the OnMapReadyCallback API and override the onMapReady method.
Code:
@Override
public void onMapReady(HuaweiMap map) {
// after call getMapAsync method ,we can get HuaweiMap instance in this callback method
hMap = map;
hMap.addMarker(new MarkerOptions().position(new LatLng(31.2304, 121.4737)).icon(BitmapDescriptorFactory.fromResource(R.drawable.ic_launcher_background)));
hMap.moveCamera(CameraUpdateFactory.newLatLngZoom(new LatLng(31.2304, 121.4737), 10));
}
3. Load MapFragment in the onCreate method of the activity and call getMapAsync().
Code:
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_map_fragment_demo);
mMapFragment = (MapFragment) getFragmentManager().findFragmentById(R.id.mapfragment_mapfragmentdemo);
mMapFragment.getMapAsync(this);
}
5 Summary
This tutorial has covered the differences between the Google Map SDK and HMS Core Map Kit. HMS ToolKit, an automated tool, is also provided to help you migrate from the Google Map SDK to HMS Core Map Kit or add HMS Core Map Kit to your app efficiently. For details, visit the following links:
HMS ToolKit - Overview
Map Kit Manual Conversion Guide

Does it supports Snap to Road?

Interesting post! Thanks

XDARoni said:
1 Overview
This tutorial helps you migrate the Google Map SDK for Android to the HMS Core Map SDK for Android in your Android app. It starts with basic environment settings, and covers some basic use cases, such as creating a map and adding a marker. At the bottom of the page is a summary in which we provide links to an automated tool that helps you finish the migration.
2 Preparations
Before integrating HMS Core Map Kit, complete the following preparations:
Create an app in AppGallery Connect.
Create an Android Studio project.
Generate a signing certificate and a signing certificate fingerprint.
Configure the signing certificate fingerprint in AppGallery Connect.
Integrate the HMS SDK.
Configure a project signature.
For details, please refer to Preparations for Integrating HUAWEI HMS Core.
3 Creating a Map
This section describes how to create a map using Google Map SDK and HMS Core Map Kit.
3.1 Google Map SDK
1. Add build dependencies to the build.gradle file of your app.
Code:
implementation 'com.google.android.gms:play-services-maps:15.0.0'
2. Add a fragment to the activity layout file.
Code:
<fragment
android:id="@+id/mapFragment"
android:name="com.google.android.gms.maps.MapFragment"
android:layout_width="match_parent"
android:layout_height="match_parent"/>
3. Add a Google Android API key in the application section in the manifest.xml file.
Code:
<meta-data
android:name="com.google.android.geo.API_KEY"
android:value="YOUR_GOOGLE_KEY_GOES_HERE"/>
3.2 Map Kit
1. Add build dependencies to the build.gradle file of your app.
Code:
implementation 'com.huawei.hms:maps:4.0.1.301'
2. Add a fragment to the activity layout file.
Code:
<fragment
xmlns:map="http://schemas.android.com/apk/res-auto"
android:id="@+id/mapfragment_mapfragmentdemo"
class="com.huawei.hms.maps.MapFragment"
android:layout_width="match_parent"
android:layout_height="match_parent"
map:cameraTargetLat="48"
map:cameraTargetLng="118"/>
4 Adding a Marker
This section describes how to add a marker using Google Map SDK and HMS Core Map Kit.
4.1 Google Map SDK
1. Define a GoogleMap object and a MapFragment object.
Code:
private GoogleMap gMap;
private MapFragment mapFragment;
2. Implement the OnMapReadyCallback API and override the onMapReady method.
Code:
@Override
public void onMapReady(GoogleMap googleMap) {
gMap = googleMap;
LatLng amsterdam = new LatLng(52.37, 4.90);
gMap.addMarker(new MarkerOptions().position(amsterdam).title("Amsterdam"));
gMap.animateCamera(CameraUpdateFactory.newLatLngZoom(amsterdam, 8));
}
3. Load MapFragment in the onCreate method of the activity and call getMapAsync().
Code:
mapFragment = (MapFragment)getFragmentManager().findFragmentById(R.id.mapFragment);
mapFragment.getMapAsync(this);
4.2 Map Kit
1. Define a HuaweiMap object and a MapFragment object.
Code:
private HuaweiMap hMap;
private MapFragment mMapFragment;
2. Implement the OnMapReadyCallback API and override the onMapReady method.
Code:
@Override
public void onMapReady(HuaweiMap map) {
// after call getMapAsync method ,we can get HuaweiMap instance in this callback method
hMap = map;
hMap.addMarker(new MarkerOptions().position(new LatLng(31.2304, 121.4737)).icon(BitmapDescriptorFactory.fromResource(R.drawable.ic_launcher_background)));
hMap.moveCamera(CameraUpdateFactory.newLatLngZoom(new LatLng(31.2304, 121.4737), 10));
}
3. Load MapFragment in the onCreate method of the activity and call getMapAsync().
Code:
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_map_fragment_demo);
mMapFragment = (MapFragment) getFragmentManager().findFragmentById(R.id.mapfragment_mapfragmentdemo);
mMapFragment.getMapAsync(this);
}
5 Summary
This tutorial has covered the differences between the Google Map SDK and HMS Core Map Kit. HMS ToolKit, an automated tool, is also provided to help you migrate from the Google Map SDK to HMS Core Map Kit or add HMS Core Map Kit to your app efficiently. For details, visit the following links:
HMS ToolKit - Overview
Map Kit Manual Conversion Guide
Click to expand...
Click to collapse
Can we show direction on the map?

Does Map kit supports custom markers ?

Related

How to implement HMS Location Kit With Flutter?

More information like this, you can visit HUAWEI Developer Forum​
HMS Location Kit Flutter
Hi everyone , Today I try describing how we can use HMS Location kit Flutter Plugin also I prepare a demo project .
--You can see that at the end of page with Github link.
What is the Location Kit?
HUAWEI Location Kit assists developers in enabling their apps to get quick and accurate user locations and expand global positioning capabilities by using GPS, Wi-Fi, and base station locations.
Fused location: Provides a set of simple and easy-to-use APIs for you to quickly obtain the device location based on the GPS, Wi-Fi, and base station location data.
Activity identification: Identifies user motion status through the acceleration sensor, cellular network information, and magnetometer, helping you adjust your app based on user behavior.
Geofence: Allows you to set an interested area through an API so that your app can receive a notification when a specified action (such as leaving, entering, or lingering in the area) occurs.
If you want to more information about Location kit:
Location Kit
HUAWEI Location Kit combines GPS, Wi-Fi, and base station location data to allow you to quickly obtain precise user…developer.huawei.com
Location Flutter Plugin
The Flutter Location Plugin provides adaption code used for the HUAWEI Location Kit to use in Flutter platform. HUAWEI Location Kit combines the GPS, Wi-Fi, and base station locations to help you quickly obtain precise user locations, build up global positioning capabilities, and reach a wide range of users around the globe.
Configuring Your Flutter Project
Registering as a Developer
Before you get started, you must register as a HUAWEI developer and complete identity verification on the HUAWEI Developers website. For details, please refer to Registration and Verification.
Creating AppGalery Connect Project
Sign in to AppGallery Connect and click My apps.
Click the desired app name.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
3. Click the Develop tab.
4. In the App information area, click agconnect-services.json to download the configuration file.
If you have made any changes, such as setting the data storage location and enabling or managing APIs, you need to download the latest agconnect-services.json file and use it to replace the existing file in the app directory.
5. Create a Flutter project if you have not created one.
6. Run the following command and make sure that no errors are reported.
[project_path]> flutter doctor
7. Copy the agconnect-services.json file to the android/app directory of your Flutter project.
8. Copy the generated signature file to the android/app directory of your Flutter project.
9. Verify that the agconnect-services.json file and signature file are successfully added to the android/app directory of your Flutter project.
10. Open the build.gradle file in the android directory of your Flutter project.
a. Go to buildscript, and configure the Maven repository address and AppGallery Connect plugin for the HMS Core SDK.
Code:
buildscript {
repositories {
google()
jcenter()
maven { url 'https://developer.huawei.com/repo/' }
}
dependencies {
/*
* <Other dependencies>
*/
classpath 'com.huawei.agconnect:agcp:1.2.1.301'
}
}
b. Go to allprojects, and configure the Maven repository address for the HMS Core SDK.
Code:
allprojects {
repositories {
google()
jcenter()
maven { url 'https://developer.huawei.com/repo/' }
}
}
11. Open the build.gradle file in the android/app directory of your Flutter project.
a. Add build dependencies.
Code:
dependencies {
implementation 'com.huawei.hms:location:4.0.3.301'
/*
* <Other dependencies>
*/
}
b. Add the apply plugin: ‘com.huawei.agconnect’ line after the apply plugin: ‘com.android.application’ line.
Code:
apply plugin: 'com.android.application'
apply plugin: 'com.huawei.agconnect'
apply from: "$flutterRoot/packages/flutter_tools/gradle/flutter.gradle"
c. Set minSdkVersion to 19 or higher in android > defaultConfig.
Code:
defaultConfig {
applicationId "<package_name>"
minSdkVersion 19
/*
* <Other configurations>
*/
}
The value of applicationId must match with that of package_name in the agconnect-services.json file.
d. Configure the signature in android based on the signature file information.
Code:
android {
/*
* <Other configurations>
*/
signingConfigs {
release {
storeFile file('<keystore_file>')
storePassword '<keystore_password>'
keyAlias '<key_alias>'
keyPassword '<key_password>'
}
}
buildTypes {
debug {
signingConfig signingConfigs.release
}
release {
signingConfig signingConfigs.release
}
}
}
Replace <keystore_file>, <keystore_password>, <key_alias> and <key_password> with matching entries in your signature file. For details about the app signing procedure in Flutter, please refer to App Signing in Flutter.
Integrating the Plugin
There are two ways to integrate the plugin to your project. Download the Location Kit Flutter Plugin and Using pub.dev HMS location_plugin
Download the Location Kit Flutter Plugin
Download the Location Kit Flutter Plugin and decompress it.
Open the pubspec.yaml file in your Flutter project and add the plugin dependency to the dependencies section.
Code:
dependencies:
huawei_location:
path: {library path}
Replace {library path} with the actual library path of the HUAWEI Location Kit Flutter plugin. For details, please refer to Using Packages.
3. Run following command to update the package information:
Code:
[project_path]> flutter pub get
4. Run the following command or click the run icon on the toolbar to start the app:
Code:
[project_path]> flutter run
Using pub.dev HMS location_plugin
Add this to your package’s pubspec.yaml file:
Code:
dependencies:
huawei_location: ^4.0.4+300
You can install packages from the command line:
with Flutter:
Code:
$ flutter pub get
Code:
Alternatively, your editor might support flutter pub get. Check the docs for your editor to learn more.
Permissions
First of all we need permissions to access location and physical activity data.
Add location permission to manifest file.Define this permissions in android/app/src/main/AndroidManifest.xml as follows:
Create a PermissionHandle intance.
Code:
PermissionHandler permissionHandler;
Add initState() for initialize.
Code:
permissionHandler = PermissionHandler();
What does the service provide?
Check Permissions
Request Permissions
Code:
void requestPermission() async {
try {
bool status = await permissionHandler.requestLocationPermission();
setState(() {
infoText = "Is permission granted $status";
});
} catch (e) {
setState(() {
infoText = e.toString();
});
}
}
Fused Location
Create a FusedLocationProviderClient instance using the init() method and use the instance to call location-related APIs.
Code:
FusedLocationProviderClient locationService;
Add initState() for initialize.
Code:
locationService = FusedLocationProviderClient();
What does the service provide?
Last Location
Last Location With Address
Mock Location
Location Updates
Reference: Fused Location developer.huawei.com
To use mock location
To use the mock location function, go to Settings > System & updates > Developer options > Select mock location app and select the app for using the mock location function.
(If Developer options is unavailable, go to Settings > About phone and tap Build number for seven consecutive times. Then, Developer options will be displayed on System & updates.)
To use mock location feature first configure the mock location permission in the android/app/src/main/AndroidManifest.xml file.
Code:
<uses-permission
android:name="android.permission.ACCESS_MOCK_LOCATION"
tools:ignore="MockLocation,ProtectedPermissions" />
Listen Location Update Event
Call the method onLocationData that listens the location update events.
StreamSubscription<Location> streamSubs;
Add initState()
Code:
streamSubs = locationService.onLocationData.listen((location) {
print(location.toString());
});
Example method : getLastLocation()
Code:
void getLastLocation() async {
setState(() {
infoText = "";
});
try {
Location location = await locationService.getLastLocation();
setState(() {
infoText = location.toString();
});
} catch (e) {
setState(() {
infoText = e.toString();
});
}
}
Activity Identification
Creating an Activity Identification Service Client
ActivityIdentificationService locationService;
Add initState() for initialize.
locationService = FusedLocationProviderClient();
What does the service provide?
Activity Conversion Updates
Activity Conversion Request
Activity Identification Updates
Activity Identification Request
Listen Activity Identification Event
You can use the onActivityIdentification method to listen to and receive data from activity identification events.
Add initState() for initialize.
ActivityIdentificationService activityIdentificationService =
ActivityIdentificationService();
Code:
void onActivityIdentificationResponse(ActivityIdentificationResponse response) {
for (ActivityIdentificationData data
in response.activityIdentificationDatas) {
setChange(data.identificationActivity , data.possibility);
//data.identificationActivity include activity type like vehicle,bike etc.
//data.posibility The confidence for the user to execute the activity.
//The confidence ranges from 0 to 100. A larger value indicates more reliable activity authenticity.
}
}
void streamListen() {
streamSubscription =
activityIdentificationService.onActivityIdentification.listen(onActivityIdentificationResponse);
}
For full content, you can visit https://forums.developer.huawei.com/forumPortal/en/topicview?tid=0201314069177450189&fid=0101187876626530001

HMS Video Kit — 1

HMS Video Kit — 1
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
In this article, I will write about the features of Huawei’s Video Kit and we will develop a sample application that allows you to play streaming media from a third-party video address.
Why should we use it?
Nowadays, video apps are so popular. Due to the popularity of streaming media, many developers have introduced HD movie streaming apps for people who use devices such as tablets and smartphones for everyday purposes. With Video Kit WisePlayer SDK you can bring stable HD video experiences to your users.
Service Features
It provides a high definition video experience without any delay
Responds instantly to playback requests
Have intuitive controls and offer content on demand
It selects the most suitable bitrate for your app
URL anti-leeching , playback authentication, and other security mechanisms so your videos are completely secure
It supports streaming media in 3GP, MP4, or TS format and complies with HTTP/HTTPS, HLS, or DASH.
Integration Preparations
First of all, in order to start developing an app with most of the Huawei mobile services and the Video Kit as well, you need to integrate the HUAWEI HMS Core into your application.
Software Requirements
Android Studio 3.X
JDK 1.8 or later
HMS Core (APK) 5.0.0.300 or later
EMUI 3.0 or later
The integration flow will be like this :
For a detailed HMS core integration process, you can either refer to Preparations for Integrating HUAWEI HMS Core.
After creating the application on App Gallery Connect and completed the other steps that are required, please make sure that you copied the agconnect-services.json file to the app’s root directory of your Android Studio project.
Adding SDK dependencies
Add the AppGallery Connect plug-in and the Maven repository in the project-level build.gradle file.
Code:
buildscript {
repositories {
......
maven {url 'https://developer.huawei.com/repo/'}
}
dependencies {
......
classpath 'com.huawei.agconnect:agcp:1.3.1.300' // HUAWEI agcp plugin
}
}
allprojects {
repositories {
......
maven {url 'https://developer.huawei.com/repo/'}
}
}
2. Open the build.gradle file in the app directory and add the AppGallery connect plug-in.
Code:
apply plugin: 'com.android.application'
// Add the following line
apply plugin: 'com.huawei.agconnect' // HUAWEI agconnect Gradle plugin
android {
......
}
3.Configure the Maven dependency in the app level build.gradle file
Code:
dependencies {
......
implementation "com.huawei.hms:videokit-player:1.0.1.300"
}
You can find all the version numbers of this kit in its Version Change History.
Click to expand...
Click to collapse
4.Configure the NDK in the app-level build.gradle file.
Code:
android {
defaultConfig {
......
ndk {
abiFilters "armeabi-v7a", "arm64-v8a"
}
}
......
}
Here, we have used the abiFilters in order to reduce the .apk size by selecting the desired CPU architectures.
5.Add permissons in the AndroidManifest.xml file.
Code:
<uses-permission
android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<uses-permission
android:name="android.permission.ACCESS_WIFI_STATE" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<uses-permission android:name="com.huawei.permission.SECURITY_DIAGNOSE" />
Note : For Android 6.0 and later , Video Kit dynamically applies for the write permisson on external storage.
Click to expand...
Click to collapse
6.Lastly, add configurations to exclude the HMS Core SDK from obfuscation.
The obfuscation configuration file is proguard-rules.pro for Android Studio
Open the obfuscation configuration file of your Android Studio project and add the configurations.
Code:
-ignorewarnings
-keepattributes *Annotation*
-keepattributes Exceptions
-keepattributes InnerClasses
-keepattributes Signature
-keepattributes SourceFile,LineNumberTable
-keep class com.hianalytics.android.**{*;}
-keep class com.huawei.updatesdk.**{*;}
-keep class com.huawei.hms.**{*;}
With these steps, we have terminated the integration part. Now, let's get our hands dirty with some code …
Initializing WisePlayer
In order to initialize the player, we need to create a class that inherits from Application.Application class is a base class of Android app containing components like Activities and Services.Application or its sub classes are instantiated before all the activities or any other application objects have been created in the Android app.
We can give additional introductions to the Application class by extending it. We call the initialization API WisePlayerFactory.initFactory() of the WisePlayer SDK in the onCreate() method.
Java:
public class VideoKitPlayApplication extends Application {
private static final String TAG = "VideoKitPlayApplication";
private static WisePlayerFactory wisePlayerFactory = null;
@Override
public void onCreate() {
super.onCreate();
initPlayer();
}
private void initPlayer() {
// DeviceId test is used in the demo, specific access to incoming deviceId after encryption
Log.d(TAG, "initPlayer: VideoKitPlayApplication");
WisePlayerFactoryOptions factoryOptions = new WisePlayerFactoryOptions.Builder().setDeviceId("xxx").build();
WisePlayerFactory.initFactory(this, factoryOptions, initFactoryCallback);
}
/**
* Player initialization callback
*/
private static InitFactoryCallback initFactoryCallback = new InitFactoryCallback() {
@Override
public void onSuccess(WisePlayerFactory wisePlayerFactory) {
Log.d(TAG, "init player factory success");
setWisePlayerFactory(wisePlayerFactory);
}
@Override
public void onFailure(int errorCode, String reason) {
Log.d(TAG, "init player factory fail reason :" + reason + ", errorCode is " + errorCode);
}
};
/**
* Get WisePlayer Factory
*
* @return WisePlayer Factory
*/
public static WisePlayerFactory getWisePlayerFactory() {
return wisePlayerFactory;
}
private static void setWisePlayerFactory(WisePlayerFactory wisePlayerFactory) {
VideoKitPlayApplication.wisePlayerFactory = wisePlayerFactory;
}
}
Playing a Video
We need to create a PlayActivity that inherits from AppCompatActivity and implement the Callback and SurfaceTextureListener APIs.Currently, WisePlayer supports SurfaceView and TextureView. Make sure that your app has a valid view for video display; otherwise, the playback will fail. So that In the layout file, we need to add SurfaceView or TextureView to be displayed in WisePlayer.PlayActivity also implements the OnPlayWindowListener and OnWisePlayerListener in order to get callbacks from the WisePlayer.
Java:
import android.view.SurfaceHolder.Callback;
import android.view.TextureView.SurfaceTextureListener;
import com.videokitnative.huawei.contract.OnPlayWindowListener;
import com.videokitnative.huawei.contract.OnWisePlayerListener;
public class PlayActivity extends AppCompatActivity implements Callback,SurfaceTextureListener,OnWisePlayerListener,OnPlayWindowListener{
...
}
WisePlayerFactory instance is returned when the initialization is complete in Application. We need to call createWisePlayer() to create WisePlayer.
Java:
WisePlayer player = Application.getWisePlayerFactory().createWisePlayer();
In order to make the code modular and understandable, I have created PlayControl.java class as in the official demo and created the Wiseplayer in that class. Since we create the object in our PlayActivity class through the constructor,wisePlayer will be created in the onCreate() method of our PlayActivity.
Note: Before calling createWisePlayer() to create WisePlayer, make sure that Application has successfully initialized the WisePlayer SDK.
Click to expand...
Click to collapse
Now, we need to Initialize the WisePlayer layout and add layout listeners. I have created the PlayView.java for creating the views and updating them. So that we can create the PlayView instance on onCreate() method of our PlayActivity.
Java:
/**
* init the layout
*/
private void initView() {
playView = new PlayView(this, this, this);
setContentView(playView.getContentView());
}
In the PlayView.java class I have created SurfaceView for displaying the video.
Java:
surfaceView = (SurfaceView) findViewById(R.id.surface_view); SurfaceHolder surfaceHolder = surfaceView.getHolder(); surfaceHolder.addCallback(this); surfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
I will share the demo code that I have created. You can find the activity_play.xml layout and the PlayView.java files over there.
Click to expand...
Click to collapse
Registering WisePlayer listeners is another important step. Because the app will react based on listener callbacks. I have done this on PlayControl.java class with the method below.
Java:
/**
* Set the play listener
*/
private void setPlayListener() {
if (wisePlayer != null) {
wisePlayer.setErrorListener(onWisePlayerListener);
wisePlayer.setEventListener(onWisePlayerListener);
wisePlayer.setResolutionUpdatedListener(onWisePlayerListener);
wisePlayer.setReadyListener(onWisePlayerListener);
wisePlayer.setLoadingListener(onWisePlayerListener);
wisePlayer.setPlayEndListener(onWisePlayerListener);
wisePlayer.setSeekEndListener(onWisePlayerListener);
}
}
Here, onWisePlayerListener is an interface that extends required Wiseplayer interfaces.
Java:
public interface OnWisePlayerListener extends WisePlayer.ErrorListener, WisePlayer.ReadyListener,
WisePlayer.EventListener, WisePlayer.PlayEndListener, WisePlayer.ResolutionUpdatedListener,
WisePlayer.SeekEndListener, WisePlayer.LoadingListener, SeekBar.OnSeekBarChangeListener {
}
Now, we need to set URLs for our videos on our PlayControl.java class with the method below.
Java:
wisePlayer.setPlayUrl("http://commondatastorage.googleapis.com/gtv-videos-bucket/sample/BigBuckBunny.mp4");
Since I have used CardViews on my MainActivity.java class , I have passed the Urls and movie names on click action through intent from MainActivity to PlayControl. You can check it out on my source code as well.
Click to expand...
Click to collapse
We’ve set a view to display the video with the code below. In my demo application I have used SurfaceView to display the video.
Java:
// SurfaceView listener callback
@Override
public void surfaceCreated(SurfaceHolder holder) { wisePlayer.setView(surfaceView); }
In order to prepare for the playback and start requesting data, we need the call wisePlayer.ready() method .
Lastly, we need to call wisePlayer.start() method to start the playback upon a successful response of the onReady callback method in this API.
Java:
@Override public void onReady(WisePlayer wisePlayer)
{
wisePlayer.start();
}
We have finished the development, lets pick a movie and enjoy it
Movie List
You can find the source code of the demo app here.
In this article, we developed a sample application using HUAWEI Video Kit. HMS Video Kit offers a lot of features, for the sake of simplicity we implemented a few of them. I will share another post to show more features of the video kit in the near future.
RESOURCES
Documentation
Video Kit Codelab
what is the minimum resolution video we can play ??
What should I do if the signature fails to be verified on the server side?
shikkerimath said:
what is the minimum resolution video we can play ??
Click to expand...
Click to collapse
The minimum resolution is 270p, and the maximum is 4K.
Very interesting.

Beginner: Integration of Text Translation feature in Education apps (Huawei ML Kit-React Native)

View attachment 5237385
Overview
Translation service can translate text from the source language into the target language. It supports online and offline translation.
In this article, I will show how user can understand the text using ML Kit Plugin.
The text translation service can be widely used in scenarios where translation between different languages is required.
For example, travel apps can integrate this service to translate road signs or menus in other languages to tourists' native languages, providing those considerate services; educational apps can integrate this service to eliminate language barriers, make content more accessible, and improve learning efficiency. In addition, the service supports offline translation, allowing users to easily use the translation service even if the network is not available.
Create Project in Huawei Developer Console
Before you start developing an app, configure app information in App Gallery Connect.
Register as a Developer
Before you get started, you must register as a Huawei developer and complete identity verification on HUAWEI Developers. For details, refer to Registration and Verification.
Create an App
Follow the instructions to create an app Creating an App Gallery Connect Project and Adding an App to the Project. Set the data storage location to Germany.
React Native setup
Requirements
Huawei phone with HMS 4.0.0.300 or later.
React Native environment with Android Studio, NodeJs and Visual Studio code.
Dependencies
Gradle Version: 6.3
Gradle Plugin Version: 3.5.2
React-native-hms-ml gradle dependency
React Native CLI: 2.0.1
1. Environment set up, refer below link.
2. Create project using below command.
Code:
react-native init project name
3. You can install react native command line interface on npm, using the install -g react-native-cli command as shown below.
Code:
npm install –g react-native-cli
Generating a Signing Certificate Fingerprint
Signing certificate fingerprint is required to authenticate your app to Huawei Mobile Services. Make sure JDK is installed. To create one, navigate to JDK directory’s bin folder and open a terminal in this directory. Execute the following command:
Code:
keytool -genkey -keystore <application_project_dir>\android\app\<signing_certificate_fingerprint_filename>.jks -storepass <store_password> -alias <alias> -keypass <key_password> -keysize 2048 -keyalg RSA -validity 36500
This command creates the keystore file in application_project_dir/android/app
The next step is obtain the SHA256 key which is needed for authenticating your app to Huawei services, for the key store file. To obtain it, enter following command in terminal:
Code:
keytool -list -v -keystore <application_project_dir>\android\app\<signing_certificate_fingerprint_filename>.jks
After an authentication, the SHA256 key will be revealed as shown below.
View attachment 5237389
Adding SHA256 Key to the Huawei project in App Gallery
Copy the SHA256 key and visit AppGalleryConnect/ <your_ML_project>/General Information. Paste it to the field SHA-256 certificate fingerprint.
View attachment 5237391
Enable the ML kit from ManageAPIs.
Download the agconnect-services.json from App Gallery and place the file in android/app directory from your React Native Project.
Follow the steps to integrate the ML plugin to your React Native Application.
Integrate the Hms-ML plugin
Code:
npm i @hmscore/react-native-hms-ml
Download the Plugin from the Download Link
Download ReactNative ML Plugin under node_modules/@hmscore of your React Native project, as shown in the directory tree below:
Code:
project-dir
|_ node_modules
|_ ...
|_ @hmscore
|_ ...
|_ react-native-hms-ml
|_ ...
|_ ...
Navigate to android/app/build.gradle directory in your React Native project. Follow the steps:
Add the AGC Plugin dependency
Code:
apply plugin: 'com.huawei.agconnect'
Add to dependencies in android/app/build.gradle:
Code:
implementation project(':react-native-hms-ml')
Navigate to App level android/build.gradle directory in your React Native project. Follow the steps:
Add to buildscript/repositories
Code:
maven {url 'http://developer.huawei.com/repo/'}
Add to buildscript/dependencies
Code:
classpath 'com.huawei.agconnect:agcp:1.3.1.300')
Navigate to android/settings.gradle and add the following:
Code:
include ':react-native-hms-ml'
project(':react-native-hms-ml').projectDir = new File(rootProject.projectDir, '../node_modules/@hmscore/react-native-hms-ml/android')
Use case:
Huawei ML kit’s HMSTranslate API can be integrate for different applications and to translation between different languages.
Set API Key:
Before using HUAWEI ML in your app, set Api key first.
Copy the api_key value in your agconnect-services.json file.
Call setApiKey with the copied value.
Code:
HMSApplication.setApiKey("api_key").then((res) => {console.log(res);})
catch((err) => {console.log(err);})
Add below permission under AndroidManifest.xml file.
XML:
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<uses-permission android:name="android.permission.CAMERA" />
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.RECORD_AUDIO" />
<uses-permission android:name="android.permission.ACCESS_WIFI_STATE" />
Translation
Text translation is implemented in either asynchronous or synchronous mode. For details, please refer to HMSTranslate.
JavaScript:
async asyncTranslate(sentence) {
try {
if (sentence !== "") {
var result = await HMSTranslate.asyncTranslate(this.state.isEnabled, true, sentence, this.getTranslateSetting());
console.log(result);
if (result.status == HMSApplication.SUCCESS) {
this.setState({ result: result.result });
}
else {
this.setState({ result: result.message });
if (result.status == HMSApplication.NO_FOUND) {
this.setState({ showPreparedModel: true });
ToastAndroid.showWithGravity("Download Using Prepared Button Below", ToastAndroid.SHORT, ToastAndroid.CENTER);
}
}
}
} catch (e) {
console.log(e);
this.setState({ result: "This is an " + e });
}
}
Obtaining Languages
Obtains language codes in on-cloud and on-device translation services. For details, please refer to HMSTranslate.
JavaScript:
async getAllLanguages() {
try {
var result = await HMSTranslate.getAllLanguages(this.state.isEnabled);
console.log(result);
if (result.status == HMSApplication.SUCCESS) {
this.setState({ result: result.result.toString() });
}
else {
this.setState({ result: result.message });
}
} catch (e) {
console.log(e);
}
}
Downloading Prepared Model
A prepared model is provided for on-device analyzer to translate text. You can download the on-device analyzer model. You can translate the text in offline using the download Model. For details, please refer to HMSTranslate.
JavaScript:
async preparedModel() {
try {
var result = await HMSTranslate.preparedModel(this.getStrategyConfiguration(), this.getTranslateSetting());
console.log(result);
if (result.status == HMSApplication.SUCCESS) {
this.setState({ result: "Model download Success. Now you can use local analyze" });
}
else {
this.setState({ result: result.message });
}
} catch (e) {
console.log(e);
this.setState({ result: "This is an " + e });
}
}
Read full article continue reading
Output:
View attachment 5237403
Tips and Tricks
Download latest HMS ReactNativeML plugin.
Copy the api_key value in your agconnect-services.json file and set API key.
Add the languages to translate in Translator Setting.
For project cleaning, navigate to android directory and run the below command.
Code:
gradlew clean
Conclusion:
In this article, we have learnt to integrate ML kit in React native project.
Educational apps can integrate this service to eliminate language barriers, make content more accessible, and improve learning efficiency. In addition, the service supports offline translation, allowing users to easily use the translation service even if the network is not available.
Reference
https://developer.huawei.com/consum...uides-V1/text-translation-0000001051086162-V1
Read full article link
XDARoni said:
View attachment 5237385
Overview
Translation service can translate text from the source language into the target language. It supports online and offline translation.
In this article, I will show how user can understand the text using ML Kit Plugin.
The text translation service can be widely used in scenarios where translation between different languages is required.
For example, travel apps can integrate this service to translate road signs or menus in other languages to tourists' native languages, providing those considerate services; educational apps can integrate this service to eliminate language barriers, make content more accessible, and improve learning efficiency. In addition, the service supports offline translation, allowing users to easily use the translation service even if the network is not available.
Create Project in Huawei Developer Console
Before you start developing an app, configure app information in App Gallery Connect.
Register as a Developer
Before you get started, you must register as a Huawei developer and complete identity verification on HUAWEI Developers. For details, refer to Registration and Verification.
Create an App
Follow the instructions to create an app Creating an App Gallery Connect Project and Adding an App to the Project. Set the data storage location to Germany.
React Native setup
Requirements
Huawei phone with HMS 4.0.0.300 or later.
React Native environment with Android Studio, NodeJs and Visual Studio code.
Dependencies
Gradle Version: 6.3
Gradle Plugin Version: 3.5.2
React-native-hms-ml gradle dependency
React Native CLI: 2.0.1
1. Environment set up, refer below link.
2. Create project using below command.
Code:
react-native init project name
3. You can install react native command line interface on npm, using the install -g react-native-cli command as shown below.
Code:
npm install –g react-native-cli
Generating a Signing Certificate Fingerprint
Signing certificate fingerprint is required to authenticate your app to Huawei Mobile Services. Make sure JDK is installed. To create one, navigate to JDK directory’s bin folder and open a terminal in this directory. Execute the following command:
Code:
keytool -genkey -keystore <application_project_dir>\android\app\<signing_certificate_fingerprint_filename>.jks -storepass <store_password> -alias <alias> -keypass <key_password> -keysize 2048 -keyalg RSA -validity 36500
This command creates the keystore file in application_project_dir/android/app
The next step is obtain the SHA256 key which is needed for authenticating your app to Huawei services, for the key store file. To obtain it, enter following command in terminal:
Code:
keytool -list -v -keystore <application_project_dir>\android\app\<signing_certificate_fingerprint_filename>.jks
After an authentication, the SHA256 key will be revealed as shown below.
View attachment 5237389
Adding SHA256 Key to the Huawei project in App Gallery
Copy the SHA256 key and visit AppGalleryConnect/ <your_ML_project>/General Information. Paste it to the field SHA-256 certificate fingerprint.
View attachment 5237391
Enable the ML kit from ManageAPIs.
Download the agconnect-services.json from App Gallery and place the file in android/app directory from your React Native Project.
Follow the steps to integrate the ML plugin to your React Native Application.
Integrate the Hms-ML plugin
Code:
npm i @hmscore/react-native-hms-ml
Download the Plugin from the Download Link
Download ReactNative ML Plugin under node_modules/@hmscore of your React Native project, as shown in the directory tree below:
Code:
project-dir
|_ node_modules
|_ ...
|_ @hmscore
|_ ...
|_ react-native-hms-ml
|_ ...
|_ ...
Navigate to android/app/build.gradle directory in your React Native project. Follow the steps:
Add the AGC Plugin dependency
Code:
apply plugin: 'com.huawei.agconnect'
Add to dependencies in android/app/build.gradle:
Code:
implementation project(':react-native-hms-ml')
Navigate to App level android/build.gradle directory in your React Native project. Follow the steps:
Add to buildscript/repositories
Code:
maven {url 'http://developer.huawei.com/repo/'}
Add to buildscript/dependencies
Code:
classpath 'com.huawei.agconnect:agcp:1.3.1.300')
Navigate to android/settings.gradle and add the following:
Code:
include ':react-native-hms-ml'
project(':react-native-hms-ml').projectDir = new File(rootProject.projectDir, '../node_modules/@hmscore/react-native-hms-ml/android')
Use case:
Huawei ML kit’s HMSTranslate API can be integrate for different applications and to translation between different languages.
Set API Key:
Before using HUAWEI ML in your app, set Api key first.
Copy the api_key value in your agconnect-services.json file.
Call setApiKey with the copied value.
Code:
HMSApplication.setApiKey("api_key").then((res) => {console.log(res);})
catch((err) => {console.log(err);})
Add below permission under AndroidManifest.xml file.
XML:
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<uses-permission android:name="android.permission.CAMERA" />
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.RECORD_AUDIO" />
<uses-permission android:name="android.permission.ACCESS_WIFI_STATE" />
Translation
Text translation is implemented in either asynchronous or synchronous mode. For details, please refer to HMSTranslate.
JavaScript:
async asyncTranslate(sentence) {
try {
if (sentence !== "") {
var result = await HMSTranslate.asyncTranslate(this.state.isEnabled, true, sentence, this.getTranslateSetting());
console.log(result);
if (result.status == HMSApplication.SUCCESS) {
this.setState({ result: result.result });
}
else {
this.setState({ result: result.message });
if (result.status == HMSApplication.NO_FOUND) {
this.setState({ showPreparedModel: true });
ToastAndroid.showWithGravity("Download Using Prepared Button Below", ToastAndroid.SHORT, ToastAndroid.CENTER);
}
}
}
} catch (e) {
console.log(e);
this.setState({ result: "This is an " + e });
}
}
Obtaining Languages
Obtains language codes in on-cloud and on-device translation services. For details, please refer to HMSTranslate.
JavaScript:
async getAllLanguages() {
try {
var result = await HMSTranslate.getAllLanguages(this.state.isEnabled);
console.log(result);
if (result.status == HMSApplication.SUCCESS) {
this.setState({ result: result.result.toString() });
}
else {
this.setState({ result: result.message });
}
} catch (e) {
console.log(e);
}
}
Downloading Prepared Model
A prepared model is provided for on-device analyzer to translate text. You can download the on-device analyzer model. You can translate the text in offline using the download Model. For details, please refer to HMSTranslate.
JavaScript:
async preparedModel() {
try {
var result = await HMSTranslate.preparedModel(this.getStrategyConfiguration(), this.getTranslateSetting());
console.log(result);
if (result.status == HMSApplication.SUCCESS) {
this.setState({ result: "Model download Success. Now you can use local analyze" });
}
else {
this.setState({ result: result.message });
}
} catch (e) {
console.log(e);
this.setState({ result: "This is an " + e });
}
}
Read full article continue reading
Output:
View attachment 5237403
Tips and Tricks
Download latest HMS ReactNativeML plugin.
Copy the api_key value in your agconnect-services.json file and set API key.
Add the languages to translate in Translator Setting.
For project cleaning, navigate to android directory and run the below command.
Code:
gradlew clean
Conclusion:
In this article, we have learnt to integrate ML kit in React native project.
Educational apps can integrate this service to eliminate language barriers, make content more accessible, and improve learning efficiency. In addition, the service supports offline translation, allowing users to easily use the translation service even if the network is not available.
Reference
https://developer.huawei.com/consum...uides-V1/text-translation-0000001051086162-V1
Read full article link
Click to expand...
Click to collapse
Currently, real-time translation how many language support
huawei is not accepting new account. any other possible solution? thanks in advance

How a Programmer Developed a Perfect Flower Recognition App

Spring is a great season for hiking, especially when flowers are in full bloom. One weekend, Jenny, John's girlfriend, a teacher, took her class for an outing in a park. John accompanied them to lend Jenny a hand.
John had prepared for a carefree outdoor outing, like those in his childhood, when he would run around on the grass — but it took a different turn. His outing turned out to be something like a Q&A session that was all about flowers: the students were amazed at John’s ability to recognize flowers, and repeatedly asked him what kind of flowers they encountered. Faced with their sincere questioning and adoring expression, John, despite not a flower expert, felt obliged to give the right answer even though he had to sneak to search for it on the Internet.
It occurred to John that there could be an easier way to answer these questions — using a handy app.
As a programmer with a knack for the market, he soon developed a flower recognition app that's capable of turning ordinary users into expert "botanists": to find out the name of a flower, all you need to do is using the app to take a picture of that flower, and it will swiftly provide you with the correct answer.
Demo
How to Implement
The flower recognition function can be created by using the image classification service in HUAWEI ML Kit. It classifies elements within images into intuitive categories to define image themes and usage scenarios. The service supports both on-device and on-cloud recognition modes, with the former recognizing over 400 categories of items, and the latter, 12,000 categories. It also allows for creating custom image classification models.
Preparations
1. Create an app in AppGallery Connect and configure the signing certificate fingerprint.
2. Configure the Huawei Maven repository address, and add the build dependency on the image classification service.
Code:
<p style="line-height: 1.5em;">dependencies{
// Import the basic SDK.
implementation 'com.huawei.hms:ml-computer-vision-classification:2.0.1.300'
// Import the image classification model package.
implementation 'com.huawei.hms:ml-computer-vision-image-classification-model:2.0.1.300'
}</p>
3. Automatically update the machine learning model.
Add the following statements to the AndroidManifest.xml file. After a user installs your app from HUAWEI AppGallery, the machine learning model will be automatically updated to the user's device.
Code:
<p style="line-height: 1.5em;"><manifest
...
<meta-data
android:name="com.huawei.hms.ml.DEPENDENCY"
android:value= "label"/>
...
</manifest></p>
4. Configure obfuscation scripts.
For details, please refer to the ML Kit Development Guide on HUAWEI Developers.
5. Declare permissions in the AndroidManifest.xml file.
To obtain images through the camera or album, you'll need to apply for relevant permissions in the file.
Code:
<p style="line-height: 1.5em;"> <uses-permission android:name="android.permission.CAMERA" />
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
<uses-feature android:name="android.hardware.camera" />
<uses-feature android:name="android.hardware.camera.autofocus" /></p>
Development Process
1. Create and configure an on-cloud image classification analyzer.
Create a class for the image classification analyzer.
Code:
<p style="line-height: 1.5em;">public class RemoteImageClassificationTransactor extends BaseTransactor<List<MLImageClassification>>
</p>
In the class, use the custom class (MLRemoteClassificationAnalyzerSetting) to create an analyzer, set relevant parameters, and configure the handler.
Code:
<p style="line-height: 1.5em;">private final MLImageClassificationAnalyzer detector;
private Handler handler;MLRemoteClassificationAnalyzerSetting options = new MLRemoteClassificationAnalyzerSetting.Factory().setMinAcceptablePossibility(0f).create();
this.detector = MLAnalyzerFactory.getInstance().getRemoteImageClassificationAnalyzer(options);this.handler = handler;
</p>
2. Call asyncAnalyseFrame to process the image.
Asynchronously classify the input MLFrame object.
Code:
<p style="line-height: 1.5em;">@Override
protected Task<List<MLImageClassification>> detectInImage(MLFrame image) {
return this.detector.asyncAnalyseFrame(image);
}
</p>
3. Obtain the result of a successful classification.
Override the onSuccess method in RemoteImageClassificationTransactor to display the name of the recognized object in the image.
Code:
<p style="line-height: 1.5em;">@Override
protected void onSuccess(
Bitmap originalCameraImage,
List<MLImageClassification> classifications,
FrameMetadata frameMetadata,
GraphicOverlay graphicOverlay) {
graphicOverlay.clear();
this.handler.sendEmptyMessage(Constant.GET_DATA_SUCCESS);
List<String> classificationList = new ArrayList<>();
for (int i = 0; i < classifications.size(); ++i) {
MLImageClassification classification = classifications.get(i);
if (classification.getName() != null) {
classificationList.add(classification.getName());
}
}
RemoteImageClassificationGraphic remoteImageClassificationGraphic =
new RemoteImageClassificationGraphic(graphicOverlay, this.mContext, classificationList);
graphicOverlay.addGraphic(remoteImageClassificationGraphic);
graphicOverlay.postInvalidate();
}
</p>
If recognition fails, handle the error and check the failure reason in the log.
Code:
<p style="line-height: 1.5em;">@Override
protected void onFailure(Exception e) {
this.handler.sendEmptyMessage(Constant.GET_DATA_FAILED);
Log.e(RemoteImageClassificationTransactor.TAG, "Remote image classification detection failed: " + e.getMessage());
}
</p>
4. Release resources when recognition ends.
When recognition ends, stop the analyzer, release detection resources, and override the stop() method in RemoteImageClassificationTransactor.
Code:
<p style="line-height: 1.5em;">@Override
public void stop() {
super.stop();
try {
this.detector.stop();
} catch (IOException e) {
Log.e(RemoteImageClassificationTransactor.TAG,
"Exception thrown while trying to close remote image classification transactor" + e.getMessage());
}
}
</p>
For more details, you can go to:
For more details, you can go to:
l Our official website
l Our Development Documentation page, to find the documents you need
l Reddit to join our developer discussion
l GitHub to download demos and sample codes
l Stack Overflow to solve any integration problems
| Original Source

Intermediate: Text Recognition, Language detection and Language translation using Huawei ML Kit in Flutter (Cross platform)

Introduction
In this article, we will be learning how to integrate Huawei ML kit in Flutter application. Flutter ML plugin allows your apps to easily leverage Huawei’s long-term proven expertise in machine learning to support diverse artificial intelligence (AI) applications. ML plugin provides diversified leading machine learning capabilities that are easy to use, helping you develop various AI apps.
List of API’s ML plugin provides
Text-related services
Language-related services
Image-related services
Face/body-related services
Natural language processing
Custom model
In this article, we will be integrating some of the specific API’s related to Text-related services and Language-related service in flutter application.
Development Overview
You need to install Flutter and Dart plugin in IDE and I assume that you have prior knowledge about the Flutter and Dart.
Hardware Requirements
A computer (desktop or laptop) running Windows 10.
A Huawei phone (with the USB cable), which is used for debugging.
Software Requirements
Java JDK 1.7 or later.
Android studio software or Visual Studio or Code installed.
HMS Core (APK) 4.X or later.
Integration process
Step 1. Create flutter project.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Step 2. Add the App level gradle dependencies, choose inside project Android > app > build.gradle.
Code:
apply plugin: 'com.android.application'
apply plugin: 'com.huawei.agconnect'
implementation 'com.huawei.agconnect:agconnect-core:1.3.1.300'
Add root level gradle dependencies.
Code:
maven {url 'https://developer.huawei.com/repo/'}
classpath 'com.huawei.agconnect:agcp:1.4.1.300'
Step 3: Add the below permissions in Android Manifest file.
Code:
<uses-permission android:name="android.permission.CAMERA" />
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<uses-permission android:name="android.permission.ACCESS_WIFI_STATE" />
Step 4: Add plugin path in pubspec.yaml file under dependencies.
Step 5: Create a project in AppGallery Connect, find here.
pubspec.yaml
Code:
name: flutterdrivedemo123
description: A new Flutter project.
# The following line prevents the package from being accidentally published to
# pub.dev using `pub publish`. This is preferred for private packages.
publish_to: 'none' # Remove this line if you wish to publish to pub.dev
# https://developer.apple.com/library/archive/documentation/General/Reference/InfoPlistKeyReference/Articles/CoreFoundationKeys.html
version: 1.0.0+1
environment:
sdk: ">=2.12.0 <3.0.0"
dependencies:
flutter:
sdk: flutter
huawei_account:
path: ../huawei_account
huawei_drive:
path: ../huawei_drive
huawei_ml:
path: ../huawei_ml
# The following adds the Cupertino Icons font to your application.
# Use with the CupertinoIcons class for iOS style icons.
cupertino_icons: ^1.0.2
image_picker: ^0.8.0
dev_dependencies:
flutter_test:
sdk: flutter
# For information on the generic Dart part of this file, see the
# following page: https://dart.dev/tools/pub/pubspec
# The following section is specific to Flutter.
flutter:
Initialize MLApplication
Code:
MLApplication app = new MLApplication();
app.setApiKey(apiKey:"API_KEY");<strong> </strong>
Check required permissions
Code:
Future<void> checkPerms() async {
final bool isCameraPermissionGranted =
await MLPermissionClient().hasCameraPermission();
if (!isCameraPermissionGranted) {
final bool res = await MLPermissionClient()
.requestPermission([MLPermission.camera, MLPermission.storage]);
}
}
Select image and capture text from image
Code:
Future getImage() async {
final pickedFile = await picker.getImage(source: ImageSource.gallery);
//final pickedFile = await picker.getImage(source: ImageSource.camera);
setState(() {
if (pickedFile != null) {
File _image = File(pickedFile.path);
print('Path :' + pickedFile.path);
capturetext(pickedFile.path);
} else {
print('No image selected.');
}
});
}
Future<void> capturetext(String path) async {
// Create an MLTextAnalyzer object.
MLTextAnalyzer analyzer = new MLTextAnalyzer();
// Create an MLTextAnalyzerSetting object to configure the recognition.
MLTextAnalyzerSetting setting = new MLTextAnalyzerSetting();
// Set the image to be recognized and other desired options.
setting.path = path;
setting.isRemote = true;
setting.language = "en";
// Call asyncAnalyzeFrame to recognize text asynchronously.
MLText text = await analyzer.asyncAnalyzeFrame(setting);
print(text.stringValue);
setState(() {
msg = text.stringValue;
});
}
How to detect Language using ML kit?
Code:
Future<void> onClickDetect() async {
// Create an MLLangDetector object.
MLLangDetector detector = new MLLangDetector();
// Create MLLangDetectorSetting to configure detection.
MLLangDetectorSetting setting = new MLLangDetectorSetting();
// Set source text and detection mode.
setting.sourceText = text;
setting.isRemote = true;
// Get detection result with the highest confidence.
String result = await detector.firstBestDetect(setting: setting);
setState(() {
text = setting.sourceText + ": " + result;
});
}
How to translate Language using ML kit?
Code:
Future<void> onClickTranslate() async {
// Create an MLLocalTranslator object.
MLLocalTranslator translator = new MLLocalTranslator();
// Create an MLTranslateSetting object to configure translation.
MLTranslateSetting setting = new MLTranslateSetting();
// Set the languages for model download.
setting.sourceLangCode = "en";
setting.targetLangCode = "hi";
// Prepare the model and implement the translation.
final isPrepared = await translator.prepareModel(setting: setting);
if (isPrepared) {
// Asynchronous translation.
String result = await translator.asyncTranslate(sourceText: text);
setState(() {
text = result.toString();
});
}
// Stop translator after the translation ends.
bool result = await translator.stopTranslate();
}
Result
Tricks and Tips
Make sure that you have downloaded latest plugin.
Make sure that updated plugin path in yaml.
Make sure that plugin unzipped in parent directory of project.
Makes sure that agconnect-services.json file added.
Make sure dependencies are added in build file.
Run flutter pug get after adding dependencies.
Generating SHA-256 certificate fingerprint in android studio and configure in Ag-connect.
Conclusion
In this article, we have learnt how to integrate capabilities of Huawei ML kit in flutter application. Similar way you can use Huawei ML kit as per user requirement in your application.
Thank you so much for reading, I hope this article helps you to understand the Huawei ML kit capabilities in flutter.
Reference
MLkit
Plutter plugin
Original Source
useful sharing

Categories

Resources