HiAi Image Super Resolution - Improve Low Resolution Images - Huawei Developers

This article is originally from HUAWEI Developer Forum
Forum link: https://forums.developer.huawei.com/forumPortal/en/home​
HiAi Image Super Resolution
Upscales an image or reduces image noise and improves image details without changing the resolution.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Upscales an image or reduces image noise and improves image details without changing the resolution.
Base on AI deep learning of CV (Computer Vision)
Utilize Huawei NPU (Neural Processing Unit), 50X faster than CPU
1X & 3X super-resolution furnish images with clearer effect, reducing JPEG compression noise
You can check the offical documentation about HiAi Image super resolution.
Huawei continuous investment on NPU technology
Huawei Phones support HiAI
Software: Huawei EMUI 9.0 & above
Hardware: CPU 970,810, 820,985,990
Codelab
https://developer.huawei.com/consumer/en/codelab/HiAIImageSuperresolution/index.html#0
You can also follow the codelab to implement the HiAi image resolution with the help DevEco IDE plugin in Android Studio.
Project: (HiAi Image Super Resolution)
In this article we are going to make project in which we can implement HiAi Image Super Resolution to improve low resolution image quality which is used in most of the application as a thumbnail images.
1. Implementation:
Download the vision-oversea-release.aar package in the Huawei AI Engine SDKs from the Huawei developer community.
Copy the downloaded vision-oversea-release.aar package to the app/libs directory of the project.
Add the following code to build.gradle in the APP directory of the project, and add vision-release.aar to the project. Dependency on the Gson library must be added, because the conversion of parameters and results between the JavaScript Object Notation (JSON) and Java classes inside vision-release.aar depends on the Gson library.
Code:
repositories {
flatDir {
dirs 'libs'
}
}
dependencies {
implementation fileTree(dir: 'libs', include: ['*.jar'])
implementation(name: 'vision-oversea-release', ext: 'aar')
implementation 'com.google.code.gson:gson:2.8.6'
}
2. Assets:
In this section we adding some low resolution images in "assets/material/image_super_resolution" directory, to further fetch images for local direction for optimization.
3. Design ListView:
In this section we are design ListView in our layouts to show original images and optimized images.
activity_main.xml
Code:
<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:orientation="vertical"
tools:context=".MainActivity">
<LinearLayout
android:id="@+id/linearLayout"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:layout_marginTop="1pt"
android:layout_marginBottom="5pt"
android:orientation="horizontal"
app:layout_constraintStart_toStartOf="parent"
app:layout_constraintTop_toTopOf="parent">
<TextView
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_weight="1"
android:gravity="center"
android:text="Original Image"
android:textSize="24sp" />
<TextView
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_weight="1"
android:gravity="center"
android:text="Improved Image"
android:textSize="24sp" />
</LinearLayout>
<ListView
android:id="@+id/item_listView"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:dividerHeight="3pt"
/>
</LinearLayout>
items.xml
Code:
<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
android:orientation="horizontal"
android:layout_width="match_parent"
android:layout_height="match_parent"
>
<ImageView
android:id="@+id/imgOriginal"
android:layout_width="100dp"
android:layout_height="100dp"
app:srcCompat="@drawable/noimage"
android:layout_gravity="start"
android:layout_weight="1"
/>
<TextView
android:id="@+id/imgTitle"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text=" "
android:layout_gravity="center_horizontal"
android:layout_weight="0"
android:textAlignment="center"
/>
<ImageView
android:id="@+id/imgConverted"
android:layout_width="100dp"
android:layout_height="100dp"
app:srcCompat="@drawable/noimage"
android:layout_gravity="end"
android:layout_weight="1"
app:layout_constraintDimensionRatio="h,4:3"
/>
</LinearLayout>
4. Coding: (Adapter, HiAi Image Super Resolution )
Util Class:
  We make a util class AssetsFileUtil, to get all the images from local assets directory, get single BitMap image.
Code:
public class AssetsFileUtil {
public static Bitmap getBitmapByFilePath(Context context, String filePath){
try{
AssetManager assetManager = context.getAssets();
InputStream is = assetManager.open(filePath);
Bitmap bitmap = BitmapFactory.decodeStream(is);
return bitmap;
}catch (Exception e){
e.printStackTrace();
return null;
}
}
public static List<Bitmap> getBitmapListByDirPath(Context context, String dirPath){
List<Bitmap> list = new ArrayList<Bitmap>();
try{
AssetManager assetManager = context.getResources().getAssets();
String[] photos = assetManager.list(dirPath);
for(String photo : photos){
if(isFile(photo)){
Bitmap bitmap = getBitmapByFilePath(context,dirPath + "/" + photo);
list.add(bitmap);
}else {
List<Bitmap> childList = getBitmapListByDirPath(context,dirPath + "/" + photo);
list.addAll(childList);
}
}
}catch (Exception e){
e.printStackTrace();
}
return list;
}
public static List<String> getFileNameListByDirPath(Context context, String dirPath){
List<String> list = new ArrayList<String>();
try{
AssetManager assetManager = context.getResources().getAssets();
String[] photos = assetManager.list(dirPath);
for(String photo : photos){
if(isFile(photo)){
list.add(dirPath + "/" + photo);
}else {
List<String> childList = getFileNameListByDirPath(context,dirPath + "/" + photo);
list.addAll(childList);
}
}
}catch (Exception e){
e.printStackTrace();
}
return list;
}
public static boolean isFile(String fileName){
if(fileName.contains(".")){
return true;
}else {
return false;
}
}
}
MainActivity Class:
  In this class we are getting local images and attaching images list to our Adapter
Code:
public class MainActivity extends AppCompatActivity {
private String mDirPath;
private ArrayList<Item> itemList;
private List<String> imageList;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
itemList = new ArrayList<Item>();
getLocalImages();
// Setting Adapter and listview
ItemAdapter itemAdapter = new ItemAdapter(getApplicationContext(), R.layout.items, itemList);
ListView listView = findViewById(R.id.item_listView);
listView.setAdapter(itemAdapter);
}
public void getLocalImages(){
mDirPath ="material/image_super_resolution";
imageList = AssetsFileUtil.getFileNameListByDirPath(this,mDirPath);
for(int i=0; i<imageList.size();i++){
itemList.add(new Item(imageList.get(i), " ", imageList.get(i)));
}
}
}
Item Class:
  Prepare item data class.
Code:
public class Item {
private String imgOriginal;
private String imgTitle;
private String imgConverted;
public Item(String imgOriginal, String imgTitle, String imgConverted) {
this.imgOriginal = imgOriginal;
this.imgTitle = imgTitle;
this.imgConverted = imgConverted;
}
public String getImgOriginal() {
return imgOriginal;
}
public void setImgOriginal(String imgOriginal) {
this.imgOriginal = imgOriginal;
}
public String getImgTitle() {
return imgTitle;
}
public void setImgTitle(String imgTitle) {
this.imgTitle = imgTitle;
}
public String getImgConverted() {
return imgConverted;
}
public void setImgConverted(String imgConverted) {
this.imgConverted = imgConverted;
}
}
ItemAdapter Class:
In this class we are binding our images array list with our layout ImageView and implement HiAi Image Resolution to optimize image resolution.
ItemAdapter class extends ArrayAdapter with the type of Item class.
Code:
public class ItemAdapter extends ArrayAdapter<Item>
Define some constants
Code:
private ArrayList<Item> itemList;
private final static int SUPERRESOLUTION_RESULT = 110;
private Bitmap bitmapOriginal;
private Bitmap bitmapConverted;
ImageView imgOriginal;
ImageView imgConverted;
private String TAG = "ItemAdapter";
Prepare constructor for the Adpater class
Code:
public ItemAdapter(@NonNull Context context, int resource, @NonNull ArrayList<Item> itemList) {
super(context, resource, itemList);
this.itemList = itemList;
}
Define initHiAi function to check service connected or disconnected
Code:
/**
* init HiAI interface
*/
private void initHiAI() {
/** Initialize with the VisionBase static class and asynchronously get the connection of the service */
VisionBase.init(getContext(), new ConnectionCallback() {
@Override
public void onServiceConnect() {
/** This callback method is invoked when the service connection is successful; you can do the initialization of the detector class, mark the service connection status, and so on */
}
@Override
public void onServiceDisconnect() {
/** When the service is disconnected, this callback method is called; you can choose to reconnect the service here, or to handle the exception*/
}
});
}
Define setHiAi function to perform HiAi operation on Original image and generate the optimized Bitmap.
Code:
/**
* Capability Interfaces
*
* @return
*/
private void setHiAi() {
/** Define class detector, the context of this project is the input parameter */
ImageSuperResolution superResolution = new ImageSuperResolution(getContext());
/** Define the frame, put the bitmap that needs to detect the image into the frame*/
Frame frame = new Frame();
/** BitmapFactory.decodeFile input resource file path*/
// Bitmap bitmap = BitmapFactory.decodeFile(null);
frame.setBitmap(bitmapOriginal);
/** Define and set super-resolution parameters*/
SuperResolutionConfiguration paras = new SuperResolutionConfiguration(
SuperResolutionConfiguration.SISR_SCALE_3X,
SuperResolutionConfiguration.SISR_QUALITY_HIGH);
superResolution.setSuperResolutionConfiguration(paras);
/** Run super-resolution and get result of processing */
ImageResult result = superResolution.doSuperResolution(frame, null);
/** After the results are processed to get bitmap*/
Bitmap bmp = result.getBitmap();
/** Note: The result and the Bitmap in the result must be NULL, but also to determine whether the returned error code is 0 (0 means no error)*/
this.bitmapConverted = bmp;
handler.sendEmptyMessage(SUPERRESOLUTION_RESULT);
}
Define Handler if the optimization completed attach the optimized Bitmap to image.
Code:
private Handler handler = new Handler() {
@Override
public void handleMessage(Message msg) {
super.handleMessage(msg);
switch (msg.what) {
case SUPERRESOLUTION_RESULT:
if (bitmapConverted != null) {
imgConverted.setImageBitmap(bitmapConverted);
} else { // Set the original image
imgConverted.setImageBitmap(bitmapOriginal);
// toast("High Resolution image");
}
break;
}
}
};
Override the getView method attached orginial image to image view and process the original image to attached optimized image.
Code:
@NonNull
@Override
public View getView(int position, @Nullable View convertView, @NonNull ViewGroup parent) {
initHiAI();
int itemIndex = position;
if(convertView == null){
convertView = LayoutInflater.from(getContext()).inflate(R.layout.items,parent, false);
}
imgOriginal = convertView.findViewById(R.id.imgOriginal);
TextView imgTitle = convertView.findViewById(R.id.imgTitle);
imgConverted = convertView.findViewById(R.id.imgConverted);
bitmapOriginal = AssetsFileUtil.getBitmapByFilePath(imgOriginal.getContext(), itemList.get(itemIndex).getImgConverted());
imgOriginal.setImageBitmap(bitmapOriginal);
bitmapConverted =AssetsFileUtil.getBitmapByFilePath(imgConverted.getContext(), itemList.get(itemIndex).getImgOriginal());
imgConverted.setImageBitmap(bitmapConverted);
int height = bitmapOriginal.getHeight();
int width = bitmapOriginal.getWidth();
Log.e(TAG, "width:" + width + ";height:" + height);
if (width <= 800 && height <= 600) {
new Thread() {
@Override
public void run() {
setHiAi();
}
}.start();
} else {
toast("Width and height of the image cannot exceed 800*600");
}
imgTitle.setText(itemList.get(itemIndex).getImgTitle());
return convertView;
}
public void toast(String text) {
Toast.makeText(getContext(), text, Toast.LENGTH_SHORT).show();
}
Coding section has been complete here, now run you project and check the output of the Image Optimization by using HiAi Image Super Resolution.
5. Result

Related

Huawei Share Kit Facilitates to Acquire Skill in Enforcement - Part 2

More articles like this, you can visit HUAWEI Developer Forum and Medium.
​https://forums.developer.huawei.com/forumPortal/en/home
In this article, We will implement the Huawei Share Kit SDK and complete our demo application.
In the previous article, we have learned about Share Kit introduction and created project. So let’s start our implementation.
I will represent the functionality of Share Kit in a simple way with a working application and give a demo.
Before start developing the application we must have the following requirement.
Hardware Requirements
1. A computer (desktop or laptop) that runs Windows 7 or Windows 10
2. A Huawei phone (with the USB cable), which is used for debugging
3. A third-party Android device, which is used for debugging
Software Requirements
1. JDK 1.8 or later
2. Android API (level 26 or higher)
3. EMUI 10.0 or later
Let’s start the development:
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
1. Add Share Kit SDK in project:
2. We need to add the code repository to the project root directory gradle.
Code:
maven {
url 'http://developer.huawei.com/repo/'
}
3. We need to add the following dependencies in our app gradle.
Code:
dependencies {
implementation files('libs/sharekit-1.0.1.300.aar')
implementation 'com.android.support:support-annotations:28.0.0'
implementation 'com.android.support:localbroadcastmanager:28.0.0'
implementation 'com.android.support:support-compat:28.0.0'
implementation 'com.google.guava:guava:24.1-android'
}
Note: You need to raise a ticket to get Share Kit SDK “sharekit-1.0.300.aar” file
Click on the below link and raise your ticket.
https://developer.huawei.com/consumer/en/support/feedback/#/
4. I have created following package and resource file:
5. I have mentioned all activities in manifest file:
Code:
<manifest xmlns:android="http://schemas.android.com/apk/res/android"
package="com.hms.myshare">
<application
android:allowBackup="true"
android:icon="@mipmap/ic_launcher"
android:label="@string/app_name"
android:roundIcon="@mipmap/ic_launcher_round"
android:supportsRtl="true"
android:theme="@style/AppTheme">
<activity
android:name=".SplashScreen"
android:label="@string/app_name">
<intent-filter>
<action android:name="android.intent.action.MAIN" />
<category android:name="android.intent.category.LAUNCHER" />
</intent-filter>
</activity>
<activity
android:name=".SearchingActivity"
android:configChanges="orientation|keyboardHidden|screenSize"/>
<activity
android:name=".ReceiveActivity"
android:configChanges="orientation|keyboardHidden|screenSize"/>
</application>
</manifest>
Let’s create an awesome User Interface:
1. I have created a wave ripple effect which will help to find the device from a UI perspective.
I have created a SearchingView.java class:
Code:
public class SearchingView extends RelativeLayout {
private static final int DEFAULT_RIPPLE_COUNT=6;
private static final int DEFAULT_DURATION_TIME=3000;
private static final float DEFAULT_SCALE=6.0f;
private static final int DEFAULT_FILL_TYPE=0;
private int rippleColor;
private float rippleStrokeWidth;
private float rippleRadius;
private int rippleDurationTime;
private int rippleAmount;
private int rippleDelay;
private float rippleScale;
private int rippleType;
private Paint paint;
private boolean animationRunning=false;
private AnimatorSet animatorSet;
private ArrayList<Animator> animatorList;
private LayoutParams rippleParams;
private ArrayList<RippleView> rippleViewList=new ArrayList<RippleView>();
public SearchingView(Context context) {
super(context);
}
public SearchingView(Context context, AttributeSet attrs) {
super(context, attrs);
init(context, attrs);
}
public SearchingView(Context context, AttributeSet attrs, int defStyleAttr) {
super(context, attrs, defStyleAttr);
init(context, attrs);
}
private void init(final Context context, final AttributeSet attrs) {
if (isInEditMode())
return;
if (null == attrs) {
throw new IllegalArgumentException("Attributes should be provided to this view,");
}
final TypedArray typedArray = context.obtainStyledAttributes(attrs, R.styleable.RippleBackground);
rippleColor=typedArray.getColor(R.styleable.RippleBackground_rb_color, getResources().getColor(R.color.rippelColor));
rippleStrokeWidth=typedArray.getDimension(R.styleable.RippleBackground_rb_strokeWidth, getResources().getDimension(R.dimen.rippleStrokeWidth));
rippleRadius=typedArray.getDimension(R.styleable.RippleBackground_rb_radius,getResources().getDimension(R.dimen.rippleRadius));
rippleDurationTime=typedArray.getInt(R.styleable.RippleBackground_rb_duration,DEFAULT_DURATION_TIME);
rippleAmount=typedArray.getInt(R.styleable.RippleBackground_rb_rippleAmount,DEFAULT_RIPPLE_COUNT);
rippleScale=typedArray.getFloat(R.styleable.RippleBackground_rb_scale,DEFAULT_SCALE);
rippleType=typedArray.getInt(R.styleable.RippleBackground_rb_type,DEFAULT_FILL_TYPE);
typedArray.recycle();
rippleDelay=rippleDurationTime/rippleAmount;
paint = new Paint();
paint.setAntiAlias(true);
if(rippleType==DEFAULT_FILL_TYPE){
rippleStrokeWidth=0;
paint.setStyle(Paint.Style.FILL);
}else
paint.setStyle(Paint.Style.STROKE);
paint.setColor(rippleColor);
rippleParams=new LayoutParams((int)(2*(rippleRadius+rippleStrokeWidth)),(int)(2*(rippleRadius+rippleStrokeWidth)));
rippleParams.addRule(CENTER_IN_PARENT, TRUE);
animatorSet = new AnimatorSet();
animatorSet.setInterpolator(new AccelerateDecelerateInterpolator());
animatorList=new ArrayList<Animator>();
for(int i=0;i<rippleAmount;i++){
RippleView rippleView=new RippleView(getContext());
addView(rippleView,rippleParams);
rippleViewList.add(rippleView);
final ObjectAnimator scaleXAnimator = ObjectAnimator.ofFloat(rippleView, "ScaleX", 1.0f, rippleScale);
scaleXAnimator.setRepeatCount(ObjectAnimator.INFINITE);
scaleXAnimator.setRepeatMode(ObjectAnimator.RESTART);
scaleXAnimator.setStartDelay(i * rippleDelay);
scaleXAnimator.setDuration(rippleDurationTime);
animatorList.add(scaleXAnimator);
final ObjectAnimator scaleYAnimator = ObjectAnimator.ofFloat(rippleView, "ScaleY", 1.0f, rippleScale);
scaleYAnimator.setRepeatCount(ObjectAnimator.INFINITE);
scaleYAnimator.setRepeatMode(ObjectAnimator.RESTART);
scaleYAnimator.setStartDelay(i * rippleDelay);
scaleYAnimator.setDuration(rippleDurationTime);
animatorList.add(scaleYAnimator);
final ObjectAnimator alphaAnimator = ObjectAnimator.ofFloat(rippleView, "Alpha", 1.0f, 0f);
alphaAnimator.setRepeatCount(ObjectAnimator.INFINITE);
alphaAnimator.setRepeatMode(ObjectAnimator.RESTART);
alphaAnimator.setStartDelay(i * rippleDelay);
alphaAnimator.setDuration(rippleDurationTime);
animatorList.add(alphaAnimator);
}
animatorSet.playTogether(animatorList);
}
private class RippleView extends View {
public RippleView(Context context) {
super(context);
this.setVisibility(View.INVISIBLE);
}
@Override
protected void onDraw(Canvas canvas) {
int radius=(Math.min(getWidth(),getHeight()))/2;
canvas.drawCircle(radius,radius,radius-rippleStrokeWidth,paint);
}
}
public void startRippleAnimation(){
if(!isRippleAnimationRunning()){
for(RippleView rippleView:rippleViewList){
rippleView.setVisibility(VISIBLE);
}
animatorSet.start();
animationRunning=true;
}
}
public void stopRippleAnimation(){
if(isRippleAnimationRunning()){
animatorSet.end();
animationRunning=false;
}
}
public boolean isRippleAnimationRunning(){
return animationRunning;
}
Let’s see the implementation of this custom view inside xml layout:
Code:
<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
android:orientation="vertical"
android:gravity="center_horizontal"
android:layout_width="match_parent"
android:background="@drawable/background"
android:layout_height="match_parent">
<RelativeLayout
android:layout_width="wrap_content"
android:layout_height="0dp"
android:layout_weight="1"
android:gravity="center">
<com.hms.myshare.view.SearchingView
android:id="@+id/searching"
android:layout_width="match_parent"
android:layout_height="match_parent"
app:rb_color="@android:color/white"
app:rb_duration="3000"
app:rb_radius="40dp"
app:rb_rippleAmount="6"
app:rb_scale="5">
<ImageView
android:id="@+id/img_logo"
android:layout_width="200dp"
android:layout_height="200dp"
android:layout_centerInParent="true"
android:src="@drawable/log" />
</com.hms.myshare.view.SearchingView>
</RelativeLayout>
<androidx.appcompat.widget.AppCompatTextView
android:layout_width="wrap_content"
android:layout_gravity="bottom|center_horizontal"
android:textColor="@android:color/white"
android:textSize="28sp"
android:gravity="center"
android:layout_height="wrap_content"
android:layout_margin="10dp"
android:text="Huawei Share Kit"
android:id="@+id/appCompatTextView2" />
</LinearLayout>
Let’ see the output of this view:
Let’s implement Search device and Send Data:
· We have implemented this functionality inside SearchingActivity class.
We need to perform the following operation in order to implement sending data to found device.
1. We need to instantiate SDK manager class i.e. ShareKitManager with current context of Activity inside the oncreate method.
Code:
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
binding = DataBindingUtil.setContentView(this, R.layout.searching_activity);
shareKitManager = new ShareKitManager(this);
2. Add callback IShareKitInitCallback to initialize the ShareKitManager class.
Code:
IShareKitInitCallback initCallback = isSuccess -> {
Log.i(TAG, "share kit init result:" + isSuccess);
if (isSuccess) {
binding.txtError.setText(getString(R.string.sharekit_init_finish));
} else {
binding.txtError.setText(getString(R.string.sharekit_init_failed));
}
};
shareKitManager.init(initCallback);
3. Register the ShareKitManager with IWidgetCallback:
Code:
private IWidgetCallback callback = new IWidgetCallback.Stub() {
@Override
public synchronized void onDeviceFound(NearByDeviceEx nearByDeviceEx) {
String deviceId = nearByDeviceEx.getCommonDeviceId();
if (deviceId == null) {
Log.e(TAG, "onDeviceFound: deviceId is null");
return;
}
Log.i(TAG, "onDeviceFound: " + deviceId + ", btName: " + nearByDeviceEx.getBtName());
synchronized (lock) {
deviceMap.put(deviceId, nearByDeviceEx);
foundTimeMap.put(deviceId, format.format(new Date()));
updateDeviceList();
}
}
@Override
public void onDeviceDisappeared(NearByDeviceEx nearByDeviceEx) {
String deviceId = nearByDeviceEx.getCommonDeviceId();
if (deviceId == null) {
Log.e(TAG, "onDeviceDisappeared: deviceId is null");
return;
}
Log.i(TAG, "onDeviceDisappeared: " + deviceId + ", btName: " + nearByDeviceEx.getBtName());
synchronized (lock) {
deviceMap.remove(deviceId);
foundTimeMap.remove(deviceId);
updateDeviceList();
}
}
@Override
public void onTransStateChange(NearByDeviceEx nearByDeviceEx, int state, int stateValue) {
Log.i(TAG, "trans state:" + state + " value:" + stateValue);
String stateDesc = "";
switch (state) {
case STATE_PROGRESS:
stateDesc = getString(R.string.sharekit_send_progress, stateValue);
break;
case STATE_SUCCESS:
stateDesc = getString(R.string.sharekit_send_finish);
break;
case STATE_STATUS:
stateDesc = getString(R.string.sharekit_state_chg, translateStateValue(stateValue));
break;
case STATE_ERROR:
stateDesc = getString(R.string.sharekit_send_error, translateErrorValue(stateValue));
showError(getString(R.string.sharekit_send_error, translateErrorValue(stateValue)));
break;
default:
break;
}
// showToast(stateDesc);
}
@Override
public void onEnableStatusChanged() {
int status = shareKitManager.getShareStatus();
Log.i(TAG, "sharekit ability current status:" + status);
}
};
We need to pass this callback to Register api.
Code:
shareKitManager.registerCallback(callback);
4. Start searching device using Discorvey api.
Code:
shareKitManager.startDiscovery();
5. If you found the device successfully we need to call the ShareBean api for send the data.
Code:
private void doSendText() {
String text = binding.sharetext.getText().toString();
ShareBean shareBean = new ShareBean(text);
doSend(destDevice, shareBean);
}
Followed by doSend() method:
Code:
private void doSend(String deviceName, ShareBean shareBean) {
List<NearByDeviceEx> processingDevices = shareKitManager.getDeviceList();
for (NearByDeviceEx device : processingDevices) {
if (deviceName.equals(device.getBtName())) {
return;
}
}
synchronized (lock) {
for (NearByDeviceEx device : deviceMap.values()) {
if (deviceName.equals(device.getBtName())) {
shareKitManager.doSend(device, shareBean);
}
}
}
}
Let’s implement Receive data functionality:
· We have implemented this functionality inside ReceivingActivity.
· We need to enable wifi in Huawei device which receive the socket connection request from sender device.
· So we need to initialize the ShareKitManager inside this activity oncreate method.
Code:
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
binding = DataBindingUtil.setContentView(this, R.layout.receiver_activity);
binding.searching.startRippleAnimation();
shareKitManager = new ShareKitManager(this);
IShareKitInitCallback initCallback = isSuccess -> {
Log.i(TAG, "share kit init result:" + isSuccess);
};
shareKitManager.init(initCallback);
shareKitManager.enable();
}
Android device (Sender):
Huawei device (Receiver):
If you have any doubts or queries. Please leave your valuable comment or post your doubts in HUAWEI Developer Forum.

HMS Video Kit For Movie Promotion Application

More information like this, you can visit HUAWEI Developer Forum​
Intoduction:
HUAWEI Video Kit provides an excellent playback experience with video streaming from a third-party cloud platform. It supports streaming media in 3GP, MP4, or TS format and comply with HTTP/HTTPS, HLS, or DASH.
Advantage of Video Kit:
Provides an excellent video experience with no lag, no delay, and high definition.
Provides a complete and rich playback control interfaces.
Provides rich video operation experience.
Prerequisites:
Android Studio 3.X
JDK 1.8 or later
HMS Core (APK) 5.0.0.300 or later
EMUI 3.0 or later
Integration:
1. Create an project in android studio and Huawei AGC.
2. Provide the SHA-256 Key in App Information Section.
3. Download the agconnect-services.json from AGCand save into app directory.
4. In root build.gradle
Navigate to allprojects > repositories and buildscript > repositories and add the given line.
Code:
maven { url 'http://developer.huawei.com/repo/' }
In dependency add class path.
Code:
classpath 'com.huawei.agconnect:agcp:1.3.1.300'
5. In app build.gradle
Configure the Maven dependency
Code:
implementation "com.huawei.hms:videokit-player:1.0.1.300"
Configure the NDK
Code:
android {
defaultConfig {
......
ndk {
abiFilters "armeabi-v7a", "arm64-v8a"
}
}
......
}
Apply plugin
Code:
apply plugin: 'com.huawei.agconnect'
6. Permissions in Manifest
Code:
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<uses-permission android:name="android.permission.ACCESS_WIFI_STATE" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<uses-permission android:name="com.huawei.permission.SECURITY_DIAGNOSE" />
Code Implementation:
A movie promo application has been created to demonstrate HMS Video Kit . The application uses recycleview, cardview and piccaso libraries apart from HMS Video Kit library. Let us go to the details of HMS Video kit code integration.
1. Initializing WisePlayer
We have to implement a class that inherits Application and the onCreate() method has to call the initialization API WisePlayerFactory.initFactory()
Code:
public class VideoKitPlayApplication extends Application {
private static final String TAG = VideoKitPlayApplication.class.getSimpleName();
private static WisePlayerFactory wisePlayerFactory = null;
@Override
public void onCreate() {
super.onCreate();
initPlayer();
}
private void initPlayer() {
// DeviceId test is used in the demo.
WisePlayerFactoryOptions factoryOptions = new WisePlayerFactoryOptions.Builder().setDeviceId("xxx").build();
WisePlayerFactory.initFactory(this, factoryOptions, initFactoryCallback);
}
/**
* Player initialization callback
*/
private static InitFactoryCallback initFactoryCallback = new InitFactoryCallback() {
@Override
public void onSuccess(WisePlayerFactory wisePlayerFactory) {
LogUtil.i(TAG, "init player factory success");
setWisePlayerFactory(wisePlayerFactory);
}
@Override
public void onFailure(int errorCode, String reason) {
LogUtil.w(TAG, "init player factory failed :" + reason + ", errorCode is " + errorCode);
}
};
/**
* Get WisePlayer Factory
*
* @return WisePlayer Factory
*/
public static WisePlayerFactory getWisePlayerFactory() {
return wisePlayerFactory;
}
private static void setWisePlayerFactory(WisePlayerFactory wisePlayerFactory) {
VideoKitPlayApplication.wisePlayerFactory = wisePlayerFactory;
}
}
2. Creating instance of wise player
Code:
wisePlayer = VideoKitPlayApplication.getWisePlayerFactory().createWisePlayer();
3. Initialize the WisePlayer layout and add layout listeners
Code:
private void initView(View view) {
if (view != null) {
surfaceView = (SurfaceView) view.findViewById(R.id.surface_view);
textureView = (TextureView) view.findViewById(R.id.texture_view);
if (PlayControlUtil.isSurfaceView()) {
//SurfaceView display interface
SurfaceHolder surfaceHolder = surfaceView.getHolder();
surfaceHolder.addCallback(thois);
textureView.setVisibility(View.GONE);
surfaceView.setVisibility(View.VISIBLE);
} else {
//TextureView display interface
textureView.setSurfaceTextureListener(this);
textureView.setVisibility(View.VISIBLE);
surfaceView.setVisibility(View.GONE);
}
}
4. Register WisePlayer listeners
Code:
private void setPlayListener() {
if (wisePlayer != null) {
wisePlayer.setErrorListener(onWisePlayerListener);
wisePlayer.setEventListener(onWisePlayerListener);
wisePlayer.setResolutionUpdatedListener(onWisePlayerListener);
wisePlayer.setReadyListener(onWisePlayerListener);
wisePlayer.setLoadingListener(onWisePlayerListener);
wisePlayer.setPlayEndListener(onWisePlayerListener);
wisePlayer.setSeekEndListener(onWisePlayerListener);
}
}
5. Set playback parameters
Code:
player.setVideoType(PlayMode.PLAY_MODE_NORMAL);
player.setBookmark(10000);
player.setCycleMode(CycleMode.MODE_CYCLE);
6. Set URL for video
Code:
wisePlayer.setPlayUrl(new String[] {currentPlayData.getUrl()});
7. Set a view to display the video.
Code:
// SurfaceView listener callback
@Override
public void surfaceCreated(SurfaceHolder holder) {
wisePlayer.setView(surfaceView);
}
// TextureView listener callback
@Override
public void onSurfaceTextureAvailable(SurfaceTexture surface, int width, int height) {
wisePlayer.setView(textureView);
// Call the resume API to bring WisePlayer to the foreground.
wisePlayer.resume(ResumeType.KEEP);
}
8. Prepare for the playback and start requesting data.
Code:
wisePlayer.ready();
9. Start the playback upon a success response of the onReady callback method
Code:
@Override
public void onReady(WisePlayer wisePlayer) {
player.start();
}
10. select_play_movie.xml
Code:
<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:orientation="vertical"
android:tag="cards main container">
<androidx.cardview.widget.CardView
android:id="@+id/card_view"
xmlns:card_view="http://schemas.android.com/apk/res-auto"
android:layout_width="match_parent"
android:layout_height="wrap_content"
card_view:cardBackgroundColor="@color/colorbg"
card_view:cardCornerRadius="10dp"
card_view:cardElevation="5dp"
card_view:cardUseCompatPadding="true">
<LinearLayout
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:orientation="horizontal"
android:gravity="center"
>
<ImageView
android:id="@+id/videoIcon"
android:tag="image_tag"
android:layout_width="0dp"
android:layout_height="100dp"
android:layout_margin="5dp"
android:layout_weight="1"
android:src="@drawable/j1"/>
<LinearLayout
android:layout_width="0dp"
android:layout_height="wrap_content"
android:layout_marginTop="12dp"
android:layout_weight="2"
android:orientation="vertical"
>
<TextView
android:id="@+id/play_name"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_gravity="center_horizontal"
android:layout_marginTop="10dp"
android:text="Jurassic Park"
android:textColor="@color/colorTitle"
android:textAppearance="?android:attr/textAppearanceLarge"/>
<TextView
android:id="@+id/releasedYear"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_gravity="center_horizontal"
android:layout_marginTop="10dp"
android:textColor="@color/black"
android:textAppearance="?android:attr/textAppearanceMedium"/>
<TextView
android:id="@+id/briefStory"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_gravity="center_horizontal"
android:layout_margin="10dp"
android:textColor="@color/green"
android:textAppearance="?android:attr/textAppearanceSmall"/>
<TextView
android:id="@+id/play_type"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="0"
android:textColor="@color/select_play_text_color"
android:textSize="20sp"
android:visibility="gone"/>
<TextView
android:id="@+id/play_url"
android:layout_width="fill_parent"
android:layout_height="wrap_content"
android:ellipsize="end"
android:marqueeRepeatLimit="marquee_forever"
android:maxLines="2"
android:paddingTop="5dip"
android:singleLine="false"
android:textColor="@color/select_play_text_color"
android:textSize="14sp"
android:visibility="gone"/>
</LinearLayout>
</LinearLayout>
</androidx.cardview.widget.CardView>
</LinearLayout>
11.SelectMoviePlayAdapter.java
Code:
/**
* Play recyclerView adapter
*/
public class SelectMoviePlayAdapter extends RecyclerView.Adapter<SelectMoviePlayAdapter.PlayViewHolder> {
private static final String TAG = "SelectMoviePlayAdapter ";
// Data sources list
private List<MovieEntity> playList;
// Context
private Context context;
// Click item listener
private OnItemClickListener onItemClickListener;
/**
* Constructor
*
* @param context Context
* @param onItemClickListener Listener
*/
public SelectMoviePlayAdapter(Context context, OnItemClickListener onItemClickListener) {
this.context = context;
playList = new ArrayList<>();
this.onItemClickListener = onItemClickListener;
}
/**
* Set list data
*
* @param playList Play data
*/
public void setSelectPlayList(List<MovieEntity> playList) {
if (this.playList.size() > 0) {
this.playList.clear();
}
this.playList.addAll(playList);
notifyDataSetChanged();
}
@NonNull
@Override
public PlayViewHolder onCreateViewHolder(@NonNull ViewGroup parent, int viewType) {
View view = LayoutInflater.from(context).inflate(R.layout.select_play_movie, parent, false);
return new PlayViewHolder(view);
}
@Override
public void onBindViewHolder(PlayViewHolder holder, final int position) {
if (playList.size() > position && holder != null) {
MovieEntitymovieEntity = playList.get(position);
if (movieEntity== null) {
LogUtil.i(TAG, "current item data is empty.");
return;
}
holder.playName.setText(movieEntity.getName());
holder.releasedYear.setText(movieEntity.getYear());
holder.briefStory.setText(movieEntity.getStory());
holder.playUrl.setText(movieEntity.getUrl());
holder.playType.setText(String.valueOf(movieEntity.getUrlType()));
holder.itemView.setOnClickListener(new OnClickListener() {
@Override
public void onClick(View v) {
onItemClickListener.onItemClick(position);
}
});
Picasso.with(context).load(movieEntity.getIcon()).into(holder.videoIcon);
}
}
@Override
public int getItemCount() {
return playList.size();
}
/**
* Show view holder
*/
static class PlayViewHolder extends RecyclerView.ViewHolder {
private TextView playName;
private TextView releasedYear,briefStory;
private TextView playType;
private TextView playUrl;
private ImageView videoIcon;
/**
* Constructor
*
* @param itemView Item view
*/
public PlayViewHolder(View itemView) {
super(itemView);
if (itemView != null) {
playName = itemView.findViewById(R.id.play_name);
releasedYear = itemView.findViewById(R.id.releasedYear);
briefStory = itemView.findViewById(R.id.briefStory);
playType = itemView.findViewById(R.id.play_type);
playUrl = itemView.findViewById(R.id.play_url);
videoIcon = itemView.findViewById(R.id.videoIcon);
}
}
}
}
ScreenShots:
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Conclusion:
Video Kit provides an excellent experience in video playback. In future it will support video editing and video hosting, through that users can easily and quickly enjoy an end-to-end video solution for all scenarios
Reference:
https://developer.huawei.com/consumer/en/doc/development/HMSCore-Guides-V5/introduction-0000001050439577-V5

Create and Monitor Geofences with HuaweiMap in Xamarin.Android Application

More information like this, you can visit HUAWEI Developer Forum​
A geofence is a virtual perimeter set on a real geographic area. Combining a user position with a geofence perimeter, it is possible to know if the user is inside the geofence or if he is exiting or entering the area.
In this article, we will discuss how to use the geofence to notify the user when the device enters/exits an area using the HMS Location Kit in a Xamarin.Android application. We will also add and customize HuaweiMap, which includes drawing circles, adding pointers, and using nearby searches in search places. We are going to learn how to use the below features together:
Geofence
Reverse Geocode
HuaweiMap
Nearby Search
Project Setup
First of all, you need to be a registered Huawei Mobile Developer and create an application in Huawei App Console in order to use HMS Map Location and Site Kits. You can follow these steps to complete the configuration that required for development:
Configuring App Information in AppGallery Connect
Creating Xamarin Android Binding Libraries
Integrating the HMS Map Kit Libraries for Xamarin
Integrating the HMS Location Kit Libraries for Xamarin
Integrating the HMS Site Kit Libraries for Xamarin
Integrating the HMS Core SDK
Setting Package in Xamarin
When we create our Xamarin.Android application in the above steps, we need to make sure that the package name is the same as we entered the Console. Also, don’t forget the enable them in Console.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Manifest & Permissions
We have to update the application’s manifest file by declaring permissions that we need as shown below.
Code:
<uses-permission android:name="android.permission.ACCESS_COARSE_LOCATION" />
<uses-permission android:name="android.permission.ACCESS_FINE_LOCATION" />
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<uses-permission android:name="android.permission.ACCESS_BACKGROUND_LOCATION" />
Also, add a meta-data element to embed your app id in the application tag, it is required for this app to authenticate on the Huawei’s cloud server. You can find this id in agconnect-services.json file.
Code:
<meta-data android:name="com.huawei.hms.client.appid" android:value="appid=YOUR_APP_ID" />
Request location permission
Request runtime permissions in our app in order to use Location and Map Services. The following code checks whether the user has granted the required location permissions in Main Activity.
Code:
private void RequestPermissions()
{
if (ContextCompat.CheckSelfPermission(this, Manifest.Permission.AccessCoarseLocation) != (int)Permission.Granted ||
ContextCompat.CheckSelfPermission(this, Manifest.Permission.AccessFineLocation) != (int)Permission.Granted ||
ContextCompat.CheckSelfPermission(this, Manifest.Permission.WriteExternalStorage) != (int)Permission.Granted ||
ContextCompat.CheckSelfPermission(this, Manifest.Permission.ReadExternalStorage) != (int)Permission.Granted ||
ContextCompat.CheckSelfPermission(this, Manifest.Permission.Internet) != (int)Permission.Granted)
{
ActivityCompat.RequestPermissions(this,
new System.String[]
{
Manifest.Permission.AccessCoarseLocation,
Manifest.Permission.AccessFineLocation,
Manifest.Permission.WriteExternalStorage,
Manifest.Permission.ReadExternalStorage,
Manifest.Permission.Internet
},
100);
}
else
GetCurrentPosition();
}
public override void OnRequestPermissionsResult(int requestCode, string[] permissions, [GeneratedEnum] Android.Content.PM.Permission[] grantResults)
{
if (requestCode == 100)
{
foreach (var item in permissions)
{
if (ContextCompat.CheckSelfPermission(this, item) == Permission.Denied)
{
if (ActivityCompat.ShouldShowRequestPermissionRationale(this, permissions[0]) || ActivityCompat.ShouldShowRequestPermissionRationale(this, permissions[1]))
Snackbar.Make(FindViewById<RelativeLayout>(Resource.Id.mainLayout), "You need to grant permission to use location services.", Snackbar.LengthLong).SetAction("Ask again", v => RequestPermissions()).Show();
else
Toast.MakeText(this, "You need to grant location permissions in settings.", ToastLength.Long).Show();
}
else
GetCurrentPosition();
}
}
else
{
base.OnRequestPermissionsResult(requestCode, permissions, grantResults);
}
}
Add a Map
Within our UI, a map will be represented by either a MapFragment or MapView object. We will use the MapFragment object in this sample.
Add a <fragment> element to your activity’s layout file, activity_main.xml. This element defines a MapFragment to act as a container for the map and to provide access to the HuaweiMap object.
Also, let’s add other controls to use through this sample. That is two Button and a SeekBar. One button for clearing the map and the other for searching nearby locations. And seekbar is helping us to create a radius for the geofence.
Code:
<RelativeLayout
xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
xmlns:map="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:id="@+id/mainLayout"
android:layout_width="match_parent"
android:layout_height="match_parent">
<fragment
android:id="@+id/mapfragment"
class="com.huawei.hms.maps.MapFragment"
android:layout_width="match_parent"
android:layout_height="match_parent"/>
<LinearLayout
android:orientation="vertical"
android:layout_width="wrap_content"
android:layout_height="match_parent">
<Button
android:text="Get Geofence List"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_margin="5dp"
android:padding="5dp"
android:background="@drawable/abc_btn_colored_material"
android:textColor="@android:color/white"
android:id="@+id/btnGetGeofenceList" />
<Button
android:text="Clear Map"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_margin="5dp"
android:background="@drawable/abc_btn_colored_material"
android:textColor="@android:color/white"
android:id="@+id/btnClearMap" />
</LinearLayout>
<SeekBar
android:visibility="invisible"
android:min="30"
android:layout_alignParentBottom="true"
android:layout_marginBottom="20dp"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:id="@+id/radiusBar" />
</RelativeLayout>
In our activity’s OnCreate method, set the layout file as the content view, load AGConnectService, set button’s click events, and initialize FusedLocationProviderClient. Get a handle to the map fragment by calling FragmentManager.FindFragmentById. Then use GetMapAsync to register for the map callback.
Also, implement the IOnMapReadyCallback interface to MainActivity and override OnMapReady method which is triggered when the map is ready to use.
Code:
public class MainActivity : AppCompatActivity, IOnMapReadyCallback
{
MapFragment mapFragment;
HuaweiMap hMap;
Marker marker;
Circle circle;
SeekBar radiusBar;
FusedLocationProviderClient fusedLocationProviderClient;
GeofenceModel selectedCoordinates;
List<Marker> searchMarkers;
private View search_view;
private AlertDialog alert;
public static LatLng CurrentPosition { get; set; }
protected override void OnCreate(Bundle savedInstanceState)
{
base.OnCreate(savedInstanceState);
Xamarin.Essentials.Platform.Init(this, savedInstanceState);
SetContentView(Resource.Layout.activity_main);
AGConnectServicesConfig config = AGConnectServicesConfig.FromContext(ApplicationContext);
fusedLocationProviderClient = LocationServices.GetFusedLocationProviderClient(this);
mapFragment = (MapFragment)FragmentManager.FindFragmentById(Resource.Id.mapfragment);
mapFragment.GetMapAsync(this);
FindViewById<Button>(Resource.Id.btnGeoWithAddress).Click += btnGeoWithAddress_Click;
FindViewById<Button>(Resource.Id.btnClearMap).Click += btnClearMap_Click;
radiusBar = FindViewById<SeekBar>(Resource.Id.radiusBar);
radiusBar.ProgressChanged += radiusBar_ProgressChanged; ;
RequestPermissions();
}
public void OnMapReady(HuaweiMap map)
{
hMap = map;
hMap.UiSettings.MyLocationButtonEnabled = true;
hMap.UiSettings.CompassEnabled = true;
hMap.UiSettings.ZoomControlsEnabled = true;
hMap.UiSettings.ZoomGesturesEnabled = true;
hMap.MyLocationEnabled = true;
hMap.MapClick += HMap_MapClick;
if (selectedCoordinates == null)
selectedCoordinates = new GeofenceModel { LatLng = CurrentPosition, Radius = 30 };
}
}
As you can see above, with the UiSettings property of the HuaweiMap object we set my location button, enable compass, etc. Other properties like below:
Code:
public bool CompassEnabled { get; set; }
public bool IndoorLevelPickerEnabled { get; set; }
public bool MapToolbarEnabled { get; set; }
public bool MyLocationButtonEnabled { get; set; }
public bool RotateGesturesEnabled { get; set; }
public bool ScrollGesturesEnabled { get; set; }
public bool ScrollGesturesEnabledDuringRotateOrZoom { get; set; }
public bool TiltGesturesEnabled { get; set; }
public bool ZoomControlsEnabled { get; set; }
public bool ZoomGesturesEnabled { get; set; }
Now when the app launch, directly get the current location and move the camera to it. In order to do that we use FusedLocationProviderClient that we instantiated and call LastLocation API.
LastLocation API returns a Task object that we can check the result by implementing the relevant listeners for success and failure.In success listener we are going to move the map’s camera position to the last known position.
Code:
private void GetCurrentPosition()
{
var locationTask = fusedLocationProviderClient.LastLocation;
locationTask.AddOnSuccessListener(new LastLocationSuccess(this));
locationTask.AddOnFailureListener(new LastLocationFail(this));
}
...
public class LastLocationSuccess : Java.Lang.Object, IOnSuccessListener
{
private MainActivity mainActivity;
public LastLocationSuccess(MainActivity mainActivity)
{
this.mainActivity = mainActivity;
}
public void OnSuccess(Java.Lang.Object location)
{
Toast.MakeText(mainActivity, "LastLocation request successful", ToastLength.Long).Show();
if (location != null)
{
MainActivity.CurrentPosition = new LatLng((location as Location).Latitude, (location as Location).Longitude);
mainActivity.RepositionMapCamera((location as Location).Latitude, (location as Location).Longitude);
}
}
}
To change the position of the camera, we must specify where we want to move the camera, using a CameraUpdate. The Map Kit allows us to create many different types of CameraUpdate using CameraUpdateFactory.
Code:
public static CameraUpdate NewCameraPosition(CameraPosition p0);
public static CameraUpdate NewLatLng(LatLng p0);
public static CameraUpdate NewLatLngBounds(LatLngBounds p0, int p1);
public static CameraUpdate NewLatLngBounds(LatLngBounds p0, int p1, int p2, int p3);
public static CameraUpdate NewLatLngZoom(LatLng p0, float p1);
public static CameraUpdate ScrollBy(float p0, float p1);
public static CameraUpdate ZoomBy(float p0);
public static CameraUpdate ZoomBy(float p0, Point p1);
public static CameraUpdate ZoomIn();
public static CameraUpdate ZoomOut();
public static CameraUpdate ZoomTo(float p0);
There are some methods for the camera position changes as we see above. Simply these are:
1. NewLatLng: Change camera’s latitude and longitude, while keeping other properties
2. NewLatLngZoom: Changes the camera’s latitude, longitude, and zoom, while keeping other properties
3. NewCameraPosition: Full flexibility in changing the camera position
We are going to use NewCameraPosition. A CameraPosition can be obtained with a CameraPosition.Builder. And then we can set target, bearing, tilt and zoom properties.
Code:
public void RepositionMapCamera(double lat, double lng)
{
var cameraPosition = new CameraPosition.Builder();
cameraPosition.Target(new LatLng(lat, lng));
cameraPosition.Zoom(1000);
cameraPosition.Bearing(45);
cameraPosition.Tilt(20);
CameraUpdate cameraUpdate = CameraUpdateFactory.NewCameraPosition(cameraPosition.Build());
hMap.MoveCamera(cameraUpdate);
}
Creating Geofence
Now that we’ve created the map, we can now start to create geofences using it. In this article, we will choose the location where we want to set geofence in two different ways. The first is to select the location by clicking on the map, and the second is to search for nearby places by keyword and select one after placing them on the map with the marker.
Set the geofence location by clicking on the map
It is always easier to select a location by seeing it. After this section, we are able to set a geofence around the clicked point when the map’s clicked. We attached the Click event to our map in the OnMapReady method. In this Click event, we will add a marker to the clicked point and draw a circle around it.
After clicking the map, we will add a circle, a marker, and a custom info window to that point like this:
Also, we will use the Seekbar at the bottom of the page to adjust the circle radius.
We set selectedCoordinates variable when adding the marker. Let’s create the following method to create the marker:
Code:
private void HMap_MapClick(object sender, HuaweiMap.MapClickEventArgs e)
{
selectedCoordinates.LatLng = e.P0;
if (circle != null)
{
circle.Remove();
circle = null;
}
AddMarkerOnMap();
}
void AddMarkerOnMap()
{
if (marker != null) marker.Remove();
var markerOption = new MarkerOptions()
.InvokeTitle("You are here now")
.InvokePosition(selectedCoordinates.LatLng);
hMap.SetInfoWindowAdapter(new MapInfoWindowAdapter(this));
marker = hMap.AddMarker(markerOption);
bool isInfoWindowShown = marker.IsInfoWindowShown;
if (isInfoWindowShown)
marker.HideInfoWindow();
else
marker.ShowInfoWindow();
}
With MarkerOptions we can set the title and position properties. And for creating a custom info window, there is SetInfoWindowAdapter method. Adding MapInfoWindowAdapter class to our project for rendering the custom info model. And implement HuaweiMap.IInfoWindowAdapter interface to it.
This interface provides a custom information window view of a marker and contains two methods:
Code:
View GetInfoContents(Marker marker);
View GetInfoWindow(Marker marker);
When an information window needs to be displayed for a marker, methods provided by this adapter are called in any case.
Now let’s create a custom info window layout and named it as map_info_view.xml
Code:
<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:orientation="vertical"
android:layout_width="match_parent"
android:layout_height="match_parent">
<Button
android:text="Add geofence"
android:width="100dp"
style="@style/Widget.AppCompat.Button.Colored"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:id="@+id/btnInfoWindow" />
</LinearLayout>
And return it after customizing it in GetInfoWindow() method. The full code of the adapter is below:
Code:
internal class MapInfoWindowAdapter : Java.Lang.Object, HuaweiMap.IInfoWindowAdapter
{
private MainActivity activity;
private GeofenceModel selectedCoordinates;
private View addressLayout;
public MapInfoWindowAdapter(MainActivity currentActivity)
{
activity = currentActivity;
}
public View GetInfoContents(Marker marker)
{
return null;
}
public View GetInfoWindow(Marker marker)
{
if (marker == null)
return null;
//update everytime, drawcircle need it
selectedCoordinates = new GeofenceModel { LatLng = new LatLng(marker.Position.Latitude, marker.Position.Longitude) };
View mapInfoView = activity.LayoutInflater.Inflate(Resource.Layout.map_info_view, null);
var radiusBar = activity.FindViewById<SeekBar>(Resource.Id.radiusBar);
if (radiusBar.Visibility == Android.Views.ViewStates.Invisible)
{
radiusBar.Visibility = Android.Views.ViewStates.Visible;
radiusBar.SetProgress(30, true);
}
activity.FindViewById<SeekBar>(Resource.Id.radiusBar)?.SetProgress(30, true);
activity.DrawCircleOnMap(selectedCoordinates);
Button button = mapInfoView.FindViewById<Button>(Resource.Id.btnInfoWindow);
button.Click += btnInfoWindow_ClickAsync;
return mapInfoView;
}
}
This is not the end. For full content, you can visit https://forums.developer.huawei.com/forumPortal/en/topicview?tid=0201357111605920240&fid=0101187876626530001

Expert: Integration of Huawei ML Kit for Scene Detection in Xamarin(Android)

Overview
In this article, I will create a demo app along with the integration of ML Kit Scene Detection which is based on Cross platform Technology Xamarin. It will classify image sets by scenario and generates intelligent album sets. User can select camera parameters based on the photographing scene in app, to take better-looking photos.
Scene Detection Service Introduction
ML Text Recognition service can classify the scenario content of images and add labels, such as outdoor scenery, indoor places, and buildings, helps to understand the image content. Based on the detected information, you can create more personalized app experience for users. Currently, on-device detection supports 102 scenarios.
Prerequisite
Xamarin Framework
Huawei phone
Visual Studio 2019
App Gallery Integration process
Sign In and Create or Choosea project on AppGallery Connect portal.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Navigate to Project settings and downloadthe configuration file.
Navigate to General Information, and then provide Data Storage location.
Navigate to Manage APIs and enable ML Kit.
Installing the Huawei ML NuGet package
Navigate to Solution Explore > Project > Right Click > Manage NuGet Packages.
Install Huawei.Hms.MlComputerVisionScenedetectionin reference.
Install Huawei.Hms.MlComputerVisionScenedetectionInner in reference.
Install Huawei.Hms.MlComputerVisionScenedetectionModel in reference.
Xamarin App Development
Open Visual Studio 2019 and Create A New Project.
Configure Manifest file and add following permissions and tags.
Code:
<uses-feature android:name="android.hardware.camera" />
<uses-permission android:name="android.permission.CAMERA" />
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.RECORD_AUDIO" />
<uses-permission android:name="android.permission.NETWORK_STATE" />
<uses-permission android:name="android.permission.ACCESS_WIFI_STATE" />
</manifest>
3.Create Activity class with XML UI.
GraphicOverlay.cs
This Class performs scaling and mirroring of the graphics relative to the camera's preview properties.
Code:
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using Android.App;
using Android.Content;
using Android.Graphics;
using Android.OS;
using Android.Runtime;
using Android.Util;
using Android.Views;
using Android.Widget;
using Huawei.Hms.Mlsdk.Common;
namespace SceneDetectionDemo
{
public class GraphicOverlay : View
{
private readonly object mLock = new object();
public int mPreviewWidth;
public float mWidthScaleFactor = 1.0f;
public int mPreviewHeight;
public float mHeightScaleFactor = 1.0f;
public int mFacing = LensEngine.BackLens;
private HashSet<Graphic> mGraphics = new HashSet<Graphic>();
public GraphicOverlay(Context context, IAttributeSet attrs) : base(context,attrs)
{
}
/// <summary>
/// Removes all graphics from the overlay.
/// </summary>
public void Clear()
{
lock(mLock) {
mGraphics.Clear();
}
PostInvalidate();
}
/// <summary>
/// Adds a graphic to the overlay.
/// </summary>
public void Add(Graphic graphic)
{
lock(mLock) {
mGraphics.Add(graphic);
}
PostInvalidate();
}
/// <summary>
/// Removes a graphic from the overlay.
/// </summary>
public void Remove(Graphic graphic)
{
lock(mLock)
{
mGraphics.Remove(graphic);
}
PostInvalidate();
}
/// <summary>
/// Sets the camera attributes for size and facing direction, which informs how to transform image coordinates later.
/// </summary>
public void SetCameraInfo(int previewWidth, int previewHeight, int facing)
{
lock(mLock) {
mPreviewWidth = previewWidth;
mPreviewHeight = previewHeight;
mFacing = facing;
}
PostInvalidate();
}
/// <summary>
/// Draws the overlay with its associated graphic objects.
/// </summary>
protected override void OnDraw(Canvas canvas)
{
base.OnDraw(canvas);
lock (mLock)
{
if ((mPreviewWidth != 0) && (mPreviewHeight != 0))
{
mWidthScaleFactor = (float)canvas.Width / (float)mPreviewWidth;
mHeightScaleFactor = (float)canvas.Height / (float)mPreviewHeight;
}
foreach (Graphic graphic in mGraphics)
{
graphic.Draw(canvas);
}
}
}
}
/// <summary>
/// Base class for a custom graphics object to be rendered within the graphic overlay. Subclass
/// this and implement the {Graphic#Draw(Canvas)} method to define the
/// graphics element. Add instances to the overlay using {GraphicOverlay#Add(Graphic)}.
/// </summary>
public abstract class Graphic
{
private GraphicOverlay mOverlay;
public Graphic(GraphicOverlay overlay)
{
mOverlay = overlay;
}
/// <summary>
/// Draw the graphic on the supplied canvas. Drawing should use the following methods to
/// convert to view coordinates for the graphics that are drawn:
/// <ol>
/// <li>{Graphic#ScaleX(float)} and {Graphic#ScaleY(float)} adjust the size of
/// the supplied value from the preview scale to the view scale.</li>
/// <li>{Graphic#TranslateX(float)} and {Graphic#TranslateY(float)} adjust the
/// coordinate from the preview's coordinate system to the view coordinate system.</li>
/// </ ol >param canvas drawing canvas
/// </summary>
/// <param name="canvas"></param>
public abstract void Draw(Canvas canvas);
/// <summary>
/// Adjusts a horizontal value of the supplied value from the preview scale to the view
/// scale.
/// </summary>
public float ScaleX(float horizontal)
{
return horizontal * mOverlay.mWidthScaleFactor;
}
public float UnScaleX(float horizontal)
{
return horizontal / mOverlay.mWidthScaleFactor;
}
/// <summary>
/// Adjusts a vertical value of the supplied value from the preview scale to the view scale.
/// </summary>
public float ScaleY(float vertical)
{
return vertical * mOverlay.mHeightScaleFactor;
}
public float UnScaleY(float vertical) { return vertical / mOverlay.mHeightScaleFactor; }
/// <summary>
/// Adjusts the x coordinate from the preview's coordinate system to the view coordinate system.
/// </summary>
public float TranslateX(float x)
{
if (mOverlay.mFacing == LensEngine.FrontLens)
{
return mOverlay.Width - ScaleX(x);
}
else
{
return ScaleX(x);
}
}
/// <summary>
/// Adjusts the y coordinate from the preview's coordinate system to the view coordinate system.
/// </summary>
public float TranslateY(float y)
{
return ScaleY(y);
}
public void PostInvalidate()
{
this.mOverlay.PostInvalidate();
}
}
}
LensEnginePreview.cs
This Class performs camera's lens preview properties which help to detect and identify the preview.
Code:
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using Android.App;
using Android.Content;
using Android.Graphics;
using Android.OS;
using Android.Runtime;
using Android.Util;
using Android.Views;
using Android.Widget;
using Huawei.Hms.Mlsdk.Common;
namespace HmsXamarinMLDemo.Camera
{
public class LensEnginePreview :ViewGroup
{
private const string Tag = "LensEnginePreview";
private Context mContext;
protected SurfaceView mSurfaceView;
private bool mStartRequested;
private bool mSurfaceAvailable;
private LensEngine mLensEngine;
private GraphicOverlay mOverlay;
public LensEnginePreview(Context context, IAttributeSet attrs) : base(context,attrs)
{
this.mContext = context;
this.mStartRequested = false;
this.mSurfaceAvailable = false;
this.mSurfaceView = new SurfaceView(context);
this.mSurfaceView.Holder.AddCallback(new SurfaceCallback(this));
this.AddView(this.mSurfaceView);
}
public void start(LensEngine lensEngine)
{
if (lensEngine == null)
{
this.stop();
}
this.mLensEngine = lensEngine;
if (this.mLensEngine != null)
{
this.mStartRequested = true;
this.startIfReady();
}
}
public void start(LensEngine lensEngine, GraphicOverlay overlay)
{
this.mOverlay = overlay;
this.start(lensEngine);
}
public void stop()
{
if (this.mLensEngine != null)
{
this.mLensEngine.Close();
}
}
public void release()
{
if (this.mLensEngine != null)
{
this.mLensEngine.Release();
this.mLensEngine = null;
}
}
private void startIfReady()
{
if (this.mStartRequested && this.mSurfaceAvailable) {
this.mLensEngine.Run(this.mSurfaceView.Holder);
if (this.mOverlay != null)
{
Huawei.Hms.Common.Size.Size size = this.mLensEngine.DisplayDimension;
int min = Math.Min(640, 480);
int max = Math.Max(640, 480);
if (this.isPortraitMode())
{
// Swap width and height sizes when in portrait, since it will be rotated by 90 degrees.
this.mOverlay.SetCameraInfo(min, max, this.mLensEngine.LensType);
}
else
{
this.mOverlay.SetCameraInfo(max, min, this.mLensEngine.LensType);
}
this.mOverlay.Clear();
}
this.mStartRequested = false;
}
}
private class SurfaceCallback : Java.Lang.Object, ISurfaceHolderCallback
{
private LensEnginePreview lensEnginePreview;
public SurfaceCallback(LensEnginePreview LensEnginePreview)
{
this.lensEnginePreview = LensEnginePreview;
}
public void SurfaceChanged(ISurfaceHolder holder, [GeneratedEnum] Format format, int width, int height)
{
}
public void SurfaceCreated(ISurfaceHolder holder)
{
this.lensEnginePreview.mSurfaceAvailable = true;
try
{
this.lensEnginePreview.startIfReady();
}
catch (Exception e)
{
Log.Info(LensEnginePreview.Tag, "Could not start camera source.", e);
}
}
public void SurfaceDestroyed(ISurfaceHolder holder)
{
this.lensEnginePreview.mSurfaceAvailable = false;
}
}
protected override void OnLayout(bool changed, int l, int t, int r, int b)
{
int previewWidth = 480;
int previewHeight = 360;
if (this.mLensEngine != null)
{
Huawei.Hms.Common.Size.Size size = this.mLensEngine.DisplayDimension;
if (size != null)
{
previewWidth = 640;
previewHeight = 480;
}
}
// Swap width and height sizes when in portrait, since it will be rotated 90 degrees
if (this.isPortraitMode())
{
int tmp = previewWidth;
previewWidth = previewHeight;
previewHeight = tmp;
}
int viewWidth = r - l;
int viewHeight = b - t;
int childWidth;
int childHeight;
int childXOffset = 0;
int childYOffset = 0;
float widthRatio = (float)viewWidth / (float)previewWidth;
float heightRatio = (float)viewHeight / (float)previewHeight;
// To fill the view with the camera preview, while also preserving the correct aspect ratio,
// it is usually necessary to slightly oversize the child and to crop off portions along one
// of the dimensions. We scale up based on the dimension requiring the most correction, and
// compute a crop offset for the other dimension.
if (widthRatio > heightRatio)
{
childWidth = viewWidth;
childHeight = (int)((float)previewHeight * widthRatio);
childYOffset = (childHeight - viewHeight) / 2;
}
else
{
childWidth = (int)((float)previewWidth * heightRatio);
childHeight = viewHeight;
childXOffset = (childWidth - viewWidth) / 2;
}
for (int i = 0; i < this.ChildCount; ++i)
{
// One dimension will be cropped. We shift child over or up by this offset and adjust
// the size to maintain the proper aspect ratio.
this.GetChildAt(i).Layout(-1 * childXOffset, -1 * childYOffset, childWidth - childXOffset,
childHeight - childYOffset);
}
try
{
this.startIfReady();
}
catch (Exception e)
{
Log.Info(LensEnginePreview.Tag, "Could not start camera source.", e);
}
}
private bool isPortraitMode()
{
return true;
}
}
}
activity_scene_detection.xml
Code:
<?xml version="1.0" encoding="utf-8"?>
<RelativeLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:background="#000"
android:fitsSystemWindows="true"
android:keepScreenOn="true"
android:orientation="vertical">
<ToggleButton
android:id="@+id/facingSwitch"
android:layout_width="65dp"
android:layout_height="65dp"
android:layout_alignParentBottom="true"
android:layout_centerHorizontal="true"
android:layout_marginBottom="5dp"
android:background="@drawable/facingswitch_stroke"
android:textOff=""
android:textOn="" />
<com.huawei.mlkit.sample.camera.LensEnginePreview
android:id="@+id/preview"
android:layout_width="match_parent"
android:layout_height="wrap_content"
android:layout_alignParentStart="true"
android:layout_alignParentTop="true">
<com.huawei.mlkit.sample.views.overlay.GraphicOverlay
android:id="@+id/overlay"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_marginStart="20dp"
android:layout_marginEnd="20dp" />
</com.huawei.mlkit.sample.camera.LensEnginePreview>
<RelativeLayout
android:id="@+id/rl_select_album_result"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:background="#000"
android:visibility="gone">
<ImageView
android:id="@+id/iv_result"
android:layout_width="match_parent"
android:layout_height="match_parent"
android:layout_alignParentRight="true" />
<TextView
android:id="@+id/result"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_alignParentBottom="true"
android:layout_centerHorizontal="true"
android:layout_marginBottom="100dp"
android:textColor="@color/upsdk_white" />
</RelativeLayout>
<ImageView
android:id="@+id/iv_select_album"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_alignParentRight="true"
android:layout_marginTop="20dp"
android:layout_marginEnd="20dp"
android:src="@drawable/select_album" />
<ImageView
android:id="@+id/iv_return_back"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_marginStart="20dp"
android:layout_marginTop="20dp"
android:src="@drawable/return_back" />
<ImageView
android:id="@+id/iv_left_top"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_below="@id/iv_return_back"
android:layout_marginStart="20dp"
android:layout_marginTop="20dp"
android:src="@drawable/left_top_arrow" />
<ImageView
android:id="@+id/iv_right_top"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_below="@id/iv_select_album"
android:layout_alignParentRight="true"
android:layout_marginTop="23dp"
android:layout_marginEnd="20dp"
android:src="@drawable/right_top_arrow" />
<ImageView
android:id="@+id/iv_left_bottom"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_alignParentBottom="true"
android:layout_marginStart="20dp"
android:layout_marginBottom="70dp"
android:src="@drawable/left_bottom_arrow" />
<ImageView
android:id="@+id/iv_right_bottom"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_alignParentRight="true"
android:layout_alignParentBottom="true"
android:layout_marginEnd="20dp"
android:layout_marginBottom="70dp"
android:src="@drawable/right_bottom_arrow" />
</RelativeLayout>
SceneDetectionActivity.cs
This activity performs all the operation regarding live scene detection.
Code:
using Android.App;
using Android.Content;
using Android.OS;
using Android.Runtime;
using Android.Support.V7.App;
using Android.Views;
using Android.Widget;
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using Huawei.Hms.Mlsdk.Common;
using Huawei.Hms.Mlsdk.Scd;
using HmsXamarinMLDemo.Camera;
using Android.Support.V4.App;
using Android;
using Android.Util;
namespace SceneDetectionDemo
{
[Activity(Label = "SceneDetectionActivity")]
public class SceneDetectionActivity : AppCompatActivity, View.IOnClickListener, MLAnalyzer.IMLTransactor
{
private const string Tag = "SceneDetectionLiveAnalyseActivity";
private const int CameraPermissionCode = 0;
private MLSceneDetectionAnalyzer analyzer;
private LensEngine mLensEngine;
private LensEnginePreview mPreview;
private GraphicOverlay mOverlay;
private int lensType = LensEngine.FrontLens;
private bool isFront = true;
protected override void OnCreate(Bundle savedInstanceState)
{
base.OnCreate(savedInstanceState);
this.SetContentView(Resource.Layout.activity_live_scene_analyse);
this.mPreview = (LensEnginePreview)this.FindViewById(Resource.Id.scene_preview);
this.mOverlay = (GraphicOverlay)this.FindViewById(Resource.Id.scene_overlay);
this.FindViewById(Resource.Id.facingSwitch).SetOnClickListener(this);
if (savedInstanceState != null)
{
this.lensType = savedInstanceState.GetInt("lensType");
}
this.CreateSegmentAnalyzer();
// Checking Camera Permissions
if (ActivityCompat.CheckSelfPermission(this, Manifest.Permission.Camera) == Android.Content.PM.Permission.Granted)
{
this.CreateLensEngine();
}
else
{
this.RequestCameraPermission();
}
}
private void CreateLensEngine()
{
Context context = this.ApplicationContext;
// Create LensEngine.
this.mLensEngine = new LensEngine.Creator(context, this.analyzer).SetLensType(this.lensType)
.ApplyDisplayDimension(960, 720)
.ApplyFps(25.0f)
.EnableAutomaticFocus(true)
.Create();
}
public override void OnRequestPermissionsResult(int requestCode, string[] permissions, [GeneratedEnum] Permission[] grantResults)
{
if (requestCode != CameraPermissionCode)
{
base.OnRequestPermissionsResult(requestCode, permissions, grantResults);
return;
}
if (grantResults.Length != 0 && grantResults[0] == Permission.Granted)
{
this.CreateLensEngine();
return;
}
}
protected override void OnSaveInstanceState(Bundle outState)
{
outState.PutInt("lensType", this.lensType);
base.OnSaveInstanceState(outState);
}
protected override void OnResume()
{
base.OnResume();
if (ActivityCompat.CheckSelfPermission(this, Manifest.Permission.Camera) == Permission.Granted)
{
this.CreateLensEngine();
this.StartLensEngine();
}
else
{
this.RequestCameraPermission();
}
}
public void OnClick(View v)
{
this.isFront = !this.isFront;
if (this.isFront)
{
this.lensType = LensEngine.FrontLens;
}
else
{
this.lensType = LensEngine.BackLens;
}
if (this.mLensEngine != null)
{
this.mLensEngine.Close();
}
this.CreateLensEngine();
this.StartLensEngine();
}
private void StartLensEngine()
{
if (this.mLensEngine != null)
{
try
{
this.mPreview.start(this.mLensEngine, this.mOverlay);
}
catch (Exception e)
{
Log.Error(Tag, "Failed to start lens engine.", e);
this.mLensEngine.Release();
this.mLensEngine = null;
}
}
}
private void CreateSegmentAnalyzer()
{
this.analyzer = MLSceneDetectionAnalyzerFactory.Instance.SceneDetectionAnalyzer;
this.analyzer.SetTransactor(this);
}
protected override void OnPause()
{
base.OnPause();
this.mPreview.stop();
}
protected override void OnDestroy()
{
base.OnDestroy();
if (this.mLensEngine != null)
{
this.mLensEngine.Release();
}
if (this.analyzer != null)
{
this.analyzer.Stop();
}
}
//Request permission
private void RequestCameraPermission()
{
string[] permissions = new string[] { Manifest.Permission.Camera };
if (!ActivityCompat.ShouldShowRequestPermissionRationale(this, Manifest.Permission.Camera))
{
ActivityCompat.RequestPermissions(this, permissions, CameraPermissionCode);
return;
}
}
/// <summary>
/// Implemented from MLAnalyzer.IMLTransactor interface
/// </summary>
public void Destroy()
{
throw new NotImplementedException();
}
/// <summary>
/// Implemented from MLAnalyzer.IMLTransactor interface.
/// Process the results returned by the analyzer.
/// </summary>
public void TransactResult(MLAnalyzer.Result result)
{
mOverlay.Clear();
SparseArray imageSegmentationResult = result.AnalyseList;
IList<MLSceneDetection> list = new List<MLSceneDetection>();
for (int i = 0; i < imageSegmentationResult.Size(); i++)
{
list.Add((MLSceneDetection)imageSegmentationResult.ValueAt(i));
}
MLSceneDetectionGraphic sceneDetectionGraphic = new MLSceneDetectionGraphic(mOverlay, list);
mOverlay.Add(sceneDetectionGraphic);
mOverlay.PostInvalidate();
}
}
}
Xamarin App Build Result
Navigate to Build > Build Solution.
Navigate to Solution Explore > Project > Right Click > Archive/View Archive to generate SHA-256 for build release and Click on Distribute.
Choose Archive > Distribute.
Choose Distribution Channel > Ad Hoc to sign apk.
Choose Demo keystore to release apk.
Build succeed and click Save.
Result.
Tips and Tricks
The minimum resolution is 224 x 224 and the maximum resolution is 4096 x 4960.
Obtains the confidence threshold corresponding to the scene detection result. Call synchronous and asynchronous APIs for scene detection to obtain a data set. Based on the confidence threshold, results whose confidence is less than the threshold can be filtered out.
Conclusion
In this article, we have learned how to integrate ML Text Recognition in Xamarin based Android application. User can live detect indoor and outdoor places and things with the help of Scene Detection API in Application.
Thanks for reading this article. Be sure to like and comments to this article, if you found it helpful. It means a lot to me.
References
HMS Core ML Scene Detection Docs: https://developer.huawei.com/consumer/en/doc/development/HMSCore-Guides-V5/scene-detection-0000001055162807-V5
Original Source

Book Reader Application Using General Text Recognition by Huawei HiAI in Android

Introduction
In this article, we will learn how to integrate Huawei General Text Recognition using Huawei HiAI. We will build the Book reader application.
About application:
Usually user get bored to read book. This application helps them to listen book reading instead of manual book reading. So all they need to do is just capture photo of book and whenever user is travelling or whenever user want to read the book on their free time. Just user need to select image from galley and listen like music.
Huawei general text recognition works on OCR technology.
First let us understand about OCR.
What is optical character recognition (OCR)?
Optical Character Recognition (OCR) technology is a business solution for automating data extraction from printed or written text from a scanned document or image file and then converting the text into a machine-readable form to be used for data processing like editing or searching.
Now let us understand about General Text Recognition (GTR).
At the core of the GTR is Optical Character Recognition (OCR) technology, which extracts text in screenshots and photos taken by the phone camera. For photos taken by the camera, this API can correct for tilts, camera angles, reflections, and messy backgrounds up to a certain degree. It can also be used for document and streetscape photography, as well as a wide range of usage scenarios, and it features strong anti-interference capability. This API works on device side processing and service connection.
Features
For photos: Provides text area detection and text recognition for Chinese, English, Japanese, Korean, Russian, Italian, Spanish, Portuguese, German, and French texts in multiple printing fonts. A wide range of scenarios are supported, and a high recognition accuracy can be achieved even under the influence of complex lighting condition, background, or more.
For screenshots: Optimizes text extraction algorithms based on the characteristics of screenshots captured on mobile phones. Currently, this function is available in the Chinese mainland supporting Chinese and English texts.
OCR features
Lightweight: This API greatly reduces the computing time and ROM space the algorithm model takes up, making your app more lightweight.
Customized hierarchical result return: You can choose to return the coordinates of text blocks, text lines, and text characters in the screenshot based on app requirements.
How to integrate General Text Recognition
1. Configure the application on the AGC.
2. Apply for HiAI Engine Library
3. Client application development process.
Configure application on the AGC
Follow the steps
Step 1: We need to register as a developer account in AppGallery Connect. If you are already a developer ignore this step.
Step 2: Create an app by referring to Creating a Project and Creating an App in the Project
Step 3: Set the data storage location based on the current location.
Step 4: Generating a Signing Certificate Fingerprint.
Step 5: Configuring the Signing Certificate Fingerprint.
Step 6: Download your agconnect-services.json file, paste it into the app root directory.
Apply for HiAI Engine Library
What is Huawei HiAI?
HiAI is Huawei’s AI computing platform. HUAWEI HiAI is a mobile terminal–oriented artificial intelligence (AI) computing platform that constructs three layers of ecology: service capability openness, application capability openness, and chip capability openness. The three-layer open platform that integrates terminals, chips, and the cloud brings more extraordinary experience for users and developers.
How to apply for HiAI Engine?
Follow the steps
Step 1: Navigate to this URL, choose App Service > Development and click HUAWEI HiAI.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Step 2: Click Apply for HUAWEI HiAI kit.
Step 3: Enter required information like Product name and Package name, click Next button.
Step 4: Verify the application details and click Submit button.
Step 5: Click the Download SDK button to open the SDK list.
Step 6: Unzip downloaded SDK and add into your android project under libs folder.
Step 7: Add jar files dependences into app build.gradle file.
Code:
implementation fileTree(include: ['*.aar', '*.jar'], dir: 'libs')
implementation 'com.google.code.gson:gson:2.8.6'
repositories {
flatDir {
dirs 'libs'
}
}
Client application development process
Follow the steps
Step 1: Create an Android application in the Android studio (Any IDE which is your favorite).
Step 2: Add the App level Gradle dependencies. Choose inside project Android > app > build.gradle.
Code:
apply plugin: 'com.android.application'
apply plugin: 'com.huawei.agconnect'
Root level gradle dependencies.
Code:
maven { url 'https://developer.huawei.com/repo/' }
classpath 'com.huawei.agconnect:agcp:1.4.1.300'
Step 3: Add permission in AndroidManifest.xml
XML:
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.READ_INTERNAL_STORAGE" />
<uses-permission android:name="android.permission.CAMERA" />
Step 4: Build application.
Initialize all view.
Java:
private void initializeView() {
mPlayAudio = findViewById(R.id.playAudio);
mTxtViewResult = findViewById(R.id.result);
mImageView = findViewById(R.id.imgViewPicture);
}
Request the runtime permission
Java:
private void requestPermissions() {
try {
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.M) {
int permission1 = ActivityCompat.checkSelfPermission(this,
Manifest.permission.WRITE_EXTERNAL_STORAGE);
int permission2 = ActivityCompat.checkSelfPermission(this,
Manifest.permission.CAMERA);
if (permission1 != PackageManager.PERMISSION_GRANTED || permission2 != PackageManager
.PERMISSION_GRANTED) {
ActivityCompat.requestPermissions(this, new String[]{Manifest.permission.WRITE_EXTERNAL_STORAGE,
Manifest.permission.READ_EXTERNAL_STORAGE, Manifest.permission.CAMERA}, 0x0010);
}
}
} catch (Exception e) {
e.printStackTrace();
}
}
@override
public void onRequestPermissionsResult(int requestCode, String[] permissions, int[] grantResults) {
super.onRequestPermissionsResult(requestCode, permissions, grantResults);
if (grantResults.length <= 0
|| grantResults[0] != PackageManager.PERMISSION_GRANTED) {
Toast.makeText(this, "Permission denied", Toast.LENGTH_SHORT).show();
}
}
Initialize vision base
Java:
private void initVision() {
VisionBase.init(this, new ConnectionCallback() {
@override
public void onServiceConnect() {
Log.e(TAG, " onServiceConnect");
}
@override
public void onServiceDisconnect() {
Log.e(TAG, " onServiceDisconnect");
}
});
}
Initialize text to speech
Java:
private void initializeTextToSpeech() {
textToSpeech = new TextToSpeech(getApplicationContext(), new TextToSpeech.OnInitListener() {
@override
public void onInit(int status) {
if (status != TextToSpeech.ERROR) {
textToSpeech.setLanguage(Locale.UK);
}
}
});
}
Copy code
Create TextDetector instance.
Java:
mTextDetector = new TextDetector(this);
Define Vision image.
Java:
VisionImage image = VisionImage.fromBitmap(mBitmap);
Create instance of Text class.
Java:
final Text result = new Text();
Create and set VisionTextConfiguration
Java:
VisionTextConfiguration config = new VisionTextConfiguration.Builder()
.setAppType(VisionTextConfiguration.APP_NORMAL)
.setProcessMode(VisionTextConfiguration.MODE_IN)
.setDetectType(TextDetectType.TYPE_TEXT_DETECT_FOCUS_SHOOT)
.setLanguage(TextConfiguration.AUTO).build();
//Set vision configuration
mTextDetector.setVisionConfiguration(config);
Call detect method to get the result
Java:
int result_code = mTextDetector.detect(image, result, new VisionCallback<Text>() {
@override
public void onResult(Text text) {
dismissDialog();
Message message = Message.obtain();
message.what = TYPE_SHOW_RESULT;
message.obj = text;
mHandler.sendMessage(message);
}
@override
public void onError(int i) {
Log.d(TAG, "Callback: onError " + i);
mHandler.sendEmptyMessage(TYPE_TEXT_ERROR);
}
@override
public void onProcessing(float v) {
Log.d(TAG, "Callback: onProcessing:" + v);
}
});
Copy code
Create Handler
Java:
private final Handler mHandler = new Handler() {
[USER=439709]@override[/USER]
public void handleMessage(Message msg) {
super.handleMessage(msg);
int status = msg.what;
Log.d(TAG, "handleMessage status = " + status);
switch (status) {
case TYPE_CHOOSE_PHOTO: {
if (mBitmap == null) {
Log.e(TAG, "bitmap is null");
return;
}
mImageView.setImageBitmap(mBitmap);
mTxtViewResult.setText("");
showDialog();
detectTex();
break;
}
case TYPE_SHOW_RESULT: {
Text result = (Text) msg.obj;
if (dialog != null && dialog.isShowing()) {
dialog.dismiss();
}
if (result == null) {
mTxtViewResult.setText("Failed to detect text lines, result is null.");
break;
}
String textValue = result.getValue();
Log.d(TAG, "text value: " + textValue);
StringBuffer textResult = new StringBuffer();
List<TextLine> textLines = result.getBlocks().get(0).getTextLines();
for (TextLine line : textLines) {
textResult.append(line.getValue() + " ");
}
Log.d(TAG, "OCR Detection succeeded.");
mTxtViewResult.setText(textResult.toString());
textToSpeechString = textResult.toString();
break;
}
case TYPE_TEXT_ERROR: {
mTxtViewResult.setText("Failed to detect text lines, result is null.");
}
default:
break;
}
}
};
Complete code as follows
Java:
import android.Manifest;
import android.app.Activity;
import android.app.ProgressDialog;
import android.content.Intent;
import android.content.pm.PackageManager;
import android.database.Cursor;
import android.graphics.Bitmap;
import android.graphics.BitmapFactory;
import android.net.Uri;
import android.os.Build;
import android.os.Handler;
import android.os.Message;
import android.provider.MediaStore;
import android.speech.tts.TextToSpeech;
import android.support.v4.app.ActivityCompat;
import android.support.v7.app.AppCompatActivity;
import android.os.Bundle;
import android.util.Log;
import android.view.View;
import android.widget.Button;
import android.widget.ImageView;
import android.widget.TextView;
import android.widget.Toast;
import com.huawei.hiai.vision.common.ConnectionCallback;
import com.huawei.hiai.vision.common.VisionBase;
import com.huawei.hiai.vision.common.VisionCallback;
import com.huawei.hiai.vision.common.VisionImage;
import com.huawei.hiai.vision.text.TextDetector;
import com.huawei.hiai.vision.visionkit.text.Text;
import com.huawei.hiai.vision.visionkit.text.TextDetectType;
import com.huawei.hiai.vision.visionkit.text.TextLine;
import com.huawei.hiai.vision.visionkit.text.config.TextConfiguration;
import com.huawei.hiai.vision.visionkit.text.config.VisionTextConfiguration;
import java.util.List;
import java.util.Locale;
public class MainActivity extends AppCompatActivity {
private static final String TAG = MainActivity.class.getSimpleName();
private static final int REQUEST_CHOOSE_PHOTO_CODE = 2;
private Bitmap mBitmap;
private ImageView mPlayAudio;
private ImageView mImageView;
private TextView mTxtViewResult;
protected ProgressDialog dialog;
private TextDetector mTextDetector;
Text imageText = null;
TextToSpeech textToSpeech;
String textToSpeechString = "";
private static final int TYPE_CHOOSE_PHOTO = 1;
private static final int TYPE_SHOW_RESULT = 2;
private static final int TYPE_TEXT_ERROR = 3;
[USER=439709]@override[/USER]
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
initializeView();
requestPermissions();
initVision();
initializeTextToSpeech();
}
private void initializeView() {
mPlayAudio = findViewById(R.id.playAudio);
mTxtViewResult = findViewById(R.id.result);
mImageView = findViewById(R.id.imgViewPicture);
}
private void initVision() {
VisionBase.init(this, new ConnectionCallback() {
[USER=439709]@override[/USER]
public void onServiceConnect() {
Log.e(TAG, " onServiceConnect");
}
[USER=439709]@override[/USER]
public void onServiceDisconnect() {
Log.e(TAG, " onServiceDisconnect");
}
});
}
private void initializeTextToSpeech() {
textToSpeech = new TextToSpeech(getApplicationContext(), new TextToSpeech.OnInitListener() {
[USER=439709]@override[/USER]
public void onInit(int status) {
if (status != TextToSpeech.ERROR) {
textToSpeech.setLanguage(Locale.UK);
}
}
});
}
public void onChildClick(View view) {
switch (view.getId()) {
case R.id.btnSelect: {
Log.d(TAG, "Select an image");
Intent intent = new Intent(Intent.ACTION_PICK);
intent.setType("image/*");
startActivityForResult(intent, REQUEST_CHOOSE_PHOTO_CODE);
break;
}
case R.id.playAudio: {
if (textToSpeechString != null && !textToSpeechString.isEmpty())
textToSpeech.speak(textToSpeechString, TextToSpeech.QUEUE_FLUSH, null);
break;
}
}
}
private void detectTex() {
/* create a TextDetector instance firstly */
mTextDetector = new TextDetector(this);
/*Define VisionImage and transfer the Bitmap image to be detected*/
VisionImage image = VisionImage.fromBitmap(mBitmap);
/*Define the Text class.*/
final Text result = new Text();
/*Use VisionTextConfiguration to select the type of the image to be called. */
VisionTextConfiguration config = new VisionTextConfiguration.Builder()
.setAppType(VisionTextConfiguration.APP_NORMAL)
.setProcessMode(VisionTextConfiguration.MODE_IN)
.setDetectType(TextDetectType.TYPE_TEXT_DETECT_FOCUS_SHOOT)
.setLanguage(TextConfiguration.AUTO).build();
//Set vision configuration
mTextDetector.setVisionConfiguration(config);
/*Call the detect method of TextDetector to obtain the result*/
int result_code = mTextDetector.detect(image, result, new VisionCallback<Text>() {
[USER=439709]@override[/USER]
public void onResult(Text text) {
dismissDialog();
Message message = Message.obtain();
message.what = TYPE_SHOW_RESULT;
message.obj = text;
mHandler.sendMessage(message);
}
[USER=439709]@override[/USER]
public void onError(int i) {
Log.d(TAG, "Callback: onError " + i);
mHandler.sendEmptyMessage(TYPE_TEXT_ERROR);
}
[USER=439709]@override[/USER]
public void onProcessing(float v) {
Log.d(TAG, "Callback: onProcessing:" + v);
}
});
}
private void showDialog() {
if (dialog == null) {
dialog = new ProgressDialog(MainActivity.this);
dialog.setTitle("Detecting text...");
dialog.setMessage("Please wait...");
dialog.setIndeterminate(true);
dialog.setCancelable(false);
}
dialog.show();
}
private final Handler mHandler = new Handler() {
[USER=439709]@override[/USER]
public void handleMessage(Message msg) {
super.handleMessage(msg);
int status = msg.what;
Log.d(TAG, "handleMessage status = " + status);
switch (status) {
case TYPE_CHOOSE_PHOTO: {
if (mBitmap == null) {
Log.e(TAG, "bitmap is null");
return;
}
mImageView.setImageBitmap(mBitmap);
mTxtViewResult.setText("");
showDialog();
detectTex();
break;
}
case TYPE_SHOW_RESULT: {
Text result = (Text) msg.obj;
if (dialog != null && dialog.isShowing()) {
dialog.dismiss();
}
if (result == null) {
mTxtViewResult.setText("Failed to detect text lines, result is null.");
break;
}
String textValue = result.getValue();
Log.d(TAG, "text value: " + textValue);
StringBuffer textResult = new StringBuffer();
List<TextLine> textLines = result.getBlocks().get(0).getTextLines();
for (TextLine line : textLines) {
textResult.append(line.getValue() + " ");
}
Log.d(TAG, "OCR Detection succeeded.");
mTxtViewResult.setText(textResult.toString());
textToSpeechString = textResult.toString();
break;
}
case TYPE_TEXT_ERROR: {
mTxtViewResult.setText("Failed to detect text lines, result is null.");
}
default:
break;
}
}
};
[USER=439709]@override[/USER]
protected void onActivityResult(int requestCode, int resultCode, Intent data) {
super.onActivityResult(requestCode, resultCode, data);
if (requestCode == REQUEST_CHOOSE_PHOTO_CODE && resultCode == Activity.RESULT_OK) {
if (data == null) {
return;
}
Uri selectedImage = data.getData();
getBitmap(selectedImage);
}
}
private void requestPermissions() {
try {
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.M) {
int permission1 = ActivityCompat.checkSelfPermission(this,
Manifest.permission.WRITE_EXTERNAL_STORAGE);
int permission2 = ActivityCompat.checkSelfPermission(this,
Manifest.permission.CAMERA);
if (permission1 != PackageManager.PERMISSION_GRANTED || permission2 != PackageManager
.PERMISSION_GRANTED) {
ActivityCompat.requestPermissions(this, new String[]{Manifest.permission.WRITE_EXTERNAL_STORAGE,
Manifest.permission.READ_EXTERNAL_STORAGE, Manifest.permission.CAMERA}, 0x0010);
}
}
} catch (Exception e) {
e.printStackTrace();
}
}
private void getBitmap(Uri imageUri) {
String[] pathColumn = {MediaStore.Images.Media.DATA};
Cursor cursor = getContentResolver().query(imageUri, pathColumn, null, null, null);
if (cursor == null) return;
cursor.moveToFirst();
int columnIndex = cursor.getColumnIndex(pathColumn[0]);
/* get image path */
String picturePath = cursor.getString(columnIndex);
cursor.close();
mBitmap = BitmapFactory.decodeFile(picturePath);
if (mBitmap == null) {
return;
}
//You can set image here
//mImageView.setImageBitmap(mBitmap);
// You can pass it handler as well
mHandler.sendEmptyMessage(TYPE_CHOOSE_PHOTO);
mTxtViewResult.setText("");
mPlayAudio.setEnabled(true);
}
private void dismissDialog() {
if (dialog != null && dialog.isShowing()) {
dialog.dismiss();
}
}
[USER=439709]@override[/USER]
public void onRequestPermissionsResult(int requestCode, String[] permissions, int[] grantResults) {
super.onRequestPermissionsResult(requestCode, permissions, grantResults);
if (grantResults.length <= 0
|| grantResults[0] != PackageManager.PERMISSION_GRANTED) {
Toast.makeText(this, "Permission denied", Toast.LENGTH_SHORT).show();
}
}
[USER=439709]@override[/USER]
protected void onDestroy() {
super.onDestroy();
/* release ocr instance and free the npu resources*/
if (mTextDetector != null) {
mTextDetector.release();
}
dismissDialog();
if (mBitmap != null) {
mBitmap.recycle();
}
}
}
activity_main.xml
XML:
<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:layout_width="match_parent"
android:layout_height="match_parent"
xmlns:app="http://schemas.android.com/apk/res-auto"
android:fitsSystemWindows="true"
androidrientation="vertical"
android:background="@android:color/darker_gray">
<android.support.v7.widget.Toolbar
android:layout_width="match_parent"
android:layout_height="50dp"
android:background="#ff0000"
android:elevation="10dp">
<LinearLayout
android:layout_width="match_parent"
android:layout_height="match_parent"
androidrientation="horizontal">
<TextView
android:layout_width="match_parent"
android:layout_height="match_parent"
android:text="Book Reader"
android:layout_gravity="center"
android:gravity="center|start"
android:layout_weight="1"
android:textColor="@android:color/white"
android:textStyle="bold"
android:textSize="20sp"/>
<ImageView
android:layout_width="40dp"
android:layout_height="40dp"
android:src="@drawable/ic_baseline_play_circle_outline_24"
android:layout_gravity="center|end"
android:layout_marginEnd="10dp"
android:id="@+id/playAudio"
androidadding="5dp"/>
</LinearLayout>
</android.support.v7.widget.Toolbar>
<ScrollView
android:layout_width="match_parent"
android:layout_height="match_parent"
android:fitsSystemWindows="true">
<LinearLayout
android:layout_width="match_parent"
android:layout_height="match_parent"
androidrientation="vertical"
android:background="@android:color/darker_gray"
>
<android.support.v7.widget.CardView
android:layout_width="match_parent"
android:layout_height="wrap_content"
app:cardCornerRadius="5dp"
app:cardElevation="10dp"
android:layout_marginStart="10dp"
android:layout_marginEnd="10dp"
android:layout_marginTop="20dp"
android:layout_gravity="center">
<ImageView
android:id="@+id/imgViewPicture"
android:layout_width="300dp"
android:layout_height="300dp"
android:layout_margin="8dp"
android:layout_gravity="center_horizontal"
android:scaleType="fitXY" />
</android.support.v7.widget.CardView>
<android.support.v7.widget.CardView
android:layout_width="match_parent"
android:layout_height="wrap_content"
app:cardCornerRadius="5dp"
app:cardElevation="10dp"
android:layout_marginStart="10dp"
android:layout_marginEnd="10dp"
android:layout_marginTop="10dp"
android:layout_gravity="center"
android:layout_marginBottom="20dp">
<LinearLayout
android:layout_width="match_parent"
android:layout_height="wrap_content"
androidrientation="vertical"
>
<TextView
android:layout_margin="5dp"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:textColor="@android:color/black"
android:text="Text on the image"
android:textStyle="normal"
/>
<TextView
android:id="@+id/result"
android:layout_margin="5dp"
android:layout_marginBottom="20dp"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:textSize="18sp"
android:textColor="#ff0000"/>
</LinearLayout>
</android.support.v7.widget.CardView>
<Button
android:id="@+id/btnSelect"
android:layout_width="match_parent"
android:layout_height="wrap_content"
androidnClick="onChildClick"
android:layout_marginStart="10dp"
android:layout_marginEnd="10dp"
android:layout_marginBottom="10dp"
android:text="[USER=936943]@string[/USER]/select_picture"
android:background="@drawable/round_button_bg"
android:textColor="@android:color/white"
android:textAllCaps="false"/>
</LinearLayout>
</ScrollView>
</LinearLayout>
Result
Tips and Tricks
Maximum width and height: 1440 px and 15210 px (If the image is larger than this, you will receive error code 200).
Photos recommended size for optimal recognition accuracy.
Resolution > 720P
Aspect ratio < 2:1
If you are taking Video from a camera or gallery make sure your app has camera and storage permission.
Add the downloaded huawei-hiai-vision-ove-10.0.4.307.aar, huawei-hiai-pdk-1.0.0.aar file to libs folder.
Check dependencies added properly
Latest HMS Core APK is required.
Min SDK is 21. Otherwise you will get Manifest merge issue.
Conclusion
In this article, we have learnt the following concepts.
What is OCR?
Learnt about general text recognition.
Feature of GTR
Features of OCR
How to integrate General Text Recognition using Huawei HiAI
How to Apply Huawei HiAI
How to build the application
Reference
General Text Recognition
Apply for Huawei HiAI
Happy coding
how many languages can it be detected?

Categories

Resources