Expert: Develop weather application for HarmonyOS consuming REST APIs - Huawei Developers

Introduction
In this article, I have explained to develop weather application for HarmonyOS using Huawei DevEco Studio and using HTML, JavaScript and Open Rest APIs. User can search the city name and fetch the information. Application will show current weather and weather prediction for next five days. The UI is developed with flexible rich HTML with JavaScript. Network calls are done using Java HttpClient.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Huawei Mobile Device
Requirements
1) DevEco IDE
2) Huawei phone running Harmony OS (Can use cloud emulator also)
New Project (Phone)
After installation of DevEco Studio, make new project.
Select Phone in Device and select Empty Feature Ability (JS) in Template.
After the project is created, its directory as shown in image.
hml files describe the page layout.
css files describe the page style.
js files process the interactions between pages and users.
The app.js file manages global JavaScript logics and application lifecycle.
The pages directory stores all component pages.
The java directory stores java files related to the projects.
Development process
Design the UI
We are designing a simple UI, with just single page which will display the current weather and predicted weather for next five days. We need three UI section in this page.
Search box for searching the city name
UI Section showing today’s weather
UI Section with carousel to display next five days
Step 1: Create the search box in the hml file.
As the first step, we can create an input component that will be holding the search area which will be used for searching the city.
index.html
Code:
<div class="container">
<div>
<div class="title">
<input class="comment" value="{{searchValue}}" placeholder ="Enter the city " onchange="updateSearchValue()"></input>
</div>
<image class="searchart" src="/common/search1.png" onclick="searchCity() "></image>
</div>
</div>
index.css
Code:
.container {
flex-direction: column;
background-color: #e8f6fe;
}.comment {
width: 550px;
height: 100px;
background-color: lightgrey;
}.searchart {
margin-top:40px;
width:70px;
height:68px;
margin-right:40px;
margin-left:-40px;
}
index.js
Code:
updateSearchValue(e) {
this.searchValue = e.text;
},searchCity() {
this.inProgress = true;
this.fetchWeather(this.searchValue).then()
},
Result
​
Step 2: Add UI section to display today’s weather.
Create Text fields for city name, todays date, temperature, precipitation and wind speed. Then we have section for description of today’s weather in few words like sunny, clear sky and so on. Finally we have image which depicts what type of weather it is.
index.hml
Code:
<div class="widget">
<div class="details">
<text class="city">{{currentCityInfo}}</text>
<text class="today">Today {{toDay}}</text>
<text class="temperature">{{currentTemperature}}</text>
<text class="precipitation">Precipitation: {{currentHumidity}}</text>
<text class="wind">Wind: {{currentWind}} km/hr</text>
<div>
<div class="summary">
<text class="summaryText">{{currentDesc}}</text>
</div>
<image class="weatherart" src="{{artImage}}"></image>
</div>
</div>
</div>
index.css
Code:
.temperature {
color: white;
font-weight: 300;
font-size: 150px;
}
.today {
color: white;
font-weight: 300;
font-size: 32px;
margin-top: 20px;
width: 420px;
padding-top: 10px;
border-top: 2px solid #9cd0ff;
}
.city {
color: white;
font-size: 70px;
margin-top: 0px;
}
.summary {
width: 660px;
margin-top: 16px;
padding-bottom: 16px;
border-top: 2px solid #9cd0ff;
}
.summaryText {
color: #d2e9fa;
font-size: 70px;
font-weight: 300;
margin: 0;
margin-left: 40px;
margin-top: 40px;
}
.precipitation, .wind {
color: #d2e9fa;
font-size: 32px;
font-weight: 300;
margin-left: 8px;
}
.precipitation {
margin-top: 16px;
}
Result
Step 3: Add UI section for next five days weather.
Now we have search box and today weather UI section. Below those add a carousel UI using swiper component. Each item in the swiper will have text fields for max and min temperature and an icon for weather indication.
index.html
Code:
<div class="daystitle"><text class="name">Next 4 days</text></div><swiper id="swiperImage" class="swiper-style">
<div class="daydetailscard" for="{{day in days}}">
<text class="daydetitle">{{day.dayName}}</text>
<div class="daydetailssubcard">
<text class="detailstemp">Hi : {{day.maxTemp}}°C</text>
<text class=" detailstemp ">Low : {{day.minTemp}}°C</text>
<image class="weatherarticon" src="{{day.artImageIcon}}"></image>
</div>
<text class="daydetails" >{{day.desc}}</text>
</div>
</swiper>
index.css
Code:
.daydetitle{
color: #626262;
text-align: center;
font-size: 40px;
padding-bottom: 10px;
border-bottom: 2px solid #626262;
font-family: Roboto, sans-serif;
display: flex;
flex-direction: column;
margin-top: 40px;
margin-bottom: 40px;
}
.daydetails{
color: white;
text-align: center;
font-size: 40px;
font-family: Roboto, sans-serif;
display: flex;
flex-direction: column;
margin-top: 40px;
margin-bottom: 40px;
}
.daydetailscard{
border-radius: 28px;
height: 300px;
width: 630px;
background: linear-gradient(to bottom right, #ffb20f 20%, #ecdebc);
font-family: Roboto, sans-serif;
display: flex;
flex-direction: column;
margin-top: 10px;
margin-left: 40px;
}
.daydetailssubcard{
height: 50px;
width: 630px;
font-family: Roboto, sans-serif;
display: flex;
flex-direction: row;
}
.deatilstemp {
color: white;
font-size: 32px;
font-weight: 300;
margin-left: 20px;
margin-top: 16px;
}
Result
​
Step 4: Add UI Screen for Loading.
We will use “inProgress” to control loading state of the network calls. When user clicks search icon, then loading screen will show until the network data is received.
Code:
<div if="{{inProgress}}" class="circleAnimation"></div>
<div if="{{inProgress}}" class="circleAnimation"></div>
<div if="{{inProgress}}" class="circleAnimation"></div>
<div if="{{inProgress}}" class="circleAnimation"></div>
<div if="{{inProgress}}" class="circleAnimation"></div>
<div if="{{inProgress}}" class="circleAnimation"></div>
</div>
index.css
Code:
.circleAnimation {
height: 20px;
width: 20px;
margin-left: 20px;
margin-top: 20px;
border-radius: 10;
background-color: red;
animation-name: Stretch;
animation-duration: 1.5s;
animation-timing-function: ease-out;
animation-delay: 0;
animation-iteration-count: infinite;
animation-fill-mode: none;
animation-play-state: running;
}
}
@keyframes spin {
0% { transform: rotate(0); }
100% { transform: rotate(360); }
}
Result
​
Consume REST APIs of openweather.org
We will use two APIs from openweather.org. One is to get the Current weather and other to get the prediction next five day weather. Before using these APIs, Create an account and obtain API Key.
Current weather data
Access current weather data for any location on Earth including over 200,000 cities! We collect and process weather data from different sources such as global and local weather models, satellites, radars and vast network of weather stations. Data is available in JSON, XML, or HTML format.
By city name
You can call by city name or city name, state code and country code.
api.openweathermap.org/data/2.5/weather?q={city name}&appid={API key}
Step 5: Create Model classes for Weather response.
CurrentWeatherResponse.java
Code:
public class CurrentWeatherResponse {
@SerializedName("dt")
private int dt;
@SerializedName("coord")
private Coord coord;
@SerializedName("weather")
private List<WeatherItem> weather;
@SerializedName("name")
private String name;
@SerializedName("cod")
private int cod;
@SerializedName("main")
private Main main;
@SerializedName("clouds")
private Clouds clouds;
@SerializedName("id")
private int id;
@SerializedName("sys")
private Sys sys;
@SerializedName("base")
private String base;
@SerializedName("wind")
private Wind wind;
public int getDt() {
return dt;
}
public void setDt(int dt) {
this.dt = dt;
}
public Coord getCoord() {
return coord;
}
public void setCoord(Coord coord) {
this.coord = coord;
}
public List<WeatherItem> getWeather() {
return weather;
}
public void setWeather(List<WeatherItem> weather) {
this.weather = weather;
}
public String getName() {
return name;
}
public void setName(String name) {
this.name = name;
}
public int getCod() {
return cod;
}
public void setCod(int cod) {
this.cod = cod;
}
public Main getMain() {
return main;
}
public void setMain(Main main) {
this.main = main;
}
public Clouds getClouds() {
return clouds;
}
public void setClouds(Clouds clouds) {
this.clouds = clouds;
}
public int getId() {
return id;
}
public void setId(int id) {
this.id = id;
}
public Sys getSys() {
return sys;
}
public void setSys(Sys sys) {
this.sys = sys;
}
public String getBase() {
return base;
}
public void setBase(String base) {
this.base = base;
}
public Wind getWind() {
return wind;
}
public void setWind(Wind wind) {
this.wind = wind;
}
}
Daily Forecast 5 Days
Daily Forecast 5 Days is available at any location or city. The forecast includes daily weather data and the response data is available in JSON or XML format
By city name
You can search 5 day weather forecast with daily average parameters by city name. All weather data can be obtained in JSON and XML formats.
api.openweathermap.org/data/2.5/forecast/daily?q={city name}&cnt={cnt}&appid={API key}
Step 6: Create Model classes for Weather response.
MultipleDaysWeatherResponse.java
Code:
public class MultipleDaysWeatherResponse {
@SerializedName("city")
private City city;
@SerializedName("cnt")
private int cnt;
@SerializedName("cod")
private String cod;
@SerializedName("message")
private double message;
@SerializedName("list")
private List<ListItem> list;
public City getCity() {
return city;
}
public void setCity(City city) {
this.city = city;
}
public int getCnt() {
return cnt;
}
public void setCnt(int cnt) {
this.cnt = cnt;
}
public String getCod() {
return cod;
}
public void setCod(String cod) {
this.cod = cod;
}
public double getMessage() {
return message;
}
public void setMessage(double message) {
this.message = message;
}
public List<ListItem> getList() {
return list;
}
public void setList(List<ListItem> list) {
this.list = list;
}
}
Step 7: Fetching network data.
We will use simple Java HttpURLConnection for fetching data from Rest APIs. We will have these network operations in Service ability.
WeatherServiceAbility.java
Code:
public class WeatherServiceAbility extends Ability {
private MyRemote remote = new MyRemote();
private static final String CODE = "CODE";
private static final String TEMP = "TEMP";
private static final String HUMIDITY = "HUMIDITY";
private static final String DESCRIPTION = "DESCRIPTION";
private static final String WIND = "WIND";
private static final String CITY_INFO = "CITY_INFO";
private static final String WEATHER_CODE = "WEATHER_CODE";
private static final String MAX_TEMP = "MAX_TEMP";
private static final String MIN_TEMP = "MIN_TEMP";
private static final String FORECAST_URL = "forecast/daily";
private static final String WEATHER_URL = "weather";
private static final String BASE_URI = "https://api.openweathermap.org/data/2.5/";
private static final String API_KEY ="appid={{Add your Key}}";
private static final String UNITS ="units=metric";
@Override
protected IRemoteObject onConnect(Intent intent) {
super.onConnect(intent);
return remote.asObject();
}
class MyRemote extends RemoteObject implements IRemoteBroker {
private static final int SUCCESS = 0;
private static final int CURRENT = 1001;
private static final int FORECAST = 1002;
MyRemote() {
super("MyService_MyRemote");
}
@Override
public boolean onRemoteRequest(int code, MessageParcel data, MessageParcel reply, MessageOption option) {
RequestParam param = getParamFromMessageParcel(data);
switch (code) {
case CURRENT: {
String output = startNetworkCall(BASE_URI + WEATHER_URL, new String[]{"q=" + param.getCity(), UNITS, API_KEY});
CurrentWeatherResponse countryObj = new Gson().fromJson(String.valueOf(output), CurrentWeatherResponse.class);
reply.writeString(bundleSuccessResult(countryObj));
}
break;
case FORECAST: {
String output = startNetworkCall(BASE_URI + FORECAST_URL, new String[]{"q=" + param.getCity(), "cnt=5", UNITS, API_KEY});
MultipleDaysWeatherResponse fiveHistoryObj = new Gson().fromJson(String.valueOf(output), MultipleDaysWeatherResponse.class);
reply.writeString(bundleforPredictedWeather(fiveHistoryObj));
}
break;
default: {
reply.writeString("service not defined");
return false;
}
}
return true;
}
@Override
public IRemoteObject asObject() {
return this;
}
}
private RequestParam getParamFromMessageParcel(MessageParcel message) {
String zsonStr = message.readString();
try {
return new Gson().fromJson(zsonStr, RequestParam.class);
} catch (RuntimeException e) {
}
return null;
}
private String bundleSuccessResult (CurrentWeatherResponse response) {
Map<String, Object> zsonResult = new HashMap<String, Object>();
zsonResult.put(CODE, MyRemote.SUCCESS);
zsonResult.put(TEMP , String.format(Locale.getDefault(), "%.0f°C", response.getMain().getTemp()) );
zsonResult.put(HUMIDITY , response.getMain().getHumidity());
zsonResult.put(WIND , response.getWind().getSpeed());
zsonResult.put(DESCRIPTION , response.getWeather().get(0).getDescription());
zsonResult.put(CITY_INFO , response.getName()+", "+response.getSys().getCountry());
zsonResult.put(WEATHER_CODE , response.getWeather().get(0).getId());
return ZSONObject.toZSONString(zsonResult);
}
private String bundleforPredictedWeather (MultipleDaysWeatherResponse response) {
List<ListItem> list = response.getList();
ZSONArray array = new ZSONArray();
for (ListItem item : list) {
Map<String, Object> zsonResult = new HashMap<String, Object>();
zsonResult.put(MAX_TEMP , item.getTemp().getMax());
zsonResult.put(MIN_TEMP ,item.getTemp().getMin());
zsonResult.put(WEATHER_CODE , (item.getWeather().get(0).getId()));
array.add(zsonResult);
}
return ZSONObject.toZSONString(array);
}
public String startNetworkCall(String link, String params[]) {
NetManager netManager = NetManager.getInstance(null);
if (!netManager.hasDefaultNet()) {
return null;
}
NetHandle netHandle = netManager.getDefaultNet();
// Listen to network state changes.
NetStatusCallback callback = new NetStatusCallback() {
// Override the callback for network state changes.
};
netManager.addDefaultNetStatusCallback(callback);
// Obtain a URLConnection using the openConnection method.
HttpURLConnection connection = null;
try {
StringBuilder urlFinal = new StringBuilder();
urlFinal.append(link);
urlFinal.append("?");
for (int i = 0; i < params.length; i++) {
urlFinal.append("&");
urlFinal.append(params[i]);
}
java.net.URL url = new URL(urlFinal.toString());
URLConnection urlConnection = netHandle.openConnection(url,
java.net.Proxy.NO_PROXY);
if (urlConnection instanceof HttpURLConnection) {
connection = (HttpURLConnection) urlConnection;
}
connection.setRequestMethod("GET");
connection.connect();
int responseCode = connection.getResponseCode();
if (responseCode == HttpURLConnection.HTTP_OK) {
BufferedReader br = new BufferedReader(new InputStreamReader(connection.getInputStream()));
StringBuilder sb = new StringBuilder();
String line;
while ((line = br.readLine()) != null) {
sb.append(line);
}
br.close();
return sb.toString();
}
return null;
} catch (IOException e) {
e.printStackTrace();
return "IOException";
} finally {
if (connection != null) {
connection.disconnect();
}
}
}
}
Step 8: Display fetched data in UI
Once the user clicks search icon, the city name passed as parameter to async call from JavaScript. Fetch weather method will send the feature ability call to the java layer.
Code:
fetchWeather: async function(city) {
var actionData = {};
actionData.city = ""+city;
var action = {};
action.bundleName = 'com.huawei.phonesample';
action.abilityName = 'com.huawei.phonesample.WeatherServiceAbility ';
action.messageCode = ACTION_MESSAGE_CODE_CURRENT;
action.data = actionData;
action.abilityType = ABILITY_TYPE_EXTERNAL;
action.syncOption = ACTION_SYNC;
var result = await FeatureAbility.callAbility(action);
var ret = JSON.parse(result);
this.inProgress = false;
this.currentTemperature = ret.TEMP;
this.currentDesc = ret.DESCRIPTION;
this.currentWind = ret.WIND;
this.currentHumidity = ret.HUMIDITY + "%"
this.currentCityInfo = ret.CITY_INFO
this.searchValue = ""
this.updateWeatherArt(ret.WEATHER_CODE)
this.toDay = new Date().getDate() + "-" + month_names[new Date().getMonth()];
if (ret.code == 0) {
console.info('plus result is:' + JSON.stringify(ret.abilityResult));
this.currentTemperature = ret.TEMP + "°";
this.currentDesc = ret.DESCRIPTION;
this.currentWind = ret.WIND;
this.currentHumidity = ret.HUMIDITY + "%"
} else {
console.error('plus error code:' + JSON.stringify(ret.code));
}
},
Once we have data after call ability parse the result to json object and retrieve the data required to display the UI.
Set the inProgress flag also to false to update the UI section with data.
Update weather icon checking the weather code.
Code:
updateWeatherArt(weatherCode) {
if (weatherCode / 100 == 2) {
this.artImage = "/common/art_storm.png";
} else if (weatherCode / 100 == 3) {
this.artImage = "/common/art_rain.png";
} else if (weatherCode / 100 == 5) {
this.artImage = "/common/art_light_rain.png";
} else if (weatherCode / 100 == 6) {
this.artImage = "/common/art_snow.png";
} else if (weatherCode / 100 == 7) {
this.artImage = "/common/art_clear.png";
} else if (weatherCode == 800) {
this.artImage = "/common/art_clear.png";
} else if (weatherCode == 801) {
this.artImage = "/common/art_light_clouds.png";
} else if (weatherCode == 803) {
this.artImage = "/common/art_light_clouds.png";
} else if (weatherCode / 100 == 8) {
this.artImage = "/common/art_clouds.png";
}
}
Similarly update the five day data for the second UI section.
Tips and Tricks
You can use Cloud emulator for development. I have explained UI updating for current weather, but similarly you can update the carousel UI with array object you got from second API response. There are few more options like fetching the current location and use that for getting the weather, which will be an added feature.
Conclusion
In this article, we have learnt how to create weather application using HarmonyOS UI components and Service Ability. We have explored Serviceability for fetching data from open source REST API.
References
JS API References
Weather API
Original Source

Related

1 Map makes you feel easy in a strange city (Part 2)

This article is orginally from HUAWEI Developer Forum.
Forum link: https://forums.developer.huawei.com/forumPortal/en/home
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Before we start learning about today’s topic, I strongly recommend you to go through my previous article i.e. HMS Site Map (Part 1). It will help you to have a clear picture.
Let’s Begin
In the previous article, we were successfully able to get details after selecting the place that we want to search using Site Kit. Today in this article we are going to see how to show a map using Map Kit after fetching the Latitude and Longitude from the details we selected. Also we are going to see how to use the Site APIs and Map APIs using POSTMAN in our Part 3 article.
One Step at a time
First we need to add Map Kit dependencies in the app gradle file and sync the app.
implementation 'com.huawei.hms:maps:4.0.1.300'
After adding the dependencies we need to provide permission in AndroidManifest.xml file.
Code:
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.ACCESS_COARSE_LOCATION" />
<uses-permission android:name="android.permission.ACCESS_FINE_LOCATION" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE"/>
<uses-permission android:name="com.huawei.appmarket.service.commondata.permission.GET_COMMON_DATA"/>
Let’s Code
Main Activity class
Code:
private void showDetails(String item) {
String pattern = Pattern.quote("\\" + "n");
String[] lines = item.split("\\n+");
autoCompleteTextView.setText(lines[0]);
mLat = lines[2]; // This is latitude
mLon = lines[3]; // This is longitude
title = lines[0]; // This is title or place name
String details = "<font color='red'>PLACE NAME : </font>" + lines[0] + "<br>"
+ "<font color='#CD5C5C'>COUNTRY : </font>" + lines[1] + "<br>"
+ "<font color='#8E44AD'>ADDRESS : </font>" + lines[4] + "<br>"
+ "<font color='#008000'>PHONE : </font>" + lines[5];
txtDetails.setText(Html.fromHtml(details, Html.FROM_HTML_MODE_COMPACT));
}
private void showMap(){
Intent intent = new Intent(MainActivity.this, MapActivity.class);
intent.putExtra("lat",mLat); // Here we are passing Latitude and Longitude
intent.putExtra("lon",mLon); // and titile from MainActivity class to
intent.putExtra("title",title);// MapActivity class…
startActivity(intent);
}v
Main Code
1) First we need to understand whether we are showing the map in view or fragment. Because there are two way we can show our map.
a) Fragment way
In fragment way we add MapFragment in the layout file of an activity.
Code:
<fragment xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:map="http://schemas.android.com/apk/res-auto"
android:id="@+id/mapfragment_mapfragmentdemo"
class="com.huawei.hms.maps.MapFragment"
android:layout_width="match_parent"
android:layout_height="match_parent"
map:cameraTargetLat="48.893478"
map:cameraTargetLng="2.334595"
map:cameraZoom="10" />
b) MapView way
Here we add MapView in the layout file of an activity.
Code:
<com.huawei.hms.maps.MapView
xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:map="http://schemas.android.com/apk/res-auto"
android:id="@+id/mapView"
android:layout_width="match_parent"
android:layout_height="match_parent"
map:mapType="normal"
map:uiCompass="true"
map:uiZoomControls="true"
map:cameraTargetLat="51"
map:cameraTargetLng="10"
map:cameraZoom="8.5"/>
2) Here we are going with MapView.
3) For both Fragment as well as View, we need to implement OnMapReadyCallback API in our MapActivity to use a Map. After implementing this API, it will ask to implement onMapReady method.
Code:
public void onMapReady(HuaweiMap map) {
Log.d(TAG, "onMapReady: ");
hMap = map;
}
4) The only difference which we will see between MapFragment and MapView is instantiating Map.
a) MapFragement
Code:
private MapFragment mMapFragment;
mMapFragment = (MapFragment) getFragmentManager()
.findFragmentById(R.id.mapfragment_mapfragmentdemo);
mMapFragment.getMapAsync(this);
b) MapView
Code:
private MapView mMapView;
mMapView = findViewById(R.id.mapview_mapviewdemo);
Bundle mapViewBundle = null;
if (savedInstanceState != null) {
mapViewBundle = savedInstanceState.getBundle("MapViewBundleKey");
}
mMapView.onCreate(mapViewBundle);
mMapView.getMapAsync(this);
5) Permission we need to check
Code:
//Put this in the top of the onCreate() method …
private static final String[] RUNTIME_PERMISSIONS = {
Manifest.permission.WRITE_EXTERNAL_STORAGE,
Manifest.permission.READ_EXTERNAL_STORAGE,
Manifest.permission.ACCESS_COARSE_LOCATION,
Manifest.permission.ACCESS_FINE_LOCATION,
Manifest.permission.INTERNET
};
// This will placed in the onCreate() method …
if (!hasPermissions(this, RUNTIME_PERMISSIONS)) {
ActivityCompat.requestPermissions(this, RUNTIME_PERMISSIONS, REQUEST_CODE);
}
// Use this method to check Permission …
private static boolean hasPermissions(Context context, String... permissions) {
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.M && permissions != null) {
for (String permission : permissions) {
if (ActivityCompat.checkSelfPermission(context, permission)
!= PackageManager.PERMISSION_GRANTED) {
return false;
}
}
}
return true;
}
MapActivity Class
Code:
public class MapActivity extends AppCompatActivity implements OnMapReadyCallback {
private static final String TAG = "MapActivity";
private MapView mMapView;
private HuaweiMap hmap;
private Marker mMarker;
private static final String[] RUNTIME_PERMISSIONS = {
Manifest.permission.WRITE_EXTERNAL_STORAGE,
Manifest.permission.READ_EXTERNAL_STORAGE,
Manifest.permission.ACCESS_COARSE_LOCATION,
Manifest.permission.ACCESS_FINE_LOCATION,
Manifest.permission.INTERNET
};
private static final String MAPVIEW_BUNDLE_KEY = "MapViewBundleKey";
private static final int REQUEST_CODE = 100;
private String mLatitude, mLongitude,mTitle;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_map);
mLatitude = getIntent().getExtras().getString("lat");
mLongitude = getIntent().getExtras().getString("lon");
mTitle = getIntent().getExtras().getString("title");
if (!hasPermissions(this, RUNTIME_PERMISSIONS)) {
ActivityCompat.requestPermissions(this, RUNTIME_PERMISSIONS, REQUEST_CODE);
}
mMapView = findViewById(R.id.mapView);
Bundle mapViewBundle = null;
if (savedInstanceState != null) {
mapViewBundle = savedInstanceState.getBundle(MAPVIEW_BUNDLE_KEY);
}
mMapView.onCreate(mapViewBundle);
mMapView.getMapAsync(this);
}
@Override
protected void onStart() {
super.onStart();
mMapView.onStart();
}
@Override
protected void onStop() {
super.onStop();
mMapView.onStop();
}
@Override
protected void onDestroy() {
super.onDestroy();
mMapView.onDestroy();
}
@Override
protected void onPause() {
mMapView.onPause();
super.onPause();
}
@Override
protected void onResume() {
super.onResume();
mMapView.onResume();
}
@Override
public void onLowMemory() {
super.onLowMemory();
mMapView.onLowMemory();
}
@Override
public void onMapReady(HuaweiMap huaweiMap) {
Log.d(TAG, "onMapReady: ");
hmap = huaweiMap;
hmap.setMyLocationEnabled(true);
hmap.setMapType(HuaweiMap.MAP_TYPE_NORMAL);
hmap.setMaxZoomPreference(15);
hmap.setMinZoomPreference(5);
CameraPosition build = new CameraPosition.Builder()
.target(new LatLng(Double.parseDouble(mLatitude), Double.parseDouble(mLongitude)))
.build();
CameraUpdate cameraUpdate = CameraUpdateFactory
.newCameraPosition(build);
hmap.animateCamera(cameraUpdate);
MarkerOptions options = new MarkerOptions()
.position(new LatLng(Double.parseDouble(mLatitude),
Double.parseDouble(mLongitude)))
.title(mTitle);
mMarker = hmap.addMarker(options);
mMarker.showInfoWindow();
hmap.setOnMarkerClickListener(new HuaweiMap.OnMarkerClickListener() {
@Override
public boolean onMarkerClick(Marker marker) {
Toast.makeText(getApplicationContext(), "onMarkerClick:" +
marker.getTitle(), Toast.LENGTH_SHORT).show();
return false;
}
});
}
private static boolean hasPermissions(Context context, String... permissions) {
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.M && permissions != null) {
for (String permission : permissions) {
if (ActivityCompat.checkSelfPermission(context, permission)
!= PackageManager.PERMISSION_GRANTED) {
return false;
}
}
}
return true;
}
}
Core Functionality of Map
1) Types of Map
There are five types map:
· HuaweiMap.MAP_TYPE_NORMAL
· HuaweiMap.MAP_TYPE_NONE
· HuaweiMap.MAP_TYPE_SATELLITE
· HuaweiMap.MAP_TYPE_HYBRID
· HuaweiMap.MAP_TYPE_TERRAIN
But we can only use MAP_TYPE_NORMAL and MAP_TYPE_NONE. Normal type is a standard map, which shows roads, artificial structures, and natural features such as rivers. None type is an empty map without any data.
The Rest Map type is in development phase.
2) Camera Movement
Huawei maps are moved by simulating camera movement. You can control the visible region of a map by changing the camera's position. To change the camera's position, create different types of CameraUpdate objects using the CameraUpdateFactory class, and use these objects to move the camera.
Code:
CameraPosition build = new CameraPosition.Builder().target(new
LatLng(Double.parseDouble(mLatitude),
Double.parseDouble(mLongitude))).build();
CameraUpdate cameraUpdate = CameraUpdateFactory
.newCameraPosition(build);
hmap.animateCamera(cameraUpdate);
In the above code we are using Map camera in animation mode. When moving the map camera in animation mode, you can set the animation duration and API to be called back when the animation stops. By default, the animation duration is 250 ms.
3) My Location in Map
We can get our location in our Map by simply enabling my-location layer. Also, we can display my-location icon in the Map.
Code:
hmap.setMyLocationEnabled(true);
hmap.getUiSettings().setMyLocationButtonEnabled(true);
4) Show Marker in Map
We can add markers to a map to identify locations such as stores and buildings, and provide additional information with information windows.
Code:
MarkerOptions options = new MarkerOptions()
.position(new LatLng(Double.parseDouble(mLatitude),
Double.parseDouble(mLongitude)))
.title(mTitle); // Adding the title here …
mMarker = hmap.addMarker(options);
mMarker.showInfoWindow();
We can customize our marker according to our need using BitmapDescriptor object.
Code:
Bitmap bitmap = ResourceBitmapDescriptor.drawableToBitmap(this,
ContextCompat.getDrawable(this, R.drawable.badge_ph));
BitmapDescriptor bitmapDescriptor = BitmapDescriptorFactory.fromBitmap(bitmap);
mMarker.setIcon(bitmapDescriptor);
We can title to the Marker as shown in the above code. We can also make the marker clickable as shown below.
Code:
hmap.setOnMarkerClickListener(new HuaweiMap.OnMarkerClickListener() {
@Override
public boolean onMarkerClick(Marker marker) {
Toast.makeText(getApplicationContext(), "onMarkerClick:" +
marker.getTitle(), Toast.LENGTH_SHORT).show();
return false;
}
});
5) Map comes in shape
a) Polyline
b) Polygon
c) Circle
We can use Polyline if we need to show routes from one place to another. We can combine Direction API with Polyline to show routes for walking, bicycling and driving also calculating routes distance.
If we need to show radius like the location under 500 meter or something we use Circle shape to show in the map.
The Result
Any questions about this process, you can try to acquire answers from HUAWEI Developer Forum.​

Setup Map on an Android Application in smart way

More articles like this, you can visit HUAWEI Developer Forum.​
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Introduction
Now days, most of the application integrate an Maps. This article I will give you the process on how to do. Millions of users look up directions, plan their commutes, catch a ride.as well as touch on the many things available through maps to enhance the user experience in mobile apps.
Let’s Start how to Integrate Map:
Step1: create a new project in Android studio.
Step 2: Configure your app into AGC.
Step 3: Enable required Api & add SHA-256.
Step 4: Download the agconnect-services.json from AGC. Paste into app directory.
Step 5: Add the below dependency in app.gradle file.
Code:
implementation 'com.huawei.hms:maps:4.0.0.301'
Step 6: Add the below dependency in root.gradle file
Code:
maven { url 'http://developer.huawei.com/repo/' }
Step 7: Add appId & permissions in AndoridManifest.xml file
Code:
<uses-permission android:name="android.permission.ACCESS_COARSE_LOCATION" />
<uses-permission android:name="android.permission.ACCESS_FINE_LOCATION" />
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<meta-data
android:name="com.huawei.hms.client.appid"
android:value="appid=*******" />
Step 8: Sync your Project
Let’s Discuss functionality:
1.OnMapReady()
2.OnMapClick()
3.OnMarkerClick()
4.Create Circle
5.Create Custom Marker
1. OnMapReady: This Callback interface for when map is ready to be used
Code:
@Override
public void onMapReady(HuaweiMap map) {
mHuaweiMap = map;
enableUiSettings();
mHuaweiMap.setMaxZoomPreference(15);
mHuaweiMap.setMinZoomPreference(2);
}
2. OnMapClick : This callback interface when the user makes tap on the map.
Code:
@Override
public void onMapClick(LatLng latLng) {
try {
createMarker(latLng);
} catch (IOException e) {
e.printStackTrace();
}
}
3. OnMarkerClick : This callback interface when a marker is clicked
Code:
@Override
public boolean onMarkerClick(Marker marker) {
marker.showInfoWindow();
return true;
}
4. How to create circle on map:
Code:
private void addCircleToCurrentLocation() {
mHuaweiMap.addCircle(new CircleOptions()
.center(new LatLng(12.9716, 77.5946))
.radius(1000)
.strokeWidth(10)
.strokeColor(Color.GREEN)
.fillColor(Color.argb(128, 255, 0, 0))
.clickable(true));
}
5. How to create marker:
Code:
private void createMarker(LatLng latLng) throws IOException {
MarkerOptions markerOptions = new MarkerOptions()
.position(latLng)
.snippet("Address : " + featchAddress(latLng))
.icon(BitmapDescriptorFactory.fromResource(R.drawable.ic_location));
mHuaweiMap.addMarker(markerOptions);
CameraPosition cameraPosition = new CameraPosition.Builder()
.target(latLng) // Sets the center of the map to location user
.zoom(20) // Sets the zoom
.bearing(90) // Sets the orientation of the camera to east
.tilt(40) // Sets the tilt of the camera to 30 degrees
.build(); // Creates a CameraPosition from the builder
mHuaweiMap.animateCamera(CameraUpdateFactory.newCameraPosition(cameraPosition));
mHuaweiMap.setOnMarkerClickListener(this);
}
In this article I covered few basics callbacks. below is the final code
Code:
public class MainActivity extends AppCompatActivity implements OnMapReadyCallback, HuaweiMap.OnMapClickListener, HuaweiMap.OnMarkerClickListener {
private static final String MAPVIEW_BUNDLE_KEY = "MapViewBundleKey";
private static final int REQUEST_CODE = 100;
private static final LatLng LAT_LNG = new LatLng(12.9716, 77.5946);
private HuaweiMap mHuaweiMap;
private MapView mMapView;
private Button btnCustom;
private static final String[] RUNTIME_PERMISSIONS = {Manifest.permission.WRITE_EXTERNAL_STORAGE,
Manifest.permission.READ_EXTERNAL_STORAGE, Manifest.permission.ACCESS_COARSE_LOCATION,
Manifest.permission.ACCESS_FINE_LOCATION, Manifest.permission.INTERNET};
private Marker marker;
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
btnCustom = findViewById(R.id.btn_custom);
if (!hasPermissions(this, RUNTIME_PERMISSIONS)) {
ActivityCompat.requestPermissions(this, RUNTIME_PERMISSIONS, REQUEST_CODE);
}
mMapView = findViewById(R.id.mapView);
Bundle mapViewBundle = null;
if (savedInstanceState != null) {
mapViewBundle = savedInstanceState.getBundle(MAPVIEW_BUNDLE_KEY);
}
init(mapViewBundle);
btnCustom.setOnClickListener(new View.OnClickListener() {
@Override
public void onClick(View view) {
mHuaweiMap.setOnMapClickListener(MainActivity.this);
}
});
}
private void init(Bundle mapViewBundle) {
mMapView.onCreate(mapViewBundle);
mMapView.getMapAsync(this);
}
@Override
public void onMapReady(HuaweiMap map) {
mHuaweiMap = map;
enableUiSettings();
mHuaweiMap.setMaxZoomPreference(15);
mHuaweiMap.setMinZoomPreference(2);
addCircleToCurrentLocation();
}
/*
Enable Ui Settings
*/
private void enableUiSettings() {
mHuaweiMap.setMyLocationEnabled(true);
mHuaweiMap.getUiSettings().setMyLocationButtonEnabled(true);
mHuaweiMap.getUiSettings().setCompassEnabled(true);
mHuaweiMap.getUiSettings().setZoomControlsEnabled(true);
mHuaweiMap.getUiSettings().setMyLocationButtonEnabled(true);
}
/*
Create Circle to current location
*/
private void addCircleToCurrentLocation() {
mHuaweiMap.addCircle(new CircleOptions()
.center(new LatLng(12.9716, 77.5946))
.radius(1000)
.strokeWidth(10)
.strokeColor(Color.GREEN)
.fillColor(Color.argb(128, 255, 0, 0))
.clickable(true));
}
/*
Create Marker when you click on map
*/
private void createMarker(LatLng latLng) throws IOException {
MarkerOptions markerOptions = new MarkerOptions()
.position(latLng)
.snippet("Address : " + featchAddress(latLng))
.icon(BitmapDescriptorFactory.fromResource(R.drawable.ic_location));
mHuaweiMap.addMarker(markerOptions);
CameraPosition cameraPosition = new CameraPosition.Builder()
.target(latLng) // Sets the center of the map to location user
.zoom(20) // Sets the zoom
.bearing(90) // Sets the orientation of the camera to east
.tilt(40) // Sets the tilt of the camera to 30 degrees
.build(); // Creates a CameraPosition from the builder
mHuaweiMap.animateCamera(CameraUpdateFactory.newCameraPosition(cameraPosition));
mHuaweiMap.setOnMarkerClickListener(this);
}
/*
Convert from latlong to Address
*/
private String featchAddress(LatLng latLng) throws IOException {
Geocoder geocoder = new Geocoder(this, Locale.ENGLISH);
List<Address> addresses = geocoder.getFromLocation(latLng.latitude, latLng.latitude, 1);
Toast.makeText(this, addresses.get(0).getLocality() + ", "
+ addresses.get(0).getAdminArea() + ", "
+ addresses.get(0).getCountryName(), Toast.LENGTH_SHORT).show();
return addresses.get(0).getLocality() + ", "
+ addresses.get(0).getAdminArea() + ", "
+ addresses.get(0).getCountryName();
}
private static boolean hasPermissions(Context context, String... permissions) {
if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.M && permissions != null) {
for (String permission : permissions) {
if (ActivityCompat.checkSelfPermission(context, permission) != PackageManager.PERMISSION_GRANTED) {
return false;
}
}
}
return true;
}
@Override
public void onMapClick(LatLng latLng) {
try {
createMarker(latLng);
} catch (IOException e) {
e.printStackTrace();
}
}
@Override
public boolean onMarkerClick(Marker marker) {
marker.showInfoWindow();
return true;
}
Output:
Conclusion:
In this article you’ve learned how to create custom markers, how callbacks will work, as well as new ways for users to interact with the map.
https://developer.huawei.com/consumer/en/doc/development/HMS-References/hms-map-cameraupdate
Reference:

Huawei Smart Watch – Quran Audio Player Application Development using JS/JAVA on HUAWEI DevEco Studio (HarmonyOS)

{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Article Introduction
In this article we will develop application for Huawei Smart Watch device using Huawei DevEco Studio (HarmonyOS). We will cover how to Develop Audio Player application for Huawei Smart Watch device using JS and JAVA language.
Huawei Smart Watch
· Video
File Data:
· Data Storage
· File Storage
Network Access:
· Uploading and Downloading
· Data Request
System Capabilities:
· Notification Message
· Network State
· Application Management
· Media Query
· Vibration
· Sensor
· Geographic Location
· Device Information
· Screen Brightness
· Battery Level
1. Create New Project
Let’s create Smart Watch Project and choosing ability template, Wearable and List Feature Ability (JS)
Define project name, package name and relevant directory where you want to save your project.
2. Run Application on Smart Watch Emulator
Let’s first configure the Smart Watch Emulator to test the application.
Click on Tools -> HVD Manager to configure Huawei Virtual Devices on our IDE. This process leads developer to browser to authenticate through Huawei ID.
Next we need to choose Wearable device and click on play button to start the HVD in our IDE.
After choosing the wearable device, we can see a Brand new Huawei Smart Watch Emulator on our IDE.
Now we need to click on Run Project button and choose HVD which we reserve for testing and click OK button to start the Emulator.
After the process completes, our first application launched on Huawei Smart. You can see the list view on Smart Watch.
3. Splash Screen Development
In Splash screen development we will cover animation, glowing effect and timer.
Let’s start development without wasting more time.
Common images:
We can place some of the common images under common folder, which we can use in our project.
index.hml:
Code:
<stack class="container">
<div id="loading" class="bg"></div>
<div class="logo-glow"></div>
<image class="logo" src="/common/quran.png"></image>
</stack>
index.css:
Code:
.container {
justify-content: center;
align-items: center;
}
.logo{
top: 0px;
left: 0px;
width: 180px;
height: 180px;
}
.logo-glow{
top: 0px;
left: 0px;
width: 200px;
height: 200px;
border-radius: 200px;
background-color: rgba(255, 255, 255, .75);
animation-name: glow;
animation-duration: 1s;
animation-timing-function: ease;
animation-iteration-count: infinite;
}
@keyframes glow {
from {
width: 190px;
height: 190px;
}
to {
width: 180px;
height: 180px;
}
}
.bg{
top: 0px;
left: 0px;
width: 100%;
height: 100%;
background-image: url('/common/bg.jpg');
background-position: center center;
background-size: 100% 100%;
}
#loading {
animation-name: rotation;
animation-duration: 10s;
animation-timing-function: linear;
animation-iteration-count: infinite;
}
@keyframes rotation {
from {
transform: rotate(0deg);
}
to {
transform: rotate(359deg);
}
}
index.js:
Code:
import router from '@system.router';
import brightness from '@system.brightness';
var counter = 10;
export default {
data: {
timer: null
},
onInit(){
this.timer = setInterval(this.run,500);
},
onReady() {
this.setBrightnessKeepScreenOn();
},
run(){
counter = counter - 1;
if(counter == 0){
clearInterval(this.timer);
this.timer = null;
router.replace({
uri: 'pages/playerList/playerList'
});
}
},
onDestroy(){
clearInterval(this.timer);
this.timer = null;
},
// Setting the screen to be steady on
setBrightnessKeepScreenOn: function () {
brightness.setKeepScreenOn({
keepScreenOn: true,
success: function () {
console.log("handling set keep screen on success")
},
fail: function (data, code) {
console.log("handling set keep screen on fail, code:" + code);
}
});
},
}
Splash Screen in Action:
Splash Screen Notes:
We are using setTimeInterval after each 500ms and repeat this process for 10 times. Once the counter completes we will redirect the user to PlayerList screen.
4. Audio Player List Screen Development
In this section we make JAVA based Player Service (AVPlayService) and Player Handler service (PlayQuranService) to manage the audio play function on smart watch. Rest we will manage the UI on JS code to show list of Quran Chapters and on-click of list we will show dialog screen and play the audio with timer.
Let’s start the development without wasting more time.
Utils:
Let’s make utils package under java folder and add two java classes LogUtil.java and RequestParam.java.
Let’s first add packages in java folder to better manage our code.
LogUtil Class:
This class is responsible to manage logs from java code while implementing player.
Code:
import ohos.hiviewdfx.HiLog;
import ohos.hiviewdfx.HiLogLabel;
/**
* Log utils
*
* @since 2021-01-20
*/
public class LogUtil {
private static final String TAG_LOG = "AVPlayer";
private static final int DOMAIN_ID = 0xD000F00;
private static final HiLogLabel LABEL_LOG = new HiLogLabel(3, DOMAIN_ID, LogUtil.TAG_LOG);
private static final String LOG_FORMAT = "%{public}s: %{public}s";
private LogUtil() {
}
/**
* Print debug log
*
* @param tag log tag
* @param msg log message
*/
public static void debug(String tag, String msg) {
HiLog.debug(LABEL_LOG, LOG_FORMAT, tag, msg);
}
/**
* Print info log
*
* @param tag log tag
* @param msg log message
*/
public static void info(String tag, String msg) {
HiLog.info(LABEL_LOG, LOG_FORMAT, tag, msg);
}
/**
* Print warn log
*
* @param tag log tag
* @param msg log message
*/
public static void warn(String tag, String msg) {
HiLog.warn(LABEL_LOG, LOG_FORMAT, tag, msg);
}
/**
* Print error log
*
* @param tag log tag
* @param msg log message
*/
public static void error(String tag, String msg) {
HiLog.error(LABEL_LOG, LOG_FORMAT, tag, msg);
}
}
RequestParam Class:
RequestParam class is used for getter and setting for URI which we pass from JS code to JAVA code to stream audio file.
Code:
public class RequestParam {
private String uriQuran;
public String getUriQuran() {
return uriQuran;
}
public void setUriQuran(String uriQuran) {
this.uriQuran = uriQuran;
}
}
Model: (AVElementManager)
This class is used to prepare audio files for applications.
Code:
import com.android.wearable.lite.ksa.salman.utils.LogUtil;
import ohos.aafwk.ability.DataAbilityHelper;
import ohos.aafwk.ability.DataAbilityRemoteException;
import ohos.app.Context;
import ohos.data.resultset.ResultSet;
import ohos.media.common.AVDescription;
import ohos.media.common.AVMetadata;
import ohos.media.common.sessioncore.AVElement;
import ohos.media.photokit.metadata.AVStorage;
import ohos.utils.PacMap;
import ohos.utils.net.Uri;
import java.util.ArrayList;
import java.util.List;
import java.util.Optional;
/**
* This class is used to prepare audio files for applications.
*
* @since 2021-01-11
*/
public class AVElementManager {
private static final String TAG = AVElementManager.class.getSimpleName();
private List<AVElement> avElements = new ArrayList<>();
private AVElement current;
/**
* The construction method of this class
*
* @param context Context
*/
public AVElementManager(Context context) {
loadFromMediaLibrary(context);
}
private void loadFromMediaLibrary(Context context) {
Uri remoteUri = AVStorage.Audio.Media.EXTERNAL_DATA_ABILITY_URI;
DataAbilityHelper helper = DataAbilityHelper.creator(context, remoteUri, false);
try {
ResultSet resultSet = helper.query(remoteUri, null, null);
LogUtil.info(TAG, "The result size: " + resultSet.getRowCount());
processResult(resultSet);
resultSet.close();
} catch (DataAbilityRemoteException e) {
LogUtil.error(TAG, "Query system media failed.");
} finally {
helper.release();
}
}
private void processResult(ResultSet resultSet) {
while (resultSet.goToNextRow()) {
String path = resultSet.getString(resultSet.getColumnIndexForName(AVStorage.AVBaseColumns.DATA));
String title = resultSet.getString(resultSet.getColumnIndexForName(AVStorage.AVBaseColumns.TITLE));
long duration = resultSet.getInt(resultSet.getColumnIndexForName(AVStorage.AVBaseColumns.DURATION));
LogUtil.info(TAG, "Add new video file: " + path);
PacMap pacMap = new PacMap();
pacMap.putLongValue(AVMetadata.AVLongKey.DURATION, duration);
AVDescription bean = new AVDescription.Builder().setTitle(title)
.setIMediaUri(Uri.parse(path))
.setMediaId(path)
.setExtras(pacMap)
.build();
avElements.add(new AVElement(bean, AVElement.AVELEMENT_FLAG_PLAYABLE));
}
setDefaultAVElement();
}
private void setDefaultAVElement() {
if (avElements.size() > 0) {
current = avElements.get(0);
}
}
/**
* get the list of avElements
*
* @return avElements the list of avElements
*/
public List<AVElement> getAvQueueElements() {
return avElements;
}
/**
* set the current AVElement by uri
*
* @param uri uri of item
* @return true if set success, else false
*/
public boolean setCurrentAVElement(Uri uri) {
for (AVElement element : avElements) {
if (element.getAVDescription().getMediaUri().toString().equals(uri.toString())) {
current = element;
return true;
}
}
setDefaultAVElement();
return false;
}
/**
* get the current AVElement
*
* @return AVElement the current AVElement
*/
public AVElement getCurrentAVElement() {
return current;
}
/**
* get the next AVElement
*
* @return AVElement the next AVElement
*/
public Optional<AVElement> getNextAVElement() {
for (int i = 0; i < avElements.size(); i++) {
if (avElements.get(i).equals(current)) {
int index = i + 1;
current = avElements.get(index < avElements.size() ? index : 0);
return Optional.of(current);
}
}
setDefaultAVElement();
return Optional.of(current);
}
/**
* get the previous AVElement
*
* @return AVElement the previous AVElement
*/
public Optional<AVElement> getPreviousAVElement() {
for (int i = 0; i < avElements.size(); i++) {
if (avElements.get(i).equals(current)) {
int index = i - 1;
current = avElements.get(index >= 0 ? index : avElements.size() - 1);
return Optional.of(current);
}
}
setDefaultAVElement();
return Optional.of(current);
}
}
Services: (AVPlayService and PlayQuranService)
First we need to make one package named as services and define two service named as AVPlayService and PlayQuranService.
AVPlayService:
This is the service of the player, that base on AVBrowserService. This service runs on background.
Code:
import com.android.wearable.lite.ksa.salman.model.AVElementManager;
import com.android.wearable.lite.ksa.salman.utils.LogUtil;
import ohos.aafwk.content.Intent;
import ohos.media.common.AVDescription;
import ohos.media.common.AVMetadata;
import ohos.media.common.Source;
import ohos.media.common.sessioncore.AVBrowserResult;
import ohos.media.common.sessioncore.AVBrowserRoot;
import ohos.media.common.sessioncore.AVPlaybackState;
import ohos.media.common.sessioncore.AVSessionCallback;
import ohos.media.player.Player;
import ohos.media.sessioncore.AVBrowserService;
import ohos.media.sessioncore.AVSession;
import ohos.utils.PacMap;
import ohos.utils.net.Uri;
import java.util.Timer;
import java.util.TimerTask;
/**
* The service of the player, that base on AVBrowserService.
*
* @since 2021-01-20
*/
public class AVPlayService extends AVBrowserService {
private static final String TAG = AVPlayService.class.getSimpleName();
/**
* parent media id 1
*/
public static final String PARENT_MEDIA_ID_1 = "PARENT_MEDIA_ID_1";
/**
* parent media id 2
*/
public static final String PARENT_MEDIA_ID_2 = "PARENT_MEDIA_ID_2";
private static final int TIME_DELAY = 500;
private static final int TIME_LOOP = 1000;
private AVElementManager avElementManager;
private AVSession avSession;
private Player player;
private Timer timer = new Timer();
private ProgressTimerTask progressTimerTask;
private boolean isFirstConnectService = true;
@Override
public void onStart(Intent intent) {
if (!isFirstConnectService) {
return;
}
super.onStart(intent);
isFirstConnectService = false;
LogUtil.info(TAG, "onStart");
avElementManager = new AVElementManager(this);
AVPlaybackState avPlaybackState = new AVPlaybackState.Builder().setAVPlaybackState(
AVPlaybackState.PLAYBACK_STATE_NONE, 0, 1.0f).build();
avSession = new AVSession(this, AVPlayService.class.getName());
avSession.setAVSessionCallback(avSessionCallback);
avSession.setAVPlaybackState(avPlaybackState);
setAVToken(avSession.getAVToken());
player = new Player(this);
}
@Override
public void onStop() {
super.onStop();
LogUtil.info(TAG, "onDestroy");
if (player != null) {
player.release();
player = null;
}
if (avSession != null) {
avSession.release();
avSession = null;
}
if (progressTimerTask != null) {
progressTimerTask.cancel();
progressTimerTask = null;
}
}
@Override
public AVBrowserRoot onGetRoot(String clientPackageName, int clientUid, PacMap rootHints) {
LogUtil.info(TAG, "onGetRoot");
return new AVBrowserRoot(PARENT_MEDIA_ID_1, null);
}
@Override
public void onLoadAVElementList(String parentId, AVBrowserResult result) {
LogUtil.info(TAG, "onLoadAVElementList");
result.detachForRetrieveAsync();
switch (parentId) {
case PARENT_MEDIA_ID_1: {
result.sendAVElementList(avElementManager.getAvQueueElements());
break;
}
case PARENT_MEDIA_ID_2:
default:
break;
}
}
@Override
public void onLoadAVElementList(String parentId, AVBrowserResult avBrowserResult, PacMap pacMap) {
LogUtil.info(TAG, "onLoadAVElementList-2");
}
@Override
public void onLoadAVElement(String parentId, AVBrowserResult avBrowserResult) {
LogUtil.info(TAG, "onLoadAVElement");
}
private AVSessionCallback avSessionCallback = new AVSessionCallback() {
@Override
public void onPlay() {
super.onPlay();
LogUtil.info(TAG + "-AVSessionCallback", "onPlay");
if (avSession.getAVController().getAVPlaybackState().getAVPlaybackState()
== AVPlaybackState.PLAYBACK_STATE_PAUSED) {
player.play();
AVPlaybackState avPlaybackState = new AVPlaybackState.Builder().setAVPlaybackState(
AVPlaybackState.PLAYBACK_STATE_PLAYING, player.getCurrentTime(), player.getCurrentTime()).build();
avSession.setAVPlaybackState(avPlaybackState);
startProgressTaskTimer();
}
}
@Override
public void onPause() {
super.onPause();
LogUtil.info(TAG + "-AVSessionCallback", "onPause");
if (avSession.getAVController().getAVPlaybackState().getAVPlaybackState()
== AVPlaybackState.PLAYBACK_STATE_PLAYING) {
player.pause();
AVPlaybackState avPlaybackState = new AVPlaybackState.Builder().setAVPlaybackState(
AVPlaybackState.PLAYBACK_STATE_PAUSED, player.getCurrentTime(), player.getPlaybackSpeed()).build();
avSession.setAVPlaybackState(avPlaybackState);
}
}
@Override
public void onPlayNext() {
super.onPlayNext();
LogUtil.info(TAG + "-AVSessionCallback", "onPlayNext");
AVDescription next = avElementManager.getNextAVElement().get().getAVDescription();
play(next, 0);
}
@Override
public void onPlayPrevious() {
super.onPlayPrevious();
LogUtil.info(TAG + "-AVSessionCallback", "onPlayPrevious");
AVDescription previous = avElementManager.getPreviousAVElement().get().getAVDescription();
play(previous, 0);
}
private void play(AVDescription description, int position) {
player.reset();
player.setSource(new Source(description.getMediaUri().toString()));
player.prepare();
player.rewindTo(position);
player.play();
AVPlaybackState avPlaybackState = new AVPlaybackState.Builder().setAVPlaybackState(
AVPlaybackState.PLAYBACK_STATE_PLAYING, player.getCurrentTime(), player.getPlaybackSpeed()).build();
avSession.setAVPlaybackState(avPlaybackState);
avSession.setAVMetadata(getAVMetadata(description));
startProgressTaskTimer();
}
private AVMetadata getAVMetadata(AVDescription description) {
PacMap extrasPacMap = description.getExtras();
return new AVMetadata.Builder().setString(AVMetadata.AVTextKey.TITLE, description.getTitle().toString())
.setLong(AVMetadata.AVLongKey.DURATION, extrasPacMap.getLongValue(AVMetadata.AVLongKey.DURATION))
.setString(AVMetadata.AVTextKey.META_URI, description.getMediaUri().toString())
.build();
}
private void startProgressTaskTimer() {
if (progressTimerTask != null) {
progressTimerTask.cancel();
}
progressTimerTask = new ProgressTimerTask();
timer.schedule(progressTimerTask, TIME_DELAY, TIME_LOOP);
}
@Override
public void onPlayByUri(Uri uri, PacMap extras) {
LogUtil.info(TAG + "-AVSessionCallback", "onPlayByUri");
switch (avSession.getAVController().getAVPlaybackState().getAVPlaybackState()) {
case AVPlaybackState.PLAYBACK_STATE_PAUSED:
case AVPlaybackState.PLAYBACK_STATE_NONE: {
avElementManager.setCurrentAVElement(uri);
AVDescription current = avElementManager.getCurrentAVElement().getAVDescription();
play(current, 0);
break;
}
default:
break;
}
}
@Override
public void onPlayBySearch(String query, PacMap extras) {
LogUtil.info(TAG + "-AVSessionCallback", "onPlayBySearch");
}
@Override
public void onSetAVPlaybackCustomAction(String action, PacMap extras) {
super.onSetAVPlaybackCustomAction(action, extras);
LogUtil.info(TAG + "-AVSessionCallback", "onSetAVPlaybackCustomAction");
}
};
// used to get the playing status in period
class ProgressTimerTask extends TimerTask {
@Override
public void run() {
if (avSession.getAVController().getAVPlaybackState().getAVPlaybackState()
== AVPlaybackState.PLAYBACK_STATE_PLAYING) {
AVPlaybackState avPlaybackState = new AVPlaybackState.Builder().setAVPlaybackState(
AVPlaybackState.PLAYBACK_STATE_PLAYING, player.getCurrentTime(), player.getPlaybackSpeed()).build();
avSession.setAVPlaybackState(avPlaybackState);
}
}
}
}
PlayQuranService:
This service is responsible to make bridge with JS code and JAVA code to send input and send response in sync and async ways.
Code:
import com.android.wearable.lite.ksa.salman.utils.LogUtil;
import com.android.wearable.lite.ksa.salman.utils.RequestParam;
import ohos.aafwk.ability.Ability;
import ohos.aafwk.content.Intent;
import ohos.agp.window.dialog.ToastDialog;
import ohos.media.common.Source;
import ohos.media.player.Player;
import ohos.rpc.*;
import ohos.utils.zson.ZSONObject;
import java.util.HashMap;
import java.util.Map;
public class PlayQuranService extends Ability {
private static final String TAG = "PlayQuranService";
private MyRemote remote = new MyRemote();
private Player player;
// The FA calls Ability.connectAbility to connect to a PA. After the connection is successful, a remote object is returned in onConnect for the FA to send messages to the PA.
@Override
protected IRemoteObject onConnect(Intent intent) {
super.onConnect(intent);
return remote.asObject();
}
class MyRemote extends RemoteObject implements IRemoteBroker {
private static final int ERROR = -1;
private static final int SUCCESS = 0;
private static final int PLAY = 1001;
MyRemote() {
super("MyService_MyRemote");
}
@Override
public boolean onRemoteRequest(int code, MessageParcel data, MessageParcel reply, MessageOption option) {
switch (code) {
case PLAY: {
String zsonStr = data.readString();
RequestParam param = ZSONObject.stringToClass(zsonStr, RequestParam.class);
String uriQuran = param.getUriQuran();
this.play(uriQuran);
// The return value can only be a serializable object.
Map<String, Object> zsonResult = new HashMap<String, Object>();
zsonResult.put("code", SUCCESS);
zsonResult.put("abilityResult", "running");
reply.writeString(ZSONObject.toZSONString(zsonResult));
break;
}
default: {
reply.writeString("service not defined");
return false;
}
}
return true;
}
private void play(String uri) {
if (player == null || !player.isNowPlaying()) {
if (uri.isEmpty()) {
LogUtil.warn(TAG, "input uri is empty.");
return;
}
if (player != null) {
player.release();
}
player = new Player(getApplicationContext());
Source source = new Source(uri);
if (!player.setSource(source)) {
LogUtil.warn(TAG, "uri is invalid");
return;
}
if (!player.prepare()) {
LogUtil.warn(TAG, "prepare failed");
return;
}
if (!player.play()) {
LogUtil.warn(TAG, "play failed");
return;
}
} else {
stopPlay();
}
}
private void stopPlay() {
if(player != null) {
player.stop();
player.release();
}
}
private void showToast(String msg) {
new ToastDialog(getAbilityPackageContext()).setText(msg).setDuration(1000).show();
}
@Override
public IRemoteObject asObject() {
return this;
}
}
}
Data Model: (quranChaptersList.js)
We need Data Model to show list of Audio Player Quran Chapters and online audio file URL. We can also able to fetch data online, but in this article we already generated a JSON Data Model and use in project with ease.
Define the quranChaptersList.js file in common folder.
Code:
const quranChaptersList = [{id: 1, name: "1. Surat Al-Fatihah", time: "00:43", url: "https://download.quranicaudio.com/quran/abdurrahmaan_as-sudays/001.mp3"}, {id: 2, name: "2. Surat Al-Baqarah", time: "01:39:32", url: "https://download.quranicaudio.com/quran/abdurrahmaan_as-sudays/002.mp3"}, {id: 3, name: "3. Surat Ali 'Imran", time: "52:16", url: "https://download.quranicaudio.com/quran/abdurrahmaan_as-sudays/003.mp3"}, {id: 4, name: "4. Surat An-Nisa", time: "01:02:26", url: "https://download.quranicaudio.com/quran/abdurrahmaan_as-sudays/004.mp3”}];
export default quranChaptersList;
playerList.hml:
We will manage to display list of Quran Chapters and a Dialog to for audio player with timer and play/stop functionality.
Code:
<div class="container" onswipe="touchMove">
<dialog id="playerDialog" class="dialog-main">
<div class="dialog-bg">
<div class="dialog-div">
<div class="inner-txt">
<marquee class="chapter-name">{{currentChapterName}}</marquee>
<image onclick="resumeQuranAudio()" if="{{!isPlaying}}" style="width: 56px; height: 56px;"
src="/common/play.png"></image>
<image onclick="resumeQuranAudio()" if="{{isPlaying}}" style="width: 56px; height: 56px;"
src="/common/stop.png"></image>
<text class="txt">{{remainTime}} / {{currentChapterTime}}
</text>
</div>
<div class="inner-btn">
<button type="capsule" value="close" onclick="cancelPlayer" class="btn-txt"></button>
</div>
</div>
</div>
</dialog>
<list class="wearable">
<list-item for="{{quranChapterData}}" tid="id" type="listItem" class="list-box"
onclick="openQuranAudio({{$item.id}}, {{$item.url}}, true)">
<div id="play" class="img-box play">
<image src="../../common/play.png"></image>
</div>
<div class="text-box">
<text class="title-text">
{{$item.name}}
</text>
</div>
</list-item>
</list>
</div>
playerList.css:
Code:
.container {
flex-direction: column;
justify-content: center;
align-items: center;
background-image: url('/common/list-bg.jpg');
background-position: center center;
background-size: 100% 100%;
}
.wearable {
width: 240px;
height: 233px;
flex-direction: column;
justify-content: flex-start;
align-items: center;
/* background-color: rgba(255, 255, 255, .75);*/
border-radius: 116.5px;
}
.title {
color: #ffffff;
font-size: 20px;
text-align: center;
margin-top: 20px;
margin-bottom: 35px;
}
.list-box {
flex-direction: row;
justify-content: flex-start;
align-items: center;
height: 56px;
border-radius: 30px;
background-color: rgba(255, 255, 255, .35);
margin-bottom: 5px;
}
.img-box {
width: 46px;
height: 46px;
margin-left: 14px;
margin-top: 0px;
}
.text-box {
flex-direction: column;
justify-content: center;
width: 180px;
margin-left: 0px;
margin-top: 0px;
}
.title-text {
color: #ffffff;
font-size: 18px;
text-align: left;
padding-left: 5px;
}
.subtitle-text {
color: #808080;
font-size: 19.5px;
text-align: center;
margin-top: 2px;
}
.icon-box {
width: 25px;
height: 25px;
margin-top: 24px;
margin-left: 20px;
}
.play{
opacity: 1;
}
.pause{
display: none;
}
.visible{
opacity: 1;
}
.dialog-main {
width: 400px;
height: 400px;
}
.dialog-bg {
display: flex;
background-image: url('/common/list-bg.jpg');
background-position: center center;
background-size: 100% 100%;
}
.dialog-div {
width: 400px;
height: 400px;
flex-direction: column;
align-items: center;
border-radius: 400px;
background-color: rgba(255, 255, 255, .15);
}
.inner-txt {
width: 400px;
height: 160px;
flex-direction: column;
align-items: center;
justify-content: space-around;
background-color: transparent;
}
.inner-btn {
width: 400px;
height: 80px;
justify-content: space-around;
align-items: center;
}
.chapter-name{
font-size: 24px;
margin-top: 40px;
}
.btn-txt{
background-color: rgba(240,126,138, .35);
text-color: whitesmoke;
}
playerList.js:
This this file we will manage all logic of audio player.
Structural - Code:
Code:
import prompt from '@system.prompt';
import brightness from '@system.brightness';
import app from '@system.app';
import quranChaptersList from '../../common/quranChaptersList';
const globalRef = Object.getPrototypeOf(global) || global;
globalRef.regeneratorRuntime = require('@babel/runtime/regenerator');
// Set abilityType to 0 (ability) or 1 (internal ability).
const ABILITY_TYPE_EXTERNAL = 0;
const ABILITY_TYPE_INTERNAL = 1;
// Set syncOption to 0 (synchronous, default value) or 1 (asynchronous). This parameter is optional.
const ACTION_SYNC = 0;
const ACTION_ASYNC = 1;
const ACTION_MESSAGE_CODE_PLAY = 1001;
export default {}
Data:
Code:
data: {
quranChapterData: quranChaptersList,
currentChapterName: null,
currentChapterTime: null,
currentPlayingObj: null,
isPlaying: false,
remainTime:'',
countDownTimer: null,
showHours: false,
},
Common - Code:
Code:
onReady() {
this.setBrightnessKeepScreenOn();
},// Setting the screen to be steady on
setBrightnessKeepScreenOn: function () {
brightness.setKeepScreenOn({
keepScreenOn: true,
success: function () {
console.log("handling set keep screen on success")
},
fail: function (data, code) {
console.log("handling set keep screen on fail, code:" + code);
}
});
},
touchMove(e){ // Handle the swipe event.
if(e.direction == "right") // Swipe right to exit.
{
this.appExit();
}
},
appExit(){ // Exit the application.
app.terminate();
}
Audio Player - Code:
Code:
openQuranAudio: async function(id, uriQuran, mode) {
var _this = this;
this.clearTimer();
var currentPlayingObj = quranChaptersList.filter((current)=> current.id == id);
_this.currentPlayingObj = currentPlayingObj[0];
_this.currentChapterName = _this.currentPlayingObj.name;
if(mode){
_this.$element('playerDialog').show();
_this.currentChapterTime = _this.currentPlayingObj.time;
_this.remainTime = _this.currentChapterTime;
}
var actionData = {};
actionData.firstNum = 1024;
actionData.secondNum = 2048;
actionData.uriQuran = uriQuran;
var action = {};
action.bundleName = 'com.android.wearable.lite.ksa.salman';
action.abilityName = 'com.android.wearable.lite.ksa.salman.services.PlayQuranService';
action.messageCode = ACTION_MESSAGE_CODE_PLAY;
action.data = actionData;
action.abilityType = ABILITY_TYPE_EXTERNAL;
action.syncOption = ACTION_SYNC;
var result = await FeatureAbility.callAbility(action);
var ret = JSON.parse(result);
if (ret.code == 0) {
console.info('player result is:' + JSON.stringify(ret.abilityResult));
_this.isPlaying = !_this.isPlaying;
_this.manageTimer();
} else {
console.error('player error code:' + JSON.stringify(ret.code));
prompt.showToast({
message: 'player error code:' + JSON.stringify(ret.code)
})
}
},
manageTimer(){
if(this.isPlaying){
this.setTimeInfo(this.remainTime);
}
},
resumeQuranAudio(){
this.clearTimer();
let id = this.currentPlayingObj.id;
let url = this.currentPlayingObj.url;
this.openQuranAudio(id, url, false);
},
cancelPlayer(e) {
if(this.isPlaying){
this.resumeQuranAudio();
}
this.$element('playerDialog').close();
this.clearTimer();
},
clearTimer(){
clearInterval(this.countDownTimer);
this.countDownTimer = null;
},
onDestroy(){
this.clearTimer();
},
Timer - Code:
Code:
getCalculatedTime(playerTimeInput){
let _this = this;
let playerTime = {hours: 0, minutes: 0, seconds: 0};
let timeSplit = playerTimeInput.split(':');
console.info("timeSplit: "+ timeSplit.length);
if(timeSplit.length === 3){
playerTime = {hours: parseInt(timeSplit[0]), minutes: parseInt(timeSplit[1]), seconds: parseInt(timeSplit[2])}
_this.showHours = true;
} else if(timeSplit.length === 2){
playerTime = {hours: 0, minutes: parseInt(timeSplit[0]), seconds: parseInt(timeSplit[1])}
_this.showHours = false;
} else {
playerTime = {hours: 0, minutes: 0, seconds: parseInt(timeSplit[0])}
_this.showHours = false;
}
let dateTime = new Date();
dateTime.setHours(dateTime.getHours() + playerTime.hours);
dateTime.setMinutes(dateTime.getMinutes() + playerTime.minutes);
dateTime.setSeconds(dateTime.getSeconds() + playerTime.seconds);
let calculatedTime = dateTime.getTime();
console.info("calculatedTime: "+ calculatedTime);
return calculatedTime;
},
caculateTime(timeObj) {
let myDate = new Date();
let currentTime = myDate.getTime();
var targetTime = parseInt(timeObj);
var remainTime = parseInt(targetTime - currentTime);
if (remainTime > 0 ) {
this.isShowTargetTime = true;
this.setRemainTime(remainTime);
this.setTargetTime(targetTime);
} else {
this.isPlaying = false;
}
},
setRemainTime(remainTime) {
let days = this.addZero(Math.floor(remainTime / (24 * 3600 * 1000))); // Calculate the number of days.
let leavel = remainTime % (24 * 3600 * 1000); // Time remaining after calculating the number of days
let hours = this.addZero(Math.floor(leavel / (3600 * 1000))); // Calculate the number of hours remaining
let leavel2 = leavel % (3600 * 1000); // Number of milliseconds remaining after calculating the remaining hours
let minutes = this.addZero(Math.floor(leavel2 / (60 * 1000))); // Calculate the remaining minutes
// Calculate the difference in seconds.
let leavel3 = leavel2 % (60 * 1000); // Number of milliseconds remaining after calculating the number of minutes
let seconds = this.addZero(Math.round(leavel3 / 1000));
if(this.showHours){
this.remainTime = hours + ':' + minutes + ':' + seconds;
} else {
this.remainTime = minutes + ':' + seconds;
}
},
setTargetTime(targetTime) {
var times = new Date(targetTime);
let date = times.toLocaleDateString(); //Obtains the current date.
var tempSetHours = times.getHours(); //Obtains the current number of hours.(0-23)
let hours = this.addZero(tempSetHours)
var tempSetMinutes = times.getMinutes(); //Obtains the current number of minutes.(0-59)
let minutes = this.addZero(tempSetMinutes)
var tempSetSeconds = times.getSeconds(); //Obtains the current number of seconds.(0-59)
let seconds = this.addZero(tempSetSeconds)
this.targetTime = `${hours}:${minutes}:${seconds}`;
},
addZero: function(i){
return i < 10 ? "0" + i: i + "";
},
References:
HarmonyOS JS API Official Documentation:
Document
developer.harmonyos.com
HarmonyOS JAVA API Official Documentation:
Document
developer.harmonyos.com
Conclusion:
Developers can able to make applications for Huawei Smart Watch using DevEco Studio. Using HarmonyOS developer can use JS , JAVA, C/C++ languages to develop very elegant and smart application for Smart wearable, Car, TV, Smart Vision, Phone and Tablet.

Beginners : Explaining Database Storage in Huawei Harmony using SQLite

Introduction
In this article, we can create an app showing below storage features:
1. Create database and create table
2. Insert data
3. Update data
4. Delete data
5. Fetch data
Requirements
1. Dev Eco IDE
2. Wearable watch (Can use simulator also)
Harmony OS Supports various ways of storage
1. Storage like (Shared preference, key value pairs).
2. File Storage
3. SQLite Db
In this article, we will test SQLite Db
UI Design
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
ability_main.xml
XML:
<?xml version="1.0" encoding="utf-8"?>
<DirectionalLayout
xmlns:ohos="http://schemas.huawei.com/res/ohos"
ohos:height="match_parent"
ohos:width="match_parent"
ohos:orientation="vertical"
ohos:background_element="#8c7373"
ohos:padding="32">
<Text
ohos:multiple_lines="true"
ohos:id="$+id:text"
ohos:height="match_content"
ohos:width="200"
ohos:layout_alignment="horizontal_center"
ohos:text="Text"
ohos:text_size="10fp"/>
<Button
ohos:id="$+id:button"
ohos:height="match_content"
ohos:width="match_content"
ohos:background_element="$graphic:background_button"
ohos:layout_alignment="horizontal_center"
ohos:text="$string:save"
ohos:text_size="30"
ohos:top_margin="5"/>
<Button
ohos:id="$+id:button_get"
ohos:height="match_content"
ohos:width="match_content"
ohos:background_element="$graphic:background_button"
ohos:layout_alignment="horizontal_center"
ohos:padding="5"
ohos:text="$string:read"
ohos:text_size="30"
ohos:top_margin="5"/>
<Button
ohos:id="$+id:button_update"
ohos:height="match_content"
ohos:width="match_content"
ohos:background_element="$graphic:background_button"
ohos:layout_alignment="horizontal_center"
ohos:padding="5"
ohos:text="$string:update"
ohos:text_size="30"
ohos:top_margin="5"/>
<Button
ohos:id="$+id:button_delete"
ohos:height="match_content"
ohos:width="match_content"
ohos:background_element="$graphic:background_button"
ohos:layout_alignment="horizontal_center"
ohos:padding="5"
ohos:text="$string:delete"
ohos:text_size="30"
ohos:top_margin="5"/>
</DirectionalLayout>
MainAbilitySlice.java
Java:
package com.example.testwearableemptyfeaturejava.slice;
import com.example.testwearableemptyfeaturejava.ResourceTable;
import ohos.aafwk.ability.AbilitySlice;
import ohos.aafwk.content.Intent;
import ohos.agp.colors.RgbColor;
import ohos.agp.components.Button;
import ohos.agp.components.Component;
import ohos.agp.components.Text;
import ohos.agp.components.element.ShapeElement;
import ohos.agp.window.dialog.ToastDialog;
import ohos.app.Context;
import ohos.data.DatabaseHelper;
import ohos.data.rdb.*;
import ohos.data.resultset.ResultSet;
import ohos.hiviewdfx.HiLog;
import ohos.hiviewdfx.HiLogLabel;
public class MainAbilitySlice extends AbilitySlice {
static final HiLogLabel LABEL = new HiLogLabel(HiLog.LOG_APP, 0x00201, "MY_TAG");
RdbStore mStore;
Text mText;
@Override
public void onStart(Intent intent) {
super.onStart(intent);
super.setUIContent(ResourceTable.Layout_ability_main);
initDb(getApplicationContext());
mText = (Text) findComponentById(ResourceTable.Id_text);
Button button = (Button) findComponentById(ResourceTable.Id_button);
if (button != null) {
button.setClickedListener(new Component.ClickedListener() {
@Override
// Register a listener for observing click events of the button.
public void onClick(Component component) {
HiLog.warn(LABEL, "inside %{public}s", "MainAbilitySliceButtonClick");
// Add the operation to perform when the button is clicked.
insertData();
}
});
}
Button buttonGet = (Button) findComponentById(ResourceTable.Id_button_get);
if(buttonGet != null){
buttonGet.setClickedListener(new Component.ClickedListener() {
@Override
public void onClick(Component component) {
HiLog.warn(LABEL, "inside %{public}s", "get data");
readData();
}
});
}
Button buttonDelete = (Button) findComponentById(ResourceTable.Id_button_delete);
if(buttonDelete != null){
buttonDelete.setClickedListener(new Component.ClickedListener() {
@Override
public void onClick(Component component) {
HiLog.warn(LABEL, "inside %{public}s", "deleteData");
deleteData();
}
});
}
Button buttonUpdate = (Button) findComponentById(ResourceTable.Id_button_update);
if(buttonUpdate != null){
buttonUpdate.setClickedListener(new Component.ClickedListener() {
@Override
public void onClick(Component component) {
HiLog.warn(LABEL, "inside %{public}s", "updateData");
updateData();
}
});
}
}
private void initDb(Context context){
StoreConfig config = StoreConfig.newDefaultConfig("RdbStoreTest.db");
final RdbOpenCallback callback = new RdbOpenCallback() {
@Override
public void onCreate(RdbStore store) {
store.executeSql("CREATE TABLE IF NOT EXISTS test (id INTEGER PRIMARY KEY AUTOINCREMENT, name TEXT NOT NULL, age INTEGER, salary REAL, blobType BLOB)");
}
@Override
public void onUpgrade(RdbStore store, int oldVersion, int newVersion) {
}
};
DatabaseHelper helper = new DatabaseHelper(context);
mStore = helper.getRdbStore(config, 1, callback, null);
}
private void insertData(){
ValuesBucket values = new ValuesBucket();
//values.putInteger("id", 2);
values.putString("name", "kamal");
values.putInteger("age", 18);
values.putDouble("salary", 100.5);
values.putByteArray("blobType", new byte[] {1, 2, 3});
long id = mStore.insert("test", values);
HiLog.warn(LABEL, "insert completed %{public}s", "id is"+id);
showToastMessage("data inserted successfully");
}
private void readData(){
try {
String[] columns = new String[] {"id", "name", "age", "salary"};
RdbPredicates rdbPredicates = new RdbPredicates("test").orderByAsc("salary");
ResultSet resultSet = mStore.query(rdbPredicates, columns);
if(resultSet == null || resultSet.getRowCount() <=0){
showToastMessage("no data in table");
return;
}
String data = "";
while(resultSet.goToNextRow()){
String name = resultSet.getString(resultSet.getColumnIndexForName("name"));
String age = resultSet.getString(resultSet.getColumnIndexForName("age"));
String salary = resultSet.getString(resultSet.getColumnIndexForName("salary"));
HiLog.warn(LABEL, "inside %{public}s", "read data"+name);
data = data + "[" + name + "][" + age + "][" + salary + "]\n";
}
mText.setText(data);
HiLog.warn(LABEL, "read completedqq %{public}s", "");
showToastMessage("data read successfully");
}catch (Exception e){
e.printStackTrace();
}
}
private void updateData(){
try {
ValuesBucket values = new ValuesBucket();
values.putString("name", "updated kamal");
values.putInteger("age", 28);
values.putDouble("salary", 200.5);
values.putByteArray("blobType", new byte[] {1, 2, 3});
AbsRdbPredicates rdbPredicates = new RdbPredicates("test").equalTo("age", 18);
int index = mStore.update(values, rdbPredicates);
HiLog.warn(LABEL, "update completed %{public}s", ""+index);
showToastMessage("data updated successfully");
}catch (Exception e){
e.printStackTrace();
}
}
private void deleteData(){
try {
String[] columns = new String[] {"id", "name", "age", "salary"};
RdbPredicates rdbPredicates = new RdbPredicates("test").equalTo("age", 18);
int index = mStore.delete(rdbPredicates);
HiLog.warn(LABEL, "delete completed %{public}s", ""+index);
showToastMessage("data deleted successfully");
}catch (Exception e){
e.printStackTrace();
}
}
private void showToastMessage(String string){
new ToastDialog(getApplicationContext()).setText(string).setAlignment(1).setSize(300,50).show();
}
@Override
public void onActive() {
super.onActive();
}
@Override
public void onForeground(Intent intent) {
super.onForeground(intent);
}
}
MainAbility.java
Java:
package com.example.testwearableemptyfeaturejava;
import com.example.testwearableemptyfeaturejava.slice.MainAbilitySlice;
import ohos.aafwk.ability.Ability;
import ohos.aafwk.content.Intent;
public class MainAbility extends Ability {
@Override
public void onStart(Intent intent) {
super.onStart(intent);
super.setMainRoute(MainAbilitySlice.class.getName());
}
}
Code Explanation
Create database under “MainAbility.java” or any separate class.
Java:
private void initDb(Context context){
StoreConfig config = StoreConfig.newDefaultConfig("RdbStoreTest.db");
final RdbOpenCallback callback = new RdbOpenCallback() {
@Override
public void onCreate(RdbStore store) {
store.executeSql("CREATE TABLE IF NOT EXISTS test (id INTEGER PRIMARY KEY AUTOINCREMENT, name TEXT NOT NULL, age INTEGER, salary REAL, blobType BLOB)");
}
@Override
public void onUpgrade(RdbStore store, int oldVersion, int newVersion) {
}
};
DatabaseHelper helper = new DatabaseHelper(context);
mStore = helper.getRdbStore(config, 1, callback, null);
}
If database is not there, it will be created. onCreate method will create table test.
Insert data under “MainAbility.java” or any new class.
Java:
private void insertData(){
ValuesBucket values = new ValuesBucket();
values.putString("name", "kamal");
values.putInteger("age", 18);
values.putDouble("salary", 100.5);
values.putByteArray("blobType", new byte[] {1, 2, 3});
long id = mStore.insert("test", values);
HiLog.warn(LABEL, "insert completed %{public}s", "id is"+id);
}
Data is retrieved and UI is updated.
Update row under “MainAbility.java” or any class.
Java:
private void updateData(){
try {
ValuesBucket values = new ValuesBucket();
values.putString("name", "updated kamal");
values.putInteger("age", 28);
values.putDouble("salary", 200.5);
values.putByteArray("blobType", new byte[] {1, 2, 3});
AbsRdbPredicates rdbPredicates = new RdbPredicates("test").equalTo("age", 18);
int index = mStore.update(values, rdbPredicates);
HiLog.warn(LABEL, "update completed %{public}s", ""+index);
showToastMessage("data updated successfully");
}catch (Exception e){
e.printStackTrace();
}
}
Delete data under “MainAbility.java” or any class.
Java:
private void deleteData(){
try {
String[] columns = new String[] {"id", "name", "age", "salary"};
RdbPredicates rdbPredicates = new RdbPredicates("test").equalTo("age", 18);
int index = mStore.delete(rdbPredicates);
HiLog.warn(LABEL, "delete completed %{public}s", ""+index);
showToastMessage("data deleted successfully");
}catch (Exception e){
e.printStackTrace();
}
}
Tips and Tricks
1. All the file operations are Asynchronous.
2. Relational mapping is possible.
3. RDB can use a maximum of four connection pools to manage read and write operations.
4. To ensure data accuracy, the RDB supports only one write operation at a time.
5. RdbPredicates: You do not need to write complex SQL statements. Instead, you can combine SQL statements simply by calling methods in this class, such as equalTo, notEqualTo, groupBy, orderByAsc, and beginsWith.
6. RawRdbPredicates: You can set whereClause and whereArgs, but cannot call methods such as equalTo.
Conclusion
we have learned to save, update, delete and retrieve the data using SQLite database in Harmony OS along with the UI components.
Reference
Harmony Official document
DevEco Studio User guide
JS API Reference
Read In Forum
Does it support room database ?
Can i implement this with Rxandroid?

Find hand points using Hand Gesture Recognition feature by Huawei ML Kit in Android (Kotlin)

{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Introduction
In this article, we can learn how to find the hand key points using Huawei ML Kit of Hand Gesture Recognition feature. This service provides two capabilities: hand keypoint detection and hand gesture recognition. The hand keypoint detection capability can detect 21 hand keypoints (including fingertips, knuckles, and wrists) and return the positions of the key points. The hand gesture recognition capability can detect and return the positions of all rectangle areas of the hand from images and videos, and the type and confidence of a gesture. This capability can recognize 14 gestures, including the thumbs-up/down, OK sign, fist, finger heart, and number gestures from 1 to 9. Both capabilities support detection from static images and real-time camera streams.
Use Cases
Hand keypoint detection is widely used in daily life. For example, after integrating this capability, users can convert the detected hand keypoints into a 2D model, and synchronize the model to the character model, to produce a vivid 2D animation. In addition, when shooting a short video, special effects can be generated based on dynamic hand trajectories. This allows users to play finger games, thereby making the video shooting process more creative and interactive. Hand gesture recognition enables your app to call various commands by recognizing users' gestures. Users can control their smart home appliances without touching them. In this way, this capability makes the human-machine interaction more efficient.
Requirements
1. Any operating system (MacOS, Linux and Windows).
2. Must have a Huawei phone with HMS 4.0.0.300 or later.
3. Must have a laptop or desktop with Android Studio, Jdk 1.8, SDK platform 26 and Gradle 4.6 and above installed.
4. Minimum API Level 21 is required.
5. Required EMUI 9.0.0 and later version devices.
How to integrate HMS Dependencies
1. First register as Huawei developer and complete identity verification in Huawei developers website, refer to register a Huawei ID.
2. Create a project in android studio, refer Creating an Android Studio Project.
3. Generate a SHA-256 certificate fingerprint.
4. To generate SHA-256 certificate fingerprint. On right-upper corner of android project click Gradle, choose Project Name > Tasks > android, and then click signingReport, as follows.
Note: Project Name depends on the user created name.
5. Create an App in AppGallery Connect.
6. Download the agconnect-services.json file from App information, copy and paste in android Project under app directory, as follows.
7. Enter SHA-256 certificate fingerprint and click Save button, as follows.
Note: Above steps from Step 1 to 7 is common for all Huawei Kits.
8. Click Manage APIs tab and enable ML Kit.
9. Add the below maven URL in build.gradle(Project) file under the repositories of buildscript, dependencies and allprojects, refer Add Configuration.
Java:
maven { url 'http://developer.huawei.com/repo/' }
classpath 'com.huawei.agconnect:agcp:1.4.1.300'
10. Add the below plugin and dependencies in build.gradle(Module) file.
Java:
apply plugin: 'com.huawei.agconnect'
// Huawei AGC
implementation 'com.huawei.agconnect:agconnect-core:1.6.5.300'
/ ML Kit Hand Gesture
// Import the base SDK
implementation 'com.huawei.hms:ml-computer-vision-handkeypoint:2.1.0.300'
// Import the hand keypoint detection model package.
implementation 'com.huawei.hms:ml-computer-vision-handkeypoint-model:2.1.0.300'
11. Now Sync the gradle.
12. Add the required permission to the AndroidManifest.xml file.
Java:
<uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
<uses-permission android:name="android.permission.CAMERA" />
<uses-permission android:name="android.permission.INTERNET" />
Let us move to development
I have created a project on Android studio with empty activity let us start coding.
In the MainActivity.kt we can find the business logic for buttons.
Java:
class MainActivity : AppCompatActivity() {
private var staticButton: Button? = null
private var liveButton: Button? = null
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
setContentView(R.layout.activity_main)
staticButton = findViewById(R.id.btn_static)
liveButton = findViewById(R.id.btn_live)
staticButton!!.setOnClickListener {
val intent = Intent([email protected], StaticHandKeyPointAnalyse::class.java)
startActivity(intent)
}
liveButton!!.setOnClickListener {
val intent = Intent([email protected], LiveHandKeyPointAnalyse::class.java)
startActivity(intent)
}
}
}
In the LiveHandKeyPointAnalyse.kt we can find the business logic for live analysis.
Java:
class LiveHandKeyPointAnalyse : AppCompatActivity(), View.OnClickListener {
private val TAG: String = LiveHandKeyPointAnalyse::class.java.getSimpleName()
private var mPreview: LensEnginePreview? = null
private var mOverlay: GraphicOverlay? = null
private var mFacingSwitch: Button? = null
private var mAnalyzer: MLHandKeypointAnalyzer? = null
private var mLensEngine: LensEngine? = null
private val lensType = LensEngine.BACK_LENS
private var mLensType = 0
private var isFront = false
private var isPermissionRequested = false
private val CAMERA_PERMISSION_CODE = 0
private val ALL_PERMISSION = arrayOf(Manifest.permission.CAMERA)
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
setContentView(R.layout.activity_live_hand_key_point_analyse)
if (savedInstanceState != null) {
mLensType = savedInstanceState.getInt("lensType")
}
initView()
createHandAnalyzer()
if (Camera.getNumberOfCameras() == 1) {
mFacingSwitch!!.visibility = View.GONE
}
// Checking Camera Permissions
if (ActivityCompat.checkSelfPermission(this, Manifest.permission.CAMERA) == PackageManager.PERMISSION_GRANTED) {
createLensEngine()
} else {
checkPermission()
}
}
private fun initView() {
mPreview = findViewById(R.id.hand_preview)
mOverlay = findViewById(R.id.hand_overlay)
mFacingSwitch = findViewById(R.id.handswitch)
mFacingSwitch!!.setOnClickListener(this)
}
private fun createHandAnalyzer() {
// Create a analyzer. You can create an analyzer using the provided customized face detection parameter: MLHandKeypointAnalyzerSetting
val setting = MLHandKeypointAnalyzerSetting.Factory()
.setMaxHandResults(2)
.setSceneType(MLHandKeypointAnalyzerSetting.TYPE_ALL)
.create()
mAnalyzer = MLHandKeypointAnalyzerFactory.getInstance().getHandKeypointAnalyzer(setting)
mAnalyzer!!.setTransactor(HandAnalyzerTransactor(this, mOverlay!!) )
}
// Check the permissions required by the SDK.
private fun checkPermission() {
if (Build.VERSION.SDK_INT >= 23 && !isPermissionRequested) {
isPermissionRequested = true
val permissionsList = ArrayList<String>()
for (perm in getAllPermission()!!) {
if (PackageManager.PERMISSION_GRANTED != checkSelfPermission(perm.toString())) {
permissionsList.add(perm.toString())
}
}
if (!permissionsList.isEmpty()) {
requestPermissions(permissionsList.toTypedArray(), 0)
}
}
}
private fun getAllPermission(): MutableList<Array<String>> {
return Collections.unmodifiableList(listOf(ALL_PERMISSION))
}
private fun createLensEngine() {
val context = this.applicationContext
// Create LensEngine.
mLensEngine = LensEngine.Creator(context, mAnalyzer)
.setLensType(mLensType)
.applyDisplayDimension(640, 480)
.applyFps(25.0f)
.enableAutomaticFocus(true)
.create()
}
private fun startLensEngine() {
if (mLensEngine != null) {
try {
mPreview!!.start(mLensEngine, mOverlay)
} catch (e: IOException) {
Log.e(TAG, "Failed to start lens engine.", e)
mLensEngine!!.release()
mLensEngine = null
}
}
}
// Permission application callback.
override fun onRequestPermissionsResult(requestCode: Int, permissions: Array<String?>, grantResults: IntArray) {
var hasAllGranted = true
if (requestCode == CAMERA_PERMISSION_CODE) {
if (grantResults[0] == PackageManager.PERMISSION_GRANTED) {
createLensEngine()
} else if (grantResults[0] == PackageManager.PERMISSION_DENIED) {
hasAllGranted = false
if (!ActivityCompat.shouldShowRequestPermissionRationale(this, permissions[0]!!)) {
showWaringDialog()
} else {
Toast.makeText(this, R.string.toast, Toast.LENGTH_SHORT).show()
finish()
}
}
return
}
super.onRequestPermissionsResult(requestCode, permissions, grantResults)
}
override fun onSaveInstanceState(outState: Bundle) {
outState.putInt("lensType", lensType)
super.onSaveInstanceState(outState)
}
private class HandAnalyzerTransactor internal constructor(mainActivity: LiveHandKeyPointAnalyse?,
private val mGraphicOverlay: GraphicOverlay) : MLTransactor<MLHandKeypoints?> {
// Process the results returned by the analyzer.
override fun transactResult(result: MLAnalyzer.Result<MLHandKeypoints?>) {
mGraphicOverlay.clear()
val handKeypointsSparseArray = result.analyseList
val list: MutableList<MLHandKeypoints?> = ArrayList()
for (i in 0 until handKeypointsSparseArray.size()) {
list.add(handKeypointsSparseArray.valueAt(i))
}
val graphic = HandKeypointGraphic(mGraphicOverlay, list)
mGraphicOverlay.add(graphic)
}
override fun destroy() {
mGraphicOverlay.clear()
}
}
override fun onClick(v: View?) {
when (v!!.id) {
R.id.handswitch -> switchCamera()
else -> {}
}
}
private fun switchCamera() {
isFront = !isFront
mLensType = if (isFront) {
LensEngine.FRONT_LENS
} else {
LensEngine.BACK_LENS
}
if (mLensEngine != null) {
mLensEngine!!.close()
}
createLensEngine()
startLensEngine()
}
private fun showWaringDialog() {
val dialog = AlertDialog.Builder(this)
dialog.setMessage(R.string.Information_permission)
.setPositiveButton(R.string.go_authorization,
DialogInterface.OnClickListener { dialog, which ->
val intent = Intent(Settings.ACTION_APPLICATION_DETAILS_SETTINGS)
val uri = Uri.fromParts("package", applicationContext.packageName, null)
intent.data = uri
startActivity(intent)
})
.setNegativeButton("Cancel", DialogInterface.OnClickListener { dialog, which -> finish() })
.setOnCancelListener(dialogInterface)
dialog.setCancelable(false)
dialog.show()
}
var dialogInterface = DialogInterface.OnCancelListener { }
override fun onResume() {
super.onResume()
if (ActivityCompat.checkSelfPermission(this, Manifest.permission.CAMERA) == PackageManager.PERMISSION_GRANTED) {
createLensEngine()
startLensEngine()
} else {
checkPermission()
}
}
override fun onPause() {
super.onPause()
mPreview!!.stop()
}
override fun onDestroy() {
super.onDestroy()
if (mLensEngine != null) {
mLensEngine!!.release()
}
if (mAnalyzer != null) {
mAnalyzer!!.stop()
}
}
}
Create LensEnginePreview.kt class to find the business logic for lens engine view.
Java:
class LensEnginePreview(private val mContext: Context, attrs: AttributeSet?) : ViewGroup(mContext, attrs) {
private val mSurfaceView: SurfaceView
private var mStartRequested = false
private var mSurfaceAvailable = false
private var mLensEngine: LensEngine? = null
private var mOverlay: GraphicOverlay? = null
@Throws(IOException::class)
fun start(lensEngine: LensEngine?) {
if (lensEngine == null) {
stop()
}
mLensEngine = lensEngine
if (mLensEngine != null) {
mStartRequested = true
startIfReady()
}
}
@Throws(IOException::class)
fun start(lensEngine: LensEngine?, overlay: GraphicOverlay?) {
mOverlay = overlay
this.start(lensEngine)
}
fun stop() {
if (mLensEngine != null) {
mLensEngine!!.close()
}
}
@Throws(IOException::class)
private fun startIfReady() {
if (mStartRequested && mSurfaceAvailable) {
mLensEngine!!.run(mSurfaceView.holder)
if (mOverlay != null) {
val size = mLensEngine!!.displayDimension
val min = Math.min(size.width, size.height)
val max = Math.max(size.width, size.height)
if (isPortraitMode) {
// Swap width and height sizes when in portrait, since it will be rotated by 90 degrees.
mOverlay!!.setCameraInfo(min, max, mLensEngine!!.lensType)
} else {
mOverlay!!.setCameraInfo(max, min, mLensEngine!!.lensType)
}
mOverlay!!.clear()
}
mStartRequested = false
}
}
private inner class SurfaceCallback : SurfaceHolder.Callback {
override fun surfaceCreated(surface: SurfaceHolder) {
mSurfaceAvailable = true
try {
startIfReady()
} catch (e: IOException) {
Log.e(TAG, "Could not start camera source.", e)
}
}
override fun surfaceDestroyed(surface: SurfaceHolder) {
mSurfaceAvailable = false
}
override fun surfaceChanged(holder: SurfaceHolder, format: Int, width: Int, height: Int) {}
}
override fun onLayout(changed: Boolean, left: Int, top: Int, right: Int, bottom: Int) {
var previewWidth = 480
var previewHeight = 360
if (mLensEngine != null) {
val size = mLensEngine!!.displayDimension
if (size != null) {
previewWidth = size.width
previewHeight = size.height
}
}
// Swap width and height sizes when in portrait, since it will be rotated 90 degrees
if (isPortraitMode) {
val tmp = previewWidth
previewWidth = previewHeight
previewHeight = tmp
}
val viewWidth = right - left
val viewHeight = bottom - top
val childWidth: Int
val childHeight: Int
var childXOffset = 0
var childYOffset = 0
val widthRatio = viewWidth.toFloat() / previewWidth.toFloat()
val heightRatio = viewHeight.toFloat() / previewHeight.toFloat()
// To fill the view with the camera preview, while also preserving the correct aspect ratio,
// it is usually necessary to slightly oversize the child and to crop off portions along one
// of the dimensions. We scale up based on the dimension requiring the most correction, and
// compute a crop offset for the other dimension.
if (widthRatio > heightRatio) {
childWidth = viewWidth
childHeight = (previewHeight.toFloat() * widthRatio).toInt()
childYOffset = (childHeight - viewHeight) / 2
} else {
childWidth = (previewWidth.toFloat() * heightRatio).toInt()
childHeight = viewHeight
childXOffset = (childWidth - viewWidth) / 2
}
for (i in 0 until this.childCount) {
// One dimension will be cropped. We shift child over or up by this offset and adjust
// the size to maintain the proper aspect ratio.
getChildAt(i).layout(-1 * childXOffset, -1 * childYOffset,
childWidth - childXOffset,childHeight - childYOffset )
}
try {
startIfReady()
} catch (e: IOException) {
Log.e(TAG, "Could not start camera source.", e)
}
}
private val isPortraitMode: Boolean
get() {
val orientation = mContext.resources.configuration.orientation
if (orientation == Configuration.ORIENTATION_LANDSCAPE) {
return false
}
if (orientation == Configuration.ORIENTATION_PORTRAIT) {
return true
}
Log.d(TAG, "isPortraitMode returning false by default")
return false
}
companion object {
private val TAG = LensEnginePreview::class.java.simpleName
}
init {
mSurfaceView = SurfaceView(mContext)
mSurfaceView.holder.addCallback(SurfaceCallback())
this.addView(mSurfaceView)
}
}
Create HandKeypointGraphic.kt class to find the business logic for hand key point.
Java:
class HandKeypointGraphic(overlay: GraphicOverlay?, private val handKeypoints: MutableList<MLHandKeypoints?>) : GraphicOverlay.Graphic(overlay!!) {
private val rectPaint: Paint
private val idPaintnew: Paint
companion object {
private const val BOX_STROKE_WIDTH = 5.0f
}
private fun translateRect(rect: Rect): Rect {
var left: Float = translateX(rect.left)
var right: Float = translateX(rect.right)
var bottom: Float = translateY(rect.bottom)
var top: Float = translateY(rect.top)
if (left > right) {
val size = left
left = right
right = size
}
if (bottom < top) {
val size = bottom
bottom = top
top = size
}
return Rect(left.toInt(), top.toInt(), right.toInt(), bottom.toInt())
}
init {
val selectedColor = Color.WHITE
idPaintnew = Paint()
idPaintnew.color = Color.GREEN
idPaintnew.textSize = 32f
rectPaint = Paint()
rectPaint.color = selectedColor
rectPaint.style = Paint.Style.STROKE
rectPaint.strokeWidth = BOX_STROKE_WIDTH
}
override fun draw(canvas: Canvas?) {
for (i in handKeypoints.indices) {
val mHandKeypoints = handKeypoints[i]
if (mHandKeypoints!!.getHandKeypoints() == null) {
continue
}
val rect = translateRect(handKeypoints[i]!!.getRect())
canvas!!.drawRect(rect, rectPaint)
for (handKeypoint in mHandKeypoints.getHandKeypoints()) {
if (!(Math.abs(handKeypoint.getPointX() - 0f) == 0f && Math.abs(handKeypoint.getPointY() - 0f) == 0f)) {
canvas!!.drawCircle(translateX(handKeypoint.getPointX().toInt()),
translateY(handKeypoint.getPointY().toInt()), 24f, idPaintnew)
}
}
}
}
}
Create GraphicOverlay.kt class to find the business logic for graphic overlay.
Java:
class GraphicOverlay(context: Context?, attrs: AttributeSet?) : View(context, attrs) {
private val mLock = Any()
private var mPreviewWidth = 0
private var mWidthScaleFactor = 1.0f
private var mPreviewHeight = 0
private var mHeightScaleFactor = 1.0f
private var mFacing = LensEngine.BACK_LENS
private val mGraphics: MutableSet<Graphic> = HashSet()
// Base class for a custom graphics object to be rendered within the graphic overlay. Subclass
// this and implement the [Graphic.draw] method to define the graphics element. Add instances to the overlay using [GraphicOverlay.add].
abstract class Graphic(private val mOverlay: GraphicOverlay) {
// Draw the graphic on the supplied canvas. Drawing should use the following methods to
// convert to view coordinates for the graphics that are drawn:
// 1. [Graphic.scaleX] and [Graphic.scaleY] adjust the size of the supplied value from the preview scale to the view scale.
// 2. [Graphic.translateX] and [Graphic.translateY] adjust the coordinate from the preview's coordinate system to the view coordinate system.
// @param canvas drawing canvas
abstract fun draw(canvas: Canvas?)
// Adjusts a horizontal value of the supplied value from the preview scale to the view scale.
fun scaleX(horizontal: Float): Float {
return horizontal * mOverlay.mWidthScaleFactor
}
// Adjusts a vertical value of the supplied value from the preview scale to the view scale.
fun scaleY(vertical: Float): Float {
return vertical * mOverlay.mHeightScaleFactor
}
// Adjusts the x coordinate from the preview's coordinate system to the view coordinate system.
fun translateX(x: Int): Float {
return if (mOverlay.mFacing == LensEngine.FRONT_LENS) {
mOverlay.width - scaleX(x.toFloat())
} else {
scaleX(x.toFloat())
}
}
// Adjusts the y coordinate from the preview's coordinate system to the view coordinate system.
fun translateY(y: Int): Float {
return scaleY(y.toFloat())
}
}
// Removes all graphics from the overlay.
fun clear() {
synchronized(mLock) { mGraphics.clear() }
postInvalidate()
}
// Adds a graphic to the overlay.
fun add(graphic: Graphic) {
synchronized(mLock) { mGraphics.add(graphic) }
postInvalidate()
}
// Sets the camera attributes for size and facing direction, which informs how to transform image coordinates later.
fun setCameraInfo(previewWidth: Int, previewHeight: Int, facing: Int) {
synchronized(mLock) {
mPreviewWidth = previewWidth
mPreviewHeight = previewHeight
mFacing = facing
}
postInvalidate()
}
// Draws the overlay with its associated graphic objects.
override fun onDraw(canvas: Canvas) {
super.onDraw(canvas)
synchronized(mLock) {
if (mPreviewWidth != 0 && mPreviewHeight != 0) {
mWidthScaleFactor = canvas.width.toFloat() / mPreviewWidth.toFloat()
mHeightScaleFactor = canvas.height.toFloat() / mPreviewHeight.toFloat()
}
for (graphic in mGraphics) {
graphic.draw(canvas)
}
}
}
}
In the StaticHandKeyPointAnalyse.kt we can find the business logic static hand key point analyses.
Java:
class StaticHandKeyPointAnalyse : AppCompatActivity() {
var analyzer: MLHandKeypointAnalyzer? = null
var bitmap: Bitmap? = null
var mutableBitmap: Bitmap? = null
var mlFrame: MLFrame? = null
var imageSelected: ImageView? = null
var picUri: Uri? = null
var pickButton: Button? = null
var analyzeButton:Button? = null
var permissions = arrayOf(Manifest.permission.READ_EXTERNAL_STORAGE, Manifest.permission.WRITE_EXTERNAL_STORAGE,
Manifest.permission.CAMERA)
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
setContentView(R.layout.activity_static_hand_key_point_analyse)
pickButton = findViewById(R.id.pick_img)
analyzeButton = findViewById(R.id.analyse_img)
imageSelected = findViewById(R.id.selected_img)
initialiseSettings()
pickButton!!.setOnClickListener(View.OnClickListener {
pickRequiredImage()
})
analyzeButton!!.setOnClickListener(View.OnClickListener {
asynchronouslyStaticHandkey()
})
checkRequiredPermission()
}
private fun checkRequiredPermission() {
if (PackageManager.PERMISSION_GRANTED != ContextCompat.checkSelfPermission(this, Manifest.permission.WRITE_EXTERNAL_STORAGE)
|| PackageManager.PERMISSION_GRANTED != ContextCompat.checkSelfPermission(this, Manifest.permission.READ_EXTERNAL_STORAGE)
|| PackageManager.PERMISSION_GRANTED != ContextCompat.checkSelfPermission(this, Manifest.permission.CAMERA)) {
ActivityCompat.requestPermissions(this, permissions, 111)
}
}
private fun initialiseSettings() {
val setting = MLHandKeypointAnalyzerSetting.Factory() // MLHandKeypointAnalyzerSetting.TYPE_ALL indicates that all results are returned.
// MLHandKeypointAnalyzerSetting.TYPE_KEYPOINT_ONLY indicates that only hand keypoint information is returned.
// MLHandKeypointAnalyzerSetting.TYPE_RECT_ONLY indicates that only palm information is returned.
.setSceneType(MLHandKeypointAnalyzerSetting.TYPE_ALL) // Set the maximum number of hand regions that can be detected in an image. By default, a maximum of 10 hand regions can be detected.
.setMaxHandResults(1)
.create()
analyzer = MLHandKeypointAnalyzerFactory.getInstance().getHandKeypointAnalyzer(setting)
}
private fun asynchronouslyStaticHandkey() {
val task = analyzer!!.asyncAnalyseFrame(mlFrame)
task.addOnSuccessListener { results ->
val canvas = Canvas(mutableBitmap!!)
val paint = Paint()
paint.color = Color.GREEN
paint.style = Paint.Style.FILL
val mlHandKeypoints = results[0]
for (mlHandKeypoint in mlHandKeypoints.getHandKeypoints()) {
canvas.drawCircle(mlHandKeypoint.pointX, mlHandKeypoint.pointY, 48f, paint)
}
imageSelected!!.setImageBitmap(mutableBitmap)
checkAnalyserForStop()
}.addOnFailureListener { // Detection failure.
checkAnalyserForStop()
}
}
private fun checkAnalyserForStop() {
if (analyzer != null) {
analyzer!!.stop()
}
}
private fun pickRequiredImage() {
val intent = Intent()
intent.type = "image/*"
intent.action = Intent.ACTION_PICK
startActivityForResult(Intent.createChooser(intent, "Select Picture"), 20)
}
override fun onActivityResult(requestCode: Int, resultCode: Int, data: Intent?) {
super.onActivityResult(requestCode, resultCode, data)
if (requestCode == 20 && resultCode == RESULT_OK && null != data) {
picUri = data.data
val filePathColumn = arrayOf(MediaStore.Images.Media.DATA)
val cursor = contentResolver.query( picUri!!, filePathColumn, null, null, null)
cursor!!.moveToFirst()
cursor.close()
imageSelected!!.setImageURI(picUri)
imageSelected!!.invalidate()
val drawable = imageSelected!!.drawable as BitmapDrawable
bitmap = drawable.bitmap
mutableBitmap = bitmap!!.copy(Bitmap.Config.ARGB_8888, true)
mlFrame = null
mlFrame = MLFrame.fromBitmap(bitmap)
}
}
}
In the activity_main.xml we can create the UI screen.
Java:
<?xml version="1.0" encoding="utf-8"?>
<androidx.constraintlayout.widget.ConstraintLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
tools:context=".MainActivity">
<Button
android:id="@+id/btn_static"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="Static Detection"
android:textAllCaps="false"
android:textSize="18sp"
app:layout_constraintBottom_toTopOf="@+id/btn_live"
app:layout_constraintLeft_toLeftOf="parent"
app:layout_constraintRight_toRightOf="parent"
android:textColor="@color/black"
style="@style/Widget.MaterialComponents.Button.MyTextButton"
app:layout_constraintTop_toTopOf="parent" />
<Button
android:id="@+id/btn_live"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:text="Live Detection"
android:textAllCaps="false"
android:textSize="18sp"
android:layout_marginBottom="150dp"
app:layout_constraintBottom_toBottomOf="parent"
app:layout_constraintLeft_toLeftOf="parent"
android:textColor="@color/black"
style="@style/Widget.MaterialComponents.Button.MyTextButton"
app:layout_constraintRight_toRightOf="parent" />
</androidx.constraintlayout.widget.ConstraintLayout>
In the activity_live_hand_key_point_analyse.xml we can create the UI screen.
Java:
<?xml version="1.0" encoding="utf-8"?>
<androidx.constraintlayout.widget.ConstraintLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
tools:context=".LiveHandKeyPointAnalyse">
<com.example.mlhandgesturesample.LensEnginePreview
android:id="@+id/hand_preview"
android:layout_width="match_parent"
android:layout_height="match_parent"
tools:ignore="MissingClass">
<com.example.mlhandgesturesample.GraphicOverlay
android:id="@+id/hand_overlay"
android:layout_width="match_parent"
android:layout_height="match_parent" />
</com.example.mlhandgesturesample.LensEnginePreview>
<Button
android:id="@+id/handswitch"
android:layout_width="35dp"
android:layout_height="35dp"
app:layout_constraintBottom_toBottomOf="parent"
app:layout_constraintStart_toStartOf="parent"
app:layout_constraintEnd_toEndOf="parent"
android:layout_marginBottom="35dp"
android:background="@drawable/front_back_switch"
android:textOff=""
android:textOn=""
tools:ignore="MissingConstraints" />
</androidx.constraintlayout.widget.ConstraintLayout>
In the activity_static_hand_key_point_analyse.xml we can create the UI screen.
Java:
<?xml version="1.0" encoding="utf-8"?>
<androidx.constraintlayout.widget.ConstraintLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
tools:context=".StaticHandKeyPointAnalyse">
<com.google.android.material.button.MaterialButton
android:id="@+id/pick_img"
android:text="Pick Image"
android:textSize="18sp"
android:textColor="@android:color/black"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:textAllCaps="false"
app:layout_constraintTop_toTopOf="parent"
app:layout_constraintBottom_toTopOf="@id/selected_img"
app:layout_constraintLeft_toLeftOf="@id/selected_img"
app:layout_constraintRight_toRightOf="@id/selected_img"
style="@style/Widget.MaterialComponents.Button.MyTextButton"/>
<ImageView
android:visibility="visible"
android:id="@+id/selected_img"
android:layout_width="350dp"
android:layout_height="350dp"
app:layout_constraintBottom_toBottomOf="parent"
app:layout_constraintLeft_toLeftOf="parent"
app:layout_constraintRight_toRightOf="parent"
app:layout_constraintTop_toTopOf="parent" />
<com.google.android.material.button.MaterialButton
android:id="@+id/analyse_img"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:textColor="@android:color/black"
android:text="Analyse"
android:textSize="18sp"
android:textAllCaps="false"
app:layout_constraintBottom_toBottomOf="parent"
app:layout_constraintTop_toBottomOf="@id/selected_img"
app:layout_constraintLeft_toLeftOf="@id/selected_img"
app:layout_constraintRight_toRightOf="@id/selected_img"
style="@style/Widget.MaterialComponents.Button.MyTextButton"/>
</androidx.constraintlayout.widget.ConstraintLayout>
Demo
Tips and Tricks
1. Make sure you are already registered as Huawei developer.
2. Set minSDK version to 21 or later, otherwise you will get AndriodManifest merge issue.
3. Make sure you have added the agconnect-services.json file to app folder.
4. Make sure you have added SHA-256 fingerprint without fail.
5. Make sure all the dependencies are added properly.
Conclusion
In this article, we have learned how to find the hand key points using Huawei ML Kit of Hand Gesture Recognition feature. This service provides two capabilities: hand keypoint detection and hand gesture recognition. The hand keypoint detection capability can detect 21 hand keypoints (including fingertips, knuckles, and wrists) and return the positions of the key points. The hand gesture recognition capability can detect and return the positions of all rectangle areas of the hand from images and videos, and the type and confidence of a gesture. This capability can recognize 14 gestures, including the thumbs-up/down, OK sign, fist, finger heart, and number gestures from 1 to 9. Both capabilities support detection from static images and real-time camera streams.
I hope you have read this article. If you found it is helpful, please provide likes and comments.
Reference
ML Kit – Hand Gesture Recognition
ML Kit – Training Video

Categories

Resources