Practice on Pushing Messages to Devices of Different Manufacturers - Huawei Developers

Push messaging, with the proliferation of mobile Internet, has become a very effective way for mobile apps to achieve business success. It improves user engagement and stickiness by allowing developers to send messages to a wide range of users in a wide range of scenarios: taking the subway or bus, having a meal in a restaurant, having a chat... you name it. No matter what the scenario is, a push message is always a great helper for you to directly "talk" to your users, and for your users to know something informative.
Such great benefits brought by push messages, however, can be dampened by a challenge: the variety of mobile phone manufacturers. This is because usually each manufacturer has their own push messaging channels, which increases the difficulty for uniformly sending your app's push messages to mobile phones of different manufacturers. Of course there is an easy solution for this: sending your push messages to mobile phones of only one manufacturer, but this can limit your user base and prevent you from obtaining your desired messaging effects.
Then this well explains why we developers usually need to find a solution for our apps to be able to push their messages to devices of different brands.
I don't know about you, but the solution I found for my app is HMS Core Push Kit. Going on, I will demonstrate how I have integrated this kit and used its ability to aggregate third-party push messaging channels to implement push messaging on mobile phones made by different manufacturers, expecting greater user engagement and stickiness. Let's move on to the implementation.
Preparations​Before integrating the SDK, make the following preparations:
1. Sign in to the push messaging platform of a specific manufacturer, create a project and app on the platform, and save the JSON key file of the project. (The requirements may vary depending on the manufacturer, so refer to the specific manufacturer's documentation to learn about their requirements.)
2. Configure app information in AppGallery Connet, but use the following build dependency instead when configuring the build dependencies:
Code:
dependencies {
implementation 'com.huawei.hms:push-fcm:6.3.0.304'
}
3. On the platform mentioned in the previous step, click My projects, find the app in the project, and go to Grow > Push Kit > Settings. On the page displayed, click Enable next to Configure other Android-based push, and then copy the key in the saved JSON key file and paste it in the Authentication parameters text box.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
​Development Procedure​Now, let's go through the development procedure.
1. Disable the automatic initialization of the SDK.
To do so, open the AndroidManifest.xml file, and add the <meta-data> element to the <application> element. Note that in the element, the name parameter has a fixed value of push_kit_auto_init_enabled. As for the value parameter, you can set it to false, indicating that the automatic initialization is disabled.
Code:
<manifest ...>
...
<application ...>
<meta-data
android:name="push_kit_auto_init_enabled"
android:value="false"/>
...
</application>
...
</manifest>
2. Initialize the push capability in either of the following ways:
Set value corresponding to push_kit_proxy_init_enabled in the <meta-data> element to true.
Code:
<application>
<meta-data
android:name="push_kit_proxy_init_enabled"
android:value="true" />
</application>
Explicitly call FcmPushProxy.init in the onCreate method of the Application class.
3. Call the getToken method to apply for a token.
Code:
private void getToken() {
// Create a thread.
new Thread() {
@Override
public void run() {
try {
// Obtain the app ID from the agconnect-services.json file.
String appId = "your APP_ID";
// Set tokenScope to HCM.
String tokenScope = "HCM";
String token = HmsInstanceId.getInstance(MainActivity.this).getToken(appId, tokenScope);
Log.i(TAG, "get token: " + token);
// Check whether the token is empty.
if(!TextUtils.isEmpty(token)) {
sendRegTokenToServer(token);
}
} catch (ApiException e) {
Log.e(TAG, "get token failed, " + e);
}
}
}.start();
}
private void sendRegTokenToServer(String token) {
Log.i(TAG, "sending token to server. token:" + token);
}
4. Override the onNewToken method.
After the SDK is integrated and initialized, the getToken method will not return a token. Instead, you'll need to obtain a token by using the onNewToken method.
Code:
@Override
public void onNewToken(String token, Bundle bundle) {
Log.i(TAG, "onSubjectToken called, token:" + token );
}
5. Override the onTokenError method.
This method will be called if the token fails to be obtained.
Code:
@Override
public void onTokenError(Exception e, Bundle bundle) {
int errCode = ((BaseException) e).getErrorCode();
String errInfo = e.getMessage();
Log.i(TAG, "onTokenError called, errCode:" + errCode + ",errInfo=" + errInfo );
}
6. Override the onMessageReceived method to receive data messages.
Code:
@Override
public void onMessageReceived(RemoteMessage message) {
Log.i(TAG, "onMessageReceived is called");
// Check whether the message is empty.
if (message == null) {
Log.e(TAG, "Received message entity is null!");
return;
}
// Obtain the message content.
Log.i(TAG, "get Data: " + message.getData()
+ "\n getFrom: " + message.getFrom()
+ "\n getTo: " + message.getTo()
+ "\n getMessageId: " + message.getMessageId()
+ "\n getSentTime: " + message.getSentTime()
+ "\n getDataMap: " + message.getDataOfMap()
+ "\n getMessageType: " + message.getMessageType()
+ "\n getTtl: " + message.getTtl()
+ "\n getToken: " + message.getToken());
Boolean judgeWhetherIn10s = false;
// Create a job to process a message if the message is not processed within 10 seconds.
if (judgeWhetherIn10s) {
startWorkManagerJob(message);
} else {
// Process the message within 10 seconds.
processWithin10s(message);
}
}
private void startWorkManagerJob(RemoteMessage message) {
Log.d(TAG, "Start new job processing.");
}
private void processWithin10s(RemoteMessage message) {
Log.d(TAG, "Processing now.");
}
7. Send downlink messages.
Currently, you can only use REST APIs on the server to send downlink messages through a third-party manufacturer's push messaging channel.
The following is the URL for calling the API using HTTPS POST:
Code:
POST https://push-api.cloud.huawei.com/v1/[appId]/messages:send
The request header looks like the following:
Code:
Content-Type: application/json; charset=UTF-8
Authorization: Bearer CF3Xl2XV6jMKZgqYSZFws9IPlgDvxqOfFSmrlmtkTRupbU2VklvhX9kC9JCnKVSDX2VrDgAPuzvNm3WccUIaDg==
An example of the notification message body is as follows:
Code:
{
"validate_only": false,
"message": {
"android": {
"notification": {
"title": "test title",
"body": "test body",
"click_action": {
"type": 3
}
}
},
"token": ["pushtoken1"]
}
}
And just like that, my app has got the ability to send its push messages to mobile phones of different manufacturers — without any other configurations. Easy-peasy, right?
Conclusion​Today's highly developed mobile Internet has made push messaging an important and effective way for mobile apps to improve user engagement and stickiness. A great obstacle for push messaging to effectively play its role is the highly diversified mobile phone market that is inundated with various manufacturers.
In this article, I demonstrated my solution to aggregate the push channels of different manufacturers, which allowed my app to push messages in a unified way to devices made by those manufacturers. As proven, the whole implementation process is both straightforward and cost-effective, delivering a better messaging effect of push messages by ensuring that they can reach a bigger user base supported by various manufacturers.

Related

SMS Retriever API for HMS

{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Article Introduction
Validating identity is incredibly important in mobile development, and assuring new user signups are real human beings is critical. Developers need reliable ways to confirm the identities of their users to prevent security issues. Developers can implement verification systems using phone numbers in two main ways: either by calling or by sending an SMS containing a code that the user must input. In this article, we’ll implement SMS Retriever API to support both GMS and HMS, so our app can read the Verification/OTP SMS and verify the user automatically.
Automatic and one-tap SMS verification
Verify your users by SMS without making them deal with verification codes. If your app requires a user to enter a mobile number and verifies the user identity using an SMS verification code, you can integrate the ReadSmsManager service so that your app can automatically read the SMS verification code without applying for the SMS reading permission. After the integration, SMS verification codes are automatically filled in for verification, greatly improving user experience.
The process is described as follows:
1. HMS Core (APK) sends the SMS message that meets the rules to your app through a directed broadcast.
2. Your app receives the directed broadcast, parses it to obtain the SMS verification code, and displays it on your app.
3. The user checks whether the verification code is correct and if so, sends a verification request.
4. Your app sends the verification code from the user to your app server for check.
5. Your app server checks whether the verification code is correct and if so, returns the verification result to your app.
Pre-Requisites
ReadSmsManager Support the following:
Support
Value
Devices
Phones and Tablets
Operating System
EMUI 3.0 or later
Android
Android 4.4 or later
SMS message rules
SMS Template:
prefix_flag short message verification code is XXXXXX hash_value
prefix_flag
indicates the prefix of an SMS message, which can be <#>, [#], or \u200b\u200b. \u200b\u200b is invisible Unicode characters.
short message verification code is
indicates the SMS message content, which you can define as needed.
XXXXXX
indicates the verification code.
hash_value
indicates the hash value generated by the HMS Core SDK based on the package name of an app to uniquely identify the app
Step 1: Get Hash Value:
You get your hash value by implementing the following class:
Java:
public class hashcodeHMS extends ContextWrapper {
public static final String TAG = hashcodeHMS.class.getSimpleName();
public hashcodeHMS(Context context) {
super(context);
}
public MessageDigest getMessageDigest() {
MessageDigest messageDigest = null;
try {
messageDigest = MessageDigest.getInstance("SHA-256");
} catch (NoSuchAlgorithmException e) {
Log.e(TAG, "No Such Algorithm.", e);
}
return messageDigest;
}
public String getSignature(Context context, String packageName) {
PackageManager packageManager = context.getPackageManager();
Signature[] signatureArrs;
try {
signatureArrs = packageManager.getPackageInfo(packageName, PackageManager.GET_SIGNATURES).signatures;
} catch (PackageManager.NameNotFoundException e) {
Log.e(TAG, "Package name inexistent.");
return "";
}
if (null == signatureArrs || 0 == signatureArrs.length) {
Log.e(TAG, "signature is null.");
return "";
}
Log.e("hashhms =>", signatureArrs[0].toCharsString());
return signatureArrs[0].toCharsString();
}
public String getHashCode(String packageName, MessageDigest messageDigest, String signature) {
String appInfo = packageName + " " + signature;
messageDigest.update(appInfo.getBytes(StandardCharsets.UTF_8));
byte[] hashSignature = messageDigest.digest();
hashSignature = Arrays.copyOfRange(hashSignature, 0, 9);
String base64Hash = Base64.encodeToString(hashSignature, Base64.NO_PADDING | Base64.NO_WRAP);
base64Hash = base64Hash.substring(0, 11);
return base64Hash;
}
}
Step 2: Start the consent from SMS Manager:
Code:
val task = ReadSmsManager.startConsent([email protected], null)
task.addOnCompleteListener { it ->
if (it.isSuccessful) {
// The service is enabled successfully. Perform other operations as needed.
tv_title.text = "Waiting for the OTP"
Toast.makeText(this, "SMS Retriever starts", Toast.LENGTH_LONG).show()
}else{
tv_title.text = "ReadSmsManager did not worked"
Toast.makeText(this, "SMS Retriever did not start", Toast.LENGTH_LONG).show()
}
}
Step 3: Prepare your Broadcast Receiver:
Java:
class MySMSBrodcastReceiverHms : BroadcastReceiver() {
private var otpReceiver: OTPReceiveListenerHMS? = null
fun initOTPListener(receiver: OTPReceiveListenerHMS) {
this.otpReceiver = receiver
Log.e("Firas", "initOTPListener: Done", )
}
override fun onReceive(context: Context, intent: Intent) {
intent?.let { it ->
val bundle = it.extras
bundle?.let { itBundle ->
if (ReadSmsConstant.READ_SMS_BROADCAST_ACTION == it.action) {
val status : Status? = itBundle.getParcelable(ReadSmsConstant.EXTRA_STATUS)
if (status?.statusCode == CommonStatusCodes.TIMEOUT) {
// The service has timed out and no SMS message that meets the requirements is read. The service process ends.
Log.i("Firas", "onReceive: TIMEOUT")
otpReceiver!!.onOTPTimeOutHMS()
}else if (status?.statusCode == CommonStatusCodes.SUCCESS){
if (bundle.containsKey(ReadSmsConstant.EXTRA_SMS_MESSAGE)) {
// An SMS message that meets the requirement is read. The service process ends.
var otp: String = bundle.getString(ReadSmsConstant.EXTRA_SMS_MESSAGE) as String
Log.i("Firas", "onReceive: ${otp}")
otp = otp.replace("[#] short message verification code is ", "").split(":".toRegex()).dropLastWhile { it.isEmpty() }.toTypedArray()[0]
otp = otp.split(" ").dropLastWhile { it.isEmpty() }.toTypedArray()[0]
Log.i("Firas", "onReceive: ${otp}")
otpReceiver!!.onOTPReceivedHMS(otp)
}
}
}
}
}
}
interface OTPReceiveListenerHMS {
fun onOTPReceivedHMS(otp: String)
fun onOTPTimeOutHMS()
}
}
*Add the broadcast receiver to the manifest*
XML:
<receiver
android:name=".MySMSBrodcastReceiverHms"
android:exported="true">
<intent-filter>
<action android:name="com.huawei.hms.support.sms.common.ReadSmsConstant.READ_SMS_BROADCAST_ACTION" />
</intent-filter>
</receiver>
Step 4: Prepare your Broadcast Receiver:
Implement the OTPReceiverListenerHMS interface from our MySMSBrodcastReceiverHms:
Code:
* class otp_read : AppCompatActivity(),MySMSBrodcastReceiverHms.OTPReceiveListenerHMS{}*
Then create and initiate the OTPListener
Code:
var smsBroadcast = MySMSBrodcastReceiverHms() as MySMSBrodcastReceiverHms
(smsBroadcast as MySMSBrodcastReceiverHms).initOTPListener(this)
Implement the functions of the interface:
Code:
override fun onOTPReceivedHMS(otp: String) {
Toast.makeText(this, " onOTPReceived", Toast.LENGTH_SHORT).show()
if (smsBroadcast != null) {
LocalBroadcastManager.getInstance(this).unregisterReceiver(smsBroadcast as MySMSBrodcastReceiverHms)
}
Toast.makeText(this, otp, Toast.LENGTH_SHORT).show()
tv_title.text = "$otp"
otp_view.setText(otp)
Log.e("OTP Received", otp)
}
override fun onOTPTimeOutHMS() {
tv_title.setText("Timeout")
Toast.makeText(this, " SMS retriever API Timeout", Toast.LENGTH_SHORT).show()
}
Step 5: Register your receiver:
Code:
val intentFilter = IntentFilter()
intentFilter.addAction(ReadSmsConstant.READ_SMS_BROADCAST_ACTION)
applicationContext.registerReceiver(
smsBroadcast as MySMSBrodcastReceiverHms,
intentFilter
)
Step 6: Run the application
Conclusion
This feature will help the user to be verified faster than the regular way and it will prevent the user from any typo mistake like writing the OTP, keep in mind this feature will work only if the user has the sim card on his phone other wais it will not trigger the broadcast receiver. From now not every app with READ SMS permission can access your personal data like messages. Usually, to auto-fill OTP we give access to an android app it’s better to off that permission after the process is completed (else they will have access to each & every message you have on the mobile), but how many will do that. With SMS Retriever API apps won’t ask for READ SMS permission to auto-fill OTP.
Tips & Tricks
Remove AppSignatureHelper class from your project before going to production.
Debug and Release APK’s might have different Hashcodes, make sure you get hashcode from release APK.
References
Automatically Reading an SMS Verification Code Without User Authorization:
https://developer.huawei.com/consumer/en/doc/development/HMSCore-Guides/readsmsmanager-0000001050050861
Automatically Reading an SMS Verification Code After User Authorization:
https://developer.huawei.com/consumer/en/doc/development/HMSCore-Guides/authotize-to-read-sms-0000001061481826#EN-US_TOPIC_0000001126286229__section1186673334918
ReadSmsManager Reference:
https://developer.huawei.com/consumer/en/doc/development/HMSCore-Guides/readsmsmanager-0000001050050861
Obtaining the Hash Value Reference:
https://developer.huawei.com/consumer/en/doc/development/HMSCore-Guides-V5/obtaining-hash-value-0000001050194405-V5
Checkout in forum

Audio File Transcription, for Super-Efficient Recording

Introduction
Converting audio into text has a wide range of applications: generating video subtitles, taking meeting minutes, and writing interview transcripts. HUAWEI ML Kit's service makes doing so easier than ever before, converting audio files into meticulously accurate text, with correct punctuation as well!
Actual Effects
Build and run an app with audio file transcription integrated. Then, select a local audio file and convert it into text.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Development Preparations
For details about configuring the Huawei Maven repository and integrating the audio file transcription SDK, please refer to the Development Guide of ML Kit on HUAWEI Developers.
Declaring Permissions in the AndroidManifest.xml File
Open the AndroidManifest.xml in the main folder. Add the network connection, network status access, and storage read permissions before <application.
Please note that these permissions need to be dynamically applied for. Otherwise, Permission Denied will be reported.
Code:
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
Development Procedure
Creating and Initializing an Audio File Transcription Engine
Code:
Override onCreate in MainActivity to create an audio transcription engine.
private MLRemoteAftEngine mAnalyzer;
mAnalyzer = MLRemoteAftEngine.getInstance();
mAnalyzer.init(getApplicationContext());
mAnalyzer.setAftListener(mAsrListener);
Use MLRemoteAftSetting to configure the engine. The service currently supports Mandarin Chinese and English, that is, the options of mLanguage are zh and en.
Code:
MLRemoteAftSetting setting = new MLRemoteAftSetting.Factory()
.setLanguageCode(mLanguage)
.enablePunctuation(true)
.enableWordTimeOffset(true)
.enableSentenceTimeOffset(true)
.create();
enablePunctuation indicates whether to automatically punctuate the converted text, with a default value of false.
If this parameter is set to true, the converted text is automatically punctuated; false otherwise.
enableWordTimeOffset indicates whether to generate the text transcription result of each audio segment with the corresponding offset. The default value is false. You need to set this parameter only when the audio duration is less than 1 minute.
If this parameter is set to true, the offset information is returned along with the text transcription result. This applies to the transcription of short audio files with a duration of 1 minute or shorter.
If this parameter is set to false, only the text transcription result of the audio file will be returned.
enableSentenceTimeOffset indicates whether to output the offset of each sentence in the audio file. The default value is false.
If this parameter is set to true, the offset information is returned along with the text transcription result.
If this parameter is set to false, only the text transcription result of the audio file will be returned.
Creating a Listener Callback to Process the Transcription Result
private MLRemoteAftListener mAsrListener = new MLRemoteAftListener()
After the listener is initialized, call startTask in AftListener to start the transcription.
Code:
@Override
public void onInitComplete(String taskId, Object ext) {
Log.i(TAG, "MLRemoteAftListener onInitComplete" + taskId);
mAnalyzer.startTask(taskId);
}
Override onUploadProgress, onEvent, and onResult in MLRemoteAftListener.
@Override
public void onUploadProgress(String taskId, double progress, Object ext) {
Log.i(TAG, " MLRemoteAftListener onUploadProgress is " + taskId + " " + progress);
}
@Override
public void onEvent(String taskId, int eventId, Object ext) {
Log.e(TAG, "MLAsrCallBack onEvent" + eventId);
if (MLAftEvents.UPLOADED_EVENT == eventId) { // The file is uploaded successfully.
showConvertingDialog();
startQueryResult(); // Obtain the transcription result.
}
}
@Override
public void onResult(String taskId, MLRemoteAftResult result, Object ext) {
Log.i(TAG, "onResult get " + taskId);
if (result != null) {
Log.i(TAG, "onResult isComplete " + result.isComplete());
if (!result.isComplete()) {
return;
}
if (null != mTimerTask) {
mTimerTask.cancel();
}
if (result.getText() != null) {
Log.e(TAG, result.getText());
dismissTransferringDialog();
showCovertResult(result.getText());
}
List<MLRemoteAftResult.Segment> segmentList = result.getSegments();
if (segmentList != null && segmentList.size() != 0) {
for (MLRemoteAftResult.Segment segment : segmentList) {
Log.e(TAG, "MLAsrCallBack segment text is : " + segment.getText() + ", startTime is : " + segment.getStartTime() + ". endTime is : " + segment.getEndTime());
}
}
List<MLRemoteAftResult.Segment> words = result.getWords();
if (words != null && words.size() != 0) {
for (MLRemoteAftResult.Segment word : words) {
Log.e(TAG, "MLAsrCallBack word text is : " + word.getText() + ", startTime is : " + word.getStartTime() + ". endTime is : " + word.getEndTime());
}
}
List<MLRemoteAftResult.Segment> sentences = result.getSentences();
if (sentences != null && sentences.size() != 0) {
for (MLRemoteAftResult.Segment sentence : sentences) {
Log.e(TAG, "MLAsrCallBack sentence text is : " + sentence.getText() + ", startTime is : " + sentence.getStartTime() + ". endTime is : " + sentence.getEndTime());
}
}
}
}
Processing the Transcription Result in Polling Mode
After the transcription is completed, call getLongAftResult to obtain the transcription result. Process the obtained result every 10 seconds.
Code:
private void startQueryResult() {
Timer mTimer = new Timer();
mTimerTask = new TimerTask() {
@Override
public void run() {
getResult();
}
};
mTimer.schedule(mTimerTask, 5000, 10000); // Process the obtained long speech transcription result every 10s.
}
private void getResult() {
Log.e(TAG, "getResult");
mAnalyzer.setAftListener(mAsrListener);
mAnalyzer.getLongAftResult(mLongTaskId);
}
References:
To learn more, please visit:
HUAWEI Developers official website
Development Guide
Reddit to join developer discussions
GitHub or Gitee to download the demo and sample code
Stack Overflow to solve integration problems
Follow our official account for the latest HMS Core-related news and updates.
Original Source

How to Convert Audio File into Text With Machine Learning

Introduction​Converting audio into text has a wide range of applications: generating video subtitles, taking meeting minutes, and writing interview transcripts. Machine learning makes doing so easier than ever before, converting audio files into meticulously accurate text, with correct punctuation as well!
Actual Effects​Build and run an app with audio file transcription integrated. Then, select a local audio file and convert it into text.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Development Preparations​For details about configuring the Huawei Maven repository and integrating the audio file transcription SDK, please refer to the Development Guide of ML Kit on HUAWEI Developers.
Declaring Permissions in the AndroidManifest.xml File
Open the AndroidManifest.xml in the main folder. Add the network connection, network status access, and storage read permissions before <application.
Please note that these permissions need to be dynamically applied for. Otherwise, Permission Denied will be reported.
Code:
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<uses-permission android:name="android.permission.READ_EXTERNAL_STORAGE" />
Development Procedure​Creating and Initializing an Audio File Transcription Engine
Override onCreate in MainActivity to create an audio transcription engine.
Code:
private MLRemoteAftEngine mAnalyzer;
mAnalyzer = MLRemoteAftEngine.getInstance();
mAnalyzer.init(getApplicationContext());
mAnalyzer.setAftListener(mAsrListener);
Use MLRemoteAftSetting to configure the engine. The service currently supports Mandarin Chinese and English, that is, the options of mLanguage are zh and en.
Code:
MLRemoteAftSetting setting = new MLRemoteAftSetting.Factory()
.setLanguageCode(mLanguage)
.enablePunctuation(true)
.enableWordTimeOffset(true)
.enableSentenceTimeOffset(true)
.create();
enablePunctuation indicates whether to automatically punctuate the converted text, with a default value of false.
If this parameter is set to true, the converted text is automatically punctuated; false otherwise.
enableWordTimeOffset indicates whether to generate the text transcription result of each audio segment with the corresponding offset. The default value is false. You need to set this parameter only when the audio duration is less than 1 minute.
If this parameter is set to true, the offset information is returned along with the text transcription result. This applies to the transcription of short audio files with a duration of 1 minute or shorter. If this parameter is set to false, only the text transcription result of the audio file will be returned.
enableSentenceTimeOffset indicates whether to output the offset of each sentence in the audio file. The default value is false.
If this parameter is set to true, the offset information is returned along with the text transcription result. If this parameter is set to false, only the text transcription result of the audio file will be returned.
Creating a Listener Callback to Process the Transcription Result
Code:
private MLRemoteAftListener mAsrListener = new MLRemoteAftListener()
After the listener is initialized, call startTask in AftListener to start the transcription.
Code:
@Override
public void onInitComplete(String taskId, Object ext) {
Log.i(TAG, "MLRemoteAftListener onInitComplete" + taskId);
mAnalyzer.startTask(taskId);
Override onUploadProgress, onEvent, and onResult in MLRemoteAftListener.
Code:
@Override
public void onUploadProgress(String taskId, double progress, Object ext) {
Log.i(TAG, " MLRemoteAftListener onUploadProgress is " + taskId + " " + progress);
}
@Override
public void onEvent(String taskId, int eventId, Object ext) {
Log.e(TAG, "MLAsrCallBack onEvent" + eventId);
if (MLAftEvents.UPLOADED_EVENT == eventId) { // The file is uploaded successfully.
showConvertingDialog();
startQueryResult(); // Obtain the transcription result.
}
}
@Override
public void onResult(String taskId, MLRemoteAftResult result, Object ext) {
Log.i(TAG, "onResult get " + taskId);
if (result != null) {
Log.i(TAG, "onResult isComplete " + result.isComplete());
if (!result.isComplete()) {
return;
}
if (null != mTimerTask) {
mTimerTask.cancel();
}
if (result.getText() != null) {
Log.e(TAG, result.getText());
dismissTransferringDialog();
showCovertResult(result.getText());
}
List<MLRemoteAftResult.Segment> segmentList = result.getSegments();
if (segmentList != null && segmentList.size() != 0) {
for (MLRemoteAftResult.Segment segment : segmentList) {
Log.e(TAG, "MLAsrCallBack segment text is : " + segment.getText() + ", startTime is : " + segment.getStartTime() + ". endTime is : " + segment.getEndTime());
}
}
List<MLRemoteAftResult.Segment> words = result.getWords();
if (words != null && words.size() != 0) {
for (MLRemoteAftResult.Segment word : words) {
Log.e(TAG, "MLAsrCallBack word text is : " + word.getText() + ", startTime is : " + word.getStartTime() + ". endTime is : " + word.getEndTime());
}
}
List<MLRemoteAftResult.Segment> sentences = result.getSentences();
if (sentences != null && sentences.size() != 0) {
for (MLRemoteAftResult.Segment sentence : sentences) {
Log.e(TAG, "MLAsrCallBack sentence text is : " + sentence.getText() + ", startTime is : " + sentence.getStartTime() + ". endTime is : " + sentence.getEndTime());
}
}
}
}
Processing the Transcription Result in Polling Mode
After the transcription is completed, call getLongAftResult to obtain the transcription result. Process the obtained result every 10 seconds.
Code:
private void startQueryResult() {
Timer mTimer = new Timer();
mTimerTask = new TimerTask() {
@Override
public void run() {
getResult();
}
};
mTimer.schedule(mTimerTask, 5000, 10000); // Process the obtained long speech transcription result every 10s.
}
private void getResult() {
Log.e(TAG, "getResult");
mAnalyzer.setAftListener(mAsrListener);
mAnalyzer.getLongAftResult(mLongTaskId);
}
References:​For more details, you can go to:
ML Kit official website
ML Kit Development Documentation page, to find the documents you need
Reddit to join our developer discussion
GitHub to download ML Kit sample codes
Stack Overflow to solve any integration problems
Thanks for sharing!

Integration of Huawei Push Kit in Book Reading Android app (Kotlin) - Part 3

{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Introduction
In this article, we can learn how to integrate the Huawei Push Kit in Book Reading app to send the push message notification to users phone from the AppGallery Connect. Push notifications offers a great way to increase your application’s user engagement and boost your retention rates by sending meaningful messages or by informing users about your application. These messages can be sent at any time and even if your app is not running at that time. So, I will provide the series of articles on this Book Reading App, in upcoming articles I will integrate other Huawei Kits.
Push Kit
Huawei Push Kit is a messaging service developed by Huawei for developers to send messages to apps on users’ device in real time. Push Kit supports two types of messages: notification messages and data messages. You can send notifications and data messages to your users from your server using the Push Kit APIs or directly from the AppGallery Push Kit Console.
AppGallery Connect
Find the Push Kit message service in AppGallery connect dashboard.
Choose My Projects > Grow > Push Kit, and click Enable now.
Follow the steps to send the notification message to device from AppGallery Connect, Sending a Notification Message.
Requirements
1. Any operating system (MacOS, Linux and Windows).
2. Must have a Huawei phone with HMS 4.0.0.300 or later.
3. Must have a laptop or desktop with Android Studio, Jdk 1.8, SDK platform 26 and Gradle 4.6 and above installed.
4. Minimum API Level 24 is required.
5. Required EMUI 9.0.0 and later version devices.
How to integrate HMS Dependencies
1. First register as Huawei developer and complete identity verification in Huawei developers website, refer to register a Huawei ID.
2. Create a project in android studio, refer Creating an Android Studio Project.
3. Generate a SHA-256 certificate fingerprint.
4. To generate SHA-256 certificate fingerprint. On right-upper corner of android project click Gradle, choose Project Name > Tasks > android, and then click signingReport, as follows.
Note: Project Name depends on the user created name.
5. Create an App in AppGallery Connect.
6. Download the agconnect-services.json file from App information, copy and paste in android Project under app directory, as follows.
7. Enter SHA-256 certificate fingerprint and click Save button, as follows.
Note: Above steps from Step 1 to 7 is common for all Huawei Kits.
8. Click Manage APIs tab and enable Push Kit.
9. Add the below maven URL in build.gradle(Project) file under the repositories of buildscript, dependencies and allprojects, refer Add Configuration.
Code:
maven { url 'http://developer.huawei.com/repo/' }
classpath 'com.huawei.agconnect:agcp:1.6.0.300'
10. Add the below plugin and dependencies in build.gradle(Module) file.
Java:
apply plugin: id 'com.huawei.agconnect'
// Huawei AGC
implementation 'com.huawei.agconnect:agconnect-core:1.6.0.300'
// Huawei Push Kit
implementation 'com.huawei.hms:push:6.3.0.302'
// PDF Viewer
implementation 'com.github.barteksc:android-pdf-viewer:2.8.2'
11. Now Sync the gradle.
12. Add the required permission to the AndroidManifest.xml file.
XML:
// Push Kit
<uses-permission android:name="android.permission.INTERNET" />
<service
android:name=".PushService"
android:exported="false">
<intent-filter>
<action android:name="com.huawei.push.action.MESSAGING_EVENT" />
</intent-filter>
</service>
Let us move to development
I have created a project on Android studio with empty activity let us start coding.
In the WebViewActivity.kt to find the web view of pdf document.
Java:
class WebViewActivity : AppCompatActivity() {
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
setContentView(R.layout.activity_web_view)
webView.webViewClient = WebViewClient()
webView.settings.setSupportZoom(true)
webView.settings.javaScriptEnabled = true
val url = getPdfUrl() webView.loadUrl("https://docs.google.com/gview?embedded=true&url=$url")
}
companion object{
fun getPdfUrl(): String {
return "https://mindorks.s3.ap-south-1.amazonaws.com/courses/MindOrks_Android_Online_Professional_Course-Syllabus.pdf"
}
}
}
Create PushService.kt class to send the push notification to device.
Code:
class PushService : HmsMessageService() {
// When an app calls the getToken method to apply for a token from the server,
// if the server does not return the token during current method calling, the server can return the token through this method later.
// This method callback must be completed in 10 seconds. Otherwise, you need to start a new Job for callback processing.
// @param token token
override fun onNewToken(token: String?) {
Log.i(TAG, "received refresh token:$token")
// send the token to your app server.
if (!token.isNullOrEmpty()) {
// This method callback must be completed in 10 seconds. Otherwise, you need to start a new Job for callback processing.
refreshedTokenToServer(token)
}
val intent = Intent()
intent.action = CODELABS_ACTION
intent.putExtra("method", "onNewToken")
intent.putExtra("msg", "onNewToken called, token: $token")
sendBroadcast(intent)
}
private fun refreshedTokenToServer(token: String) {
Log.i(TAG, "sending token to server. token:$token")
}
// This method is used to receive downstream data messages.
// This method callback must be completed in 10 seconds. Otherwise, you need to start a new Job for callback processing.
// @param message RemoteMessage
override fun onMessageReceived(message: RemoteMessage?) {
Log.i(TAG, "onMessageReceived is called")
if (message == null) {
Log.e(TAG, "Received message entity is null!")
return
}
// getCollapseKey() Obtains the classification identifier (collapse key) of a message.
// getData() Obtains valid content data of a message.
// getMessageId() Obtains the ID of a message.
// getMessageType() Obtains the type of a message.
// getNotification() Obtains the notification data instance from a message.
// getOriginalUrgency() Obtains the original priority of a message.
// getSentTime() Obtains the time when a message is sent from the server.
// getTo() Obtains the recipient of a message.
Log.i(TAG, """getCollapseKey: ${message.collapseKey}
getData: ${message.data}
getFrom: ${message.from}
getTo: ${message.to}
getMessageId: ${message.messageId}
getMessageType: ${message.messageType}
getSendTime: ${message.sentTime}
getTtl: ${message.ttl}
getSendMode: ${message.sendMode}
getReceiptMode: ${message.receiptMode}
getOriginalUrgency: ${message.originalUrgency}
getUrgency: ${message.urgency}
getToken: ${message.token}""".trimIndent())
// getBody() Obtains the displayed content of a message
// getTitle() Obtains the title of a message
// getTitleLocalizationKey() Obtains the key of the displayed title of a notification message
// getTitleLocalizationArgs() Obtains variable parameters of the displayed title of a message
// getBodyLocalizationKey() Obtains the key of the displayed content of a message
// getBodyLocalizationArgs() Obtains variable parameters of the displayed content of a message
// getIcon() Obtains icons from a message
// getSound() Obtains the sound from a message
// getTag() Obtains the tag from a message for message overwriting
// getColor() Obtains the colors of icons in a message
// getClickAction() Obtains actions triggered by message tapping
// getChannelId() Obtains IDs of channels that support the display of messages
// getImageUrl() Obtains the image URL from a message
// getLink() Obtains the URL to be accessed from a message
// getNotifyId() Obtains the unique ID of a message
val notification = message.notification
if (notification != null) {
Log.i(TAG, """
getTitle: ${notification.title}
getTitleLocalizationKey: ${notification.titleLocalizationKey}
getTitleLocalizationArgs: ${Arrays.toString(notification.titleLocalizationArgs)}
getBody: ${notification.body}
getBodyLocalizationKey: ${notification.bodyLocalizationKey}
getBodyLocalizationArgs: ${Arrays.toString(notification.bodyLocalizationArgs)}
getIcon: ${notification.icon}
getImageUrl: ${notification.imageUrl}
getSound: ${notification.sound}
getTag: ${notification.tag}
getColor: ${notification.color}
getClickAction: ${notification.clickAction}
getIntentUri: ${notification.intentUri}
getChannelId: ${notification.channelId}
getLink: ${notification.link}
getNotifyId: ${notification.notifyId}
isDefaultLight: ${notification.isDefaultLight}
isDefaultSound: ${notification.isDefaultSound}
isDefaultVibrate: ${notification.isDefaultVibrate}
getWhen: ${notification.`when`}
getLightSettings: ${Arrays.toString(notification.lightSettings)}
isLocalOnly: ${notification.isLocalOnly}
getBadgeNumber: ${notification.badgeNumber}
isAutoCancel: ${notification.isAutoCancel}
getImportance: ${notification.importance}
getTicker: ${notification.ticker}
getVibrateConfig: ${notification.vibrateConfig}
getVisibility: ${notification.visibility}""".trimIndent())
showNotification(notification.title,notification.body)
}
val intent = Intent()
intent.action = CODELABS_ACTION
intent.putExtra("method", "onMessageReceived")
intent.putExtra("msg", "onMessageReceived called, message id:" + message.messageId + ", payload data:" + message.data)
sendBroadcast(intent)
val judgeWhetherIn10s = false
// If the messages are not processed in 10 seconds, the app needs to use WorkManager for processing.
if (judgeWhetherIn10s) {
startWorkManagerJob(message)
} else {
// Process message within 10s
processWithin10s(message)
}
}
private fun showNotification(title: String?, body: String?) {
val intent = Intent(this, WebViewActivity::class.java)
intent.putExtra("URL", "Provide link here")
intent.addFlags(Intent.FLAG_ACTIVITY_CLEAR_TOP)
val pendingIntent = PendingIntent.getActivity(this, 0, intent, PendingIntent.FLAG_ONE_SHOT)
val soundUri = RingtoneManager.getDefaultUri(RingtoneManager.TYPE_NOTIFICATION)
val notificationBuilder = NotificationCompat.Builder(this)
.setSmallIcon(R.drawable.sym_def_app_icon)
.setContentTitle(title)
.setContentText(body)
.setAutoCancel(true)
.setSound(soundUri)
.setContentIntent(pendingIntent)
val notificationManager = getSystemService(Context.NOTIFICATION_SERVICE) as NotificationManager
notificationManager.notify(0, notificationBuilder.build())
}
private fun startWorkManagerJob(message: RemoteMessage?) {
Log.d(TAG, "Start new Job processing.")
}
private fun processWithin10s(message: RemoteMessage?) {
Log.d(TAG, "Processing now.")
}
override fun onMessageSent(msgId: String?) {
Log.i(TAG, "onMessageSent called, Message id:$msgId")
val intent = Intent()
intent.action = CODELABS_ACTION
intent.putExtra("method", "onMessageSent")
intent.putExtra("msg", "onMessageSent called, Message id:$msgId")
sendBroadcast(intent)
}
override fun onSendError(msgId: String?, exception: Exception?) {
Log.i(TAG, "onSendError called, message id:$msgId, ErrCode:${(exception as SendException).errorCode}, " +
"description:${exception.message}")
val intent = Intent()
intent.action = CODELABS_ACTION
intent.putExtra("method", "onSendError")
intent.putExtra("msg", "onSendError called, message id:$msgId, ErrCode:${exception.errorCode}, " +
"description:${exception.message}")
sendBroadcast(intent)
}
override fun onTokenError(e: Exception) {
super.onTokenError(e)
}
private fun getToken() {
showLog("getToken:begin")
object : Thread() {
override fun run() {
try {
// read from agconnect-services.json
val appId = "Your app id"
val token = HmsInstanceId.getInstance([email protected]).getToken(appId, "HCM")
Log.i(TAG, "get token:$token")
if (!TextUtils.isEmpty(token)) {
sendRegTokenToServer(token)
}
showLog("get token:$token")
} catch (e: ApiException) {
Log.e(TAG, "get token failed, $e")
showLog("get token failed, $e")
}
}
}.start()
}
fun showLog(log: String?) {
runOnUiThread {
val tvView = findViewById<View?>(R.id.tv_log)
val svView = findViewById<View?>(R.id.sv_log)
if (tvView is TextView) {
tvView.text = log
}
if (svView is ScrollView) {
svView.fullScroll(View.FOCUS_DOWN)
}
}
}
private fun sendRegTokenToServer(token: String?) {
Log.i(TAG, "sending token to server. token:$token")
}
companion object {
private const val TAG: String = "PushDemoLog"
private const val CODELABS_ACTION: String = "com.huawei.codelabpush.action"
}
}
In the activity_web_view.xml we can create the UI screen.
XML:
<?xml version="1.0" encoding="utf-8"?>
<androidx.constraintlayout.widget.ConstraintLayout xmlns:android="http://schemas.android.com/apk/res/android"
xmlns:app="http://schemas.android.com/apk/res-auto"
xmlns:tools="http://schemas.android.com/tools"
android:layout_width="match_parent"
android:layout_height="match_parent"
tools:context=".WebViewActivity">
<WebView
android:id="@+id/webView"
android:layout_width="match_parent"
android:layout_height="match_parent" />
</androidx.constraintlayout.widget.ConstraintLayout>
In the log_layout.xml we can create the UI screen.
Code:
<?xml version="1.0" encoding="utf-8"?>
<ScrollView android:id="@+id/sv_log"
android:overScrollMode="never"
android:layout_width="match_parent"
android:layout_height="match_parent"
xmlns:android="http://schemas.android.com/apk/res/android">
<TextView
android:id="@+id/tv_log"
android:layout_width="match_parent"
android:layout_height="wrap_content" />
</ScrollView>
Demo
Tips and Tricks
1. Make sure you are already registered as Huawei developer.
2. Set minSDK version to 24 or later, otherwise you will get AndriodManifest merge issue.
3. Make sure you have added the agconnect-services.json file to app folder.
4. Make sure you have added SHA-256 fingerprint without fail.
5. Make sure all the dependencies are added properly.
Conclusion
In this article, we have learned how to integrate the Huawei Push Kit in Book Reading app to send the push message notification to users’ phone from the AppGallery Connect. Push notifications offers a great way to increase your application’s user engagement and boost your retention rates by sending meaningful messages or by informing users about your application. These messages can be sent at any time and even if your app is not running at that time.
I hope you have read this article. If you found it is helpful, please provide likes and comments.
Reference
Push Kit – Document
Push Kit – Training Video

How to Use Geofences for Precise Audience Messaging

Precise messaging is an important way for mobile apps to retain users and is usually achieved by segmenting users into different groups according to their preferences and then adopting different messaging policies for each user segment. However, if you want to push messages to users based on their precise locations, in-depth customization is usually required since most available third-party messaging services cannot narrow the target audience down to a specific business area or a small area. With geofences, this issue can be effectively resolved. A geofence is a set of virtual boundaries that define a given area on a map. When a user's device enters or leaves the geofence, or stays in the geofence for a specific amount of time, messages and notifications can be automatically sent to an app on the user's device. Geofence and messaging capabilities can work together to precisely send messages to target audiences in a specified area.
For example, suppose that a travel app wants to promote its ticket booking service in Paris. To do so, the app can create geofences for popular scenic spots in Paris. When a target user arrives at a scenic spot during a specified time range, the app will send a promotion message such as "You have received a coupon for the XXX. Tap here to claim the coupon." to the user, increasing their willingness to buy a ticket.
Implementation​You can carry out precise messaging to specified target audiences by using the geofence capability HMS Core Location Kit in conjunction with the message pushing capability of HMS Core Push Kit. By creating a geofence for a specified area, the app can detect the user's status, for example, when they enter, leave, or stay in this area. Once the messaging condition is met, the app on the user's device will receive a push message in real time. The push message can be sent to and displayed on the user's device even when the app is not running in the background, achieving a delivery rate as high as 99%.
​Demo​1. Install the demo app on the test device.
2. Start the demo app, tap Add Geofence on the GeoFence screen, and set relevant parameters to create a geofence.
3. Wait for the geofence to be triggered.
4. Check the received message.
{
"lightbox_close": "Close",
"lightbox_next": "Next",
"lightbox_previous": "Previous",
"lightbox_error": "The requested content cannot be loaded. Please try again later.",
"lightbox_start_slideshow": "Start slideshow",
"lightbox_stop_slideshow": "Stop slideshow",
"lightbox_full_screen": "Full screen",
"lightbox_thumbnails": "Thumbnails",
"lightbox_download": "Download",
"lightbox_share": "Share",
"lightbox_zoom": "Zoom",
"lightbox_new_window": "New window",
"lightbox_toggle_sidebar": "Toggle sidebar"
}
Development Procedure​1. Configure the Maven repository address for the SDK.
(The procedure for configuring the Maven repository address in Android Studio is different for Gradle plugin versions earlier than 7.0, Gradle plugin 7.0, and Gradle plugin 7.1 or later versions. Here, the procedure for Gradle plugin 7.1 is used as an example.)
a) Go to buildscript > dependencies and add AppGallery Connect plugin configurations.
Code:
buildscript {
dependencies {
...
// Add the AppGallery Connect plugin configuration. You are advised to use the latest plugin version.
classpath 'com.huawei.agconnect:agcp:1.6.0.300'
}
}
b) Open the project-level settings.gradle file and configure the Maven repository address for the SDK.
Code:
pluginManagement {
repositories {
gradlePluginPortal()
google()
mavenCentral()
// Configure the Maven repository address for the SDK.
maven { url 'https://developer.huawei.com/repo/' }
}
}
dependencyResolutionManagement {
...
repositories {
google()
mavenCentral()
// Configure the Maven repository address for the SDK.
maven { url 'https://developer.huawei.com/repo/' }
}
}
2. Add build dependencies in the dependencies block.
Code:
// Configure the app-level build.gradle file.
dependencies {
implementation 'com.huawei.hms:location: 6.4.0.300'
implementation 'com.huawei.hms:push: 6.3.0.304'
}
3. Declare system permissions in the AndroidManifest.xml file.
The Location SDK incorporates the GNSS, Wi-Fi, and base station location functions into an app to build up precise global positioning capabilities. Therefore, it requires the network, precise location, and coarse location permissions to function correctly. If the app needs to continuously obtain user locations when running in the background, the ACCESS_BACKGROUND_LOCATION permission also needs to be declared in the AndroidManifest.xml file.
Code:
<uses-permission android:name="android.permission.INTERNET" />
<uses-permission android:name="android.permission.ACCESS_NETWORK_STATE" />
<uses-permission android:name="android.permission.ACCESS_FINE_LOCATION" />
<uses-permission android:name="android.permission.ACCESS_WIFI_STATE" />
<uses-permission android:name="android.permission.ACCESS_COARES_LOCATION" />
Note: The ACCESS_FINE_LOCATION, WRITE_EXTERNAL_STORAGE, and READ_EXTERNAL_STORAGE permissions are dangerous system permissions, so they must be dynamically applied for. If the app does not have the permissions, Location Kit will be unable to provide services for the app.
​Key Code​Code file: com.huawei.hmssample2.geofence\GeoFenceActivity.java
If you want to integrate the geofence service and implement message pushing in your app, you only need to add relevant code in GeoFenceActivity.java to your app project.
1. Configure geofences.
a) Create geofences and geofence groups as needed, and set relevant parameters, such as the geofence radius and triggering time.
Code:
if (checkStyle(geofences, data.uniqueId) == false) {
LocationLog.d("GeoFenceActivity", "not unique ID!");
LocationLog.i("GeoFenceActivity", "addGeofence failed!");
return;
}
geoBuild.setRoundArea(data.latitude, data.longitude, data.radius);
geoBuild.setUniqueId(data.uniqueId);
geoBuild.setConversions(data.conversions);
geoBuild.setValidContinueTime(data.validContinueTime);
geoBuild.setDwellDelayTime(data.dwellDelayTime);
geoBuild.setNotificationInterval(data.notificationInterval);
geofences.add(geoBuild.build());
LocationLog.i("GeoFenceActivity", "addGeofence success!");
b) Register a broadcast using the intent.
Code:
GeofenceRequest.Builder geofenceRequest = new GeofenceRequest.Builder();
geofenceRequest.createGeofenceList(GeoFenceData.returnList());
if (trigger.getText() != null) {
int trigGer = Integer.parseInt(trigger.getText().toString());
geofenceRequest.setInitConversions(trigGer);
LocationLog.d(TAG, "trigger is " + trigGer);
} else {
geofenceRequest.setInitConversions(5);
LocationLog.d(TAG, "default trigger is 5");
}
final PendingIntent pendingIntent = getPendingIntent();
try {
geofenceService.createGeofenceList(geofenceRequest.build(), pendingIntent)
.addOnCompleteListener(new OnCompleteListener<Void>() {
@Override
public void onComplete(Task<Void> task) {
if (task.isSuccessful()) {
LocationLog.i(TAG, "add geofence success! ");
setList(pendingIntent, GeoFenceData.getRequestCode(), GeoFenceData.returnList());
GeoFenceData.createNewList();
} else {
// Get the status code for the error and log it using a user-friendly message.
LocationLog.w(TAG, "add geofence failed : " + task.getException().getMessage());
}
}
});
} catch (Exception e) {
LocationLog.i(TAG, "add geofence error:" + e.getMessage());
}
private PendingIntent getPendingIntent() {
Intent intent = new Intent(this, GeoFenceBroadcastReceiver.class);
intent.setAction(GeoFenceBroadcastReceiver.ACTION_PROCESS_LOCATION);
Log.d(TAG, "new request");
GeoFenceData.newRequest();
return PendingIntent.getBroadcast(this, GeoFenceData.getRequestCode(), intent,
PendingIntent.FLAG_UPDATE_CURRENT);
}
2. Trigger message pushing. Send a push message when onReceive of GeoFenceBroadcastReceiver detects that the geofence is triggered successfully. The message will be displayed in the notification panel on the device.
Code:
GeofenceData geofenceData = GeofenceData.getDataFromIntent(intent);
if (geofenceData != null) {
int errorCode = geofenceData.getErrorCode();
int conversion = geofenceData.getConversion();
ArrayList<Geofence> list = (ArrayList<Geofence>) geofenceData.getConvertingGeofenceList();
Location myLocation = geofenceData.getConvertingLocation();
boolean status = geofenceData.isSuccess();
sb.append("errorcode: " + errorCode + next);
sb.append("conversion: " + conversion + next);
if (list != null) {
for (int i = 0; i < list.size(); i++) {
sb.append("geoFence id :" + list.get(i).getUniqueId() + next);
}
}
if (myLocation != null) {
sb.append("location is :" + myLocation.getLongitude() + " " + myLocation.getLatitude() + next);
}
sb.append("is successful :" + status);
LocationLog.i(TAG, sb.toString());
Toast.makeText(context, "" + sb.toString(), Toast.LENGTH_LONG).show();
//
new PushSendUtils().netSendMsg(sb.toString());
}
Note: The geofence created using the sample code will trigger two callbacks for conversion types 1 and 4. One is triggered when a user enters the geofence and the other when a user stays in the geofence. If Trigger is set to 7 in the code, callbacks will be configured for all scenarios, including entering, staying, and leaving the geofence.
Once you have performed the preceding steps, you will have enabled geofence-based message pushing for your app and can send messages to target audiences in specific areas, achieving precise marketing.
References​Location Kit official website
Location Kit development documentation

Categories

Resources