[APP] [IDEA] Targeting system - Wear OS Software and Hacking General

Here's an idea:
A targeting system/cross hair app for Google glass.
It could even use a camera attached to a scope, I'm
sure there are electronic scopes that could be connected
as well.

Related

[Q]How to do an Augment Reality app

Hi! I want to know is there any kind of program that can make an Augment Reality code and to input the code into an application for Android? Something like a generator.
I want to create an app like Nokia City Lens. Something like this:
http://www.youtube.com/watch?v=N2Zd3qHrOoc
Thanks in advance!
Hello there, While Developing the Perfect AR App you have to Mark these things
1. Unique design
2. Necessary 3D models
3. Pictures
4.Text files
5. Powerful Effects
6. Scientific visualization and 3d Modeling
7. 3D image recognition and tracking
I am an AR APP Developer at Game App Studio where we provide the top Augment Reality Reality Apps Development Solutions with These Kind of AR App development Tools:
Vuforia​
ARToolKit ​
Google ARCore​
Apple ARKit​
Maxst​
Wikitude​
As an AR App Developer, I am recommending you the Google ARCore for your Android AR app.​While as an AR App developer, We follow the Process
We Figure out a great idea to enhance your business. Competitor analysis is a good way to handle it.
Explore the market of existing SDKs and platforms for augmented reality development. We pick the one that fits your idea, budget, and expertise. For example, platforms like Vuforia, Wikitude, and ARToolKit require a profound knowledge of C++, Java, or C#. If programming languages are not your cup of tea, opt for an easier solution – tools like Blippar or Aurasma are an excellent choice for beginners.
Already you that AR usually needs the presence of 3D objects. You can download existing models or learn how to make them yourself.
Create a 2D tracker, which is a specific picture that gets placed on a surface and scanned by AR-aided devices.
Create a unique design and prepare all the necessary 3D models, pictures, text files, and data.
Put all the elements together on a platform of your choice.

Is such an app possible for Android? (plz read)

Range Finder App
An app that accurately measures distance from your smart phone/tablet. Something like this already exists (Smartmeasure), however it’s clunky, troublesome, and obscure. I’m aiming to have this app be used naturally amongst anybody who constantly needs to measure
Goals:
Create a work-around a focused laser beam, which is traditionally used in range finders. that a phone can't provide, you would be unable to get any sort of meaningful reading. Perhaps an algorithm that compares the scale of items it finds in the camera to a stored set, and give approximations.
Use the phone’s sound capabilities to measure distance
Use the phone’s map/compass capabilities to measure distance
Use a phone's accelerometer to track the phones's movement, and take a series of image with the same camera as you point the phone towards a target to measure distance
Hey Zabrak999, this is really a wonderful idea but I haven't seen such an app till date. You can contact some Android developers to get such an app developed for you.

Finding an android IDE which hides the baggage of the official one

Hi there. I have created a few apps using the android version of the processing IDE. I was amazed at how easily i could create applications for my device compared to using the official IDE which was far too complicated for an amateur like myself. I have now reached a point however, where the limitations of the processing IDE are showing themselves, it was developed as a means to create visual 'sketches' after all and not for creating complex applications.The main problem i have is in the organisation of my code which becomes harder and harder as the application gets bigger. Which brings me to my question; are there any android programming environments which are more 'app oriented' but still provide that layer of abstraction which hides all the unnecessary baggage of straight up android app development? would be great if there was some GUI features too. does this exist or would i be better off sticking with processing?many thanks.

How to automate gestures (swiping, tapping) in Android? (UI Automation)

Apologies if this is in the wrong section, mods please move if need be.
I am using a public app on the play store. I would like to perform an operations similar to that one might do when unit testing UI elements.
The app consists of mainly a list view of rectangular cards arranged in a vertical orientation. The app allows me to accept batches (orders) in which I perform deliveries for payment. There are also other users who use this same app so competition is fierce. If I do not accept a lucrative batch as fast as possible, someone else will. Even then its possible due to latency or simply bad luck that I do not get the batch. It is also bad because I tend to stare at my screen a lot while driving
I had an idea. I'm seeking some solution similar to the Robot class in Java, with the exception that the app be able to analyze the contents of the List View (which are View Groups composed of TextViews).
I was able to partially emulate what I want using UIAutomator, but it is a cumbersome solution because it requires ADB to run everytime. Not only that the swiping function on the UIDevice object in UIAutomator does not work on this particular app.
I have heard there are better utilities that can accomplish this. I have root on my phone.
Any advise?

HiChain1.0 Distributed Trusted Device Connection

In an era of the Internet of Things, our lives have been made more convenient by continuously optimized apps across distributed scenarios in contexts such as the smart home, smart office, and smart travel. Redefined product forms and enhanced cross-device collaboration have also contributed significantly to getting rid of limitations in terms of device connection. In light of this, highly reliable and accurate trusted connections between smart devices become crucial not only for improving the smart life experience, but also for protecting consumers' privacy and security of consumers' property.
Born for a converged intelligent life experience
The key factors for optimizing the smart life experience are improving the utilization rate of smart devices, removing the restrictions on device usage scenarios, and leveraging multiple types of devices. Born to address these issues, HiChain 1.0 for distributed and trusted device connections builds a software system across multiple devices to comprehensively enable trusted connections of devices using distributed technology, facilitating secure, convenient, and efficient device interconnections.
12 core functions enabling trusted connections in distributed scenarios
HiChain 1.0 securely connects core devices (such as mobile phones, tablets, and PCs), hub devices (such as tablets, TVs, and speakers), and accessories (such as watches and headsets), and performs binding, authorization, and transmission/control operations. In addition, HiChain 1.0 improves the establishment and transmission of the trust relationship, as well as data transmission and control based on the trust relationship. With the 12 core functions of HiChain 1.0 in distributed scenarios, trusted connections between the same account, across accounts, and independent of accounts become a reality, and various intelligent devices can collaborate with each other to assist people in daily life and work situations.
Remove cross-device limitations and build an interconnected life
In today's modern world where smart device forms including smart speakers, watches, TVs, and even vehicles have sprung up, consumers' demands on cross-device and multi-screen collaboration have also been soaring. HiChain 1.0 utilizes its proprietary technologies to unblock connection channels to allow different smart devices to interconnect with each other, as well as integrate device advantages for an optimized cross-device collaboration experience, forming a "1+1 > 2" synergy effect across a wide range of scenarios such as travel, mobile office, and social communication.
Intelligent travel: making journey smarter
The mission of vehicles has evolved from "making travel easier" to "making travel smarter". Closely related to this development is HiChain 1.0's enabling of secure binding between mobile phones and vehicles, end-to-end two-way authentication, and encrypted transmission to prevent attacks. This means that user privacy will not be disclosed even if data is intercepted. In this way, a seamless HiCar service scenario is implemented whereby users can project their phone screen to the vehicle screen, and seamlessly control phone apps (such as navigation, music, phone, and SMS apps) projected to the vehicle screen.
Smart office: making work more efficient
Good tools are crucial to a job being done properly. As such, HiChain 1.0 leverages its technical advantages to deliver an optimally secure screen projection and control experience by allowing mobile phones to obtain the unique identifier of a PC through scanning the QR code or using NFC to tap against the PC, and then project the screen to the PC desktop. This is especially handy for working at home efficiently and working together with colleagues.
One-tap TV projection: making connections more flexible
HiChain 1.0 also utilizes protocols such as Miracast to establish a connection between the phone and the TV (using the Miracast security mechanism), obtain the TV's unique identifier through the Miracast channel, and project video and audio from the phone to the TV and speaker. It can also return the video data collected by the TV's camera to the phone, facilitating a seamless and immersive entertainment experience between phones and TVs.
HiChain 1.0 will continue to give full play to its technical characteristics to enable trusted connections between devices using distributed technologies. Various types of smart devices will be able to collaborate with each other in a secure and trusted manner to overcome restrictions across different usage scenarios, and provide assurance for consumers to achieve a seamless and intelligent life experience with multi-screen collaboration.

Categories

Resources