Video mirroring/sharing between 2 tablets - Android Q&A, Help & Troubleshooting

I have 2 Samsung Galaxy Tab 2 7” tablets which I would like to use as an in-car movie player with both tablets mounted to headrests and playing the same movie simultaneously. Both have the Group Share app which could in theory do this, but the version available for the Tab 2 doesn’t support video.
I’ve tried:
• Using hotspot wifi with a variety of DLNA/UPnP players/servers. I nearly achieve it with Bubble UPnP which functions on the hotspot mini network, but it won’t play on the server device as well as playing on the client device.
• Using hotspot wifi with a separate server device (smartphone) playing to 2 client devices (the 2 tablets). Again this nearly works with Bubble UPnP but it won’t play to 2 client devices at the same time – presumably a limitation of the technology.
• Screenshare app by spring design. Will enable a video to be played on one tablet controlled by the other tablet, but not simultaneously on both.
• Teamviewer – looks like it could work but couldn’t get the app working. Suspect it won’t work on a hotspot or wifi direct network.
Are there any other apps or workarounds which might achieve this? I feel that it must be possible...

Related

LiveTV on the Nexus Q

I'm very interested to see how Google navigates the 'Live TV' area of the Nexus Q home entertainment machine. Google TV hasn't exactly taken off as they had expected for a number of reasons - namely high price and buggy, hard to understand implementation. I'm hoping they've learned from their mistake with the Nexus Q.
Right now my home TV setup is comprised of a live TV server (Windows Media Center on Windows 7) and then an Xbox 360 running as a media center extender. This bring me live, HD TV with a great UI and total DVR functionality. It's decent, but a bit of a pain to launch the MCE App on the Xbox when you want to watch TV.
Google bought SageTV almost exactly 2 years ago. SageTV consisted of a media server running on a home PC which provides all DVR functionality, and then SageTV 'placeshifter' which allowed you to watch TV, including premium cable content with a Cablecard, to any of their supported platforms.
SageTV was java-based, which means it is wholly possible that Google could be writing it into the Android platforum and the Nexus Q would be a perfect 'Extender' device. I'm hoping that Google might be working on this as a large secret project so that they can dominate the home entertainment ecosystem.
To me this would be the 'holy grail' of home entertainment. A box that supports both on-demand content (YouTube, Netflix, Music) as well as Live TV.
Does anyone think this is possible? Would you use such a setup?
I think your looking more for a Google TV than the Nexus Q. It's strange to kind of have competing boxes but the Nexus Q seems just for streaming content and the ability to easily share from phone/tablet to your entire house depending on how many you have.
Why they just didn't ad some of these functions to the Google TV product I don't know.
But a nexus q as a front end working with say hdhome turner and feeding streams out Google tv.....god I want this, I love my htpc but I want something like android for the popularity
Sent from my A500 using Tapatalk 2

More About How Miracast Works on Android

http://ausdroid.net/2012/11/17/lg-australia-nexus-4-optimus-g-and-miracast/
Yes, the article isn't about N10 per se, but it has relevant info about Miracast that would translate to N10--whenever Goog can deliver on its claim of Miracast being a 4.2 feature rather than a phone-specific feature. Salient points from the piece:
"Both the Nexus 4 and Optimus G feature Miracast. On the Nexus 4, it’s supported through Android 4.2’s Secondary Displays...On the Optimus G, it’s part of the standard OS and can be enabled with a tap on a dedicated toggle in the notification shade’s Quick Settings area.
"Josh’s demonstration included browsing a photo gallery in full-screen mode on the TV, playback of HD video (an MKV file, no less), web browsing, and a game of Angry Birds. The phone can send output to the Miracast display from an application – for example, a video – and continue to use the phone normally. Josh demoed this by playing a video on the TV while playing Angry Birds on the phone.
"Another quirk is that Miracast uses your Wifi antenna, so you can’t maintain a connection to your home network while transmitting and will instead be relying on mobile data. This is a definite drawback compared to competing systems like AirPlay, but it’s something that could be added or changed as Miracast evolves.
"Notably, the demo was performed on the Optimus G. There seem to be issues with the implementation on the Nexus 4 which should be sorted out with a software update. This seems OK, as no-one has Miracast-capable hardware at the moment."
One of the things I wondered about Miracast was how it can maintain two wifi connections with a single radio. If the above is true, then it can't, so you can't do something like streaming Netflix from online, through your device, to the TV. This would put a massive damper on Miracast's appeal if you can't access the net (via wifi) while using Mira... Hmm, may be that's why Mira isn't available on N10. It only has wifi.
Also interesting that Mira implementations are different on OptiG and N4.
ummm...
Well I can't wait for android to actually allow wifi streaming apps like 'MirrorOp Sender' apps (plenty more on the market) get root access to the 'screen image'.
That is the only issue I am having, I have successfully used the Nexus 7 as my PC monitor with 'MirrorOp Receiver' as well as actually controlling the PC (windows 7) from the nexus with the same app, however as soon as I try to connect my Nexus 7 to the Qumi projector it informs me that root access is required, and to be quite honest I am not at all interested on rooting the nexus, I believe Google should include these things on the OS as they announced they would on JellyBean 4.2... a bit disappointing it still isn't out!

[Q] Android as a desktop operating system

I was thinking of the coolness factor of just having one device, a phone, to which you could connect an external display and have an extended desktop. I am not finding any reference to this on Android (only the MS Surface). From what I have been reading, and remember/understand (may be confused), Jelly Bean brought the ability for windowing apps. However, the apps have to be coded for the capability, unless you root your phone and installed an app that provided windowing for all apps. Also, I have not heard of the possibility of having an extended desktop in Android.
I would like to ask WHY? Why not have windowing and the ability for an extended desktop, on an external display? A bluetooth keyboard and mouse just follows. Does google have to play nice with the manufacturers that stand to loose from people only needing one device? Is there a reason I'm not thinking of? Most phones are fast enough for this these days.
At the turn of the century, I was running GPS software Deluo Routis on a Sony Vaio 505 Pentium 200Mhz laptop running Win98. The 2-D graphics were smooth even while playing mp3's through the car speakers. The mapping software showed the map clearly, and effectively gave me navigation. People have lost sight of how much you can do if you give up the bloat and bling.
Also, I am pretty confused with the merging of Android and Chrome. I never liked Java to begin with; my experience with it is in MS Windows, and it runs slow as molasses. I believe my phone would run much faster if they had not chosen Java. I understand this to be because you have an operating system running on top of another operating system. It just makes more sense to me to have less layers and run apps natively, for better performance. I thought maybe they chose Java for its level of security. Is the screening process for Google Play not foolproof enough?
I like the philosophy of Google better than Microsoft**, so if one of them is going to win, I hope it's Google. I'm hoping Google won't end up with a convoluted Android/Chrome operating system because Lawyers forced them to (the idea I get based on the latest news). I don't understand: do they want to keep their OS architecture simple, but are being forced to make the OS complex for different reasons?
**Apple doesn't even want to compete. They have never wanted to dominate, just make huge profits. Unless they break up the marriage of hardware and software, they won't win. Then again, if Samsung keeps dominating, there may not be much hardware diversity?
Oh, and my main question was: "Why not have windowing and the ability for an extended desktop?". Wouldn't that be a big deciding factor for anyone that wanted to simplify and just have one device?
Anybody? Tell me I'm crazy at least. There has to be a strategic reason, that Google does not introduce full windowing and extended desktop support.
Its coming eventually. though you could do it right now. Motorola tried something like this with their atrix lapdocks.
Sent from my Samsung i437p using Tapatalk and CM 10.2
E_Phather said:
Its coming eventually. though you could do it right now. Motorola tried something like this with their atrix lapdocks.
Click to expand...
Click to collapse
Can you do it right now with any android device having a video port?
Well lets look at how we could achieve this with todays technology.
Input:
Bluetooth Mouse & keyboard.
Output:
Wireless display with support for older displays using something like Chromecast.
Graphical User Interface:
A secondary Launcher/Application (Which could potentially see companies like MS & Canonical developing their own UI's and Charging for them if required).
Home & Office use with one device:
Home would be the default UI, but when your device has used NFC to log into the office it would automatically enable your Office profile/UI for a certain length of time (requiring you to log back in after a set time or manual log out via another NFC tap).
This would be very useful as it would enable you to take your "desktop" environment anywhere with you and connect to any HDTV with Wireless display/Chromecast support.
Applications:
So if like me you are finding your phone to become ever more a better solution to your digital needs and you only require your desktop for apps which work better with larger displays (Videos & certain games) you will find this very useful.
Games:
Now games could become ever more better as they could be controlled using standardised control inputs (game controllers could use standardised input methods allowing you to select any compatible controller to best suit your needs) or even a driving game could allow you to see the game on a HDTV yet be controlled with the accelerometer for steering and the right of the devices touch display would be the accelerator and the left of the display would be the brakes for example.
More Business Solutions:
If you could wirelessly connect to the office display then show a powerpoint style presentation that would be great because the very device which stores the file would also be your controller to move to the next/pevious slides.
Media:
Music could possibly be stored in the cloud so when your on the move you can listen to your music as many of us do now, but when connected to a large display it could utilise the large display and speakers to show a music video too!.
Photos could be viewed on the large screen and the next one to be displayed could be select on the device (allowing the use to avoid showing anyone pictures which they don't want other to see - ie: pitcures of you and your friends whilst your parents/grandparents are in the room...).
The TV Guide:
The TV Guide would become a very interactive thing which allows you to see what is available on other TV channels without other people in the room being limited to viewing the content they are trying to watch in a small box in the corner of the display...
These are just some ideas of what is possible, but I know that you could do so much more with this and with 64-bit technology coming to many mobile devices soon that will make it so much easier for devices to process all of this data at once without any serious lag!.
I would love to see a group of developers on XDA team up on an open desktop (secondary) launcher to run alongside the users primary (phone) launcher. if there was a project like this with an open framework to develop apps for I'd be happy to start developing apps for that or separate UI's to run alongside my current (Phone/Android) apps UI's.
Edit:
Also remember that this could be utilised in other ways too eg:: connecting your device to your car and your device could deliver your navigation & music to your vehicles display whilst getting important traffic/weather news using your devices network connection!.
Isn't this exactly what the Ubuntu phone intends to do or have I got the wrong idea?
-----------------------------------------------------
Sign up for COPY with this link, and we'll both get 5 GB of free cloud storage in addition to the usual 15 GB: https://copy.com?r=m3arsR
Free GiffGaff SIM with £5 credit: http://redir.info.giffgaff.com/r/?id=h4d8d850,c0e1337,c0e139c&p1=d3xp2pc
Sent from my Galaxy Nexus using XDA Premium 4 mobile app
Yes, but with Android already having a large ecosystem it would make a lot of sense to build upon that.
Chromecast is not "open" to third party apps. http://www.minyanville.com/sectors/...eeds-to-Tread-Lightly-With/8/28/2013/id/51502
Do they have a displayport version of Chromecast? *cough*
quote from: http://www.tested.com/tech/set-top-boxes/457036-testing-google-chromecast/
"Chromecast is also not a particularly good desktop mirroring option, either. It actually can't do full desktop mirroring, and instead works solely with the Chrome browser. In beta right now is Chrome tab streaming, which sends to Chromecast everything that can be rendered in a single Chrome tab, including web pages, flash embeds, and even full-screen MKV video files if you have VLC installed. I like that Chrome tab streaming works independently of what's showing on your laptop or desktop's screen--like with YouTube and Netflix, you can multi-task and switch to other tabs or windows while one tab is being streamed. The only thing that matters is the window size and screen resolution. Chromecast will automatically scale the aspect ratio of your window to fill up your TV screen, adding black bars on the sides to avoid stretching. A full-screen resolution of 1440x900 looked good on a large 1080p TV, but streaming from a 2560x1600 monitor at full-screen made the text unreadable on my 70" TV."
Wow... I thought only displayport was capable of 2560x1600 (edit: hdmi v1.3 brought this). Even if I hook it up to my 2560x1600 monitor, it won't really display anything but entertainment. Chromecast doesn't seem to be a way to have a monitor, to use your Android phone as a PC replacement.
AllCast !!!
http://www.geek.com/android/chromecast-reject-becomes-allcast-public-beta-now-available-1578674/
However, I still need to add some kind of wifi enabled device to my 30" lcd monitor (like with chromecast). Really, I don't mind a cable connection from my phone to my monitor, if that was an option. If Google continues to be closed like this, then I would go for Ubuntu phone.
Displayport:
http://en.wikipedia.org/wiki/MyDP#SlimPort
Any phones have this besides the Google Nexus 4? Actually, I'm not getting a new phone until I know what the hell will happen with Android / Chrome OS
Quote from: http://www.tested.com/tech/android/457205-mhl-vs-slimport/
"SlimPort's support for the DisplayPort standard--specifically Mobility DisplayPort--means it can output video at the same 4K resolution as MHL, though not via HDMI (yet, anyway). And here SlimPort hasn't really made good on its potential, yet; though it's based on the flexible DisplayPort standard, the only SlimPort adapters currently available are for VGA and HDMI connectors. The upshot is that you won't be plugging a Nexus 7 into a 1440p DisplayPort computer monitor anytime soon." http://www.slimportconnect.com/
Chromecast May Get Screen Mirroring With Android 4.4.1
Evidence in Android 4.4.1 indicates that screen mirroring is coming to Chromecast.
Click to expand...
Click to collapse
http://www.tomshardware.com/news/chromecast-google-screen-mirroring-kitkat-android,25345.html
It could start with mirroring a primary display, but gradually result in mirroring something that a GPU has rendered for a secondary display.
A dock from Samsung Galaxy phones. Has USB ports, HDMI, and audio.
http://www.samsung.com/us/mobile/cell-phones-accessories/EDD-S20JWEGSTA
mraeryceos said:
A dock from Samsung Galaxy phones. Has USB ports, HDMI, and audio.
http://www.samsung.com/us/mobile/cell-phones-accessories/EDD-S20JWEGSTA
Click to expand...
Click to collapse
I tried that myself with my previous Galaxy S4 (i9500), It was a great dock and when I connected my wireless KB & Mouse USB dongle & connected the HDMI to my PC monitor it was a good experience when doing things like playing GTA3 on the bigger screen (it was better than the windows version in some ways).
But the device just needed a separate home screen UI to be output to the PC screen to look perfect and to work better with the KB & Mouse input type.
It shouldn't be too difficult to make a UI that simply changes the size of some buttons to a smaller size, enabling more widgets to fit on the home screen and if they could simply force the apps to run in either windowed or full screen that would enable better multi-tasking, then the browsers would just need a small update to detect if the device is running in Desktop Mode if so, then simply zoom out of the page a little to emulate the desktop browser experience.
Just a few ideas... If Google's Android team are reading this, I would recommend that you get that dock to experiment with for future Android builds.
Especially now that OS' like Ubuntu Phone are looking at going down this road of the one device fits all computational needs.
Rather than creating a new thread I thought that it would appropriate to bring this topic back up after the recent announcements that several OEM's have made, that they will be releasing desktops with Android as their Primary/Secondary OS.
I hope that this pushes Google into creating a dedicated desktop UI in the future.

Screen mirroring with amazon fire stick?

I read that the N9 isn't miracast compatible. Does that mean I will not be able screen mirror with the amazon fire stick?
Didn't realize the fire stick allowed for Miracast. And no it won't.
Wow so I can't even get screen casting to work with a Chromecast even though it detects the device, but I am however able to cast with apps like YouTube.
Anyone else experiencing this?
Works fine for me
Sent from my Nexus 9 using Tapatalk
y2whisper said:
Works fine for me
Sent from my Nexus 9 using Tapatalk
Click to expand...
Click to collapse
What could I be doing wrong?
My note 4 is able to screen cast to the chromecast no problem...
Shouldn't need to do anything fancy to make it work. Hmmm
Sent from my Nexus 9 using Tapatalk
There is a different between screen casting and (app) casting. The latter simply requires an app that can chromecast AND that the device is on the same wifi network.
Screen casting requires either 'specific' devices that are capable of screen mirroring to chromecast OR miracast capability (I think Sony devices may offer this). For the former, here are the supported devices.
nycebo said:
There is a different between screen casting and (app) casting. The latter simply requires an app that can chromecast AND that the device is on the same wifi network.
Screen casting requires either 'specific' devices that are capable of screen mirroring to chromecast OR miracast capability (I think Sony devices may offer this). For the former, here are the supported devices.
Click to expand...
Click to collapse
That's the thing.
My N9 can't screen cast, but it can app cast.
My note 4 can do both.
My N9 can do both.
Resurrecting this to clarify some things. Apps that play video on one device from another when you have that app installed on two devices on the same subnet use DIAL (Discovery and Launch), they will work on all devices because the entire mechanism is baked into the app. Youtube, Netflix, Sony, and Samsung created the DIAL protocol. With DIAL, the displaying device is getting the media from the internet and just being controlled by the other device. This a per-app thing and completely device independent and not at all what this thread is about.
Miracast is actual screen streaming. Support is built into Android 4.2 and greater and Intel's WiDi initiative has basically folded over into Miracast. Google removed miracast support from all Nexus and Pixel devices, all other android 4.2 and up devices support Miracast because it is part of Android. This is a super bull**** move by Google and Miracast can be turned on with a single line edit to the build.prop if you root your phone.
Chromecast was Google combining an OS level DIAL implementation with Miracast. Chromecast has since been changed from using straight up DIAL for the media casting to use the more proprietary mDNS, but it's basically the same. Amazon products do not support Chromecast, which is a bit less of a bull**** move compared to Google turning off Miracast on Nexus and Pixel devices since Chromecast is more proprietary.
There are 3 ways to get around the problem, all of which have been done and only 2 are currently available:
Add an app that supports Chromecast protocols to the firestick. There have been a few come and go, they all get in trouble from Google because Chromecast stuff is proprietary. Currently there is AirScreen on the Amazon App Store. It probably steals your data and sends it to China and North Korea, but it works.
Add a miracast app to the Nexus device. There have been a few, but I can't find one at the moment.
Root the nexus device and edit the build.prop file to turn Miracast support back on. Definitely the cleanest, but you permanently trip the security check on the phone and can't use it for payment apps like Google Wallet anymore.
CapinWinky said:
Resurrecting this to clarify some things. Apps that play video on one device from another when you have that app installed on two devices on the same subnet use DIAL (Discovery and Launch), they will work on all devices because the entire mechanism is baked into the app. Youtube, Netflix, Sony, and Samsung created the DIAL protocol. With DIAL, the displaying device is getting the media from the internet and just being controlled by the other device. This a per-app thing and completely device independent and not at all what this thread is about.
Miracast is actual screen streaming. Support is built into Android 4.2 and greater and Intel's WiDi initiative has basically folded over into Miracast. Google removed miracast support from all Nexus and Pixel devices, all other android 4.2 and up devices support Miracast because it is part of Android. This is a super bull**** move by Google and Miracast can be turned on with a single line edit to the build.prop if you root your phone.
Chromecast was Google combining an OS level DIAL implementation with Miracast. Chromecast has since been changed from using straight up DIAL for the media casting to use the more proprietary mDNS, but it's basically the same. Amazon products do not support Chromecast, which is a bit less of a bull**** move compared to Google turning off Miracast on Nexus and Pixel devices since Chromecast is more proprietary.
There are 3 ways to get around the problem, all of which have been done and only 2 are currently available:
Add an app that supports Chromecast protocols to the firestick. There have been a few come and go, they all get in trouble from Google because Chromecast stuff is proprietary. Currently there is AirScreen on the Amazon App Store. It probably steals your data and sends it to China and North Korea, but it works.
Add a miracast app to the Nexus device. There have been a few, but I can't find one at the moment.
Root the nexus device and edit the build.prop file to turn Miracast support back on. Definitely the cleanest, but you permanently trip the security check on the phone and can't use it for payment apps like Google Wallet anymore.
Click to expand...
Click to collapse
Can you verify this works and specify the build.prop line to add? I've used this method on my Nexus 6 and 9 in the past but it stopped working since Android 6 or 7 or so. The option to cast via Miracast would return but it would never make a connection after a certain OS version.

Miracast (WiFi screen casting) RECEIVER app for Android

Hi,
I have a Galaxy S7 which comes with the "Smart View" app in Android 8.0, which makes it easy to screen cast the phone to another Miracast device (e.g. TV or PC). Is there any way to do the reverse, i.e. using the phone as a Miracast host to display the screen of, say, another phone or PC? In particular, I want to use the Wireless Display feature in Windows 10 to have the phone as an external PC display. Windows can do both (cast to a device and host/display a cast from a device) but Smart View can only cast, not host. I don't want to use a media streaming app over the local network or god forbid a remote desktop solution routing everything through the internet. I want to use specifically Miracast because it's fast, convenient, built-in to many devices and most importantly uses direct peer-to-peer WiFi instead of going through the rest of the network, thus it's perfect for short-distance casting, which is exactly my use case. Unfortunately, I can't find any app that allows the phone to receive Miracast streams. Can anyone help?

Categories

Resources