What would it be like, if you are able to access the objects in the screen with the help of your mobile device? Sounds interesting right! I can see that you are still mapping this to a pretty old idea, where people control VLC, Storage with the help of Android/IoS Apps. But, what if I say that you require NO APP INSTALLATION! Now, I can very well be assured that you will be reading the rest of the post too 🙂
The idea spawned when Freshworks was calling for ideas for their annual hackathon – Save the Hacker. I and my friends decided to something related to Augmented Reality using ARToolkit. But due to limited time and device unavailability, we decided to go with something else. And then, we had the idea of developing a “Universal Hand-held Kinect“, a unique framework that converts any hand-held device into a Kinect. And again the hackathon was a 2-day event, and so we decided to do something specific, rather than making the framework generic.
We took the problem of today’s Education System, where Teachers still use wireless mouse/keyboard to manipulate objects in a smart-board. And people still use the keyboard to swipe through subsequent slides. And now, we have this problem solved :O
- Are you curious to know about how the sections of your organs look? You don’t have to explore a 3D object using traditional mouse/keyboard! Just a single jerk can help you learn more.
- Have you lost the receiver of your wireless mouse? No worries. Swipe your mobile.
- Finding it difficult to navigate through the boring presentations? We present you an easy way to sweep through the slides.
- Want to play the next song out of Bahubali’s Jukebox – Here we present a leisure way of using your mobile as a controller.
Trust me! I’m not using any exaggerating phrases, in the points mentioned above. We actually developed a product called “AirTeach“. The basic technical environment to get this working is a PC/MAC, a modern mobile/tablet with Gyroscope enabled, and an active internet connection. The teacher can just project the PC to the smart-board and then use his mobile phone to literally control the virtual objects inside it. The mobile phone and the PC should be connected to the same Local Network/Wifi, to enable transmission. The Teacher can then access the local URL in his mobile phone to access the controller interface. He can then rotate/translate/shake to manipulate objects on the screen.
I’m explaining the feature set of the product in the below lines
-
Air Board – The Teacher will be able to import any OBJ/ThreeJS Model inside the board. He can instantaneously switch on the controller interface to control the 3d object on the screen. Imagine, the user imports a 3d model of the human heart. He can then rotate the mobile as if he is holding the heart in his hand. He can then give a shake, to peel of the layers of the heart one by one, exploring the inner veins and arteries. Seems cool right!
-
Air Presentations – The teacher will be able to swipe through the slides just through shakes. A single shake helps you move forward while a double shake moves you backward.
-
Air Mouse – People have developed android/ios apps to control the desktop mouse. Now, with Air Mouse, you will be able to highlight the content in presentations, open/close your applications, play shooter games etc. The possibilities are endless!
You might still wonder about the technologies used since I previously said “No Installations”. Now coming to the technologies used, We have used Flask, NodeJS, HammerJS, RobotJS, Bootstrap, Redis, ThreeJS, and SocketIO.
Never ever underestimate the power of JS!
- Flask – To setup basic MVC model
- NodeJS – to write basic async routines for mouse controllers
- HammerJS – to get the touch coordinates in the smartphone
- RobotJS – To get native bindings for Mouse Movements in Mac/Windows
- Bootstrap – Basic UI and other stuff
- Redis – To establish a Pub-Sub Channel and other synchronization stuff
- ThreeJS – For 3d Model Rendering and Layering
- SocketIO – To establish socket connections between various nodes in the network
No project is complete! And we have a lot of TODOs in our case too. If you find this project interesting, let me tell you about our future goals. We plan to make this as a framework so that anyone can make use of it and really build cool applications on top of it. Imagine playing an FPS game using your mobile phones as Rifles! The possibilities are endless. Our goal was to provide users a handheld console with ZERO Setup and minimum requirement spec. Today I’m making this project open-source for people to contribute !
TODO’s
-
Make the API’s generic and refactor code to cater any number of mobile controllers. Currently, it is only one!
-
Add cool abilities like panorama browser, where the user will be able to navigate the panorama image just using his handset movements.
-
Add a native interface for Unity-based applications. This will enable us to use mobiles as a console for Native games too.
-
Build an interface to draw letters/shapes just by recognizing the gestures of the mobile phone.
-
I leave the rest to you folks ! Go wild with your creativity 🙂
Contribute here !