The Smartphones that we use in our day to day lives are filled with a colony of sensory chips or if I compare it to human body it is filled with a number of organs.
In case of human body each organ plays a vital role in keeping us alive and in making our body functional. In the same way the sensors in smartphones today keeps a streamlined flow of informational stats to the smartphones' processor or to the apps in order to keep that particular functionality in the running mode.
Whenever we hold a phone in our hand we do not realize that this marvel of technology has the power to provide us with information that we unknowingly miss or is continuously missed or not interpreted by our mind. As I have mentioned in my earlier post about the "Future of Mobile Computing", Mobile Devices will largely define all the future major innovations in the consumer based computing industry, but in this post I will tell you that how will it be possible to start with providing personalized processed info to an individual and then gradually move on to socially or globally indicative aspects of the processed information from these "Smart Devices".
Taking the example of the recently launched Samsung Galaxy S4 smart computing hand held device/phone. I would like to notify that it is capable of giving us 14 types of sensory data. Please refer list below:
- Time
- Location
- Altitude of the device
- Gravity
- Orientation
- Atmospheric Pressure
- Proximity
- Magnetic Field
- Linear Acceleration
- Relative Humidity
- Rotation Vector
- Gyroscope based Directional Sense
- Temperature
- Light--> Indoor/Outdoor (RGB & Brightness Sensor data)
But the irony is that the phone does not have a single application that can accumulate data from these sensors as a combined input mechanism in order to give a really personalized & "Üseful" analysed "Info".
Think of the possibility that if we can get information from all these sensors for all the activities we do in only a time frame of 24 hours. It is astounding to know this meaningful information can be used to provide us inputs regarding our vehicle's fuel consumption mileage, our health stats, our emotional status, our location based risks, our social status, predict near future events, alert notifications in cases of emergency, delivery of required info at right time, our most productive time period of the day and numerous other informational feeds.
If we move forward leaving behind the informational impact on a single being, if we can process the data from a group of individuals suppose a class of 15 quantum mechanics students. The teacher can quickly grasp the pointers which left student's lose interest and which made them more enthusiastic. I mean to say if we combine the sensory data of a group of individuals it will provide us the info that can help prevent epidemics or can help in promoting an Ad-Campaign.
Now if we can combine the data across states we can have global demographic data and real time traffic stats to help build new highways. The Micro to Macroscopic view to the benefits of this technology innovation can bring into our lives is endless.
I have begun working on defining a computing algorithm for this type of sensory data utilization for ubiquitous computing. If the interested ignited minds like to contribute then they are welcome.
At the last I would like you to think on the lines of studying the effects of various kinds of informational feeds we take inside our minds via TV, Mobile, PCs etc and look into the possibility of delivering personalized educative content to people in shorter digestible amounts so that they can inch closer to the full grasp of the related topics. This application will be useful in defining an ubiquitous knowledge sharing mechanism by utilizing user patterns of social or informational feeds via various channels.
No comments:
Post a Comment