Zen and the Art of Assistant Robot Maintenance
SensoSmart is not a physical Sensor.
SensoSmart is not a single feature or function.
SensoSmart is about Universal Accessibility.
SensoSmart is about facilitating Accessibility through Mainstream resources.
SensoSmart is about detecting the gradual changes in people's senses -- for example hearing -- and predicting the level of assistance needed for a given context: for example, listening to audio books on a mobile device.
SensoSmart is about providing personalized features for everybody.
SensoSmart is about figuring out which features will follow you to all the various devices you want to use -- anywhere, anytime -- and making sure the network provides that personalization.
SensoSmart is about making the world safe for people and their Assistant Robots and Devices.
Can you imagine having surgery to implant an expensive device in your body or in your brain to control a prosthesis and then a few months later a new and improved version comes out on the market? The ultimate in Robot Envy. SensoSmart is about helping the Robot to become Smarter, so there is less dependence upon humans to control every aspect of the Robot.
SensoSmart exploits connectivity, leverages the community of users, employs sensors that are already in the environment, accesses available databases, and optimizes network analytics.
SensoSmart is about tapping into all that data on the cloud.
SensoSmart virtual sensors capture data from the environment, for example, microphones or cameras on smart phones, and so many sensors out there and those being invented as we speak. Network analysis improves understanding of the context. For example, people label the sounds, perhaps add a picture, and the network uses machine learning to build a resource of labeled new acoustic samples to develop knowledge of the environment. Add in global positioning services, location of businesses, databases of objects, history, previous experience of others, and SensoSmart will be able to recommend a setting for your hearing aid, and your personal audiogram measures, so you can enjoy the binge DVD, Movie or the Concert.
Your friend can hear you even in that crowded restaurant with all the music in the background.
Or the student can hear the teacher even in the back of the room. Even next to the person who is twitching a lot in that squeaky chair.
SensoSmart will provide the solution to you in one click so you don't have to become a programmer to get it.
What's missing from the picture above?
The Sound of the Emergency Vehicle!
SensoSmart will use available sensors to learn about and recognize important sounds; then display them in a Personalized way. For example, SensoSmart will use a modified audio, a visual or a haptic display. Further, the sensor data and analysis will be repurposed for more applications. For example, for protecting the hearing of workers who are in dangerous situations. And for the purpose of improving human robot interaction for job related tasks.
SensoSmart connects people and devices to the network to sense their context, learn, recommend, and share resources.
SensoSmart makes things Smarter and more autonomous.
SensoSmart taps into learning from communities of users to help share and update solutions for Assistant Robots and Devices.
In today's world, people buy an expensive hearing aid and have a fixed set of features. Then in a few years, they can buy a new hearing aid and perhaps get new features. To connect with other media, they have to buy proprietary connectors, for example, to listen to television or connect to a telephone.
These solutions are more expensive and prevent community development. They polarize people -- those who use the proprietary solution -- and the rest of the world.
SensoSmart means people can use mainstream network connectivity and resources to find accommodation that meets their needs. And community members can make contributions to the solutions in a model like Google's Play Store.
SensoSmart means people will no longer have to wait to buy a new hearing aid in order to gain new features and solutions. Standalone expensive and proprietary devices will be a thing of the past.
Hearing Aid manufacturing amounts to approximately 10% of the world need for hearing aids. Many people are not being served. SensoSmart aims to provide a new means to reach people who need personalized hearing, vision, motor, cognitive, and health assistants.
Filling a need.
The Sky is the Limit. Or rather, the Creativity of the Community and the Resources of the Network.
SensoSmart embraces a future of Robots. Assistant Robots. More Autonomous Robots. Hearing aids are a special type of robot. They help people to achieve things they cannot achieve without assistance.
People are living longer and we predict there will be greater need for assistant robots and devices. We predict there will be many more things people will want to do with their devices and it will become much more important to have ways to teach people and robots how to collaborate. That's why SensoSmart focuses on human robot interaction, methods of learning how to use robots and how to teach them new skills, and sharing resources on the network.
There are many wonderful robots. They can be expensive and limited in features and functions. Robots can be very useful for short periods of time. Then the cost of upgrading the robot or increasing features in money and time can be very high.
Rehabilitation Robots Active Prostheses and Exoskeletons Market shares strategies and forecasts worldwide
SensoSmart envisions a slimmer robot architecture, such as the Noonee Chairless Chair, with more flexible models for updating control programs. For example, a new model where members of the community can share control programs. In the case of the leg exoskeleton, the Chairless Chair, programs might be useful for job assistance, rehabilitation, or for needs identified by a consumer group.
SensoSmart applies the same principles to share learning of one robot with any other robot on the network.
For example, one robot learns about the grasp aperture and weight in order to pick up a consumer item, and then shares that information on the network indexed by a product code. Now any other robot can access that information.
A QR or product code is just one quick way to recognize what the human and robot are attempting to collaborate about. Other more sophisticated recognition programs exist, but SensoSmart's effort is to simplify and utilize resources from the cloud to make devices Smarter and more autonomous.
If a person's assistant robot knows how much a consumer item weighs, and the robot payload is known, then the user can be protected from attempting to lift an object that is too heavy for the robot. Much safer situation for everyone.
If a person's assistant robot knows the grasp aperture, then the human has one less instruction to give to the robot in order to manipulate the object. More robot autonomy. Simplified human robot collaborations.
Or a robot owner creates a new program for his or her robot to dance along with a popular song. A machine readable code in the YouTube video can provide a one click link to the new control program and another robot now knows the dance.
How cool is that!