Tag Archives: dynamic
If you were to ask someone to name a new technology that emerged from MIT in the 21st century, there's a good chance they would name the robotic cheetah. Developed by the MIT Department of Mechanical Engineering's Biomimetic Robotics Lab under the direction of Associate Professor Sangbae Kim, the quadruped MIT Cheetah has made headlines for its dynamic legged gait, speed, jumping ability, and biomimetic design. Continue reading
The best storytellers react to their audience. They look for smiles, signs of awe, or boredom; they simultaneously and skillfully read both the story and their sitters. Kevin Brooks, a seasoned storyteller working for Motorola’s Human Interface Labs, explains, “As the storyteller begins, they must tune in to… the audience’s energy. Based on this energy, the storyteller will adjust their timing, their posture, their characterizations, and sometimes even the events of the story. There is a dialog between audience and storyteller.”
Shortly after I read the script to Melita, the latest virtual reality experience from Madrid-based immersive storytelling company Future Lighthouse, CEO Nicolas Alcalá explained to me that the piece is an example of “reactive content,” a concept he’s been working on since his days at Singularity University.
For the first time in history, we have access to technology that can merge the reactive and affective elements of oral storytelling with the affordances of digital media, weaving stunning visuals, rich soundtracks, and complex meta-narratives in a story arena that has the capability to know you more intimately than any conventional storyteller could.
It’s no understatement to say that the storytelling potential here is phenomenal.
In short, we can refer to content as reactive if it reads and reacts to users based on their body rhythms, emotions, preferences, and data points. Artificial intelligence is used to analyze users’ behavior or preferences to sculpt unique storylines and narratives, essentially allowing for a story that changes in real time based on who you are and how you feel.
The development of reactive content will allow those working in the industry to go one step further than simply translating the essence of oral storytelling into VR. Rather than having a narrative experience with a digital storyteller who can read you, reactive content has the potential to create an experience with a storyteller who knows you.
This means being able to subtly insert minor personal details that have a specific meaning to the viewer. When we talk to our friends we often use experiences we’ve shared in the past or knowledge of our audience to give our story as much resonance as possible. Targeting personal memories and aspects of our lives is a highly effective way to elicit emotions and aid in visualizing narratives. When you can do this with the addition of visuals, music, and characters—all lifted from someone’s past—you have the potential for overwhelmingly engaging and emotionally-charged content.
Future Lighthouse inform me that for now, reactive content will rely primarily on biometric feedback technology such as breathing, heartbeat, and eye tracking sensors. A simple example would be a story in which parts of the environment or soundscape change in sync with the user’s heartbeat and breathing, or characters who call you out for not paying attention.
The next step would be characters and situations that react to the user’s emotions, wherein algorithms analyze biometric information to make inferences about states of emotional arousal (“why are you so nervous?” etc.). Another example would be implementing the use of “arousal parameters,” where the audience can choose what level of “fear” they want from a VR horror story before algorithms modulate the experience using information from biometric feedback devices.
The company’s long-term goal is to gather research on storytelling conventions and produce a catalogue of story “wireframes.” This entails distilling the basic formula to different genres so they can then be fleshed out with visuals, character traits, and soundtracks that are tailored for individual users based on their deep data, preferences, and biometric information.
The development of reactive content will go hand in hand with a renewed exploration of diverging, dynamic storylines, and multi-narratives, a concept that hasn’t had much impact in the movie world thus far. In theory, the idea of having a story that changes and mutates is captivating largely because of our love affair with serendipity and unpredictability, a cultural condition theorist Arthur Kroker refers to as the “hypertextual imagination.” This feeling of stepping into the unknown with the possibility of deviation from the habitual translates as a comforting reminder that our own lives can take exciting and unexpected turns at any moment.
The inception of the concept into mainstream culture dates to the classic Choose Your Own Adventure book series that launched in the late 70s, which in its literary form had great success. However, filmic takes on the theme have made somewhat less of an impression. DVDs like I’m Your Man (1998) and Switching (2003) both use scene selection tools to determine the direction of the storyline.
A more recent example comes from Kino Industries, who claim to have developed the technology to allow filmmakers to produce interactive films in which viewers can use smartphones to quickly vote on which direction the narrative takes at numerous decision points throughout the film.
The main problem with diverging narrative films has been the stop-start nature of the interactive element: when I’m immersed in a story I don’t want to have to pick up a controller or remote to select what’s going to happen next. Every time the audience is given the option to take a new path (“press this button”, “vote on X, Y, Z”) the narrative— and immersion within that narrative—is temporarily halted, and it takes the mind a while to get back into this state of immersion.
Reactive content has the potential to resolve these issues by enabling passive interactivity—that is, input and output without having to pause and actively make decisions or engage with the hardware. This will result in diverging, dynamic narratives that will unfold seamlessly while being dependent on and unique to the specific user and their emotions. Passive interactivity will also remove the game feel that can often be a symptom of interactive experiences and put a viewer somewhere in the middle: still firmly ensconced in an interactive dynamic narrative, but in a much subtler way.
While reading the Melita script I was particularly struck by a scene in which the characters start to engage with the user and there’s a synchronicity between the user’s heartbeat and objects in the virtual world. As the narrative unwinds and the words of Melita’s character get more profound, parts of the landscape, which seemed to be flashing and pulsating at random, come together and start to mimic the user’s heartbeat.
In 2013, Jane Aspell of Anglia Ruskin University (UK) and Lukas Heydrich of the Swiss Federal Institute of Technology proved that a user’s sense of presence and identification with a virtual avatar could be dramatically increased by syncing the on-screen character with the heartbeat of the user. The relationship between bio-digital synchronicity, immersion, and emotional engagement is something that will surely have revolutionary narrative and storytelling potential.
Image Credit: Tithi Luadthong / Shutterstock.com Continue reading
As Dorothy famously said in The Wizard of Oz, there’s no place like home. Home is where we go to rest and recharge. It’s familiar, comfortable, and our own. We take care of our homes by cleaning and maintaining them, and fixing things that break or go wrong.
What if our homes, on top of giving us shelter, could also take care of us in return?
According to Chris Arkenberg, this could be the case in the not-so-distant future. As part of Singularity University’s Experts On Air series, Arkenberg gave a talk called “How the Intelligent Home of The Future Will Care For You.”
Arkenberg is a research and strategy lead at Orange Silicon Valley, and was previously a research fellow at the Deloitte Center for the Edge and a visiting researcher at the Institute for the Future.
Arkenberg told the audience that there’s an evolution going on: homes are going from being smart to being connected, and will ultimately become intelligent.
Intelligent home technologies are just now budding, but broader trends point to huge potential for their growth. We as consumers already expect continuous connectivity wherever we go—what do you mean my phone won’t get reception in the middle of Yosemite? What do you mean the smart TV is down and I can’t stream Game of Thrones?
As connectivity has evolved from a privilege to a basic expectation, Arkenberg said, we’re also starting to have a better sense of what it means to give up our data in exchange for services and conveniences. It’s so easy to click a few buttons on Amazon and have stuff show up at your front door a few days later—never mind that data about your purchases gets recorded and aggregated.
“Right now we have single devices that are connected,” Arkenberg said. “Companies are still trying to show what the true value is and how durable it is beyond the hype.”
Connectivity is the basis of an intelligent home. To take a dumb object and make it smart, you get it online. Belkin’s Wemo, for example, lets users control lights and appliances wirelessly and remotely, and can be paired with Amazon Echo or Google Home for voice-activated control.
Speaking of voice-activated control, Arkenberg pointed out that physical interfaces are evolving, too, to the point that we’re actually getting rid of interfaces entirely, or transitioning to ‘soft’ interfaces like voice or gesture.
Drivers of change
Consumers are open to smart home tech and companies are working to provide it. But what are the drivers making this tech practical and affordable? Arkenberg said there are three big ones:
Computation: Computers have gotten exponentially more powerful over the past few decades. If it wasn’t for processors that could handle massive quantities of information, nothing resembling an Echo or Alexa would even be possible. Artificial intelligence and machine learning are powering these devices, and they hinge on computing power too.
Sensors: “There are more things connected now than there are people on the planet,” Arkenberg said. Market research firm Gartner estimates there are 8.4 billion connected things currently in use. Wherever digital can replace hardware, it’s doing so. Cheaper sensors mean we can connect more things, which can then connect to each other.
Data: “Data is the new oil,” Arkenberg said. “The top companies on the planet are all data-driven giants. If data is your business, though, then you need to keep finding new ways to get more and more data.” Home assistants are essentially data collection systems that sit in your living room and collect data about your life. That data in turn sets up the potential of machine learning.
Colonizing the Living Room
Alexa and Echo can turn lights on and off, and Nest can help you be energy-efficient. But beyond these, what does an intelligent home really look like?
Arkenberg’s vision of an intelligent home uses sensing, data, connectivity, and modeling to manage resource efficiency, security, productivity, and wellness.
Autonomous vehicles provide an interesting comparison: they’re surrounded by sensors that are constantly mapping the world to build dynamic models to understand the change around itself, and thereby predict things. Might we want this to become a model for our homes, too? By making them smart and connecting them, Arkenberg said, they’d become “more biological.”
There are already several products on the market that fit this description. RainMachine uses weather forecasts to adjust home landscape watering schedules. Neurio monitors energy usage, identifies areas where waste is happening, and makes recommendations for improvement.
These are small steps in connecting our homes with knowledge systems and giving them the ability to understand and act on that knowledge.
He sees the homes of the future being equipped with digital ears (in the form of home assistants, sensors, and monitoring devices) and digital eyes (in the form of facial recognition technology and machine vision to recognize who’s in the home). “These systems are increasingly able to interrogate emotions and understand how people are feeling,” he said. “When you push more of this active intelligence into things, the need for us to directly interface with them becomes less relevant.”
Could our homes use these same tools to benefit our health and wellness? FREDsense uses bacteria to create electrochemical sensors that can be applied to home water systems to detect contaminants. If that’s not personal enough for you, get a load of this: ClinicAI can be installed in your toilet bowl to monitor and evaluate your biowaste. What’s the point, you ask? Early detection of colon cancer and other diseases.
What if one day, your toilet’s biowaste analysis system could link up with your fridge, so that when you opened it it would tell you what to eat, and how much, and at what time of day?
Roadblocks to intelligence
“The connected and intelligent home is still a young category trying to establish value, but the technological requirements are now in place,” Arkenberg said. We’re already used to living in a world of ubiquitous computation and connectivity, and we have entrained expectations about things being connected. For the intelligent home to become a widespread reality, its value needs to be established and its challenges overcome.
One of the biggest challenges will be getting used to the idea of continuous surveillance. We’ll get convenience and functionality if we give up our data, but how far are we willing to go? Establishing security and trust is going to be a big challenge moving forward,” Arkenberg said.
There’s also cost and reliability, interoperability and fragmentation of devices, or conversely, what Arkenberg called ‘platform lock-on,’ where you’d end up relying on only one provider’s system and be unable to integrate devices from other brands.
Ultimately, Arkenberg sees homes being able to learn about us, manage our scheduling and transit, watch our moods and our preferences, and optimize our resource footprint while predicting and anticipating change.
“This is the really fascinating provocation of the intelligent home,” Arkenberg said. “And I think we’re going to start to see this play out over the next few years.”
Sounds like a home Dorothy wouldn’t recognize, in Kansas or anywhere else.
Stock Media provided by adam121 / Pond5 Continue reading
Singapore International Robo Expo debuts as the robotics sector is poised for accelerated growth
In partnership with Experia Events, the Singapore Industrial Automation Association sets its sights on boosting the robotics solutions industry with this strategic global platform for innovation and technology
Singapore, 18 October 2016 – The first Singapore International Robo Expo (SIRE), organised by Experia Events and co-organised by the Singapore Industrial Automation Association (SIAA), will be held from 1 to 2 November 2016, at Sands Expo and Convention Centre, Marina Bay Sands.
Themed Forging the Future of Robotics Solutions, SIRE will comprise an exhibition, product demonstrations, networking sessions and conferences. SIRE aims to be the global platform for governments, the private sector and the academia to engage in dialogues, share industry best practices, network, forge partnerships, and explore funding opportunities for the adoption of robotics solutions.
“SIRE debuts at a time when robotics has been gaining traction in the world due to the need for automation and better productivity. The latest World Robotics Report by the International Federation of Robotics has also identified Singapore as a market with one of the highest robot density in manufacturing – giving us more opportunities for further development in this field, and well as its extension into the services sectors.
With the S$450 million pledged by the Singapore government to the National Robotics Programme to develop the industry over the next three years, SIRE is aligned with these goals to cultivate the adoption of robotics and support the growing industry. As an association, we are constantly looking for ways to bring together robotic adoption, collaboration among partners, and providing support with funding for our members. SIRE is precisely the strategic platform for this,” said Mr Oliver Tian, President, SIAA.
SIRE has attracted strong interest from institutes of higher learning (IHLs), research institutes, local and international enterprises, with innovation and technology applicable for a vast range of industries from manufacturing to healthcare.
ST Kinetics, the Title Sponsor for the inaugural edition of the event, is one of the key exhibitors, together with other leading industry players such as ABB, Murata, Panasonic, SICK Pte Ltd, and Tech Avenue amongst others. Emerging SMEs such as H3 Dynamics, Design Tech Technologies and SMP Robotics Singapore will also showcase their innovations at the exhibition. Participating research institute, A*STAR’s SIMTech, and other IHLs supporting the event include Ngee Ann Polytechnic, Republic Polytechnic and the Institute of Technical Education (ITE).
Visitors will also be able to view “live” demonstrations at the Demo Zone and come up close with the latest innovations and technologies. Some of the key highlights at the zone includes the world’s only fully autonomous outdoor security robot developed by SMP Robotics Singapore, as well as ABB’s Yumi, IRB 14000, a collaborative robot designed to work in close collaboration and proximity with humans safely. Dynamic Stabilization Systems, SIMTech and Design Tech will also be demonstrating the capabilities of their robotic innovations at the zone.
At the Singapore International Robo Convention, key speakers representing regulators, industry leaders and academia will come together, exchange insights and engage in discourse to address the various aspects of robotic and automation technology, industry trends and case studies of robotics solutions. There will also be a session discussing the details of the Singapore National Robotics Programme led by Mr Haryanto Tan, Head, Precision Engineering Cluster Group, EDB Singapore.
SIRE will also host the France-Singapore Innovation Days in collaboration with Business France, the national agency supporting the international development of the French economy. The organisation will lead a delegation of 20 key French companies to explore business and networking opportunities with Singapore firms, and conduct specialized workshops.
To further foster a deeper appreciation and to inspire the next generation of robotics and automation experts, the event will also host students from higher institutes of learning on Education Day on 2 November. Students will be able to immerse themselves in the exciting developments of the robotics industry and get a sampling of how robotics can be applied to real-world settings by visiting the exhibits and interacting with representatives from participating companies.
Mr Leck Chet Lam, Managing Director, Experia Events, says, “SIRE will be a game changer for the industry. We are expecting the industry’s best and new-to-market players to showcase their innovations, which could potentially add value to the operations across a wide spectrum of industry sectors, from manufacturing to retail and service, and healthcare. We also hope to inspire the robotics and automation experts of tomorrow with our Education Day programme.
Experia Events prides itself as a company that organises strategic events for the global stage, featuring thought leaders and working with the industries’ best. It is an honour for us to be partnering SIAA, a recognised body and key player in the robotics industry. We are privileged to be able to help elevate Singapore’s robotics industry through SIRE and are pulling out all stops to ensure that the event will be a resounding success.”
SIRE is supported by Strategic Partner, IE Singapore as well as agencies including EDB Singapore, GovTech Singapore, InfoComm Media Development Authority, A*STAR’s SIMTech, and Spring Singapore.
For further enquiries, please contact:
Marilyn HoExperia Events Pte LtdDirector, CommunicationsTel: +65 6595 6130Email: firstname.lastname@example.org
Genevieve YeoExperia Events Pte LtdAssistant Manager, CommunicationsTel: +65 6595 6131Email: email@example.com
The post Singapore International Robotics Expo appeared first on Roboticmagazine. Continue reading