Tag Archives: dr

#431315 Better Than Smart Speakers? Japan Is ...

While American internet giants are developing speakers, Japanese companies are working on robots and holograms. They all share a common goal: to create the future platform for the Internet of Things (IoT) and smart homes.
Names like Bocco, EMIEW3, Xperia Assistant, and Gatebox may not ring a bell to most outside of Japan, but Sony, Hitachi, Sharp, and Softbank most certainly do. The companies, along with Japanese start-ups, have developed robots, robot concepts, and even holograms like the ones hiding behind the short list of names.
While there are distinct differences between the various systems, they share the potential to act as a remote control for IoT devices and smart homes. It is a very different direction than that taken by companies like Google, Amazon, and Apple, who have so far focused on building IoT speaker systems.
Bocco robot. Image Credit: Yukai Engineering
“Technology companies are pursuing the platform—or smartphone if you will—for IoT. My impression is that Japanese companies—and Japanese consumers—prefer that such a platform should not just be an object, but a companion,” says Kosuke Tatsumi, designer at Yukai Engineering, a startup that has developed the Bocco robot system.
At Hitachi, a spokesperson said that the company’s human symbiotic service robot, EMIEW3, robot is currently in the field, doing proof-of-value tests at customer sites to investigate needs and potential solutions. This could include working as an interactive control system for the Internet of Things:
“EMIEW3 is able to communicate with humans, thus receive instructions, and as it is connected to a robotics IT platform, it is very much capable of interacting with IoT-based systems,” the spokesperson said.
The power of speech is getting feet
Gartner analysis predicts that there will be 8.4 billion internet-connected devices—collectively making up the Internet of Things—by the end of 2017. 5.2 billion of those devices are in the consumer category. By the end of 2020, the number of IoT devices will rise to 12.8 billion—and that is just in the consumer category.
As a child of the 80s, I can vividly remember how fun it was to have separate remote controls for TV, video, and stereo. I can imagine a situation where my internet-connected refrigerator and ditto thermostat, television, and toaster try to work out who I’m talking to and what I want them to do.
Consensus seems to be that speech will be the way to interact with many/most IoT devices. The same goes for a form of virtual assistant functioning as the IoT platform—or remote control. Almost everything else is still an open ballgame, despite an early surge for speaker-based systems, like those from Amazon, Google, and Apple.
Why robots could rule
Famous android creator and robot scientist Dr. Hiroshi Ishiguro sees the interaction between humans and the AI embedded in speakers or robots as central to both approaches. From there, the approaches differ greatly.
Image Credit: Hiroshi Ishiguro Laboratories
“It is about more than the difference of form. Speaking to an Amazon Echo is not a natural kind of interaction for humans. That is part of what we in Japan are creating in many human-like robot systems,” he says. “The human brain is constructed to recognize and interact with humans. This is part of why it makes sense to focus on developing the body for the AI mind as well as the AI mind itself. In a way, you can describe it as the difference between developing an assistant, which could be said to be what many American companies are currently doing, and a companion, which is more the focus here in Japan.”
Another advantage is that robots are more kawaii—a multifaceted Japanese word that can be translated as “cute”—than speakers are. This makes it easy for people to relate to them and forgive them.
“People are more willing to forgive children when they make mistakes, and the same is true with a robot like Bocco, which is designed to look kawaii and childlike,” Kosuke Tatsumi explains.
Japanese robots and holograms with IoT-control capabilities
So, what exactly do these robot and hologram companions look like, what can they do, and who’s making them? Here are seven examples of Japanese companies working to go a step beyond smart speakers with personable robots and holograms.
1. In 2016 Sony’s mobile division demonstrated the Xperia Agent concept robot that recognizes individual users, is voice controlled, and can do things like control your television and receive calls from services like Skype.

2. Sharp launched their Home Assistant at CES 2016. A robot-like, voice-controlled assistant that can to control, among other things, air conditioning units, and televisions. Sharp has also launched a robotic phone called RoBoHon.
3. Gatebox has created a holographic virtual assistant. Evil tongues will say that it is primarily the expression of an otaku (Japanese for nerd) dream of living with a manga heroine. Gatebox is, however, able to control things like lights, TVs, and other systems through API integration. It also provides its owner with weather-related advice like “remember your umbrella, it looks like it will rain later.” Gatebox can be controlled by voice, gesture, or via an app.
4. Hitachi’s EMIEW3 robot is designed to assist people in businesses and public spaces. It is connected to a robot IT-platform via the cloud that acts as a “remote brain.” Hitachi is currently investigating the business use cases for EMIEW3. This could include the role of controlling platform for IoT devices.

5. Softbank’s Pepper robot has been used as a platform to control use of medical IoT devices such as smart thermometers by Avatarion. The company has also developed various in-house systems that enable Pepper to control IoT-devices like a coffee machine. A user simply asks Pepper to brew a cup of coffee, and it starts the coffee machine for you.
6. Yukai Engineering’s Bocco registers when a person (e.g., young child) comes home and acts as a communication center between that person and other members of the household (e.g., parent still at work). The company is working on integrating voice recognition, voice control, and having Bocco control things like the lights and other connected IoT devices.
7. Last year Toyota launched the Kirobo Mini, a companion robot which aims to, among other things, help its owner by suggesting “places to visit, routes for travel, and music to listen to” during the drive.

Today, Japan. Tomorrow…?
One of the key questions is whether this emerging phenomenon is a purely Japanese thing. If the country’s love of robots makes it fundamentally different. Japan is, after all, a country where new units of Softbank’s Pepper robot routinely sell out in minutes and the RoBoHon robot-phone has its own cafe nights in Tokyo.
It is a country where TV introduces you to friendly, helpful robots like Doraemon and Astro Boy. I, on the other hand, first met robots in the shape of Arnold Schwarzenegger’s Terminator and struggled to work out why robots seemed intent on permanently borrowing things like clothes and motorcycles, not to mention why they hated people called Sarah.
However, research suggests that a big part of the reason why Japanese seem to like robots is a combination of exposure and positive experiences that leads to greater acceptance of them. As robots spread to more and more industries—and into our homes—our acceptance of them will grow.
The argument is also backed by a project by Avatarion, which used Softbank’s Nao-robot as a classroom representative for children who were in the hospital.
“What we found was that the other children quickly adapted to interacting with the robot and treating it as the physical representation of the child who was in hospital. They accepted it very quickly,” Thierry Perronnet, General Manager of Avatarion, explains.
His company has also developed solutions where Softbank’s Pepper robot is used as an in-home nurse and controls various medical IoT devices.
If robots end up becoming our preferred method for controlling IoT devices, it is by no means certain that said robots will be coming from Japan.
“I think that the goal for both Japanese and American companies—including the likes of Google, Amazon, Microsoft, and Apple—is to create human-like interaction. For this to happen, technology needs to evolve and adapt to us and how we are used to interacting with others, in other words, have a more human form. Humans’ speed of evolution cannot keep up with technology’s, so it must be the technology that changes,” Dr. Ishiguro says.
Image Credit: Sony Mobile Communications Continue reading

Posted in Human Robots

#431171 SceneScan: Real-Time 3D Depth Sensing ...

Nerian Introduces a High-Performance Successor for the Proven SP1 System
Stereo vision, which is the three-dimensional perception of our environment with two sensors likeour eyes, is a well-known technology. As a passive method – there is no need to emit light in thevisible or invisible spectral range – this technology can open up new possibilities for three dimensional perception, even under difficult conditions.
But as often, the devil is in the details: for most applications, the software implementation withstandard PCs, but also with graphics processors, is too slow. Another complicating factor is thatthese hardware platforms are expensive and not energy-efficient. The solution is to instead usespecialized hardware for image processing. A programmable logic device – a so-called FPGA – cangreatly accelerate the image processing.
As a technology leader, Nerian Vision Technologies has been following this path successfully forthe past two years with the SP1 stereo vision system, which has enabled completely newapplications in the fields of robotics, automation technology, medical technology, autonomousdriving and other domains. Now the company introduces two successors:
SceneScan and SceneScan Pro. Real eye-catchers in a double sense: stereo vision in an elegant design!But more important is, of course, the significantly improved inner workings of the two new modelsin comparison to their predecessor. The new hardware allows processing rates of up to 100 framesper second at resolutions of up to 3 megapixels, which leaves the SP1 far behind:
Photo Credit: Nerian Vision Technologies – www.nerian.com

The table illustrates the difference: while SceneScan Pro has the highest possible computing powerand is designed for the most demanding applications, SceneScan has been cost-reduced forapplications with lower requirements. The customer can thus optimize his embedded vision solution both in terms of costs and technology.
The new duo is completed by Nerian’s proven Karmin stereo cameras. Of course, industrialUSB3Vision cameras by other manufacturers are also supported.This combination not only supports the above-mentioned applications even better, but alsofacilitates completely new and innovative ones. If required, customer-specific adaptations are alsopossible.
ContactNerian Vision TechnologiesOwner: Dr. Konstantin SchauweckerGotenstr. 970771 Leinfelden-EchterdingenGermanyPhone: +49 711 / 2195 9414Email: service@nerian.comWebsite: http://nerian.com
Press Release Authored By: Nerian Vision Technologies
Photo Credit: Nerian Vision Technologies – www.nerian.com
The post SceneScan: Real-Time 3D Depth Sensing Through Stereo Vision appeared first on Roboticmagazine. Continue reading

Posted in Human Robots

#430830 Biocomputers Made From Cells Can Now ...

When it comes to biomolecules, RNA doesn’t get a lot of love.
Maybe you haven’t even heard of the silent workhorse. RNA is the cell’s de facto translator: like a game of telephone, RNA takes DNA’s genetic code to a cellular factory called ribosomes. There, the cell makes proteins based on RNA’s message.
But RNA isn’t just a middleman. It controls what proteins are formed. Because proteins wiz around the cell completing all sorts of important processes, you can say that RNA is the gatekeeper: no RNA message, no proteins, no life.
In a new study published in Nature, RNA finally took center stage. By adding bits of genetic material to the E. Coli bacteria, a team of biohackers at the Wyss Institute hijacked the organism’s RNA messengers so that they only spring into action following certain inputs.
The result? A bacterial biocomputer capable of performing 12-input logic operations—AND, OR, and NOT—following specific inputs. Rather than outputting 0s and 1s, these biocircuits produce results based on the presence or absence of proteins and other molecules.
“It’s the greatest number of inputs in a circuit that a cell has been able to process,” says study author Dr. Alexander Green at Arizona State University. “To be able to analyze those signals and make a decision is the big advance here.”
When given a specific set of inputs, the bacteria spit out a protein that made them glow neon green under fluorescent light.
But synthetic biology promises far more than just a party trick—by tinkering with a cell’s RNA repertoire, scientists may one day coax them to photosynthesize, produce expensive drugs on the fly, or diagnose and hunt down rogue tumor cells.
Illustration of an RNA-based ‘ribocomputing’ device that makes logic-based decisions in living cells. The long gate RNA (blue) detects the binding of an input RNA (red). The ribosome (purple/mauve) reads the gate RNA to produce an output protein. Image Credit: Alexander Green / Arizona State University
The software of life
This isn’t the first time that scientists hijacked life’s algorithms to reprogram cells into nanocomputing systems. Previous work has already introduced to the world yeast cells that can make anti-malaria drugs from sugar or mammalian cells that can perform Boolean logic.
Yet circuits with multiple inputs and outputs remain hard to program. The reason is this: synthetic biologists have traditionally focused on snipping, fusing, or otherwise arranging a cell’s DNA to produce the outcomes they want.
But DNA is two steps removed from proteins, and tinkering with life’s code often leads to unexpected consequences. For one, the cell may not even accept and produce the extra bits of DNA code. For another, the added code, when transformed into proteins, may not act accordingly in the crowded and ever-changing environment of the cell.
What’s more, tinkering with one gene is often not enough to program an entirely new circuit. Scientists often need to amp up or shut down the activity of multiple genes, or multiple biological “modules” each made up of tens or hundreds of genes.
It’s like trying to fit new Lego pieces in a specific order into a room full of Lego constructions. Each new piece has the potential to wander off track and click onto something it’s not supposed to touch.
Getting every moving component to work in sync—as you might have guessed—is a giant headache.
The RNA way
With “ribocomputing,” Green and colleagues set off to tackle a main problem in synthetic biology: predictability.
Named after the “R (ribo)” in “RNA,” the method grew out of an idea that first struck Green back in 2012.
“The synthetic biological circuits to date have relied heavily on protein-based regulators that are difficult to scale up,” Green wrote at the time. We only have a limited handful of “designable parts” that work well, and these circuits require significant resources to encode and operate, he explains.
RNA, in comparison, is a lot more predictable. Like its more famous sibling DNA, RNA is composed of units that come in four different flavors: A, G, C, and U. Although RNA is only single-stranded, rather than the double helix for which DNA is known for, it can bind short DNA-like sequences in a very predictable manner: Gs always match up with Cs and As always with Us.
Because of this predictability, it’s possible to design RNA components that bind together perfectly. In other words, it reduces the chance that added RNA bits might go rogue in an unsuspecting cell.
Normally, once RNA is produced it immediately rushes to the ribosome—the cell’s protein-building factory. Think of it as a constantly “on” system.
However, Green and his team found a clever mechanism to slow them down. Dubbed the “toehold switch,” it works like this: the artificial RNA component is first incorporated into a chain of A, G, C, and U folded into a paperclip-like structure.
This blocks the RNA from accessing the ribosome. Because one RNA strand generally maps to one protein, the switch prevents that protein from ever getting made.
In this way, the switch is set to “off” by default—a “NOT” gate, in Boolean logic.
To activate the switch, the cell needs another component: a “trigger RNA,” which binds to the RNA toehold switch. This flips it on: the RNA grabs onto the ribosome, and bam—proteins.
BioLogic gates
String a few RNA switches together, with the activity of each one relying on the one before, and it forms an “AND” gate. Alternatively, if the activity of each switch is independent, that’s an “OR” gate.
“Basically, the toehold switches performed so well that we wanted to find a way to best exploit them for cellular applications,” says Green. They’re “kind of the equivalent of your first transistors,” he adds.
Once the team optimized the designs for different logic gates, they carefully condensed the switches into “gate RNA” molecules. These gate RNAs contain both codes for proteins and the logic operations needed to kickstart the process—a molecular logic circuit, so to speak.
If you’ve ever played around with an Arduino-controlled electrical circuit, you probably know the easiest way to test its function is with a light bulb.
That’s what the team did here, though with a biological bulb: green fluorescent protein, a light-sensing protein not normally present in bacteria that—when turned on—makes the microbugs glow neon green.
In a series of experiments, Green and his team genetically inserted gate RNAs into bacteria. Then, depending on the type of logical function, they added different combinations of trigger RNAs—the inputs.
When the input RNA matched up with its corresponding gate RNA, it flipped on the switch, causing the cell to light up.

Their most complex circuit contained five AND gates, five OR gates, and two NOTs—a 12-input ribocomputer that functioned exactly as designed.
That’s quite the achievement. “Everything is interacting with everything else and there are a million ways those interactions could flip the switch on accident,” says RNA researcher Dr. Julies Lucks at Northwestern University.
The specificity is thanks to RNA, the authors explain. Because RNAs bind to others so predictably, we can now design massive libraries of gate and trigger units to mix-and-match into all types of nano-biocomputers.
RNA BioNanobots
Although the technology doesn’t have any immediate applications, the team has high hopes.
For the first time, it’s now possible to massively scale up the process of programming new circuits into living cells. We’ve expanded the library of available biocomponents that can be used to reprogram life’s basic code, the authors say.
What’s more, when freeze-dried onto a piece of tissue paper, RNA keeps very well. We could potentially print RNA toehold switches onto paper that respond to viruses or to tumor cells, the authors say, essentially transforming the technology into highly accurate diagnostic platforms.
But Green’s hopes are even wilder for his RNA-based circuits.
“Because we’re using RNA, a universal molecule of life, we know these interactions can also work in other cells, so our method provides a general strategy that could be ported to other organisms,” he says.
Ultimately, the hope is to program neural network-like capabilities into the body’s other cells.
Imagine cells endowed with circuits capable of performing the kinds of computation the brain does, the authors say.
Perhaps one day, synthetic biology will transform our own cells into fully programmable entities, turning us all into biological cyborgs from the inside. How wild would that be?
Image Credit: Wyss Institute at Harvard University Continue reading

Posted in Human Robots

#430579 What These Lifelike Androids Can Teach ...

For Dr. Hiroshi Ishiguro, one of the most interesting things about androids is the changing questions they pose us, their creators, as they evolve. Does it, for example, do something to the concept of being human if a human-made creation starts telling you about what kind of boys ‘she’ likes?
If you want to know the answer to the boys question, you need to ask ERICA, one of Dr. Ishiguro’s advanced androids. Beneath her plastic skull and silicone skin, wires connect to AI software systems that bring her to life. Her ability to respond goes far beyond standard inquiries. Spend a little time with her, and the feeling of a distinct personality starts to emerge. From time to time, she works as a receptionist at Dr. Ishiguro and his team’s Osaka University labs. One of her android sisters is an actor who has starred in plays and a film.

ERICA’s ‘brother’ is an android version of Dr. Ishiguro himself, which has represented its creator at various events while the biological Ishiguro can remain in his offices in Japan. Microphones and cameras capture Ishiguro’s voice and face movements, which are relayed to the android. Apart from mimicking its creator, the Geminoid™ android is also capable of lifelike blinking, fidgeting, and breathing movements.
Say hello to relaxation
As technological development continues to accelerate, so do the possibilities for androids. From a position as receptionist, ERICA may well branch out into many other professions in the coming years. Companion for the elderly, comic book storyteller (an ancient profession in Japan), pop star, conversational foreign language partner, and newscaster are some of the roles and responsibilities Dr. Ishiguro sees androids taking on in the near future.
“Androids are not uncanny anymore. Most people adapt to interacting with Erica very quickly. Actually, I think that in interacting with androids, which are still different from us, we get a better appreciation of interacting with other cultures. In both cases, we are talking with someone who is different from us and learn to overcome those differences,” he says.
A lot has been written about how robots will take our jobs. Dr. Ishiguro believes these fears are blown somewhat out of proportion.
“Robots and androids will take over many simple jobs. Initially there might be some job-related issues, but new schemes, like for example a robot tax similar to the one described by Bill Gates, should help,” he says.
“Androids will make it possible for humans to relax and keep evolving. If we compare the time we spend studying now compared to 100 years ago, it has grown a lot. I think it needs to keep growing if we are to keep expanding our scientific and technological knowledge. In the future, we might end up spending 20 percent of our lifetime on work and 80 percent of the time on education and growing our skills.”
Android asks who you are
For Dr. Ishiguro, another aspect of robotics in general, and androids in particular, is how they question what it means to be human.
“Identity is a very difficult concept for humans sometimes. For example, I think clothes are part of our identity, in a way that is similar to our faces and bodies. We don’t change those from one day to the next, and that is why I have ten matching black outfits,” he says.
This link between physical appearance and perceived identity is one of the aspects Dr. Ishiguro is exploring. Another closely linked concept is the connection between body and feeling of self. The Ishiguro avatar was once giving a presentation in Austria. Its creator recalls how he felt distinctly like he was in Austria, even capable of feeling sensation of touch on his own body when people laid their hands on the android. If he was distracted, he felt almost ‘sucked’ back into his body in Japan.
“I am constantly thinking about my life in this way, and I believe that androids are a unique mirror that helps us formulate questions about why we are here and why we have been so successful. I do not necessarily think I have found the answers to these questions, so if you have, please share,” he says with a laugh.
His work and these questions, while extremely interesting on their own, become extra poignant when considering the predicted melding of mind and machine in the near future.
The ability to be present in several locations through avatars—virtual or robotic—raises many questions of both philosophical and practical nature. Then add the hypotheticals, like why send a human out onto the hostile surface of Mars if you could send a remote-controlled android, capable of relaying everything it sees, hears and feels?
The two ways of robotics will meet
Dr. Ishiguro sees the world of AI-human interaction as currently roughly split into two. One is the chat-bot approach that companies like Amazon, Microsoft, Google, and recently Apple, employ using stationary objects like speakers. Androids like ERICA represent another approach.
“It is about more than the form factor. I think that the android approach is generally more story-based. We are integrating new conversation features based on assumptions about the situation and running different scenarios that expand the android’s vocabulary and interactions. Another aspect we are working on is giving androids desire and intention. Like with people, androids should have desires and intentions in order for you to want to interact with them over time,” Dr. Ishiguro explains.
This could be said to be part of a wider trend for Japan, where many companies are developing human-like robots that often have some Internet of Things capabilities, making them able to handle some of the same tasks as an Amazon Echo. The difference in approach could be summed up in the words ‘assistant’ (Apple, Amazon, etc.) and ‘companion’ (Japan).
Dr. Ishiguro sees this as partly linked to how Japanese as a language—and market—is somewhat limited. This has a direct impact on viability and practicality of ‘pure’ voice recognition systems. At the same time, Japanese people have had greater exposure to positive images of robots, and have a different cultural / religious view of objects having a ‘soul’. However, it may also mean Japanese companies and android scientists are both stealing a lap on their western counterparts.
“If you speak to an Amazon Echo, that is not a natural way to interact for humans. This is part of why we are making human-like robot systems. The human brain is set up to recognize and interact with humans. So, it makes sense to focus on developing the body for the AI mind, as well as the AI. I believe that the final goal for both Japanese and other companies and scientists is to create human-like interaction. Technology has to adapt to us, because we cannot adapt fast enough to it, as it develops so quickly,” he says.
Banner image courtesy of Hiroshi Ishiguro Laboratories, ATR all rights reserved.
Dr. Ishiguro’s team has collaborated with partners and developed a number of android systems:
Geminoid™ HI-2 has been developed by Hiroshi Ishiguro Laboratories and Advanced Telecommunications Research Institute International (ATR).
Geminoid™ F has been developed by Osaka University and Hiroshi Ishiguro Laboratories, Advanced Telecommunications Research Institute International (ATR).
ERICA has been developed by ERATO ISHIGURO Symbiotic Human-Robot Interaction Project Continue reading

Posted in Human Robots

#428357 UV Disinfection robot

Tech-Link Healthcare Systems partners with Blue Ocean Robotics Introducing UV-Disinfection Robot
Singapore, 1 November 2016 – The rise of robots have steered Tech-Link Healthcare Systems, a design and integrator of healthcare automation systems to offer solutions beyond automated storage and material handling systems. With a vision of providing holistic solutions for healthcare organisations, Tech-Link extends its capabilities by offering UV disinfection robot solutions via a strategic partnership with Danish robotics company, Blue Ocean Robotics to battle against Hospital Acquired Infections (HAIs).Singapore’s labour intensive healthcare environment and the unknown impact of HAIs in the developed city-state had beckoned Tech-Link Healthcare Systems to offer solutions in the area of disinfection. We recognised the rise in demand for robots to collaborate with humans and have identified this need for customers. Introducing robotic technologies as part of our suite of solutions is the company’s mission to innovate the way healthcare organisations work and enhance their customers’ experience.Tech-Link’s partnership with Blue Ocean Robotics affirms both companies’ efforts in reaching out to new markets with technology and solutions to ease manpower crunch, deliver greater value and improve the quality of healthcare services. As an official sales partner, we bring together Blue Ocean Robotics’ expertise in automating disinfection procedures to promote safer, efficient and more productive work environment.
“Tech-Link looks forward to developing reliable healthcare solutions with hardware and latest technologies from Blue Ocean Robotics for our customers in Singapore and abroad.” said Director of Tech-Link Healthcare Systems, Tan Hock Seng. “Our similar beliefs in the Blue Ocean strategy synergise the collaboration to improve the quality of healthcare services through robotics.” he added.“We are very excited about our new sales partner Tech-Link Healthcare Systems, since it is of great importance for Blue Ocean Robotics to expand our sales of new technologies beyond Denmark’s borders. Blue Ocean Robotics focuses on creating new markets for robots. This includes both the development of new technologies and the creation of new markets for revolutionary robot solutions. We welcome Tech-Link Healthcare Systems with open arms and look forward to a fruitful collaboration in the years ahead.” said Claus Risager, Rune K. Larsen & John Erland Østergaard, Partners and Co-CEOs, Blue Ocean Robotics.
UV-Disinfection RobotThe UV-Disinfection Robot – also called UV-DR – is an autonomous disinfection robot for hospitals, production lines and pharmaceutical companies. The robot is used primarily in, but not limited to the cleaning cycle with the aim of reducing spread of HAIs, infectious diseases, viruses, bacteria and other types or harmful organic materials.UV-DR is a mobile robot that can drive autonomously while emitting concentrated UV-C light onto pre-defined infectious hotspots in patient rooms and other hospital environments, thus disinfecting and killing bacteria and virus on all exposed surfaces. An exposure time of ten minutes is estimated to kill up to 99% of bacteria such as Clostridium Difficile.

About Tech-Link Healthcare Systems Pte LtdTech-Link Healthcare Systems is a subsidiary of Tech-Link Storage Engineering established in Singapore since 2015. The company designs and provides innovative solutions for the healthcare sector, focusing on advanced and emerging solutions to support healthcare organisations in optimising available resources and services. Tech-Link Healthcare Systems design and implement automated material handling systems to enhance secured material transport and logistics storage management in hospitals and other healthcare facilities. As a complete solution provider, the company also provides consultancy in systems design to streamline and automate processes as well as integrated video solutions within healthcare facilities.About Tech-Link Storage Engineering Pte LtdTech-Link Storage Engineering is a group of companies established in Singapore with more than 25 years of principal activities in procurement, manufacturing and marketing of storage, distribution and materials handling products and systems. From its domain expertise in storage and racking systems, Tech-Link is also involved in R&D, system design, supply and implementation of logistics supply chain automation systems. The business expanded its global capabilities in the area of planning and consultancy to provide solutions for Built-to-Suit industrial developments and Healthcare logistics systems.
Tech-Link is an ISO 9001:2008 and OHSAS 18001:2007 certified company for Quality Management System and Occupational, Health and Safety System.Visit www.techlinkstorageengineering.comAbout Blue Ocean RoboticsBlue Ocean Robotics is an international company group with presence across the globe including America, Europe, Asia and Australia. The robotics company has its headquarter in the city of Odense (www.odenserobotics.dk) in Denmark. Blue Ocean Robotics applies robot technology to create solutions and innovation for end-users and new businesses in partnerships.Visit www.blue-ocean-robotics.com
Here is a video showing the robot in action:

The post UV Disinfection robot appeared first on Roboticmagazine. Continue reading

Posted in Human Robots