Tag Archives: technology
Stranger Things fans will be familiar with this scene: Eleven, a girl with telekinetic powers, stares intently at a Coke can. Without physically touching the can, she completely crushes it using her mind alone.
Changing objects with the mind has long been a trope in science fiction. Now, thanks to metasurfaces, two studies just showed that it’s potentially possible.
Metamaterials are artificial composites with bizarre optical properties. Often arranged in tandem, they can interact with electromagnetic waves, including visible light, in ways that are impossible for natural materials. This gives them a superpower: they can readily adapt their properties—for example, bending light in different ways—rather than relying on the properties of the materials they’re made of.
Why care? Our brains generate electromagnetic waves as they process information. Depending on the brain’s state—for example, if it’s “relaxed” versus “concentrating”—different frequencies of brain waves take over. So why not use the brain as a source to trigger changes in metamaterials?
In the first study, published in eLight, the team used a brainwave extraction module that allowed volunteers to control a metasurface—a 2D version of metamaterials—with their minds alone. The whole system is wireless and relies on Bluetooth. They extracted brainwaves from the volunteer as she relaxed or concentrated, and through a controller, changed how the linked metasurface scattered light. Not as dramatic as bending a Coke can, sure—but a futuristic demonstration of using the mind to control physical material.
A second study took the idea a smidge further. Different metasurfaces can “talk” to each other based on electromagnetic properties. Here, the team hooked up two people to metasurfaces to text with their minds. One volunteer was the transmitter, the other the receiver. By concentrating, the transmitter’s brain waves changed the metasurface’s properties to encode different binary messages. Upon decoding, the receiver got the text—all without lifting a single finger.
For now, the futuristic tech is still in its infancy. But scientists imagine they’ll one day be able to use metamaterials for a myriad of purposes: monitoring the attention status of a driver, for example, or incorporating them into non-invasive brain-machine interfaces.
“Combined with intelligent algorithms such as machine learning, the presented two works may further open up a new direction to advanced bio-intelligent metasurface systems,” said Dr. Xiangang Luo at the Institute of Optics and Electronics, Chinese Academy of Sciences, who was not involved in either of the studies.
The Weirdness of Metasurfaces
Metasurfaces are like a fever dream. Normally we expect our materials to behave consistently: glass bottles shatter under pressure; wood cracks; cotton is soft. Metamaterials change this paradigm. Often made up of an amalgamation of materials—piezoelectric materials are a favorite—they readily change their structural and light-bending properties under the effect of electromagnetic fields.
This has led to preliminary invisibility cloaks, dynamic camouflaging, superlenses, and 3D-printed millibots that could one day roam your body to intelligently deliver drugs when needed.
Metasurfaces are metamaterials’ 2D cousin. Here, the repeating structures in metamaterials weave into a sheet-like structure, maintaining their ability to control “nearly all the characteristics of electromagnetic waves,” said Dr. Shaobo Qu at Air Force Engineering University in China, who led the telekinesis trial. Programmable metasurfaces (PMs) are a step up, in that their functions can be controlled in a predictable manner by outside influences to switch operating modes—like a bathroom “smart” mirror with several light settings depending on your mood.
Normally, electromagnetic waves come from a generator. But our brains burst with different frequencies of these waves, which collectively represent electrical signals across large regions. Beta waves, for example, cycle roughly 15 to 40 times a second, and are associated with an engaged mind. Theta waves, in contrast, correlate with daydreaming—a sort of mental relaxation. Scientists have found that it is possible to control your brain waves and actively shift them from one state to another through neurofeedback.
Brain waves can be readily picked up by a cap of embedded electrodes. This led the team to wonder: can we use these signals to control metasurfaces?
In one study, Qu proposed a simple design using a brainwave extraction module. It’s got three parts: the sensor, controller, and actuator. The sensor collects brainwaves through electrodes placed on the scalp. Here, the team used a commercially available module, ThinkGear AM, an affordable chip popular with the DIY EEG brainwave-hacking community.
Recorded data is then transmitted to the controller through Bluetooth. The controller is also made from a low-cost component, with Arduino at its heart. Brain wave signals are converted into a measure for attention, and fed into the actuator. Depending on the person’s level of attention, the actuator bins the data into four groups and outputs different voltages.
“The four threshold intervals correspond to distracted, neutral, concentrated, and extremely concentrated attention intensity, respectively,” the team explained.
The high or low voltage corresponds to a 1 or 0 coding sequence. These sequences then map to different material properties for the metasurface, which in turn controls how it scatters light.
The end result? In a proof of concept, a volunteer sat in an anechoic chamber—a room designed to block out surrounding sound or electromagnetic waves. With dry electrodes on her head, she closed her eyes as she cycled through different concentration states. By measuring the light-scattering properties of the metasurface, the team found a strong correspondence between her attention intensity and the material properties.
The study doesn’t show that it’s possible to physically move materials with your mind. But it does show that it’s possible to remotely control a material based on thought alone. For now, the technology is mostly a cool proof of evidence that paves the road for mind-controlled materials for health monitoring or smart sensors. A major roadblock is how to deal with outside electromagnetic noise, which could occlude neural control signals.
Telekinesis already blows my mind. But what about telepathy?
A separate study used metasurfaces as a telephone of sorts to help two people text simple messages, all without lifting a finger.
Direct brain-to-brain communication isn’t new. Previous studies using non-invasive setups had participants playing 20 questions with their brain waves. Another study built a BrainNet for three volunteers, allowing them to play a Tetris-like game using brainwaves alone. The conduit for those mindmelds relied on cables and the internet. One new study asked if metasurfaces could do the same.
Led by Dr. Tie Jun Cui at the Institute of Electromagnetic Space, Southeast University in China, the study linked a well-known brainwave signal, P300, to the properties of a metasurface. Their setup, electromagnetic brain-computer-metasurface (EBCM), used brainwaves to control a particular type of metasurface known as an information metasurface, which can code 0s and 1s like an electronic circuit board.
The experiment had two volunteers: a transmitter and a receiver. The transmitter had his brain waves monitored with EEG, with a specific focus on the P300 signal. The signals were then decoded into binary code, which was then used to control the transmitter’s metasurface properties. These changes wirelessly changed the receiver’s metasurface, which was then decoded and translated back into text information for the receiver to read.
The setup successfully transmitted four text sequences: “hello world,” “Hi, Sue,” “Hi, Scut” and “BCI metasurface.” It’s a slow process, averaging roughly five seconds for each character, but could be improved with some “quick-spelling paradigms,” the team said.
We are still far from tech-based telekinesis and telepathy. But those superpowers may not be as far-fetched as once thought. For now, the teams are eager to adopt their setups for bettering health.
“Our work may further open up a new direction to explore the deep integration of metasurface, human brain intelligence, and artificial intelligence, so as to build up new generations of bio-intelligent metasurface systems,” said Cui.
Image Credit: Gerd Altmann / Pixabay Continue reading
This is a sponsored article brought to you by SICK Inc..
From advanced manufacturing to automated vehicles, engineers are using LiDAR to change the world as we know it. For the second year, students from across the country submitted projects to SICK's annual TiM$10K Challenge.
The first place team during the 2020 TiM$10K Challenge hails from Worcester Polytechnic Institute (WPI) in Worcester, Mass. The team comprised of undergraduate seniors, Daniel Pelaez and Noah Budris, and undergraduate junior, Noah Parker.
With the help of their academic advisor, Dr. Alexander Wyglinski, Professor of Electrical Engineering and Robotics Engineering at WPI, the team took first place in the 2020 TiM$10K Challenge with their project titled ROADGNAR, a mobile and autonomous pavement quality data collection system.
So what is the TiM$10K Challenge? In this challenge, SICK reached out to universities across the nation that were looking to support innovation and student achievement in automation and technology. Participating teams were supplied with a SICK 270° LiDAR, a TiM, and accessories. They were challenged to solve a problem, create a solution, and bring a new application that utilizes the SICK scanner in any industry.
Around the United States, many of the nation's roadways are in poor condition, most often from potholes and cracks in the pavement, which can make driving difficult. Many local governments agree that infrastructure is in need of repair, but with a lack of high-quality data, inconsistencies in damage reporting, and an overall lack of adequate prioritization, this is a difficult problem to solve.
Pelaez, Parker, and Budris first came up with the idea of ROADGNAR before they had even learned of the TiM$10K Challenge. They noticed that the roads in their New England area were in poor condition, and wanted to see if there was a way to help solve the way road maintenance is performed.
In their research, they learned that many local governments use outdated and manual processes. Many send out workers to check for poor road conditions, who then log the information in notebooks.
The team began working on a solution to help solve this problem. It was at a career fair that Pelaez met a SICK representative, who encouraged him to apply to the TiM$10K Challenge.
Win $10K and a Trip to Germany!
SICK is excited to announce the 2022-2023 edition of the SICK TiM$10K Challenge. Twenty teams will be selected to participate in the challenge, and the chosen teams will be supplied with a 270º SICK LiDAR sensor (TiM) and accessories. The teams will be challenged to solve a problem, create a solution, bring a new application that utilizes the SICK LiDAR in any industry. This can be part of the curriculum of a senior design project or capstone projects for students.
The 3 winning teams will win a cash award of
• 1st Place – $10K
• 2nd Place – $5K
• 3rd place – $3K
In addition to bragging rights and the cash prize, the 1st place winning team, along with the advising professor, will be offered an all-expenses-paid trip to SICK Germany to visit the SICK headquarters and manufacturing facility!
Registration is now open for the academic year 2022-2023!
Using SICK's LiDAR technology, the ROADGNAR takes a 3D scan of the road and the data is then used to determine the exact level of repair needed.
ROADGNAR collects detailed data on the surface of any roadway, while still allowing for easy integration onto any vehicle. With this automated system, road maintenance can become a faster, more reliable, and more efficient process for towns and cities around the country.
ROADGNAR solves this problem through two avenues: hardware and software. The team designed two mounting brackets to connect the system to a vehicle. The first, located in the back of the vehicle, supports a LiDAR scanner. The second is fixed in line with the vehicle's axle and supports a wheel encoder, which is wired to the fuse box.
“It definitely took us a while to figure out a way to power ROADGNAR so we wouldn't have to worry about it shutting off while the car was in motion,” said Parker.
Also wired to the fuse box is a GPS module within the vehicle itself. Data transfer wires are attached to these three systems and connected to a central processing unit within the vehicle.
Using LiDAR to collect road dataWhen the car is started, all connected devices turn on. The LiDAR scanner collects road surface data, the wheel encoder tracks an accurate measurement of the distance travelled by the vehicle, and the GPS generates geo-tags on a constant basis. All this data is stored in the onboard database, where a monitor presents it all to the user. The data is then stored in a hard drive.
Much like the roads in their Massachusetts town, the creation process of ROADGNAR was not without its challenges. The biggest problem took the form of the COVID-19 pandemic, which hit the ROADGNAR team in the middle of development. Once WPI closed to encourage its students and faculty to practice social distancing, the team was without a base of operations.
“When the coronavirus closed our school, we were lucky enough to live pretty close to each other,” said Paleaz. “We took precautions, but were able to come together to test and power through to finish our project.”
Integrating LiDAR into the car was also a challenge. Occasionally, the LiDAR would shut off when the car began moving. The team had to take several measures to keep the sensor online, often contacting SICK's help center for instruction.
“One of the major challenges was making sure we were getting enough data on a given road surface,” said Budris. “At first we were worried that we wouldn't get enough data from the sensor to make ROADGNAR feasible, but we figured that if we drove at a slow and constant rate, we'd be able to get accurate scans.”
With the challenge complete, Pelaez, Budris, and Parker are looking to turn ROADGNAR into a genuine product. They have already contacted an experienced business partner to help them determine their next steps.
They are now interviewing with representatives from various Department of Public Works throughout Massachusetts and Connecticut. Thirteen municipalities have indicated that they would be extremely interested in utilizing ROADGNAR, as it would drastically reduce the time needed to assess all the roads in the area. The trio is excited to see how different LiDAR sensors can help refine ROADGNAR into a viable product.
“We'd like to keep the connection going,” explained Pelaez. “If we can keep the door open for a potential partnership between us and SICK, that'd be great.”
SICK is now accepting entries for the TiM$10K Challenge for the 2022-2023 school year!
Student teams are encouraged to use their creativity and technical knowledge to incorporate the SICK LiDAR for any industry in any application. Advisors/professors are allowed to guide the student teams as required. Continue reading
A year and a half ago Netflix released The Social Dilemma, a docu-drama that dug into the harmful consequences of social media. Think political polarization, the spread of misinformation, and upticks in anxiety and depression across multiple demographics. Tristan Harris, a former Google design ethicist and co-founder of the Center for Humane Technology, is a central figure in the film. In a session at South By Southwest this week, Harris spoke about the steps we should be taking to get this technology and our relationship with it to a healthy place, or as he put it, the wisdom we need to steer technology and our future.
Harris opened with a quote from biologist Edward O. Wilson, who said, “The real problem of humanity is the following: We have Paleolithic emotions, medieval institutions, and godlike technology. And it is terrifically dangerous, and it is now approaching a point of crisis overall.”
In other words, technology is advancing far too fast for our brains to keep up and know how to healthily interact with it, or for our institutions to understand it and wisely regulate it.
Wilson spoke these words at a debate at the Harvard Museum of Natural History in 2009; that is, before the widespread adoption of platforms like Instagram and Tiktok, or of tech like deepfakes, text generators, CRISPR, and other innovations that have the potential to transform humanity (for better or for worse).
Tristan Harris at SXSW 2022
We now have algorithms that can generate realistic images based on text, of anything from mountain sunsets to bombed-out buildings in Ukraine. We have GPT-3, which could write a convincing paper arguing mRNA vaccines aren’t safe, citing real facts that are simply presented out of context. “This is like a neutron bomb for trust on the internet,” Harris said. “And the complexity of the world is increasing every day.” Our ability to respond, however, isn’t matching up.
Issues that would have been considered separate from one another in the past (or that didn’t exist in the past) are now closely linked; consider the impact that misinformation and synthetic media could have on nuclear escalation (and the impact they’ve already had on elections and democracy), or the connection between artificial intelligence and global financial risk.
Our previous thinking around how to manage technology isn’t good enough in the face of this new complexity; how do we handle issues like privacy or freedom of speech when multiple actors are involved, there’s low accountability, and everyone’s definition of what’s “right” is different? “Technology has been undermining humanity’s capacity for wisdom,” Harris said. “Not just individually, but our collective ability to operate with the wisdom that we need.”
Wisdom, he said, means knowing the limits of how we actually work, having the self awareness and humility to be inquiring, and being able to think in terms of systems and root causes. Harris referenced the book Thinking in Systems by environmental scientist Donella Meadows, in which she details 12 leverage points for intervening in a system—that is, changing the way a system works from its current state to something else. In Harris’s opinion, the most relevant of Meadows’ points to the tech conversation is the power to transcend paradigms.
Each of the paradigms of thinking in the tech industry that got us where we are should be overhauled by a human-centered focus. Rather than shrugging off the harms of technology by asserting that there are always costs and benefits, we should focus on minimizing harmful externalities. Rather than giving users what they want, we must respect human weaknesses and vulnerabilities (for example, the way social media platforms exploit the brain’s dopamine response). Rather than maximizing personalization to give users a satisfying experience (also known as creating our own unique little echo chambers), we should strive to create shared understanding.
The question is, how do we get more people to go from being typical users of social media and other tech to being what Harris calls humane technologists?
It starts with raising awareness and educating ourselves. Harris and his team at the Center for Humane Technology created an online course called Foundations of Humane Technology, which takes registrants through six values-centered tenets that, if we prioritize them when designing new tech (or changing the design of existing tech), can improve our experience both individually and as an interconnected community.
“We would like to have 100,000 humane technologists who are trained in this new paradigm,” Harris said. “It’s hard to think about these things when you feel like you’re the only one asking these questions.”
We’re at an inflection point where it’s crucial for those working on technology to help create shared understanding; the world isn’t about to get less complex or volatile. On the contrary, Harris predicts we’re heading into a period of increasing global catastrophes fueled by climate change, inequality, and unstable political regimes, among other factors.
It’s a lot to take on, even a lot to contemplate. But, Harris said, he has hope because he’s seen the system change much faster in the last few years than ever before. People from within the tech industry have spoken out about the risks and harms of the products they helped create, from former YouTube engineer Guillaume Chaslot to Facebook co-founder Chris Hughes to former Facebook data scientist Frances Haugen, and many more. “Technologists are actually waking up and saying, ‘I don’t want to participate in the toxic part of the industry, I want to help build a better part,’” Harris said.
Going back to Wilson’s quote, Harris proposed the following: we need to embrace our Paleolithic emotions, upgrade our medieval institutions, and have the wisdom to wield our God-like technology. We need to be able to make sense of the world and have people from different sides come together and agree on the actions we should take—then take them. There should be no place for business models that are dependent on dividing people. “We need everyone working on helping us close that gap,” Harris said.
Image Credit: Rodion Kutsaev on Unsplash
Looking for ways to stay ahead of the pace of change? Rethink what’s possible. Join a highly curated, exclusive cohort of 80 executives for Singularity’s flagship Executive Program (EP), a five-day, fully immersive leadership transformation program that disrupts existing ways of thinking. Discover a new mindset, toolset and network of fellow futurists committed to finding solutions to the fast pace of change in the world. Click here to learn more and apply today! Continue reading
A short journey through the magical world of humanoid robots.
Mesmer Entertainment Robotics demonstrate some of their humanoid animatronics, as well as their humanoid robot, Owen.