Tag Archives: IoT
#437337 6G Will Be 100 Times Faster Than ...
Though 5G—a next-generation speed upgrade to wireless networks—is scarcely up and running (and still nonexistent in many places) researchers are already working on what comes next. It lacks an official name, but they’re calling it 6G for the sake of simplicity (and hey, it’s tradition). 6G promises to be up to 100 times faster than 5G—fast enough to download 142 hours of Netflix in a second—but researchers are still trying to figure out exactly how to make such ultra-speedy connections happen.
A new chip, described in a paper in Nature Photonics by a team from Osaka University and Nanyang Technological University in Singapore, may give us a glimpse of our 6G future. The team was able to transmit data at a rate of 11 gigabits per second, topping 5G’s theoretical maximum speed of 10 gigabits per second and fast enough to stream 4K high-def video in real time. They believe the technology has room to grow, and with more development, might hit those blistering 6G speeds.
NTU final year PhD student Abhishek Kumar, Assoc Prof Ranjan Singh and postdoc Dr Yihao Yang. Dr Singh is holding the photonic topological insulator chip made from silicon, which can transmit terahertz waves at ultrahigh speeds. Credit: NTU Singapore
But first, some details about 5G and its predecessors so we can differentiate them from 6G.
Electromagnetic waves are characterized by a wavelength and a frequency; the wavelength is the distance a cycle of the wave covers (peak to peak or trough to trough, for example), and the frequency is the number of waves that pass a given point in one second. Cellphones use miniature radios to pick up electromagnetic signals and convert those signals into the sights and sounds on your phone.
4G wireless networks run on millimeter waves on the low- and mid-band spectrum, defined as a frequency of a little less (low-band) and a little more (mid-band) than one gigahertz (or one billion cycles per second). 5G kicked that up several notches by adding even higher frequency millimeter waves of up to 300 gigahertz, or 300 billion cycles per second. Data transmitted at those higher frequencies tends to be information-dense—like video—because they’re much faster.
The 6G chip kicks 5G up several more notches. It can transmit waves at more than three times the frequency of 5G: one terahertz, or a trillion cycles per second. The team says this yields a data rate of 11 gigabits per second. While that’s faster than the fastest 5G will get, it’s only the beginning for 6G. One wireless communications expert even estimates 6G networks could handle rates up to 8,000 gigabits per second; they’ll also have much lower latency and higher bandwidth than 5G.
Terahertz waves fall between infrared waves and microwaves on the electromagnetic spectrum. Generating and transmitting them is difficult and expensive, requiring special lasers, and even then the frequency range is limited. The team used a new material to transmit terahertz waves, called photonic topological insulators (PTIs). PTIs can conduct light waves on their surface and edges rather than having them run through the material, and allow light to be redirected around corners without disturbing its flow.
The chip is made completely of silicon and has rows of triangular holes. The team’s research showed the chip was able to transmit terahertz waves error-free.
Nanyang Technological University associate professor Ranjan Singh, who led the project, said, “Terahertz technology […] can potentially boost intra-chip and inter-chip communication to support artificial intelligence and cloud-based technologies, such as interconnected self-driving cars, which will need to transmit data quickly to other nearby cars and infrastructure to navigate better and also to avoid accidents.”
Besides being used for AI and self-driving cars (and, of course, downloading hundreds of hours of video in seconds), 6G would also make a big difference for data centers, IoT devices, and long-range communications, among other applications.
Given that 5G networks are still in the process of being set up, though, 6G won’t be coming on the scene anytime soon; a recent whitepaper on 6G from Japanese company NTTDoCoMo estimates we’ll see it in 2030, pointing out that wireless connection tech generations have thus far been spaced about 10 years apart; we got 3G in the early 2000s, 4G in 2010, and 5G in 2020.
In the meantime, as 6G continues to develop, we’re still looking forward to the widespread adoption of 5G.
Image Credit: Hans Braxmeier from Pixabay Continue reading
#435822 The Internet Is Coming to the Rest of ...
People surf it. Spiders crawl it. Gophers navigate it.
Now, a leading group of cognitive biologists and computer scientists want to make the tools of the Internet accessible to the rest of the animal kingdom.
Dubbed the Interspecies Internet, the project aims to provide intelligent animals such as elephants, dolphins, magpies, and great apes with a means to communicate among each other and with people online.
And through artificial intelligence, virtual reality, and other digital technologies, researchers hope to crack the code of all the chirps, yips, growls, and whistles that underpin animal communication.
Oh, and musician Peter Gabriel is involved.
“We can use data analysis and technology tools to give non-humans a lot more choice and control,” the former Genesis frontman, dressed in his signature Nehru-style collar shirt and loose, open waistcoat, told IEEE Spectrum at the inaugural Interspecies Internet Workshop, held Monday in Cambridge, Mass. “This will be integral to changing our relationship with the natural world.”
The workshop was a long time in the making.
Eighteen years ago, Gabriel visited a primate research center in Atlanta, Georgia, where he jammed with two bonobos, a male named Kanzi and his half-sister Panbanisha. It was the first time either bonobo had sat at a piano before, and both displayed an exquisite sense of musical timing and melody.
Gabriel seemed to be speaking to the great apes through his synthesizer. It was a shock to the man who once sang “Shock the Monkey.”
“It blew me away,” he says.
Add in the bonobos’ ability to communicate by pointing to abstract symbols, Gabriel notes, and “you’d have to be deaf, dumb, and very blind not to notice language being used.”
Gabriel eventually teamed up with Internet protocol co-inventor Vint Cerf, cognitive psychologist Diana Reiss, and IoT pioneer Neil Gershenfeld to propose building an Interspecies Internet. Presented in a 2013 TED Talk as an “idea in progress,” the concept proved to be ahead of the technology.
“It wasn’t ready,” says Gershenfeld, director of MIT’s Center for Bits and Atoms. “It needed to incubate.”
So, for the past six years, the architects of the Dolittlesque initiative embarked on two small pilot projects, one for dolphins and one for chimpanzees.
At her Hunter College lab in New York City, Reiss developed what she calls the D-Pad—a touchpad for dolphins.
Reiss had been trying for years to create an underwater touchscreen with which to probe the cognition and communication skills of bottlenose dolphins. But “it was a nightmare coming up with something that was dolphin-safe and would work,” she says.
Her first attempt emitted too much heat. A Wii-like system of gesture recognition proved too difficult to install in the dolphin tanks.
Eventually, she joined forces with Rockefeller University biophysicist Marcelo Magnasco and invented an optical detection system in which images and infrared sensors are projected through an underwater viewing window onto a glass panel, allowing the dolphins to play specially designed apps, including one dubbed Whack-a-Fish.
Meanwhile, in the United Kingdom, Gabriel worked with Alison Cronin, director of the ape rescue center Monkey World, to test the feasibility of using FaceTime with chimpanzees.
The chimps engaged with the technology, Cronin reported at this week’s workshop. However, our hominid cousins proved as adept at videotelephonic discourse as my three-year-old son is at video chatting with his grandparents—which is to say, there was a lot of pass-the-banana-through-the-screen and other silly games, and not much meaningful conversation.
“We can use data analysis and technology tools to give non-humans a lot more choice and control.”
—Peter Gabriel
The buggy, rudimentary attempt at interspecies online communication—what Cronin calls her “Max Headroom experiment”—shows that building the Interspecies Internet will not be as simple as giving out Skype-enabled tablets to smart animals.
“There are all sorts of problems with creating a human-centered experience for another animal,” says Gabriel Miller, director of research and development at the San Diego Zoo.
Miller has been working on animal-focused sensory tools such as an “Elephone” (for elephants) and a “Joybranch” (for birds), but it’s not easy to design efficient interactive systems for other creatures—and for the Interspecies Internet to be successful, Miller points out, “that will be super-foundational.”
Researchers are making progress on natural language processing of animal tongues. Through a non-profit organization called the Earth Species Project, former Firefox designer Aza Raskin and early Twitter engineer Britt Selvitelle are applying deep learning algorithms developed for unsupervised machine translation of human languages to fashion a Rosetta Stone–like tool capable of interpreting the vocalizations of whales, primates, and other animals.
Inspired by the scientists who first documented the complex sonic arrangements of humpback whales in the 1960s—a discovery that ushered in the modern marine conservation movement—Selvitelle hopes that an AI-powered animal translator can have a similar effect on environmentalism today.
“A lot of shifts happen when someone who doesn’t have a voice gains a voice,” he says.
A challenge with this sort of AI software remains verification and validation. Normally, machine-learning algorithms are benchmarked against a human expert, but who is to say if a cybernetic translation of a sperm whale’s clicks is accurate or not?
One could back-translate an English expression into sperm whale-ese and then into English again. But with the great apes, there might be a better option.
According to primatologist Sue Savage-Rumbaugh, expertly trained bonobos could serve as bilingual interpreters, translating the argot of apes into the parlance of people, and vice versa.
Not just any trained ape will do, though. They have to grow up in a mixed Pan/Homo environment, as Kanzi and Panbanisha were.
“If I can have a chat with a cow, maybe I can have more compassion for it.”
—Jeremy Coller
Those bonobos were raised effectively from birth both by Savage-Rumbaugh, who taught the animals to understand spoken English and to communicate via hundreds of different pictographic “lexigrams,” and a bonobo mother named Matata that had lived for six years in the Congolese rainforests before her capture.
Unlike all other research primates—which are brought into captivity as infants, reared by human caretakers, and have limited exposure to their natural cultures or languages—those apes thus grew up fluent in both bonobo and human.
Panbanisha died in 2012, but Kanzi, aged 38, is still going strong, living at an ape sanctuary in Des Moines, Iowa. Researchers continue to study his cognitive abilities—Francine Dolins, a primatologist at the University of Michigan-Dearborn, is running one study in which Kanzi and other apes hunt rabbits and forage for fruit through avatars on a touchscreen. Kanzi could, in theory, be recruited to check the accuracy of any Google Translate–like app for bonobo hoots, barks, grunts, and cries.
Alternatively, Kanzi could simply provide Internet-based interpreting services for our two species. He’s already proficient at video chatting with humans, notes Emily Walco, a PhD student at Harvard University who has personally Skyped with Kanzi. “He was super into it,” Walco says.
And if wild bonobos in Central Africa can be coaxed to gather around a computer screen, Savage-Rumbaugh is confident Kanzi could communicate with them that way. “It can all be put together,” she says. “We can have an Interspecies Internet.”
“Both the technology and the knowledge had to advance,” Savage-Rumbaugh notes. However, now, “the techniques that we learned could really be extended to a cow or a pig.”
That’s music to the ears of Jeremy Coller, a private equity specialist whose foundation partially funded the Interspecies Internet Workshop. Coller is passionate about animal welfare and has devoted much of his philanthropic efforts toward the goal of ending factory farming.
At the workshop, his foundation announced the creation of the Coller Doolittle Prize, a US $100,000 award to help fund further research related to the Interspecies Internet. (A working group also formed to synthesize plans for the emerging field, to facilitate future event planning, and to guide testing of shared technology platforms.)
Why would a multi-millionaire with no background in digital communication systems or cognitive psychology research want to back the initiative? For Coller, the motivation boils to interspecies empathy.
“If I can have a chat with a cow,” he says, “maybe I can have more compassion for it.”
An abridged version of this post appears in the September 2019 print issue as “Elephants, Dolphins, and Chimps Need the Internet, Too.” Continue reading