Tag Archives: Digit

#439693 Agility Robotics’ Digit is Getting ...

Agility Robotics' Digit humanoid has been taking a bit of a break from work during the pandemic. Most of what we've seen from Agility and Digit over the past year and a half has been decidedly research-y. Don't get me wrong, Digit's been busy making humans look bad and not falling over when it really should have done, but remember that Agility's goal is to make Digit into a useful, practical robot. It's not a research platform—as Agility puts it, Digit is intended to “accelerate business productivity and people's pursuit of a more fulfilling life.” As far as I can make out, this is a fancier way of saying that Digit should really be spending its time doing dull repetitive tasks so that humans don't have to, and in a new video posted today, the robot shows how it can help out with boring warehouse tote shuffling.

The highlights here for me are really in the combination of legged mobility and object manipulation. Right at the beginning of the video, you see Digit squatting all the way down, grasping a tote bin, shuffling backwards to get the bin out from under the counter, and then standing again. There's an unfortunate cut there, but the sequence is shown again at 0:44, and you can see how Digit pulls the tote towards itself and then regrasps it before lifting. Clever. And at 1:20, the robot gives a tote that it just placed on a shelf a little nudge with one arm to make sure it's in the right spot.

These are all very small things, but I think of them as highlights because all of the big things seem to be more or less solved in this scenario. Digit has no problem lifting things, walking around, and not mowing over the occasional human, and once that stuff is all sorted, whether the robot is able to effectively work in an environment like this is to some extent reflected in all of these other little human-obvious things that often make the difference between success and failure.
The clear question, though, is why Digit (or, more broadly, any bipedal robot) is the right robot to be doing this kind of job. There are other robots out there already doing tasks like these in warehouses, and they generally have wheeled bases and manipulation systems specifically designed to move totes and do nothing else. If you were to use one of those robots instead of Digit, my guess is that you'd pay less for it, it would be somewhat safer, and it would likely do the job more efficiently. Fundamentally, Digit can't out box-move a box-moving robot. But the critical thing to consider here is that as soon as you run out of boxes to move, Digit can do all kinds of other things thanks to its versatile humanoid design, while your box-moving robot can only sit in the corner and be sad until more boxes show up.
“We did not set out to build a humanoid robot. We set out to solve mobility.”
—Agility CTO Jonathan Hurst
“Digit is very, very flexible automation,” Agility CTO Jonathan Hurst told us when we asked him about this. “The value of what we're doing is in generality, and having a robot that's going be able to work carrying totes for three or four hours, then go unload boxes from trailers for three or four hours, keep up with you if you change your workflow entirely. Many of these spaces are designed specifically around the human form factor, and it's possible for a robot like Digit to do all of these different boring, repetitive jobs. And then when things get complicated, humans are still doing it.”
The value of having a human-like robot in a human environment comes into play as soon as you start thinking about typical warehouse situations that would be trivial for a human to solve but that are impossible for wheeled robots. For example, Hurst says that Digit is capable of using a stool to reach objects on high shelves. You could, of course, design a wheeled robot with an extension system to allow it to reach high shelves, but you're now adding more cost and complexity, and the whole point of a generalist humanoid robot is that in human environments, you just don't have to worry about environmental challenges. Or that's the idea, anyway, but as Hurst explains, the fact that Digit ended up with a mostly humanoid form factor was more like a side effect of designing with specific capabilities in mind:
We did not set out to build a humanoid robot. We set out to solve mobility, and we've been on a methodical path towards understanding physical interaction in the world. Agility started with our robot Cassie, and one of the big problems with Cassie was that we didn't have enough inertia in the robot's body to counteract the leg swinging forward, which is why Digit has an upright torso. We wanted to give ourselves more control authority in the yaw direction with Cassie, so we experimented with putting a tail on the robot, and it turns out that the best tail is a pair of bilaterally symmetrical tails, one on either side.
Our goal was to design a machine that can go where people go while manipulating things in the world, and we ended up with this kind of form factor. It's a very different path for us to have gotten here than the vast majority of humanoid robots, and there's an awful lot of subtlety that is in our machine that is absent in most other machines.IEEE Spectrum: So are you saying that Digit's arms sort of started out as tails to help Cassie with yaw control?
Jonathan Hurst: There are many examples like this—we've been going down this path where we find a solution to a problem like yaw control, and it happens to look like it does with animals, but it's also a solution that's optimal in several different ways, like physical interaction and being able to catch the robot when it falls. It's not like it's a compromise between one thing and another thing, it's straight up the right solution for these three different performance design goals.
Looking back, we started by asking, should we put a reaction wheel or a gyro on Cassie for yaw control? Well, that's just wasted mass. We could use a tail, and there are a lot of nice robots with tails, but usually they're for controlling pitch. It's the same with animals; if you look at lizards, they use their tails for mid-air reorienting to land on their feet after they jump. Cassie doesn't need a tail for that, but we only have a couple of small feet on the ground to work with. And if you look at other bipedal animals, every one of them has some other way of getting that yaw authority. If you watch an ostrich run, when it turns, it sticks its wing out to get the control that it needs.
And so all of these things just fall into place, and a bilaterally symmetrical pair of tails is the best way to control yaw in a biped. When you see Digit walking and its arms are swinging, that's not something that we added to make the motion look right. It looks right because it literally is right—it's the physics of mobility. And that's a good sign for us that we're on the right path to getting the performance that we want.
“We're going for general purpose, but starting with some of the easiest use cases.”
—Agility CTO Jonathan Hurst
Spectrum: We've seen Digit demonstrating very impressive mobility skills. Why are we seeing a demo in a semi-constrained warehouse environment instead of somewhere that would more directly leverage Digit's unique advantages?
Jonathan Hurst: It's about finding the earliest, most appropriate, and most valuable use cases. There's a lot to this robot, and we're not going to be just a tote packing robot. We're not building a specialized robot for this one application, but we have a couple of pretty big logistics partners who are interested in the flexibility and the manipulation capabilities of this machine. And yeah, what you're seeing now is the robot on a flattish floor, but it's also not going to be tripped up by a curb, or a step, or, a wire cover, or other things on the ground. You don't have to worry about anything like that. So next, it's an easy transition next to unloading trailers, where it's going to have to be stepping over gaps and up and down things and around boxes on the floor and stuff like that. We're going for general purpose, but starting with some of the easiest use cases.
Damion Shelton, CEO: We're trying to prune down the industry space, to get to something where there's a clear value proposition with a partner and deploying there. We can respect the difficulty of the general purpose use case and work to deploy early and profitably, as opposed to continuing to push for the outdoor applications. The blessing and the curse of the Ford opportunity is that it's super interesting, but also super hard. And so it's very motivating, and it's clear to us that that's where one of the ultimate opportunities is, but it's also far enough away from a deployment timeline that it just doesn't map on to a viable business model.
This is a point that every robotics company runs into sooner or later, where aspirations have to succumb to the reality of selling robots in a long-term sustainable way. It's definitely not a bad thing, it just means that we may have to adjust our expectations accordingly. No matter what kind of flashy cutting-edge capabilities your robot has, if it can't cost effectively do dull or dirty or dangerous stuff, nobody's going to pay you money for it. And cost effective usefulness is, arguably, one of the biggest challenges in bipedal robotics right now. In the past, I've been impressed by Digit's weightlifting skills, or its ability to climb steep and muddy hills. I'll be just as impressed when it starts making money for Agility by doing boring repetitive tasks in warehouses, because that means that Agility will be able to keep working towards those more complex, more exciting things. “It's not general manipulation, and we're not solving the grand challenges of robotics,” says Hurst. “Yet. But we're on our way.” Continue reading

Posted in Human Robots

#439100 Video Friday: Robotic Eyeball Camera

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here's what we have so far (send us your events!):

RoboSoft 2021 – April 12-16, 2021 – [Online Conference]
ICRA 2021 – May 30-5, 2021 – Xi'an, China
RoboCup 2021 – June 22-28, 2021 – [Online Event]
DARPA SubT Finals – September 21-23, 2021 – Louisville, KY, USA
WeRobot 2021 – September 23-25, 2021 – Coral Gables, FL, USA
Let us know if you have suggestions for next week, and enjoy today's videos.

What if seeing devices looked like us? Eyecam is a prototype exploring the potential future design of sensing devices. Eyecam is a webcam shaped like a human eye that can see, blink, look around and observe us.

And it's open source, so you can build your own!

[ Eyecam ]

Looks like Festo will be turning some of its bionic robots into educational kits, which is a pretty cool idea.

[ Bionics4Education ]

Underwater soft robots are challenging to model and control because of their high degrees of freedom and their intricate coupling with water. In this paper, we present a method that leverages the recent development in differentiable simulation coupled with a differentiable, analytical hydrodynamic model to assist with the modeling and control of an underwater soft robot. We apply this method to Starfish, a customized soft robot design that is easy to fabricate and intuitive to manipulate.

[ MIT CSAIL ]

Rainbow Robotics, the company who made HUBO, has a new collaborative robot arm.

[ Rainbow Robotics ]

Thanks Fan!

We develop an integrated robotic platform for advanced collaborative robots and demonstrates an application of multiple robots collaboratively transporting an object to different positions in a factory environment. The proposed platform integrates a drone, a mobile manipulator robot, and a dual-arm robot to work autonomously, while also collaborating with a human worker. The platform also demonstrates the potential of a novel manufacturing process, which incorporates adaptive and collaborative intelligence to improve the efficiency of mass customization for the factory of the future.

[ Paper ]

Thanks Poramate!

In Sevastopol State University the team of the Laboratory of Underwater Robotics and Control Systems and Research and Production Association “Android Technika” performed tests of an underwater anropomorphic manipulator robot.

[ Sevastopol State ]

Thanks Fan!

Taiwanese company TCI Gene created a COVID test system based on their fully automated and enclosed gene testing machine QVS-96S. The system includes two ABB robots and carries out 1800 tests per day, operating 24/7. Every hour 96 virus samples tests are made with an accuracy of 99.99%.

[ ABB ]

A short video showing how a Halodi Robotics can be used in a commercial guarding application.

[ Halodi ]

During the past five years, under the NASA Early Space Innovations program, we have been developing new design optimization methods for underactuated robot hands, aiming to achieve versatile manipulation in highly constrained environments. We have prototyped hands for NASA’s Astrobee robot, an in-orbit assistive free flyer for the International Space Station.

[ ROAM Lab ]

The new, improved OTTO 1500 is a workhorse AMR designed to move heavy payloads through demanding environments faster than any other AMR on the market, with zero compromise to safety.

[ ROAM Lab ]

Very, very high performance sensing and actuation to pull this off.

[ Ishikawa Group ]

We introduce a conversational social robot designed for long-term in-home use to help with loneliness. We present a novel robot behavior design to have simple self-reflection conversations with people to improve wellness, while still being feasible, deployable, and safe.

[ HCI Lab ]

We are one of the 5 winners of the Start-up Challenge. This video illustrates what we achieved during the Swisscom 5G exploration week. Our proof-of-concept tele-excavation system is composed of a Menzi Muck M545 walking excavator automated & customized by Robotic Systems Lab and IBEX motion platform as the operator station. The operator and remote machine are connected for the first time via a 5G network infrastructure which was brought to our test field by Swisscom.

[ RSL ]

This video shows LOLA balancing on different terrain when being pushed in different directions. The robot is technically blind, not using any camera-based or prior information on the terrain (hard ground is assumed).

[ TUM ]

Autonomous driving when you cannot see the road at all because it's buried in snow is some serious autonomous driving.

[ Norlab ]

A hierarchical and robust framework for learning bipedal locomotion is presented and successfully implemented on the 3D biped robot Digit. The feasibility of the method is demonstrated by successfully transferring the learned policy in simulation to the Digit robot hardware, realizing sustained walking gaits under external force disturbances and challenging terrains not included during the training process.

[ OSU ]

This is a video summary of the Center for Robot-Assisted Search and Rescue's deployments under the direction of emergency response agencies to more than 30 disasters in five countries from 2001 (9/11 World Trade Center) to 2018 (Hurricane Michael). It includes the first use of ground robots for a disaster (WTC, 2001), the first use of small unmanned aerial systems (Hurricane Katrina 2005), and the first use of water surface vehicles (Hurricane Wilma, 2005).

[ CRASAR ]

In March, a team from the Oxford Robotics Institute collected a week of epic off-road driving data, as part of the Sense-Assess-eXplain (SAX) project.

[ Oxford Robotics ]

As a part of the AAAI 2021 Spring Symposium Series, HEBI Robotics was invited to present an Industry Talk on the symposium's topic: Machine Learning for Mobile Robot Navigation in the Wild. Included in this presentation was a short case study on one of our upcoming mobile robots that is being designed to successfully navigate unstructured environments where today's robots struggle.

[ HEBI Robotics ]

Thanks Hardik!

This Lockheed Martin Robotics Seminar is from Chad Jenkins at the University of Michigan, on “Semantic Robot Programming… and Maybe Making the World a Better Place.”

I will present our efforts towards accessible and general methods of robot programming from the demonstrations of human users. Our recent work has focused on Semantic Robot Programming (SRP), a declarative paradigm for robot programming by demonstration that builds on semantic mapping. In contrast to procedural methods for motion imitation in configuration space, SRP is suited to generalize user demonstrations of goal scenes in workspace, such as for manipulation in cluttered environments. SRP extends our efforts to crowdsource robot learning from demonstration at scale through messaging protocols suited to web/cloud robotics. With such scaling of robotics in mind, prospects for cultivating both equal opportunity and technological excellence will be discussed in the context of broadening and strengthening Title IX and Title VI.

[ UMD ] Continue reading

Posted in Human Robots

#439036 Video Friday: Shadow Plays Jenga, and ...

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here’s what we have so far (send us your events!):

RoboSoft 2021 – April 12-16, 2021 – [Online Conference]
ICRA 2021 – May 30-5, 2021 – Xi'an, China
DARPA SubT Finals – September 21-23, 2021 – Louisville, KY, USA
WeRobot 2021 – September 23-25, 2021 – Coral Gables, FL, USA
Let us know if you have suggestions for next week, and enjoy today’s videos.

The Shadow Robot team couldn't resist! Our Operator, Joanna, is using the Shadow Teleoperation System which, fun and games aside, can help those in difficult, dangerous and distant jobs.

Shadow could challenge this MIT Jenga-playing robot, but I bet they wouldn't win:

[ Shadow Robot ]

Digit is gradually stomping the Agility Robotics logo into a big grassy field fully autonomously.

[ Agility Robotics ]

This is a pretty great and very short robotic magic show.

[ Mario the Magician ]

A research team at the Georgia Institute of Technology has developed a modular solution for drone delivery of larger packages without the need for a complex fleet of drones of varying sizes. By allowing teams of small drones to collaboratively lift objects using an adaptive control algorithm, the strategy could allow a wide range of packages to be delivered using a combination of several standard-sized vehicles.

[ GA Tech ]

I've seen this done using vision before, but Flexiv's Rizon 4s can keep a ball moving along a specific trajectory using only force sensing and control.

[ Flexiv ]

Thanks Yunfan!

This combination of a 3D aerial projection system and a sensing interface can be used as an interactive and intuitive control system for things like robot arms, but in this case, it's being used to make simulated pottery. Much less messy than the traditional way of doing it.

More details on Takafumi Matsumaru's work at the Bio-Robotics & Human-Mechatronics Laboratory at Waseda University at the link below.

[ BLHM ]

U.S. Vice President Kamala Harris called astronauts Shannon Walker and Kate Rubins on the ISS, and they brought up Astrobee, at which point Shannon reaches over and rips Honey right off of her charging dock to get her on camera.

[ NASA ]

Here's a quick three minute update on Perseverance and Ingenuity from JPL.

[ Mars 2020 ]

Rigid grippers used in existing aerial manipulators require precise positioning to achieve successful grasps and transmit large contact forces that may destabilize the drone. This limits the speed during grasping and prevents “dynamic grasping,” where the drone attempts to grasp an object while moving. On the other hand, biological systems (e.g. birds) rely on compliant and soft parts to dampen contact forces and compensate for grasping inaccuracy, enabling impressive feats. This paper presents the first prototype of a soft drone—a quadrotor where traditional (i.e. rigid) landing gears are replaced with a soft tendon-actuated gripper to enable aggressive grasping.

[ MIT ]

In this video we present results from a field deployment inside the Løkken Mine underground pyrite mine in Norway. The Løkken mine was operative from 1654 to 1987 and contains narrow but long corridors, alongside vast rooms and challenging vertical stopes. In this field study we evaluated selected autonomous exploration and visual search capabilities of a subset of the aerial robots of Team CERBERUS towards the goal of complete subterranean autonomy.

[ Team CERBERUS ]

What you can do with a 1,000 FPS projector with a high speed tracking system.

[ Ishikawa Group ]

ANYbotics’ collaboration with BASF, one of the largest global chemical manufacturers, displays the efficiency, quality, and scalability of robotic inspection and data-collection capabilities in complex industrial environments.

[ ANYbotics ]

Does your robot arm need a stylish jacket?

[ Fraunhofer ]

Trossen Robotics unboxes a Unitree A1, and it's actually an unboxing where they have to figure out everything from scratch.

[ Trossen ]

Robots have learned to drive cars, assist in surgeries―and vacuum our floors. But can they navigate the unwritten rules of a busy sidewalk? Until they can, robotics experts Leila Takayama and Chris Nicholson believe, robots won’t be able to fulfill their immense potential. In this conversation, Chris and Leila explore the future of robotics and the role open source will play in it.

[ Red Hat ]

Christoph Bartneck's keynote at the 6th Joint UAE Symposium on Social Robotics, focusing on what roles robots can play during the Covid crisis and why so many social robots fail in the market.

[ HIT Lab ]

Decision-making based on arbitrary criteria is legal in some contexts, such as employment, and not in others, such as criminal sentencing. As algorithms replace human deciders, HAI-EIS fellow Kathleen Creel argues arbitrariness at scale is morally and legally problematic. In this HAI seminar, she explains how the heart of this moral issue relates to domination and a lack of sufficient opportunity for autonomy. It relates in interesting ways to the moral wrong of discrimination. She proposes technically informed solutions that can lessen the impact of algorithms at scale and so mitigate or avoid the moral harm identified.

[ Stanford HAI ]

Sawyer B. Fuller speaks on Autonomous Insect-Sized Robots at the UC Berkeley EECS Colloquium series.

Sub-gram (insect-sized) robots have enormous potential that is largely untapped. From a research perspective, their extreme size, weight, and power (SWaP) constraints also forces us to reimagine everything from how they compute their control laws to how they are fabricated. These questions are the focus of the Autonomous Insect Robotics Laboratory at the University of Washington. I will discuss potential applications for insect robots and recent advances from our group. These include the first wireless flights of a sub-gram flapping-wing robot that weighs barely more than a toothpick. I will describe efforts to expand its capabilities, including the first multimodal ground-flight locomotion, the first demonstration of steering control, and how to find chemical plume sources by integrating the smelling apparatus of a live moth. I will also describe a backpack for live beetles with a steerable camera and conceptual design of robots that could scale all the way down to the “gnat robots” first envisioned by Flynn & Brooks in the ‘80s.

[ UC Berkeley ]

Thanks Fan!

Joshua Vander Hook, Computer Scientist, NIAC Fellow, and Technical Group Supervisor at NASA JPL, presents an overview of the AI Group(s) at JPL, and recent work on single and multi-agent autonomous systems supporting space exploration, Earth science, NASA technology development, and national defense programs.

[ UMD ] Continue reading

Posted in Human Robots

#438980 Ford partners with U-M on robotics ...

Digit marches on two legs across the floor of the University of Michigan's Ford Motor Co. Robotics Building, while Mini-Cheetah—staccato-like—does the same on four and the yellow-legged Cassie steps deliberately side-to-side. Continue reading

Posted in Human Robots

#438785 Video Friday: A Blimp For Your Cat

Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We’ll also be posting a weekly calendar of upcoming robotics events for the next few months; here's what we have so far (send us your events!):

HRI 2021 – March 8-11, 2021 – [Online Conference]
RoboSoft 2021 – April 12-16, 2021 – [Online Conference]
ICRA 2021 – May 30-5, 2021 – Xi'an, China
Let us know if you have suggestions for next week, and enjoy today's videos.

Shiny robotic cat toy blimp!

I am pretty sure this is Google Translate getting things wrong, but the About page mentions that the blimp will “take you to your destination after appearing in the death of God.”

[ NTT DoCoMo ] via [ RobotStart ]

If you have yet to see this real-time video of Perseverance landing on Mars, drop everything and watch it.

During the press conference, someone commented that this is the first time anyone on the team who designed and built this system has ever seen it in operation, since it could only be tested at the component scale on Earth. This landing system has blown my mind since Curiosity.

Here's a better look at where Percy ended up:

[ NASA ]

The fact that Digit can just walk up and down wet, slippery, muddy hills without breaking a sweat is (still) astonishing.

[ Agility Robotics ]

SkyMul wants drones to take over the task of tying rebar, which looks like just the sort of thing we'd rather robots be doing so that we don't have to:

The tech certainly looks promising, and SkyMul says that they're looking for some additional support to bring things to the pilot stage.

[ SkyMul ]

Thanks Eohan!

Flatcat is a pet-like, playful robot that reacts to touch. Flatcat feels everything exactly: Cuddle with it, romp around with it, or just watch it do weird things of its own accord. We are sure that flatcat will amaze you, like us, and caress your soul.

I don't totally understand it, but I want it anyway.

[ Flatcat ]

Thanks Oswald!

This is how I would have a romantic dinner date if I couldn't get together in person. Herman the UR3 and an OptiTrack system let me remotely make a romantic meal!

[ Dave's Armoury ]

Here, we propose a novel design of deformable propellers inspired by dragonfly wings. The structure of these propellers includes a flexible segment similar to the nodus on a dragonfly wing. This flexible segment can bend, twist and even fold upon collision, absorbing force upon impact and protecting the propeller from damage.

[ Paper ]

Thanks Van!

In the 1970s, The CIA​ created the world's first miniaturized unmanned aerial vehicle, or UAV, which was intended to be a clandestine listening device. The Insectothopter was never deployed operationally, but was still revolutionary for its time.

It may never have been deployed (not that they'll admit to, anyway), but it was definitely operational and could fly controllably.

[ CIA ]

Research labs are starting to get Digits, which means we're going to get a much better idea of what its limitations are.

[ Ohio State ]

This video shows the latest achievements for LOLA walking on undetected uneven terrain. The robot is technically blind, not using any camera-based or prior information on the terrain.

[ TUM ]

We define “robotic contact juggling” to be the purposeful control of the motion of a three-dimensional smooth object as it rolls freely on a motion-controlled robot manipulator, or “hand.” While specific examples of robotic contact juggling have been studied before, in this paper we provide the first general formulation and solution method for the case of an arbitrary smooth object in single-point rolling contact on an arbitrary smooth hand.

[ Paper ]

Thanks Fan!

A couple of new cobots from ABB, designed to work safely around humans.

[ ABB ]

Thanks Fan!

It's worth watching at least a little bit of Adam Savage testing Spot's new arm, because we get to see Spot try, fail, and eventually succeed at an autonomous door-opening behavior at the 10 minute mark.

[ Tested ]

SVR discusses diversity with guest speakers Dr. Michelle Johnson from the GRASP Lab at UPenn; Dr Ariel Anders from Women in Robotics and first technical hire at Robust.ai; Alka Roy from The Responsible Innovation Project; and Kenechukwu C. Mbanesi and Kenya Andrews from Black in Robotics. The discussion here is moderated by Dr. Ken Goldberg—artist, roboticist and Director of the CITRIS People and Robots Lab—and Andra Keay from Silicon Valley Robotics.

[ SVR ]

RAS presents a Soft Robotics Debate on Bioinspired vs. Biohybrid Design.

In this debate, we will bring together experts in Bioinspiration and Biohybrid design to discuss the necessary steps to make more competent soft robots. We will try to answer whether bioinspired research should focus more on developing new bioinspired material and structures or on the integration of living and artificial structures in biohybrid designs.

[ RAS SoRo ]

IFRR presents a Colloquium on Human Robot Interaction.

Across many application domains, robots are expected to work in human environments, side by side with people. The users will vary substantially in background, training, physical and cognitive abilities, and readiness to adopt technology. Robotic products are expected to not only be intuitive, easy to use, and responsive to the needs and states of their users, but they must also be designed with these differences in mind, making human-robot interaction (HRI) a key area of research.

[ IFRR ]

Vijay Kumar, Nemirovsky Family Dean and Professor at Penn Engineering, gives an introduction to ENIAC day and David Patterson, Pardee Professor of Computer Science, Emeritus at the University of California at Berkeley, speaks about the legacy of the ENIAC and its impact on computer architecture today. This video is comprised of lectures one and two of nine total lectures in the ENIAC Day series.

There are more interesting ENIAC videos at the link below, but we'll highlight this particular one, about the women of the ENIAC, also known as the First Programmers.

[ ENIAC Day ] Continue reading

Posted in Human Robots