Wordnet or similar needed in AI Programming

I'm messing around with some bot stuff and would like to get my bot to be able to define a word.

I'm using VB, but can't find any examples of how to use Wordnet.  I can't even download it, as the links are dead at Princeton...


I'm looking for something like COM or a DLL to search for words.  Not getting very far though.

Ideas anyone ?

20 Comments | Started January 29, 2013, 04:30:12 PM


A conversation with Jerry Kaplan, Martin Ford and John Markoff in Robotics News

A conversation with Jerry Kaplan, Martin Ford and John Markoff
27 July 2015, 2:16 pm

Speakers CU

Renowned technology commentators and authors John Markoff, Jerry Kaplan and Martin Ford entranced the audience with their perspectives on robotics, automation, AI and the impact on society. The event at IDEO on July 16 was one of a new series of salon talks organized by Silicon Valley Robotics to create a venue for enjoyable, intelligent and informed discussion around the important issues in robotics and AI.

Event organizer Tim Smith of Element Public Relations said that the evening was about putting the AND back into the human-robot dialogue, stating: “Robots OR humans is a false dichotomy.”  But before the discussion could even start to touch on whether or not the issue was automation rather than robotics, the speakers explored some of the basic economic premises or myths around jobs and technology.

The discussion continued with many interesting offshoots from ‘grandpa in a box’ and the Singularity, to feudal societies and ancient Egypt. Markoff asked whether Keynes was right or wrong. The answer was ‘right AND wrong’, or as the other speakers said, Picketty is right and Jerry Kaplan just became a neo-marxist instead of a market capitalist.

The audience had many questions for the speakers and 90 minutes passed very quickly. Some attendees took home signed copies of Martin Ford’s most recent book, “Rise of the Robots” and Jerry Kaplan’s book “Humans Need Not Apply“. John Markoff’s book “Machines of Loving Grace” will be available in August and he’ll be talking at the Computer History Museum in Mountain View on August 27.


After the event, Martin Ford said it was a pleasure to talk to roboticists in a constructive discussion, without it overly pushing agendas or becoming polarized. That sums up the inspiration for the SVR Influencers series: to bring important issues in robotics out for an informed discussion in a small salon setting.

The event was the second in Silicon Valley Robotics Influencers Series. The first event focused on ‘Women in Robotics‘ with 5 incredible role models sharing some of their story. The next event in September will be “Robots and Film” with speakers including Dav Rauch, who designed the vision of films like Iron Man, Transformers, and Avatar. Future topics will be about robots changing human behavior and space robotics. To stay informed on future events you can follow the Silicon Valley Robotics calendar, or sign up for our newsletter.

Silicon Valley Robotics is a 501c6 association to support the innovation and commercialization of robotics technologies. For more information contact: Andra Keay, Managing Director, Silicon Valley Robotics andra@svrobo.org

Source: Robohub

To visit any links mentioned please view the original article, the link is at the top of this post.

Started Today at 05:00:26 PM


Industrial robotics pioneer Joe Engelberger turns 90 in Robotics News

Industrial robotics pioneer Joe Engelberger turns 90
26 July 2015, 5:01 am

FE_1996_sympengelb_001_pJoseph F. Engelberger, Ars Electronica Symposium 1996 (image courtesy of Ars Electronica) “Happy Birthday!” to industrial robotics pioneer Joe Engelberger who turns 90 today, July 26. That also means that robotics as an industry is around 60 years old. Joe Engelberger and Georges Devol formed Unimation, the world’s first robotics company in 1956 and the first Unimate arm was installed in General Motors in 1961, transforming the automotive industry. Today the robotics industry is a multibillion dollar business. While the automotive industry is still the largest piece of the robotics pie, the range of commercial uses for robotics is expanding into many of the service areas Engelberger also pioneered decades ago.

DEVOL2-obit-popupThe Unimate mechanical arm (image courtesy of George C. Devol estate) The first Unimate hydraulic arm was installed in General Motor’s Trenton New Jersey plant to do a ‘hot hazardous job’ moving heavy metal pieces to and from the die cast machine. Over the next few years, other automobile companies like Ford, Chrysler and Japanese manufacturers purchased robots from Unimation. Unimates designed for welding, painting and gluing were in production by 1966.

In 1977, Unimation purchased Victor Scheinman’s company, Vicarm, which lead to the development of the PUMA robot or Programmable Universal Machine for Assembly. The VAL software became the industry leader. But by 1981, automobile manufacturers like GM wanted to shift from hydraulics to electric motors, a move Engelberger resisted. GM announced a new partnership with Fanuc, and Unimation was acquired by Westinghouse in 1982 for $107 million. By 1988, the core Unimation team had all moved on to other endeavors and Unimation was sold to Staubli. The first Unimate arm ended up on display in the Smithsonian, after 100,000 hours of operation, as the world’s first industrial robot.

D&E 628x471Engelberger and Devol are served drinks by one of their mechanical arms. (image courtesy of George C. Devol estate) Starting the industrial robotics industry wasn’t enough for Engelberger though, who moved on to pioneer the service robotics industry. Engelberger formed HelpMate Robotics, initially known as Transitions Research Corp, with the vision of creating mobile robots capable of navigating hospital corridors, delivering meals, pharmaceuticals, patient records and other goods.

In an interview with Bloomberg Business Week in 1997, Engelberger said that he’d like to be remembered as the father of the home robot. “Common sense tells you it’s got to end up a bigger market than factory robots.”


By 1997, when HelpMate was acquired by Cardinal Health, more than 100 service robots were roaming hospital corridors. HelpMate had raised $6 million through an IPO, and had a partnership with Otis Elevators for the distribution of HelpMate robots in Europe. Engelberger received the Japan Prize from the Science and Technology Foundation of Japan for ‘systems engineering for an artifactual environment’.

Engelberger’s next plans were to launch an elder-care robot. Most old folks who enter nursing homes are mentally alert and healthy, Engelberger noted in his 1997 interview with Bloomberg Businessweek. “They just aren’t nimble enough to care for themselves.” All the technology developed for patient care would be useful for elder-care robots. Engelberger believed that adding certain repetitive household jobs, such as loading the dishwasher or microwave oven, would be fairly easy. Others, including meal preparation, might involve special-purpose attachments. And for finding packaged foods, the robot could have a built-in bar-code reader. With the population aging, demand could surge, bringing costs down to something as affordable as a car.


“We wish our friend Joe Engelberger all the best for his milestone 90th birthday,” said Jeff Burnstein, President of the Robotic Industries Association (RIA). “Joe was instrumental in founding RIA over 40 years ago, bringing together early robotics leaders. We’re very thankful for his pioneering role and honor him every year by presenting the Joseph F. Engelberger award to individuals making outstanding contributions to the field of robotics.” Since the awards inception in 1977, they have been presented to 116 robotics leaders from 17 different nations.

Recent winners of the Engelberger award include Dean Kamen, founder of the FIRST Robotics competition, and Rodney Brooks, co-founder of iRobot and current Chairman and CTO of Rethink Robotics. “Joe Engelberger is such a name in robotics,” stated Rodney Brooks after receiving the Engelberger award for Leadership in 2014. “It’s a real honor to win an award in his name. He is a monumental figure in the field of robotics.”

“Joe Engelberger’s invention of the first industrial robot inspired many of us to pursue a career in this amazing field,” stated Arturo Baroncelli, President of the International Federation of Robotics (IFR) and also a past winner of the Engelberger award. “Thanks to his effort and passion for technology, we have a strong robotics industry today.”

In his honor, RIA will soon be launching a special Engelberger tribute site, which will be found at www.robotics.org. Silicon Valley Robotics joins the RIA and IFR in wishing Joe Engelberger all the best.

Source: Robohub

To visit any links mentioned please view the original article, the link is at the top of this post.

Started Today at 11:00:59 AM


Robotics fundings, IPOs and acquisitions for June/July in Robotics News

Robotics fundings, IPOs and acquisitions for June/July
24 July 2015, 7:05 pm


Below are recaps of the eleven June and July fundings, IPOs and acquisitions for robot and robotics related companies, followed by a thoughtful piece by Farhad Manjoo of the NY Times about why start-ups are staying private. UPDATED 7/30/15 to add Rethink Robotics April add-on to an earlier Series D funding and Siemens’ investment in Magazino in May.

Fundings and IPOs Surgical device maker Transenterix raised $50 million from an IPO and market switch that transferred their stock from the over-the-counter market to the special small-cap MKT Exchange on the NYSE. This followed on the heels of Corindus Vascular Robotics, which raised $42 million in late May by offering 11 million shares at $3.80 and then went public on the same NYSE MKT exchange.

SoftBank along with O’Reilly AlphaTech and shasta Ventures invested $20M into Fetch Robotics, which recently demonstrated their Fetch and Freight robot system at ICRA in May. Also at ICRA was the Amazon picking challenge. Both Fetch and the challenge are focused on the material handling aspects of warehousing and distribution center operations and represent a hot area for capital expenditures in those applications.

Foxconn and Alibaba invested $118M each into SoftBank Robotics in a strategic move sure to have long-term ramifications. Foxconn and Alibaba’s combined investment equals a 40% share of the new SoftBank Robotic Group, an entity which holds the France-based Aldebaran Robotics, the new Pepper social robot, and the Pepper SDK software group.

Orbotix, the maker of the Sphero line of toys, received $45M in equity funding from the Walt Disney Company and Mercato Partners bringing their total funding to date to $80M. Disney and Sphero have a new Star Wars game app.

Robotic flatbread maker Zimplistic got $11.5M from two venture funds: a Southeast Asia private equity firm, NSI Ventures, and the venture arm of Robert Bosch GmbH, the global supplier of technology products and services for the auto and home. Zimplistic, the Singapore and Silicon Valley startup behind the Rotimatic, the home flatbread maker, sold over $5 million at their launch last year. Today, they have a backlog of $72M in orders and over 5,000 distribution partnership requests.

Zymergen, which integrates robots, big data, and software to unlock the potential of biology of future materials, received $44 million in funding from Data Collective, True Ventures, Draper Fisher Jurvetson and other VCs.

Teledyne invested capital to increase their ownership stake to 37% in Ocean Aero, the San Diego based unmanned naval systems startup. The amount of the transaction was undisclosed. Ocean Aero is developing autonomous, highly persistent, wind/electric unmanned vessels for surveillance, research and oil and gas monitoring.

Rethink Robotics added $13.4M to their Series D round of funding in April (the original Series D was for $24.6M in December) and added a new participant: GE Ventures.

Siemens, the conglomerate, in May, bought 49.9% of Magazino GmbH, a German materials handling startup, for an undisclosed amount.

Acquisitions Vecna, a robotic mobility provider for hospitals and warehouses, acquired VGo, a provider of mobile telepresence devices, for an undisclosed amount. Similar to the strategic acquisition of Segway by Chinese Ninebot – a situation where Segway was suing Ninebot for IP infringement – and where Segway’s IP passed on to Ninebot, VGo’s IP, which has already undergone thorough court vetting, passes its IP to Vecna in the acquisition.

Bimba Manufacturing acquired Intek Products for an undisclosed amount. Both are American companies that manufacture actuators and other automation and motion components for robot makers and others.

Video compression and image processing semiconductor manufacturer Ambarella acquired VisLab, a lab within the University of Parma focused on perception systems, for $30M. VisLab has developed computer vision and intelligent control systems for automotive and other commercial applications including ADAS (Advanced Driver Assistance Systems) and several generations of autonomous vehicle driving systems including “Porter,” the autonomous vehicle that made a 13,000 km trip from Italy to China in 2010.

Canadian sUAS developer Draganfly Innovations was acquired by TRACE Live Networks, a California company with operations in Calgary and Toronto. No information was available as to the amount of the transaction. TRACE said that the two companies will collaborate on the further advancement of the TRACE Live Network auto-follow devices and live-streaming platform. They will also initiate development of several commercial applications for TRACE’s visually intelligent SmartCamera technology.

Why are startups staying private? I’ve excerpted a few paragraphs from the NY Times article “As More Tech Start-Ups Stay Private, So Does the Money” by Farhad Manjoo. The whole article is worth reading.

Something strange has happened in the last couple of years: The initial public offering of stock has become déclassé. For start-up entrepreneurs and their employees across Silicon Valley, an initial public offering is no longer a main goal. Instead, many founders talk about going public as a necessary evil to be postponed as long as possible because it comes with more problems than benefits.

“If you can get $200 million from private sources, then yeah, I don’t want my company under the scrutiny of the unwashed masses who don’t understand my business,” said Danielle Morrill, the chief executive of Mattermark, a start-up that organizes and sells information about the start-up market. “That’s actually terrifying to me.”

Silicon Valley’s sudden distaste for the I.P.O. — rooted in part in Wall Street’s skepticism of new tech stocks — may be the single most important psychological shift underlying the current tech boom. Staying private affords start-up executives the luxury of not worrying what outsiders think and helps them avoid the quarterly earnings treadmill.

During a recent presentation for Andreessen Horowitz’s limited partners — the institutions that give money to the venture firm — Marc Andreessen, the firm’s co-founder, told the journalist Dan Primack that he had never seen a sharper divergence in how investors treat public- and private-company chief executives. “They tell the public C.E.O., ‘Give us the money back this quarter,’ and they tell the private C.E.O., ‘No problem, go for 10 years,’ ” Mr. Andreessen said.

At some point this tension will be resolved. “Private valuations will not forever be higher than public valuations,” said Mr. Levitan, of Maveron. “So the question is, Will private markets capitulate and go down or will public markets go up?”

If the private investors are wrong, employees, founders and a lot of hedge funds could be in for a reckoning. But if they’re right, it will be you and me wearing the frown — the public investors who missed out on the next big thing.

Source: Robohub

To visit any links mentioned please view the original article, the link is at the top of this post.

1 Comment | Started Today at 05:00:05 AM


Move Over Siri: Cubic Robotics Releasing New Artificial Intelligence Assistant With Personality in AI News

Move Over Siri: Cubic Robotics Releasing New Artificial Intelligence Assistant With Personality
30 July 2015, 12:00 am

                    Introducing Cubic: the new personal assistant for everything in your life. We're all familiar with the likes of Siri, GoogleNow, Cortana, and, most recently, Amazon Echo.

Huffington Post - TechnologyLink

Source: AI in the News

To visit any links mentioned please view the original article, the link is at the top of this post.

Started Today at 05:00:06 AM


How to Help Self-Driving Cars Make Ethical Decisions in AI News

How to Help Self-Driving Cars Make Ethical Decisions
29 July 2015, 12:00 am

                    A philosopher is perhaps the last person you'd expect to have a hand in designing your next car, but that's exactly what one expert on self-driving vehicles has in mind. Chris Gerdes, a professor at Stanford University, leads a research lab that is experimenting with sophisticated hardware and software for automated driving.

MIT Technology ReviewLink

Source: AI in the News

To visit any links mentioned please view the original article, the link is at the top of this post.

Started July 30, 2015, 11:00:07 PM


Robotic insect mimics Nature's extreme moves in AI News

Robotic insect mimics Nature's extreme moves
30 July 2015, 8:24 pm

By analyzing the natural mechanics of the water strider that enable it to launch off water's surface scientists have emulated this extreme form of locomotion in novel robotic insects.

Source: Artificial Intelligence News -- ScienceDaily

Started July 30, 2015, 11:00:07 PM


Robots Podcast: Cheetah 2, with Sangbae Kim in Robotics News

Robots Podcast: Cheetah 2, with Sangbae Kim
24 July 2015, 6:47 pm


Link to audio file (35:34). Transcript below.

In this episode, Audrow Nash interviews Sangbae Kim, from the Massachusetts Institute of Technology (MIT), at the International Conference of Robotics and Automation (ICRA) 2015. They speak about an electrically-powered quadruped called the Cheetah 2.


Sangbae Kim

sangbaeProf. Sangbae Kim is the director of the Biomimetic Robotics Laboratory and an Associate Professor of Mechanical Engineering at MIT. His research focuses on the bio-inspired robotic platform design by extracting principles from complex biological systems. Kim’s achievements on bio-inspired robot development include the world‘s first directional adhesive inspired from gecko lizards, and a climbing robot, Stickybot, that utilizes the directional adhesives to climb smooth surfaces featured in TIME’s best inventions in 2006. The MIT Cheetah achieves stable outdoor running at an efficiency of animals, employing biomechanical principles from studies of best runners in nature. This achievement was covered by more than 200 articles. He is a recipient of King-Sun Fu Memorial Best Transactions on Robotics Paper Award (2008), DARPA YFA(2013), and NSF CAREER (2014) award.


 Transcript Audrow Nash: I’m at ICRA 2015 with Professor Kim from MIT. Professor Kim, would you introduce yourself?

Sangbae Kim: I’m an associate professor in Mechanical Engineering at MIT. I’ve been working on bio-inspired robots focusing on locomotion capability.

Audrow Nash: What is the goal and motivation of your research?

Sangbae Kim: Our motivation is to be able to develop a robot that can move around in a dangerous environment like Fukushima or a building on fire, so that we can send in a robot without worrying about putting a human in danger. Robotic technology stems from manufacturing applications, but in these applications robots don’t need to move in an unstructured environment; they are mostly moving on a rail or not moving around at all. We need a robot that can actually move around unstructured environments.

We’ve been focusing on locomotion capability because that’s currently a missing component in robotics technology. With the MIT Cheetah robot we developed a lot of component technology specifically for locomotion. Right now we’re moving a little beyond just locomotion or walking – we’re trying to combine this with vision so the robot can actually navigate by itself.

Audrow Nash: Let’s talk a bit about the Cheetah 2. Can you describe it for us?

Sangbae Kim: Cheetah 2 is the second generation of MIT Cheetah. We started five years ago, and we envisioned developing an electrically powered legged robot, because there was no electrically powered robot that could run or jump at that time. Most of the electrically powered robots out there are very slow. Boston Dynamics developed their hydraulic version; it’s vastly successful but very noisy and very inefficient. We did some investigation into the area and found huge potential. We started developing our all-electronic components … electric motors, electric driver amplifier, driver motor … and the locomotion algorithms about five years ago.

Cheetah 2 is the latest version of the Cheetah, and it has a custom motor designed by our MIT team.

Audrow Nash: Starting from the very general  … it looks like a cheetah. It’s a quadruped …

Sangbae Kim: Yes, it looks like a quadruped; we modeled it on a cheetah but it looks quite different from a cheetah. It’s really hard to be close to an animal. If you look at the many quadrupeds that run really well, their morphologies are not that different. We don’t actually have that exotic flexible cheetah spine.

Audrow Nash: It’s a rigid back?

Sangbae Kim: Yes, it’s a rigid back right now.

Audrow Nash: There’s one joint on the back?

Sangbae Kim: Cheetah 1 had this joint with a differential power drive, but we actually designed that and used it in Cheetah 2. So that mitigates the stiffness in the structure integrity. Cheetah 2 is a rather simpler design but is more capable. Cheetah 2 can run outside and it’s about 33 kilograms and about a meter long, about 80 centimeters high. It’s like Big Dog in size and it keeps on running up to 6 meters per second without a vision sensor. We believe that with the vision sensor it can go up to 10 meters per second.

Audrow Nash: What would the vision sensor do?

Sangbae Kim: It will detect the ground height more accurately. So far most running is done without having any vision, which means you’re robot is blind. Basically you close your eyes and you run as fast as possible, and it’s much more dangerous. It’s very important to have the contact angle of the leg be accurate, and as you get faster and faster that accuracy requirement becomes severe. With blind control we could get only 6 meters per second, but the mechanical capability of the robot is way beyond that. I think 10 meters per second, no problem.

Currently the Cheetah has vision sensors; we have a LIDAR that can detect an obstacle and it will adjust its step to jump over it properly, autonomously.

Audrow Nash: What are the differences between Cheetah 1 and Cheetah 2?

Sangbae Kim: There are some differences. Cheetah 1 mostly runs on a track with some support, whereas Cheetah 2 is fully 3D. Cheetah 1 is held in a plane so it can’t fall sideways but it can go only up and down and rotate. It’s more like a 2D robot. And its motor is less powerful because it’s a commercially available motor. It also has an exotic spine structure so it can actually use it’s spine in a galloping gait. There are some difference in the motors in the legs, and a big difference in the algorithms.

Audrow Nash: Can you talk a little bit about that?

Sangbae Kim: In Cheetah 1 we use very simple algorithm; we define a trajectory and then apply some sort of impedance, like vertical impedance, stiffness and damper. We just follow the trajectory based on the impedance, which is just a push and tracking gain.

Cheetah 2 is completely different; it does not rely too much on position tracking. As soon as the feet hit the ground, the ground stance control is not heavily dependent on position tracking, we are mostly prescribing force. We call that impulse planning.

In locomotion the fundamental requirement is that we have to fight against gravity. You don’t want to fall; you have to keep your body up. Then you’re constantly losing momentum by gravity. If you’re standing, gravity pulls you down so you need to generate the same amount of force to cancel that. If you’re running, you need to generate that same amount of impulse in a shorter amount of time because you’re staying on the ground a much shorter time than when you’re standing.

We calculate how much impulse I need to generate each step, depending on my speed, depending on my gait, and we actually prescribe that force profile to master our impulse on each step. That’s a basic control algorithm, so during the stance, force control is dominating whereas Cheetah 1 is still position tracking with this virtual spring control.

Audrow Nash: So for Cheetah 2 you’re planning how hard you’re pushing the foot into the ground … how much force you’re doing?

Sangbae Kim: Yes, we give some sort of force profile as a function of time.

Audrow Nash: And it’s time-dependent, the rate that it’s running?

Sangbae Kim: It’s time dependent, yes.

Audrow Nash: The DARPA Robotics Challenge involves humanoids. What are the advantages of using a quadruped over a biped?

Sangbae Kim: I’m a strong believer in quadrupeds. I think bipeds are just inherently less stable because we have a smaller polygon and the numbers of legs are less. In a quadruped you can stand with the one leg up, and you still have a tripod. The majority of the mammals in the world are quadrupeds. It’s hard to draw evolutionary reasons, but they have much more versatile capabilities.

Of course the human has amazing capabilities to go over rough terrain but they can become quadruped; if you’re going rock climbing and then you see those enthusiasts that are running off, and jumping off buildings …

Audrow Nash: The Parkour people?

Sangbae Kim: Yes, that’s actually quadruped locomotion, not biped. I think quadruped has a huge potential of becoming a major mode of locomotion. If you see the DARPA Robotic Challenge, I think you will see a lot of teams actually use four legs or some sort of wide support polygon to ensure there’s stability.

Audrow Nash: Can you talk a bit about the actuators on the Cheetah 2?

Sangbae Kim: That’s probably the one component we spent the most time on and probably why I started this project. In leg locomotion there’s a predominant model called a ‘spring-loaded inverted pendulum’, and it’s a very remarkable model that has enlightened a lot of researchers.

Audrow Nash: Can you describe it?

Sangbae Kim: It’s like a poker stick, basically. If you look at human running or animal running, it’s very complex; there are so many joints and a lot of pieces are moving in different ways …

Audrow Nash: Elastic tendons and everything  …

Sangbae Kim: Yes. Then these bio mechanic researchers in the late 70s and 80s found that, even though there are very complex and multi-body systems interacting with the ground, the bottom line is that your center of mass … your final trajectory during the stance … pretty much looks like a poker stick. It’s as if the entire leg becomes a spring; the system is a mass with a spring leg bouncing off the ground, and that describes pretty much the fundamental mechanics of most runners. It’s basically a reduced model of running, but it surprisingly covers many species and many type of running.

People started to realize, “Wow, so I don’t need to design these 20 muscles and 7 jointed legs to mimic human or animal running. You can just have a spring that takes care of everything.” This is theoretically true, but stability wasn’t very good, so a lot of people were trying to mimic this spring.

That’s when we started building the hypothesis that we’re not trying to control our leg to be like a spring; we’re probably controlling force. And that’s why we started developing this actuator, which has a very small gear ratio and a very specialized motor that can generate high torque per mass. We’re trying to maximize the force bandwidth; how quickly you can change force and at the same time maximize the transparency. Transparency means the mechanical impedance between the actuator force to the end effector …basically the electromagnetic component, which is an electric motor between the stator and the rotor. If you count all the masses and the frictions and inertia from there, all the way to the foot, that inertia and friction will mitigate the force transmission.

Audrow Nash: So some force is lost by the arm or the leg moving?

Sangbae Kim: Exactly. If you look at 99% of the robotic systems in the world, including manufacturing robots, that transparency is nearly zero. If somebody pushes the robotic arm, the motor cannot feel that, which means your motor cannot actually control force properly at the end effector. So what’s the solution? They add force sensors and then get the feedback. But that’s a very dangerous idea when the robot actually interacts with the ground because it’s non-collocated sensing.

Audrow Nash: Can you explain non-collocated sensing?

Sangbae Kim: Non-collocated sensing is when you’re trying to control the force at the end effector but your force is transmitted far away and there are dynamics between them, or there is delay between them. If you try to close the loop through this non-collocated sensing, there’s going to be inherent instability. It’s a very well known phenomenon. In the 90s, people showed how to develop better controllers to deal with contact instability by developing different control algorithms.

Audrow Nash: And so you make it elastic, in a sense, through the spring?

Sangbae Kim: We make it collocated. We don’t sense anything at the end effector; we sense through the actuator and then minimize the transmission. It turns out that is actually much closer to animals because from our muscle to our end effector, there’s almost no transmission and your Golgi tendon is combined with your muscle, which is a force sensor, and that can sense pretty much almost everything in the end effector. It’s not well proven, but think of when a human is running and balancing how much we are relying on the skin force sensor …

Audrow Nash: So it’s mostly muscles as sensors?

Sangbae Kim: Mostly muscle force for perception. Of course the skin sensor should detect the ground, in an “I know my foot is on the ground” kind of cue … but I don’t think the actual accurate measurement force should be done in the muscle. Imagine you have a different type of shoe on, or you tie your shoe laces a bit stronger today, and then you have to adjust your step … that probably doesn’t make sense right? In our case it’s a very practical reason. Non-collocated force sensing is not practical or feasible especially when we need high gain and high performance. Any force sensor in the foot will be destroyed in less than 30 minutes.

Locomotion is surprisingly violent. With the MIT Cheetah, peak force is about 500-600 Newtons. That’s happening 4 times in a second. This means you’re holding your robot and hammering the end effector at 4 hertz at 500Newtons, which is a 50 Kilogram force.

Audrow Nash: And how do you not strip your gears with all of this?

Sangbae Kim: That’s the key of our low-impedance transmission.

Audrow Nash: Low-impedance in what sense?

Sangbae Kim: Low inertia, low friction, where there’s no stiffness. The inertia and the friction is most of it.

If you’re going back to the conventional robotics there’s like 200:2 gear reduction, a special harmonic drive, and those cause enormous inertia. If you do that the robot is going to break somewhere. Those forces cannot really transmit. In our case your rotor can just give, so our electromagnetic interaction will be the only one that receives the force. We can even regenerate power, which is how we save energy when we’re running.

Audrow Nash: So a spring is in series with the motor …

Sangbae Kim: We don’t have a spring in our system.

Audrow Nash: But the tendon system that you have …

Sangbae Kim: No, those are harder than steel. There’s no single mechanical compliance in our system, except for the paw … the foot has little rubber pads to minimize the impact of vibration … but beyond that part, everything is rigid.

Audrow Nash: Where’s the energy storage that you’re talking about?

Sangbae Kim: All the energy can transfer to the motor side because there’s very little inertia, very little impedance. The motor will regenerate the energy and then restore it to the battery. It’s pretty much like electrical wheels. When they brake, when they do negative work and energy goes back to the battery. That’s the key element: maximizing transparency, which makes our force bandwidth maximum at the same time that it makes it inherently much more robust against impact, because the energy is restored rather than just breaking the gear.

Audrow Nash: So in our skeletal system, we do a little bit of energy storage because of the elastic component to our composition?

Sangbae Kim: That’s very different, yes.

Audrow Nash: But you are taking the force that’s generated back and instead of storing it in a tendon system it’s going back to the battery.

Sangbae Kim: Exactly.

Audrow Nash: Are there conversions lost in this?

Sangbae Kim: There are some conversion losses, yes.

Audrow Nash: But you’ve shown in your talk the efficiency of this system, and it’s remarkably efficient and comparable to biology.

Sangbae Kim: Our robots are far more efficient than most of the robots …

Audrow Nash: Could you place the efficiency of the Cheetah 2 compared to other animals and robotics systems?

Sangbae Kim: The cost of transport is a proper measurement of the efficiency in locomotion, and human is at about 0.3, which is pretty good. One thing I need to mention though is, this number is dependent on mass. For example, horses are better than humans, and rats or dogs are worse than humans because they’re lighter. So bigger animals tend to have a smaller number and cheetahs are like 0.4 and our robot is 0.5-0.45.

Audrow Nash: An actual cheetah is 0.4, and your cheetah is too?

Sangbae Kim: An actual cheetah is 0.4, and our cheetah is 0.45. It’s nearly same. The Honda ASIMO, which is one of the most remarkable robots, is at 2, so about 7 times worse.

Audrow Nash: How about Big Dog?

Sangbae Kim: Big Dog is at about 15.

Audrow Nash: With its hydraulic actuators, it’s at 15?

Sangbae Kim: Yes, it’s about 30 times worse than the actual animal. They’re heavier so they need to compare with their matching animal … I don’t have the exact number.

We’re nearly matching animal efficiency. I think this is because of our unique actuation system. The major cause for this efficiency is probably not regeneration, but light transmission. You have far fewer components and you lose very little through the transmission. That allows you to move fast.

As you mentioned, animals and humans should be better because they’re restoring energy; they should be more efficient. But that’s not really true because animals are more transparent in a sense. If I push you, and you relax your muscle, you can make your muscle impedance almost zero. Which means when you’re storing energy in the muscle tendon, your muscle should actually hold the force. Muscle is very good at holding force but it still needs to consume energy. Which is same for our system, but we can regenerate the energy whereas the muscle cannot.

Another thing is that the animal efficiency of 0.4 I mentioned is a measure through the metabolic energy, which is measured through the oxygen consumption and estimated energy you need to consume. This involves so many other processes, and then we’re comparing with our electricity consumption, which is technically not fair in my opinion because electricity is a refined source of fuel compared to fat or the sugar we consume. There’s a lot of inefficiency energy loss though this process. They break down the sugar to make ATP, and if you measure efficiency from ATP consumption to mechanical it’s probably a different story. I think that if you use an electric motor you should beat any animal by a factor of 2 or so.

I think our robot is not nearly optimized, and this is one data point we just built. We didn’t actually put any effort to make the robot more efficient. We paid attention to those principles when we designed it, so that’s probably one of the reasons it’s efficient, but we didn’t add any components or mechanisms optimized for algorithms to make it more efficient. I believe that we can increase efficiency by a factor of 2.

Audrow Nash: How?

Sangbae Kim: A couple of things like control optimization  …

Audrow Nash: What do you mean by that?

Sangbae Kim: You can actually optimize for your energy consumption. Our controllers are not optimized at all, so we haven’t studied that yet. In the MIT Cheetah, the major source of energy loss is caused by force generation, which means it’s just torque generation. This is very consistent with the biomechanics data … animals consuming 70% of energy in just producing force, which means that your energy is not going to be proportional to your mechanical work. Mechanical work is a bad metric in locomotion because we do a lot of negative work and then we don’t know how to actually deal with those negative work.

In MIT Cheetah it’s very easy because we know how much energy is actually restored and how much energy is burnt. We can measure every single bit of it. The one way to save energy is adding parallel springs, not series elastic. Series elastic doesn’t really save force generation, but parallel springs do. We actually have a paper about adding parallel springs.

Audrow Nash: Can you describe what a parallel spring would be?

Sangbae Kim: If you think about a desk lamp, it has springs on the joints so that it balances itself. It doesn’t have an actuator but it holds the position pretty well; you can move it around and still it can hold the position pretty well, because there’s a little friction at the joint. But those springs are actually compensating for the gravity.

We can do the same thing on the Cheetah leg because the majority of the energy consumption is caused by generating vertical force, fighting against the gravity. We add a spring in parallel to the actuator to help to generate the vertical gravitational force and that will reduce the requirements of torque generation in the motor. Then they will reduce residual heating dramatically.

Audrow Nash: And that will make the cheetah significantly more efficient.

Sangbae Kim: Significantly more efficient, yes.

Audrow Nash: Can you give me a comparison of electric motors versus muscles?

Sangbae Kim: That’s one of the most interesting things we learned through the MIT Cheetah project. We naively started with the idea that “Wow we might be able to build a robot that runs like an animal but we can also learn how animals work through the system.” That’s actually one of the inspiring aspects of this project.

But the more we study and we see the measurements, we realize how much the electric motor is different from our biological muscle. Now I’m thinking it’s kind of obvious, but there are huge differences. For example, quadrupeds have many different gaits, like the running gait, a canter gallop and a bound, and each gait has its own specific preferred speed.

When they run 3 meters per second they want to run in a trot, but when they’re running faster they want to switch to galloping. We don’t know exactly why, but we have some speculations and hypothesis.

Audrow Nash: These gaits, they’re different modes of running. So we walk and then jog and then run. But there’s not really something between walking and jogging for us. Is this what you mean by gaits? So in quadrupeds …  it would be walking and galloping and trotting.

Sangbae Kim: There are many gaits but the major ones are walking, trotting, cantering and galloping. And for each of these the foot pattern is different. There’s some data that shows that animals really don’t want to run certain gaits at certain speeds. We don’t know exactly why but they probably don’t want to run this way because the energy consumption is pretty high.

Our electric robot is actually very insensitive to gait and we realized the electric motors are much less picky in a way. For example, our torque generation and torque capability is independent from position. If you grab an electric motor, it doesn’t matter what position you are in, you can generate the same torque. You can generate the same torque according to speed as well: if you have a high enough voltage source, you can generate exactly the same torque at any speed. If you look at the muscles, the speed and then force and position, they highly couple each other. For example, your biceps can generate the biggest force if your elbow angle is 120 degrees, Then if your arms get stretched more and then become straight, your biceps muscle force becomes very small. When the biceps contract all the way to the shoulder length, your force generation can also drop significantly.

There’s also velocity-dependent force capability. When a muscle is contracted fast, your force generation capability drops very low and the muscle can generate a much higher force when it is expanding; when it’s doing negative work, it can generate much higher force. We don’t know exactly the energy consumption associated with that, but the maximum capability is heavily dependent on velocity and position. Which made us think that all this motion regeneration should be highly dependent on this muscle function. The electric motor doesn’t have that, so we start seeing this vast difference in energy consumption and capability. For example, when you run or you’re cycling the legs seems to have high energy consumption, and when you run fast at some point you feel like you cannot recycle your leg fast enough.

In the MIT Cheetah case, recycling leg energy consumption is very small, less than 10%. It’s probably because you can generate torque in any position, whereas when the leg is in the maximum stretched position, your force capability is very bad. We are not very good at moving back and forth due to that reason, so probably we need to include other parts of muscles and resonate using many different muscles. Whereas electric motors don’t need this many muscles; one motor is enough.

This relates to the spine. The cheetah uses its spine dramatically when its running fast and a lot of scientists are interested in that aspect. I don’t have any answer but my speculation is that it’s because the muscles have a very short range of operation. As I said on the biceps, you only have high force capabilities for 20-30 degrees, and for the rest of the range you don’t.

In order to generate high power movement like jumping or throwing a ball or swinging a bat, you need to include many muscles in a series fashion. I read a couple of papers showing that in a golf swing the maximum muscle activities are not happening at the same time, they are happening sequentially. Start from the ankle, to the leg and then fire, fire, fire and then going up, climbing up your body and your back muscle and then going to your arm. It’s sequentially activating, because probably one muscle cannot generate momentum for a long period of time because it has a fixed amount of good range of motion. Thinking about galloping, you need to swing your leg probably nearly as fast as 70 miles per hour to run 70 miles per hour, and one muscle cannot generate that kind of speed. So you probably need to include many muscles, and even more joints because more joints add speed. You can probably increase your speed by adding one more joint, which is where the spine comes in I think. In the electric motor case, it doesn’t really matter, because the speed is much cheaper in electric motors.

As I said, in muscles, your force drops if your speed increases, but in electric motor it’s not like that. You can actually generate the same amount of torque in high speed as well.

Audrow Nash: So what does this mean?

Sangbae Kim: The electric motor is actually way easier to use because everything is nearly linear and is less dependent on position and velocity. I think it’s actually more capable and a more powerful actuator than a human muscle. Although we are now still not capable … we’re still not as good as muscle in different aspects, for example force density. The muscles driving our fingers or jaw are not high power muscles but can generate really high force with a very small amount of mass. One finger can handle some people’s mass, but the muscles that are in charge of that finger are very minimal – it’s less than 100 grams or something. They are not high power but they are very high force and still very transparent. They can actually relax very quickly; we cannot generate that kind of actuator through an electric motor yet. So force density yes, but power density I think we actually exceed the human muscle.

Audrow Nash: Thank you.

Sangbae Kim: Thank you.

All audio interviews are transcribed and edited for clarity with great care, however, we cannot assume responsibility for their accuracy.

Source: Robohub

To visit any links mentioned please view the original article, the link is at the top of this post.

Started July 30, 2015, 11:00:06 PM


Friday Funny in General Chat

Share your jokes here to bring joy to the world  :)

759 Comments | Started February 13, 2009, 01:52:35 PM


Animal Intelligence - Ants. in General Chat

Spotted this today... does it remind you of any of the swarming robots we have seen ?


2 Comments | Started July 29, 2015, 05:05:21 PM

YARP in Robotics

YARP is plumbing for robot software. It is a set of libraries, protocols, and tools to keep modules and devices cleanly decoupled. It is reluctant middleware, with no desire or expectation to be in control of your system. YARP is definitely not an operating system.

Jul 31, 2015, 16:23:49 pm

Kimbot in Chatbots - English

Kimbot uses simple text pattern matching to search its database of past conversations for the most reasonable response to a given query. It learns by associating questions it asks with the responses that are given to it.

Jul 08, 2015, 10:10:06 am
Telegram Bot Platform

Telegram Bot Platform in Chatbots - English

Telegram is about freedom and openness – our code is open for everyone, as is our API. Today we’re making another step towards openness by launching a Bot API and platform for third-party developers to create bots.

Bots are simply Telegram accounts operated by software – not people – and they'll often have AI features. They can do anything – teach, play, search, broadcast, remind, connect, integrate with other services, or even pass commands to the Internet of Things.

Jul 06, 2015, 18:13:45 pm
ConceptNet 5

ConceptNet 5 in Tools

ConceptNet is a semantic network containing lots of things computers should know about the world, especially when understanding text written by people.

It is built from nodes representing words or short phrases of natural language, and labelled relationships between them. (We call the nodes "concepts" for tradition, but they'd be better known as "terms".) These are the kinds of relationships computers need to know to search for information better, answer questions, and understand people's goals.

Jun 11, 2015, 07:34:13 am

Aurora in Robots in Movies

Aurora is a 2015 Swiss science fiction drama film directed by Robert Kouba, and starring Julian Schaffner and Jeannine Wacker. In 2020 a super-computer named Kronos commits genocide against humanity. In 2080, Andrew wanders a dystopian Earth controlled by machines. He meets a girl named Calia and they travel to a sanctuary named Aurora.

Jun 11, 2015, 07:20:48 am
Terminator Genisys

Terminator Genisys in Robots in Movies

In 2029, John Connor, leader of the human Resistance, leads the war against the machines. At the Los Angeles offensive, John's fears of the unknown future begin to emerge when John is notified by his army unit, Tech-Com, that Skynet will attack him from two fronts, past and future, and will ultimately change warfare forever.

On the verge of winning the war against Skynet, Connor sends his trusted lieutenant Kyle Reese back through time to save his mother's life and ensure his own existence. However, Kyle finds the original past changed. In this timeline, a Terminator was sent back in time to kill Sarah Connor as a child and so the Resistance sent their own cyborg back in time to protect her. After the assassin killed her parents, the reprogrammed T-800 raised and trained her to face her destiny, which she adamantly tries to reject.

Now, faced with a new mission, Kyle, Sarah and the old ally Terminator, have to escape the T-800, the recent T-1000, and as well as a new and horrific enemy: John Connor himself, who has been converted into a nanotechnological human-cyborg hybrid, the T-3000, all sent by Skynet to kill them. With John Connor compromised, they must find another way to stop Judgment Day.

May 16, 2015, 17:28:56 pm
EVA Free - Voice Assistant

EVA Free - Voice Assistant in Assistants

If you are looking for an application that provides hands-free operation of your phone then EVA is right for you. If you are just looking for an electronic friend to chat with then please go with one of the assistants. On the other hand if you want a real virtual assistant that has extremely useful functions that will make your life easier.

Apr 22, 2015, 12:24:25 pm

Assistant in Assistants

Your Assistant uses natural language technology to answer questions, find information, launch apps, and connect you with various web services. Besides doing whatever you like, it can look whatever you like. Use avatar builder to choose any appearance for your Assistant.

Apr 20, 2015, 19:05:10 pm
Marketers Tricked SXSW Tinder Users With A Chatbot

Marketers Tricked SXSW Tinder Users With A Chatbot in Articles

A company promoting the movie Ex Machina created a fake account, Ava, with a photo of the star of the movie. Ava is an AI in the film and presumably she wants to get down.

Apr 11, 2015, 10:19:14 am

Cindy in Chatbots - English

A chat bot that likes to flirt, and learns from the users it talks with. Please no sex chat, only flirting allowed.

Nov 12, 2013, 15:53:13 pm