Category Archives: Co-design

filler text for co-design theme

Kinegami: Computational Design of Kinematic Mechanisms

This project aims to develop computational pipelines for users to quickly and cheaply design and construct mechanisms from kinematic specifications. Arms, legs, and fingers of animals and robots are all examples of “kinematic chains” – mechanisms with sequences of joints connected by effectively rigid links. We create end-to-end design algorithms and interactive editing software for kinematic “skeletons” that can be fabricated as origami or 3D printed structures. This is part of a larger effort within the lab to provide tools for rapid prototyping and fabrication of custom robots and mechanisms.

Current Personnel: Daniel Feshbach (CIS PhD Student), Wei-Hsi Chen (ESE Postdoc), Emil Schaumburg (BSE CMPE ’27), Raymond Feng (BSE DMD ’26), Andy Wang (BSE CIS ’27), Zachary Leong (BSE DMD ’28), Daniel Lin (BSE ESE ’28)

Overview diagram for Kinegami system

Compositional Design of Tubular Structures

We construct kinematic chains and trees as tubular structures designed as compositions of rotational and translational modules. The methods are built upon a library of parameterized designs for revolute (rotating) joints, prismatic (sliding) joints, and rigid links. We have designed a library of modules constructed from origami for kinematic chains. Currently, we are working on 3D printed module designs, including branching modules to extend our work to kinematic trees.

Algorithms

Our algorithms automatically design kinematic chains and trees with given degrees of freedom. Given a sequence of axes of motion (lines in 3D space along which a revolute joint rotates or a prismatic joint translates), our algorithms calculate a position and orientation along each axis such that joints can be sequentially connected by tubular links. The core idea is to convert the design problem into a planning problem for module centerline paths. Since a tube cannot bend more sharply than its own radius, the paths have a minimum turning radius, making this a Dubins planning problem. The algorithms space joints far enough apart and orient them appropriately such that collision-free Dubins paths exist connecting them.

Human-in-the-Loop Design Tools

We are creating fully interactive design software to enable humans to create kinematic chains and trees with assistance from our algorithms, requiring no coding or engineering expertise. We currently have a python repository for creating and editing tubular kinematic chains, visualizing how they can move, and exporting origami crease patterns to construct them: see our github repo for more details.

Dynamical Robots

(In collaboration with Kod*lab)

Our library of tubular origami module patterns enables rapid, cheap, semi-automated prototyping of dynamical robots with high power density. To demonstrate this, we are building the Dynamic Origami Quadruped (DOQ), an untethered mesoscale robot capable of walking, bounding, and pronking gaits. The light weight of the origami tubes enables the robot mass to be about 50% actuators.

Resources

Python code for creating and editing tubular origami kinematic chains (from our 8OSME paper, 2024): https://github.com/SungRoboticsGroup/KinegamiPython

MATLAB code for creating tubular origami kinematic chains (from our 2023 T-RO paper): https://github.com/SungRoboticsGroup/Kinegami

Instructional example videos for folding tubular origami modules. These examples have 4 sides and are made from .005″ thick PET plastic film. The crease lines are laser etched at 25 PPI, and mountain-valley coloring is hand-drawn in pen.

Related Publications

Algorithmic Design of Kinematic Trees Based on CSC Dubins Planning for Link Shapes

Feshbach, Daniel; Chen, Wei-Hsi; Xu, Ling; Schaumburg, Emil; Huang, Isabella; Sung, Cynthia

Algorithmic Design of Kinematic Trees Based on CSC Dubins Planning for Link Shapes (Conference)

Workshop on the Algorithmic Foundations of Robotics (WAFR), 2024.

(Abstract | BibTeX | Links: )

Reparametrization of 3D CSC Dubins' Paths Enabling 2D Search

Xu, Ling; Baryshnikov, Yuliy; Sung, Cynthia

Reparametrization of 3D CSC Dubins' Paths Enabling 2D Search (Conference)

Workshop on the Algorithmic Foundations of Robotics (WAFR), 2024.

(Abstract | BibTeX)

Kinegami: Open-source Software for Creating Kinematic Chains from Tubular Origami

Feshbach, Daniel; Chen, Wei-Hsi; Koditschek, Daniel E.; Sung, Cynthia

Kinegami: Open-source Software for Creating Kinematic Chains from Tubular Origami (Conference)

8th International Meeting on Origami in Science, Mathematics, and Education (8OSME), 2024.

(Abstract | BibTeX | Links: )

Robogami Reveals the Utility of Slot-Hopper for Co-Design of DOQ’s Body and Behavior

Chen, Wei-Hsi; Caporale, J. Diego; Koditschek, Daniel E.; Sung, Cynthia

Robogami Reveals the Utility of Slot-Hopper for Co-Design of DOQ’s Body and Behavior (Workshop)

ICRA 2024 Workshop on Co-design in Robotics: Theory, Practice, and Challenges, 2024.

(BibTeX | Links: )

Bio-inspired quadrupedal robot with passive paws through algorithmic origami design

Chen, Wei-Hsi; Qi, Xueyang; Feshbach, Daniel; Wang, Stanley J.; Kuang, Duyi; Full, Robert; Koditschek, Daniel; Sung, Cynthia

Bio-inspired quadrupedal robot with passive paws through algorithmic origami design (Workshop)

7th IEEE-RAS International Conference on Soft Robotics (RoboSoft) Workshop: Soft Robotics Inspired Biology, 2024.

(BibTeX | Links: )

DOQ: A Dynamic Origami Quadrupedal Robot

Chen, Wei-Hsi; Rozen-Levy, Shane; Addison, Griffin; Peach, Lucien; Koditschek, Daniel E.; Sung, Cynthia R.

DOQ: A Dynamic Origami Quadrupedal Robot (Workshop)

ICRA Workshop on Origami-based Structures for Designing Soft Robots with New Capabilities, 2023.

(BibTeX)

Kinegami: Algorithmic Design of Compliant Kinematic Chains From Tubular Origami

Chen, Wei-Hsi; Yang, Woohyeok; Peach, Lucien; Koditschek, Daniel E.; Sung, Cynthia R.

Kinegami: Algorithmic Design of Compliant Kinematic Chains From Tubular Origami (Journal Article)

In: IEEE Transactions on Robotics, vol. 39, iss. 2, pp. 1260-1280, 2023, (Honorable mention for 2023 IEEE Transactions on Robotics King-Sun Fu Memorial Best Paper Award).

(Abstract | BibTeX | Links: )

Acknowledgements

This project has been supported by the National Science Foundation under grants 2322898 and 1845339, and by the Army Research Office under the SLICE Multidisciplinary University Research Initiatives Program grant W911NF1810327. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation or the Army Research Office.

EvoRobogami: Co-designing with Humans in Evolutionary Robotics Experiments

Abstract

We study the effects of injecting human-generated designs into the initial population of an evolutionary robotics experiment, where subsequent generations of robots are optimised via a Genetic Algorithm and MAP-Elites. First, human participants interact via a graphical front-end to explore a directly-parameterised legged robot design space and attempt to produce robots via a combination of intuition and trial-and-error that perform well in a range of environments. Environments are generated whose corresponding high-performance robot designs range from intuitive to complex and hard to grasp. Once the human designs have been collected, their impact on the evolutionary process is assessed by replacing a varying number of designs in the initial population with human designs and subsequently running the evolutionary algorithm. Our results suggest that a balance of random and hand-designed initial solutions provides best performance for the problems considered, and that human designs are most valuable when the problem is intuitive. The influence of human design in an evolutionary algorithm is a highly understudied area, and the insights provided in this paper may be valuable to those in the area of AI-based design more generally.

Paper

@conference{hzh2022evorobogami,
title = {EvoRobogami: Co-designing with Humans in Evolutionary Robotics Experiments},
author = {Huang Zonghao and Quinn Wu and David Howard and Cynthia Sung},
url = {https://arxiv.org/abs/2205.08086},
year = {2022},
date = {2022-07-09},
urldate = {2022-07-09},
booktitle = {Genetic and Evolutionary Computation Conference (GECCO)},
abstract = {We study the effects of injecting human-generated designs into the initial population of an evolutionary robotics experiment, where subsequent population of robots are optimised via a Genetic Algorithm and MAP-Elites. First, human participants interact via a graphical front-end to explore a directly-parameterised legged robot design space and attempt to produce robots via a combination of intuition and trial-and-error that perform well in a range of environments. Environments are generated whose corresponding high-performance robot designs range from intuitive to complex and hard to grasp. Once the human designs have been collected, their impact on the evolutionary process is assessed by replacing a varying number of designs in the initial population with human designs and subsequently running the evolutionary algorithm. Our results suggest that a balance of random and hand-designed initial solutions provides the best performance for the problems considered, and that human designs are most valuable when the problem is intuitive. The influence of human design in an evolutionary algorithm is a highly understudied area, and the insights in this paper may be valuable to the area of AI-based design more generally. },
keywords = {},
pubstate = {published},
tppubtype = {conference}
}

Data

We provide equivalent data in MATLAB and Python format. The MATLAB data can be coupled with the tools in the code repo to recreate and inspect the plots we presented in the paper, and redo the statistical tests. There are also many other interesting plotting and testing options available in the tool. Follow the instructions in the codebase for more details. The pickle format contains the data and statistics of every experiment in separate dictionaries. Check the readme.md in the package for the available keys and their meanings. The statistical-test-result sheet contains the results of all the statistical results we did. And the models of the environments and robot parts, and the human designs can be found in the meta section.

MATLAB

Pickle

Statistical Test Results

Meta

Funding Support

The work was supported in part by the National Science Foundation under Grant #1845339. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.