Joseph Crandall (Schunk Arm)

Biological Robotic Imaging Experimentation Framework (BRIEF)


Overview, Key Words, and Subtopic Name: The topic is Electronic Hardware, Robotics and Wireless Technologies (EW) and the subtopic is Robotics and Human Assistive Technologies (RH): RH2. Robotic Applications. Joseph Crandall is conducting Schuck Robotic Manipulation and computer vision research in order to develop a robotic system that can image and manipulate a plant. This point cloud based imaging research has the potential to generate savings for agricultural products that require human labor and dexterity to harvest. The system will also provide point cloud over time data for botanists working to quantitatively measure plant development. The following resources are being used for the robotic component of the project and can be treated as key words, Schunk ROS Package (http://wiki.ros.org/schunk_canopen_driver), Peak-systems Linux (http://www.peak-system.com/PCAN-USB.199.0.html?L=1), ROS Kinetic Kame (http://wiki.ros.org/kinetic ), ROS Control (http://wiki.ros.org/ros_control ), ROS Gazebo Integration (http://gazebosim.org/tutorials?tut=ros_overview), and Yale OpenHand Project (https://www.eng.yale.edu/grablab/openhand/).

 

Intellectual Merit: The Robotics component of the research encompasses three main technical challenges, gaining a working understanding of a prebuilt ROS package for the Schunk LWA 4P in order to drive the arm, a working understanding of the Gazebo robotic simulator in order to run a simulation of the Schunk LWA 4p, and using sensory point cloud data from an Xbox kinetic inside of a simulation world in order to give the robot arm sensory information. Once functional, the arm will be able to perceive, through the point cloud, complex geometric structures and be able to interact with them via a point cloud to mesh conversion software. This project will require the developer to become familiar with all of these packages mentioned in the overview section in order to integrate them for the Schuck Robotic System. Many of the basic features needed for this project have already been implemented and refined by the robotic operating system open source community. To use one’s development time wisely it is beneficial to become well versed in what has previously been done to avoid duplication of code.
This research should be viewed as both a learning experience and a development process. The student will gain a better understanding of ROS and its community in order to drive the robotic system. The student will also implement software to interpret the point cloud data to make it usable for the robot.

 

Broader/Commercial Impact: This project will make Robotic tools for interacting with biological processes more useful to biological researchers and more commercially viable across many agricultural sectors. Although automation in machining and manufacturing has been well developed, the same principles for these controlled closed environments do not always lend themselves to dynamic ones. However, many dynamic biological environments could be automated with this technology. One example is for fruit harvesting and indoor/vertical farming. The orientation of the plant sites can be controlled, and with this system a point cloud to mesh interpretation of the plant would allow the robotic arm to grasp the biological structure and harvest it.

 

Bio: Last Summer I was fortunate to be selected as a member of the GW NanoTechnology Fellowship program where we were trained on different nanotechnology tools and manufacturing techniques. During the second half of the summer I worked at Oak Ridge National Lab continuing the additive manufacturing research I had started with Dr. Leblanc in the NanoTechnology program. I presented my nanotechnology research at both PhysCon in San Francisco in the fall, and at the SEAS R&D showcase in the spring. This summer I will be working on the ITER project in Cadarache France and next year I will be starting my masters degree at the George Washington University in Electrical Engineering with a focus on photonic computing.