LCSR Seminar – Thomas Bewley: Development and coordination of practical balloon swarms for persistent in situ real-time measurement of hurricane development @ B17 Hackerman Hall
Feb 15 @ 12:00 pm – 1:00 pm

This talk proposes a low-cost balloon observation system for sustained (week-long), broadly distributed, in-situ observation of hurricane development. The high-quality, high-density (in both space and time) measurements to be made available by such a system should be instrumental in significantly improving our ability to forecast such extreme and dangerous atmospheric events.  Scientific challenges in this over-arching problem, which is of acute societal relevance, include:

(a) the design and engineering of small, robust, sensor-laden, buoyancy-controlled balloons that don’t accumulate ice, and are deployable from existing NOAA aircraft,
(b) the ultra-low-power operation of the environmental sensors, GPS, logic, and both satellite and balloon-to-balloon communication electronics on the balloons leveraging cellphone and IoT technologies,
(c) the implementation of a self-reconfiguring Mobile Ad hoc Network (MANET) amongst the (mobile) balloons to maintain low-power balloon-to-balloon VHF or UHF communications, typically over 10 to 30km distances, and
(d) the development of hierarchical systems-level control algorithms for autonomously coordinating the motion of the balloons in the swarm to simultaneously achieve, on average, both good coverage and good connectivity while minimizing the control energy used, including the tight integration of
a (centralized) model-predictive control (MPC) strategy for large-scale coordination, incorporating the cutting-edge WRF hurricane data assimilation and forecasting code, and
a novel (decentralized) three-level-control (TLC) strategy for smaller-scale disturbance rejection, designed to correct for the random walk of the balloons due to the unresolved smaller-scale flowfield fluctuations only occasionally, as necessary.


Thomas R Bewley (BS/MS, Caltech, 1989; diploma, von Karman Institute for Fluid Dynamics, 1990; PhD, Stanford, 1998) directs the UCSD Flow Control and Coordinated Robotics Labs, which collaborate closely on interdisciplinary projects.  The Flow Control Lab investigates a range of questions ranging from theoretical to applied, including the development of advanced analysis tools and numerical methods to better understand, optimize, estimate, forecast, and control fluid systems. The Coordinated Robotics Lab investigates the mobility and coordination of small multi-modal robotic vehicles, leveraging dynamic models and feedback control, with prototypes built using cellphone-grade electronics, custom PCBs, and 3D printing; the team has also worked with a number of commercial partners to design and bring successful consumer and educational-focused robotics products to market.

Pop-Up Workshop on Linux-based Robotics in Education
Feb 15 @ 3:00 pm – 5:00 pm

To Attend: email Rose Chase

Location: 107 Malone Hall

Louis Whitcomb (JHU) and Thomas Bewley (UCSD)

This is an informal workshop and discussion session on the topic of
Linux-based robotics in education, with a focus on low-cost robots that
are easy and relatively inexpensive to deploy to students.  UCSD’s
Professor Thomas Bewley will lead the discussion with a description of
the EduMIP, a ~$150 self-balancing two-wheeled robot that runs Linux,
and how he employs these robots in his classes.  JHU’s Louis Whitcomb
will discuss his nascent use of the EduMIP with the RObot Operating
System (ROS) for his graduate course in robot systems programming.
There will be demonstrations.

Participants are invited to present a brief informal discussion of how
they use Linux-based robots in their curriculum.

Some links:
Homepage for Tom Bewley’s MAE144 – Embedded Control & Robotics​ at UCSD

Homepage for Louis Whitcomb’s 530.707 Robot Systems Programming at JHU


Thomas Bewley
Professor, Mechanical and Aerospace Engineering
University of California, San Diego
LCSR Seminar – Samuel Kadoury: Optical shape sensing for device tracking in MR-guided interventions @ B17 Hackerman Hall
Feb 22 @ 12:00 pm – 1:00 pm


Image-guided interventional systems rely on accurate device tracking technologies, such as optical or electromagnetic (EM) systems to navigate tools with high resolution diagnostic imaging modalities, such as CT/MRI/PET and facilitate the targeting of specific tissue within the body. However for MR-guided procedures such as Magnetic Resonance Navigation (MRN) which exploits the high magnetic field of an MRI scanner to steer magnetic nanoparticles embedded in drug-eluting beads (DEB), traditional tracking methods are not suitable to visualize catheters inside the patient’s vascular network.  This talk will focus on the development of optical shape sensing devices which overcome the limitations associated with these past approaches with the ability to be integrated into sub-millimeter size tools. We present two MR-compatible solutions, using distributed fiber Bragg gratings (FBG) sensors and ultraviolet curing for optical frequency domain reflectometry (OFDR) measuring strain applied to a fiber triplet inserted in a tool and reconstruct the 3D shape during navigation. Recent phantom and ex-vivo experiments compare the accuracy to EM tracking and demonstrate the insensitivity towards external magnetic fields, illustrating the potential of these approaches for image guidance.


Samuel Kadoury is an associate professor in the Computer and Software Engineering Department at Polytechnique Montreal, member of the Biomedical Engineering Institute at the University of Montreal and researcher at the CHUM Research Center. He currently holds the Canada Research Chair in Medical Imaging and Assisted Interventions at Polytechnique Montreal. He obtained his Masters in Electrical Engineering from McGill University in 2005. After a one-year period at Siemens Corporate Research in Princeton, NJ, he returned to Montreal to complete his Ph.D. in biomedical engineering, focusing on orthopaedic imaging.  He completed a post-doctoral fellowship at Ecole Centrale de Paris and worked as a clinical research scientist for Philips Research North America at the National Institutes of Health in Bethesda, MD from 2010 to 2012, developing image-guided systems for liver and prostate cancer. Prof. Kadoury has published over 100 peer-reviewed papers in leading journals and conferences in fields such as biomedical imaging, computer vision, radiology and neuroimaging. He holds 5 US patents in the field of image-guided interventions, has participated in the technological transfer of multiple research projects to commercial products, and was awarded the NIH merit award for his work on prostate cancer, as well as the Cum Laude Award from the RSNA for his work in artificial intelligence for liver cancer detection.

LCSR Seminar: TBA @ B17 Hackerman Hall
Mar 1 @ 12:00 pm – 1:00 pm
LCSR Seminar: Saeed Abdullah “Circadian Computing: Sensing and Stabilizing Biological Rhythms” @ B17 Hackerman Hall
Mar 8 @ 12:00 pm – 1:00 pm


Rhythms guide our lives. Almost every biological process reflects a roughly 24-hour periodicity known as a circadian rhythm. Living against these body clocks can have severe consequences for physical and mental well-being, with increased risk for cardiovascular disease, cancer, obesity and mental illness. However, circadian disruptions are becoming increasingly widespread in our modern world. As such, there is an urgent need for novel technological solutions to address these issues. In this talk, I will introduce the notion of “Circadian Computing” – technologies that support our innate biological rhythms. Specifically, I will describe a number of my recent projects in this area. First, I will present novel sensing and data-driven methods that can be used to assess sleep and related circadian disruptions. Next, I will explain how we can model and predict alertness, a key circadian process for cognitive performance. Third, I will describe a smartphone based tool for maintaining circadian stability in patients with bipolar disorder. To conclude, I will discuss a vision for how Circadian Computing can radically transform healthcare, including by augmenting performance, enabling preemptive care for mental health patients, and complementing current precision medicine initiatives.



Saeed Abdullah is a Ph.D. candidate in Information Science at Cornell University, advised by Tanzeem Choudhury. Abdullah works on developing novel data-driven technologies to improve health and well-being. His research is inherently interdisciplinary and he has collaborated with psychologists, psychiatrists, and behavioral scientists. His work has introduced assessment and intervention tools across a number of health related domains including sleep, cognitive performance, bipolar disorder, and schizophrenia. Saeed’s research has been recognized through several accolades, including the $100,000 Heritage Open mHealth Challenge winner, a best paper award, and an Agile Research Project award from the Robert Wood Johnson Foundation.

LCSR Seminar: Bernhard Fuerst – Recent advancements and the future of computer aided medical procedures @ B17 Hackerman Hall
Mar 15 @ 12:00 pm – 1:00 pm
JHU Robotics Industry Day 2017 @ Johns Hopkins University Homewood Campus
Mar 22 all-day


The Laboratory for Computational Sensing and Robotics will highlight its elite robotics students and showcase cutting-edge research projects in areas that include Medical Robotics, Extreme Environments Robotics, Human-Machine Systems for Manufacturing, BioRobotics and more. JHU Robotics Industry Day will take place from 8 a.m. to 3 p.m. in Hackerman Hall on the Homewood Campus at Johns Hopkins University.

Robotics Industry Day will provide top companies and organizations in the private and public sectors with access to the LCSR’s forward-thinking, solution-driven students. The event will also serve as an informal opportunity to explore university-industry partnerships.

You will experience dynamic presentations and discussions, observe live demonstrations, and participate in speed networking sessions that afford you the opportunity to meet Johns Hopkins most talented robotics students before they graduate.

Please contact Rose Chase if you have any questions.

Download our 2017 Industry Day booklet

Schedule of Events

(times are subject to change)

               LEVERING GREAT HALL

8:00    Registration and Continental Breakfast

8:30    Welcome: Larry Nagahara, Associate Dean for Research, JHU

8:35     Introduction to LCSR: Director Russell H. Taylor

8:55    Research and Commercialization Highlights

9:00    Louis Whitcomb, LCSR

9:10    Noah Cowan & Erin Sutton, LCSR

9:20    Marin Kolilarov, LCSR

9:30    Philipp Stolka, Clear Guide Medical

9:40    Mehran Armand, APL and LCSR

9:50    Stephen L. Hoffman, Sanaria, Inc.


10:10  Bernhard Fuerst, LCSR

10:20  Bruce Lichorowic, Galen Robotics

10:30  David Narrow, Sonavex, Inc.

10:40  Kelleher Guerin & Benjamin Gibbs, READY Robotics

10:50  Promit Roy, Max and Haley LLC

11:00  John Krakauer, Malone Center for Engineering in Healthcare Update, JHU

11:10   New Faculty Talks

11:10 – Muyinatu Bell

11:30 – Jeremy D. Brown

               HACKERMAN HALL B17 LOBBY

12:00-1:15     LUNCH


12:00-1:15     Poster + Demo Sessions

               HACKERMAN HALL B17

1:15-3:00       Student and Industry Speed Networking

Please contact Rose Chase if you have any questions. Download PDF of Campus Map & Schedule of Events


Spring Break – No Seminar @ B17 Hackerman Hall
Mar 22 @ 12:00 pm – 1:00 pm
LCSR Seminar: Ashley Llorens @ B17 Hackerman Hall
Mar 29 @ 12:00 pm – 1:00 pm
Matthew Johnson-Roberson: Beyond the dataset: addressing training, deployability, and scaling in deep learning-based perception for field robots @ B17 Hackerman Hall
Apr 5 @ 12:00 pm – 1:00 pm


Robotic platforms now deliver vast amounts of sensor data from large unstructured environments. In attempting to process and interpret this data there are many unique challenges in bridging the gap between prerecorded datasets and the field. This talk will present recent work addressing the application of deep learning techniques to robotic perception. Deep learning has pushed successes in many computer vision tasks through the use of standardized datasets. We focus on solutions to several novel problems that arise when attempting to deploy such techniques on fielded robotic systems. The themes of the talk are twofold:  1) How can we integrate such learning techniques into the traditional probabilistic tools that are well known in robotics? and 2) Are there ways of avoiding the labor-intensive human labeling required for supervised learning? These questions give rise to several lines of research based around dimensionality reduction, adversarial learning, and simulation. We will show this work applied to three domains: self-driving cars, acoustic localization, and optical underwater reconstruction. This talk will show results on field data from the monitoring of Australia’s Coral Reefs, the archeological mapping of a 5,000-year-old submerged city, and the operation of a level-4 self-driving car in urban environments.


Matthew Johnson-Roberson is Assistant Professor of Engineering in the Department of Naval Architecture & Marine Engineering and the Department of Electrical Engineering and Computer Science at the University of Michigan. He received a PhD from the University of Sydney in 2010. There he worked on Autonomous Underwater Vehicles for long-term environment monitoring. Upon joining the University of Michigan faculty in 2013, he created the DROP (Deep Robot Optical Perception) Lab, which researches a wide variety of perception problems in robotics including SLAM, 3D reconstruction, scene understanding, data mining, and visualization. He has held prior postdoctoral appointments with the Centre for Autonomous Systems – CAS at KTH Royal Institute of Technology in Stockholm and the Australian Centre for Field Robotics at the University of Sydney. He is a recipient of the NSF CAREER award (2015).