top of page
2024_1212_JK_090.jpg

Artifical Intelligence and Robotics Lab

Director: Angelique Taylor, Ph.D.

Andrew H. and Ann R. Tisch Assistant Professor

Department of Information Science, Cornell University

Department of Emergency Medicine, Weill Cornell Medicine

VISION

We build intelligent technologies to support users in real-world settings

OUR MISSION

Our mission is to design intelligent systems (e.g., robots) that work alongside human teams in real-world, safety-critical environments (i.e., healthcare). Our research lies at the intersection of robotics, computer vision, and artificial intelligence.

 

Through community engagement and human-centered design, we conduct groundbreaking research that impacts both theoretical development and real-world applications in safety-critical environments. We develop systems that improve teamwork and collaboration between multiple users with mobile robot systems, robot vision, AI, and AR/VR.

tlps-9290.jpg

TEACHING

ML & Human-Robot Interaction

This course provides hands-on experience developing and deploying foundational machine learning algorithms on real-world datasets for practical applications (e.g., predicting housing prices, document retrieval, and product recommendation among others). Students will learn about the machine learning pipeline end-to-end including dataset creation, pre- and post-processing, annotation, annotation validation, preparation for machine learning, training and testing a model, and evaluation. Students will focus on real-world challenges at each stage of the ML pipeline while handling bias in models and datasets. Lastly, students will analyze the strengths and weaknesses of regression, classification, clustering, deep learning, and Agentic AI algorithms.

[Spring 2026, Spring 2025, Spring 2024, Spring 2023]

Robots are making their way into our everyday lives, working across many applications including in people’s homes, healthcare, and retail settings to name a few.  As these systems become more integrated into our lives, it is important that they are designed to be useful, functional, and socially acceptable; however, this remains a key challenge for the field of human-robot interaction (HRI).  This course covers core computational, engineering, social challenges, and approaches for effective HRI in human-centered environments.  Topics include research methods, robot design and anthromorphization, perception of people, groups and teams, spatial interaction, emotion and intent design in HRI, social signal processing – recognition and synthesis, and augmented and virtual reality (AR/VR) for HRI. Students should expect to learn about seminal research in HRI, gain hands-on experience with physical mobile robots, and implement systems for real-time interaction with users.

[FALL 2025, FALL 2024, Fall 2022]

WHY AIRLAB

A Different Approach, Using Human-Centered Design

A guiding principle of our research agenda is to engage in collaborative design processes with stakeholders to motivate technical innovations before and throughout iterative developments of novel platforms and algorithms that drive robot and AR decision-making during team collaborations.  My work has been published in top venues in my field, including HRI, THRI, AAAI, IROS, CSCW, R-AL, UIST, RCAR, and Issues in Science and Technology.

tlps-9495.jpg

OUR PLATFORMS

Robots & Augmented Reality

tlps-9213.jpg
tlps-9213.jpg

Clifford Robot (Crash cart v1)

The Clifford robot is a prototype design to be flexible and modular to serve as a boundary object for communication with different healthcare stakeholders. Its core functionality is centered around mobility. Each function was accomplished by effectively slapping together off-the-shelf modules on a standard workshop shelf. Clifford resembles a crash cart in terms of its red color but all the drawers do not open in a similar fashion as a medical crash cart. Nevertheless, these prototypes provided an opportunity for us to rapidly add mobility to cart-base objects and explore how robots could navigate in ERs [PDF].

Screen Shot 2024-12-21 at 11.11.58 PM.png

Cart2D2 (Crash cart v2)

Cart2D2 is a medical crash cart situated on a mobile platform with a hoverboard circuit. Cart2D2 is implemented solely as a teleoperated platform and does not yet have autonomous capabilities. It is equipped with an RGB-D camera that records image data of interactions and a speaker to talk to users.

Screen Shot 2024-12-21 at 11.12.28 PM.png

D(rawer)Bot (Crash cart v3)

The vision for this prototype is to enable the robot to approach healthcare workers during medical procedures and open relevant drawers with equipment to prevent them from shuffling through cart drawers in search for supplies. We developed a modular mobile platform based off a hoverboard, integrating linear actuators for the shelves, and developing Printed Circuit Boards as well as control logic in Robot Operating System to control the a robot that resembles drawers of a traditional crash cart with less focus on the appearance of the cart [PDF].

Screen Shot 2024-12-21 at 11.12.40 PM.png

BigRedMobile Robot
(Crash cart v4)

The BigRedMobile robot has unique benefits in terms of its multimodal feedback using speech, drawer lights, and alerts to indicate the location of relevant supplies and task reminders. The design requirements of BigRedMobile include building the robot rapidly with limited complexity to enable others to build a robot with little technical knowledge, low-cost, and publically available hardware [PDF].

TURTLEBOT4

9kg

Payload Capacity

The HRI course utilized the TurtleBot 4 platform that is built on the iRobot® Create 3 educational robot – a sturdy mobile base that provides an array of intelligent sensors for accurate localization and positioning with a speed up to .3 m/s. The Create 3 has a standalone Robot Operating 2 interface and unlike previous TurtleBots, it includes integrated batteries and a charging dock.

Introduction to Human-Robot Interaction Final Projects

rat_roulette.gif
pong_soccer.gif
dr_robot.gif
tomtarget.gif

Rat Roulette

Pong Soccer

Dr. Robot

Cat and Mouse

TIAGO MOBILE BASE

An assistive robot used as an adjustable platform for healthcare settings to socially engage with and support healthcare workers, patients, and staff in the Emergency Room, long-term healthcare facilities, and sleep clinics.

RELIA THE BEARER, Craft@Large

This robot prototype was built for long-term healthcare facilities to interacts with patients and healthcare workers during art therapy activities. It stores art supplies, engages with participants using gestures by waving to participants using servo motors and actuated cardboard hands, and enables healthcare workers to use a laptop computer to record relevant information about participant activities.

RoamIN.png

DREAME, Craft@Large

The sleep clinic robot prototype illuminates lights inside the robot embodiment to account for nighttime interactions, enables healthcare workers to place a laptop on the platform for data entry during clinician-patient interactions, and responds to patient questions before sleep studies.

DreamE.png

ROAMIN, Craft@Large

This robot prototype was built for the emergency room to assist healthcare workers during medical procedures. The robot dispenses medical toolkits to emergency medicine workers during procedures, tracks toolkit orders, and navigates from supply rooms to patient rooms to deliver supplies.

RELIA.png

PARTNERS

Our Collaborators

SPONSORS

WE THANK OUR SPONSORS

Recent News

bottom of page