Content area
With the introduction of Industry 5.0, there is a growing focus on human-robot collaboration and the empowerment of human workers through the se of robotic technologies. Collaborative robots, or cobots, are well suited for filling the needs of industry. Cobots have a prioritization on safety and collaboration, giving them the unique ability to work in close proximity with people. This has the potential impact of increasing task productivity and efficiency while reducing ergonomic strain on human workers, as cobots can collaborate on tasks as teammates and support their human collaborators.
However, effectively deploying and using cobots requires multidisciplinary knowledge spanning fields such as human factors and ergonomics, economics, and human-robot interaction. This knowledge barrier represents a growing challenge in industry, as workers lack the skills necessary to effectively leverage and realize the potential of cobots within their applications, resulting in cobots often being used non-collaboratively as a form of cheap automation. This presents several research opportunities for the creation of new cobot systems that support users in the creation of cobot interactions.
The goal of this dissertation is to explore the use of abstraction and scaffolding supports within cobot systems to assist users in building human-robot collaborations. Specifically, this research (1) presents updates to the design of systems for planning and programming collaborative tasks, and (2) evaluates each system to understand how it can support user creation of cobot interactions. First, I present the CoFrame cobot programming system, a tool built on prior work, and illustrate how it supports user creation and understanding of cobot programs. Then, I present the evaluation of the system with domain experts, novices, and a real-world deployment to understand in which ways CoFrame does and does not successfully support users. I then describe the Allocobot system for allocating work and planning collaborative interactions, describing how it encodes multiple models of domain knowledge within its representation. Finally, I evaluate the Allocobot system in two real-world scenarios to understand how it produces and optimizes viable interaction plans.