Past Activities

SAIBA and The Behavior Markup Language

The generation of natural multimodal output for virtual humans requires a time-critical production process with high flexibility. To scaffold this production process and encourage sharing and collaboration, a working group of researchers has introduced the SAIBA framework (Situation, Agent, Intention, Behavior, Animation). SAIBA's Behaviour Markup Language is the de facto standard for the specification of synchronized behavior (speech, gesture, facial expression, ..) for virtual humans. The SAIBA github provides tools for BML parsing, testing BML Realizers and visualizing BML.

The Articulated Social Agents Platform

The Articulated Social Agents Platform (ASAP) provides a collection of software modules for social robots and virtual humans jointly developed by the Sociable Agents group in Bielefeld and the Human Media Interaction group in Twente. In addition to a collection of tools, we also provide the means (through middleware, architecture concepts and shared build and deployment strategies) to compose virtual human or robot applications in which the tools are embedded. Our current work focuses on AsapRealizer, a BML Realizer (behavior generator) for incremental, fluent, multimodal interaction with a virtual human or robot, and IPAACA, a middleware that implements an incremental processing architecture in a distributed fashion.

The Intelligent Interaction Knowledge Base

The it’s OWL Knowledge Base for Human-Machine Interaction, hosted and maintained by Bielefeld University, provides an overview of state-of-the-art methods, references to available software and hardware as well as evaluation resources for the realization of innovative HMI applications. It addresses researchers and engineers working in the fields of Human-Machine/Human-Computer Interaction, Usability Engineering, and Human Factors. The HMI knowledge base shall guide researchers and engineers in these fields throughout all phases of HMI system developments and provide references for most types of current human-machine interfaces ranging from touch-based to dialog-based human-machine interaction.