Behavior computation: Internet of Behaviors (IoB) and human AI
January 15, 2021 § 2 Comments
Here I will introduce shortly some interesting technological trends that seem to lead to an increased need for a systematic framework to represent and code any human behaviors, simple as well as complex, mental and bodily, and their components. The need to manage massive behavioral data is different for computer game industry, gene sciences, IoT world, and robotics. In AI this is becoming a reality and it is the focus here.
My take is to suggest IoB as a candidate approach, not only in these fields but in general, to model and code human behaviors for computational and then service purposes. This will be a vast task if it is launched. My guess is that the technology-evolutionary development will first lead to context- and device-specific modeling of behaviors, but when the data portability becomes an issue it will then reveal the benefits of a common framework for behavior data, whatever the means of generating (in games, robots) or following (genes, IoT, AI) behaviors might be. This development will take time.
Dance behavior codes are here already
Decades ago, I happened to know the then young Finnish dancer Mikko Nissinen who later became a famous ballet star and a choreographer. He introduced to me a Dutch system for coding choreographies and we discussed the problems of reaching the quality and essence of such expressive movements. Dance notations have a long history and there are dozens of different systems, specific for each style and not to be generalized to other styles. The main aim is to document dances and help to preserve them. Indeed, dance and theatre are perhaps the most ambitious, realistic fields for generating and coding any human behaviors and AI could use these for its future learning needs.
The dancing Asimo robots from 2007 and now Boston Dynamic robots, although they look like machines, demonstrate the vivid human-like potential of robots. The secret behind their marvelous performance is that their behavior imitates human dance – as a cultural and expressive phenomenon – in a credible manner. They act as much as any characters in a movie, theatre play or a game animation.
Dance behaviors – scripts – have been programmed and stored in these robot’s control store and even without knowing the details of the software and algorithms, there must be an integrated set of inter-linked, natural-like behaviors and their elements that can be output in a coordinated and adaptive manner. In other words, dance behaviors can be coded. What about millions of other human behaviors and their components? There is no unique solution to this call, yet.
The behaving AI of the future is expected to master and imitate human behaviors, emotional, intentional and motor alike, and beat us in any rational task. The functioning of AI is totally different from the processes going on in the human mind and the representation of the behavioral elements of the robots are not the same as in the human mind-body system. The artificial neural networks in deep learning and related ML system are indeed, artificial, perhaps biologically inspired ones.
AI must learn and be taught to behave in a manner which we humans can observe, interpret, accept, relate to and understand the depicted behavior. With an ever more versatile AI, millions of human behavior patterns, features, expressions, and episodes become difficult to manage systematically. Even more difficult is to build a system that covers, for example, a large set of different behaviors having the same style. Think about a robot that behaves nicely here in Finland. If we then take it to another country, with totally different culture from ours, how should the behavior program be altered or adapted in order for the robot to behave well there? Computational, adaptive and mass storage solutions and some others can be imagined.
Dance notations have been genre specific. A good example is the type of dance performed by different individuals. From the outset dance behavior follows a certain discipline, but variations occur, it is the essence of art. The problem is not easier, when considering the (expressed) motivations and cultural drivers of each behavior or a behavior set. A systematic for the representation and coding of behaviors is needed and it should allow computational manipulation.
Human behaviors and IoT
Many IoT devices and systems will be in close proximity to or in direct contact with behaving people at work, home and in free time. For example, following how people behave according to certain protocols in critical environments can be based on IoT devices which then allow seamless recording and monitoring of the behaviors of individuals who are recognized by identity. Other application areas for IoT are energy, health, and child -related behaviors, and of course, wearable sensors and many others. Common to all these is the need to know, what behaviors are associated with the IoT addressed systems in use and how to classify these behaviors. In case of complex and variable behaviors this is not an easy task to manage.
Synthetic emotional face expressions
In animation, emotional face expressions have been an essential part of character behavior. To put it simply, the question is how to link an assumed mental state, an emotion, with a set of relevant face expressions – that is. visible behaviors. To accomplish this, various methods are used, from GAN to emotion theoretical classification and other component-based systematics. I was lucky to serve as the opponent for the doctoral thesis by Meeri Mäkäräinen, Aalto University “Blending and Exaggeration of Animated Facial Expressions of Emotion” and was inspired to look at this work from a general perspective, too: how to represent and manage data on any complex mental states and their related behaviours so that it serves the needs of the situation, be it a drama, a real life communication episode or communication with or guiding a behaving robot. The field is developing fast and serves as a good model for dealing with any mental human behaviors. There is no general model for representing behaviors although specific toolsets and approaches flourish.
Non-player character behavior in games
Computer games generate different behaviors for the non-player characters and in the present game scene, massive behavior data must be managed. The field develops fast. Typically, dynamic scrips are used to make the characters show realistic or any other wanted behavior, which can have very delicate variations. Adaptive, learning and human like principles can be used for generating relevant behaviors which can be of any human and cultural genre and of course imaginary.
Social robot behavior
Social robotics aims at building robots where natural-like behavior becomes possible. This is accomplished e.g. by combining cognitive architecture, adaptive behavior and emotional expressions which can be used in natural-like human-robot interaction, UI and collaboration. Applications will occur in numerous contexts from hospitals to schools and industrial settings, even entertainment.
Representation and storage of behavior models is then an essential part of a dynamically behaving robot. At the writing of this, it seems that there is no generally accepted model for representing all human behaviors but the need for a system that covers them is evident. Social robotics evolves and some of the behavioral-emotional models are based on human cognitive theories and biological process. See e.g. Nocentini et al. 2019”A Survey of Behavioral Models for Social Robots).
Genes and behavior
Gene sciences have managed to characterize the occurrence of important gene expressions in human tissues and in different individuals – and various approaches emerge to understand how human behaviors are guided by genetic processes. Currently there are numerous studies where personality, intelligence and various pathologies are correlated with genetic factors. Typically, however, no direct, expressive and systematic behavior model is used in such studies.
In summary, these fields share the need for a general framework for representing and coding all behaviors, which would allow applying the model to the processing of behavior data, mental and physical alike – and transfer of behavior data. Genetics, learning AI, IoT, computer games and robotics are moving to a direction where the need becomes pressing.
Digital representation of behaviors in AI
AI systems receive learning/teaching/guiding data. When AI performs a task imitating a simple natural human behavior, for example recognizing human faces or other objects, each input pattern vector represents an object or components of it, which we humans can recognize, too even when they have only a mathematical formulation and do not necessarily appear as natural elements of objects. In the case of face images, the relevant vector spaces can be defined so that they cover and differentiate any natural face images. Teaching and supervising an AI system can then use these representations. The approach works well in any sensory domain where the task is simple, to recognize objects – or at least to classify them.
What about more complex or abstract human behaviors, like solving a mathematical problem, playing basketball, dancing or buying a new home? Playing an instrument, praying, writing a book, dreaming of a new job, composing a piece, thinking of life together with a loved one, … the list of human behaviors is endless and will continue growing. Clearly, it is a formidable, almost impossible task to list and code all human behaviors and their components. We know how actors manage to move us with their behaviors (which do not originate from real life). Some robots are aiming at evoking strong emotional responses from us, especially in children’s play, therapy, elderly care and sex. More is to come.
We will see an exponential growth in the emergence of digital services and applications which record and want to know and serve relevant human behaviors and situations. Typical examples are physiological trackers for various purposes, map-based-, educational-, health care-, music services and many others. They do not have a common data model for representing behavior and their ways of classifying behaviors vary according to the use context. The coded behaviors hide in silos. Only few of them offer means for representing human mental states/behaviors. There is a babel of behavior data representations. Data Transfer Project run by Apple, Facebook, Google and other giants aims at making data portable from one environment to another. No doubt, one of their challenges is the general management of high-quality behavior data.
Behavior Computation for future AI systems
An efficient framework for behavior representations includes a system for coding different behaviors and behavior classes, which then can be used as a well-defined input to AI as teaching material. It can also be used as a means for ‘humanification’ of AI-system User Interfaces (UI) and for robots to generate certain human-like, IoB coded, behaviors and especially their situational variations. When the IoB system has matured enough it can be used for any AI applications having human-like behaviors.
While this is not happening, it is a real possibility that AI can learn to browse our behaviors, using any sources from movies to literature, to health and any other history data and to use this knowledge for imitating human behavior and guiding people. AI can generate hypotheses about us in order to predict our behavior and reactions to targeted messaging, and interventions, for example. Already now AI can generate reasonably rational texts and stories, analyze and summarize genuine scientific articles, and even suggest new hypothesis and solutions to complex problems. Deep fake GAN imaging and audio imitate human expressions and styles. To the best of my knowledge, systematics of behavior modeling is only emerging but they do not exist so far.
When AI is made to learn high-level human behaviors, reactions and experiences, which are typical in visual arts and music, creative sports, written culture and simple imagination, it needs systematic guidance information – behavior data – that directs its development. How far, in the human and cultural realms can AI reach and what can we do to build the best possible human AI?
I have here emphasized the need for behavior model systematics in supporting the digital use of behavior data. This will be important not only to generate various behaviors but to build a genuine behavior computation framework that allows coding, analysis, computation, transformation and learning of any human behavior. Such a system does not exist, yet. More about it later.