Human Computer Interaction - A virus
test banner

Post Top Ad

Responsive Ads Here

Human Computer Interaction

Share This

Human Computer Interaction



Prerequisites: IntroductoryCourse
Human and Interactive Systems: Human memory, reasoning and problem solving, emotion and psychology, effects of affect, measuring user affect, human information processing and perceptual-motor behavior, attention in information processing, human based design of interactive systems, models of interaction, ergonomics, HCI in the software process. 
Cognitive and Interaction Models for HCI: Cognitive neuroscience, mental models, Cognitive architectures, The Model Human Processor (MHP), GOMS, Cognitive Complexity Theory, Task loading and stress in Human Computer Interaction, Relationship between stress and cognitive workload, mitigation of stress, Human error Identification in HCI, Interactions models, Status- event analysis, sensor-based interaction. 
Technology, Design and Evaluation Techniques for HCI: Input Technologies and Techniques, Modalities of Interaction, Sensor and Recognition-based input for interaction: sensors and signal processing, Haptic Interface, Non-speech sound in HCI, Wearable computers,Interactive design and prototyping, User Interface Management Systems, Universal design principles, user support and help systems, evaluation through expert analysis and user participation, choosing an evaluation method. 
Formal Methods in HCI & Design Issues in Critical Systems: Failure Modes and Effect Analysis (FMEA), Human Factors Process FMEA, Cognition-Adaptive Multimodal Interface (CAMI), consequences of human errors, catastrophic effects, state transition diagram, PIE model.
 

Human Computer Interaction

Human Memory

There are many variations but the objects are all loosely related: ‘I went to the market and bought a lemon, some oranges, bacon or ‘I went to the zoo and saw monkeys, and lions, and tigers and so on. As the list grows objects are missed out or recalled in the wrong order and so people are eliminated from the game.

Reasoning

Reasoning is the process by which we use the knowledge we have to draw conclusions or infer something new about the domain of interest. There are a number of different types of  reasoning: deductive, inductive and abductive. We use each of these types of reasoning in everyday life, but they differ in significant ways.

Deductive reasoning

Deductive reasoning derives the logically necessary conclusion from the given premises. For example, If it is Friday then she will go to work It is Friday Therefore she will go to work.
It is important to note that this is the logical conclusion from the premises; it does not necessarily have to correspond to our notion of truth. So, for example, If it is raining then the ground is dry It is raining Therefore the ground is dry.

Inductive reasoning

Induction is generalizing from cases we have seen to infer information about cases we have not seen. For example, if every elephant we have ever seen has a trunk, we infer that all elephants have trunks.

Abductive reasoning

The third type of reasoning is abduction. Abduction reasons from a fact to the action or state that caused it. This is the method we use to derive explanations for the events we observe. For example, suppose we know that Sam always drives too fast when she has been drinking.

Problem solving

If reasoning is a means of inferring new information from what is already known, problem solving is the process of finding a solution to an unfamiliar task, using the knowledge we have. Human problem solving is characterized by the ability to adapt the information we have to deal with new situations.

Gestalt theory

Gestalt psychologists were answering the claim, made by behaviorists, that problem solving is a matter of reproducing known responses or trial and error. This explanation was considered by the Gestalt school to be insufficient to account for human problem-solving behavior. Instead, they claimed, problem solving is both productive and reproductive.

Emotions and Psychology

So far in this chapter we have concentrated on human perceptual and cognitive abilities. But human experience is far more complex than this. Our emotional response to situations affects how we perform. For example, positive emotions enable us to think more creatively, to solve complex problems, whereas negative emotion pushes us into narrow, focussed thinking. A problem that may be easy to solve when we are relaxed, will become difficult if we are frustrated or afraid.
Psychologists have studied emotional response for decades and there are many theories as to what is happening when we feel an emotion and why such a response occurs. More than a century ago, William James proposed what has become known as the James–Lange theory (Lange was a contemporary of James whose theories were similar): that emotion was the interpretation of a physiological response, rather than the other way around.

Effects of affects

Attention One of the most important effects of emotion lies in its ability to capture attention. Emotions have a way of being completely absorbing. Functionally, they direct and focus our attention on those objects and situations that have been appraised as important to our needs and goals so that we can deal with them appropriately. Emotion-relevant thoughts then tend to dominate conscious processing—the more important the situation, the higher the arousal, and the more forceful the focus. In an HCI context, this attention-getting function can be used advantageously, as when a sudden beep is used to alert the user, or can be distracting, as when a struggling user is frustrated and can only think about his or her inability. Emotion can further influence attention through a secondary process of emotion regulation . Once anemotion is triggered, higher cognitive processes may determine that the emotion is undesirable. In such cases, attention is often directed away from the emotion-eliciting stimulus for the purpose of distraction. For example, becoming angry with an onscreen agent may be seen as ineffectual (i.e., because it doesn’t recognize your anger) or simply unreasonable. An angered user may then actively try to ignore the agent, focusing instead on other onscreen or off-screen stimuli, or even take the next step and completely remove the agent from the interaction (which could mean leaving an application or Website entirely).

Performance

 Mood has also been found to affect cognitive style and performance. The most striking finding is that even mildly positive affective states profoundly affect the flexibility and efficiency of thinking and problem solving).

Assessment 

Mood has also been shown to influence judgment and decision making. As mentioned earlier, mood tends to bias thoughts in a mood-consistent direction, while also lowering the thresholds of mood-consistent emotions.
Measuring user affect human information processing and perceptual motor behaviour
The measurement of affect in HCI research is a challenging and complex issue. Although a number of techniques for measuring affect have been developed, a systematic discussion of their effectiveness and applicability in different contexts remains lacking, especially in social contexts with multiple users.
Human-system interaction is fundamentally an information-processing task. The human information-processing approach is based on the idea that human performance, from displayed information to a response, is a function of several distinct processes. The nature of these processes, how they are arranged, and the factors that influence how quickly and accurately a particular process operates, can be discovered through appropriate research methods. Because information-processing analyses are used in HCI in several ways, it is beneficial to be familiar with basics of the approach and specific applications to HCI.

Perceptual Motor Behaviour:-

Perceptual Motor Behaviour will have implications for the design of robotics, technology, and novel rehabilitation programs. The findings from experiments conducted in the Perceptual Motor Behaviour increase our understanding of the underlying neural processes for motor control and learning, and test new and innovative approaches to the assessment and treatment of new or challenging motor skills. The advancements made will contribute to developing interventions that are cost effective and track changes in motor performance accurately. 

Currently there are three main areas of research: 

  1. Multisensory-motor integration in the typically developing population 
  2. Sensorimotor integration in individuals with an Autism Spectrum Disorder 
  3. Assessment and treatment of individuals with neurological disorders.

Egronomics

Ergonomics (from the Greek word ergon meaning work, and nomoi meaning natural laws), is the science of refining the design of products to optimize them for human use. Human characteristics, such as height, weight, and proportions are considered, as well as information about human hearing, sight, temperature preferences, and so on. Ergonomics is sometimes known as human factors engineering. Computers and related products, such as computer desks and chairs, are frequently the focus of ergonomic design. A great number of people use these products for extended periods of time -- such as the typical work day. If these products are poorly designed or improperly adjusted for human use, the person using them may suffer unnecessary fatigue, stress, and even injury. The term “ergonomics” can simply be defined as the study of work. It is the science of fitting jobs to the people who work in them. Adapting the job to fit the worker can help reduce ergonomic stress and eliminate many potential ergonomic disorders (e.g., carpel tunnel syndrome, trigger finger, tendonitis). Ergonomics focuses on the work environment and items such as the design and function of workstations, controls, displays, safety devices, tools and lighting to fit the employee’s physical requirements, capabilities and limitations to ensure his/her health and well being.

Hci In Software Process:-

  1. Software engineering provides a means of understanding the structure of the design process, and that process can be assessed for its effectiveness in interactive system design.
  2. Usability engineering promotes the use of explicit criteria to judge the success of a product in terms of its usability
  3. Iterative design practices work to incorporate crucial customer feedback early in the design process to inform critical decisions which affect usability. 
  4. Design involves making many decisions among numerous alternatives. Design rationale provides an explicit means of recording those design decisions and the context in which the decisions were made. 
The software engineering life cycle aims to structure design in order to increase the reliability of the design process. For interactive system design, this would equate to a reliable and reproducible means of designing predictably usable systems. Because of the special needs of interactive systems, it is essential to augment the standard life cycle in order to address issues of HCI. Usability engineering encourages incorporating explicit usability goals within the design process, providing a means by which the product’s usability can be judged. Iterative design practices admit that principled design of interactive systems alone cannot maximize product usability, so the designer must be able to evaluate early prototypes and rapidly correct features of the prototype which detract from the product usability. The design process is composed of a series of decisions, which pare down the vast set of potential systems to the one that is actually delivered to the customer. Design rationale, in its many forms, is aimed at allowing the designer to manage the information about the decision-making process, in terms of when and why design decisions were made and what consequences those decisions had for the user in accomplishing his work.

Human Based Design of interactive Systems:-

So far we have looked briefly at the way in which humans receive, process and store information, solve problems and acquire skill. But how can we apply what we have learned to designing interactive systems? Sometimes, straightforward conclusions can be drawn. For example, we can deduce that recognition is easier than recall and allow users to select commands from a set (such as a menu) rather than input them directly. However, in the majority of cases, application is not so obvious or simple. In fact, it may be dangerous, leading us to make generalizations which are not valid. In order to apply a psychological principle or result properly in design, we need to understand its context, both in terms of where it fits in the wider field of psychology and in terms of the details of the actual experiments, the measures used and the subjects involved, for example. This may appear daunting, particularly to the novice designer who wants to acknowledge the relevance of cognitive psychology but does not have the background to derive appropriate conclusions. Fortunately, principles and results from research in psychology have been distilled into guidelines for design, models to support design and techniques for evaluating design.

Models of Interaction:-

Interaction involves at least two participants: the user and the system. Both are complex, as we have seen, and are very different from each other in the way that they communicate and view the domain and the task. The interface must therefore effectively translate between them to allow the interaction to be successful. This translation can fail at a number of points and for a number of reasons. The use of models of interaction can help us to understand exactly what is going on in the interaction and identify the likely root of difficulties. They also provide us with a framework to compare different interaction styles and to consider interaction problems.

We begin by considering the most influential model of interaction, Norman’s execution–evaluation cycle; then we look at another model which extends the ideas of Norman’s cycle. Both of these models describe the interaction in terms of the goals and actions of the user. We will therefore briefly discuss the terminology used and the assumptions inherent in the models, before describing the models themselves.

COGNITIVE MODEL AND HUMAN INTERACTION FOR HCI

Cognitive modeling

Cognitive modelling is an area of computer science that deals with simulating human problem-solving and mental processing in a computerised model. Such a model can be used to simulate or predict human behaviour or performance on tasks similar to the ones mode-led and improve human-computer interaction.

Uses of cognitive model

Cognitive modelling is used in numerous artificial intelligence (AI) applications, such as expert systemsnatural language processingneural networks, and in robotics and virtual reality applications. Cognitive models are also used to improve products in manufacturing segments, such as human factors, engineering, and computer game and user interface design.

Cognitive neuroscience

Cognitive neuroscience is the field of study focusing on the neural substrates of mental processes. It is at the intersection of psychology and neuroscience, but also overlaps with physiological psychology, cognitive psychology and neuropsychology. It combines the theories of cognitive psychology and computational modelling with experimental data about the brain. 
      
“It is an interdisciplinary area of research that combines measurement of brain activity with a simultaneous performance of cognitive tasks by human subject.”

Mental model

The concept of mental models comes from the Scottish psychologist, Kenneth Craik’s, book “The Nature of Exploration.” He said that the mind constructs “small-scale models of reality” to anticipate and explain events.
Mental models are one of the most important concepts in human–computer interaction (HCI). ... Hopefully, users' thinking is closely related to reality because they base their predictions about the system on their mental models and thus plan their future actions based on how that model predicts the appropriate course.
Mental models are an artefact of belief. They are the beliefs that a user holds about any given system or interaction. In most instances, the belief will – to a certain extent – resemble the real life model. This is important because users will plan and predict future actions within a system based on their mental models.

Architecture of mental model

Given that we all have mental models of interaction – it is a good rule of thumb to assume that wherever possible; users will form their mental models based on interactions with existing applications and web sites. In short, they expect functionality to be consistent with these previous experiences and wherever a UI standard pattern exists, it should be emulated in your designs.

Cognitive Architecture

A cognitive architecture can refer to a theory about the structure of the human mind. One of the main goals of a cognitive architecture is to summarize the various results of cognitive psychology in a comprehensive computer model.
Using Putnam’s Multiple Realizability formulation and functionalism, David Chalmers in late 1960s suggest the possibility of mechanism and structure that underline Cognition:
  1. Processors that manipulate data
  2. Memories that hold knowledge and
  3. interface that hold interact with an environment
                          
                          Cognitive architectures in HCI: Present work and future directions
The symposium is made up of this introduction and six other papers on cognitive architectures in HCI. As many readers may not be familiar with cognitive architectures, a description of what cognitive architectures are is presented first. In an effort to be accessible to a wide audience, this description is fairly abstract.

The model Human processor (MHP)

Human processor model or MHP (Model Human Processor) is a cognitive modeling method developed by Stuart K. Card, Thomas P. Moran, & Allen Newell (1983) used to calculate how long it takes to perform a certain task. Other cognitive modeling methods include parallel design, GOMS, and KLM (human-computer interaction).
The standard definition for MHP is: The MHP draws an analogy between the processing and storage areas of a computer, with the perceptual, motor, cognitive and memory areas of the computer user.

About the figure

The human processor model uses the cognitive, perceptual, and motor processors along with the visual image, working memory, and long term memory storages. A diagram is shown below. Each processor has a cycle time and each memory has a decay time. These values are also included below. By following the connections diagrammed below, along with the associated cycle or decay times, the time it takes a user to perform a certain task can be calculated. 

GOMS (Goals, Operators, Methods, Selection Rules)

GOMS is a specialized human information processor model for human-computer interaction observation that describes a user's cognitive structure on four components. In the book The Psychology of Human Computer Interaction.“A set of Goals, a set of Operators, a set of Methods for achieving the goals, and a set of Selections rules for choosing among competing methods for goals.” GOMS is a widely used method by usability specialists for computer system designers because it produces quantitative and qualitative predictions of how people will use a proposed system.

  1. Goals are symbolic structures that define a state of affairs to be achieved and determinate a set of possible methods by which it may be accomplished
  2. Operators are elementary perceptual, motor or cognitive acts, whose execution is necessary to change any aspect of the user's mental state or to affect the task environment
  3. Methods describe a procedure for accomplishing a goal
  4. Control Structure: Selection Rules are needed when a goal is attempted, there may be more than one method available to the user to accomplish it.

Advantages of GOMS

  1. The GOMS approach to user modeling has strengths and weaknesses. While it is not necessarily the most accurate method to measure human-computer interface interaction, it does allow visibility of all procedural knowledge. 
  2. With GOMS, an analyst can easily estimate a particular interaction and calculate it quickly and easily. This is only possible if the average Methods-Time Measurement data for each specific task has previously been measured experimentally to a high degree of accuracy.

Disadvantages of GOMS

  1. GOMS only applies to skilled users. It does not work for beginners or intermediates for errors may occur which can alter the data.
  2. He model doesn't apply to learning the system or a user using the system after a longer time of not using it.
  3. Mental workload is not addressed in the model, making this an unpredictable variable.
  4. GOMS only addresses the usability of a task on a system, it does not address its functionality.

Cognitive Complexity Theory

Cognitive complexity is a psychological characteristic or psychological variable that indicates how complex or simple is the frame and perceptual skill of a person. A person who is measured high on cognitive complexity tends to perceive nuances and subtle differences which a person with a lower measure, indicating a less complex cognitive structure for the task or activity, does not.
It is used as part of one of the several variations of the viable non-empirical evaluation model GOMS (goals, operators, methods, and selection rules); in particular the GOMS/CCT methodology.
Cognitive complexity can have various meanings:
  1. The number of mental structures we use, how abstract they are, and how elaborately they interact to shape our perceptions.
  2. "An individual-difference variable associated with a broad range of communication skills and related abilities ... [which] indexes the degree of differentiation, articulation, and integration within a cognitive system".
Task Loading and Stress in human interaction
Stress is defined as organism's total response to environmental conditions or stimuli. It actually defines the condition that has an impact on the individual's mental and physical health. Crossing a stress level (threshold) is responsible for the changes. In this paper we discuss how stress plays an important role in human computer interaction (HCI) and how a human interacts with any problem domain. We will discuss human approach on stand-alone systems and certain devices on which user's final outcome will be fixed and well defined.
Human stress affects problem solution, solution time and system usability. Human generate various path towards the solution as well as creates various path towards solution to problem statement and this process is discussed in detail in this paper.
                       
                                                 Stress  level

Relationship between stress and cognitive workload

The integration of very high-tech equipment into standard operations is a radical change in the challenges faced by the infantry soldier. In addition, the battlefield, in all its forms, remains a place of extreme stress. Coupled with this stress comes the new burden of cognitive workload associated with the operation and management of new technological systems. The Land Warrior System, as currently conceived, is but one version of a potential family of advanced systems-each of which may generate its own combination of stresses. In this chapter we examine these stress-inducing factors, to identify sources of potential problems and to recommend avenues to solve such problems, either through existing capabilities or by proposing additional research.

Mitigation of stress

The diathesis–stress model is a psychological theory that attempts to explain a disorder as the result of an interaction between a predispositional vulnerability and a stress caused by life experiences. The term diathesis derives from the Greekterm (διάθεσις) for a predisposition, or sensibility. A diathesis can take the form of genetic, psychological, biological, or situational factors. A large range of differences exists among individuals' vulnerabilities to the development of a disorder. 

Human error identification in in HCI

In any complex system, most errors and failures in the system can be traced to a human source. Incomplete specifications, design defects, and implementation errors such as software bugs and manufacturing defects, are all caused by human beings making mistakes. However, when looking at human errors in the context of embedded systems, we tend to focus on operator errors and errors caused by a poor human-computer interface (HCI).
 "Error will be taken as a generic term to encompass all those occasions in which a planned sequence of mental or physical activities fails to achieve its intended outcome, and when these failures cannot be attributed to the intervention of some chance agency."

Factors leading to human errors

The aim of this chapter is to examine factors that have an effect on human errors. The analysis is based on Rasmussen's SRK (Skill – Rule – Knowledge) model:
  1. Skill-based behaviour represents sensorimotor performance automatically without conscious control. Work performance is based on subroutines which are subject to higher level control.
  2. Rule-based behaviour happens in a familiar work situation, where a consciously controlled stored rule is applied. Performance is goal-oriented, but structured by feed-forward control through a stored rule.
  3. Knowledge-based behaviour happens in unfamiliar situations, where a goal is explicitly formulated, based on an analysis of the environment and the overall aims of the person. The means must be found and selected according to the requirements of the situation.   
                                           

Human Error Interaction model

A framework is described for conceptualizing the interactions between people and computers which, it is hoped, will provide the basis of a theoretical model of human-computer interaction (HCI) sufficient to stimulate and guide research in the field.
 A distinction is drawn between interface and interaction, and the purposes of an interaction in this context are identified and discussed. In particular, the important purpose of morphogenesis is further elaborated. 

Definitions of some Terms of Interaction:

Domain: expertise, knowledge in some real world Activity. In GUI domain concepts such as geometric shape, colour, Symbols etc are involved. 
Task: operation to manipulate concepts in a domain. 
Goal: desired output from a performed task. Ex in GUI: A button 
Intention: specific action required to meet the goal.
Task analysis: Study of the problem space.

In HCI interaction models are translations between user and system

There are different Interaction Models mentioned in HCI 

  • Donald Norman’s Interaction Model
  • Abowd & Beale’s model
  • A generalised Interaction Model (from Dix et al) has four components: 
  1. System; (ii) User; (iii) Input & (iv) Output.

There are different Interaction Styles (nature of the dialogue)
And there are different Interaction Contexts (Social, Organizational, Educational, Commercial etc)

Norman’s Model of Interaction

Donald Norman’s Interaction model concentrates on the Users Thought processes and accompanying actions.
Norman proposed that actions are performed by the users in cycle such as
  1. Establishing a goal 
  2. Executing the action
  3. Evaluating the results 
Given a need a user sets about achieving the goal of fulfilling the needs. A series of actions are performed – one leading to another – till the result expected is obtained

Norman’s Model of Interaction consists of seven stages as follows:

The seven stage model

Norman applies the Model to explain why some interfaces cause problems to the users:

He uses the terms “Gulf of execution’ and ‘ Gulf of evaluation’. Normans model (also some times called as Gulf Model) is useful in understanding the reasons of interface failures from the users point of view. The Seven stages of action model is an elaboration of the Gulf model.
                           
Gulf of Execution: represents the difference between user’s formulation of the action to reach their goals and the actions allowed by the system.
   User’s formulation of action /= Actions allowed by the system.
The Gulf of Evaluation is the difference between physical presentation of system state and the expectations of the user. 
                               User’s Expectation /= system’s presentation.

Status Event Analysis
A theme that has run through my work for several years has been the analysis and specification of interfaces including both status and event phenomena. The word 'event' is self-evident. Whereas events happen at a particular time, status refers to phenomena which have some continuity. That is anything which for a period of time can be sampled or observed.
Status-event analysis is a semi-formal, easy to apply technique that:
  • Classifies phenomena as event or status
  • Embodies naïve psychology
  • Highlights feedback problems in interfaces.
Example

Events
  • clock: alarm rings
  • interface: beep or buzz
Status:
  • clock: the hands on the clock face
  • interface: the contents of the screen

 Fig: Best example of status event analysis

Sensor based interaction

Sensors are an important source of information input in any real world context and several previous research contributions look into this topic. In our research, we combine sensor-generated context information received both from the phone itself and information retrieved from cloud-based servers. All data is integrated to create a context-aware mobile device, where we implemented a new customized home screen application for Android enabled devices. 






Next topic is coming soon


NOTE: for any queries please contact us



3. mail us aakashraj416@gmail.com

 2017 final question papers





















No comments:

Post a Comment

Post Bottom Ad

Responsive Ads Here

Pages