How To Create A Simple NPC

NPC – non-player character

There are several components required to create an NPC: an avatar, a surface on which to navigate, a set of animations, and rules that dictate when to play which animations.

Making the Avatar:

  1. Import the package into your project. It should appear under the Assets folder in the Project window.
  2. There should be an image of the animation’s avatar in the folder; drag and drop into the Hierarchy window to create a game object.
  3. If you click on the game object, there should be an animator script in the inspector. Click Add Component > NavMeshAgent. This tells Unity that this game object is a moving character that will navigate the scene.
  4. Click Add Component > Rigidbody. Any object with a Rigidbody will be subjected to physics in the Unity universe. This ensures that the NavMeshAgent will be able to interact with the ground (stand on it without falling through).

  Making the Surface:

  1. Drag and drop a walkable surface from the Project folder into the Hierarchy.
  2. From the Navigation tab, go to Bake > Advanced > Select “Height Mesh” > Bake. This ensures that the NavMeshAgent will actually walk on the surface; otherwise, it may appear to float instead. 
    1. It may take a few moments to bake; you can track its progress by the blue progress bar in the bottom right.
  3. Position the NavMeshAgent on or above the surface. If it’s working properly, the game object should fall and make contact with the surface without falling through. If it’s not, make sure that there’s a Rigidbody attached to the game object and that a NavMesh has been baked.

Making the Animations:

  1. Next to the Scene and Asset Store tab, there should be an Animator tab. Right-click in the Project window, click Create > Animator Controller. Three animation states should show up in the Animator window: Any State, Entry, and Exit. Attach the animator controller to your animator by clicking on the avatar in the Hierarchy and locating the “Animator” script; drag and drop your animator controller into the “Controller” variable.
    1. To move around in the Animator window: Alt + click and drag. Scroll to zoom.
  2. In your animation pack, there should be a set of animation states available for your animator. When your animation begins, there should be an initial state — designate this as your first animation. Drag and drop the animation into the Animator window; it should appear in orange, with an arrow leading to it from Entry. These arrows represent transitions between states.
    1. To make a transition, right-click on the starting state and select “Make Transition”. An arrow will appear; connect it to the end state.
    2. To learn more: Unity’s State Machine Transitions

Making the Rules:

  1. On the left side of the Animator window, there is a column designated for layers and parameters. Parameters are variables used to control transitions between states. Create a parameter by clicking on the “+” sign to the right, selecting the type, and setting a default value. These parameter values can be accessed and altered through scripts.
  2. To add a parameter value as a condition for entering a transition, first click on the desired transition. In the Inspector window, locate the “Conditions” section and add the necessary parameter values.
  3. Parameter values can be changed via scripts. Instantiate an Animator, use GetComponent to reference your animator, and reset values using SetBool or SetFloat, for instance.
    1. To learn more: Unity’s Animation Parameters

Setting a Destination:

  1. To set the NavMeshAgent’s destination: Unity’s NavMeshAgent.destination
    1. NavMeshAgents can “slide” across a surface; to fix this, you can add to your script: [NavMeshAgent variable name].velocity = Vector3.zero;
    2. To rotate the NavMeshAgent to face the player: Unity’s Transform.LookAt

Is Pain Catastrophizing Hindering your Recovery? An Interview with Dr. Beth Darnall

If you’re a person with chronic pain, ask yourself how often you experience the following sort of thoughts: “I feel like I can’t stand it anymore,” “I want this pain to go away so badly,” “I can barely think of anything else and it overwhelms me,” “I’m afraid this pain will only get worse?”

 

If these thoughts are familiar to you, you may be suffering from what psychologists call pain catastrophizing. Pain catastrophizing is one of the strongest predictors of suffering in chronic pain patients. The good news is, catastrophization is highly treatable with the use of empirically-supported therapy techniques intended to bring awareness to and restructure harmful thoughts.

 

Beth Darnall, Ph.D. is a clinical psychologist specializing in pain, and is an associate professor in the Division of Pain Management at Stanford University. She’s been working with people who suffer from chronic pain for over 15 years, and more recently specializing in patients about to undergo surgery in order to ready them for their procedures. Karuna Labs recently caught up with Dr. Darnall, and asked her about her insights into the psychology of pain.

 

Karuna: What do you most want people to understand about your work?

Dr. Beth Darnall: Most people have misinformation about role of psychology in the treatment of pain. Historically, the focus of pain treatment was medication, procedures, and surgeries. It was only when medical treatments were ineffective that anyone raised the question of behavioral interventions or pain psychology, which led to the perception that pain psychology is a palliative option, or the option of last resort when nothing else works. But it turns out that this is simply not the case.

 

My work focuses on changing a cultural mindset about the role of psychology in the experience and treatment of pain, such that we elevate psychology to the status of primary treatment. Collectively, research findings tell us that pain psychology deserves top billing.

 

Karuna: Why do you believe that pain psychology should be the primary treatment for chronic pain?

Dr. Beth Darnall: Psychology is built into the definition of pain. Pain is not just noxious sensory experience, but an emotional experience too. The fact that we don’t integrate it equally into our treatment is the main reason that people have suboptimal results. Often, cognitive and emotional, behaviors that maintain or even amplify pain are ignored. At this point, we have a lot of rich data that demonstrates that if psychological factors aren’t addressed on at the front end, they will serve to undermine patients’ response to the interventions we try. If we integrate pain psychology early and equip patients with the skills they need to self-regulate, it optimizes their response to surgeries, medications, or whatever procedures may also be used to treat their pain.

 

It’s not either medicine or psychology – people need both.

 

Karuna: What is your approach to pain psychology?

Dr. Beth Darnall: The psychological piece needs to be addressed first in order to optimize patients’ response to everything else. In order to ready people as quickly as possible, I have developed a brief treatment to address pain related distress, common factor, and pain catastrophizing. I have developed a single session treatment – a 2 hour class – to do so. Usually, pain cognitive behavioral therapy (CBT) lasts 8 session, so this is much more efficient and inexpensive for patients.

 

Karuna: What would a patient learn in the 2 hour class?

Dr. Beth Darnall: Pain education and skills. People have their own experience of pain, so we all think we know what that is, but the class actually unpacks why psychology is integral when it comes to chronic pain. The class includes pain science and basic CBT skills. We distill 8 CBT sessions into one focal compressed module. We also know that when CBT is effective, it’s often because catastrophizing has been reduced, which may mean catastrophizing is the most effective therapeutic target.

 

Karuna: What can you tell us about pain catastrophizing?

Dr. Beth Darnall: The research shows that pain catatrophizing is a specific psychological experience that is highly predictive of treatment outcome. It often comes with rumination, magnification, and feelings of helplessness. Rumination has the most predictive value. When our minds are connected to pain so strongly, it makes it hard to reduce pain because it’s so front and center. This is confusing to people with chronic pain; they’ll say, “of course I’m focused on my pain, it’s severe and I can’t focus on anything else.” My approach is similar to the broader field of cognitive behavioral therapy: it requires unpacking how a person’s attention relates to their pain and what they can do to use specific skills to disengage that focus. Any skills one can use to disengage negative focus from the pain tend to engender a sense of how a person can help themselves. By learning ways to stop ruminating, we reduce feelings of helplessness.

 

Karuna: What about mindfulness?

Dr. Beth Darnall: There is a therapy approach called Acceptance and Commitment Therapy (ACT) that combines CBT with mindfulness. I don’t have formal training as an ACT therapist but I use some of the techniques. ACT has many of the same components of CBT but also focally integrates mindfulness with more emphasis on setting active goals. ACT uses different types of cognitive strategies; in CBT often the emphasis is on restructuring and reframing, whereas ACT uses techniques meant to generate more cognitive flexibility. The idea is that much of our thinking is rooted in absolutes, or binary thinking. In the end, there are more than 100 types of technique someone could diffuse a persistent thought that’s essentially unhelpful. One could be repeating a thought again and again until it loses its value, which allows the patient to begin witnessing those thoughts without reaction or emotional attachment to them. There are many useful ways to approach your pain more adaptively.

 

Karuna: Thank you, Dr. Darnall, for your insights!

Can VR Change Your Sense Of Self?

With the popularity of immersive virtual reality growing it’s time we looked at the power behind this technology and begin to understand how it affects our sense of self.

Predictive Coding is one of the leading theories explaining what our brain does. This theory claims our brain is a prediction machine 1)Friston, 2006. In order to have good predictions the brain combines things it’s already learnt with new information coming from the senses. This theory explains that some of layers of the brain predict whether a movement is self-generated or stemming from an external source 2)Ishida, Suzuki, & Grandi, 2015; Seth, 2014. This requires the notion of a so-called minimal-self, existing even in primitive life, allowing for differentiation between the organism and the environment 3)Apps & Tsakiris, 2013; Limanowski & Blankenburg, 2013. This minimal-self is also a combination of previous knowledge with new incoming information and has been shown to be flexible and prone to mistakes. A great example is the Rubber Hand Illusion 4)Botvinick & Cohen, 1998. The sight of a rubber hand being touched at the same time as a sensation of touch on a person’s actual hand results in the brain predicting that the rubber hand is part of the body and the minimal-self.

Immersive virtual reality allow for manipulation of the minimal-self by controlling the relationship between the visual information and the proprioceptive information.  In VR we can create experiences where what a person sees and feels fit the usual prediction created by the minimal-self model, but we can also create experiences that contradict our normal predictions. Recent research shows that virtual reality has indeed been used to manipulate the minimal-self, for instance, to investigate phenomena such as the rubber hand illusion by creating full body illusions 5)Slater, Spanlang, Sanchez-Vives, & Blanke, 2010, treat body image disorders like anorexia 6)Keizer, van Elburg, Helms, & Dijkerman, 2016, investigate mirror box therapy for amputees 7)Wittkopf & Johnson, 2016 and help heal spinal cord injuries 8)Donati et al., 2016.

When the brain experiences something that does not fit its usual predictions it can do one of several things, it can re-sample the information from the senses. For instance, if what a person sees does not fit what they feel in their proprioceptive sense, the brain can lower one of the senses. This sheds light on the possible reasons for mirror box therapy being effective for chronic pain treatment 9)Wittkopf & Johnson, 2016. Patient’s seeing a different body than what they are used to, might reduce this surprise by decreasing the sampling from their pain receptors. Another option the brain can opt for is updating its predictions or its model, creating a learning effect.

As we have seen VR can indeed change the sense of self. Karuna labs uses this ability in its Virtual Embodiment Training to produce specific immersive VR experiences that change the sense of self, causing the brain to reduce chronic pain sensation and induce rapid learning.

 

 

 

References   [ + ]

Neuromatrix Model of Pain

Cognitive-related brain areas – How we assign meaning to an experience. Same stimulus and different meaning

Emotion-related Brain Areas – Limbic system and associated homeostatic/stress mechanisms.

Sensory discriminating – Nociception, the sensory input. Where is it?

All three outputs are needed in order to be assessed as pain. Pain is an output, it is NOT an input. Nociception is only 1/3 of what travels to the brain. Perception must also be assigned meaning(cognitive-evaluative)

In the clinic…

Experience has already been scrutinized and meaning assigned to threat or harm. Pain narrative is a pain behavior. It’s a proxy of the true experience, often distilled down to a number from 0 and 10.

This is Your Brain in Virtual Space

We break up the world into concepts and labels in our everyday functioning. When we walk down the street, we label the car as a car. We cross someone on the street and label them as a woman and it’s associations. Attractive. Caucasian. Tall.. etc.

When we do this labeling, and conceptualization, we subtly create a duality between our self and other. Patients who have lesions in the back part of the brain called the association cortex are not able to distinguish between self and other. One patient with that lesion was not able to point to any items outside his own body.

There are different types of space; physical, psychological, and mental space. The sense of space can also go away which results in a substrate space. There is also non-dual space where the subject and object collapse.

Previc1)http://www.ncbi.nlm.nih.gov/pubmed/16439158 describes 4 levels of physical space:

  1. Peripersonal – space within 6.5 ft of our body. The posterior inferior parietal region.
  2. Extrapersonal focal – representation of objects above the horizontal plane and recognizes faces at distances beyond 6.5 ft. Inferotemporal region.
  3. Extrapersonal action – system that allows us to navigate. Superior and medial temporal regions.
  4. Extrapersonal ambient – orients us in space. Parietooccipital region.

The occipital to temporal pathway controls the WHAT. The occipital to parietal pathway controls the WHERE. The posterior parietal cortex and the posterior superior temporal sulcus are usually engaged in exercises where you have to point out objects in space. Allocentric space is the space of the objects we represent outside ourselves. This is contrasted with an egocentric frame of reference which is how we represent objects in the world with us in the center.

Very clever experiments have been conducted in the virtual reality space. Space gets distorted in the virtual world. We can still identify with our arm even if it is lengthen to twice it’s length. 

References   [ + ]

Self Regulating the Brain and Neurofeedback

brain.jpg-min

With neurofeedback now going mainstream1)http://www.newsweek.com/neurofeedback-brain-regulation-neuroscience-457492 it’s important to take a closer look at it’s underlying mechanics. Neurofeedback works by reading electrical activity of the surface of your brain scalp with EEG(electroencephalography). This data is in turn fed to a computer and represented visually or auditory. The brain learns what the reward signals are and over time works to modulate itself according to the feedback given to it.

Neuroplasticity

The brain is able to physically reorganize itself to form to neural connections throughout its lifetime.  Through neuroplasticity the brain is constantly getting shaped as we experience, adapt, and learn.

neurons that fire together, wire together

std

Complexity

Self organization and self regulation are a fundamental part of brain operation. Complex systems like the brain self organize, are open, and constantly exchange information across boundaries. Brain doesn’t only process information, it also generates information. Complex systems are defined by their nonlinearity and it’s behavior cannot be predicted solely on the interactions of its lower level components. The behavior of the brain as a complex system cannot be predicted by the sum of the local interactions of it’s neurons. Brain waves organize in a way that show self similarity over time. The fluidity of brain processes depends on evolving complexity. Neurofeedback seeks to tune brain oscillations to achieve a balance between network flexibility and stability.2)http://journal.frontiersin.org/article/10.3389/fncom.2016.00020/full

Phase Resets

One study showed that network resets in the medial prefrontal cortex of rats correlated with letting go of prior beliefs in favor of exploratory behavior. 3)http://science.sciencemag.org/content/338/6103/135 There was an updating of the belief system in these rats. It reflected a network switch to a state of instability, which then diminishes as the new stable representations are formed.

Brain Structures 

The [Precuneus], anterior cingulate, and angular gyri are thought to control self-referential thinking. PET scans in one Danish study 4)H. Lou, B. Luber, M. Crupain, et al. Parietal cortex and representation of the metal self. Proceedings of the National Academy of Science 2004; 101:6827-6832. The effective stimulation had a latency of 160ms showed increased activation in the left prefrontal cortex, medial precuneus, and posterior cingulate regions.

References   [ + ]

1. http://www.newsweek.com/neurofeedback-brain-regulation-neuroscience-457492
2. http://journal.frontiersin.org/article/10.3389/fncom.2016.00020/full
3. http://science.sciencemag.org/content/338/6103/135
4. H. Lou, B. Luber, M. Crupain, et al. Parietal cortex and representation of the metal self. Proceedings of the National Academy of Science 2004; 101:6827-6832. The effective stimulation had a latency of 160ms

Paper Review: Source-space EEG neurofeedback links subjective experience with brain activity during effortless awareness meditation

Judson Brewer ran a study to test whether source-localized EEG neurofeedback could follow effortless awareness in novice and expert meditators.1)van Lutterveld, R., et al., Source-space EEG neurofeedback links subjective experience with brain activity during effortless awareness meditation, NeuroImage (2016), http://dx.doi.org/10.1016/j.neuroimage.2016.02.047 The EEG neurofeedback was done with the gamma-band (40-57 Hz) measuring PCC activity. Verbal probes were used to assess the correlation with subjective experience of effortless awareness.  The suggests brain activity within the PCC is strongly correlated with the subjective experience of effortless awareness.

References   [ + ]

1. van Lutterveld, R., et al., Source-space EEG neurofeedback links subjective experience with brain activity during effortless awareness meditation, NeuroImage (2016), http://dx.doi.org/10.1016/j.neuroimage.2016.02.047