UNCW HomeUNCW HomeBreaking News! Click here for details.

Department of Psychology

Ruth M. Hurst, Ph.D., BCBA

Department of Psychology Homepage

Return to Home Page

Study Guide: Chapter 15 (pp. 259-275)
This chapter describes avoidance. One problem with analyzing avoidance is that it is sometimes hard to figure out what the contingency is that results in the increased likelihood of a response. There doesn't seem to be a concrete reinforcer.

1. Describe the procedure that was used to correct Sid's posture. What was the effect of this procedure on the likelihood of the response?




2. What is an avoidance contingency? Explain how the example with Sid (#1) fits the definition of an avoidance contingency. Try to fill in the contingency boxes.




3. Make up your own every-day example of an avoidance contingency. Diagram it and indicate the effect on behavior.




4. Why is it appropriate to say that the avoidance contingency is a reinforcement contingency?




5. Describe the procedure that Eve and Joe used to increase the likelihood of Jimmy's making eye-contact. (a) Explain how the treatment fits the definition of avoidance. (b) Diagram the contingency, and indicate the effect on the likelihood of the response.





6. What is an avoidance-of-loss contingency? Give an example of your own and diagram it. What is the effect on the likelihood of the response? (What are the main differences between an avoidance-of-loss contingency and a penalty contingency?)





7. What are the main similarities and differences between avoidance and escape contingencies? (It is sometimes said that the response in an escape contingency removes a current aversive conditions whereas the response in an avoidance contingency prevents a future aversive event. Explain what this statement means in terms of the contingency diagrams.)





8. What is a warning stimulus (p. 266; 270-271)? The complicated diagram on p. 266 shows that there is both an avoidance contingency and an escape contingency for lever-pressing. What is the aversive stimulus for the escape contingency?




9. What procedure did Hefferline, Keenan, and Harford (1958) use to study learning without awareness in humans? What was the result of their study that convinced Malott that "contingencies can control behavior, even when we are unaware of those contingencies, the behavior, or that any controlling is going on." (pp. 273-275)



Study Guide: Chapter 16 (pp. 276-284)
This chapter describes a fairly complex type of contingency--one that, nevertheless, occurs often in daily life. For that reason it is important to be able to interpret this kind of contingency correctly and understand the behavioral effects. One source of confusion is that people sometimes think they can simplify the contingency. But in doing so, they are forced to treat non-behavior as behavior. You can avoid this confusion by remembering the "Dead man's test" as a way to decide what is and what isn't behavior. This use of the "Dead man's test" comes up frequently in this chapter.

1. Without peeking, diagram the contingency that resulted in a decrease in the frequency of Sid's eye-batting. What kind of contingency is this?



2. Why would it not be appropriate to describe the contingency that reduced the frequency of Sid's eye-batting as a simple escape contingency of the following form? (Hint: Dead man's test)





3. Complete the two diagrams on page 276 that have empty boxes.




4. Without peeking, give the definition of the punishment-by-prevention-of-removal contingency.



5. Be able to reproduce the contingency diagram at the lower-right of page 277. Be able to explain why each of the following diagrams are unsatisfactory for representing the contingency that resulted in the decreased frequency of Jimmy's head-banging.




6. What kind of contingency was responsible for the decrease in Jimmy's head banging?



7. Be able to give the definition of punishment-by-prevention-of a-reinforcer contingency.



8. What is similar and what is different between a penalty contingency and a punishment-by-prevention-of a-reinforcer contingency? Why can both be regarded as types of a punishment contingency?



9. Make up your own every-day example of punishment-by-prevention-of a-reinforcer contingency. Diagram it and indicate the effect on behavior.



10. On pages 280-281 the authors describe a procedure used to reduce the frequency of Todd's disruptive behavior at the dentist. What is unsatisfactory about viewing this procedure as an escape contingency (page 281)? What is unsatisfactory about viewing this procedure as a reinforcement contingency (page 282)? What does Malott regard as the appropriate interpretation of the contingencies?



11. What is a DRO contingency? Why does Malott recommend not describing contingencies in terms of DRO?


Study Guide: Chapters 17, 18, 19 (parts of each)
These chapters discuss schedules of reinforcement. The study questions address only a few points in each of the three chapters.
1. What does the term, intermittent reinforcement, mean?


2. What is a fixed-ratio (FR) schedule? Describe the pattern of responding produced by a fixed-ratio schedule. Provide an everyday example.


3. What is the postreinforcement pause? What are the conditions that produce the postreinforcement pause? Why might the postreinforcement pause seem maladaptive?


4. What is similar and what is different between a fixed-ratio schedule and a variable-ratio (VR) schedule of reinforcement? In what way are the response patterns different? Provide an everyday example of a variable-ratio schedule.


5. What is similar and what is different between a fixed-interval (FI) schedule and a variable-interval (VI) schedule? In what way are the response patterns different? Provide an everyday example of a variable-interval and a fixed-interval schedule.



6. Does Malott et al. think the deadline for a term paper is like a fixed-interval schedule? Why or why not?




7. What is the procedure that can produce superstition in the pigeon? What is the evidence that a superstition was developed? Is the procedure a fixed-interval schedule?



8. What are concurrent contingencies? What is the relevance of concurrent contingencies to our everyday concept of choice?




9. In class we will discuss a particular type of concurrent contingency that can result in "impulsive" rather than "optimal" choices. The contingencies involve different amounts and delays of reinforcers (or aversive conditions). You should be able to sketch these kinds of concurrent (or competing) contingencies and discuss their relevance to "impulsive versus optimal" behavior.



Study Guide: Chapter 20 (p. 331-347)

1. Malott suggests that stimulus-response chains are common in our everyday lives. Define stimulus-response chain.



2. On page 333, Malott gives an example of the stimulus-response chain operating when you eat potatoes. Think of a different example of a sequence of responses that might be interpreted as a stimulus-response chain, and diagram at least three segments (links). Each stimulus in a chain is said to have dual (two) functions. What are the two functions of each stimulus?




3. The book emphasizes the use of forward chaining for establishing a stimulus-response chain. Why would backward chaining often be a better method of establishing a stimulus-response chain? (Hint: Think of nonverbal subjects.)



4. Describe a situation in which you would use backward chaining to establish a stimulus-response chain. Diagram the stimulus response-chain and describe the procedure you would us to establish your stimulus-response chain. Which link in the chain would you establish first? (What role does shaping play in backward chaining?)




5. What is a differential reinforcement of low rate contingency (DRL)? What effect does it have on responding?




6. In the section entitled "Ways to reduce undesirable behavior"(pp. 342-348) the authors summarize 9 different procedures that one could use to decrease the frequency of some response. These procedures are summarized in a table on page 348. Go over this table with your interteach group.

.
Imagine that you are on the staff of a school or other institution (say, one for the education of children with various kinds of developmental disabilities). One of the children (Tim) throws temper tantrums excessively. As a result, Tim is unable to benefit from the educational program in the school--at least not to the degree that he needs to. Your job is to consider each of the 8 procedures listed in the summary on p. 348, and indicate how each one might be used to correct the child's problem. Specifically, (a) define and diagram the procedure, and (b) indicate the pros and cons of using the procedure for reducing the excessive frequency of Tim's tantrums. It would be good to pretend with your interteach partner that you are both on the staff of the school, and you can then work together to develop the best possible treatment program for Tim.





Study Guide: Chapter 21 (p. 353-376)
This chapter is on the topic of what is called respondent, Pavlovian, or classical conditioning (these terms mean the same thing).


1. Pavlov demonstrated that a previously neutral stimulus could acquire the ability to substitute for an effective eliciting stimulus. Describe the kind of learning experience that caused the neutral stimulus to acquire this ability. That is, what were the essential conditions that the dog had to experience? (And what does elicit mean?)




2. Define and give an example of each of the following: neutral stimulus (NS); unconditioned stimulus (US); conditioned stimulus (CS); unconditioned response (UR); conditioned response (CR). What is the difference between the UR and the CR?



3. Interpret the following as an example of respondent conditioning. Identify the NS, US, CS, UR, and CR.




Until last Saturday, little Billy showed no signs of autonomic nervous system arousal to the sound of the buzzing bee. But then Saturday, Billy heard the sound of the buzz and was stung. Since then Billy has shown signs of autonomic nervous system arousal upon hearing the buzz.





4. Suppose we also notice that Billy also calls to mommy for her to hold him when he hears the buzz. Is the response of calling out likely to be the result of respondent conditioning? What kind of behavior is the calling-out? And what kind of contingency probably caused the likelihood of calling out under such circumstances to increase?


5. What is the extinction procedure for respondent conditioning? What is the result of the extinction procedure?


6. What is higher-order conditioning? Provide an illustrative example.



7. What is systematic desensitization? What is its relevance to respondent conditioning?


 

 

Return to Home Page