Review sheet 2


B. F. SKINNER, OPERANT/INSTRUMENTAL CONDITIONING, REINFORCEMENT, AND CONTINGENCY MANAGEMENT

1. Why is Skinner sometimes referred to as a "radical behaviorist?"

2. The term "operant" means that an organism _____________ on ______________.

3. The term "instrumental" means that a _________ is "instrumental" in ______________________. The term "operant," which is widely used for the same kind of learning process, means that the organism _______________ on the _____________ in some way.

4. What can a time-and-event-grid add to a behavioral analysis?

5. What is a "cumulative record" of behavior? Make up a graph in the form of an imaginary behavior in the form of a culmulative record, and draw it.

6. a. What is a major criticism that behavioral practitioners raise against traditional clinical psychology?
What does a behavioral approach do instead?

7. What principal criticism of behavioral approaches is raised by depth psychologies such as psychoanalysis, Jungian analysis, and Gestalt therapy? What do many behavioral psychologists say about depth approaches? Is there a resolution to this point of disagreement, and if so, what is it?

REINFORCEMENT.

8.It has sometimes been said that with all we know about psychology, "____________ is still the most reliable horse in the stable."

9.. What is a positive reinforcer, or "goodie?"

10. What is another word for a negative reinforcer, or "baddie?"
How do the two forms of "negative reinforcement," that is, escape conditioning and avoidance conditioning, differ from the application of a negative reinforcer?

11. What advantages does an intrinsic reinforcer have over an extrinsic reinforcer, when one is available?

12. What effect do large and small rewards have on whether it becomes intrinsically reinforcing to act in a given way?

13. What does the symbol SR stand for?(The R should be raised and in smaller type, but this webauthoring tool does not support that.)

14. The important Premack Principle, discovered by David Premack, states that:

15. How can you determine what is an effective reinforcer for a particular person?

16. What kind of reinforcement schedule is most effective for initially establishing a behavior?

17. Name several other common reinforcement schedules.
Is reinforcement needed during the "maintenace" phase of a behavior, or only during the "acquisition" phase? Why?

18. How do past histories of reinforcement lead to "crazy" behavior in the present?
What is a "generalized reinforcer"? Give an example.

19. What is a "token" in a "token economy?"

20. What is a "menu?"

21.Secondary reinforcers come into being when: ___________________________________.

22. In everyday language, a "reinforcement schedule" refers to how much __________ elapses between reinforcements, or how ___________ reinforcements must occur to get a reward.

23. In the language of learning theory, what is "superstitious behavior?"

24. Why would someone bother to use a reversal design?

QUALITIES OF RESPONSES

25. What is the difference between an "emitted" and an "elicited" response? Which one is usually involved in operant learning?

26. "Establish functional behavior." What does this mean? What's another common way of saying the same thing?

27. Successive Approximation or "shaping," What is this and why is it important?

28. What roles do deprivation and satiation play in learning? In the performance of a learned response?

THE ANALYSIS OF CONTROLLING STIMULI

24. What is a "discriminative stimulus?" What is the difference between an S-D and an S-delta+

25. What does the term "stimulus control" mean? How can controlling cues be strengthened?

26. What is "narrowing?

27. What is "fading?" How is it different from "shaping?"

28. What is a "chain of behaviors," or "stimulus-response chain"? Of what elements is it composed?

29. Where in a chain is it most efficient to begin in order to teach a new behavior, according to behavioral orthodoxy? What's an example of this? What's an example of when this principle does not hold?

30. Where in a chain is it most efficient to intervene in order to eliminate an unwanted behavior?

31. What conditions result in "experimental neurosis?"

32. How is experimental neurosis related to "learned helplessness?" To "self-efficacy" or a "sense of effectiveness?" To ILC and ELC?

CONTINGENCY MANAGEMENT

33. What is the value in the principle "think small?"

34. Why do we bother to take a baseline?

35. What is the "self-monitoring" effect?

36. "Start where the behavior is at." Why is this principle important and how is it related to the principle of successive approximation? Name at least two circumstances under which it is often ignored?

37. How does "behavioral contracting" make change a reciprocal relationship, and what does a behavioral contract customarily involve?

38. "Be consistent." Why is this principle important?

39. Should a good behavioral program be designed perfectly at the beginning, like a good experiment? If so, why? If not, why not?

REDUCING THE FREQUENCY OR INTENSITY OF BEHAVIOR

40. Extinction can be defined as:

41.. What kind of reinforcement schedule is most effective inr producing resistance to extinction?

42. When a behavior is being established, if reinforcement is delayed too long, then: 36. What is "response cost?"

43, What is the value of reinforcement of a competing behavior thatt does not _____________ the opportunity to engage simultaneously in an undesired behavior?

40. What is a DRo schedule?

41. How does extinction differ from forgetting?

42.. What often happens at the beginning of an extinction procedure, and how does the course of extinction tend to proceed?

43. What are two great advantages of extinction as a way of ending unwanted behavior?
a.
b.

44. Why is extinction a trickier business after a history of intermittent (partial) reinforcement than after a history of continuous reinforcement?

44. How can extinction be combined with another behavioral change technique?

45. What is an example of the use of punishment by subtraction of a positive reinforcer ("losing a goodie?"

46. What areat least three major problems with the use of punishment?

47. How is punishment sometimes used in conjunction with positive reinforcement?

48. How does the technique of "time-outs" minimize unwanted side effects?

49. What is your instructor's interesting variation on the use of time-outs, when it is possible?

48. How does lack of consistency make punishment potentially highly destructive?

 

Back to learning materials on this site

Back to learning etc. main page