So by now you have a little experience with the clicker and how it works. So here’s a bit of history and theory.
Marker-based training was first observed and recorded in ‘Operant Conditioning’ experiments by BF Skinner in the 1930’s. Skinner discovered that animals can learn to ‘operate’ their environment. Every animal can become aware of and make behavior choices that affect the environment: a rat accidentally steps on a lever in his box and food is dropped in. Repeating this behavior, the rat learns that he can have food whenever he wants it. The rat thinks about his behavior (pressing the lever) and is rewarded with an outcome (food).
The neat thing about this is that this principle of Operant Conditioning (OC) in learning applies across the animal kingdom. All animals think about what behaviors get rewarded and repeat those behaviors. A crayfish learns to tug a string, a fish swims though a hoop, a bird learns to roll over, a bunny does agility, a dog shuts the door, a cat comes when called, a horse puts clothes in the washing machine, a dolphin jumps high in the air in perfect time with her 5 podmates, an elephant learns to carry her human passengers with care and a pilot learns to fly a plane. All of these behaviors are taught on the basis of the animal being rewarded for the desired behavior. No thought of being the boss or dominant over the learner, just working with the animal to teach him what behavior you are looking for.
Every good teacher knows that all trained behaviors are voluntary and is done by choice. The learner can, at any time, choose not to be operant or may not be in a mental or emotional state to be able to be operant for a variety of reasons. This is useful to know since if we don’t like a behavior that a horse offers or exhibits, we can look for underlying reasons why the horse is not doing what was asked and is doing something else instead. It also opens up the possibility to retrain the behavior or train a different behavior instead if the old one wasn’t working for the teacher.