#1: Driscoll, M. (2005). Psychology of Learning for Instruction (3rd ed.) (pp. 77-91). Boston, MA: Allyn and Bacon. Driscoll (2005) discusses sensory memory and working memory in this portion of the chapter. Discussion begins with sensory memory as a short-term storage due to decay of information that is visually presented. This decay is not because the amount of information cannot register in memory, but because it cannot be process quickly enough to be retained. Affecting the ability to process information into memory is one’s ability to focus or pay attention on the material presented. Selective attention (p. 79) is the ability to pick out and process certain information while ignoring other surrounding information. Task complexity will also influence one’s ability to process new information. Rote tasks (those done repeatedly and store in long-term memory) can be done easily while trying to take on new information, while unfamiliar tasks cannot compete with introduction of complex information. Stimuli can also affect one’s ability to store new information quickly. Pattern recognition (p. 82) suggests that cues from material already stored in memory can aid in making connections and thereby affect processing speed. Working memory houses information selected that will or can be further processed. Even so, there are limitation to the amount of information that can be store here as well. Driscoll discusses a classic study of 7+-2 recall. In the study, the idea that 7 items was the typical memory span and went on to ponder whether there is a magical quality to the number 7 in that we have “the seven wonders of the world”, “the seven deadly sins”, and so on. Though this might be a bit farfetched, seven has proven to be the memory span for a variety of information. Rehearsal and recall, as already stated, holds benefit to processing items into memory, whether chunking by seven or not. #2: Baddeley, A.D. (1992). Working memory. Science, 255, 556-559. In this paper, Baddeley discusses the study of working memory and some of the history that separated it from a part of short-term memory. Working memory is a part of the brain system that allows for processing, storage, and retrieval of information. This includes complex cognitive tasks such as language comprehension, learning, and reasoning (p. 556). Because of the simultaneous processing and storage of information, working memory is divided into three subcomponents: the central executive, the visuospatial sketch pad, and the phonological loop. The latter two are considered slave systems. In research of working memory, individuals with short-term memory issues were a priority. Since these individuals are not in large numbers (or were not at the time), a variety of tasks, both complex and repetitive, were chose for individuals to perform in order to test the theory of memory chunking, process, and storage. Researchers use a digit-span procedure, dependent on short-term store, to determine when interference to storage would occur. They noted that as the digit load increased, capacity to hold information would decrease, and interference (on processing) would increase. The researchers discovered no correlation between short-term memory disturbance and long-term memory recall. This further supported the separation of working memory from the short-term memory system. Ultimately, studies with the young, elderly, and patients with mental disorders such as Alzheimer’s, had a range of ability to which they could remember chunks of information while completing tasks. Findings for future research show that the central executive system is concerned with attentional control of behavior. These affects are seen in patients with Alzheimer’s, while the slave systems are shown to be a definite part of active memory, specifically relating to speech perception and production. #3: Miller, G. A. (1956). The magical number seven, plus or minus two: some limits on our capacity for processing information. Psychological review, 63(2), 81. In this paper, Miller (1956) begins by stating that the “amount of information” is the same as the concept of “variance”. The author states that the reason to discuss the amount of information versus variance, is that the first is a dimensionless quantity. This allows the comparison of results without fixed measurements (like inches, etc.), which is more meaningful in experiments with a variety of results. The author’s research ultimately relates to one’s cognitive abilities which is the ability to process information and the limitation of such, the channel capacity. The channel capacity is the maximum response an individual can give about presented information or from a stimulus. The author describes a Venn diagram (of sorts) where the left ring is the amount of input information (stimuli), the right ring is the amount of output information (responses), and the overlap in the middle is the amount of transmitted information (stimulus-response correlation). The expectation of the individual is to make more errors as the input of information is increased. One bit of information is representative of what one needs to make a decision between two equally likely alternatives. Given more alternatives, more bits are needed. This idea can also be represented by increase per unit of time, thereby increasing the amount of time to make a decision. Miller uses the first idea, one bit of information without time, for his absolute judgement experiment. In the experiment, individuals are given as much time as needed but the number of alternative stimuli are increased to determine where “confusion” begins. Miller states that confusion will show up near channel capacity. This is where the individual will make more and more errors while attempting to process information.
#4: Kalyuga, S. (2010). Schema acquisition and sources of cognitive load. In J.L. Plass, R. Moreno, & R. Brünken, Cognitive Load Theory (pp. 48-64). New York: Cambridge. Kalyuga begins with a segway to the previous chapter, which outlined the human cognitive architecture (HCA) (p. 48). Based on the HCA, the critical factor that influences long-term memory and the way we learn new information is the schematic knowledge base. Noted in the discussion is the fact that individuals must have current experiences stored in long-term memory for a particular situation, less the individual inserts random search process in an attempt to find related scenarios. The author presents, in this chapter, the Cognitive Load Theory (CLT). The CLT suggests principles, formulated from the cognitive load framework discussed above, for efficient instruction. The principles focus on acquisition of an organized knowledge based and include the following: direct instruction, expertise, and small step-size of change. As a part of this process, schemas are developed. Schemas related new bits of information with what an individual currently has stored in long-term memory. Instructional methods need to be designed and tuned to the learner’s current knowledge base. This allows learners to build upon current concept knowledge, relate to expanded knowledge, and process information more quickly. The direct instruction principle suggests providing user with just enough knowledge to know overload their cognitive load, for which they can process information. Coinciding with this principle, but expanding to all learners, the expertise principle suggests reducing extraneous cognitive load by reducing the amount of extra material in the learning material (for example, a learning module). Small step-size of change suggests introducing small amounts of information in small increments, such as a learning module broken into even smaller knowledge sections. These principles provide for both the novice and expert learner when used appropriately and determining user knowledge before instruction. #5: Paas, F., Renkl, A., & Sweller, J. (2003). Cognitive load theory and instructional design: Recent developments. Educational psychologist, 38(1), 1-4. I selected this article because I wanted to learn more about cognitive load theory, working memory, and principles generated from research. I am interested in the small step-size principle presented by Kalyuga (2010), specifically, but I also want to view other research on chunking information for learners and instruction.