Study Guide

What is this?

This document is intended for everybody who considers studying cognitive science, for current students as a reference and for your aunt who still asks what exactly you study on every family meeting. It is a translation and explanation of the study regulation, enriched with the experience of the student mentors, the Fachschaft and dozens of students, which may contain explicit and/or nerdy humour.

This Guide is primarily targeted at Bachelor students, graduate students may also find it helpful - just skip the “Basics” chapter and read “Mastering Cognitive Science” instead.

Lesser Epic Preface

In this study guide, we will have the pleasure to try to elucidate this entangled and hardly fathomable amalgam of ideas and thoughts and disciplines that constitutes cognitive science. We'd like to take you on a wild cross-country ride through the worlds of syntax and synapses, through compilers and consciousness, we're going to explain what Napoleon has to do with the dystopian future's killer robots and why Germany has been the soccer world champion all the time (without anybody noticing).

But before going into detail, let us give you somewhat of a grand perspective. Depending on where you ask, you'll get varying answers to the question of what cognitive science is. Most agree that it has something to do with cognition - hence the name - that is, perception, thinking, acting, maybe language. The idea to study that comes from the 70ties - back then it was mainly psychology, philosophy of mind and artificial intelligence (see Epic Preface). One thought however unified these seemingly disparate research areas: that of different levels of explanation. This means, that for a tremendously complex phenomenon such as cognition we will not just find a single cause, an isolated consciousness factory in some obscure brain lobe. If we want to understand cognition, perception, thinking and so on, we have to understand its causes and effects on all different levels of organisation, all the way up from molecules to society and back down again.

This implies that any approach towards the understanding of cognition is inevitably and thoroughly interdisciplinary. Here in Osnabrück, we group our lectures and courses into eight disciplines. Now the message here is the following: we need all those different subjects and methods, neurobiology, neuroinformatics, artificial intelligence, linguistics, cognitive psychology, philosophy of mind, to allow for a gapless chain of arguments from one end of the scale to the other. Now we can argue about how to label this scale, whether to start with molecules or genes and include sociology at the top, and of course these different areas are not entirely distinct at all - we also got research groups for, for example, cognitive modelling, neurolinguistics and so on. Now, here's an important thing to remember. Some people may claim that psychology is just applied biology, physics is just applied mathematics, sociology is just applied quantum mechanics. Well, it's not. Throw a stone into the water and observe how the waves propagate and interfere with other waves. You will never, ever, be able to explain how these waves behave by studying the structure of single water molecules. However, without the molecular structure of H2O being the way it is, we might just be walking on a rock planet and not seeing any waves at all.

Epic Preface

It is the year 1807. The country is in flames. Having lost the war of the fourth coalition against Napoleon, the infrastructure was havocked, and retributional payments to France drove the state to the verge of bankruptcy. Here, at the oddest of places, in Prussia, two men prepared the ground for the advent of cognitive science: they invented bureaucracy.

What may now seem more a curse than a blessing became a hugely successful model in the nineteenth century. In the place of arbitrary rule and authoritarian corruption stepped a formalised, ordered system to ruling the state. With the reforms introduced by Karl Freiherr vom Stein and Karl August von Hardenberg, one department established a census, reported to another that collects the taxes, which in turn reported to the accounting department. No longer were duties bound to men, but rather men to duties. It did not matter who ran the department and how it was run, as long as their interactions with other organs of the state were clearly defined and accordingly met. But the true value of bureaucracy was not transforming an incapable state into a dreadfully efficient ruling apparatus, it was establishing a mindset that brought about neuroscience. Or rather, a mindset that allowed for questions that could be answered neuroscientifically. No longer would we have to consider the brain to be an homogeneous lump of tissue, it now became evident to everybody that the brain, like the state, was divided into different departments, distinct areas with unique features, each serving a different function in that complex machinery that rules our thinking and behaviour. The many wars fought in the nineteenth and early twentieth century provided researchers with plenty of proof for this young theory: This soldier caught a bomb shard in that area of his brain, leaving him unable to form grammatical sentences, another soldier a bullet in another area, causing him to talk gibberish, yet syntactically correct.

It will however take us another game-changing invention to realise what this preface has to do with how to study cognitive science. By the end of the fragile period of peace following the world war I, the brain seemed to be fully cartographed; each and every lobe and sulcus had a name, the pace by which their functions were understood however seemed to slow down. This was however mostly due to the questions asked. Scientists were to concerned with what the brain is and what all these different brain areas do, and could not care less for how they do it.

The break-through was necessitated by another war. After the brilliant theoretical groundwork of the young polish mathematician Marian Rejewski in deciphering the German Enigma, it was no other than Alan Turing who built the first electromechanical device for automatically deciphering encrypted massages at Bletchley Park. Although lacking a modern distinction between hard- and software, this work paved the way for the widespread use of computers (of course the history of computers goes a long way back, but never before have computers been used to do something actually important on such a large scale!). Computers have a very distinctive advantage over governments: you can watch them work. You can not only ask what the do, but how exactly they do it. This led to a paradigm shift in the early fifties, and an idea was born: maybe the brain, too, works like a computer! Different brain areas received input from other areas, performed some sort of computation and send their results to, let’s say, our motors. And very quickly another, wild, if not to say, heretic idea emerged: if the brain is like a computer… then a computer in turn could be like a brain. Sure, it’s neurons, not relays. But what matters is the software, the how, not the what. This was the birth of artificial intelligence. At the same time as mathematicians, two other branches of science got interested in this new school of thought: one of them were the psychologists. How? In 1959, Simon Newell conjectured that “in 20 years, all psychological theories will be given in the form of computer programs”. Needless to say the other pack that picked up the ideas were philosophers.

This unlikely triumvirate of mathematicians, psychologists and philosophers governed the discussion deriving from the “computer metaphor” for almost two decades, until it became evident that Newell’s predictions were slightly optimistic to say the least, and funding for what is now called “good old-fashioned AI” was cut short. But the fact that by the late seventies cognitively advanced humanoid robots have not exactly become a ubiquitous household commodity could only mean one thing: that it will take a little more than just a clever arrangement of transistors and diodes to understand the human mind. That it will take another, holistic approach to unravel the mysteries of perception and thinking. That it will take the methods and contributions of multifarious different disciplines, joint only by object of the grand questions they are asking. That it will take cognitive science.

The most basal and yet most idiosyncratic insight driving cognitive science is that there are many different levels of explanation, and each and every level - from neurons to networks, from functions to behaviour, from individual consciousness to social interaction - rightfully deserves it’s own approaches, methods and terminology. And this is why so many different disciplines are involved in our cognitive science programme.