Warning: Cannot modify header information - headers already sent by (output started at /home/public/wp-config.php:1) in /home/public/wp-content/advanced-cache.php on line 218

Warning: Cannot modify header information - headers already sent by (output started at /home/public/wp-config.php:1) in /home/public/wp-includes/feed-rss2.php on line 8
Science – Did you learn anything? https://www.didyoulearnanything.net An archived blog about education, language, peace, and other fine things Mon, 26 Jun 2023 19:09:21 +0000 en-US hourly 1 https://wordpress.org/?v=6.1.1 Thoughts about: knowledge, science, culture, and reality https://www.didyoulearnanything.net/blog/2012/04/28/thoughts-about-knowledge-science-culture-and-reality/ Sat, 28 Apr 2012 12:36:56 +0000 http://www.didyoulearnanything.net/?p=2183
First contents page of A Guide to the Scientif...

Human beings are obsessed with knowledge. We instinctively believe there are facts about the world which are true, which can be known, and which explain our experience of reality. But real knowledge – thoughts about reality which are true – is incredibly elusive. Human beings aren’t very good at dealing with this.

This post is quite a long and mainly philosophical one, and part of a thought in progress.

I Intuitive knowledge

The earliest forms of knowledge, it’s safe to assume, were the result of intuitive mental “theory-building” combined with instinct. We experience things and intuitively create “theories” to explain them. When we experience something that clearly contradicts our theory so far, we either amend the theory to fit the new information, or (more often) ignore what we experienced and try to forget it as soon as we can.

We do all of this automatically, without any conscious decisions taking place.1

This intuitive knowledge is generally very useful for dealing with things that don’t change too quickly, but it is extremely rigid and doesn’t easily adapt to new situations.

It’s also strongly influenced by other forms of knowledge, which I imagine developed later in human (pre-)history.

II Introspective knowledge

The next form of knowledge might have been the result of conscious introspection: you experience something new, and it clashes so terrifically with what you know about the world that you find yourself trying to consciously figure out how to explain it.

This gives you more flexibility: you can employ the human knack for metaphor (“this thing is kind of like that other thing”) and create thoughts that are more complex and abstract. Unlike intuitive knowledge, it involves our conscious thought and we can make conscious decisions about what to believe.

Unlike intuitive knowledge, introspective knowledge is free from the restrictions of direct experience. This enables us to reach better, more general explanations, but also means a lot of our “theories” can be completely wrong and still maintain a hold on us; the more abstract a theory, the more difficult it is to encounter direct evidence against it.

III Cultural knowledge

I imagine that Introspective knowledge gave rise to what I consider the most powerful form of knowledge in human history: cultural knowledge.

Cultural knowledge includes everything people “know” because it’s what everyone around them “knows”. One example is prejudice against homosexual activity – which became common in the West only a few centuries ago but many people seem to consider “natural”. Primitive religions are another good example, but there are many, many others.

Imagine you’re part of an ancient tribe. The oldest woman in the tribe, known and respected for her experience and wisdom in all practical things, tells you and your family “a very important secret”: there is a deity – a being with powers unlike your own – which controls the growing of the wheat that sustains the tribe.

Like people, sometimes the deity is happy, and sometimes it is angry. But when the sun is shining and the deity is happy, the tribe’s crops bloom; and when the weather is bad and the deity is angry, the wheat fails, and the weak ones starve.

Because you don’t have any better knowledge to work with, and because your life truly depends on understanding how your source of nourishment works, you probably believe this story.

What happens next is that you tell your children about this deity, and they tell their children more or less the same story, and very quickly, your whole community comes to “know” that this is how the world works. (As the Dothraki often tell a baffled Daenerys Targaryen in A Song of Ice and Fire, “it is known.”)

IV The power of conformity

Cultural knowledge arises from introspective thought and we gain it through conscious social experience, especially in our childhood, though we usually forget where it came from within a few years.

Although it spreads in a conscious way, passed from one person to another using language, it becomes anchored in far deeper unconcious processes.

If you refuse to accept the social group’s beliefs, you may find yourself quite alone. Most people will not be open to talking about such things when you refuse to see “the obvious truth”.

You may have a feeling that what you’ve been told is nonsense, but that doesn’t mean you have a better theory to offer spontaneously. The easy way out is to accept what everyone around you knows to be true, or if you don’t accept it, to keep quiet; this saves you and others the torment of being a skeptic amongst conformists.

Because of the immense power of cultural knowledge, I’m convinced this is the most common form of knowledge. It doesn’t require any effort from any individual, only the repetition of information, and conformity. This means almost every thing that we know is probably cultural knowledge; it might not all be quite as false as the belief about the wheat god, but it could potentially carry all kinds of falsities that we never notice or question because we’re busy with other things.

V Scientific knowledge

I think of intuitive, introspective, and cultural knowledge as the basic forms, but there are certainly others. One I’d especially like to consider: scientific knowledge.

In the terms I used above, scientific knowledge is a hybrid form. It mostly rejects intuitive knowledge; it is based on introspective knowledge, but makes use of the tool that gives cultural knowledge its power – social sharing of information.

Scientific knowledge is, to me, a form of cultural knowledge which embraces and encourages the skeptics. The magnitude of that innovation can’t be exaggerated: it may well be one of the biggest steps taken by humankind since the evolution of complex (syntactic) language.

VI The meaning of science

Doing science, being involved in the creation of scientific knowledge, means making the effort to introspectively consider and challenge all intuitive and cultural beliefs. But it also means to share your skeptic thoughts, in the effort to socially form a theory that is more useful than any other theory so far.

A lot of people think that scientific knowledge is even more like cultural knowledge: that what scientists think about their area of investigation is merely an opinion; that scientific knowledge is just dogma; that any experienced scientist in some field can tell you the truth about anything in that field.

People who think these things are mostly wrong.

While scientific communities are susceptible to dogma and orthodoxy, the scientific method and the scientific culture are brilliantly designed to encourage serious skepticism regarding established knowledge. As a result, scientific orthodoxies never last more than a few decades in fields where the scientific method is functional and research is active.

Individual scientists do have opinions, more than knowledge. But scientific opinion is based on relatively serious investigation of realtively well-defined issues – unlike everyday opinions, which are based on cultural knowledge and relate to vague intuitive questions about reality.

But scientists do not know reality, and discovering reality is not the goal of scientific theory. The goal is to provide a better explanation than any other explanation so far, which means that an even better explanation should be just around the corner (that is, probably less than 20 years away.)

Because of this, it’s safe to assume that this means we’re slowly getting closer to true knowledge of reality, but there’s no reason to believe we’ll ever reach it.

A scientifically literate individual informed about the current state of some field can tell you about the most realistic understanding available for that field’s set of phenomena. But if that individual is intellectually honest, they will not call this “the truth”. Just a close approximation which is close to a good match for the information available so far.

VII Convergence and confusion

In the past century or two, humanity is experiencing a very odd transformation. We are, as always, obsessed with knowledge; but we have developed scientific knowledge enough that some results – gravity, atoms and molecules, bacteria and viruses, et cetera – are understood so well that we can really make use of them in our day-to-day life. At the very least, we can use technology which is based on them, and we do, every day.

What this leads to is the integration of scientific knowledge into our cultural knowledge. When we culturally-know something, we think of it as “true reality” – even if the original source is skeptical, provisionary, scientific.

Once a belief becomes cultural knowledge, it doesn’t change nearly as easily as actual scientific knowledge.

Unlike intuitive and cultural knowledge, scientific knowledge is not created to help us deal with the world around us; science is about understanding for the sake of understanding.

So we end up with cultural knowledge that we have a hard time shaking off, but also isn’t very useful in our day-to-day lives. Even worse, it was never meant to be “the truth”, and it’s probably a decade old – or five, or ten2 – and we intuitively think of it as “the truth” because nobody in our day-to-day interactions dares question it. Trying to find out what current thoery says doesn’t help, because current theory is less mature and useful, and still needs a lot of testing before it can be relied on.

It’s important to understand that this applies to every single human being. It also applies to all scientists, for everything outside of the fields they’ve seriously studied.

Sometimes, cultural biases and orthodoxy can even creep in on a scientist’s home turf and “contaminate” research – but the scientific method will clean it up within a few years if that research becomes well-known.

VIII Practical questions

What should we do about all of this? I really don’t know. It’s not my field, and I doubt there’s a definitive answer – or a close approximation – in any field, anyway. (Philosophy is even worse than science at reaching definitive answers.)

What I personally try to do is simply to identify my beliefs about the world, and try to identify where they come from and how important it is to me to hang on to them.

I try to keep in mind that “knowledge” is an illusion; that the most reliable kind of “knowledge” is the intuitive kind, which is, however, also the least accurate and the least flexible.

I try to keep in mind that anything I think I “know” could be a cultural bias. I try to remember that this applies to scientists, too, and that even the best theory is just a reliable approximation so far, and could change deeply within my lifetime.

Most of all, I try to recognize my biases, and often have to decide whether I can live with them or have to reconsider what I “know”.

What I try to never do is defend a bias simply because I happen to hold it. If I realize I don’t know why I think some thought, that thought is suspicious and I keep it to myself until I’ve looked into it again more seriously.

I can’t say any of this is fun or easy – or even that I go through with it as much as I’d like to – but considering my (introspective) beliefs about knowledge, I’d (intuitively) be uncomfortable with doing it any other way.

If you’ve made it so far in this unusually long post, maybe you’d like to share some of your thoughts/bliefs/”knowledge”. Did I miss any important form of knowledge? Is there any better way to deal with the weaknesses of the forms I discussed? Is there any point you’d like me to say more about?

 

References

The subject of this piece is one I’ve enjoyed thinking about for years, and I’m not sure where I stole my ideas from. A few sources that were probably influential, in no particular order, are:

  • Daniel Greenberg, Worlds in Creation. Sudbury Valley School Press
  • Paul Graham, What Can’t be Said. Essay on paulgraham.com
  • Noam Chomsky, New Horizons in the Study of Language and Mind. Cambridge University Press
  • David W. Lightfoot’s introduction in Noam Chomsky, Syntactic Structures. Mouton De Gruyter
  • James A. Michener, The Source: A Novel. Random House (see my review on this blog)

Footnotes

  1. What exactly “conscious” and “decision” are would be the topic for another long post, or maybe a whole new blog.
  2. For example, I heard that what is considered “normal body temperature” is the result of findings from one small study over a hundred years ago, and not considered accurate anymore.
]]>
Read my thesis online https://www.didyoulearnanything.net/blog/2012/04/05/read-my-thesis-online/ Thu, 05 Apr 2012 06:27:23 +0000 http://www.didyoulearnanything.net/?p=1972 Just a heads-up for the grammar theorists and curious laypeople in the crowd: my BA thesis is online and freely available (in a lightly edited form). You can get it on LingBuzz (1492).

See also the post where I try to make my thesis comprehensible to non-theorists using Wikipedia links.

]]>
New blog discovered: the “because” charade https://www.didyoulearnanything.net/blog/2012/03/27/new-blog-discovered-the-because-charade/ https://www.didyoulearnanything.net/blog/2012/03/27/new-blog-discovered-the-because-charade/#comments Tue, 27 Mar 2012 06:47:59 +0000 http://www.didyoulearnanything.net/?p=1967 Continue reading New blog discovered: the “because” charade ]]> I was recently delighted to discover that Daniel Harbour, one of the linguistic theorists I’ve most enjoyed reading, has a blog – about language and also other interesting topics. It’s called the “because” charade, and here’s how he explains that curious name:

My blog is called the “because” charade because what follows the word because (in a lot of discussion of science, ethics, politics, religion, …) is rarely a reason, or reasonable, or rational. And I believe that we’d all be better off if reason(ableness) played a bigger part in public life.

Recent topics have included the Pirahã controversy – an important linguistic debate, which he explains in terms a layman can understand – and the theory of evolution. A pleasure to read!

]]>
https://www.didyoulearnanything.net/blog/2012/03/27/new-blog-discovered-the-because-charade/feed/ 1
Semi-electives: a university paradox https://www.didyoulearnanything.net/blog/2012/01/10/semi-electives-a-university-paradox/ https://www.didyoulearnanything.net/blog/2012/01/10/semi-electives-a-university-paradox/#comments Mon, 09 Jan 2012 23:01:06 +0000 http://www.didyoulearnanything.net/?p=1889 Continue reading Semi-electives: a university paradox ]]>

For the BA degree in linguistics, me and my classmates are required to choose some courses from outside of the core linguistics curriculum. This is, in theory, a good thing – it gives undergraduate students a chance to see what’s going on in other departments, and particularly gets us acquainted with some fields related to our own. However, these semi-electives are simply the introductory modules that students in other programs take in their first semesters; this can cause a lot of frustration.

Over the past days, I spent several frustrating hours doing homework in such a course. I remember seeing what must have been the same frustration in students from outside of linguistics in the introductory courses I’ve taken and the one in which I tutored. I think this frustration is an indirect result of the Bologna Process, which creates a basis on which courses from different departments, universities, and countries, across Europe, are evaluated for accreditation. The problem, I think, is that it’s very hard to evaluate a course and the effort that goes into it outside of context.

Understandably, when designing courses, faculty is focussed mostly on training the next generation of scholars in their field. A certain number of students are accepted for each course from outside the field (let’s call them “outsiders”), but they are almost always evaluated in the same way as students from within the field (“insiders”) and, as a result, are supposed to do the same coursework. A part of preparing a future generation of scholars – at least as the Institute for Linguistics and some others seem to view things – is to present beginners with a large amount of hard work so that they can either quickly jump in, or figure out that they chose the wrong field and switch (or leave altogether). However, the motivations, abilities, and interests of outsiders are very different from those of insiders.

In my first semesters, I was in the process of falling in love with linguistics, and this meant I was eager to understand course material and to acquire any new skills helpful for coursework, even when this was difficult. As such, it didn’t terribly bother me that the linguistics modules were tough, or that they required a lot of homework and self-study. I was trying to enter this new world of thoughts, terminology, and ideas, so I wasn’t irked by the fact that I was required to do so. The module that’s frustrating me right now is supposedly a very small one, composed of just one course, in a field I’ve always had some familiarity with and which I find interesting, but which I’ve never been deeply into, nor have any intention of making my professional home. The homework is gruelling, even though I only have to do it every other week, and every single time I find myself kind of furious about it. Yes, I chose this module, but out of a rather narrow set of alternatives, and I have to complete it in order to earn my degree. It may be cast as a choice, but it’s really a requirement.

As I hinted above, I think the problem is a mismatch between the goals and motivations involved in creating the course and those of (some) individuals taking them. When I take an introductory module in linguistics, I am doing so as part of a bigger commitment I’ve made to the field as a whole. I know that if I find the field isn’t right for me after all, I can start an entirely different degree, but I’m willing to accept some parts along the way that I’m not crazy about, since I’m committed to the whole. It also helps that I’m surrounded by a group of people in the same situation. Now, when I’m taking an introductory module outside my field, I naturally approach it in a very different way. The little part is the whole. I’m probably interested in some aspects of the material, but I’m there basically because I need the ECTS points. I’m looking for the interesting things, but the nature of introductory courses dictates that much of what you learn is merely scaffolding for later courses, where the real fun comes. That scaffolding, which could be exciting if I planned to build on it, becomes a terrible chore when I have no reason to expect to ever use it again.1 As a result, the whole experience becomes one of jumping through hoops, often taking shortcuts, if for no other reason then because there are so many other things I am more interested in doing with my time. And to make things worse, I have no idea who of the many people taking the module is in the same situation, and who is there for the long haul.

All of this would be okay if, say, I merely had to attend the course, with the option of doing homework and taking the exam if I want to get feedback. But module credits are awarded for completing tests, usually written exams.2 And in this case, the lecturer only lets you take the exam if you got at least 50% of the points for homework assignments throughout the semester. But the course is not designed for us outsiders – it’s designed for the insiders, who have made a long-term commitment to the field, have a reason to try hard to get good at it, and have a peer group to help them out. The difficulty of assignments and exams is calibrated for them, not for us. As a result, the semi-elective often becomes the most taxing and frustrating module of the semester, even though you “merely” have to pass.

I’m not really sure what could be done about this. I don’t think it would make sense to ask lecturers to go well out of their way to accommodate the small group of outsiders. I do think it’s good that undergraduates get a peek into other disciplines, but I’m not sure that it should be a degree requirement. And as long as it’s a degree requirement, it is understandable that the university wants to make sure people actually take the courses, hence the exams etc. It’s not clear that there’s any real way out of this situation.3

If anyone has any perspective to add, please leave a comment.

Footnotes

  1. This expectation may be wrong – you never know where things could come in handy – but it seems, at the very least, highly unlikely that I’ll ever need it again; I can’t help but see it as a chore, rather than a means to an end.
  2. I’ve written before about why exams are bad.
  3. This kind of problem is, of course, only a problem in institutions which do not fundamentally trust students to take responsibility for their own education. I believe, as with school-level education, that this is not a good design feature for an educational institution. But it would be a mistake to think that universities are mainly educational institutions. Their primary social function is rather accreditation – giving people a stamp of approval so others will allow them into some prestigious jobs and social functions. They educate only as much as they can get away with, unfortunately. And so we are left with the clash of the wish to create some inter-disciplinary cross-pollination, the need to rigorously introduce newbies into your field, and the need of the system not to give away accreditation too easily.
]]>
https://www.didyoulearnanything.net/blog/2012/01/10/semi-electives-a-university-paradox/feed/ 2
Slides: my BA thesis analysis https://www.didyoulearnanything.net/blog/2011/12/10/slides-my-ba-thesis-analysis/ Sat, 10 Dec 2011 16:57:08 +0000 http://www.didyoulearnanything.net/?p=1858 Continue reading Slides: my BA thesis analysis ]]> As you may have noticed (at least one reader did!) I haven’t been posting lately. The main reason for this is that I’ve been exceedingly busy writing my BA thesis. Yesterday, I presented my analysis to the grammar theory colloquium at the Institute for Linguistics. The curious amongst you can take a look at the presentation slides (PDF), which I’ve even edited slightly to correct mistakes I noticed during the presentation.

Be warned: without being familiar with modern linguistic theory you probably won’t find most of this stuff interesting, or even intelligible. I might find the time some time soon to write a post that explains what I’m working on without requiring prior knowledge, but since that technically doesn’t count as working on my thesis, it may have to wait until after the deadline (December 20th).

If you can’t wait, you can use the Internet to educate yourself on some of the background. I’ve collected a few Wikipedia links that may be helfpul, but you may well have to conduct some independent research as well:

Needless to say, this rabbit-hole goes very, very deep. Have fun!
]]>
Hobby and Career, Academia and Activism https://www.didyoulearnanything.net/blog/2010/11/29/hobby-and-career-academia-and-activism/ https://www.didyoulearnanything.net/blog/2010/11/29/hobby-and-career-academia-and-activism/#comments Mon, 29 Nov 2010 16:25:33 +0000 http://sappir.net/?p=666 Continue reading Hobby and Career, Academia and Activism ]]> For a while now I have been very conflicted about what I want to do after my BA. The two main options on my mind have been on the one hand to (somehow) become a full-time activist for democratic education (or perhaps for human rights), possibly along with some translation and writing to make ends meet; on the other hand, I could continue with my studies and move towards an academic career in linguistics.

For a very long time I’ve wanted to be an academic, but when I decided to start studying it was important for me not to think too far ahead and take things one at a time. I wanted to stay open to other options, some of which, I knew, could not have even occurred to me at the time. As the degree gets closer and closer I know I have to at least decide what the next step will be. There have been times when it was clear to me that a BA was not enough, that I’d need at least an MA to satisfy my curiosity. At other times (in particular when I get annoyed at the university’s structure) I’ve wished to just be done with it as soon as possible and go do something else.

What makes the whole thing more difficult is that I find both fields absolutely fascinating, and both engage me in a way that makes use of my skills. Activism stands out to me as a particularly worthy way of spending one’s time, because activism means working for the greater good (or one’s vision thereof) and would have a clear goal. The goal in linguistics is less clear to me, and I know that the best one can do is create, or help improve, a model that is useful for understanding the phenomena of language — hoping to achieve total understanding would only be a recipe for disappointment. On the other hand, I’ve been thinking and speaking about democratic education since I was thirteen, and I don’t think it’s much good to advocate it as a graduate who hasn’t spent much of their adult life outside the movement.

In the last few days I’ve been thinking a lot about one way of seeing things, a way that had occurred to me when I started to study but I somehow forgot about in the meantime. The idea is essentially to make a hobby into a career, and work on something you believe in in your free time. In my case, the hobby-career would be linguistics — a pursuit that is valuable to me simply because it’s fascinating and fun. I could be an activist on my free time, as time allows.

I’m far from done figuring this out, but this approach seems like a good one. Going into a career without any lofty expectations would allow me to spend time on something challenging and enjoyable, while pursuing more lofty goals on my free time would let me continue being part of something I consider really important, something that seems to make a real difference in people’s lives (which, outside of academia, linguistics rarely does).

I’m writing this just because it is on my mind and I feel like writing. I should actually be doing my computer science homework. I’d appreciate thoughts on all this, especially if they come quickly enough to distract me from my homework!

]]>
https://www.didyoulearnanything.net/blog/2010/11/29/hobby-and-career-academia-and-activism/feed/ 1
A rant about degree requirements https://www.didyoulearnanything.net/blog/2010/11/23/a-rant-about-degree-requirements/ https://www.didyoulearnanything.net/blog/2010/11/23/a-rant-about-degree-requirements/#comments Tue, 23 Nov 2010 15:05:19 +0000 http://sappir.net/?p=638 Continue reading A rant about degree requirements ]]>
University of Leipzig, in 2009 partly occupied...
Image via Wikipedia

Lately I’ve been having a very hard time accepting the structure of the university program I am in. It’s hard to put my finger on it, but I certainly have not been happy with the requirements this semester.

Over the four semesters I completed so far, I mostly took courses in linguistics. They were not all exactly my cup of tea (only about half of them) and I don’t have to repeat what I think about being tested at the end of every semester, but for the most part I was happy to jump through the hoops, knowing it was progressing my understanding of the discipline and domain of research that I had chosen. My fascination with linguistics and language grew over time as I learned more, understood more, and appreciated new ways of approaching the subject matter. I could accept the expectation that all of us learn a little of all of it, even the approaches we are not interested in pursuing.

However, the way this BA program is designed is a little strange. After the 4th semester, there are no linguistics courses anymore. For the last year of our studies — the year in which we are expected to write our BA thesis in linguistics, mind you — the plan is to take courses from the three different lists of more-or-less elective courses. In total, the program requires 180 ECTS credits throughout the six semesters of study, corresponding to the unrealistic total of 900 hours of class and self-study time per semester. (Hardly anyone at the university, student or instructor, takes this requirement seriously. It seems like something the Bologna process dictates and the universities do their best to fulfill, mostly on paper.)

90 credits — half of the program — are to be obtained in linguistics courses, including 10 credits for the thesis. The other half is composed of:

  • 30 credits: courses you get by lottery (from your first few choices) from other departments, university-wide, usually limited to introductory offerings
  • 30 credits worth of courses from an “obligatory electives” list, which lets you choose from exactly 70 credits worth of courses from other departments — introductory computer science (20), inter-cultural communication for Russian (10), philosophy of language (10), the languages of Africa (10), the system and history of German (10), or basic Hausa (10)
  • 30 credits worth of “key qualifications” courses, being a strange mixed bag of courses offered by different parts of the university on a basis which is not quite interdisciplinary as much as it is simply unrelated to any of the disciplines of those who might take the courses. Luckily, 10 of these credits have to be taken in a language course and the other 20 can be semi-officially replaced by language courses.

I have a feeling this is a case of good intentions gone amiss. There is apparently a social norm of going straight from highschool into university if you were in the academic “gymnasium” highschool system — which you are selected for at the age of 10. As a result, most beginning students have no clue what they’re getting into. So it’s probably doing many students a favor to force them to get a taste of other disciplines before giving them a degree, and indeed the majority changes to another program, or quits, by the second year of studies. But perhaps it’s just cruel, seeing as those of us without wealthy parents have two semesters of grace in which to switch majors, after which financial aid is no longer available.

But I digress. The point is that the structure of this program — not the content — is crushing my interest and desire to complete it. I can’t emphasize enough how this is not a matter of content. I feel like those 80 credits worth of linguistics courses both gave me an excellent, broad understanding of the discipline (and sub-disciplines) of linguistics, as well as giving me a chance to develop real interest in research.

The problem is that the structure of the program makes it entirely impractical to continue pursuing that interest. It’s not just that I have to take some other courses. It’s that a student like me, who is engaged in extra-curricular activity and dependent on financial support, can’t realistically do much besides the required work.

Right now I feel trapped. I am working as a tutor in the introduction to linguistics, and as a research assistant in a language documentation project. I decided to take these jobs both in order to stay involved in linguistics and to work towards more financial independence. I’m very glad I made that choice, and I think it is entirely in line with what engaged and serious students are supposed to do (faculty seems to agree entirely). Yet with all of the time and effort my work requires, I’m struggling to keep up with the computer science coursework, and just desperate to devote more time to reading linguistics literature and perhaps work on some research of my own. (With theoretical grammar as my primary interest, research is thankfully something I can do without any special equipment.)

It makes me furious that in order to receive my BA in linguistics, I am expected to now more or less put my interest in linguistics aside and focus on hoop-jumping.

A note

As some of you may know, as of last Thursday I’m taking time off from my work on EUDEC Council, until the end of 2010. This makes some much-needed room in my schedule for dealing with these requirements. Hopefully it will also give me more time to blog.

I do not expect to make a habit of personal, emotional posts like this one, lacking a clear and general point. It’s just something I had to write about today. At any rate, comments are open and I’d love to hear some of your thoughts on all of this.

]]>
https://www.didyoulearnanything.net/blog/2010/11/23/a-rant-about-degree-requirements/feed/ 2
NYT on Study Habits: three comments https://www.didyoulearnanything.net/blog/2010/09/08/nyt-on-study-habits-three-comments/ https://www.didyoulearnanything.net/blog/2010/09/08/nyt-on-study-habits-three-comments/#comments Wed, 08 Sep 2010 11:07:20 +0000 http://msappir.wordpress.com/?p=510 This New York Times article came to my attention via Facebook (Thanks, H.B.!):

Forget What You Know About Good Study Habits

By Benedit Carey

Full article on NYTimes.com

As you can imagine, I read the whole thing on the spot. I fully recommend the entire article, and it’s not long.

I’d like to comment on a few things in the article. I’ll quote them in the order they appear:

Science and the school system

“We have known these principles for some time, and it’s intriguing that schools don’t pick them up, or that people don’t learn them by trial and error,” said Robert A. Bjork, a psychologist at the University of California, Los Angeles. “Instead, we walk around with all sorts of unexamined beliefs about what works that are mistaken.”

Take the notion that children have specific learning styles, […] In a recent review of the relevant research, published in the journal Psychological Science in the Public Interest, a team of psychologists found almost zero support for such ideas. “The contrast between the enormous popularity of the learning-styles approach within education and the lack of credible evidence for its utility is, in our opinion, striking and disturbing,” the researchers concluded.

[…]

[…] many study skills courses insist that students find a specific place, a study room or a quiet corner of the library, to take their work. The research finds just the opposite. In one classic 1978 experiment, psychologists found that college students who studied a list of 40 vocabulary words in two different rooms — one windowless and cluttered, the other modern, with a view on a courtyard — did far better on a test than students who studied the words twice, in the same room. Later studies have confirmed the finding, for a variety of topics.

These paragraphs (emphasis mine) show a recurring theme of the article: the school system has not been learning from science. This is, indeed, “striking and disturbing”. But I can’t say I’m surprised. In my encounters with “education sciences” in Germany, I have to say I did not get the impression that they are very scientific. As my friend Sören Kirchner of tologo often remarks, they seem to be more in the business of reinforcing a philosophy than that of empirical science. Because “education sciences” are wedded to the traditional school system (in Germany at least, the educational sciences faculties are where accredited teacher training takes place) they typically seem rather unmotivated to produce true criticism of the beliefs that drive the traditional system. The truly critical — and I am glad to know a few such people in the faculties of a few German universities — are the exception, not the rule.

The traditional school system has stuck to the same basic paradigm since it was conceived during the Industrial revolution. Society is deeply invested in that paradigm, since the vast majority of us have been through that system, and rejecting the validity of their assumptions about learning means rejecting the validity of how we spent many (unpleasant) hours in childhood. Making that kind of concession is not easy. At this point, improving education is a matter of revolution, not evolution.

Context, relevance, context!

“What we think is happening here is that, when the outside context is varied, the information is enriched, and this slows down forgetting,” said Dr. Bjork, the senior author of the two-room experiment.

Varying the type of material studied in a single sitting — alternating, for example, among vocabulary, reading and speaking in a new language — seems to leave a deeper impression on the brain than does concentrating on just one skill at a time.

I will have to remember this next time someone tells me that students in a democratic school won’t learn anything properly because they aren’t forced to stick to a topic for 45 minutes in a static context. This research strongly suggests that the constantly changing, dynamic atmosphere of democratic schools is a terrific boon for learning. This seems right in line with the thought that having a relevant context is crucial for learning: when you learn something because it is interesting and relevant to you at that moment, you learn it better. Classrooms have a hard time providing that kind of context. A school where students explore things freely allows that relevance to happen all the time.

Exams, revisited?

[…] cognitive scientists see testing itself — or practice tests and quizzes — as a powerful tool of learning, rather than merely assessment. The process of retrieving an idea is not like pulling a book from a shelf; it seems to fundamentally alter the way the information is subsequently stored, making it far more accessible in the future.

Tests are a learning tool? I guess you really do learn something new every day! I wrote a little about testing in July, and indeed as most people do, I treated testing as mere assessment. I stand happily corrected.

In one of his own experiments, Dr. Roediger and Jeffrey Karpicke, also of Washington University, had college students study science passages from a reading comprehension test, in short study periods. When students studied the same material twice, in back-to-back sessions, they did very well on a test given immediately afterward, then began to forget the material.

But if they studied the passage just once and did a practice test in the second session, they did very well on one test two days later, and another given a week later.

It’s good to know that testing can actually help you learn things, but if this is their usefulness, this is not reflected by the way they are treated in schools and universities. I can only emphasize what I’ve said before: the importance of exam grades must be abolished. Then perhaps tests can be useful. Making test grades important only encourages the kind of learning that gets forgotten.

Like this post?

Flattr this

Flattr it!

]]>
https://www.didyoulearnanything.net/blog/2010/09/08/nyt-on-study-habits-three-comments/feed/ 2
A Tirade Against Exams https://www.didyoulearnanything.net/blog/2010/07/11/a-tirade-against-exams/ https://www.didyoulearnanything.net/blog/2010/07/11/a-tirade-against-exams/#comments Sun, 11 Jul 2010 13:22:01 +0000 http://sappir.net/?p=437 I don't like exams

Summer break has just begun. I managed to get away without any exams this semester, for the first time. In the past weeks, like every end of semester, I find myself thinking what an awful, ridiculous system these exams really are, especially in university. I’d like to try and articulate why.

I can imagine a university where exams are hardly even relevant because people only study things they find interesting, and only so long as they are interested. Such places exist (take Tokyo Shure for example).

However, most officially-recognized undegrad programs are still based on instructors providing students with pre-packaged chunks of information, and then judging whether each student has properly digested the information. This post will be about exams in that context; my point of reference will the linguistics BA program at the University of Leipzig. As far as I know, it’s as good an example as any of a normal undergrad program in science.

Exams are bad experiments

So why are exams a bad idea when you want to check whether a bunch of science undergrads understood what you taught them? Well, one part of the problem should be obvious to anyone with even a rudimentary understanding of science: exams are not very good experiments. There is no way to control for interference of irrelevant, extraneous factors. When scientists conduct a study, in any field and with any methodology, they seek to control for irrelevant interferences. For example, when psychologists test hand-eye coordination, they’ll do something like only taking right-handed people with healthy hands and eyes, in order to make sure that the results aren’t skewed by irrelevant differences between individuals.

You can’t do anything like that in exams. For example, one of my exams once took place at a time when I was infatuated with someone. I spent about a quarter of the exam staring into blank space and thinking about things quite unrelated to linguistics. As you might expect, my grades for that semester were not spectacular. This was not a reflection of how well I understood the material in question, but rather a reflection of how capable I was of concentration at the time of the exam.

Exam stress: an antidote for learning

Not only can’t exams control for interference, they create a strongly interfering, totally irrelevant factor: stress.

Exams cause those who take them to get stressed out, usually for weeks in advance.

Google the words “stress” and “learning” together. The first result I got (of some 25 million) was this site, which says “Stress can disrupt learning and memory development”. Huh. That sounds like a great way to lower people’s performance on a test.

One obvious remedy is to train people so they’re used to taking tests and don’t get so stressed out. This is what traditional schools do, and perhaps why they do it.

For some reason, that really doesn’t work for most people. I’m guessing that the way schools make a big deal out of exams rather trains people to think exams are a big deal and worry about whether they’ll pass. What also doesn’t help much is that the resulting grades are relevant to one’s progress in a degree program as well as one’s chances of getting accepted for further studies or a job.

Exams are bad science

But even if we accept that it’s schools’ job to prepare students for the stress of university exams, what are those exams preparing them for? Surely, it can’t be their future work as scientists. Exams are good preparation for bad science.

A scientist’s job is essentially the opposite of exam-taking.

Exam-taking is swallowing a more experienced person’s presentation of information (course material), then regurgitating small bits of it as closely as possible to the original (“the right answers”). Science is carefully considering information (raw data) and other people’s presentations of information (prior work), carefully deciding whether or not to swallow it, then, optionally, producing a novel presentation of the information (research), which is considered useless if it’s in small bits that are exactly like they were when you got them.

The whole idea of one person telling the beginners how it is and expecting them to accept it is bad science. Obviously, my instructors are far more experienced and knowledgeable than me in their respective fields. Still, it would not be very good if I accepted everything they taught me unquestioningly.

If I take my role as a budding scientist seriously, I should critically examine everything I am taught and decide for myself whether I agree or disagree (and why). Exams tell me the opposite, and it takes real effort to continue thinking critically while I am expected to soon be able to reproduce the instructor’s view.

Worse still, in some introductory courses, the theory being taught is not perfect: instructors use simplified or “toy” versions of the theories being taught, or perhaps just a rather recent theory which is more a work in progress than the final word about anything. Either way, attentive students might notice inconsistencies or incoherences. This is good for undergrads; they can be inspired and take the theories further. That value is diminished by needing to swallow theories whole for a test.

Some suggestions

I could probably think of another point or two against exams, but instead I will dedicate the end of this post to pointing out a few things that might make the situation better:

Abolish the importance of exam grades.

This is the most important thing, but also likely the most difficult. Exam grades should not be available to anyone but the student and instructor. It might make sense to indicate on a degree whether the holder’s grades were consistently above average — this might potentially be an indication of extraordinary ability. But knowing that even the best exams are inaccurate and susceptible to extraneous variables, it does not make sense to prefer B students to C students.

Make feedback the goal of all exams.

Finding out I got a C on an exam doesn’t help me improve. Telling me what the weak and strong points of my exam were, could. I learned a lot on the few occasions where I’ve asked an instructor to go over the exam and tell me what my mistakes were. This value as a learning tool is wasted by not presenting all exam takers with feedback. (Some instructors do this, but all should.)

Make some or all exams optional.

If the goal of exams is to give feedback, then save it for those who want it. Mandatory exams create unnecessary stress. There are plenty of other ways to run a system like the modular European Credit Transfer and Accumulation System, in which it is essential to judge whether a student really took part in their courses.

Replace some exams with real work.

Writing a term paper takes more effort than writing an exam, but you learn new things from it and experience something akin to actual academic work. Some disciplines have other “simulations” of real work which could be graded as tests. Sure, this requires more effort per test from the staff, but grading something other than an exam may be a welcome change. And perhaps a system could be created where more advanced students grade beginners’ work and get graded for their grading work (being real academic work practice itself).

Filter students in conversation, not testing.

I get the impression that one of the main reasons I had to take so many exams in the first year of my studies was to filter out students who are not really interested in the program they chose. (I’ve mentioned before that some people choose their major at random here, and if that’s as common as I think, filtering is a good idea.)

I imagine a ten-minute conversation with each student after their first semester could replace some or all of that testing. If the courses didn’t do the trick, simply asking the students if they want to continue with this major, and if yes then why, will get them thinking about those questions themselves. With all of the second chances people are given after failing, it’s their choice anyway; a short conversation could save a lot of exam creation, administration and grading. And of course, this could be done by advanced students as well as by faculty.

 

Clearly, all change in the university system is slow. Certainly, there are many different changes that can be made. I hope I have provided a few good points of critique and a few good ideas on how to improve the system. Further ideas, comments, and criticism are most welcome in comments.

 

]]>
https://www.didyoulearnanything.net/blog/2010/07/11/a-tirade-against-exams/feed/ 9
Conversation and happiness https://www.didyoulearnanything.net/blog/2010/03/20/conversation-and-happiness/ https://www.didyoulearnanything.net/blog/2010/03/20/conversation-and-happiness/#comments Sat, 20 Mar 2010 20:10:18 +0000 http://sappir.net/?p=301 Continue reading Conversation and happiness ]]> Language Log recently had an interesting post about a study that found that happy people tend to have more substantive conversations. I was reminded of the kind of conversations we had in Sudbury Jerusalem. I’ve written about it before (for example, in The Secret Weapon) and I thought I’d bring up the connection here. As the Language Log post explains, the study doesn’t say whether it’s substantive conversation that increases happiness, or being a happy person that increases substantive conversation.

But it’s interesting to think about how this relates to democratic schools. My experience is that a democratic school is a good place for substantive conversation. In Sudbury Jerusalem, I was a student who rarely had any classes and spent much of my time socializing, talking. It was a place where the general atmosphere is happy rather than depressed. I noticed that in school, we talked a lot, and had a lot of really good conversation, and I never considered whether it’s just because people are often in a good mood (I, by the way, often was not in a good mood, despite a lot of conversation — maybe that’s why). I also never considered that this might be why people are often in a good mood.

I don’t know, but I had other things in mind. The school democracy itself, it seems to me, encourages a culture of talking things through. It’s what we would try to do in School Meeting and in committees, and it’s often what we would do to solve conflicts before resorting to a Judicial Committee complaint form. And the other side of things, the personal freedom, simply gives people more time to talk. There’s always conversation going on all over the place, and it makes sense that when you get to talk with people a lot, you eventually get to deeper, “substantive” conversations.

But maybe the large amount of conversation in democratic schools is caused by something else. Maybe it’s the nature of the school as a community in which people operate freely in the same spaces, together; the school is a very social environment. This is also something that probably contributes to the general happiness of the population. Actually, considering that more social environment are probably causes for both happiness and for conversation, maybe this is the causal link behind the study’s findings. People with more social contact are happier and have more substantive conversation (as compared to people with less social contact.) It definitely makes sense to me.

]]>
https://www.didyoulearnanything.net/blog/2010/03/20/conversation-and-happiness/feed/ 2