Monday, October 22, 2012

History Part 2: Bricks and Mortar

In the last post, I talked about three different foundations to online education: the political changes that shifted the role of the university, early attempts at using technology for education, and CS's slight shift to more intuitive programming. These are important foundations to understand the current state of CS education and the ideas surrounding its shift online.

To continue on the topic of the history of the American university:
After World War 2, the US turned its attention to Russia. While the government was already funding some university research, Sputnik's surprise launch into space shocked the country into rethinking its approach to science and technology research. According to this article, as early as the 1950s Bush decided that private/commercial R&D wouldn't lead to the necessary developments that would keep the US ahead. Funneling huge amounts of money to select institutions, 13 universities getting 85% of the money, the government encouraged all sorts of science and technology research and education.

This trend worked well until the so-called Permanent Tax Revolt, triggered by rising property prices, where tax cuts emerged as consistent political fodder. Since the 1980s, politicians on both sides have felt obliged to gain favor by cutting taxes by cutting programs such as education funding, forcing universities to restructure (link shows how amount paid to education is now less than maintaining prison). We see this a lot in the current election. This restructuring placed universities under an economic spotlight. I.e., to best economically justify higher education's existence, universities were judged by how well paid they can get their alumni to be.  The effects of this second change, more politically charged and strenuous, is at terms with the original vision of higher education.

In a matter of speaking, this is where we stand with the university system. So why is this history, only tangentially related to CS education, so important? For me, it serves to show that the institutions we have now, our model of learning, is largely a result of politics, money, and tradition--a system that's now being forced to handle many more people with higher standards, less money, and more scrutiny. On top of this governmental/societal change, the technological revolution has also found itself at odds with the no-cheating, individualist learning environment of the university.

The history also helps us understand the general approach towards this new age of education.
The proponents of the digital revolution believe that the internet will bring universal, egalitarian, high-quality learning. Whether it be a videotaped traditional course or a MOOC, the possibility to allow anyone, anywhere, access to top professionals' classes seems to promise a system driven by learning for learning's sake, away from the dirty world of money and politics. Even while companies will be able to commercialize learning more intensely, this money will only serve to force various classes to be better than another, raising the quality of learning with the free market.

This university system is not easily changed. While teaching age-old subjects has been brought down a long line of teachers working off a long history of classes and teachers, CS has no real prior population of teachers or curriculums to lean on. Many students self-teach, not particularly improving this trend. In fact, this has lead to a decrease in CS education outside the university--so few teachers know how to teach CS that they stay in university or get quickly hired for a great deal more money than a school could afford.


On Cold War finances and policy: http://cshe.berkeley.edu/publications/docs/PP.JD.Sputnik_Tech.2.99.pdf