Wednesday, September 5, 2012

A Curriculum for 1992

(Update: The faculty met to discuss this on Friday afternoon, 9/7, and the committee said that they did feel this was a hole in the curriculum and they needed more time with the proposal to fix it. I will keep my fingers crossed. My next post will basically elaborate on my proposal and the staffing requirements for it.)

Trinity has been working on revising the curriculum for nearly a year now, and today a formal draft of that curriculum was sent out to the faculty. As you can tell from the subject of this post, I am not impressed. I'm writing this post not only to express my ideas, and inevitably some frustration, but hopefully to motivate alumni to take a little action in regards to this. Keep reading for details.

Computers Are Just a Fad
At least that is the impression I get from the curriculum document. The charge for the revisions was to create a curriculum for the 21st century. However, the only mention of technology in the entire document comes in the FAQ at the end. Here is what they say:

2. If we are trying to educate students for the 21st century, why isn’t technological and information literacy part of the capacities?
Answer for technological literacy: 
Our committee agrees that the ability to use, understand, and criticize technology is of tremendous importance. Technological advances flow into the classroom as they become relevant to educational content and delivery, and we are confident that Trinity faculty bring these technologies (as well as a thoughtful discussion about their strengths and limitations) into their courses.
Answer for information literacy:
Information literacy is a hallmark of a Trinity education through the university’s commitment to the QEP. It was felt that most, if not all, of our classes support and reinforce information literacy.
At least they see this as a weakness of their proposal, and they acknowledge the importance of technology. However, they seem to think that faculty will somehow magically start incorporating this into the classroom and that students are certain to take courses that use the needed technology. The reality is that college faculty, to a large extent, as some of the least technologically savvy people on the planet. What is more, I frequently see students who work to avoid technology in the same way that they avoid math and science. This is a bad decision on their part, and most will realize it later in life. Part of why students pay to go to college is so that other people can give them direction and help them avoid making those bad decisions. In my opinion, as this curriculum currently stands, it fails miserably in this area.

Globalization: The Game Changer of the Last Several Decades
So what does this curriculum do instead? There are changes in it. One of the big ones is a push to address globalization. This includes a "capacity" with courses on "Global Awareness", "Understanding Diversity", and "Foreign Language". This is on top of the standard elements where you have to be able to read, write, and speak as well as a smattering of courses from humanities, social sciences, natural sciences, and math. The new part is dealing with being an "engaged citizen", which seems to be largely motivated by a desire to have Trinity students prepared for globalization.

In my opinion, globalization is yesterday's news. I made the subject of this refer to 1992 because honestly, a really forward looking curriculum would have included globalization back then. Now this is just a knee-jerk reaction to a boat that was missed two decades ago. Globalization was perhaps the biggest influence on our economy and the general evolution of the world over the few decades up to 2010. However, it isn't what is going to shape the future. Yes, global exchange of information is going to continue to be important, but production of goods is on the verge of heading back the other way. New approaches to digital manufacturing, including 3-D printing and increase automation, are making it possible to put the production of goods back close to the point of consumption. After all, why should we ship materials to China, have them assembled there, and ship them back if they can be assembled here? For the past few decades the answer was that assembling them here cost too much. However, today even Chinese companies like Foxconn are planning to replace their human workers with robots. Those robots aren't any cheaper to run in China than they are here. However, energy costs for transportation are only going up. So at a certain point (which I expect is <10 years from now) you cross a point where you want to put the robots close to the end consumer and make things as close as possible to where they will go in the end.

In addition, technology is significantly minimizing the need to actually speak a foreign language to have a dialog with someone who doesn't speak your language. Today Google Translate can allow me to have a reasonably fluid conversation with someone who speaks a different language, and the quality of translation and speech understanding is improving by leaps and bounds. If you have seen comparisons between Google Now! and Siri, you can see what one extra year of development means in this space. I fully expect that by 2022 I will be able to speak to someone in almost any language in a manner that is very close to natural without knowing that language. This isn't to say that there aren't cognitive benefits to learning a foreign, natural language. It is just to say that interpersonal communication is going to cease to be one of those benefits.

If Not Globalization, Then What?
So what do I think is the game changer of the coming decades? What should our new curriculum aim for? The paragraphs above should make this fairly clear. Globalization is going to take a back seat to the oncoming surge of digital technologies that will be enabled by machine learning based AIs and automation. It is impossible to predict exactly what will be relevant, but based on what is already out there you can feel pretty confident that in 2022 most students will have cars that drive themselves and many of them will have robots at home that cook and clean. (Sound Sci-Fi? Then you need to follow me on Google+ or at least go search for videos of those things on YouTube because they are feasible today and will be cheap enough to be wide spread in a decade.)

There are other things that are becoming increasingly significant as well. The buzzword of "big data" is everywhere for a reason. In addition, the rollout of IPv6 wasn't much hyped, but there are rumblings of the beginning of the internet of things if you look in the right places to hear them. When your shirt has an IP address and is constantly sending information about your temperature and heart rate into the cloud for analysis, then you will begin to understand what these things are. They are primed to change the way we live in dramatic ways.

What does this mean for the curriculum? My take is that if a graduate of 2022 looks at a computer and seeing a magic black box with pretty pictures on it, that graduate has already lost at life. They are a powerless consumer with no ability to produce in the markets that will define their time. If we let them become that, we have failed the trust that they put in us when they enroll in our school.

My Proposal
So what do we do about this? What do I think the curriculum document should have included? First, let me tell you what it should not have included. It should not require that every student take a CS course specifically aimed at teaching students to program. That would be a nightmare for me on many different levels. In addition, it wouldn't really benefit the students. Some students need to know how to really code. Those students can learn about programming language fundamentals without associated context. For the vast majority of students though, they need to learn how to use computers to solve problems with at least slightly more competence than just using pre-written software.

Increasingly, data is what drives the world. Humans are horrible at manipulating even reasonable amounts of data. Computers are great at it. The graduate of 2022 should have seen the data associated with courses in a number of different departments and they should have had to do something beyond just plugging it into existing software to dig for meaning or answer questions based on that data. They need to have some experience using a computer and associated technologies to solve problems. That is what really matters. They need the skills to turn the computer into a tool that they can use to solve problems that are beyond what they can do alone.

I believe the best way to do this is to require that students take a few courses that require them to do computer based problem solving. The ideal situation would be that courses that normally count for 3 hours in departments all across the University add an extra hour of credit and a computational problem solving component. For example, a course in Political Science could ask students to analyze census data or data from the last presidential election. Have the students answer questions that aren't simple queries in Excel. That way they might learn how to write VB Script and do a little logic to solve the problems. Or maybe the questions you want to answer are well suited to some reasonable SQL queries. Sometimes the right approach might be writing scripts in Python, Perl, and Scala. The details don't matter to me. The details should be chosen to fit the data set and the questions being asked about it. What matters is that students learn how to make technology do what they want it to instead of acting as passive consumers of software that someone else has written.

I've always liked the expression that if your only tool is a hammer, every problem looks like a nail. All too often, I see people do things the hard way because they don't know there is an easy way. Even worse, I see people who can't even conceive of certain questions because the tools they know don't allow them to answer those questions. If our graduates fall into either of those categories, we have failed them. I don't want to see that happen. A major curricular review is the time to make sure we do things right. Unfortunately, I don't think the current proposal is doing that.

Call to Action
So I want to close with a little call to action for any Trinity alumni out there who are reading this. If having the ability to control technology and make it do what you want to solve problems has benefited you in life, let your old faculty know. Take a minute to tell them how skills like programming have benefited you in your life and what things you wouldn't be able to do without those skills. You might even just pass on a link to this if you don't want to write much yourself. In addition, forward this to other alumni so that they too might help the faculty at Trinity to see that computers are not a fad that is going away and that being able to bend and manipulate technology to your benefit to solve problems really is a valuable skill that everyone needs to have.

3 comments:

  1. +1.

    ~Dr. Forrest Stonedahl (Assistant Professor of Computer Science & Mathematics at Centre College)

    ReplyDelete
  2. Our students are well versed in technology because all classrooms use technology, you say? Well they must also be experts in architecture since classes are held inside buildings!

    ReplyDelete
  3. I wish there were some type of "Like" or "+1" in the comments on Blogger.

    ReplyDelete