Archive for May, 2010
I watched College, Inc. the other day, a documentary about the rise of for-profit colleges. Everyone with an internet connection has no doubt seen their advertisements all over the place: UoP, DeVry, etc. Even some medical schools that have opened recently are for-profit, and for-profit pharmacy schools have existed for a while.
Education as byproduct
Being for-profit doesn't necessarily make a school "bad". I was a little annoyed with the PBS interviewer who kept asking whether education should be a business. Education is and always will be a business. Just ask any student who's taken money out in student loans to pay for school. Ironically, I suspect that for-profit institutions probably have incentives that are more closely aligned with the majority of students' motivations than their non-profit counterparts. Broadly speaking, students seem to fall into two basic categories: those who are there to learn stuff, and those that are there for the "college experience" which may or may not include learning something. By and large, the students that for-profit institutions attract are those that fall into the "want to learn stuff" category, because they don't have a lot of amenities that contribute to a traditional college experience.
As a result, colleges and universities engage in quite a bit of activity that has very little educational merit. Building $40MM gymnasiums adds little value to a student's education, but it does add to an institution's "sex appeal". That means it's fluff. It's not just athletic complexes and fancy dorms, though. Look at the job postings at these institutions:
How many of these positions directly contribute to a student's education the way that excellent classroom instruction and strong ties to the public and private sector would? Not many. Instead, most of these job postings piggyback on education itself, increasing overhead, and contributing very little to the success of its customers. In this respect, institutions are more interested in the furtherance of their own legacy and building their brand than they are in educating students. Now I'm not suggesting that these positions are worthless to students; there will always be some overhead in any organization, but the sheer vastness of this overhead in higher education is what's staggering. Institutions that are directly funded through taxes* have less of this, just like high schools would never have this kind of overhead, simply because the budget doesn't afford it.
This inefficiency supports a lot of salaries. Luckily for them, demand for education at these institutions is relatively inelastic, so there's very little incentive to change organizational behavior.
Overbuilding and under-using
The traditional two-semester school year is broken, too. 3.5 months of school twice a year, for four years? That's dumb. We don't live in an agrarian society any longer. Change to a trimester or quarter system and go to school year-round. You could even build-in a mandatory co-op program so students can make money in their industry while still being students. One quarter or trimester would be co-op, and the rest of the year would be traditional didactic education. Even with co-op, students could finish in 3 or 3.5 years instead of 4, and be better rounded for it. Perhaps even less if some other changes are made. (Read on.)
Classrooms go unused for 5 months out of the year, limited summer class offerings notwithstanding. Schools' capital isn't being utilized efficiently. On top of this, there's quite a lot of IT infrastructure on your traditional college campus that doesn't need to exist anymore. General purpose computer labs aren't necessary for most majors, because computing is now commoditized. Most students have laptops and/or desktops. (Engineering is probably the main exception here, as licensing for engineering software packages is prohibitively expensive for all but the richest students.)
For everyone else, academic discounts exist. Steeply discounted versions of Office Ultimate ($60) and Windows 7 ($65), are available so the argument that students can't afford Office doesn't really hold much water. Switch to an open-access wireless network, and you've suddenly eliminated quite a lot of physical overhead.
Wealth and ideas are created when smart, motivated people interact with one another. Universities are havens for this kind of interaction. In computer science, for example, getting rid of the computers doesn't mean you get rid of the student interaction. A school could continue to foster it by making space available that only CS students have access to by taking the computers out of the computer lab but leaving the tables and chairs. Besides, when you break your own stuff, you have to fix it, which is itself a learning experience…
For everyone else, the same principle works, too: get rid of the computers but leave the tables and chairs.
Education vs instruction
Education isn't the same as instruction. It's dumb to think that making an engineering student take 10 liberal arts classes makes him "well-rounded". This is the difference between education and instruction. Instruction is what happens when a student sits in a classroom. Education is the gradual process of acquiring and assimilating knowledge.
Conflating the two is dangerous and ignorant.
For this reason, I think that changing the US model of higher ed to be more like the British model makes a lot of sense. Let college students study what they want within their field. Don't make them take a bunch of classes that they care nothing about. They're not going to learn anything in them, and they waste time, money, and attention. Ensure that they can write — and if they can't, fail them — and set them loose on their CS classes, EE classes, or History classes. The rest is unimportant.
Educated people will learn the other stuff just by being attentive, observant participants in life. And those that won't learn this information on their own certainly won't retain it as a result of sitting through some class they hated.
Oh, and let students test out of any class without needing to take an AP exam. If you think one test isn't enough, then there's something wrong with your testing methodology, not the student.
Teacher quality transparency
I don't think a professor's degree matters. Whether they have a Master's or a PhD is irrelevant. The only things that matter are that they:
- Understand the material
- Can effectively communicate the material to students
In this respect, I think sites like Rate My Professors are absolutely brilliant. Throughout my time as an undergrad, whenever possible, I checked the site to scope out who I would try to take, and who I'd avoid. I used to build small dossiers on potential professors based on RMP comments and ratings, and build my curriculum from there, inasmuch as this was possible.
Unfortunately, a lot of professors (and institutions) dislike RMP. You see, RMP brings transparency to an otherwise opaque — and unimportant from the school's perspective — part of the educational process: teacher quality. RMP isn't perfect; there's a lot of crap on there written by idiots for idiots (business opportunity!), but there's also a lot of quality there if you look closely enough. If RMP built in a meta-review tool like Amazon has, it would suddenly become a lot more useful.
The last semester I was in school, one of my professors told students considering graduate school to check Rate My Professors first, and avoid any program where the professors got consistently bad ratings. I thought this was wonderfully enlightened of him, and of course he was one of the best profs I ever had.
Y Combinator exists to mass produce successful startup companies. I don't see why higher education can't be rebuilt to mass produce effective people. I think the for-profit education sector has a lot to teach the non-profit sector with respect to leveraging the Internet and using capital efficiently. Most students don't go to school with the goal of being an academic. They go to school because society expects it of them (degree inflation) and/or they want to learn something so they can have a cool job and make money.
In this respect, I'd like to put forward some modest reform proposals:
- Have classes year 'round
- Let students test out of any class in the curriculum
- Get rid of mandatory, off-topic courses
- Get rid of unnecessary computer labs
- Offer all courses that can be reasonably be offered online, online
- Make co-op mandatory, and based on ability as measured by progress through the program (see #2)
- Reward teaching excellence rather than research excellence
This might mean a talented CS student finishes his degree in a year. This might mean an English student never sets foot in a classroom. These things are okay. They should be embraced.
* All institutions are funded through taxes, even for-profit schools, albeit indirectly. Most student loan programs are government funded or subsidized, which means they're paid for by taxes.
As we move into the middle portion of 2010, we're hitting the fattest part of the ARRA stimulus allocations. While there was a lot of pre-passage wrangling on both sides of the aisle about efficiency and multipliers, there hasn't been much talk about the stimulus money's effect on unemployment in recent months. Not since the unemployment rate has been trending in a positive direction. That means it's a great time to examine stimulus.
Recovery.gov lists 682,779 jobs as "created" or "saved" as a result of the stimulus package. ARRA started doling out money on Feb 17, 2009, and through March 31, 2010, $205.3bn has been laid out. This means that $205.3bn has puchased 682,779 jobs, at a cost of $300,686 per job.
That's pretty horrible. Apparently it's not quite as straightforward as this, though, thanks to some fancy hand-waving that magically brings this number down to ~$160,000 through methods that I've yet to see explained. Regardless of what you believe, it's really expensive for Uncle Sam to create jobs using good old fashioned Keynesian stimulus.
The neoclassical production function is y = (K,L); that is, output is a function of capital (K) and labor (L). Modern macroeconomics throws a lot more into this function to account for other factors, but using the older capital-labor tradeoff is good enough to illustrate my point. In general, there is a tradeoff between capital and labor. You could hire 1,000 men to shovel the streets in the winter, or you could hire one guy with a plow, and the guy with the plow will do more in less time. This is why nations tend towards industrialization.
The United States is a post-industrial economy. Watch a documentary on the Hoover Dam and look closely at the sheer number of men at work. In today's economy, a large percentage of these men would be replaced with machinery designed to do a specific job.
That means that money is being spent on capital instead of labor because stimulus money is allocated for specific tasks — not to employ workers, which is merely a fringe benefit — and then these tasks are completed by either the public or private sector (or both) using whatever mix of capital and labor is appropriate for the job.
Capital has to come from somewhere. In terms of traditional stimulus spending on infrastructure, you're mostly dealing with heavy equipment manufacturers, the majority of which are located overseas. In some cases they're headquartered elsewhere, and in some cases, their manufacturing is located elsewhere. Sometimes both. When one or both of these conditions is met, profit, jobs, or both are shipped across borders into other nations. In this respect, stimulus money is being used to create or sustain jobs in other countries.
It is possible to mandate that firms using government money buy from American corporations — the Fly America Act is an example — but this isn't possible or desirable in all cases. American heavy equipment manufacturers don't make all of the machinery necessary to complete some of the larger infrastructure projects that are on the table.
Job destruction and obsolescence
Another problem with stimulus spending is that quite a bit of the funding is to improve efficiency, particularly in sectors like health care. Unfortunately, "improving efficiency" usually means shifting away from labor. In a sector like health care, capital takes the form of computers and software. Primarily the latter. Automating billing, cutting back on administrative personnel, decreasing overhead, getting rid of physical records. These are the things that software is great at; it is a substitute for labor.
Software is a unique example, too. Not only does it destroy jobs in the short run, but it doesn't create new jobs very quickly, except in the software-based industries themselves, because it's a virtual good: all you need is a computer, an Internet server, and the intellectual capital required to make it. This breaks the normal supply-demand models in some interesting ways, because supply is functionally unlimited, but price doesn't go to zero, even after a firm's fixed costs have been covered. This leads to some very high profit margins.
History is littered with examples of capital displacing entire workforces. Textile manufacture during the Industrial Revolution is the most prominent example in modern history, with huge numbers of people losing their jobs thanks to the electric loom. The upside is that because of this creative destruction, whole new industries are born, and more jobs are created than were destroyed.
This doesn't lessen the pain in the short term, however. While the labor market can and does adjust for this creative destruction, this obsolescence is occurring so quickly that many of these displaced workers are unable to acquire the skills necessary to find new kinds of employment in the new economy that's being created. In times past, the technological increase was on the order of 3% per generation (pre-Industrial Revolution), but now stands at about 3% per year. The rapidity of this kind of change is difficult for the labor market to absorb. This is one of the causes of a jobless recovery: technology increases wealth (GDP) without using more workers to get the job done thanks to software advances and other capital-based technologies.
If the goal is mass job creation as quickly as possible, then the government should employ fiscal stimulus as inefficiently as it can. Employ workers where the private sector would use machinery. Employ dozens of people with slide rules instead of one physicist with a computer. Build roads by hand instead of using machinery. In short, pretend we're a pre-industrial civilization instead of a post-industrial one. This will get you the biggest bang for your buck in terms of rapid, low-cost employment. The work isn't desirable, and people will jump ship back to the private sector when it starts creating jobs as a result of people spending their money.
In this case, inefficiency props up the labor market in the short term.
The downside to this is that it doesn't create jobs in the long-term growth sectors like technology, health care, and education, which are physically and intellectually capital intensive. It does, however, get a lot of people employed very quickly.