Friday, April 29, 2011
I called it in 2005. From my (admittedly callow) piece then:
Picture a tenure-based, unionized, public college. Imagine, for the sake of argument, that it has chronic budget issues. (Hard to imagine, I know, but bear with me.) Imagine that a non-trivial faction of its tenured faculty are low performing, highly-paid pains in the neck. (Again, an obvious counterfactual, but let’s go with it.) Imagine that there are lots of good young Ph.D.’s out there looking for work. (Inconceivable, yes, but let’s try.) Imagine that the faculty contract is expiring, and the round of negotiations for the next contract is starting.
The administration makes the union an offer it can’t accept. (“Mandatory calisthenics at 7:00 a.m. every day, no pay raise, and every office computer will play the alma mater when you log on.”) The union tells the administration to perform an anatomical impossibility. The administration shows good faith by compromising a little (“Okay, 7:30. Sheesh, there’s no pleasing you people!”). The union goes on strike.
Here’s the good part.
With a rock-solid right-wing Supreme Court, and with the PATCO precedent on its side, the administration hires “permanent replacements” for all striking faculty. Presto, change-o, budget problems solved, low performers purged, faculty renewed with new blood. The tenured faculty are suddenly unemployed, the young freeway flyers jump at the chance for full-time employment, and budgetary equilibrium has been restored. With Republican appointees throughout the judiciary, and a solid conservative majority on the Supreme Court, the union stands about as much chance of winning as Harold Stassen.
This could actually happen (not the Stassen part).
Admittedly, it would take a pretty confident administration to try it, with a Board of Trustees willing to endure some pretty serious flak, but it could work. And if it works once, in just one place, the legal precedent would be set.
I’d write it differently now, but the basic idea is clearly there.
As a legal issue, it’s fascinating. I assume that the Board of Mt. Hood is basically bluffing, and that it expects to settle. But what if that doesn’t happen?
It seems that strikes by public employees are illegal in Oregon, so the college could claim that an illegal strike vacated any claims to tenure. At that point, barring any state laws to the contrary that I don’t know about -- I don’t live in Oregon -- the college could conceivably hire permanent replacements.
I’m not sure who would actually hire them, though. Even the smartest administrator only has subject matter knowledge in a discipline or two; staffing across fields requires other people with content expertise. If they’re gone, I’d expect a lot of hiring with fingers crossed.
The replacements themselves would be taking a hell of a risk. What if the college restores the status quo ante? What if it rehires some of the replaced folk -- no tensions there! I wouldn’t be shocked to see the replacements effectively blacklisted, at least until some union organizers decided to recruit them, too.
It’s effectively impossible to use a nuclear option like this without doing serious collateral damage. I’d expect to see enrollments drop hard in the short term, though they could bounce back quickly. Some terrific people would be lost. (And yes, some less terrific ones would, too.) Faculty-staff relations could get dicey. I’d expect lawsuits to bloom like mushrooms in springtime. The new hires would be nervous constantly, not knowing when they’d be cast out by a legal or political challenge.
On the other hand, if the college won a legal challenge, I could easily see colleges elsewhere adding this move to their playbooks. As with any nuclear weapon, it isn’t so much what happens when you actually use it, since you hope not to; it’s what happens when you have a sufficiently credible threat to scare the crap out of the other side. Win a legal challenge, and the threat becomes a lot more credible.
It may only have merited a passing mention in IHE, but I consider this one of the more fascinating stories to come along in a while. Predicting the news since 2005...
Thursday, April 28, 2011
The article calls it “seduction by great expectations,” and it’s what happens when someone is hired into a role with such high expectations that some level of failure is simply inevitable. I’d amend it to read “disappointment by conflicted expectations.” The real issue isn’t ambition; it’s confusion. When a college doesn’t know what it wants, anything its leadership does will disappoint someone. I’ve seen this one up close a couple of times now.
Make tough decisions (but don’t upset anybody). Be fiscally conservative (except for things people want). Follow the rules (except when they involve saying ‘no’). Be authentic (but never inappropriate). Keep the accreditors happy (but don’t annoy the faculty with outcomes assessment). Give everyone a fresh start (but don’t be surprised when they interpret you in light of a predecessor from an earlier decade). Bring the budget in line with reality (but beef up staffing and services in every area). Innovate (but be consistent with past practice). Consult everyone on everything (but keep meetings to a minimum). Be transparent (but don’t bring up touchy subjects).
The common denominator to that list of desiderata is that they include their own opposites.
Given such a contradictory set of charges, some level of failure is simply built into the system.
In relatively flush times, it’s sometimes possible to throw enough money around to have a given situation both ways. (In academia, the preferred euphemism is “ a both/and solution”.) Two tenured professors in the same discipline can’t stand the sight of each other? Create a new department, and move one of them into it! More worthy new proposals than the budget can support? Just raise tuition some more and go for it! In the moment, it’s called “statesmanship.” Over time, it’s called “bloat.”
But with funding drying up, there’s no more papering over the conflicts with money.
This isn’t unique to higher ed, of course. Americans are very good at, say, wanting both increased public services and lower taxes. We despise government spending, except for the things government actually spends money on, which we want more of. This is a culture that venerates both thinness and fast food.
When expectations and desires are so deeply conflicted, it shouldn’t be surprising that leaders disappoint. It’s hard to give the people what they want when they don’t know what they want. It’s even harder when they don’t see the contradictions in themselves. Rather than engaging in the difficult work of self-critique, it’s easier just to blame whomever’s around at the time.
In times of acute crisis, the difference is noticeable. When the threat is clear, severe, and immediate, even mediocre leaders often seem to get smarter. They don’t, really; it’s just that the usual contradictions are subsumed under a single, obvious imperative. In those special cases, the people actually know what they want. “Achieve a sustainable peace” is difficult, and the path to it is neither obvious nor clear. “Bomb the bastards” is easy, both to explain and to do.
When the mission of an organization is inherently diffuse, such as in public higher ed, leadership will have a serious challenge in the best of times. Add economic crisis to a conflicted mission and culture, and it’s surprising that anything gets done at all.
Admittedly, this isn’t as “seductive” as great expectations, but it comes a lot closer to capturing the on-the-ground reality I see. Wise and worldly readers, does it describe your world, too?
Wednesday, April 27, 2011
Much of my readership won’t like this, but some faculty do this to administrators, too. They seem to think they’re acting on some sort of principle. They provoke and provoke until they get a recognizably human reaction, at which point they wield the reaction like a pelt. It’s a surprisingly common cause of people leaving administrative roles; eventually, they just get sick of “gotcha!”
I’ll share a fundamental truth. We are all human. We are all imperfect, sometimes tired, sometimes frustrated, sometimes impatient. That is true of radicals and conservatives, students and professors, administrators and trustees. “Exposing” a human moment is exposing nothing. At its base, the move reflects a failure to understand the difference between a role and the person occupying it.
In my teaching days, I recall being simultaneously annoyed and proud when a pain-in-the-ass student got an ‘A’ in my class. I was annoyed that the jerk was rewarded, but proud of myself for rising above my own human response and fulfilling the professional obligations of my role. (The same also held in reverse: although it pained me to give bad grades to likable students, their work was what it was.) In unguarded moments at home, I would occasionally vent about some particularly grating student. Was I “exposing my true agenda?”
No, I was venting. My true agenda was to teach as best I could, which necessarily involved putting aside my personal preferences at key moments. As long as I fulfilled the role in an aboveboard way, my private emotional reactions -- though real -- were of no import.
In administration, the same dynamic becomes even more important. Students come and go; even the most annoying student is usually gone in a year or two, and often less than that. Faculty are forever. Abiding a self-important jerk is not the exercise of months; it’s the burden of years, and even of careers.
Over time, maintaining that professional equanimity can be difficult, especially when repeatedly baited.
If we had a more mature political culture -- since this is really about politics -- people would understand the difference between the actor and the role. The occasional expression of frustration would be seen as merely that, rather than as the accidental revelation of a “real” agenda.
But that’s not our culture. In a “gotcha!” culture, the under-the-breath human moment is taken as far more “real” than anything actually done. Which means that you have to refuse the bait.
Of course, the smartest passive-aggressors characterize that as “stonewalling,” or “shutting down dialogue,” or whatever euphemism they prefer for ignoring them. They call for “open dialogue,” pretending not to notice that they’re allowed to vent publicly and you’re not, or that they have life tenure and you don’t.
Knowing the difference between an honest, if difficult, question and “bait” is tricky. It’s necessarily situational, and any given question can be a little of each.
Most of us have identified people in our lives who are simply never satisfied. They’re happiest when angry. Once you’ve figured out that this accurately describes someone, there’s no point in trying to meet them halfway. They’ll just move the goalposts. For that type, disputes aren’t about finding the best solutions; they’re about drama. For the soapbox junkie, compromise is self-defeating; therefore, reasoned dialogue is pointless.
Others have issues with authority of any stripe, other than their own. (I’ve noticed that some of the most anti-authority faculty are the most authoritarian in their own classes.) Again, there’s no point in engaging this group in serious discussion. I’ve been burned multiple times by this type. I spend months delicately forging some sort of peace, only to have them burn it down in public a few days later just for the hell of it. Burn me enough times, and I’m done with you.
The delicate balance is in walling off the truly malignant without also walling off difficult, but necessary, discussions. Moving past the culture of “gotcha!” would make that a hell of a lot easier. But until we move away from politics-as-competitive-victimization, selective deafness may be the best that can be done. Taking the bait won’t serve the goal of transparency, or civility, or real debate; it will only feed trolls. Those who don’t wish to be mistaken for trolls shouldn’t engage in the politics of “gotcha!” It’s easier to engage in dialogue-as-problem-solving if we don’t assume that identifying the problem, and even expressing frustration over it, is out of bounds.
Tuesday, April 26, 2011
Ask the Administrator: Incentives for Course Evaluations
My college is on a big push to do course evals... we do them at midterm and the end of the course. So about every 7 weeks. Understandably students get burned out on all that evaluating and drag their feet about completing them, so we receive several "tips" a week about how to get students to do so, such as this (it's all cookie cutter stuff fromonlinecourseevaluations.com, no problem sharing them):
Getting Response Rates Tip 4:
Have you thought about offering incentives? Some faculty do not like reminding students because they feel like they are nagging their students. Group incentives are a great alternative. This allows the students to push their classmates to complete their evaluations. You may use something similar to the following:
'If this class gets an 80% response rate by the end of the evaluation, I will allow one 3x5 index card of notes to be used during your final.'
'If this class gets a 70% I will remove the lowest quiz grade.'
Also this from a youtube video were were encouraged to watch:
"I decided to tell students that if they filled out their course evaluations they would get a 1 to 2 bonus on their final score".
Is it just me or is all of that pretty unethical? I find the last part beyond the pale... to suggest we give points on the final grade for something that really has nothing to do with learning or coursework? What if that was the difference between failing and passing? That would mean we were handing out college credit for busy work, basically.
The real question is what should I do? My instinct is to just suck it up, ignore the bad advice and move on. On the other hand I feel like they need to be told this is not OK. They bought into this eval website and are just repeating their bad advice... maybe someone just needs to call them on it. Suggestions. from your wise and worldly readers?
It’s a fine line between rewarding participation and rewarding a positive response. I wouldn’t be surprised to find that some students simply assume the latter.
I’ll share that I am not -- at all -- a fan of “extra credit.” It introduces too much noise into grading -- a noisy enough process already -- and often rewards the wrong things. “Collective” extra credit is that much worse, since it puts pressure on students to put pressure on other students. It goes well beyond the standard issues involved with “group work,” since it involves the entire class. (Of course, from my side of the desk, the most offensive extra credit is the kind that’s offered ad hoc to one or a few students and not to the rest. If some of the un-offered students are in protected classes, the legal issues are staggering.)
As with any kind of poll, the people most strongly motivated to give feedback are the ones with the most strongly held, and therefore usually the most extreme, opinions. In some ways, those can be the least valuable. Nearly any teacher worth her salt has had a student who thought the sun rose and set on her, and another who thought she smelled vaguely of sulfur. What I want to know is what the overall tendency is, and for what reasons.
I’ve never heard of midterm evaluations that were shared with administration. I’ve heard of professors doing their own midterm evaluations just to find out where the students were struggling, and then using that feedback to make adjustments. That seems reasonable to me, though I rarely did it myself. But the standard number-two-pencil evaluations that get tabulated? Not at midterm...
I wonder if the issue isn’t so much student attitudes per se, but rather a ridiculous situation that would naturally give rise to student cynicism. Even after all these years in administration -- in the words of the old Westerberg lyric, you can count the rings around my eyes -- I can’t imagine a productive administrative use of midterm evaluations. It seems like that would be the conversation to have. End of semester, I understand, as do most students. But midterm just seems like overkill. After a few years with multiple classes per term, I could easily imagine students getting a little tired of it. I would.
But maybe that’s me. Contexts differ, and some people out there have probably found ways to make this work (and/or to make it useful). Wise and worldly readers, what do you think? Is there an effective, yet ethical, way to get students to indulge all those evals? Or is the entire enterprise misguided? Or, both?
Good luck. Sorry I don’t have the magic bullet on this one.
Have a question? Ask the Administrator at deandad (at) gmail (dot) com.
Monday, April 25, 2011
For-Profits and Satellite Radio
At some point, I read a piece online -- maybe at Slate? -- arguing that satellite radio was a ‘transitional’ technology. It would last, the piece argued, only until mobile internet access got to the point where people could reasonably stream audio in the car; once that happened, satellite would lose its reason to exist.
I was skeptical. How on earth could I get internet in the car? There was no way the cable would reach that far!
Since then, of course, wireless mobile internet has become commonplace. I can pick up Pandora on my phone and play it through the ‘aux’ input in my car radio without bothering with, say, a monthly fee. When XM misunderstood its own business model and did away with “beyond jazz” while keeping a couple dozen lite-hits stations, I was done. I don’t miss it.
I’m starting to wonder if the for-profits are the satellite radio of higher ed.
Like satellite radio, the for-profits started by attending to parts of the market that traditional colleges mostly ignored. They grew quickly by exploiting some of the inefficiencies of the competition. (The competition made that easy by mostly digging in its heels.) Most of traditional higher ed was so affronted by the upstarts that it failed to notice all the ways the for-profits actually resembled them.
For all the obvious differences -- the calendar, the terms of employment, the unapologetic focus on post-graduation employment -- the basic structure was remarkably similar. Students pay tuition, mostly using financial aid, for courses denominated in credit hours and taught over semesters. Some of the larger and more successful for-profits offer the same degrees with the same regional accreditations as their non-profit competitors; Phoenix and DeVry, for example, are both accredited by the Higher Learning Commission (North Central), which is the same body that accredits the University of Chicago and Northwestern. The for-profits generally accept transfer credit, and in some cases the acceptance is even reciprocated.
The family similarities were by design; they allowed easier access to federal financial aid money, and they made marketing to prospective students easier because the for-profits could offer a credential that the wider culture already recognized.
For a while, the for-profits did remarkably well. Their marketing left much of their competition in the dust, some of their programs were actually pretty good at what they claimed to do, and there was no shortage of qualified would-be faculty to hire. (I’ve long thought that the overproduction of Ph.D.’s and the resulting shortage of full-time faculty positions in traditional colleges was part of what made the for-profit boom possible. They could get qualified people on the cheap.) And the increasingly desperate economic straits of the nonprofits led to behaviors -- such as wholesale adjunctification -- that undermined any argument from the moral high ground.
The for-profits grew wildly when financial aid was easy to get and the lower-tier nonprofits were consigned to a more or less permanent austerity regime. The secret strength of the for-profits -- and I used to work at one, so I know whereof I write -- is that they charge more than the cost of the service they provide. That means that growth not only pays for itself, but it actually benefits the financial health of the organization. Most public colleges teach at a loss. In some cases, such as California, the losses become so great that the only way the colleges can survive is to turn students away. When those students are turned away, the for-profits stand at the ready, happy to accept them.
The Achilles heel of the for-profits, though, is that they only make sense when they’re growing. Public colleges derive their budgets from a combination of tuition/fees and public subsidy; when enrollment drops, they at least have the subsidy to cushion the blow. The for-profits exist entirely on tuition/fees; when those drop, there’s no cushion. There’s nothing to fall back on.
Worse, the for-profits never really solved the cost issue that bedevils traditional higher ed. They still charge by seat time and require a set number of credit hours for a degree.
The failure to address the underlying cost disease didn’t matter much when financial aid flowed like manna from heaven; just keep raising tuition and all is well. But when the financial aid spigot starts to sputter, the underlying inefficiencies of the service model stand exposed. That’s happening now.
I’m not sure what the for-profits are a transition to. Satellite radio bridged the gap to mobile internet, which offered a longer tail at a lower price. My guess -- and that’s all it is -- is that the next big thing will be a transition away from the “many paths to one goal” model. Right now, there are elevendy-million different combinations of majors and colleges that fall under the umbrella of a bachelor’s degree. I suspect the umbrella will be the next thing to go.
We’ve seen hints of this over the last decade or so. Terms like “Cisco certification” exist to signify alternative (or often supplemental) goals, as opposed to a bachelor’s degree. I suspect that customized certificate programs will start to flourish. Unlike bachelor’s degrees, they aren’t tied to a particular amount of seat time, or even to credit hours at all. The better-known ones are validated by some sort of exam; how you get the knowledge to pass the exam is up to you. It isn’t difficult to imagine the outcomes assessment movement greasing the skids for something like this to happen; if degrees are really just lists of outcomes, then certificates can be shorter lists of outcomes. As costs continue to escalate, it’s getting harder to explain why every single person needs 120 credits, many of which will be in courses they don’t care about.
When I gave up satellite radio, I didn’t go back to my local FM stations. I switched to Pandora and similar competitors. Having grown accustomed to listening to music that fit my idiosyncratic tastes and that didn’t have five minutes of commercials per song, I just couldn’t go back. Slacker, Pandora, and their ilk didn’t rebuke the advances of satellite; they saw them and raised them.
Similarly, the sudden vulnerability of the for-profits doesn’t suggest to me that we’re on the verge of a social democratic renaissance. It suggests that something even more disruptive is readying to be born. It’s one thing to say “we can offer a more convenient bachelor’s degree in an employable field, and you can get financial aid for it.” It’s quite another to say “we can offer an alternative to a bachelor’s degree, and you can afford it.”
Admittedly, this is a big topic. And it’s hard to make predictions, especially about the future. Wise and worldly readers, what do you think?
Friday, April 22, 2011
Ask the Administrator: Where Do You See Yourself in Five Years?
Right now I am a Community College Adjunct who is applying for some full time positions for Biology Faculty. I just had a phone interview which I think I tanked on badly but I want to use it as a learning experience so that I perform better next time. (It didn't help that they called me without any warning for the phone interview but at least now I am becoming someone who is in an ever ready state of phone interview preparedness!).
One of the questions that I think I did not do well on is "where do you see yourself in 5 years?". What is a realistic progression for a new faculty member like me over the course of 5 years?
My first thought is that a ‘surprise’ phone interview is staggeringly unprofessional. Phone interviews are fine in and of themselves, but they’re scheduled in advance. A ‘pop’ interview suggests either cluelessness or a fundamental lack of seriousness. Don’t beat yourself up over your performance; you shouldn’t have been put in that position in the first place.
That said, the “five years” question can lead in any of several directions.
For a tenure-track faculty position, the absolutely wrong answer is “I hope to have published my way out of here by then.” Whatever else your answer involves, it should involve still being at the place you’re interviewing.
The better answers to that question suggest a desire to grow within the job. “I hope to have developed a new curricular option in...” or “I hope to have a student club in x-and-such up and running” or “I hope to be a key player in improving student success in biology...” would all be good answers.
From a hiring perspective, it can be hard to distinguish the folks who will just do the job from the folks who will see the bigger picture and make themselves useful in other ways. The second group is far more valuable, so any indication you can give that you’re in that camp should help.
Just be careful not to overshoot. Depending on local culture and circumstance, an answer like “I hope to be chairing the department” could be either very good or very, very bad. But something like “I hope to take a key role in growing the program, such as by...” will almost always be good.
For administrative roles, questions like that are more ambiguous. I once interviewed for a vp position (at an undisclosed location) in which I answered the “five years” question in the usual way, only to discover that they answer they wanted was “President of the college.” They wound up hiring someone about twenty years older than I am, who could step into the Presidency at a moment’s notice. In that case, they saw the position as a sort of on-deck circle. That’s not uncommon for certain kinds of administrative roles, but it’s rare for junior faculty.
Wise and worldly readers, is there a more elegant and/or effective way to answer this question?
Have a question? Ask the Administrator at deandad (at) gmail (dot) com.
Thursday, April 21, 2011
Ask the Administrator: The Library as Ladder?
I am the library director at a small liberal arts institution. I have
worked closely with the administration and also served on many
standing and ad hoc faculty committees. Looking 10 years or so into
the future, I feel very drawn to, well, jobs like yours. Community
colleges particularly interest me because I started out in public
libraries and still feel drawn to that broader public mission.
I have my library master's and a subject master's, but no doctorate. I
can't afford to go into debt to complete another degree. Is there any
hope for me?
Thanks for any help or advice you can give!
I’ll open by admitting that libraries are a field of their own, and it is not mine. Folks with backgrounds in the library world are invited to shed light as appropriate.
As I understand it, the question is about using the background as library director to leap into another administrative position with broader authority.
I’ve only seen it done once. In that case, the college had recently experienced convulsive turnover, and the director of the library was recognized and respected as an even-tempered grownup. He more or less inherited a deanship, due primarily to his personality and ability to work well with the existing faculty. At that point, they were so starved for peace that nobody got upset about subject matter.
At last report, he did quite well, which makes some level of sense. Librarians are to academia what catchers are to baseball. Catchers have to be able to talk to both position players (which they are themselves) and pitchers. As a result, catchers tend to make successful managers. Librarians are staffers who have to work well with faculty, and who have to have a pretty serious academic background. As such, they’re well-positioned, at least in theory, to manage across gaps.
If you want to leap from the library and associated services to the faculty/academic side, I doubt that a doctorate would be the critical variable. Instead, I suspect that your reputation for level-headedness and competence would be the driver. (I’m assuming that you have that.) Put differently, I suspect that it would be most likely to happen, if at all, where you already work.
The key would be in putting yourself in a position in which people will see you as a campus leader, independent of title. Ways to do that could include leading an accreditation self-study, taking a conspicuous and active role in a whole-campus initiative, and being the voice of reason during campus disputes. In other words, if you want to be seen as a leader, start leading.
Wise and worldly readers -- especially those from the library world -- what do you think?
Have a question? Ask the Administrator at deandad (at) gmail (dot) com.
Wednesday, April 20, 2011
Helpful Hints for Hidden Hoops
The first, which I've seen before, was the recent kerfuffle over unpaid internships. The law stipulates that unpaid internships should be for the benefit of the student, not of the 'employer,' and that the intern can't do things that people are normally paid to do. It seems to be increasingly the case that students who have internships in their background have a leg up in the job market over students who don't.
This is a class trap. On a really basic level, who can afford to work unpaid? Generally, that would be “those who don't need the money.” Second, community college students who do internships frequently find that the academic credits for the internships don't transfer; the four-year schools generally prefer to offer those themselves. So an ambitious and serious student finds herself paying twice for the privilege of working unpaid.
Then there's the educational value, or not, of the internship itself. I admit that my own summer internship in college was useful; it dissuaded me from going to law school. I learned that I don't want to do what lawyers do. That's good information to have at age 20. But too many internships seem to amount to free photocopying.
The second hoop was a confessed bias against the unemployed. I had a conversation with a local business owner who sometimes employs our grads. He was congenial, and supportive of the college's efforts to prepare students for the work world. In discussion, though, he noted without hesitation that part of the reason he likes young graduates is that they haven't crapped out elsewhere. By his reasoning, companies don't lay off their top performers, and top performers at dying companies bail before the collapse; therefore, if someone over twenty-five is unemployed, it's because there's something wrong with them. I was too shocked to ask about older returning students.
As regular readers know, I'm profoundly opposed to characterizing the job market as any sort of meritocracy. It gravely underestimates the importance of luck and timing, and it leads to ungracious attitudes among those who caught breaks. But this version surprised me. If enough people held this position, it would effectively rule second chances out of bounds. To equate “loss of job” with “loss of virtue” strikes me as a category mistake of the highest order, and as likely to do real harm.
It's also based on a factual mistake. Some of the best hires we've made have been people who were out of jobs. In this market, some very good people are without work. Ruling them out, sight unseen, is both inaccurate and inhumane.
Finally, in a presentation by a big muckety-muck from the Federal government, in which he discussed hiring needs in the area of national defense and homeland security – a phrase that still sounds faintly German to me – he mentioned that most of the more lucrative jobs require a security clearance, and a security clearance requires a good credit rating.
To which I thought, huh?
I imagined an argument having to do with susceptibility to bribery: someone in financial trouble is presumably easier to sway with money that someone who isn't. Whether it actually plays out that way, I don't know, but there's a surface plausibility to it. But his argument wasn't about vulnerability; it was about “integrity.” He equated a low credit rating with low personal integrity.
Though he didn't spell it out, one major contributor to low credit ratings is trouble paying back student loans.
Putting the three together, I came away with a roadmap for success.
1. Try to pick parents wealthy enough to support you doing unpaid internships.
2. Try to pick parents wealthy enough that you don't have to borrow much for college.
3. Never lose a job, or go without one, after you graduate. One way to do that is to pick parents who own their own lucrative business, so you always have a job to fall back on.
I hope that helps. Meanwhile, for those who chose the wrong parents...
Tuesday, April 19, 2011
Diversity Begins at Home
The Boy has a friend who is a Jehovah’s Witness, which occasioned the following discussion in the car on the way home from Mass. (I wasn’t there; this is according to TW.)
TB: He doesn’t believe the same things I believe, but it’s still okay to be friends with him, right?
TW: Of course! Good people believe all kinds of different things, and they’re still good people. Like I’m Catholic, and so are you, and so is TG, but Daddy is...
The Girl: Daddy!
Monday, April 18, 2011
Seasonal Academic Disorder
It’s an awful time. With the academic year in the final stretch, nearly everybody is fried. The faculty are tired and cranky, and not at their best; the students are tired and scared; the administrators are overscheduled to within inches of our lives.
The killer stretches always occur at the same times of year: the Thanksgiving-to-Christmas rush and the mid-April to mid-May slog. I’ve been through them enough times now to know that they come and go, but they’re still exhausting. The trick is to remember that stretches like this end, and that this is not the time to do anything regrettable.
I’ve never been a fan of the quasi-agrarian calendar for colleges. Yes, it’s important to have down time, but giving everybody the same down time also means giving everybody the same crunch time. (It also means having students compete for jobs at the same time as everybody else.) There may well be something to be said for some sort of two-out-of-three seasonal cycle. But we’re not there yet.
This is when the various end-of-year celebrations and performances occur in rapid-fire sequence. Each is wonderful individually, and it’s a pleasure to see students and professors enjoying their successes. But the accumulation is powerful. It’s also when the conversations at meetings tend to be the most strained. Deadlines loom everywhere, and even the most level-headed people are getting a little frazzled. I’ve seen some normally sane people act wildly out of character of late; it’s just how this time of year goes.
At least with the Thanksgiving-to-Christmas rush, the rest of the world shares the sense of panic. But this time of year, academics are significantly out-of-step with the rest of the world. This one is just us.
Wise and worldly readers, have you found a reasonably effective way to deal with seasonal academic disorder?
Friday, April 15, 2011
Ask the Administrator: Speaking Truth to Power
I am a doctoral student at State University, a university founded on the principle that college and continuing education should be available to everyone. We began as a college for night students, students of color, and students returning to school in hopes of a better job; as a result, we are still known for being a teaching college and many of our students are public school kids or returning students. I have been a TA or instructor of record for my entire graduate career, and I enjoy teaching quite a bit.
However, my university is trying to advance in the rankings and lately we've been hiring research-driven faculty. All of them are very nice and collegial folks, and are powerhouse researchers. This is great for us graduate students, who now get to learn from rising stars in our field. But many of the new folks do not seem to be very good at, or even interested in, teaching the undergraduates. They often expect the undergrads to turn in Harvard quality work and when the students struggle, the professors dismiss the students as lazy. In a conversation today, one of my professors was stunned when I pointed out that most of our undergrads work part-time jobs, as though such a thing had never occurred to him. As an adjunct professor myself, I've had some students complaining in my class about tenure-track professors in my department and the way the professors speak down to the students as though they are stupid.
In the short term, should I say something to my tenure-track colleagues about the way they are perceived by their students, even in a laid-back and sensitive way? And in the long term, must teaching always fall by the wayside if a university is trying to amp up its research faculty? I realize that many of the new faculty are young, and may not have much teaching experience, but the lack of interest in undergraduate teaching troubles me.
Until just now, I had never made the connection between faculty attitudes towards administrators and the way that grad students are treated by senior faculty. When your socialization into a profession is basically abusive, you start to imagine that that’s just how academic hierarchy works. In some settings, it probably does, but many of the most petty tyrants never manage to rise terribly high, precisely because they’re petty tyrants.
Anyway, back to your question. One of the blessings of working at a community college is that we’re banned, by state law, from “raising our academic profile” by offering higher degrees. That means that we’re blessedly free from the identity crisis that occur when an institution that was one thing historically is in the process of trying to become something else. When a college or university that was historically focused on teaching decides to try to move up the ladder to the big leagues of research, it can’t just jump all at once. Tensions between earlier hires and newer hires are understandable, since the two groups were recruited for different reasons and understood their employer in different ways.
In my doctoral program at Flagship State, the faculty were bracingly blunt about how they viewed undergraduate teaching. While they agreed that, all else being equal, it’s better to teach well than to teach badly, they also agreed -- and told us directly -- that spending any more than the bare minimum amount of time and effort would be self-defeating. Good reputations for teaching would result in higher enrollments -- meaning more time spent grading -- and higher student expectations, meaning more time spent advising. Since there are only so many hours in the day, “excess” time spent on teaching represented a deadweight loss from research. We were told, in both word and deed, that teaching was something to minimize; be just good enough to not get fired, and spend the rest of your waking hours publishing.
I took offense at that, of course, but I also knew that nothing was to be gained by trying to argue the point in any serious way. The graduate faculty were notorious bearers of grudges, and their treatment of grad students could be mercurial. I took the measure of the place, did what I had to do to get my degree, and forged a career path on my own terms.
It’s possible, of course, that your new graduate faculty are uncommonly enlightened beings with self-awareness and strong ethical compasses. But I doubt it, based on experience, and the downside risk for you in taking them on is significant.
My advice would be to be diplomatic, get your degree, and not try to be a hero. If that’s simply untenable, then either decamp for more congenial pastures or be prepared to. Once you have the credential and can make your own way, you can choose whichever path makes the most sense to you. But based on personal observation and experience, I would not encourage a graduate student to “take on” the graduate faculty. Grad school is not about being right. It’s about being disciplined, in every sense of the word.
I suspect that my wise and worldly readers will have a range of opinions on this one, which is good; my own grad program was famously contentious, even by graduate school standards. So, wise and worldly readers, what would you advise?
Have a question? Ask the Administrator at deandad (at) gmail (dot) com.
Thursday, April 14, 2011
Scenes from the Science Fair
- Last year was the first year that this school did a science fair. The turnout then was modest, and many of the projects fairly weak. This year, though, you could see that the kids got the hang of it. The cafeteria was filled, and there were so many parents and kids that it was hard to walk through. It did my heart good to see the parents and kids step up for science.
- The obligatory baking soda volcanoes were out in force, as were the butterfly displays. Other than TB’s and TG’s, the highlight had to be one on “Glucosamine and Logan.” Logan is a dog. He’s a family pet, and apparently pretty old. The vet told the family that they should switch Logan’s treats from Milk Bones to Healthy Hips with Glucosamine. They did, and for five weeks they tracked Logan’s ability to climb the stairs without help. Happily, Logan’s progress was good, and they’ve decided to keep Logan on the new treats. The display board featured a photo of Logan looking resigned. It was hard to resist saying “awww...”
- Overheard while walking between displays: “Never play basketball on a trampoline.” Okay then.
- Thanks to her petition drive, TW has become a local celebrity. All of the teachers know her at this point, and so do a lot of the Moms. (I’m known, if at all, as “TW’s husband.”) She already has over 700 signatures on her petition, and she has an appointment to hand-deliver it to our state senator next week. Earlier this week she spoke at a school board meeting and stole the show. (She showed far better political instincts than the Board member who addressed us, her assembled constituents, as "you people.") It’s fun to see her on such a roll.
- We learned of yet another girl who has a crush on TB. Bless him, he doesn’t have a clue. Take your time, kid...
- After the fair, we stopped by a local ice cream place to celebrate. It was too cold to eat outside, so we got it to go. As she prepared to climb into the back seat, holding her ice cream, I reminded TG to be careful not to drop it. She replied brightly, “You can hopefully count on me!” I’m considering framing that over my desk.
Wednesday, April 13, 2011
Ask the Administrator: One Prediction Leads to Another
Your latest post on a coming personnel movement is right on, I think, and it brings me to this question: Do you see any benefit to junior faculty who stick it out? That is, will the realignment in higher ed open paths to administration or other promotions that might not have existed before? If so, will there be greater freedom to do those jobs in new ways (i.e., less structural resistance to change due to sheer desperation)? That could be an exciting prospect for some of us.
The first two numbers of my salary have not changed since I began my current job, leading me to seriously regret not negotiating harder at the point of hire, as you recommend. The salaries at my SLAC were already near the bottom of our conference, and the administration is absolutely unapologetic about this. On the bad days--those days when I'm sweating over a course, or dealing with departmental dysfunction, or defending a decision to kick a hostile student out of class--I think to myself: I could be getting paid so much more to do something way easier. (I spent years in the field before and during graduate school, so there is some basis for this line of thought.) However, I love the intellectual work of higher ed and I'm looking for reasons to stay. Might one of those reasons be future opportunities?
I don’t mean this to be evasive, but it depends on what you mean by “stick it out.” (And of course, every local context has its own quirks; any given college could run counter to a larger national trend.)
That said, I can say with confidence that I’ve found it far, far easier to fill tenure-track faculty positions than to fill dean’s positions. There is no shortage of intelligent, qualified, engaging people who are eager to step into tenure-track roles, even for colleges like mine, at the lower end of the prestige hierarchy. The same simply cannot be said of deanships. Those are proving devilishly hard to fill with people who fit even the minimum qualifications.
I don’t see the latter trend changing anytime soon; if anything, as I mentioned yesterday, it’s likely to intensify. With the last huge generation of hires aging out of the profession, the thin faculty bench stands exposed. With budgetary pressures getting worse, many intelligent people see administrative roles -- largely correctly -- as no-win, so they stay away. And if your college is late to the party when the thaw finally comes, I’d expect to see people with options start to bail quickly. The resulting vacuum could create a powerful updraft. The key will be positioning yourself to take advantage of that updraft.
That’s where I hesitate with your mentions of “sticking it out” and “doing something easier.” The one thing that almost certainly will not work is standing pat. You’ll need to be willing to step outside the traditional faculty role in order to gain the experience to be a viable candidate. Whether doing administration is easier or harder than teaching is a matter of taste, I guess. Judging solely by the number of applicants, it must be harder. Overall, I’d describe it as the difference between distance running and sprinting. I’m not sure which is harder; they’re just different.
The issue of whether the jobs will change in time to keep good people, I think, depends on whether academia is willing to change to survive. Based on what I’ve seen so far, I’m pessimistic on that one. It seems likelier to me that change will be done to us, rather than by us. I’d like to be wrong on that, though.
If the updraft happens before the structural collapse, you should be in good shape. I’d just advise getting some experience now, to make yourself a good candidate.
Wise and worldly readers, what do you think? How would you advise playing the situation?
Have a question? Ask the Administrator at deandad (at) gmail (dot) com.
Tuesday, April 12, 2011
A Fearless Prediction
I’m basing this on two observations. The first is the increasing median age of full-timers in higher ed, especially on the administrative side. The second is the extinction of pay raises.
There have been recessions before. In previous recessions, I’m told, the drill was as follows: low or no pay raises for a few years, followed by “make them up to you” raises when the money came back. The idea was to reward loyalty among those who stuck around during the bad years.
I don’t foresee the “make them up to you” raises this time around. The political direction of the country being what it is, I just don’t see public employees winning significant gains in the foreseeable future.
That means that for many people, the only way to effect a meaningful increase in salary will be to change jobs. And those who do will be well-advised to negotiate hard at the point of hire, since they won’t be able to count on meaningful raises over time. If you’re fairly confident that any increases over the next half-decade or more will be below the rate of inflation, then you’d best get every penny you can upfront. It’s all downhill from there.
Incumbent employees will cry ‘foul’ over salary compression or inversion, and they’ll have a point. But when the rules favor one course of action over another, you have to assume that some people will act accordingly. Rule out raises for staying in place, and you will get movement. It’s as simple as that.
Some level of movement can be a good thing. When a campus or program becomes too inbred and static, it starts settling for things it shouldn’t. The occasional fresh pair of eyes can keep people honest and bring new perspectives to bear on existing problems. (I’ve seen departments that have gone literally decades between hires; at a certain point, “stasis” becomes “stagnation.”) But there can be too much of a good thing.
At this point, I suspect, we’re in the lag period between a fundamental change on the ground and people fully comprehending the change. We still have plenty of people playing by the old rules, just waiting for the “make it up to you” part of the cycle. But lags don’t last forever. Sooner or later, and I’m guessing sooner, people are going to start adapting to the new reality. When you pay for movement but you don’t pay for stability, sooner or later, you’ll get what you pay for.
All predictions guaranteed or your money back...
Monday, April 11, 2011
Community Colleges as Debtbusters
If you’re able to access it, this piece discusses people consciously choosing to spend the first two years of a much longer college education at a community college as a cost-cutting measure. The idea is to build up a slew of transferable credits at low cost, so that when you get to the four-year school, you’re only paying their rate for two years.
Yes, I’m biased, but I consider this a great idea for many people. If your four-year option is Princeton, then by all means, go directly. But if your four-year option is Midtier State U, the community college transfer route has real virtues.
The most obvious one, and the one that the article picked up on immediately, is cost. CC tuition is typically lower than anyone else’s, and living at home is usually cheaper than living in a dorm. (If living at home is simply not an option, cheap apartments usually compare fairly well to dorms, too.) The rule of thumb I’ve heard lately is that you don’t want your total educational debt to exceed your expected first year’s salary. Let’s say that each year at the four-year college requires $10,000 worth of loans. If you go directly, you graduate with $40,000 of debt. If you start at the cc and transfer, you graduate with about $25,000 in debt. In this economy, the difference is nothing to sneeze at.
But there are other advantages, too. One of the dirty little secrets of American higher education is that many universities run the intro classes as cash cows. When I was in grad school at Flagship State, the undergrad intro course in my discipline was taught in an auditorium to 300 students at a pop. Sitting near the back, as I sometimes did, I can attest that student attention was spotty at best. (And that was before the plethora of electronics that students have now.)
The better community colleges tend to run all classes relatively small. Yes, the adjunct percentage is higher than it ought to be, but candidly, you may be better off educationally with a long-time adjunct than with a brand-new graduate student who’s teaching the first class of her life. On the full-time side, since cc’s hire faculty to teach, rather than to do research, you tend to have fewer Inscrutable Geniuses and more solid teachers.
If you’re susceptible to the seamier side of dorm life, a cc might be a good bet. Nobody likes to talk about it, but every January we get a non-trivial number of “reverse transfers” from four-year colleges. Many of them are students who got a little carried away with the bacchanalia of dorm life, and who lost their way academically as a result. A more distraction-free environment can be just the thing to get the potentially-dissolute back on track.
All of that said, this solution isn’t for everybody.
Most cc’s don’t offer the entire traditional college experience. If you have your heart set on football Saturdays and late-night dorm bull sessions, you may feel like you’re missing out. Depending on its location and profile, the options for Honors programs may be limited, and outside of those programs, you may not get the same average level of academic preparation among your peers as you would at most four-year schools. (‘Tis the hazard of open admissions.) Truthfully, some cc’s are very good at transfer, and some really aren’t. The ones that are represent a colossal bargain for many students; the ones that aren’t, don’t.
One way to tell the difference is to call the Student Affairs office and ask to speak to the Transfer Counselor (or Transfer Coordinator). If they don’t have one, you know what you need to know. If they do have one, ask for numbers of students sent to the schools to which you hope to transfer. Savvy students (and savvy parents) know to ask about “articulation agreements,” which are contracts between colleges spelling out which credits will transfer for which programs. If you already have a target college or university in mind, you want to choose your cc courses with an eye towards what that school will accept for full credit in your intended major. (Beware of “free elective” credit, which is where credits go to die.) It can also be a good idea to contact the destination college and ask them which community colleges send them the most transfer students. Past performance is no guarantee, but it beats random guessing.
Every year my cc sends an impressive number of students to a number of strong four year colleges, including some that high-achieving high schoolers compete hard to get into. A gold-plated diploma at what amounts to half-price amounts to one of the best bargains in American higher education, and it sets those students up to have a wide range of enviable choices. I’m proud to be a part of that, even if only indirectly.
Some cc’s are establishing bachelor’s degree completion programs on their own campuses, in which they contract with four-year colleges to run courses from the third and fourth years on the cc’s campus. These programs are usually targeted at returning adult students, though I’ve never seen them formally restricted to them. The idea behind programs like these is to improve both the cost and the convenience of the degree, allowing students to graduate with even less debt than they otherwise would have. For the right student, this is a major win; savvy students would be well-advised to ask about these when they go college-shopping.
This approach may not be for everybody, but it’s nice to see some media acknowledgement that community colleges do more than just remediation and job training. Even though job training gets the most political attention, some cc’s actually transfer more students than they graduate from terminal programs. If you’re in the right place, this can be a very appealing option.
Wise and worldly readers, did any of you go this route? If you did, did you learn anything about the process that you wish you had known when you started?
Friday, April 08, 2011
Community colleges catch a lot of flak for teaching so many sections of remedial (the preferred term now is “developmental”) math and English. (For present purposes, I’ll sidestep the politically loaded question of whether ESL should be considered developmental.) In a perfect world, every student who gets here would have been prepared well in high school, and would arrive ready to tackle college-level work.
This is not a perfect world. And given the realities of the K-12 system, especially in low-income areas, I will not hold my breath for that.
Many four-year colleges and universities simply exclude the issue by having selective admissions. Swarthmore doesn’t worry itself overly much about developmental math; if you need a lot of help, you just don’t get in. But community colleges are open-admissions by mission; we don’t have the option to outsource the problem. We’re where the problem gets outsourced.
I was surprised, when I entered the cc world, to discover that course levels and pass rates are positively correlated; the ‘higher’ the course content, the higher the pass rate. Basic arithmetic -- the lowest level developmental math we teach -- has a lower pass rate than calculus. The same holds in English, if to a lesser degree.
At the League for Innovation conference a few weeks ago, some folks from the Community College Research Center presented some pretty compelling research that suggested several things. First, it found zero predictive validity in the placement tests that sentence students to developmental classes. Students who simply disregarded the placement and went directly into college-level courses did just as well as students who did as they were told. We’ve found something similar on my own campus. Last year, in an attempt to see if our “cut scores” were right, I asked the IR office and a math professor to see if there was a natural cliff in the placement test scores that would suggest the right levels for placing students into the various levels of developmental math. I had assumed that higher scores on the test would correlate with higher pass rates, and that the gently-slanting line would turn vertical at some discrete point. We could put the cutoff at that point, and thereby maximize the effectiveness of our program.
It didn’t work. Not only was there no discrete dropoff; there was no correlation at all between test scores and course performance. None. Zero. The placement test offered precisely zero predictive power.
Second, the CCRC found that the single strongest predictor of student success that’s actually under the college’s control -- so I’m ignoring gender and income of student, since we take all comers -- is length of sequence. The shorter the sequence, the better they do. The worst thing you can do, from a student success perspective, is to address perceived student deficits by adding more layers of remediation. If anything, you need to prune levels. Each new level provides a new ‘exit point’ -- the goal should be to minimize the exit points.
I’m excited about these findings, since they explain a few things and suggest an actual path for action.
Proprietary U did almost no remediation, despite recruiting a student body broadly comparable to a typical community college. At the time, I recall regarding that policy decision pretty cynically, especially since I had to teach some of those first semester students. Yet despite bringing in students who were palpably unprepared, it managed a graduation rate far higher than the nearby community colleges.
I’m beginning to think they were onto something.
This week I saw a webinar by Complete College America that made many of the same points, but that suggested a “co-requisite” strategy for developmental. In other words, it suggested having students take developmental English alongside English 101, and using the developmental class to address issues in 101 as they arise. It would require reconceiving the developmental classes as something closer to self-paced troubleshooting, but that may not be a bad thing. At least that way students will perceive a need for the material as they encounter it. It’s much easier to get student buy-in when the problem to solve is immediate. In a sense, it’s a variation on the ‘immersion’ approach to learning a language. You don’t learn a language by studying it in small chunks for a few hours a week. You learn a language by swimming in it. If the students need to learn math, let them swim in it; when they have what they need, let them get out of the pool.
I’ve had too many conversations with students who’ve told me earnestly that they don’t want to spend money and time on courses that “don’t count.” If they go in with a bad attitude, uninspired performance shouldn’t be surprising. Yes, extraordinary teacherly charisma can help, but I can’t scale that. Curricular change can scale.
This may seem pretty inside-baseball, but from the perspective of someone who’s tired of beating his head against the wall trying to improve student success rates without lowering standards, these findings offer real hope. It may be that the issue isn’t that we’re doing developmental wrong; the issue is that we’re doing it at all.
There’s real risk in moving away from an established pattern of doing things. As Galbraith noted fifty years ago, if you fail with the conventional approach, nobody holds it against you; if you fail with something novel, you’re considered an idiot. The “add-yet-another-level” model of developmental ed is well-established, with a legible logic of its own. But the failures of the existing model are just inexcusable. Assuming three levels of remediation with fifty percent pass rates at each -- which is pretty close to what we have -- only about 13 percent of the students who start at the lowest level will ever even reach the 101 level. An 87 percent dropout rate suggests that the argument for trying something different is pretty strong.
Wise and worldly readers, have you had experience with compressing or eliminating developmental levels? If so, did it work?
Thursday, April 07, 2011
What’s In a Name?
My state's community college system just got state money to offer job training. I assume this entails new classes, but it also influences the whole culture. My bosses are now asking me why my syllabus simply says "If you miss X number of classes you fail" instead of "In a workplace you are fired for not showing up... my classroom is the same, etc.". I don't like to play pretend that my classroom is a workplace, though it's clear that that's what my bosses would like. The power differentials are totally different between student and teacher vs. employee and employer. Money is going the wrong direction for one thing. And to blithely pretend apples are oranges just seems fruity to me.
That reminds me of a discussion I had years ago with a biologist. He made the argument -- true or not, I don’t know -- that many projects are funded in the name of a given disease, but have applicability across a wide range of diseases. Accordingly, savvy scientists know how to repackage what they’re doing according to the flavor of the month. It seemed to me somewhere between cynical and brilliant.
When I was at Proprietary U, nearly everything had to be presented in pre-employment terms. PU sold itself as education for employment, and it wanted a consistent message. It was a little annoying at first, but I figured out ways to make it work. Since my classes were in the gen ed side, which students generally enrolled there specifically to avoid, I’d get the “why do I have to take this class?” question almost every semester. The argument I found that usually worked was that the technical skills they were getting in their “major” classes would get them their first job, but the analysis and communication skills they developed with me would get them promoted. Management requires people who can see, and convey, more than just “how to.” If they ever wanted to get promoted above the help desk, they’d have to show that they could compete with all the liberal arts majors from Flagship State.
I’ve long held a pet peeve about people in academia generalizing about “industry” as if it were all the same. Different workplaces have different cultures. Some prize “face time,” and others allow you to work any sixty hours a week you want. Some define “communication skills” as the ability to simplify complexity; others define it as the ability to strike up a conversation with anyone, anywhere.
The common denominators tend not to be the kinds of things that folks in higher education like to think we teach. Yes, it’s important to show up when you’re supposed to, sober and clean, ready to work. But you shouldn’t need college to know that. You need to be able to navigate organizational politics well enough at least to stay out of trouble; higher ed can be helpful with that, but it’s neither necessary nor sufficient. You need a work ethic, but again, you shouldn’t need college for that. (If anything, I sometimes wonder if the party culture at some traditional colleges and universities is antithetical to developing a work ethic.)
Making matters worse, many of the skills that effective higher education helps to inculcate only show up indirectly and over time. It’s hard to show a one-to-one link between, say, a course on Elizabethan England and the ability to make sense out of ambiguity. I’m convinced that the link is there, based in part on the behavior of the wealthiest and most powerful in our society, but it’s difficult to prove. (Part of the reason that Academically Adrift got the reception it did, I think, is that it confirmed a suspicion among many of us that some students never quite get it. Interestingly, the liberal arts majors are the likeliest to get it.)
The question I’d love to pose to some of the folks who routinely bash the liberal arts in favor of more vocational majors would be why the colleges and universities that have the most elite clientele are so clearly focused on the liberal arts. Are the wealthy really that stupid? Or do they know something?
I’ve argued before that our largest ‘vocational’ major here is the liberal arts transfer major, and I believe that. Underfunded public colleges may have a hard time keeping up with the cutting edge of technology, but we can still do a damn good job of teaching literature, math, public speaking, history, and the rest of the fundamentals. Giving students that kind of grounding, the kind that gives them the skills to succeed in whatever comes next, is a form of job preparation in itself.
Wise and worldly readers, what do you think? Are the classic disciplines useful for developing job-relevant skills, or am I just chasing the disease of the week?
Wednesday, April 06, 2011
I’ve been struck at the disconnect between urgent messages of “we need more full-timers right now!” and the lachrymose “the committee will meet when it gets around to it.” The cynical part of me thinks that if the first message were true, the second wouldn’t happen.
Faculty searches are designed to be inclusive to a fault, which is part of the issue. After a department gets its request approved, it puts together a search committee that includes faculty from within the department, faculty from some other part of the college, and a full-time staff member. (The committee is also chosen to avoid too much homogeneity, whether by gender, race, or age.) The committee meets with the affirmative action officer to go over process and the various legal do’s and don’ts. The position is posted, with a certain amount of time for candidates to submit applications. After that deadline passes*, the committee meets to winnow down the pile to ten or so for first-round interviews. The committee selects three or four finalists that it puts forward for second-round interviews, which are conducted by the chair of the original commitee, the dean of the division, the vpaa, and the affirmative action officer.
The idea behind the process is to ensure that the first round interviews are conducted entirely by people in the trenches, both within and outside the discipline. (The outsiders help prevent too much inbreeding.) The chair is included in the second round to ensure that the face the candidate presents in the second round isn’t hugely different than the previous.
The upside of such an inclusive and deliberate process is that it ensures plenty of pairs of eyes on each candidate, and it tends to result in strong hires. Everyone has her own blind spots, but by including plenty of people, the idea is that any one person’s blind spots should be cancelled out. And it usually works.
The downside is that getting all those schedules to mesh for a series of meetings is remarkably difficult. Faculty are only around when they have classes, but those are the weeks they have classes. (That sentence, by itself, should issue knowing groans among my administrative colleagues.) As a result, searches routinely bog down at that stage, since the committee simply has a hard time getting together. We also have a rule that every member has to be present for every interview -- in the interests of fairness and consistency -- but getting a half-dozen people’s schedules to mesh a dozen times within the space of a few weeks is no small challenge.
The second round is typically quicker, since it involves fewer committee members and fewer candidates. But there, too, you have to allow at least a couple of weeks. It adds up.
The upshot is that, for all practical purposes, it takes two semesters to do a search right. In layman’s terms, it takes a year.
By itself, I guess that’s fine, but it stands in an odd tension with the urgency with which departments claim they need people. It seems to me that a four-month semester should be ample time, if they really mean it. But it’s incredibly hard to be both inclusive and fast.
At PU, the process was fast, but often not inclusive. Here it’s inclusive, but not fast.
Has your campus found a reasonably consistent and defensible way to be inclusive without blowing a year?
*Not every college honors its own deadlines; I’ve seen, and heard of, committees starting to read applications before the application deadline has passed. It strikes me as awful practice and potentially actionable, but it happens.
Tuesday, April 05, 2011
Ask the Administrator: Do CC Alums Have an Edge?
I guess no one warned me earlier, but I really had no idea how dire it is out there getting academic jobs nowadays. I'm thinking I'll have the phd within 3 years from now, at which point I'll have to figure out what I'm going to do to make a living. One option that I've been considering more and more is teaching at a community college.
What I want to ask you is whether being an alumni from a cc will give me somewhat of an edge in getting a good tenure track position in a few years. I can really speak passionately about how meaningful an experience community college was for me when I was 21. I barely graduated high school, only went to the cc down the street from me to please my parents, and failed out my first year, only to come back one class at a time (non-matriculated at first) to graduate with honors before going off to a 4 year school, a prestigious masters program, and a phd ("low-ranked" as it is). I've also been teaching undergrads for 4 years now (by the time I'm done, it will be 6 or 7 years) and I have excellent observation reports from faculty and students alike. In other words, I have an inspiring story to tell cc students on day 1 (and to tell the people interviewing me for the job).
And for the record, it isn't a bullshit story. I have to land a job...I really think that cc was an incredible opportunity for someone like me and I am truly grateful for the guidance I received from my teachers there. I actually still have a letter from a faculty member who wrote about how revolutionary my change while I was a student there.
So while I'm surrounded by big shots trying to elbow each other out of the way to publish in the most prestigious journals and to work with other big shots (and reading Foucault all the while), I am content to quietly finish my phd by writing a dissertation I'm proud of and move off to an unassuming community college where I can inspire people and not have to worry about the hustle of academia.
So I know that I don't have the edge over my competitors when it comes to teaching at R1's, but I think I ironically might have an edge over them in the cc job market, where my more modest background might make me a better candidate. Do I have this edge? What are cc's looking for in candidates? Would I have even more of an edge if I went back to the cc I went to? And finally, what is a salary for someone teaching poli sci in a tenure track position at a cc in the northeast? Is thinking that I could make 70k a year (eventually) crazy of me?
First, good luck on your search. The market is brutal out there in the evergreen disciplines, and even more so if you aren’t coming out of a brand-name doctoral program.
That said, I’ve noticed that while a community college degree is pretty much the opposite of a brand name, it can actually help in applying for a full-time cc teaching position. That’s because the great fear at this level, in hiring Ph.D.’s, is that they’re ‘settling.’ In my neck of the woods, we have a substantial number of Ph.D.’s on our faculty, as do most of our nearby counterparts. Many of them are wonderful, but there are some who just can’t let go of the dream of teaching at Pastoral Liberal Arts College, and who never pass up the opportunity to complain about teaching loads, students, salaries, travel funding, clouds in the sky, fish in the sea, or whatever else happens to enter their field of vision that day. Although small in number, these people are toxic, and search committees are well-advised to avoid them.
Having a cc degree in your own background can help immunize you against suspicions of covert snobbery. If you can make a convincing argument to the effect that teaching at this level is your first choice, and you say that knowing the realities of the setting, you may come across as the best of both worlds: the amply-credentialed candidate who actually wants to be here.
That said, I’d strongly encourage avoiding language like “avoid the hustle of academic life.” Teaching well at a cc is hard work. You don’t want to convey the impression that you’re looking to coast or take it easy. And it’s rarely a good idea to say you want location A to avoid location B. Talk about why you want location A.
Salaries vary by region and institution, but generally speaking, a full-time position in a liberal arts discipline will usually start somewhere in the 40’s. (California and New York City are exceptions.) Depending on locale, the most senior faculty will earn anywhere from the 80’s to the low 100’s. And certain specialized fields, like Nursing, sometimes have higher scales. Salaries aren’t creeping up much these days; since the recession hit, freezes and furloughs have become the order of the day. At some point, though, I expect that things will thaw a bit. If they don’t, I’d expect an accelerating exodus over the next few years.
Wise and worldly readers -- any tips for a grad student considering a cc gig?
Have a question? Ask the Administrator at deandad (at) gmail (dot) com.
Sunday, April 03, 2011
The commentary on the piece last week was instructive. (At least, once you got past the usual culture war posturing.) Some of the commenters had an “about time!” response, cheering on an action that they considered long overdue. Others told stories of bad behavior that had gone unpunished. Still others took the side of the shunned, arguing that the danger of informal sanctions based entirely on hearsay is that the accused will never get a chance to defend himself. After all, in court you have the option of testifying on your own behalf; in the court of whispered opinion, you really don’t.
Both sides are right, as far as they go, but they both seem to take for granted that official sanctions are ineffective. The pro-shunning side seems to assume that official sanctions are hopeless, so frontier justice is necessary. The anti-shunning side seems to assume that official sanctions are ineffective, but such is life, and frontier justice is worse.
As an administrator -- one of the folks whose job it is to tend to those official channels -- I found the common assumption disturbing. Do official channels have to be so terribly ineffective?
It seems to me that official channels that actually work would offer the best of both worlds. The miscreants would be answerable for their sins, and the falsely accused could be exonerated. I don’t imagine perfection, given the inevitable gaps between “what actually happened” and “what can be proved,” but I can imagine getting close enough that calls for shunning campaigns would seem strange.
Academic culture is unique in many ways. One of its distinctive calling cards is a temperamental allergy to people with positional power actually using it. In most other lines of work, a taboo like that would be considered either bizarre or simply unthinkable; in academe, it’s a norm. It’s so normal that in a debate among highly educated people about sometimes-criminal conduct, the one shared assumption is that you don’t want administrators to get involved. In any other line of work, you’d want to bust the perps, and you’d call in the folks empowered to do that.
The taboo seems to be based on fear of abuse. But the idea that incapacitating those with positional power will somehow create a power-free zone is delusional. Instead, it empowers local bullies. When legitimate authority has been rendered irrelevant, then by default, the only option left against petty tyrants is some sort of frontier justice, whether overt or covert.
The choice is not between hierarchy and freedom. It’s between legible, accountable power, and illegible, unaccountable power. Both are problematic, but I’d take the former over the latter any day of the week. At least the former offers rules of combat.
To the extent that those legible, accountable institutions need to be reformed to be effective, by all means, bring it on. But to take their failure for granted -- to make their failure the assumed starting point of a discussion -- leaves nothing but unappealing options. Laws without police aren’t laws. The question of shunning doesn’t have a good answer, because it’s the wrong question. The right question is how to make legitimate, legible authority more effective.
Friday, April 01, 2011
The Dark Side of Choice
I actually agree with this.
Many public colleges and universities have embraced “comprehensiveness” as part of their “access” mission. The idea is that part of “access” is access to whatever program the student wants. Whether that program is liberal arts transfer, culinary arts, or auto repair, the college is presumed to be on the hook to provide it. I’m increasingly skeptical, though not for the traditional reason.
In olden times, I’m told, there existed in the land a strange breed called “professional students.” They could be identified by their distinctive markings -- tie-dye, mostly -- and vague smell of weed. They stayed in college forever, ekeing out meager livings and never confronting the real world. They accomplished this by changing majors a half-dozen times or more, thereby forestalling graduation. Some have also suggested that forestalling draft eligibility may have had something to do with it.
Several decades of tuition inflation and erosion in the minimum wage have threatened the natural habitat of the professional student, driving them nearly to extinction. So that downside of comprehensiveness has become largely moot. Now, when students stick around for a long time, it’s usually because they’re attending part-time and working close to full-time. In my own dealings with students, I can attest that I hear much more of “how can I graduate faster?” than “how can I stick around longer?” They’re much more interested in getting jobs than in staying on campus forever.
No, my objection is based on quality control, cost, and what for lack of a better term I’ll call student cluelessness.
I consider quality control and cost closely related. The more you have to water down a program, the more risk you’re taking with the quality of delivery. When a program is a little bit better than it has to be -- what I call “excellence,” or skeptics might call “waste” -- then in good times it can take risks, and in bad times it can make some sacrifices and still do right by its students. But when a program is already running dangerously lean, any cut of any magnitude will do real harm. (Even in good times, it won’t have the resources to experiment, and thereby to improve.) Too much efficiency at what you do now can actually prevent improvement, since there’s no room for the mistakes that are part of the learning curve.
There’s also an issue of blind spots. We all have them. A program with only one, or even two, full-time faculty in it is likely to have significant blind spots in its discipline. I’d be shocked if it didn’t.
All else being equal, a college of, say, 5000 students can probably do a better job of supporting twenty programs than it could of supporting fifty. Absent unique program-based funding, the larger number of programs means that each program gets fewer faculty and fewer resources. Each one runs leaner. That means each one has more blind spots than it should, less room to experiment than it should, and more adjuncts than it should.
Student cluelessness is the other objection. As programs multiply and the distinctions become finer-grained, students are less able to make intelligent decisions among them. Since each program has its own unique requirements and chains of prereqs, guessing wrong can put a student out of sequence and make completion more difficult. Yes, good individual advisement can reduce the incidences of that, but complexity inevitably breeds confusion.
At the two-year level, there’s also a basic issue of the degree of specialization that should really be expected. Outside of a few very prescriptive vocational programs, like Nursing, most of the degree programs tend to be heavy on the gen ed. Everybody has so many credits of humanities, social science, math, and the like to cover, so there’s only so much room for specialized coursework. Slicing that remainder of credits ever thinner seems likely to lead to diminishing returns.
Obviously, I’d rather have enough resources to be able to do everything well. That would be nifty. But in the absence of that, I’m increasingly convinced that it’s better to do fewer programs and do them well then to try to keep doing everything with less.
What do you think?