Tuesday, September 30, 2008
Ask the Administrator: Breaking Into Bio
I just graduated with my masters in Biology and started to look at jobs in the tech industry and teaching adjunct on the side. I loved teaching when I TA'ed in grad school, a fact that was surprising to me. Well, I never found a tech job but took as many classes as I could teach at the local CC. I love it, I love teaching, interacting with the students, and helping them to understand things. I learn each day how to help both adult learners and new HS grads. I have decided to pursue teaching full time as a career choice.
Now my question. How hard is it to break in to a CC FT position without any FT experience, only adjunct? What makes me more marketable? Is it worth getting a PhD or another Masters in Chemistry to make me more marketable? I tutor high school kids on the side as well, so my devotion to teaching seems clear. What are your suggestions?
First, congratulations on figuring out what you want to do! I'm always heartened when I hear of people seeing through the “what you're supposed to do” party line and finding what works for them.
The good news is that biology is a tough field for hiring, at least at the cc level, so you're probably in fairly decent shape if you're willing to be geographically flexible. (Nursing and other allied health programs are the main drivers of bio enrollments in the cc's I've seen, and they're hot these days.) For that first position, you should be able to find something pretty decent with a Master's and some adjunct experience. Anything else you can document that suggests a genuine love of teaching – tutoring, for example – could also help.
If you decide to do more graduate work, for a cc you'd probably be better off getting a Master's in a nearby field, like chemistry, than a doctorate in your primary field. The reason for that is that 'credential creep' hasn't hit the sciences yet with the same force that it has hit the humanities, and some smaller schools like to hire people who can cover classes in two disciplines. If you can cover both bio and chem, that puts you a step ahead of the folks who can only do one or the other. At the cc level, breadth is sometimes valued more than depth. Whether that's good or bad is a matter of taste.
If that's a bit more ambitious than you want to be right now – and I couldn't blame you – it might still be worth doing some focused reading in the scholarship of teaching and learning, especially as it's practiced in the STEM disciplines (Science, Technology, Engineering, Math). If there's a local or relatively local conference of 'best practices' in higher ed, go. You'll probably glean some things you can use in your current positions, which is great, but you'll also pick up some fluency in the kinds of things that get you noticed by cc search committees.
Successful teaching at this level is the antithesis of the “weed 'em out” approach. Here it's about finding ways to empower people with shaky academic pasts, diagnosed and undiagnosed learning disabilities, and the like, to succeed academically. Doing that while still maintaining high standards is a real challenge, and plenty of people either can't do it or burn out trying. If you can develop ways to do that, both individually and collaboratively, you'll be a hot commodity.
(The point about collaboration is worth highlighting. Grad programs often encourage a war of each against all, as everybody tries to be The Smartest Person In The Room. This is deeply dysfunctional, and to be avoided. In a cc setting, it's not about being the hotshot; it's about being a committed teacher who can make her department better by working with, instead of against, her colleagues. If you can show that you've teamed up with colleagues in some form and tried something innovative to help struggling students succeed, that should work like catnip with the committee.)
All of that said, I'll admit that my background isn't in the lab sciences. So I'll ask my wise and worldly readers, especially in the lab sciences, to fill in the gaps. Science-y types – what say you?
Have a question? Ask the Administrator at deandad (at) gmail (dot) com.
Monday, September 29, 2008
In Which I Admit to Being Really Confused
And a whole bunch of banks, quasi-banks, sorta-banks, and insurance companies are in trouble because, at the root of it all, too many people aren’t able to pay back what they’ve borrowed. (The “no credit, no problem” approach to lending only works when house values are climbing, since anyone who can’t make payments anymore can sell at a profit. With house prices falling, now folks who can’t make payments often owe more than the house is worth, so they can’t get out from under.) The banks used to sell the IOU’s from mortgages on the open market to get new money to lend, but now the buyers of those IOU’s aren’t buying anymore, since they’re afraid of getting stuck with unpaid loans. And the banks don’t want to hold the loans themselves, since they know just how shaky the stuff actually is.
Since the banks aren’t getting infusions of money to lend from selling the IOU’s anymore, and the loans they can’t sell are often going unpaid, they have a lot less to lend (and much worse jitters about lending it). This means that people who want to buy houses or cars often can’t, since they can’t get loans. When houses and cars go unsold, their value drops, and the people who make and/or sell houses and/or cars get laid off. Then the people who are laid off have trouble making their mortgage payments, and it gets uglier.
(On the bright side, the relatively few people who make stuff for export are actually doing better, since the declining dollar makes our stuff more competitive.)
Left to its own devices, the market would jack up interest rates to compensate for perceived increased risk. But the Fed doesn’t want that to happen, for fear of accelerating the downward spiral and turning a popped-bubble recession into a devil-take-the-hindmost depression. So the Fed is pouring money into the financial markets to make sure that banks can still lend. And now the government is considering borrowing another 700 billion or so to buy up the stinky IOU’s, to free the banks (and the people the banks sell the IOU’s to) from the consequences of their mistakes, so they can make more.
Here’s where I get confused.
Obviously, there's the 'moral hazard' problem. If the lenders are freed from the consequences of their actions, won’t they revert to form and keep doing what they were doing before? If they get to keep windfall profits in good times, but get bailed out in bad times, why not bet the house (literally) every time?
But more basically than that, where the hell are we getting 700 billion or so to buy up the stinky IOU’s? Aren’t we already running a huge national debt? The only way I can figure it is that either we snooker the Chinese real good and convince them that we’re much better risks than we actually are – which seems unlikely to work – or we just print more money. But the ‘print more money’ approach doesn’t do much to build trust with our lenders, who find that the substantial IOU’s they hold from us are suddenly worth less, since dollars are suddenly worth less. Won’t they compensate by either jacking up interest rates or just refusing to lend?
And if they do that, aren’t we just re-creating the same problem, but on a much larger scale? If the government bails out the banks, who bails out the government?
This has a ‘banana republic’ feel to it. If the cost of borrowing from foreigners becomes prohibitive, the way around that is to either stop borrowing and get our stuff together – probably a wrenching process in the best of times – or to just roll the presses even faster. That’s what a lot of third world countries do. It seems to work for a little while, but usually leads to hyperinflation and economic collapse. You can’t keep flooding the market with money and not expect prices to go up, since the supply of money would increase so much faster than the supply of stuff.
Wouldn’t it make a lot more sense to stop borrowing money for, say, wars of choice, and instead direct infusions of cash to, say, the people who can’t make their mortgage payments? If they can make their payments, then the IOU’s suddenly become non-stinky, and the financial institutions will have the money to start lending again. If people can actually get loans, they’ll buy stuff – we Americans are good at that – which leads to jobs making stuff, so then those folks buy more stuff, and so on. If you direct a bailout to the IOU holders directly, you aren’t addressing the root of the problem – just because old IOU’s got paid off doesn’t make new ones any more solvent. The root of the problem is that people can’t make their payments. Address that through, say, rediscovering the principle of progressive taxation, and taking a serious look at the distribution of income. A prosperous middle class will consume more than enough to keep everything humming along nicely. We’re good at consuming. It’s what we do. We just need the money to do it.
In other words, wouldn’t it make a lot more sense to avoid the ‘borrow and print’ strategy, and instead to tax the uber-wealthy to help the middle class make good on its mortgages? Yes, they’d bitch, but they always bitch. That’s what they do. And as annoying as, say, the income tax can be, depressions are much more so. Instead of borrowing money from abroad and hastening a catastrophic collapse, we could tap the huge private reserves of money already here and actually right the ship. Among other benefits, this would demonstrate to our creditors that we’re a good risk, which would keep the overall system nice and stable. It would also have a nice moral cast to it, since many of those private reserves of wealth are essentially ill-gotten. Make the folks who profited from irresponsibility pay for it. Fair is fair.
Instead, some very experienced, very smart people are saying that we have no choice but to borrow hundreds of billions right now, not bother connecting the dots, and trust that the same very smart people who caused the mess will fix it if we just leave them alone with all that money. They’re good for it, right?
We're supposed to believe all that. And that's why I'm confused.
Friday, September 26, 2008
Raising Future Writers
The Wife: TG, tell Daddy what you said at school today.
The Girl (earnestly): Daddy, 'tushie' is more appwopwiate than “heinie.”
So now we know.
The Boy: During recess, Dylan got hit in the you-know-whats.
TB: You know, the nuts.
Got it, thanks.
There's something humbling, and a little frightening, about seeing your own quirks reflected back to you in your kids. They're already impressively precise in their language, and attuned to how they're heard. This means I get away with nothing. It also means I know they're in for a bumpy ride in adolescence. But I'll admit to some parental pride in hearing my four-year-old tell me what's appwopwiate.
Thursday, September 25, 2008
Bumper Stickers? Really?
Other issues, by contrast, are so obvious that any sentient being should be able to dispose of them immediately. This is one of those.
According to IHE, the University of Illinois promulgated some ethics guidelines that, among other things, ban partisan bumper stickers on employees' cars. Non-partisan bumper stickers are fine. So “Just Do Me” is recognized as free speech, but “Obama '08” is over the line.*
The theory of constitutional interpretation underlying this could be described as 'obscure.' Even Justice Scalia has argued that political speech is entitled to the highest level of First Amendment protection. If we assume that employees pay for their own cars, and drive them both on campus and off, then I'd be at a loss to explain why a public employer should have the authority to monitor employees' bumpers for political content, and not for any other kind.
(It doesn't take much to establish a really slippery slope from there. If my car somehow represents the college, and therefore must be neutral wherever it goes, then one could easily argue that my front yard represents the college, too. After all, my neighbors know where I work. No yard signs for me! And what if a neighbor sees me going to vote? Best not to do that, either.)
The article quotes a spokesman for the university saying of the bumper sticker ban that “officially, it does apply.” As a card-carrying bureaucrat, I can attest that the word “officially” speaks volumes. Simply put, that's the standard signal for “even I don't really believe this, but I'm obligated to say it.” Officially, jaywalking is illegal. Officially, employees aren't supposed to take or make personal calls while at work. Officially, students are required to attend every meeting of every class.
The ethics regulations also go into less obvious areas, like wearing campaign buttons in class. I don't think I've ever seen a professor wear a campaign button in class, even going back to my own student days, but it's at least theoretically possible. My sense is that it would be bad practice, since it would only serve to distract the students from the task at hand, but I hardly see it rising to the level of actionable misconduct. And faculty office doors – also addressed in the policy – are famously rich with all manner of cartoons, stickers, opinions, and jokes, as well they should be. I wouldn't object to a policy saying that anyone who affixes stickers to their door is responsible for removing them when they move out, but that's about cleanliness, rather than content.
If the America in which I believe has any reality, this policy will be relegated to the dustbin of history posthaste. You don't give up your freedom of speech when you work for a public university. I know we've lost our bearings over the last several years, but honestly.
* There's also the commonly-exploited distinction between 'partisan' and 'political.' Technically, both the NRA and MoveOn.org are 'nonpartisan,' since neither is officially a part of a political party, but they aren't fooling anybody, and aren't particularly trying. Perversely enough, most negative messages are non-partisan, since they rely on attacking rather than endorsing. “Vote Obama” is partisan; “Stop McCain” isn't, since one could presumably stop McCain by organizing a write-in campaign for, say, Gumby. Expunging positive messages, while allowing negative ones to flourish, strikes me as unhelpful.
Wednesday, September 24, 2008
Some of the stats cited in the report are worth checking out. Among them:
Nationally, only 11 percent of community college faculty are tenured or tenure-track, and under the age of 45.
In 1986, 42 percent of college Presidents were younger than 50; in 2006, only 8 percent were.
The younger generation of full-time faculty is more diverse than its predecessors, but also dramatically smaller. The shift to adjunct instruction has disproportionately affected Gen X and later. Somewhat surprisingly, the report also notes that an increasing number of community college faculty start their teaching careers above the age of 40; apparently, it's becoming more common as a second career. (That fits my local observation; I just didn't know it was a national trend.)
The report recommends rethinking the traditional upward ladder of faculty – chair – dean – vp – president to allow younger faculty to rise more rapidly, whether by skipping steps or by spending less time at each step. Although it doesn't spell out the alternative, anyone who has been paying attention knows that the alternative is an acceleration of the trend of college leadership coming from outside the academic ranks. That may or may not be a good thing, but it's getting predictable.
This looks to be one of those trends that sneaks up on you, since the current generation of Presidents still hails largely from the boom years, and it's still a big group. One of the most common sources of college Presidents is other college Presidencies; there's a certain musical-chairs quality to many Presidential searches now. But I've seen plenty of dean-level searches fail, and I've seen a few vp-level searches fail, too. The pool is getting thin, which is pretty much what happens when you don't hire new people at the entry level for twenty years.
It's possible to interpret this trend in many ways, but for now, I'll highlight the positive. Youngish full-time faculty who want administrative opportunities may soon find them easier to get than they've been in quite some time. And unlike the folks before them, this cohort will have come up through the system in its current, brutal form, so they're likelier to have a clearer sense of it. It's possible – I'm being hopeful here – that this cohort will have a greater sense of the fragility of the academic enterprise, and will therefore be more thoughtful in its stewardship of it.
Or, it could be a bunch of callow, careerist douchebags. History will decide.
My own sense of the value of experience is a bit more nuanced than what the report suggests. Skipping more than one step strikes me as genuinely dangerous, since each level is meaningfully different than the one before it. (I'd be worried about a new President whose previous experience didn't go higher than chairing a department.) But I also suspect – and the literature I've seen on teaching suggests that there's something to this – that the payoff to experience at any given level plateaus after a certain point, and eventually can even turn negative. Five years of deaning is better than two, but I'm not convinced that fifteen is better than ten. So my recommendation for administrative hiring committees would be a bit different. Be wary of step-skipping, but don't make the mistake of valuing experience linearly, either. That kind of CYA behavior by hiring committees tends to perpetuate the musical-chairs hiring of the same faces over and over again, since the Old Boys have the most experience, by definition.
I'm not naïve enough to think that a looming shortage of deans will provoke a reversal of the trend towards adjuncting-out the faculty. That's a function of much larger forces. But it might make sense for faculty hiring committees to look for certain kinds of skills in new hires, and to encourage level-headed, ethical young colleagues to see past the “crossing over to the dark side” taboo and consider stepping up. Silly stereotypes may have been affordable in an age of abundance, but that's behind us. It's time to recognize that younger faculty with administrative potential should be encouraged, rather than dumped on or ostracized. The alternative is to import managers straight from the business world, with all that entails.
Tuesday, September 23, 2008
That said, it had one article that actually brought up a worthwhile issue, if indirectly.
Student evaluations of professors are usually coded statistically, and that works pretty well for professors who are generally considered great, average, or awful. If someone scores multiple standard deviations below the mean in every category, there’s a pretty good chance that something is going badly wrong. While it would be a mistake to base a personnel decision on a single data point, there’s nothing wrong with using data to highlight where a second, closer look is warranted.
But the statistical system doesn’t do as well with professors who aren’t “generally considered” any one thing. Some professors evoke polarized responses among students: simply put, they’re either loved or hated. In the numbers, loved and hated might average out to something like “nothing special,” but that’s misleading. These are where the dreaded judgment calls have to be made.
Any number of things can lead to polarized responses. I’ll start with the hot-button one of a ‘political agenda.’ Yes, some professors are ham-fisted (or seem that way, depending on your background) in how they present material, and students object to that. But to me, ‘political opinions’ and ‘teacherly craft’ are separable. I’ve seen professors with strong beliefs structure their classes to foster a vigorous exchange of ideas, and win the respect (if not the agreement) of those who disagree. The key variable here isn’t so much a strong set of opinions, but the ability to run a class in a professional and ethical way.
(Annoyingly, some students simply won’t tolerate any challenge to their pre-existing beliefs. A student once accused me of advocating cannibalism when I assigned Swift’s “A Modest Proposal” as a reading. It took some time to get him to acknowledge that there can be value in reading things with which you disagree.)
More commonly, though, sarcasm tends to be polarizing. Those who feel secure in their grasp of what’s going on often find it refreshing; those who are struggling often find it arrogant. This tends to hold true regardless of politics. At PU, one of my math faculty was widely despised by students, despite what I observed to be a clear and accessible teaching style. Over time I figured out that it was his sarcasm – gentle by my standards, but there it was – that set them off. Since they were struggling in the class, they heard the occasional aside – rightly or wrongly – as insulting.
Personality conflicts can also play a role. I had a class at SLAC with a professor whom just about everybody liked; I couldn’t stand him. Everybody else found him charismatic; I found him smug beyond belief. Strong personalities will probably elicit more of these responses than others; whether that’s right or wrong I’ll leave to the ethicists. And those frustrating issues of cultural fit can fall under this category, too, with varying degrees of fairness.
Offbeat subject matter or teaching methods can also elicit unpredictable responses. In my experience, in-class role play exercises tend to either soar or flop, with little in-between. I’ve found that some level of honesty when things flop can actually go a long way with students. They read honestly, correctly, as respect, and respond accordingly.
I’ll admit to getting a little nervous when a polarizing professor blames negative responses on the students. There’s often some truth in it, but there’s also the basic fact that the students are who they are, and teaching the students we actually have is the job. You’ll never reach everybody, and that’s to be expected, but I’m much more sympathetic to the professor who says “I’m having trouble with x group, so I’m trying this new strategy” than with the one who says “x group just isn’t fit to be here.” There’s also the inconvenient fact that other faculty have the exact same students, and somehow seem to get through. If 8 out of 9 members of a department seem to be doing fine, and the 9th isn’t, and the 9th’s explanation is that her students suck, I tend to be skeptical. Anything’s possible, but the burden of proof in that case is pretty heavy.
Wise and worldly readers – what do you think makes a professor polarizing? Have you been one?
Monday, September 22, 2008
1.Midyear budget cuts. These happen when tax revenues fall below projections, and the states (or localities) can't (or won't) borrow or dip into reserves to make up the difference. Midyear budget cuts are uniquely brutal, since there's typically very little wiggle room in a budget after the annual contracts have already gone out. If someone is on a July-to-June contract, an October or December layoff may require a payout equivalent to what she would have made through June, thereby negating any savings. (Once you factor in the inevitable discrimination/wrongful termination/procedural irregularity lawsuits, it's a net money loser.) Between permanent fixed costs – facilities, tenured faculty – and costs that are fixed at the start of the fiscal year – most other staff – there just isn't much left over, especially if it has to do twelve months of fiscal duty in only eight or nine months. You can cut travel, and some ceremonial/fluffy stuff, but you're dealing in tenths of a percent of an overall budget. It's necessary, but not even close to sufficient. You can reduce your Spring adjuncts, but it's incredibly hard to save money by squeezing the lowest-paid employees, especially when they have a direct connection to your major revenue source. Philanthropy almost certainly won't save you, either, since that money is almost always barred from being used in 'operating' budgets. (Of course, with the financial services crowd running scared, there may not be as much philanthropy sloshing around anyway.)
2.Increased spending on financial aid for students, since economic downturns tend to mean fewer people capable of paying full fare. Combining increased financial aid spending with midyear budget cuts is all kinds of fun. (Cc's may be spared some of this, to the extent that the poorest of the poor elect residential colleges instead, the better to have room and board subsidized.)
3.Tuition increases. This one follows pretty directly from 1 and 2. Admittedly, combining 2 and 3 leads to a chasing-your-own-tail exercise, at some level.
4.Local strategic planning consigned to (increased?) irrelevance. Strategic planning isn't my favorite exercise in the world anyway, but ideally, it's supposed to represent the best hopes of an institution, combined with its best guesses as to future circumstances. If those circumstances change for the worse, abruptly, then those plans quickly become untenable. Yes, we'd love to beef up the Underwater Poetry program, but right now we aren't sure we can cover the cost of heating the buildings. First things first.
5.More of the 'replace retired full-timers with adjuncts and hope for the best' approach to hiring. In the very short term, the cost savings from adjuncting-out the classes that had been taught by a high-seniority full-timer are significant. The damage to the institution is (usually) somewhat abstract at first, and hard to quantify, but the savings are concrete and easy to quantify. When the bulkhead is broken and water is rushing in, the quick fix holds real appeal. If last week's IHE article was right, and the pace of full-timers retiring will slow as the returns on their pensions slip into negative territory, then the pressure to adjunct-out the few who actually do retire will be all the greater. Expect an already unforgiving job market in the evergreen disciplines to get that much worse, at least until the dust settles.
6.More 'votes of no confidence' on individual campuses. Since the faculty can't fire the governor, the voters, or Ben Bernanke, they direct their fire where they can. If I were a betting man, I'd expect to see a spike in these over the next year or two.
7.Little to no help from either Presidential candidate or political party. I'm open to being proved wrong on this one in a few months.
Astute readers will note that most of these represent continuations of existing trends. Attribute that to a lack of imagination, if you want, but I think it's accurate. When 'politically chosen' austerity becomes 'forced' austerity, it's still austerity.
I don't often want to be wrong, but as far as this post goes, I want to be largely wrong. Wise and worldly readers – make me feel better by pointing out how I'm wrong. Seriously.
Friday, September 19, 2008
Perversely Enough, We're Upscaling
I knew that recessions (or economic slowdowns, since I'm really not interested in splitting semantic hairs here) generally bring increased enrollments at cc's. The reasons are straightforward enough: the opportunity cost of education is lower when jobs are scarce, the need for a degree is higher when jobs are scarce, and our low tuition becomes much more attractive when things get precarious. This is old news. People who otherwise might have gone somewhere more expensive will take a second look at the local cc when money is an issue.
Unless – and this was the part I didn't realize – they're so ridiculously broke that the logic circles around. According to my source, who's in a position to know, some of our increasing number of homeless (or intermittently homeless) students are actually transferring to four-year schools earlier than they would prefer. The draw is financial aid for dorm rooms and meal plans.
Financial aid at the cc only covers tuition, fees, and a (low) estimate for books.* It doesn't do anything for living expenses, which aren't getting any cheaper. But financial aid at the nearby residential four-year colleges includes room and board. If you're intermittently homeless, the prospect of aid covering a place to live and a meal plan is nothing to sneeze at.
So the perverse impact of the economic downturn is that we get more people from the upper end of the economic scale, since they're playing it safer by choosing the low-cost option, and fewer people from the lower end, since we don't offer subsidized room and board. Perversely enough, we're upscaling.
Community colleges live a Keynesian existence. Our enrollments are aggressively counter-cyclical, even though our operating subsidies generally aren't. (Our philanthropic income generally isn't, either, so there's little relief on that front.) Increased enrollments bring increased tuition revenue, which helps offset the decreased public aid, but which puts additional strains on services and facilities. (Parking has been a limited exception this time around, since gas got so expensive. We're seeing much fuller buses than in the past, so the increased stress on the parking lots is less than would have been expected.)
Now we're experiencing a sort of counter-cyclical sociological shift, with more middle-class students and fewer really poor ones just when the economy is going South. The cars in the student parking lots are, on average, a little nicer than they were a few years ago.
I'll admit to feeling a little unsettled at the thought of some of our students applying to transfer just to have a place to live, but I can't argue with their logic.
Wise and worldly readers at other cc's – have you seen this on your campus?
*I'll admit to being intrigued at the prospect of a textbook rental program, at a predictable flat rate that financial aid could cover, but we aren't there yet. Perversely enough, bookstore profits are part of the college's operating budget, so we couldn't afford to cannibalize that too much.
Thursday, September 18, 2008
Foot-Dragging and Network Externalities
In this context, I don't mean that in the substantive sense, though that's certainly there. I mean it in the procedural sense. How do you inform every potential stakeholder of, say, an event happening on campus in two days, or a grant application deadline in two months?
In olden times, back when 'full-time faculty' would have been considered a redundancy, you could just put pieces of paper in mailboxes. It wasn't the fastest method, but everybody had a mailbox, and nearly everybody was around a lot, so you could be reasonably sure that almost everybody at least had a chance to read the letter. (Yes, people were often quick to dispose of papers unread, but that's another issue.)
Some people still insist on that method, and it's not without certain virtues in very small, close-knit settings. But it doesn't work well for units larger than a single department, and it doesn't work at all with workforces that are substantially part-time. And with the imminent change in postal regs – starting November 1, any pre-sorted first class mail that gets returned to sender for an incorrect address will carry a fifty-cent penalty per piece for the sender – snail mail is too financially risky.
Email was supposed to save us, as was the campus website. They both have real virtues – they're quick, they save on paper and printing costs, and they're accessible from off-campus. In some ways, they're almost too quick and easy -- I sometimes wonder if attaching at least some cost to them might cut down on the amount of meaningless cc'ed emails I have to slog through every single day. But there's a substantial portion of the population that simply refuses to read emails or websites, and another substantial portion that reads them too sporadically to rely on them.
The dilemma here is that the tools only work when everyone uses them. I think the scholarly term for it is 'network externalities.' The idea of 'network externalities' is that the value of a communication tool (for example) increases for each person as more people use it. Phones weren't terribly useful when only Bell and Watson had them; as phones became more common, they became more useful, since there were more people to call. Email is similar. Back in the day, I can remember when email access was strange and exotic, and very few people had it. The only people I could email were other students in computer classes, so I didn't rely on it for anything. Now it's far more useful precisely because so many more people have it.
The problem with anything that relies on network externalities is that stubborn non-adopters can drag down its value for everyone else. Posting a message on the campus website becomes ineffective by itself, since some people don't look at the website (or don't look very often), so we have to rely on other media, too. I've actually seen emails asking people to look at a message on the website. Worse, some folks insist on paper backups as well, thereby defeating any cost or time savings. A tool that could have saved us all a lot of trouble winds up causing new problems, specifically because some people simply refuse to use it.
Wise and worldly readers, I'm hoping to steal some good ideas here. Have you seen effective ways to coax the technophobes into at least the 1990's? They're throwing sand in the gears for the rest of us, whether they're aware of it or not, and I'd love to be able to re-route resources from self-inflicted problems to real ones.
Wednesday, September 17, 2008
In response to the obvious question of why they'd care what faculty did with their own money, the provost, one Arnett Mace, responded:
“Some people argue that there is no cost to the university...but the person is not here conducting scholarly work or teaching.”
Apparently, it's impossible to conduct scholarly work outside the office, or to teach online. And exactly who they'll pay to conduct bed checks on the weekends is left for the reader to ponder.
(I can see the scenario now. A dean confronts a professor:
Dean: My sources tell me you weren't home on Saturday night. You weren't at a conference, were you?
Prof: No, I went straight from the rave to the orgy.
Dean: Okay, then. Just don't go to any conferences.)
The move starts to make sense later in the article, when the university's President is quoted as saying that “what we do in this climate gets looked at, and perceptions matter.”
Now we're getting somewhere.
Although it's a delicate subject, and feelings on both sides run high when it's mentioned, the fact of the matter is that public higher education exists at the pleasure of a public that really doesn't understand how it works. Some policies and practices that strike seasoned practitioners as silly have their roots in efforts to prevent inflammatory headlines.
Office hours are an easy example. I've seen (cough) colleges require faculty to be physically on campus four days a week, even if they only teach for three. The idea is that if the press gets wind of a non-trivial number of professors only showing up for work three days a week for full pay, it'll whip up a scandal, and the college will pay the price in reduced political support. Which would happen.
The press periodically gets its knickers in a twist over released time, or travel, or summers, or empty classrooms on Fridays (unless it's to save gas money), or tenure, or esoteric research, or controversial research, or tuition. Usually the stories are grounded in anecdotes about a few outliers – some of which are genuinely objectionable, some of which really aren't – but which imply a much more generalized pattern of abuse. These stories do a great deal of political damage, both individually and over time.
One of the jobs of administration is to find that sweet spot where 'allowing the place to function' and 'keeping the public reasonably happy' overlap. (In some places, that sweet spot is pronounced “football.”) When times are good, that overlap is large, and there's still plenty of room to move. When things get ugly, that overlap shrinks, and some cherished prerogatives get some cold, hard looks. That's when a certain amount of defensive Dilbert-ism creeps in.
I'm not defending Georgia's action, which strikes me as memorably stupid. But while it's certainly absurd, it's not completely random. It's a misfire, but I can see where they were aiming. The public sees 'travel' and thinks 'pleasure junket on my tax dollars.' One florid anecdote about a particularly overentitled professor in a tropical clime, and the university could lose millions. I get that.
The trick is in both acknowledging the reality of the danger, and in addressing it in ways that don't do violence to the underlying mission. UGA gets full credit for the former, but a zero for the latter.
Tuesday, September 16, 2008
Ask the Administrator: Presidential Spouses
The small, regional university at which I teach in the Midwest has a new president. He has what I believe to be a unique approach to administration: his wife is present with him at every school function, and is always introduced as "First Lady of (insert school name)." This has recently progressed to her appearing in newsletters and promotional materials as "First Lady of..."
I have never encountered this at any of the schools I attended or at which I taught before taking this position. My question for your readers: does this sort of honorific exist for the president's spouse at your school (and, if the spouse is male, what is his honorary title)?...
I realize this is probably a minor issue, but it just seems a little...well...weird. While it may simply be an eccentricity, it causes me to slightly worry regarding future issues like budget decisions. "Yes, we need to begin repairs on the wall that crumbled to the ground on the north side of the Humanities Building, but first we must erect a statue at the university entrance honoring the Czarina of..."
Sometimes I wonder what year this is. From the recent financial sector news, I'm thinking 1929. From this letter, I'm thinking 1950.
As near as I can tell, there are exactly two reasonable ways of dealing with Presidential spouses. One is to treat them like any other spouses. The other – which makes sense only in certain contexts – is to recognize the job of Presidential Spouse as a de facto social director, draw up a job description for the social director part, and pay a salary. In the context of some large institutions, where the President basically reports to the Development office, the President is largely the fundraiser-in-chief, with the spouse serving as a sort of chief of domestic staff. To the extent that this description applies, a salary or stipend of some sort is probably in order. I find that weird, but it captures the reality of some places.
What's happening in this example, I think, is a pretty highly developed case of Presidential narcissism.
Presidents are very visible figures, and to some degree they're never entirely off camera. This strikes me as a dangerous development in many ways. One of the ways I stay sane – and mine is a much lower-profile position – is by having a home life that's pretty cleanly separated from my work life. My kids have no idea what I do all day, and TW only gets vague outlines. That's not to protect anything nefarious, but just so the politics and drama at work don't contaminate home, too. Switching between 'work brain' and 'home brain' keeps me from burning out on either one. (The unfortunate side effect is that any requests to take care of something “on the way home” or “during lunch” frequently get forgotten, since I'm in 'work brain' mode.) Blogging is somewhere in between, since I do it at home and it's mostly about higher-ed issues, but I find it therapeutic.
The people who can't separate the two, and who have very high-profile positions, strike me as high-risk. Either they eventually just forget that the camera is on (like the Iowa cc President I mentioned last week), or, more commonly, they cannibalize their home lives until their marriages fall apart. (Anecdotally, I've noticed a high divorce rate among college Presidents.)
Involving the spouse as First Mate might seem to help, but it gives me the creeps. It's obviously subservient, introducing workplace hierarchy where it ought not to be. It puts the First Mate in an awkward position in terms of internal college politics, which could easily make things worse. It puts the other employees in some awkward situations. And what happens if the First Couple separates or divorces, and the First Mate doesn't want to give up the job? (Let's say for pension or health insurance reasons.) Yucka yucka yuck. No, thanks. That has 'ugly' written all over it.
It's easy for Presidents – not a bunch of shrinking violets anyway – to forget where the boundaries are, and to conflate their personal convenience with the good of the college. All the more reason for colleges not to let them, I say.
I've been lucky enough that every place I've worked has treated Presidential spouses just like any other spouses. That seems sane to me. Quasi-royalist trappings don't make sense in 2008.
Wise and worldly readers – what have you seen? What do you think?
Have a question? Ask the Administrator at deandad (at) gmail (dot) com.
Monday, September 15, 2008
It's Not 'Career or Transfer'
Classically, cc programs are grouped into either 'career' or 'transfer' tracks. The idea is that students who matriculate typically have one of two desired outcomes when they graduate: either step right into a full-time job, or transfer to a four-year school. Perkins funding assumes a relatively clear distinction, as does much of the public discourse around cc's. (Even this satirical piece in last week's IHE seems to assume that the distinction is both natural and self-evident.) Politicians love the 'career' aspect of cc's, and are curiously silent on the transfer aspect. (They're similarly reticent on the subject of remediation.) Faculty, supposedly, favor the transfer side of things, though in practice I've found the stereotype to be pretty unreliable.
What we're discovering, though, is that the world is blurring the distinction between career and transfer, and we have to, too.
Fields that once hired people with two-year degrees are increasingly insisting on four-year degrees. (Locally, at least, that's proving to be the case with social work, engineering, and, surprisingly, nursing.) As the fields professionalize, they expect a higher level of academic qualification. In the late 90's, the way to handle that was to get a two-year degree at a cc, get an entry-level job in the field, and have the employer pay for the completion of the bachelor's degree, usually at night. But employer-paid tuition is much less common than it once was. In the late 90's at PU, most of the evening students were employer-paid. In the early 00's, that funding source dried up, and the program shrunk dramatically. Since then, they haven't bounced back much; employers seem increasingly to take the position that finishing your degree is your problem, to be solved on your dime.
That's not to say, though, that the two-year programs have lost their relevance; it's just that they aren't enough by themselves anymore. A student who really wants to break into those fields can (and probably should) complete our program and then transfer. So is it a transfer program or a career program?
The wisdom of the various fields 'professionalizing' is, of course, open to debate. But whether the shift is good or bad, it's out of our hands.
One strange side effect of the shift, at least in my observation, has been that the generic 'transfer' major – all gen eds, all the time – is thriving. Between the sticker shock of four-year college tuition and the increased relevance of transfer for career fields, the enrollments in the classic liberal arts courses are as healthy as they've been in decades. The liberal arts fields may be increasingly marginalized at many colleges and universities, but they're healthy – if not thriving – at cc's. I'm not entirely sure what to make of that – on the one hand, I'm happy to see the classic academic fields well-enrolled, but on the other, I'm a little wary of them becoming too closely identified with the least prestigious tier of higher education. Since prestige and funding tend to go together, this isn't just a theoretical point.
Although the facts on the ground are shifting, our ways of talking about them haven't. We still talk about career and transfer as if it were as clear as 'HVAC technician' and 'philosophy major.' And there are still enough clear cases that the discourse can survive. But that area in between is where the real growth is, and we haven't quite figured out how to handle that yet.
Friday, September 12, 2008
Thoughts on San Antonio
Apparently, San Antonio College has a policy stating that anybody who teaches 12 or more credits in a given semester is entitled to full-time status, including benefits (read: health insurance). This is a fairly common policy at cc's, and one with which I'm quite familiar. San Antonio's innovation has been to require adjuncts who hit 12 credits to sign a waiver waiving their right to full-time status and benefits as a condition of getting that last course.
So they have a policy, but they have a practice of requiring anyone to whom the policy would apply to waive it.
The story is worth reading in its entirety, if only to follow the tortured explanations offered by the college spokesperson. (My fave is the 85 days vs. 90 days. Very cute.) Her stated argument boils down to “it's either that or we don't run the class.” Her unstated argument is that they can't afford to pay what it would cost to make everyone they need full-time, so they're doing what they can to make ends meet.
This is the kind of stuff that gives administrators a bad name.
I can't imagine that this would hold up in court. If it did, I'd expect to see retailers and restaurants start requiring employees to waive their right to the minimum wage, contractors requiring employees to waive their right to OSHA standards, and the like. Down that road lies madness.
(I've always wanted to write “down that road lies madness.”)
Libertarians like to argue that freedom of contract is one of the most fundamental human rights. What they fail to recognize is that contracts have ripple effects far beyond their signatories. (In other words, there is a 'market' that is larger than individual actors, just as there is a 'society' beyond individual adults, and in both cases, the whole is importantly different than the sum of its parts. Some libertarians actually go so far as to resort to patent absurdity -- “there is no society” -- or veiled threat -- “there is no alternative” -- to avoid dealing with this.) When individuals were allowed to use poor judgment, they created a housing bubble – and crash -- that first priced more prudent sorts out of the market altogether, and is now dragging down the entire economy. Your poor judgment stole my home equity – the idea that I shouldn't have anything to say about that is absurd. Similarly, if individuals are allowed to use poor judgment to get around minimum compensation laws, they'll drag down the prevailing market for everyone else. The idea that everybody else should just sit there and take it is somewhere between 'naïve' and 'offensive.' This is a fundamental failure of libertarian thought, built into its structure.
That's why I don't have much patience for the “nobody held a gun to their heads” line of argument. It can be used to justify all manner of exploitation, and often is.
People entrusted with decision-making positions ought to hold themselves – okay, ourselves – to a higher standard than “nobody held a gun to their heads.”
As a profession, I think we owe it to prospective grad students to warn them away, and we owe it to some chronically-underemployed freeway fliers to be candid about their prospects. In both cases, the point isn't to make people feel lousy about themselves, but to help them avoid or escape a situation in which success is unlikely.
But at the end of the day, 'ought' won't always cut it. As long as this kind of exploitation is relatively easy to do, some places will do it. That's why we need rules of the game, and why waiving those rules – even by mutual 'consent' – can't be tolerated. Recognizing each other's basic humanity involves recognizing that, at some important level, we're all in this together. We all have moments of weakness, and those moments affect other people. That's why we have to recognize the fact of collective consequences for individual actions, and therefore the potential legitimacy of collective restraints on those actions. If we don't ban abuse of the weak, the weak will be abused, whether that weakness is physical, economic, or anything else. Saying “they asked for it” doesn't make it right.
I honestly hope for a housecleaning in San Antonio. Some things are just too basic to let slide.
Thursday, September 11, 2008
This is the time of the semester when late-adding students show up to the second (or even third) meeting of a class, asking to be caught up and held harmless. In lecture classes, it's not that big a deal; you just tell the student he's responsible for whatever he has missed so far, and that's that. But in classes that do group work, or hands-on work, or anything intensely interactive, it's a real imposition.
It wreaks havoc with attendance policies, among other things. If a student wasn't enrolled yet, was he absent? My position is yes, because 'absent' means 'not present,' and he was not present. If a student happens to burn through his allotment of legal 'drops' in the first week, so be it.
The logic of grading, among other things, forces this position. Ideally, students are graded based on their demonstrated mastery of the designated learning outcomes for a given class. Also ideally, the number and depth of those learning outcomes is determined in part by reference to what's achievable in a given semester (or trimester, or quarter, or whatever the local system is). A student doesn't get a free pass on a learning outcome based on late registration. And the course grade doesn't come with an asterisk.
In a perfect world, we'd do away with the add/drop period altogether. But I know that's just not realistic. I recall dropping an Art History class after a single meeting in college when it became abundantly clear that my 'nap' reflex was entirely too strong to endure that long in a dark classroom. (I have the same reaction to Al Gore. TW laughed long and hard when she caught me snoring about a half-hour into the An Inconvenient Truth DVD. I still don't know how it ends.) And a class dropped usually entails another class added, since 'full-time' status usually requires a set number of credits, and the loss of full-time status often entails the loss of health insurance.
Although different schools handle the add/drop period differently, I've never seen a method that didn't bring headaches. I've heard tell of schools that let individual faculty decide who to let in and who not to, which strikes me as a 'due process' lawsuit waiting to happen. Some schools end the add period before ending the drop period, leading to the inevitable angry “now what am I supposed to do?” questions from students. And I worked at one college where the ERP system was so fouled up that a student could duck a prerequisite by dropping it during add/drop week; the computer wouldn't go back and re-check, so disturbing numbers of students were able to dodge math until remarkably late in their studies. What purpose they thought that served eludes me, but they did it in droves.
Some faculty get around the add/drop issue by making the first day of class non-substantive – just pass out the syllabus and some basic contact info and send them on their way. That way, late arrivals didn't have to be caught up, since they really didn't miss anything. This strikes me as a form of consumer fraud, though, and it's profoundly disrespectful to the students who commute to school. (I've had angry students in my office saying “I drove a half hour for this?”) Yes, making the first day substantive can be a challenge, since it's pretty much a given that nobody has done any reading yet, but it strikes me as the kind of challenge that a professional ought to be able to handle.
Wise and worldly readers – has your college found a reasonable way to handle the conflicting demands of add/drop?
Wednesday, September 10, 2008
Ask the Administrator: Starting Over
I have a question about college teaching as a second career. Most of the advice I've seen is aimed at twenty-something's finishing grad school. I am a few months shy of my 50th birthday and live in the DC metro area. I started graduate school straight out of college and was on my way to finishing my PhD in American History when I was side-tracked: I took a "real" job in an area unrelated to my degree. I kept plugging away at it though, and finally finished my doctorate in 1999, twelve years after leaving school. In 2002 I decided to use my degree and launched myself back into the world of academic history. I spent my first three years working in a local museum as the in-house historian. However, my job was based on grants and when my project was finished, so was the grant money.
Last Autumn I was hired by a small local university as an adjunct. I've taught two classes every semester since, both survey classes and some other, more specialized classes. I get along well with my department chair and get great student reviews. My chair asked me to cover some classes for her when she was called out of town on an emergency, and I was interviewed for a full-time position by the department even thought it was outside of my specialty (I did not make the final cut). This semester I am back teaching two courses; a survey and one other, covering for a professor who left the department unexpectedly.
My CV is starting to bulk up a little bit. My first book is coming out in early 2009, published by a good, small university press. My first article was published in a decent journal last year and I have another in the editing process now (the first reviewers liked it.) I have given at least one conference paper each year for past five years and next year I will be presenting at the AHA.
My wife is OK with me being an adjunct but I'm not satisfied with doing that forever. The pay is dismal and it's straining our family budget. A good buddy from my undergraduate days, who is now tenured, tells me I'd have a job if I could move to a different area and to not give up. My wife has a good career here though, so we're not going anywhere for at least another 12 years. I've sent my CV and letters to the local university history departments, but haven't gotten any replies. I love teaching and I do not want to give up and besides, I can’t go back to my old job (it's gone).
My question is simple, despite some progress, I am frustrated and wondering if I have taken on an impossible task. Did I blow my opportunity when I left the academic track back when I left school?
Honestly, I think age is less of a barrier here than location. At my cc, we've hired people in their fifties repeatedly; in fact, over the last several years, most of my hires have been in their forties and fifties. But the catch for you is that the market for tenure-track faculty positions is truly national, and DC is a popular area. If you're ensconced there for the next dozen years, your options are relatively few.
Certainly I'd advocate being open-minded about the 'level' of institution at which you'd be willing to work. Universities are great, and if you can find the job of your dreams there, more power to you. But teaching-oriented four-year colleges and community colleges (which DC itself lacks, for reasons that passeth understanding) can also offer many of the satisfactions of academic life, and even some virtues of their own.
I'd be surprised if the paper equivalent of cold-calling led to a full-time position. In the age of open searches, standard operating procedure is for colleges with tenure-track openings to advertise those in a few generally accepted venues, and for applicants to answer those. Yes, the folks who address job hunting in the corporate world love to talk about networking and elevator pitches and the rest, but academia is still its own world in some ways. Network at will, but actually respond to published ads.
(My favorite job-hunting disconnect is when I read that corporate HR departments will only spend ten seconds or so scanning an application for a few keywords. Can you imagine? Academic search committees tend to veer to the other extreme, which I think reflects a combination of habit, a scarcity of positions, and a surfeit of applicants. When a department only gets to hire once every five or ten years, it isn't going to resort to skimming.)
If you haven't let the cat out of the bag yet, I would go to some length to hide your place-bound status. If your current college thinks that it can keep you on the cheap since you have nowhere else to go, it probably will. That's a cold truth, but it's a truth nonetheless. And certainly don't bring it up elsewhere.
My general advice to folks mulling an academic career in an evergreen discipline is: don't. But since you've already cast your lot, good luck.
Wise and worldly readers – your thoughts?
Have a question? Ask the Administrator at deandad (at) gmail (dot) com.
Tuesday, September 09, 2008
Administrivia is the barrage of detail-y, annoying, meaningless crap that constitutes a sometimes-disheartening amount of the job. The English department needs more blue books, but it’s overdrawn on its instructional supplies account – can we transfer money from office supplies? We got an ‘unauthorized purchase’ notice from Purchasing because the shipping charge on the network printer exceeded the quote we used for the purchase requisition by $1.95; how do you want to handle it? Does a former cousin’s (by a defunct marriage) death merit ‘bereavement’ leave, or is that personal time? (Be careful with this one, since it sets a grievable precedent.) Someone’s journal subscription runs from April to April, so most of it falls into next year’s budget, but the chair didn’t realize that – how should we compensate?
I’ve actually had all of those, and so very, very many more.
They’re frustrating in many ways: they’re trivial, no-win, and often precedent-setting in bizarre ways. You find yourself spending inordinate amounts of time tracking down fractions of fractions, and trying to satisfy the demands of bookkeeping, local policy, the quirks of your ERP system, and statewide policy that may or may not fit your local reality.
Some deans fail by deciding that the whole realm of administrivia is simply beneath them. It both is and isn’t. It is, in the sense that nobody gets a doctorate to decide how to compensate for postage hikes. But it isn’t, in the sense that those annoying little detail-y things don’t go away when you ignore them. If anything, they get worse. The deans who decide that they’re “big picture people” usually either move up really fast – before the damage becomes visible – or crash and burn over time, as the problems they ignore snowball. These are the folks who disparage “management,” instead favoring “leadership.” Inspiration is all well and good, but if the bills don’t get paid or you wind up spending half your time fighting (and losing) grievances over “I can’t be bothered with this” impulsive decisions, then even the inspiration will fall flat.
The other extreme is the micromanaging control freak, the legal secretary writ large. These folks get noticed at lower levels for being good with budgets and for being conspicuously conscientious, both of which are positive. But they’re toxic as leaders, since their first impulse when confronted with anything new is to kill it. (Anything that doesn’t fit neatly into a pre-existing category is perceived as a threat.) They compensate for an inability to understand the big picture by elevating administrivia to Holy Writ. I’m still surprised, after all these years, at some of the issues they decide are worth fighting over. The easiest way to scare these people is to ask ‘why’ questions in sequence. After the first answer or two, they get lost, and often visibly nervous.
The deans I’ve seen succeed usually lean one way or the other, but have enough strength on the weak side to at least stay out of trouble. (As my regular readers can probably guess, I can do the philosophical stuff until I’m blue, but have to force myself to do the detail-y stuff.) The main survival skill the smarter ones have is enough self-awareness to surround themselves with people who are strong where they’re weak. At PU, my Associate Dean was an accountant by training, and we were a great match – he thrived on the stuff I hated, and vice versa. Here, we don’t have Associate Deans, so I’ve had to cobble together what I can. But you have to make a conscious choice to reject the “Mini Me” model of staffing. Otherwise you’ll spend so much time echoing each other that you’ll lose sight of those annoying realities that just won’t go away.
Obviously, the perfect dean would hit both parts of the job out of the park, but I haven’t seen that person yet.
Allowing for personal limitations and some degree of leaning, though, it’s tough to coach people to pay productive attention to the weak side. I’m working on that here, and have to admit mixed results.
Wise and worldly readers – have you found successful ways to get “leaders” to pay (at least some) attention to details, or to get the ‘legal secretaries’ of the world to get a grip?
Monday, September 08, 2008
Ask the Administrator: Academic Blacklists?
Currently I am applying for jobs for 2009-2010. My specialization is Central Asian history, but there are few openings specifically advertising for scholars of this region. Instead I have noticed a lot of positions for history of the Islamic world. Often these spots require teaching courses in Middle Eastern History. That is the Arab states, Turkey and Iran. I am currently teaching such a course at the overseas institution where I work. But, my research and publication in the area is limited. Most of my publications deal specifically with the Soviet Union, especially Central Asia. However, I have one publication comparing Soviet nationality policies regarding the Crimean Tatars and Meskhetian Turks and Israeli policies towards the Palestinians. My personal feeling is that this piece shows my broader capabilities. That it demonstrates that I can teach and write about areas of the Islamic world outside of the former USSR. But, a friend of mine has
repeatedly insisted that I not send it as a writing sample. He claims that it will automatically get me blacklisted at many places because it is critical of Israeli policies. My feeling is that he is being over paranoid. Besides it is listed on my CV. But, since he is so insistent I am wondering how much truth there is to his claim? In particular I am looking at state schools in the Mountain states, Midwest and South.
I'll have to start by acknowledging that my administrative and faculty experience have been exclusively at teaching-oriented institutions, as opposed to research universities, so that's all I can speak to from direct experience. Folks at research universities are especially invited to comment.
The usual meaning of 'blacklist,' as I understand it, is a list of names of people to be shunned. I've never seen such a list, nor have I ever heard of one, anyplace I've worked. It's true that some research topics get more interest/respect/funding than others, but to call that 'blacklisting' strikes me as melodramatic. 'Fashion' strikes me as closer. You may be considered out of step, but that's not the same as being blacklisted per se. (For example, it's entirely possible that prior to seeing your application, they've never heard of you.) Of course, from the perspective of an unsuccessful job applicant, that may amount to a distinction without a difference.
At the teaching institutions at which I've worked, the concern wouldn't be about whether you were pro- or anti-Israel, but about whether you were really interested in teaching the courses we'd be hiring you to teach. If we got the impression that you were really all about the Soviet Union, we would find you less appealing for a role teaching Middle Eastern History. That's not because of any particular political point of view, but because folks playing out of position tend to get crankier than do people who get to teach their first love. (I suspect that this inclination is why rhet/comp grads get jobs, and literature people struggle. If 'freshman composition' is your first love, we can be pretty sure you'll be happy here. If it's something you endure grudgingly in order to teach the occasional lit class, we can be pretty sure you'll be miserable here. That's not an absolute, by any means, but the rhet/comp folk start off with the benefit of the doubt. Similarly, people who have taught at cc's before have a leg up over folks who've never ventured far from Research U territory. That's not 'blacklisting' in any sense of the term I'd recognize, but if you're a struggling literature grad, that may not be much comfort.)
I sense the undercurrent to the question is the popular sense that higher ed is a bastion of liberal groupthink, and that positions counter to the alleged orthodoxy are inherently risky. Again, I've only worked at teaching institutions, but I can attest that this view simply doesn't describe the realities I've seen on the ground. It may at other places and may have at other times, but I haven't seen it. There are certainly blowhards to be found here and there, and some of those blowhards use politics as a battering ram to make people listen to them, but in those cases the politics are merely an excuse. (I've had awful personal experience, many years ago, of working for a blowhard who used his own much-trumpeted altruism as a way of calling attention to himself. It was horrible.) The David Horowitzes of the world rage against the bogeyman of higher education to justify their own existence; those of us in the trenches wonder what the hell they're talking about, shrug, and go about our business.
On a more basic level, I wouldn't let fears of a theoretical monolith stop you from going where your research leads you. If you do, the monolith effectively becomes real. I say, call its bluff.
Wise and worldly readers – especially at research universities – what do you think?
Have a question? Ask the Administrator at deandad (at) gmail (dot) com.
Friday, September 05, 2008
The Return of Friday Fragments
1. My Mom sent The Wife some old pictures of me, from high school and college. Actual conversation:
TW: You look so skinny! Look, you barely have shoulders! You're like a rail!
The Girl (reassuringly): Now you're nice and big, Daddy.
Uh, thanks, honey.
2. Conversation from earlier this week:
The Boy: Dad, do I have to go to grad school?
DD: Noooooo. Noooo, you don't. Nope.
TB: Whew! I'd be tired of school by then!
DD: Yeah, probably.
3. I'll admit to a perverse fascination with the unfolding of the whole “Presidents Gone Wild” thing in Iowa. The President of Iowa Central Community College, Robert Paxton, got fired after a pretty unpresidential photograph of him ran in the local paper. (Judge the photo for yourself here.) His punishment included settling for a $400,000 severance payment and several years' worth of health insurance. Given that the young woman in the photo wasn't a student, and was of age, I'm not completely sure what he actually did wrong. That said, if a $400,000 check and several years' worth of health insurance is a punishment, I'm sure I could dig up a skeleton of my own somewhere.
4. I'll also admit to wondering about the possibilities of maintaining a squeaky-clean image in the age of cell phone cameras. It's much easier now to not know when your picture is being taken. I try to maintain a professional demeanor when I know I'm representing the college, but when I'm schlepping the kids on errands, sometimes I just look like the harried Dad I am. (And I don't even want to think about what a picture of me at the end of a workout would look like!) In an age of ubiquitous information and images, maybe we have to change the way we look at isolated moments.
5. And yes, we're all flawed. I had to confront a flaw of my own this week as I watched Sarah Palin. Sure, she's ideologically offensive, ethically tainted, and comically unqualified for her role; as a friend of mine wrote me in an email, her speech sounded like she was running for class president. And yet, even granting all of that, there's the basic, irreducible fact of her hotness. She's a poutier Tina Fey, and, well, I'm not made of stone, people. I'm not about to go do something stupid like vote for her, but I'll admit I don't mind seeing her picture in the paper.
6. The Boy started a new school this week, which involved riding a bus for the first time. As fate would have it, our stop was the first stop on the route. Not knowing any better, he sat right in the front row, and actually greeted each kid at each subsequent stop upon boarding. I'm torn between pride in raising such an earnest and sweet kid, and terror at how the other kids will read that. Unless things have changed pretty radically from my bus days, bus culture has rules, and “don't sit in the front seat and act as some sort of &*)(_&^ greeter” is a basic one. Poor guy.
Thursday, September 04, 2008
In Which I Realize That We're Doing It Wrong
In the last few weeks, two of the biggest, most respected and sought after employers in our service area told me, independently and without prompting, that they desperately want bilingual employees. In the fields the employers represent, the ability to communicate with the population that actually exists is hugely important, and they’ve had a terrible time finding bilingual workers with the skills they want.
We teach a substantial (and growing) bilingual population, of course, but I realized just now that we’re doing it wrong.
Our entire ESL/Bilingual framework is built on the assumption that ESL status is an obstacle or a handicap. The unstated goal has been to ‘catch up’ the ESL population to the rest of the students. Accordingly, we have all manner of ‘bridge’ programs, tutoring, ‘outreach,’ and the rest.
These are all good, as far as they go. To the extent that they help students from struggling high schools to develop the skills to succeed in college and eventual careers, I’m all for them. And you’ll never catch me saying bad things about tutoring, whether for this group or anybody else.
But the attitudes we convey, and messages we send, by treating ESL status as a handicap are backwards. In this market, fluency in two languages (English and Spanish, really) is a huge plus. It’s an asset. Given two similarly qualified candidates, one bilingual and the other not, both employers made it abundantly clear to me that they’d hire the bilingual one in a heartbeat. The ability to communicate with Spanish-speaking clients (or, more importantly, potential clients) is a major business advantage, and one for which they’re willing to pay. It’s worth something to them.
But the messages we send to the local high schools with large Hispanic populations don’t mention that. They’re all about ‘access’ and ‘support’ and ‘respect,’ rather than ‘parlaying your advantage.’ They put the focus on possibility, rather than motivation. And anyone who has taught can tell you that motivation matters tremendously.
I’ll concede that utilitarian arguments for education don’t ‘sing’ the way that phrases like ‘the love of learning’ do, and that too exclusive a focus on perceived career payoff can be limiting. But utilitarian arguments can get folks in the door, and once they’re in, all kinds of things become possible. (And, in practice, the choices available aren’t usually “major in business or major in literature.” They’re closer to “go to college or go to work right away.”) And saying to a potential student that “you’re already ahead of the game” seems likelier to result in positive attitudes than “we’ll help you slog through all that remediation.”
Even better, since the asset in question is a skill rather than a trait, anybody who wants to can pick it up. We teach Spanish, and anybody who wants to enroll is welcome. In other words, while an emphasis on Spanish-as-asset would particularly benefit the Hispanic population, it ultimately isn’t racial. It’s about recognizing the market value of a teachable skill. Some people just happen to have been taught it at home.
(Of course, ‘level’ is key. It’s one thing to be able to translate “blue car,” and quite another to be able to translate “anterior cruciate ligament” or “collateralized debt obligation.” Fluency in everyday language doesn’t necessarily mean fluency in a specialized vocabulary. All the more reason for higher ed to lead the charge, I say.)
Wise and worldly readers – have you seen community outreach programs that emphasize the pragmatic advantage of bilingualism? Am I just reinventing an old wheel here?
Wednesday, September 03, 2008
Ask the Administrator: Gifts from Students
One of my students from last year came in today and gave me a t-shirt that he had gotten for me on his vacation. I do not currently have any grading responsibility for this student, and I don't expect to, but it's not impossible. I have written him a letter of recommendation, and I would happily do so again, if asked. I'm sure that this student meant it entirely innocently. There was no ulterior motive. He just likes me and wanted to say "thanks." So, is it appropriate for me to have accepted this gift? If not, what do I do now? I don't want to insult the student by returning it. Should I just quietly donate it to Goodwill?
As always, you are welcome to post this on your blog, and solicit advice from your W&W readers.
Ironically enough, it's the well-intentioned, innocent gifts that cause the most heartache.
Most colleges have some sort of ethics policy, or conflict of interest policy, about gifts. Those policies are usually fairly specific, often identifying a minimum threshold beneath which the gift is considered too trivial to worry about. (The ones I've seen usually set a cutoff around 25 dollars, give or take.) HR departments usually have the specifics at the ready. If a ten-dollar t-shirt falls below the 'de minimus' threshold, you're fine.
The fact that you aren't grading the student is certainly helpful, too. The standard of “it's not impossible” strikes me as far too rigorous, since it could conceivably apply to just about anybody. (For those of us who work at open-admissions places, that's literally true.) Some colleges have 'fraternization' policies (that is, sex) that draw the line at grading responsibilities, and your case would meet that standard. I say if you could sleep with him, you can accept a t-shirt from him.
That said, there's often a world of difference between what's legal and what's right. And sometimes being a little extra careful can actually result in a teachable moment, if you do it right.
I worked briefly for a VP who gave conspicuously expensive gifts to all and sundry, and who encouraged others to do the same. Over time, a culture had developed in which elaborate gift-giving, both up and down the chain of command, was both expected and inappropriately binding. I wanted nothing to do with it, which my involuntarily chilly response communicated clearly enough to offend him. Since it became clear that I couldn't simply opt out of the system, I did the next best thing, which was to communicate (and act upon the idea) that any gifts should be trivial. After a while, we got down to a few chocolates at Christmas, which struck me as about right.
Depending on the gift, sometimes there's also the option of sharing it with the department. (That works pretty well for flowers and candy and the like.) For t-shirts, though, that's probably not the best route.
The fear of giving offense is real. Back at PU, a perfectly lovely professor gave me a congratulatory card and a small check when TB was born. I returned the check, sheepishly explaining that since I did her review, I couldn't accept it. I don't know which of us was the more embarrassed, but it struck me as a reasonable price to pay for keeping things above reproach. (After I left PU and TG was born, she sent another card with a bigger check. I laughed out loud when it arrived.)
In my teaching days, I occasionally had students bring in editorial cartoons that they said reminded them of me or of my class. Some of them wound up on my bulletin board, and I'll admit that I didn't feel much ethical conflict over that. If anything, I was gratified that the class was making enough of an impression to spill over into random moments. Your case strikes me as similar to that.
Or maybe I'm just becoming jaded. Vox blogosphere, vox dei, so I'll turn it over to my readers. Wise and worldly readers – what say you?
Have a question? Ask the Administrator at deandad (at) gmail (dot) com.
Tuesday, September 02, 2008
Dropoffs at cc's aren't the same as dropoffs at many four-year schools, since most cc's don't have dorms. (This is apparently starting to change in New York.) More commonly, it isn't a dropoff at all, but the student living at home and driving himself to and from school. But psychologically, it's still very much a milestone.
In a way, I envy the dorm-bearing schools their clarity. When Taylor is dropped off at the dorm, and the car has one less person in it heading home, it's hard not to notice the change. At commuter colleges, the change is easier to ignore. Last year she spent her days at high school, this year at college, what's the difference?
It's a big one. And it's easy to forget after all these years.
In the K-12 system, as a student, your course is mostly charted for you. Yes, there are a few elective slots in high school, but the basic trajectory is pretty much given. This is especially true if you live in one of those districts where college is the default assumption for high school grads. With a relatively clear path, there's an institutional emphasis on keeping you on the path. And you don't have to think too much about the path, since it's fairly obvious.
In college, for most people, that clarity is gone. (Some very clearly-defined tracks exist in selected majors, but they're exceptions.) And the abrupt shift from clarity to confusion is even tougher when combined with suddenly living away from parents, if that happens. (Every January we get a non-trivial number of 18 year olds who 'went away' to college in September, only to fall victim to the uncertainties and temptations of dorm life. They come to us for a fresh start under their parents' watchful eyes.) Suddenly, not thinking about the path isn't really an option; there's no 'default' setting. You have to make choices.
At cc's, it's possible to postpone some of that. At my campus, the single largest major is the generic 'transfer major,' which consists of the staples of “general education” for most majors at most four-year colleges. Between living at home and taking the transfer major, it's possible to buy time and still make meaningful progress. I suspect that more students could benefit from this, if they knew it existed.
Still, even if you live at home and take the transfer major, there's a basic assumption by the college that you're responsible for your own fate. You make your own choices, for better or worse, and nobody will save you from yourself. Help is available if you ask for it, but you have to ask.
Some parents seem to take a while to grasp the rule change. I get calls from parents asking why they weren't told that Johnny was failing a class. The short answer is FERPA, but the long answer is that it's Johnny's problem and not theirs. That's true even if they're paying the tuition. It seems cold and self-serving, but I'm increasingly convinced that it serves a real educational purpose.
Colleges teach in many ways; the classroom is only one of them. Looking back, part of what I learned in college was how to function when overwhelmed, how to produce when outgunned, and how to stay sane in absurd situations. I learned – imperfectly, of course – to cope. Nobody coped for me. Some of that involved falling on my face and making mistakes that, looking back, were genuinely stupid. But that's part of the experience. As a parent, it's painful to watch your kid stumble. I get that. But stumbling is part of the learning process.
(In my own case, part of what I learned was that I needed to get over myself. In high school I was The Guy in certain subjects, and could coast on what I had. Having learned bad habits, I spent the first semester of college getting my ass handed to me over and over again, and I have the GPA to prove it. It was painful and awkward and frustrating, but frankly, I needed that.)
Even if the college is local and the student is at home, it's a change. The kind of coasting that high school makes possible becomes impossible, and parents need to respect that. It's part of the value of college. Let the kid stumble. Let her get angst-y and confused and frustrated. Let her learn to deal with all of that. (Yes, there are limits, and there are times when it's appropriate to ask for help, but don't be too quick with that.) If she doesn't get the chance to learn those things, you're shortchanging her.
Good luck, parents. I feel your pain. And I'll feel my own sooner than I care to admit.