Thursday, October 28, 2010
Friday, October 22, 2010
Ask the Administrator: Tracking Down Letters of Recommendation
I recently graduated from a third tier SUNY school with a degree in English. I want to be a high school English teacher, which mean I want to go to grad school for a masters in adolescent education/teaching. I transferred into said SUNY from a cc my junior year. Graduated from SUNY with a 2.84 (3.0's and above for 3 out of my 4 semesters; got 3 C's one semester...brought down my gpa some, but I've always been a good student in college). I took some time these past few months, and it was then I decided to apply to grad school. Found some schools in my area that accommodated to a gpa under 3.0, and I plan on taking the GRE to off-set that. (most of these schools btw require ACADEMIC recs, not professional, etc.)
My problem arises from the recommendation portion of my applications....I can't get one! So while it never hurts to ask, I've contacted a few professors (through e-mail) from classes that I've gotten B's and above in. I've written very kind and thought-out e-mails explaining my future academic plans, asking if they would write me a recommendation, but understanding and thanking them anyway if they couldn't. I've had professors turn me down (nicely), and some professors, after a few follow-up e-mails, never getting back to me at all.
I don't want to settle on a future w/o a graduate education because of the unwillingness of a few professors. I also don't really have the hundreds of dollars per credit to take 2 or 3 more english courses at a local college, get an awesome grade, and then HOPE that I'll get a recommendation out of it.
And yes, I have found 1 program near me that doesn't require any recs, but I'm not going to put all my eggs in one basket.
I’ve never been a fan of letters of recommendation. Especially at this level, they seem to reward extroversion more than quality, and those who by luck of the draw happened to get full-time, as opposed to adjunct, professors.
That said, though -- and please don’t take this the wrong way -- it sounds like your transcript could use some sort of boost. A GPA below 3 can make admission to many graduate programs an uphill battle.
I’m also curious about the professors who actually had email exchanges with you, but then declined to write letters. Since you actually had exchanges with them, this wasn’t a matter of not being able to track down people who had left. In my experience, professors decline to write letters when they either don’t believe that the student was especially worthy, or they really don’t know the student well enough to say. (It’s conceivable that they could also plead workload, though I wouldn’t expect that to come after exchanging multiple emails.)
So it’s possible that the GPA cutoff and the palpable lack of enthusiasm on the part of your former professors are effectively conspiring to tell you something.
Or not; you could just be a victim of circumstance. But I wouldn’t be surprised if the folks on graduate admissions committees wondered the same thing. If you’re up against other applicants with GPA’s well above 3 and solid letters, I don’t like your chances.
One way around that is precisely what you described: take some other classes, knock them out of the park, and show that you’re stronger than your record thus far would suggest.
Another would be to start with private high schools. They can set their own hiring requirements, and they usually don’t require teaching certifications. If you find the right setting, you may be able to find out fairly quickly whether teaching high school is really for you. Even there, though, you may be up against plenty of people with Master’s degrees, and the pay is typically pretty low.
Since my familiarity with teacher education programs is on the beginning end, I’ll have to ask my wise and worldly readers who know that world better than I do what they would suggest. Wise and worldly readers who know this stuff -- is there a better option?
Have a question? Ask the Administrator at deandad (at) gmail (dot) com.
Thursday, October 21, 2010
Every So Often...
This week I was able to squeeze out a little time between meetings to catch part of a student concert, and then to drop by the club that was doing a table to support the LGBT community in the wake of the recent suicides and sign their banner of support.
In administration, I usually see students only when they’re bouncing off the ceiling -- the highest achievers doing amazing things -- or when they’re in some kind of trouble. I don’t get the day-to-day interaction with most students that faculty get. My exposure to average students is remarkably small, other than passing in the hallways and hearing snippets of conversation. And since it seems that every student was issued a smartphone, many of those snippets have been replaced by inaudible typing, so even that is partly lost.
The concert was fun. Students were dressed as they are, with skill levels ranging from impressive to earnest. But they were so supportive of each other, and the joy in the room was so palpable, that I couldn’t help but get caught up in it. It felt like sitting in on a rehearsal, and I mean that as a compliment. Even watching their faculty advisors in something like their natural habitat was a refreshing change; I’m so accustomed to seeing faculty in their “I’m Talking to The Administration” mode that I’d forgotten how charming and dedicated some of them are when they’re working with their own students.
But the support group really warmed my heart.
Without undue bragging, I can honestly report that my cc is more gay-friendly than most. The LGBT community here is visible, including within the administration itself. But even within a comparatively welcoming setting, there can still be tension and real fear. The rash of hate-driven suicides nationally obviously struck a chord here.
It did this fortysomething heart good to see some students who looked like they would have been comfortable in a frat mingling comfortably with other students who looked like they stepped out of a Judith Butler seminar, and the entire group catching zero flak from passersby. I don’t recall that happening in my undergrad days. They were all happy to be there, obviously having a good time, supporting each other. The goal they shared was nothing more, or less, than making the college a safe place for everybody.
In a time of economic decline and political absurdity -- Christine O’Donnell? Really? -- there’s something restorative about seeing students being gentle and generous with each other. Between all those meetings, with their dry-as-toast discussions of “metrics” and “benchmarks” and policy quagmires, it’s worthwhile to slip out for a bit and be reminded that there’s a point to it all.
Wednesday, October 20, 2010
Ask the Administrator: The Generic Degree
Some jobs out there are advertised as requiring a college degree, but
the employers don't care what was actually studied. So these employers
are in effect using college as a four-year hundred-thousand-dollar
screening test, with perhaps a bit of intellectual calisthenics for
I had a chance to discuss this with a supervisor at one of the
management consulting companies, and he confirmed this is in fact
their policy. I suggested that since they don't care about any
specific knowledge -- only smarts and the willingness to work hard --
they should be open to hiring people right out of high school. Some
high-school students can point to significant intellectual
accomplishments, after all. But no, this is Just Not Done.
A four-year degree seems like a very expensive way of doing
intellectual quality control. Could we do better?
I hate to admit it, but there’s some truth to this.
I saw this quite a bit at PU, where some older students were already well into their careers and doing well there, but they needed their hands stamped in order to move up to the next level. They didn’t care much about the actual content of it; the point was to become eligible for management ranks. I took it as a personal victory when one of those students actually found value in a class I taught.
At an individual level, this can be kind of silly. Certainly it’s possible to be brilliant (or better, wise) without a degree, and to be bovine with one. And it’s also true that some jobs that stipulate college degrees don’t really draw on the skills that a degree is supposed to confer, whatever the major. Degree factories exist for that very reason.
That said, though, I like to think that a bachelor’s degree from a real college -- as opposed to a degree factory -- carries some meaning.
At one level, it shows the ability and willingness to stick to a program. Given the prevalence of college dropouts, those who actually finish have at least shown the ability to get their stuff together sufficiently to fulfill a multiyear commitment. (Along similar lines, students who transfer from cc’s with associate’s degrees tend to finish bachelor’s degrees at far higher rates than those who transfer with scattered credits. The graduates are those who finish what they start.) It shows the ability to navigate a bureaucracy, which is an essential workplace skill for most of the higher-paying jobs.
If the college is at least halfway serious, a degree should indicate some ability to handle complexity, to communicate at least functionally, and keep one’s balance when dealing with numbers. One of my personal indices for wisdom is the ability to handle ambiguity. Clueless people can be trained to do almost anything routine; the real test comes when the routine has to change. Some of that is temperamental, but some has to do with the ability to discern the bigger picture.
The actual content of the degree is another issue. I don’t often draw on my study of Restoration England, but I do draw on some of the skills developed through it. My social science training enabled me to stay awake and attempt to wring meaning out of long, boring, poorly-written texts; on the job, I use that skill every single day.
There’s certainly a case to be made that if you just need a generic degree, you should go to public institutions. Two years at a cc, followed by two years at a state college, will give you a far lower student loan burden than four years at someplace private. (Financial aid discrepancies may muddy that picture, admittedly.) If it’s just a matter of getting your hand stamped, why pay more?
Does it make sense for employers to use college degrees as hand stamps? In any individual case, probably not, but in the aggregate, probably. (For purposes of discussion, I’m not looking at fields like engineering, where the actual content is obviously relevant.) Given dozens or hundreds of applicants, a degree requirement makes a reasonable, legally-defensible first-level screen. It’s not perfect, but it’s clean, and it’s within reason.
The U.S. actually has a history of using college for purposes other than education. In the 1960’s, it was a source of draft deferments. In the nineties and oughts, it was a way to get health insurance. If the DREAM act passes, it will be a path to citizenship. Degrees as proxies aren’t new, and aren’t unique to the job market.
Wise and worldly readers, what do you think? Do generic degrees make sense as employment screens, or is higher ed on borrowed time economically?
Have a question? Ask the Administrator at deandad (at) gmail (dot) com.
Tuesday, October 19, 2010
The Girl, Social Observer
The costume involves a dress that’s too long for her, so we were trying to figure out how to shorten it so she wouldn’t trip over it, while not making it look awful. Actual conversation fragment:
DD: Well, you could roll it up a little and sort of pin it on the sides...
TW: No, that would look terrible.
TG: Daddy, you have to think like a girl. What do girls like?
DD: Well, sometimes that’s hard to do.
TG: I know what boys like.
TW: What do boys like?
TG: Well, little boys like Spiderman and legos and stuff like that.
TW: And what do big boys like?
TG: Teenage girls.
I laughed out loud.
I am sooooo not ready for the teen years...
Monday, October 18, 2010
SUNY Albany is tossing bottles, and catching holy hell for it. Apparently, it’s suspending admission to programs in French, Italian, Russian, and classics. Stanley Fish has opined that this marks the official collapse of the humanities.
As regular readers know, I stand in awe of Stanley Fish’s ability to maintain a prominent and successful career without ever, even by accident, getting anything right. It’s an astonishing record, really. He’s the David Hasselhoff of academia; nobody can really explain why he’s still there, yet he’s still there.
His win-by-losing streak remains alive. In defense of distribution requirements that used to be the lifeblood of certain departments, Fish writes -- and I am not making this up --
I have always had trouble believing in the high-minded case for a core curriculum — that it preserves and transmits the best that has been thought and said — but I believe fully in the core curriculum as a device of employment for me and my fellow humanists.
Wow. Just, wow. So the reason we should support the humanities is to provide jobs for humanists.
It’s an interesting bit of logic. Just substitute anything at all for “humanities” and the sentence still works. The reason we should support buggy whips making is to retain the employment of buggy whip makers. The reason we should support the continued use of typewriters is to retain the employment of people in typewriter factories. The reason we should continue the production of whalebone corsets is to support the employment of whalebone corset makers. Whee!
Lest my reading seem reductive, check Fish’s piece. He explicitly rejects the usual “pieties” on behalf of humanistic studies, as well as any kind of economic-usefulness argument. Revealingly, he also rejects internal cross-subsidies:
And it won’t do, in the age of entrepreneurial academics, zero-based budgeting and “every tub on its own bottom,” to ask computer science or biology or the medical school to fork over some of their funds so that the revenue-poor classics department can be sustained. That was the idea a while back, but today it won’t fly.
FIsh doesn’t even try to answer that, though he should. Because his alternative -- administrators will save us all! (seriously! that’s what he wrote!) -- assumes that the state legislature will be easier to persuade than will the biology department. Even though the biology department reports to the exact same campus presidents who are supposed to persuade the legislature. Even though Fish assures us that the presidents don’t believe it themselves.
Politically, this doesn’t even rise to the level of ‘failure.’
In a rational world, Fish’s piece would have been drowned out by derisive laughter. Instead, it was quoted approvingly by the AFT FACE folk and all manner of adjunct activists, even though it undercuts their departments’ reason to exist.
If SUNY Albany had, instead, simply adjuncted-out more courses across the humanities, eventuating in the same level of savings, would that have been better or worse? Suppose that, instead of tossing some bottles, it had just watered more bottles down. How might the world have reacted?
Judging by the last forty years’ worth of trendlines, I’ll go out on a limb and say that nearly nobody outside of Albany would have noticed. Business would have gone on as usual.
Instead, Albany did what few of us have the courage to do: it chose to toss some bottles and wake everyone with the noise of their shattering. It chose to face the truth, and in so doing, to maintain higher levels of resources for the programs that survived. (Fish’s analysis is silent on that point.) It made the decision -- perfectly defensible, in my estimation -- that it would rather do fewer things and continue to do them well than continue to do everything just a little bit worse.
Whether the specific bottles it chose to toss are the right ones, I don’t know; that’s context-specific, and I don’t know the context well enough to say. Whether the decision will withstand the inevitable legal challenges and votes of no confidence, I don’t know.
But it did something that the usual default move doesn’t do: it got public attention. It actually made some noise, and may have started a worthwhile debate. Should colleges continue to act as if all those budget cuts by states make no material difference, or should they put the cost on public display?
I’ll admit being less-than-optimistic that the voters of, say, New York will storm the barricades demanding more graduate programs in French Lit. Academics who care about that are invited to enter the political sphere and try to sway the electorate, though as someone who has done that for years, I’ll advise plenty of patience and a stiff sense of irony. Generally speaking, the more you can frame the discussion in terms of “great education” and “benefits to the state as a whole,” as opposed to “jobs for humanists,” the better a chance you stand.
You can’t water down the bottles forever without fundamentally changing what’s inside them. Albany decided to toss some bottles to save the rest. It’s a debatable choice, but certainly a defensible one, and it offers at least the appeal of abandoning a strategy that has failed for forty years. It also offers the appeal of maintaining quality control -- and yes, jobs -- in the departments that remain.
Albany may or may not be right, but it does not deserve the opprobrium it has received. And Fish’s win-by-losing streak remains intact.
Friday, October 15, 2010
Au Revoir, Bitch
In her day, Bitch was the undisputed champ of the academic blogosphere. She combined attitude, insight, attitude, humor, autobiography, attitude, sex, and attitude in unpredictable but irresistible ways. She could single-handedly change the entire conversation -- yes, kids, there was a time when there was a single conversation -- just by calling attention to it. And I have to admit that her stories about Pseudonymous Kid struck a chord, since he isn’t much older than TB.
Her links were mighty. The day she linked to my Elephants piece, I got more traffic than I ever had before, and probably since. In one of her early rounds of guest blogging, she even let me pinch-hit for her for a few posts. It didn’t go especially well -- I decided to try to fit her voice, only to discover that she was an original -- but I was flattered and thrilled to have the chance to try. (Her commenters were a lot tougher than mine!) Though we wrote from wildly different situations, I saw a kindred spirit there, and at one point even tried, unsuccessfully, to recruit her to my college.
In many ways, she was a sort of successor to the Invisible Adjunct. For a few years there, the sheer novelty of an open, public sphere in which people could write across ranks and tell their respective truths in narrative form drew a spate of frustrated people whose pent-up frustration gave rise to some prickly, but damned interesting, writing. Some of the topics had never been discussed candidly in public before, and while the efforts often wore the awkwardness of first drafts, they also carried the energy of discovery. Those early flame wars got as heated as they did because they felt like they mattered.
Since then, the blogosphere has evolved and fragmented. Facebook and Twitter came along and sent conversations in different directions, and after a few years, many of the “I’ve always wanted to write about this” topics had been written about. That’s not to deny that nifty writers still come along, but the sense of a ‘scene’ has faded. I have to admit reading far fewer blogs than I used to, and being surprised somewhat less often than I once was.
I recall a piece Bitch did several years ago about bloggers as living, breathing “author-functions.” (They’re sort of like personas.) Part of the appeal of following blogs is watching the author-functions evolve, and become more like themselves. Some, like Dr. Crazy, actually mark the transitions with formal changes of address. Others, like Aunt B., become the published authors they always should have been. My own space became a cross between a blog and a column, which was how I had always thought of it in the first place. Bitch herself moved away from academia, and the blog did, too.
Heaven knows we need people who speak the truth about academia, in their own voices, across ranks. People who can do that in interesting ways are rare. Those who can add humor, and attitude, and the unmistakable stamp of a person underneath it all are even rarer.
Au revoir, Bitch. May you and PK bring your characteristic grace and vinegar with you, wherever your next adventures take you.
Thursday, October 14, 2010
What’s a Sophomore?
My counterparts moved to a discussion of the second year of college. They mentioned that the sophomore year is when students need to declare a major, and that students who can’t commit to anything at that point are at much higher risk of walking away.
It sounded right to me, until I realized that they had used “second year” and “sophomore” interchangeably. To be fair, in their specific contexts, that was probably perfectly valid. But it doesn’t describe the community college world well at all.
The traditional sequence for students has students starting as freshmen, then becoming sophomores, and eventually juniors and seniors. Community colleges do the first half of the sequence, with our students graduating at the end of their sophomore years.
Except that the ‘freshman’ and ‘sophomore’ designations don’t work terribly well here. And that has implications for the kinds of interventions that can help at-risk students.
The IPEDS data we report is designed on the traditional model, which means that the success of many of our interventions are measured by the impact they have on the first-time, full-time, degree-seeking cohort (usually abbreviated FTFTDS). That describes the overwhelming majority of entering students at Dartmouth, but a minority here.
Most of the students who start here start with developmental and/or ESL classes, often for more than one semester. They move from full-time to part-time and back again, as economic and family needs dictate. By the time they get to the traditional first-year classes -- Composition 1, College Algebra, etc. -- they may have already been here for two or three semesters.
Since this is a commuter campus, we can’t rely on dorms to clearly mark the first-year cohort. And since students come in with wildly varying levels of academic preparation, we can’t clearly find the first-year cohort in any given class. (Sadly, even the first-level developmental classes aren’t always clean indicators of first-year status.) There’s no clean mechanism to catch them outside of class, and no class we can rely on to catch them. They’re everywhere and nowhere.
To make things more complex, we don’t have the isolated clean start of September (or the slightly less clean starts of September and January) that the more exclusive places have. Students here start and finish at all different points of the year. September is the most popular time, but we get new enrollments in the Spring, Summer, and Fall. We even get them for the January intersession. Cohorts blow apart quickly, especially in the early stages.
Is a second-year student taking 100-level classes a sophomore? I’d say ‘no,’ since I’d define ‘sophomore’ according to progress towards graduation. But that means making a distinction between second-year students and sophomores.
Here, students declare a major upon initial enrollment. I’m told that it’s a financial aid requirement, though I’ll admit not knowing why four-year colleges don’t seem to have that same requirement. Among students who reach my version of ‘sophomore’ status, the subsequent attrition rates are remarkably low; in essence, if they make it to thirty credits, they make it to sixty. The trick is in getting them to thirty, even if it takes more than a year to do it. We just don’t have a word for a second-year freshman.
Wednesday, October 13, 2010
We’ve had similar discussions here, though the reasons aren’t all the same. I’d love to hear from wise and worldly readers who have lived through a similar transition, and who know from experience what the concrete issues are.
Like most colleges on a traditional calendar, we devote the last several days of the semester to a final exam period. Exams are scheduled in two-hour blocs. The blocs don’t always coincide with class times, since class times are usually either fifty or seventy-five minutes. There’s a makeup day at the end, which comes in handy for schedule conflicts and/or snow days. Grades are due shortly after the end of the exam period, with a hard goal of finishing everything before Christmas.
The system works pretty well for courses that lend themselves to traditional exams that can be graded quickly. But it’s awkward, at best, for courses that tend to put more weight on papers, projects, or performances.
Over the years, even some of the folks who give something like traditional final exams have started creeping those forward into the final week of classes. Although the official college policy states that exam week is for finals, we’ve had students complain truthfully that they’ve had multiple finals on the same day, in the week before any are supposed to be given. Of course, others simply have final papers or projects due, usually the week before finals.
From a workload standpoint, this is sticky.
In a collective bargaining environment like this one, everyone in the same unit is supposed to have the same rules. That means that the semester is supposed to start at the same time, and end at the same time.
Some of the people who have moved their exams up have done so out of a sense that their colleagues had a longer vacation than they did. Indeed, if you walk the faculty office corridors during exam week -- I’ve been known to do that -- they’re a lot more vacant than the official exam schedule would lead you to believe. We’ve backed into a situation in which the official norm has become the de facto exception, and students are often caught in between. Frankly, it’s also a grievance waiting to happen.
In addition to the educationally-based ideas outlined in the Globe piece, we’re considering just dropping final exam week and extending classes to the end to address workload equity. If some people can start their vacations earlier simply by declaring that they use project exams, then we have a pretty basic fairness issue. But if classes meet to the end, they meet to the end. People who want to give traditional finals still can; they just have to break them into two parts.
This move would also greatly reduce the number of schedule conflicts, since nobody’s schedule would change for that week. Adjuncts who work at multiple campuses wouldn’t suddenly have to juggle when their obligations shift; they wouldn’t shift.
I’ve heard from a few areas -- music, notably -- that such a change would be a challenge for the final juried performances, but it seems clear that we can work around that with a little ingenuity. It would also make common finals across multiple sections more challenging; presumably, the students who have math on Tuesday could find out from the students who had it on Monday what the questions were. But this, too, seems fixable with a little ingenuity. It’s already standard practice to mix up questions on different versions of the same exam, and/or to change the numbers in a given equation. It can be done.
Wise and worldly readers, it can’t be as simple as this. (Nothing ever is!) Do you see a serious downside to doing away with a final exam period? If you’ve lived through a transition like this, did you see some unanticipated-but-serious issues arise?
Tuesday, October 12, 2010
Deans and Free Speech
I file this one under “I know it sucks, but imagine the alternative.” Frustratingly, most of the comments to the IHE story missed the point.
For a college of any meaningful size to accomplish much, it has to have the entire administration rowing in the same direction. This is a difficult, but real, fact of life. If some deans start grandstanding and styling themselves champions of whomever is out of favor at the time, the college will be convulsed by internal politics and unable to function.
Those of us who make our living doing this know that implicitly.
That’s not to say that administrators have to be automatons. The key is knowing when and where you can voice disagreement, and when you have to get in line.
Admittedly, the precise boundaries fluctuate a bit from one context to another. But the rule of thumb is, the smaller and more private the setting, the more candid you can be. Don’t make the amateur mistake of assuming that decisions are made in open meetings.
In an individual meeting, you can say just about anything. That’s the time for really frank conversations.
In a small group meeting, depending on the group, you can still sometimes raise serious objections. The more formal the group, the less this is true.
In public meetings, or in public generally, you have to be on point. That means toeing the party line, or at worst not subverting it. For those of us in these roles, that can be a HUGE point of stress. Stick around long enough, and sooner or later you’ll be put in the position of having to defend a position with which you disagree. It comes with the job.
Nearly everyone has lines they won’t cross. (Admittedly, some people are simply soulless, but I prefer not to dwell on them.) When you have to cross that line too frequently, it’s time to decamp for greener pastures. The folks who complain about administrative turnover seldom mention this variable, but it’s real. It was one of the reasons I left PU. I saw the direction it was going, and I wanted no part of it.
To be clear, nothing I saw or took issue with rose to the level of criminal conduct or fraud. It didn’t rise to the level of whistleblowing. Instead, it was a series of judgment calls that I considered poorly made. The organization had the right to make those calls; I just thought they were misguided. When the accumulation of misguided decisions reached a certain point, I left.
For those of us whose administrative positions don’t also come with tenured faculty positions, the choice is not simply ‘stick around or go back to faculty.’ It’s ‘stick around or lose your job.’ I have had to hold my tongue many times because I didn’t have a safety net. It sucks, but the alternative is hard to picture.
A college can’t have a dozen chief executives. It can only have one. At the end of the day, someone has to be responsible for setting the overall direction, and the people who report to her need to be reliable. They can argue decisions before they’re announced -- and that certainly happens -- but once the play is called, it’s called. At that point, the job is to execute the play.
Admittedly, this level of responsibility for one’s own words can lead to some frustrating exchanges with faculty, who enjoy the freedom to speak to just about anything, whether it falls within their expertise or not. I’ve been in situations in which my own sense of a given policy was somewhat grudging, but I’ve had to embody it in discussions with some faculty who’ve resorted to all manner of mudslinging. Simply put, it’s an unfair fight. But it comes with the job.
So I don’t like it, but I have to agree with UConn. At some point, the President calls the play, and we execute or we don’t. The time for debate is finite, even if we think we’re right.
Administration is not for the faint of heart.
Monday, October 11, 2010
Ask the Administrator: Payment Upon Completion
We, the faculty, have just been notified by our president that the Texas Legislature (which can only be relied upon to do whatever is worst) is poised to institute a completion-based funding system for higher ed throughout the state. Instead of funding us based on enrollments as of the 12th class day, as it does now, it will use a step system based on completion rates: so much money for students with 15 credit hours, so much more for 30 hours, so much more for graduates, etc. They will probably begin by making 10% of our funding dependent on this measure, but with the expectation of phasing it in gradually until it reaches 100%.
Our administrators claim that we’ll be able to meet the Lege’s goals (which have not yet been defined in detail) without compromising academic standards, but I don’t see how that can be possible. Won’t such a policy inevitably lead regents to push administrators, administrators to push faculty, to pass more students through? How can grade inflation not result? Or pressure to push less prepared students toward shorter (ie., technical) degree programs? Or penalties for faculty who teach in fields like math and A&P that students perceive as harder and are more likely to drop or fail?
And isn’t there a basic inequity and hardship in paying colleges nothing until students have completed 15 credits? I haven’t studied the numbers, but I know anecdotally that many students drop out or flunk out in their first semester: why should the college not be compensated for the worker-hours invested in those students before they accrue 15 credit hours — given especially that the reasons for their disappearance often (usually) have nothing to do with us?
Are you familiar with such developments, and how much trouble do you think we’re in? Is this something so unwise that only Texas could dream it up, or an emerging national trend?
I have to admit, I enjoy playing with ideas like these.
No, it’s not unique to Texas, though Texas tends to be in the forefront of ideas like this one. A few years ago a thoughtful senior professor at my college suggested a variation on this that he framed as a “graduation deposit.” Upon enrollment, students would pay a few hundred bucks that would be put into an interest-bearing account. If they drop out, they lose it. If they graduate, they get it back with interest upon graduation. The idea was to put some skin in the game. The idea didn’t go anywhere, though I have to admit kind of liking it.
From the legislature’s perspective, the scheme you’ve outlined could be appealing in a couple of ways. Obviously, it could be a backdoor way to cut funding. On a less sinister level, it creates institutional incentives for colleges to start paying attention to graduation, as well as access. At that point, the real question becomes how the colleges respond to it.
You’re certainly right that a shortsighted or panicky administration could respond simply by pressuring the faculty in myriad ways to go easy on the students. I wouldn’t expect it to say so explicitly, of course, but it’s possible to exert pressure in subtler ways. The usual playbook involves getting rid of adjuncts with high fail rates, wreaking havoc on the schedules of full-timers with high fail rates, playing games with placement tests, and/or abusing the bully pulpit.
Alternately, the college could respond the way that Texas high schools did when they fell under a high-stakes testing regime, and simply try to screen out the weaker students upfront. After all, if the weaker kids don’t get in in the first place, they won’t drag down your completion rates. This is easier than you might think. Just scale back on student support services, don’t schedule classes at certain times, and appoint Cruella DaVille to run your Students with Disabilities office. Done and done. It violates the mission of the community college, but if your choices are going upscale or shutting down, I could understand the move.
A smarter administration that actually cared would understand that the way to improve completion rates is to look at them as an institutional, rather than individual, issue. What institutional obstacles exist to throw students off course? For example, are required classes largely scheduled when students can’t take them? Are the numbers of sections of required classes adequate, or are students forced to wait? How efficient is the financial aid office? How efficient is the bookstore? Is there a major parking problem? Is your academic advising system effective, or are students left to flounder?
Other than common sense, one of the major problems with defining attrition as a faculty issue is that the solutions it suggests aren’t scalable. Let’s say that I have good data showing that Professor Jones’ students consistently fare far better than do Professor Smith’s students. What, exactly, can I do with that? I can’t clone Professor Jones. If I want to make a large scale, sustainable difference, I have to do it at the system level.
Done wisely, a completion-based approach would look at faculty as valuable sources of information on the obstacles that students face. What do your students tell you? What gets in their way? Sometimes it can be something as basic as “they never offer this class at night, and I can’t take it during the day.” A college can fix that if it chooses to.
There’s also the basic issue of student goals. Although students have to declare a major upon enrollment if they want to get financial aid, the truth is that many students have no intention of sticking around until graduation. (A common version of that is the kid who only intends to spend a year before transferring; he just has to show his parents that he’s capable.) One way to address that is to come up with ‘certificate’ programs with lower numbers of credits. A 30 credit gen ed certificate, for example, turns that one-year-and-out student from a dropout to a graduate in one fell swoop.
Given the realities of Texas politics as I understand them, you’re probably right to assume the worst. But it’s at least possible for a college, and a state, to take graduation seriously without devolving into a race to the bottom.
Wise and worldly readers, what do you think? Is there a way to reward colleges for improving graduation rates without inadvertently rewarding stupidity?
Have a question? Ask the Administrator at deandad (at) gmail (dot) com.
Friday, October 08, 2010
Admitting my own bias upfront, it's still fair to say that these are two good-looking kids. TB is the class heartthrob, tall with a winning smile, and TG is a brunette cherub with a mischievous gleam in her eye.
But you wouldn't know it from the pictures.
In these pictures, our beautiful, wonderful, sweet children look, well, kinda goony.
It's not their fault. They often photograph well, and they smiled obligingly. Yet these shots are worthy of a driver's license in their awkwardness.
Why does that always happen with school pictures?
I have to admit, many of my school pictures as a kid were horrible. The fifth grade pic was uniquely awful, with a sneer that made me, improbably enough, the class bad boy the day it came out. (It didn't last.) None of them looked the faintest bit natural or flattering, and several of them were just mean.
Photography, as an art form and as a technology, has come a long way since then. Yet school pictures are still goony. You'd think they would have improved by now.
I can isolate a couple of factors. The backgrounds are always awful – either 'sky' or 'rumped monochrome.' And the poses are ridiculous. “Lean over. Crane your neck. No, like that. Lean farther. Now, try to have fun!” Sheesh.
Who becomes a school photographer? That seems like some sort of 'community service' punishment for a photographer who did something awful. Train them in Ansel Adams, then loose them on fourth graders. It's just not right.
My Dad used to consider himself a serious photographer, all evidence to the contrary notwithstanding. He'd spend minutes composing a shot while my brother and I quietly seethed. (The pics usually reflected the seething.) To this day, the one piece of advice I'd give any photographer is “shut up and shoot.” In the age of digital photography, when you don’t have to worry about wasting film and a single card can hold hundreds of shots, I say shoot first and ask questions later. The one thing you should never do, especially with children, is ask them to hold a pose. When a camera could only hold a couple dozen shots and each one was a real cost, there was some excuse for perfectionism. Now, not.
Nowadays, at the end of the school year, teachers present slideshows of candid shots they’ve taken in the class over the course of the year, always accompanied by the Green Day song that goes “there’s something unpredictable...” (Why it’s always that song, I don’t know.) The candids are always far better than the official portraits. They show kids looking like themselves, wearing clothes they’d actually wear, doing things they actually do. They look like life. Honestly, I’d rather have copies of those than of the Official School Pictures. But the official pics live on.
Is there a trick to making school pictures suck less? And just who, exactly, winds up taking them?
Thursday, October 07, 2010
I’m increasingly convinced that we need to do something like that. We could define ‘laptop’ pretty broadly to include not just netbooks but also ipads and maybe even smartphones -- anything that gives students wireless internet.
Part of the draw is the sheer cost (in both money and space) of open computer labs. As it stands, in many of our open labs there’s a 30 minute limit per station when every station is taken, and that’s most of the time. The labs are staffed as best we can, but work-study students aren’t 100 percent reliable, and the money just isn’t there for full-timers. If we were able to convert some of those labs to teaching spaces, and redirect some of those resources to faculty, I can’t help but think we’d accomplish more, educationally.
But there’s also the issue of paper.
Every semester, we print a course schedule for general distribution. We have to get class schedules done unreasonably early to allow time for layout and printing. The schedule is obsolete from the minute it’s out, since changes are ongoing. But every time we talk about getting rid of it and driving the course scheduling online -- where the information is up-to-the-minute -- we run smack into the issue of access. Paper is portable, and cheap, and everyone who wants it can get it. (The schedule is available to students free of charge.)
When professors meet with students in their offices for academic advisement, scheduling is often a part of that. (The conflation of ‘advising’ with ‘scheduling’ is another issue altogether.) Working with two paper bulletins is sometimes easier than working with one screen. Worse, some faculty are still -- amazingly -- allergic to anything electronic.
If we could get to the point where every student had his/her own little screen, and could go on the system wherever and whenever they wanted, many of these issues would go away. We could stop spending thousands of dollars on paper bulletins that convey bad information. We could convert scarce space from the 1990’s model open computer lab to more pressing needs.
With free wi-fi becoming more common off campus -- they have it at McDonald’s now -- and ubiquitous on campus, the objection from cost of monthly service is looking less compelling than it once did. If laptops or something similar were required, they could presumably be covered by financial aid just like textbooks are, so between subsidized equipment and free wifi, the financial barrier is looking smaller. From the institution’s perspective, it would allow us finally to capture some of the efficiency gains from technology that until now have remained unrealized due to too many digital holdouts (or castouts).
Wise and worldly readers at campuses that have actually made this leap -- how did it work? Any advice you’d give a campus that’s thinking it over?
Wednesday, October 06, 2010
$35 Million in Perspective
Obviously, for any one private entity to give $35 million is very generous. But it’s worth putting in perspective.
It’s less than my college’s operating budget for one year. My college is one of 1100 community colleges across the country.
Depending on the size and ambition of the buildings, it would pay for one to two new classroom buildings. If they’re renovations, you might get three.
If it were put in an interest-bearing account at today’s interest rates -- call it one percent, to keep the math easy -- it would generate $350,000 in income. Divided evenly by 1100 community colleges, that’s a little over $300 per college.
Private philanthropy is lovely, but if we think that it’s a serious substitute for sustained public funding, we’re kidding ourselves.
I don’t mean this as a criticism of Gates. They could have kept their money, or given it to some other cause. No argument there.
But as far as bold leadership gestures go, Obama whiffed completely. This isn’t even close. If you want to double the number of college grads by 2020, this doesn’t even resemble a gesture.
Meanwhile, the shape of the policy debate is over whether to extend tax cuts for dead rich people.
When I voted for “change,” I didn’t realize it meant “spare.”
Tuesday, October 05, 2010
Oblecs, or, Statements That Don’t Mean What they Mean
Recent polls have shown a terrifying amount of militant know-nothingism. The example that leaps to my mind is how the percentage of people who claim to believe that Barack Obama is a "secret Muslim" or some such crap is shockingly high. Despite this, it's hard to escape the hunch that most people who claim this don't really believe it; they're simply registering a complaint through stupid means. They want to express their discontent through every means at their disposal, and labeling him a Muslim serves as a fatuous shorthand.
Yet, it's also not hard to escape the hunch that the more such a claim is questioned or mocked, the more it will become believed, eventually ossifying into an article of faith. In short, when they first said it, they don't really mean it, but as soon as it's challenged, they believe it fully and without reservation. Only much later, when any and all pressure is removed, might it someday soften and be allowed to be doubted.
High school physics teaches fluid dynamics in part through the medium of "non-Newtonian fluids." Create a slurry of two parts cornmeal to one part water, and you have a classic dilatant. If you lower a spoon into the liquid, it sinks in. If you smack the spoon against the surface of the liquid, it instantly hardens, and the spoon bounces off. The viscosity of the fluid is dependent upon the shearing forces it faces.
There is a subset of canard that behaves similarly. When a person first creates such a canard ("Obama is not an American citizen!"), the speaker knows it is factually false. How that canard is met is key. If approached gently with questions ("Did all of the investigating bodies in the campaign miss it? Are the 1962 Honolulu newspapers with the birth announcement time-traveler-manufactured frauds?"), it can dissolve into a slurry. If attacked ("What? Are you high?"), it ossifies into an article of true faith. This certainly does not apply to all canards or all people, but I've seen it happen.
Thus, I propose a new word for the English language. The non-Newtonian fluid used in high school physics demonstrations is commonly called "oobleck." Ideas that are deliberate falsifications that harden into articles of faith only under pressure are now "oblecs." For example, Rush Limbaugh spouts many oblecs in the course of his day. In low-pressure situations, he'll admit that a particular heap of lies is the product of him "being funny" or stretching the truth to make a point. Under any sort of pressure at all, he'll declare that every word he speaks is the pure truth. It's a sub-variety of doublethink.
To take it out of politics, kids spout out oblecs quite a bit. Wee Johnny says an obvious lie, gets called on it, and suddenly believes it to be a truth upon which he'd stake his very life. That's an everyday oblec.
First, I think his observation is spot-on; I’ve seen people hold to absurdities as a sort of badge of group identity, and then later admit the absurdity when it felt safe.
I’ve seen statements like these referred to as “dog whistles,” in that they hold meaning only to those primed to hear them. They’re a kind of secret verbal handshake, certifying members of the club. To those of us outside the club, they just sound like blowing air.
I’m not sure about the neologism itself, but the phenomenon is real and hard to capture.
Is there a more elegant phrase to describe oblecs? And has anyone out there found a reasonably effective way of dealing with them?
Monday, October 04, 2010
Opposition to Affirmative Action
(Disclosure: though my politics aren’t conservative, my appearance is. Since I rarely talk politics at work, some people incorrectly assume from my appearance that I’m a conservative, and they tell me things that they might not otherwise. It’s an odd position to be in, but there it is.)
In two different searches over the past year, I’ve heard incumbent full-time faculty voice strong resistance to the college’s affirmative action procedures. When I asked why, the answers were the same in both cases:
“This is really [favorite long-time adjunct]’s job. I’d hate to see her/him lose out just because s/he’s white.”
If a college is in a relatively white part of the country -- unthinkable, I know, but bear with me -- then its adjunct pool will likely reflect that. If diversifying the faculty is an institutional priority, that will often require bringing people in from the outside. In this market, that means passing over some long-serving adjuncts.
Given a scarcity of jobs -- which is itself a function of a scarcity of dollars -- I don’t see how to get around this. A job that goes to candidate A does not go to candidate B. If we were to give the next few batches of jobs entirely to incumbent adjuncts, we’d actually lose ground on racial diversity.
My response to those disarmingly candid statements was to object to the idea that any search is a foregone conclusion. Nobody is owed a job. The premise that Jennifer stands to get cheated is false, only because there’s no guarantee that Jennifer would win anyway. If you don’t own something, it can’t be stolen from you. If she’s truly the standout that you say she is, she’ll win a fair fight anyway. If she isn’t, then I don’t see the argument.
Admittedly, that’s a bit evasive -- affirmative action won’t reverse a blowout, but it could tip a squeaker -- but it’s also true.
I had expected the opposition to be based on the usual arguments from meritocracy -- let the best candidate win, period. But it wasn’t. It was based on knowing someone who would stand to lose. I’m unpersuaded by that position, but it’s not really reducible to straight-up racism or naivete. And it’s not entirely false -- any given search is zero-sum, and any preference for one candidate is by definition a strike against another.
To me, the decisive point is that employment is not primarily for the benefit of the employee. It’s primarily for the benefit of the employer. If the employer needs to diversify its staff, then that’s what it needs to do. It hires to solve problems it has identified. One could take issue with the usefulness or desirability of diversity, but that’s not the argument I’ve heard.
Wise and worldly readers, have you seen an elegant way to address this particular objection on your campus?