Thursday, November 30, 2017

Online Testing


This one is a bit of an evergreen, but ‘tis the season for chopping down evergreens.  I’d like to chop this one down once and for all.

We have online classes for which all of the graded work is done online.  But we also have online classes in which students are required to be physically present, either on campus or at a designated (and sometimes expensive) testing center, to take tests.  

Every single semester, we wind up with disgruntled students arguing for refunds for online classes when they discover, upon getting the syllabus, that they have to be physically present at a given place and time to take a test.  As the students explain repeatedly, part of the reason they chose an online class was precisely so they wouldn’t have to show up.  Sometimes it’s a medical or physical issue and sometimes it’s a transportation issue, but either way, the “online class” label feels like false advertising.

They have a point.

When I ask the faculty whose courses require onsite testing why they require it, the answer is nearly always a concern about cheating.  In a classroom or a proctored testing center, they argue, most cheating can be either deterred or caught; online, though, students can get away with a lot.  Academic integrity matters, so they just can’t bring themselves to go fully online.

They also have a point.

Which puts me in a tough position.  Academic integrity absolutely matters, and I have no illusions that all students are as pure as the driven snow.  But I also have to agree that requiring students to come in for a class advertised as online feels deceptive.

We’ve adopted a “lockdown browser” that prevents a student on a given device from looking at anything else on that device except the exam.  We even have a system that uses the student’s webcam to take still photos at unannounced intervals during the exam, making it very unlikely that a student could consult a second device undetected.  But even with our own mini panopticon, some faculty remain unconvinced.

Philosophically, I’m uncomfortable with the idea of just issuing some sort of diktat about what they can and can’t grade.  That gets to a level of interference in a class that I would resent deeply if it were imposed on me.  But the issue of “false advertising” is real.

So I’m looking at two ideas, and hoping my wise and worldly readers have better ones.

The first is to list any section with required onsite testing as “hybrid,” rather than “online.”  It comes closer to the actual truth of the matter.  Save the “online” label for sections that are purely online.  It strikes me as a way to preserve academic freedom while finally putting to rest any claims of misleading advertising.  If you want to require onsite exams in your online class, that’s fine, but you have to label it a hybrid.  Fair is fair.

The second is to help skittish online faculty come up with better ways to assess student work, so the idea of proctoring becomes irrelevant.  Many classes have long involved papers that are written outside the view of a proctor, so it can be done.  

So I’ll throw it open to my wise and worldly readers.  Have you found, or seen, innovative ways to assess student learning online to get around the dilemma of the proctored test?

Wednesday, November 29, 2017

The Girl is Published!


I’ll get back to my regularly scheduled meditations on higher education, but today calls for some unfiltered parental cheerleading.

The Girl is 13, and in the 8th grade.  At the behest of the advisor to the Publishing Club (!) at her school, she wrote an entry for the New York Times’ “Best Books of 2017” feature.  Her post, as it appears there:

I’ve read a variety of books throughout 2017, but “Carry On” by Rainbow Rowell stands out not only as my favorite from this year, but of all time. The book is everything I want to be as a person; clever, funny, sweet, and interesting. The characters worm their way into your heart and soul, and yet manage to be believable. Each one flips a character stereotype on its head in a beautiful way. The “chosen one” of the story is horrible at magic, the “evil British vampire guy” is sweet and in love, the “smart female best friend” actually has flaws (wow!) and the “beautiful girlfriend of the main character” is, in my opinion, vile. Despite my opinion, she is extremely realistic.
Reading it in one seven and a half hour sitting during the summer was not my best decision. If I had known that it was a work of pure genius, I would have savored the book, tasted it like a five course meal, and finished it with a content sigh. Instead, I stayed up until 1:26 am and wasn’t able to think about the book without internally shrieking for another week, but there are worse things, I suppose. There are worse things than falling in love with a book.
Looking into 2018, I’m looking forward to reading any new book by Rainbow Rowell, having read all of her past ones and loving each one of them. And while a part of my heart will forever remain tucked into the off-white pages of “Carry On,” I will keep my soul open, ready and waiting for a different book to sweep me off into a newer, brighter world.


I’ll admit parental bias, but I’m insanely proud of her.  She’s a voracious reader -- I’ve written before of our trip to Comic Con to see Rainbow Rowell speak -- and even at thirteen, you can see a writerly voice starting to emerge.  It sounds like her.  “Clever, funny, sweet, and interesting.”  Yup.

“There are worse things than falling in love with a book.”  Yes, there are.  Yay, TG!!!!!

Tuesday, November 28, 2017

Letters of Recommendation? Still?


The U of Venus bloggers did a good exchange at IHE on the value of letters of recommendation in academia.  I’ll throw in a perspective from my corner of the world.

As an open-admissions college, there’s no need for letters for students to get in, and we don’t ask for them.  For faculty, staff, and administrative hires we’ll ask for names and contact information for references, but we only contact those of people on the very short list.  Nobody wants to wade through 50 to 100 sets of letters for a single position, especially given how unrevealing they tend to be.  For the references we actually check, HR reaches out by phone.  

In the past, I’ve been the one to reach out by phone.  I’m not asking for more work, but you can learn things in a live exchange that you might not pick up from a carefully sanitized letter.

I had one candidate for a teaching position who supplied three names to call.  When I called one of them and explained that I was calling to follow up on a reference for the candidate, he asked “who?”  In another case, when I asked about any reservations the person might have about recommending the candidate -- typically, a gimme -- I got a long pause followed by a tremulous “I’m not comfortable answering that.”  Coming from someone the candidate himself chose, that was striking.

The silences were often louder than the words.

The reason this version of reference checking works, I think, is that we’re asking it to fulfill a different function than many places do with faculty searches.  We don’t use references to winnow down the applicant pile.  We winnow down the pile based on our own criteria, followed by performance at the first round interview (which, for faculty, includes a teaching demonstration) and the second round interview.  Reference checking in this system isn’t about seeing who had the biggest name advisor; it’s about making sure that the person who wowed us at two rounds of interviews doesn’t have some Terrible Secret we should know.  

In other words, good references wouldn’t get you a job, but bad ones could lose you a job.  They’re about verification, rather than distinction.

Good reference calls are really quick.  Less-good ones usually take longer, as they should.  I’ve seen that when I was the one giving a reference, too.  I’ve had the good luck to have worked with some terrific people over the years.  Every so often, one of the real stars applies for something and asks me to be a reference.  Last year I got a call for a former colleague whom I consider a rock star; I don’t think the conversation hit the two-minute mark.  “I’m jealous that you get to hire her and I don’t” doesn’t take long to say.

Oddly enough, the one place I’ve been where we used letters in the first round was DeVry.  I remember not knowing how much weight to put on most of them.  Does a relatively brief letter indicate a lukewarm endorsement, a pithy writer, or a different culture?  Later I saw reports of studies suggesting that gender and racial bias creep into letters, which wasn’t really surprising.  About ten years ago a favorite colleague -- a high-energy woman -- asked me to write a letter for her application to a doctoral program.  It took me a few drafts to find language that conveyed “high energy” in a positive way that didn’t set off stupid gendered trip wires.  It worked -- she got in -- but the fact that it took conscious effort to avoid those trip wires was revealing in itself.

I know academia isn’t quick to change, but I wouldn’t mind at all seeing the old tradition of letters for everybody go the way of the typewriter.  It’s a vestige of an earlier time, rife with bias and light on useful information.  A few live conversations do much more good.  Let the candidates shine, or not, on their own.  Just be sure to listen carefully for the silences when you call to verify.

Monday, November 27, 2017

Towards a Focus Index


What if we could quantify students’ ability to focus on their work, and made institutional decisions based on it?

A new study by the National Bureau of Economic Research found remarkable gains in completion and subsequent transfer by students at two-year colleges who had been “nudged” towards relatively modest student loans, as opposed to students to whom loans weren’t mentioned.  As Madeline Trimble pointed out on Twitter, “students induced to borrow via random assignment to a nonzero student loan borrowed $4k on average, had a 0.6 point higher GPA, had 3.7 more credits, [and] were ten percentage points more likely to transfer to four-year schools.”

That’s an enormous effect for a relatively small loan.  It got me thinking.  Why would such a smallish amount of money make such a difference?

My guess is that it boils down to focus.  

A student with $4k in the bank can make rent, and can work fewer hours a week for pay.  That means that she’s likelier to have the time, and the mental bandwidth, to engage with her classes in a sustained way.  She isn’t paying the survival tax that students whose basic material needs are going unmet have to pay.  She can have time both to study and to sleep.  

That’s basic stuff, but it’s easy to forget in the focus on macro policy issues like student loan burdens.

I remember raising an eyebrow at the data showing that students who only ever borrowed less than $5,000 are likelier to default than students who borrowed more than $25,000.  The major difference is that the latter group is mostly graduates, and the former is largely dropouts.  What looks like a debt problem is really a dropout problem.  If some debt enables completion and graduation, rather than dropping out, then it’s probably a net positive.  It was for me, and it probably was for many, if not most, of my wise and worldly readers.

Of course, it would be even better if tuition were low enough, part-time jobs paid well enough, and housing was cheap enough, that students could put themselves through college without undue strain.  But that requires a time machine or a trip across the border.  Right now, particularly in high-cost states like mine, it requires direct infusions of money.

Focus is so basic that it’s easy to overlook.  It’s part of the reason that at both Holyoke and Brookdale, course completion rates for the January intersession were/are consistently over 90 percent; the courses are so brief that life doesn’t have a chance to get in the way.  I suspect it’s part of the reason that completion rates at Odessa College, a Hispanic-Serving Institution in Texas, saw its completion rates increase when it went to shorter classes; taking fewer classes at at a time allows more focus on each one, increasing the odds of success.  The best class I ever taught was a six-week summer session of American Government, and the intensity of it was part of what made it great.  The students couldn’t escape, so they didn’t.  They thrived.  They didn’t have time not to.

We don’t typically use ‘focus’ as a metric, but I think we could.  It’s straightforward enough to quantify the number of hours per week students work for pay, for instance.  Compiling a few measures into a focus metric might be revealing.  Compare students Carly and Sam:

Carly: four classes for fifteen weeks, working 35 hours for pay, dicey home situation

Sam: two classes for seven weeks, working 20 hours for pay, small student loan, stable home situation

Any given student can defy odds, but I’d bet that a lot more Sams will graduate than Carlys.  Ten years later, even with student loan payments, I’d bet that the Sams will be much better off economically than the Carlys, too.  

An institutional “student focus” metric could include big things, like work-study jobs on campus and short classes, but it could also include smaller ones, like dedicated quiet study areas in the library.  Widespread use of Open Educational Resources would help, to the extent that they reduce the need to pay for textbooks (or the stress of trying to work without them).  Now we have research suggesting that ‘nudges’ towards student loans should count, too.  

Focus doesn’t preclude involvement in student clubs or teams.  Those can actually help.  The key is getting the unproductive distractions down to a manageable level.  That means having free food available on campus for students who can’t afford lunch, finding reasonable public transportation options for students whose cars or rides aren’t reliable, and even having strong and reliable wifi on campus so they can get their work done.  

Wise and worldly readers, what would you include in a focus index?

Sunday, November 26, 2017

Why I’m Not a Fan of Laptop Bans


Susan Dynarski ignited quite a battle online this week when she wrote in the New York Times about why she bans student laptops from her classroom.  The short version of the argument is that laptops tend to prove more distracting than useful, both to the person who brought it and to the people for whom the screen is in view.  

I can believe that easily enough.  Multitasking is harder than single-tasking, and the temptation to check social media during what seem like lulls can be powerful.  And it’s certainly distracting if the student in front of me is watching something visually interesting on Netflix.  There’s evidence for the “secondhand smoke” argument, suggesting that students near students with laptops perform worse academically than students who aren’t.  The “harm to others” argument makes an easy libertarian argument harder.

That said, the whole idea of banning them doesn’t quite sit right with me.

To be fair, Dynarski stipulates that the ban is her own, and not her university’s.  That helps.  And she makes an exception for disability-related accommodations.  I’m not sure how that avoids shining a light on a few students, but for the sake of argument, I’m willing to accept that certain kinds of exceptions can be granted without compromising confidentiality.  And I’ll assume that laptops and other tech would be permitted in classes in which their use is part of the subject matter.  A tech-free computer science class just doesn’t seem plausible.

But I’m still uneasy, and not only because my handwriting is terrible.  Which it is.

At one level, it it’s of a piece with professors who grade students’ notes.  I’ve never been comfortable with that, either, because it assumes that there’s one correct way to take notes.  Notes, to me, are means to an end.  They’re study aids.  I have no issue with teaching note-taking methods that work for most people, but what works for some will not work for others, and students need to be able to develop the styles that work best for them.  As a student, although I wouldn’t have used this term at the time, I saw my style of note-taking as a sort of academic freedom.  If my notes don’t make sense to you, well, take your own.  

By the same argument, as they prepare to move into workplaces laden with technology, students need to develop styles of interaction with technology that help them be effective.  That will necessarily involve trial and error, just as the development of writing skills does.  Early on, they may be better at handwritten notes than at technologically enhanced ones.  One reading of that is to consign tech to the trash; another reading is that it’s all the more reason they need the practice.

That typically happens when a new form of technologically mediated communication emerges.  The visual grammar of movies had to evolve from filmed plays.  Many early tv shows were basically staged versions of radio shows.  My kids tell me that I show my age when I use punctuation in texts.  New forms of fluency take time to develop.  I’d hesitate to declare laptop note-taking an exception to the rule.

Then there’s the question of technology altogether.  Laptops, tablets, and even phones increasingly resemble each other.  If you allow phones on the grounds that people need to be reachable for family emergencies, then you might as well allow laptops, too; both can be distracting, and social networks work on both.  If you ban phones and something happens, I don’t like where that leads.  Worse, if you ban phones, you make them forbidden fruit.  At some point, some student will make secret recordings on general principle.

I have no issue at all with professors advising students upfront about the benefits of handwriting their notes, or with banning irrelevant use.  I don’t even have an issue with professors relegating laptop users to the back row, in order to get around the “secondhand smoke” argument.  (A screen I can’t see isn’t terribly distracting.)  But there’s a meaningful difference between “I don’t recommend using laptops” and “laptops are banned.”  

We don’t get to veto the future.  Tech is part of the world for which we need to prepare students.  Good or bad, it simply is.  Rather than trying to reverse time’s arrow, I’d rather we devote our attention to finding more effective ways to use tech.  We need to stop filming plays, and start making movies.

Tuesday, November 21, 2017

Early Friday Fragments


As one awful story of sexual harassment or assault after another goes public, I’m just gonna return to this post from 2011.  The Boy was unimaginably younger then, but the piece has aged fairly well.

Meanwhile, kudos to IHE’s alum Libby Nelson for her timely reminder that every new story isn’t just about men’s disappointing behavior; it’s also about courage in coming forward.  Amidst the flurry of new names, it’s helpful to remember that as gruesome as the process is, it may actually achieve something.

--

I’ve been trying to decide which policy idea, currently floating around Congress, is worse: turning Pell grants into loans for students who don’t finish a course of study in x years, or taxing graduate student tuition waivers as income.  It’s a tough call.

Turning Pell grants into loans after the fact is an awful idea.  Pell grants are means-tested pretty strictly; by definition, anybody who receives one isn’t exactly rolling in money.  And among the most common reasons for dropping out is desperate financial need.  Turning it into a loan effectively taxes being poor, which adds insult to injury.  As word spreads and the first folks get “past due” notices, I’d expect a devastating effect on enrollments at community colleges and other colleges with access missions.  That, in turn, will force program and campus closures (along with mass layoffs), thereby reducing the available options even for students who don’t receive Pell.  

(Question for folks who know tax law better than I do: there’s really no such thing as a retroactive loan, so wouldn’t Pell be considered a loan forgiven upon graduation?  And if it is, does the loan forgiveness upon graduation become taxable income?)

As one would expect, the impact would be disparate along racial lines, too.  I’d expect those affected by the change to figure that out pretty quickly...

On the other hand, the #gradstudenttax is terrible in its own right.  As others have noted, it amounts to taxing a coupon.  Graduate student stipends, as a rule, tend to be quite modest; I had roommates all through grad school and drove some impressively busted beaters to prove it.  (I can still do a passable imitation of the sound the muffler made on my powder blue 1989 Toyota Tercel hatchback…) The dollar figure of the tuition waiver was higher than my income.  Taxing the waiver as income would have priced me right out of grad school.  

Now I’ll admit there’s some public policy justification for bringing the population of graduate students into closer alignment with what our higher ed ecosystem can support, especially in liberal arts fields, but this isn’t strategic or thoughtful rightsizing; it’s a financial massacre.  At least folks who haven’t started could avoid the whole thing; students halfway through long programs would be caught abruptly between the dog and the fire hydrant.  And I don’t imagine that many programs could simply double or triple their stipends to make up the difference.

To the extent that the United States wishes to remain economically vibrant, it needs highly educated people.  Yes, I know that one party considers higher education the domain of the other party, and it’s engaging in a sort of tribal warfare, but the damage would go far beyond annoying one party’s voters.  It would be a direct hit on one of our national strengths, for what amounts to a trivial amount of money.

--

I’m sending good wishes to my colleagues at Brookdale’s sister college, Essex County College.  It was just put on probation by Middle States, our regional accreditor, partially for issues outside of its control.  

Accreditation matters, but it’s a blunt instrument.  For all of the stages -- warnings, probation, “show cause” -- at its core, it’s a binary variable: yay or nay.  Sending good vibes to the folks at Essex who are just trying to do the right thing, even when circumstances make it difficult.  They’ve been through a lot over the last few years, and this won’t help.  

--

Happy Thanksgiving to my wise and worldly readers.  There’s something civilized about a holiday built around reflection and gratitude.  Enjoy.
 

Monday, November 20, 2017

Road-Testing a New Paper


A new study suggests that colleges that want to improve their graduation and completion rates would do better to spend more on instruction than on financial aid.  

I’m thinking about that.  Let’s say that the operating budget for my division were increased by a million dollars on an ongoing basis.  What could that cover?  Conservatively, I could add:

3 new English professors
2 new math professors
4 new professors wherever they make the most sense
1 new librarian
And increased tutoring at the non-Lincroft locations

Meanwhile, we have enrollment of a little over 9,000 FTE.  If the million were divided evenly among the FTE’s, that would come out to about $111 per FTE.  An annual FTE is 30 credits, so that would come out to slightly under $4 per credit.  Our tuition and general fees right now come to $168.75 per credit for in-county residents.  

Would ten new faculty, plus increased tutoring, make a bigger difference than reducing tuition and fees from $168.75 to $165?

Probably.  It’s certainly worth testing…hint, hint...

The key is that there’s funding, and then there’s funding.  Although nearly all dollars are welcome, not all dollars are the same.

For example, “capital” money -- such as New Jersey’s Chapter 12 program -- can only be used for facilities; it can’t be used to pay the people who would work in those facilities.  Federal and state grants are for specific purposes, and can’t be repurposed.  (For example, the Title III grant Brookdale just received is earmarked for the purposes specified in the application.  It’ll help, but the rule that grant money has to “supplement, not supplant,” means that I can’t use it to hire, say, English professors.)  Philanthropic giving is often tied to specific uses.  Crossing the streams can lead to very bad outcomes.

The money that would enable hiring is “operating” money.  That’s also the hardest kind to find, by a longshot.  It’s the kind that mostly comes either from direct subsidies, which are increasingly out of fashion, or tuition and fees.  It’s the money that pays salaries.  

Right now, many colleges meet the difference between flat aid and increased costs through splitting the difference, charging more each year while cutting costs.  The new paper suggests that maybe we shouldn’t, and that instead, we should charge what we need to charge to keep the quality up.

In the very short term, of course, that would make my life infinitely easier.  But ethically, I’m uncomfortable with jacking up costs for students who are already struggling.  To the extent that a measure like “graduation rates” fails to distinguish among students, it may wind up greasing the skids towards even greater polarization.  We may get more bang for the buck by helping those who least need it, but that’s not why we’re here.  I wouldn’t want to lose sight of the mission in the pursuit of efficiency.

As a followup, it might be worthwhile to isolate the effects of the spending vs. cutting debate on low-income students specifically.  For any interested funders, I’d be happy to volunteer my campus for the “spending” side of the study...

Sunday, November 19, 2017

If a Textbook Falls in the Forest...


I haven’t seen anybody else write on this, exactly, but I’ve seen it repeatedly on my own campus.  This is a quick attempt to see if my campus is somehow fluky, or if there’s something going on that isn’t getting enough attention.

One of the oft-heard objections to Open Educational Resources (OER) is that students retain more information from a printed page than from a screen.  Therefore, the argument goes, we shouldn’t sacrifice a superior learning tool to save money.

I’ve taken issue with that for a while, simply because it’s often possible to print OER materials.  In our all-OER developmental math class, for instance, students have the option of either using electronic material for free, or buying a printed copy from the bookstore for twelve dollars.  (The cost is simply to defray paper and printing; we don’t make a profit on it.)  When the institution takes it upon itself to make preprinted copies available at no more than the cost of printing, and gives students the option, I think it has gone a long way towards answering the “screens” objection.  Equating “paid” with “paper” and “screen” with OER is simplistic.

Last week, though, I heard another benefit of OER that I hadn’t heard before.  

Three different professors in three different disciplines independently mentioned that they’ve noticed that in the sections where they use OER, the students do more of the reading, and more often.  One of them even has the reading quiz scores to prove it.  All three mentioned -- unsurprisingly, to anyone who has taught -- that the class discussions are better when students have actually done the reading.  For whatever reason, the students in the OER sections are doing more of the reading than the students elsewhere.  The classes are correspondingly better, with higher grades and happier students and faculty.

To me, that puts a different spin on the “cost vs. quality” debate.  To the extent that commercial materials are better -- a case-by-case issue, but still -- a direct comparison of quality assumes equal likelihood of students actually reading.  But if they’re noticeably likelier to read OER than a commercial source, then the “quality” argument becomes trickier.  Would you rather they pay for a better book that they may not use, or not pay for a slightly less impressive book (if that’s true) that they’ll actually read?

If a textbook falls in the forest and nobody reads it…

None of them really had a theory as to why the students are reading more in the OER sections.  It could be a fluke: three professors mentioned it, but I haven’t done a full-scale survey.  It may be the equivalent of a coin coming up “heads” three times in a row.

Alternately, the screens may actually be the key factor.  Students don’t always have their textbooks with them, but they do always (or almost always) have their phones.  If they can steal a few minutes somewhere to glance over material on the phone, that’s better than not looking at all.  That’s especially true for students with jobs -- the vast majority -- for whom long blocks of uninterrupted time are scarce.  If OER-on-the-phone lends itself better to reading on the fly than textbooks do, and students are rushed, then I could see where using OER might lead to more reading.  It’s less than ideal, but the ideal is often out of the question.

Partisans of theories of “skin in the game” are probably howling at this point.  They would argue that anything given for free will be undervalued, and anything bought will be taken more seriously.  And there are times when that’s true.  But it may be that convenience, accessibility, and affordability combine to outweigh the pull of sunk cost in this case.  I know I have books I’ve bought and haven’t read, even as I continue to read free stuff online, so I don’t need to wander far to see this behavior.  The coffee table in the living room is groaning under the weight of unread books.  My long-suffering bride can attest to this.

So this one is really an empirical question for my wise and worldly readers.  For those of you who’ve taught (or taken) classes with OER, did you notice students doing more of the reading?  If so, do you have any thoughts as to why?
 

Wednesday, November 15, 2017

A Conference for Economically Vulnerable Institutions


Josh Kim wrote this week about the possibilities offered by a conference focused on, and starring, economically vulnerable colleges and universities.  

He’s at Dartmouth and I’m at a community college, so we’re looking at the idea from different angles.  That said, a few thoughts.

First, several such conferences already exist, but they don’t bill themselves that way.  The AACC, the League for Innovation, and even Achieving the Dream are all focused on community colleges, nearly all of which are economically vulnerable.  Each regularly features presentations on and/or by folks at colleges that have worked wonders on a shoestring.

But economic vulnerability in that context is a background condition, rather than a defining trait.  

Second, to the extent that a conference focused explicitly on economically vulnerable colleges, I’d expect two basic barriers to participation.  The first is cost; travel funding is usually among the first things to go when budgets get tight.  The second is reputation.  Higher education is a reputational business in many ways.  Students prefer colleges that they expect will still be around years from now; donors prefer the same thing.  “Coming out” as economically struggling beyond the norm could become a self-fulfilling prophecy.  Yes, Sweet Briar was able to shake the alumni tree for money when it threatened to close, but that was the exception; the more common story is of donors abandoning a college when the cause seems lost.

And that’s a shame, in many ways.  Leaders of struggling institutions have to be able to pivot quickly between stories of success that attract donors and stories of need that motivate savings.  The two aren’t mutually exclusive, but it takes real finesse to do both well.  In a more perfect world, there’d be room for such nuance.

Neither objection is necessarily dispositive, though.  Let’s say that the conference were regional and/or virtual, to reduce cost, and it centered on “fiscal responsibility,” rather than struggle.  What might that entail?

Certainly, some serious discussion of ways to control health insurance costs.  That’s the dinosaur-killing meteor of many college budgets.  

Affordable tech.  I remain perplexed that so many colleges still replace office PC’s, rather than going with chromebooks or dumb terminals.  Across a decent-sized institution, that change alone could save hundreds of thousands of dollars per year, and that’s before counting reduced maintenance, disaster recovery, and virus-fighting costs.  This is seriously low-hanging fruit.  

Affordable back-office tech: the ERP, LMS, and the like.  

Furniture consortia.  I don’t know if you’ve priced classroom furniture recently, but it’s pretty alarming.  We can’t just buy any old thing, given accessibility needs.  I know that colleges compete on academic programs, atmosphere, and quality of the student experience, but I don’t see a barrier to collaborating on desks and chairs.

OER, OER, OER.  

Professional development at scale, inexpensively.

Practical, affordable ways to implement renewable energy.  

Best practices in cheaply whittling down deferred maintenance.  If ever there were a useful area for higher ed research, this might be it.  

This is very much a top-of-the-head list, and hardly exhaustive.

Colleagues who work in the tuition-driven sector, what would you like to see at a conference that focused on the needs of colleges like yours?

Tuesday, November 14, 2017

Multi-Factor Placement with a Small Admissions Staff


At a meeting this week, I saw two articles of faith crash into each other.  I’m trying to sort out the pieces.  They were:

  1. High school GPA and course selection are better predictors of success in college than a single score on a placement test.  
  2. Hiring more staff (“administrative bloat”) is bad.

The two conflict, because collecting, interpreting, and applying high school transcripts and other forms of information is much more labor-intensive than simply getting a test score from a machine.  Selective universities and colleges have relatively large admissions staffs in order to sort through and compare these things.  We don’t.  We’ve never had to.  

If we want to improve placement, we need to hire staff.  The cost comes before the benefit, making it a hard sell.

Alternately, of course, we could go “full California” and just go with student self-reported GPA.  John Hetts did a presentation on that a couple of years ago that showed excellent results from using student self-reported GPA for placement.  But I can’t imagine that model gaining acceptance here without some sort of mandate.  It’s a bit too radical for most.

If every high school used the same grading system, it would be relatively straightforward.  But they don’t.  Some use 1-4, some use 1-5 (weighting for honors classes), some use 1-100, and some use A-B-C-D-F.  Each one calculates GPA slightly differently.  

The Accuplacer survives as a placement instrument not because it lives up to its name, but because it’s easy and cheap to deliver quickly at scale.  Getting to something more accurate would require spending more money and person-hours upfront, for an improvement that would be difficult to quantify for some time.  That makes the payback hard to measure against other possible uses of limited resources.

The core of the issue is that improvement sometimes requires investment.  Or, to put it more bluntly, money.  

Bailey, Jaggars, and Jenkins’ book Redesigning America’s Community Colleges notes that many of the most effective interventions reduce the cost per graduate, but raise the cost per student.  If you’re funded per FTE, and funding is tight, that creates a cruel dilemma.  We know several changes that would make significant differences, but each of them has a non-trivial cost that comes before the prospective (and unquantified) benefit.  

When you have several possible interventions that carry similar upfront costs, and very little money with which to work, it would be helpful to be able to compare the expected payoff of each.  But we’re not just there yet.

So, this one is a little more “inside baseball” than I usually go, but hope springs eternal.  Has anyone found a reasonably effective way to do multi-factor placement at scale when you can’t hire a bunch of new staff people to evaluate transcripts?

Monday, November 13, 2017

The Limits of Collaboration


Apparently, California is mulling creating a statewide online college, and it’s looking at three different models with which to do it.  Model 1, which could work, involves designating one community college to be its home.  Model 3, which could work, involves creating an entirely new organization.  Model 2 involves a consortium.

California, I know we’ve had our differences.  I don’t care for quinoa, and the ocean is on the wrong side.  But for the love of all that is holy and good, don’t choose option 2.  It won’t work.

I speak from experience.  Massachusetts tried a version of option 2 about ten years ago, called Mass Colleges Online.  It relied on existing campuses to provide seats in online courses to students from other colleges; the idea was to accumulate courses from across the state so a student could take classes that her particular campus might not offer.

It made sense on paper, but it limped along for years until finally sputtering out.  

The technology itself wasn’t really the problem, which means that saying “but the technology is better now!” doesn’t really address the issue.  The problem wasn’t the technology or the pedagogy.

It was incentives.

Each individual college is likely to offer classes that it thinks will attract enough students to run.  Those tend to be the introductory gen eds, and the staples of high-enrollment programs.  Intro to Psych will run reliably, as will Intro to Business.  You don’t need a consortium for those.  Colleges won’t run classes that they don’t think will fill.  Those are precisely the classes for which a consortium matters.

Those small classes become tricky when it comes time to make the go/no-go decision on running the class.  Let’s say you have a minimum section size of 12.  You have only 8 students registered, 4 of which are from other colleges.  By cancelling the class, you’ve just annoyed your counterparts at up to four other colleges.  If you defer to the consortium and run the class, now you have to explain to your annoyed local faculty why students from other colleges count for more than students at your own.  Good luck with that.  

And that’s before getting to the finances.  If each college keeps the tuition from its own students -- and I’ll admit not really understanding how California does its finances -- then I’m inclined to cancel any section that doesn’t have at least 12 of my own students in it.  I still have to pay the professor and cover overhead; if only six of the students in there are from my campus, I’m losing money.  When budgets are tight -- generally, on days ending in “y” -- that’s a non-starter.  When everybody else in the system runs the same numbers and reaches the same conclusion, the consortium sputters.

Let’s say a student from Bakersfield takes an online class from a college in Fremont.  (Substitute whatever cities make more sense.)  The professor accuses the student of plagiarism.  Whose regulations and processes are in effect?  If it’s the college in Fremont, and it mandates a hearing, how does the student from Bakersfield manage that?  

Worse, neither curricula nor calendars are uniform across most states.  That means different add/drop dates, refund policies, bookstores, and the rest.  That doesn’t matter much when any given student only attends one college at a time, but when she attends more than one, the issues become real quickly.  Is my college obligated to provide tutoring for a student from another college?  If so, will we get compensated for that?  And don’t get me started on financial aid, which allows a student to receive aid only at one college at a time.

The appeal of the consortium model from the outside is that it promises something for (almost) nothing.  But that’s exactly why it doesn’t work.  Unless the budgets for individual institutions are adjusted to account for the new costs of inter-institutional collaboration (and small sections), the consortium will die of accumulated indifference.  

Think twice, California.  A consortium may sound appealing from the outside, but if you don’t align the incentives internally, you’ll just be repeating a ten-year-old mistake from a small state where the ocean is on the right side.

Sunday, November 12, 2017

Across the Class Divide


This weekend I took The Boy to Boston, for the first two of what will eventually be several college visits.  It was a terrific chance to spend some time with him, even if much of it was necessarily “windshield time,” and we even got to see some old friends while we were there.  As a recent New Englander, the territory was familiar, and I knew enough to know to stop at Frank Pepe’s pizza in New Haven on the way.  (The slightly charred pepperoni is _amazing_.)  But having spent the last decade and a half at community colleges, I couldn’t help but notice the stark class differences.

TB has declared forcefully that he wants to get out of New Jersey.  I get that; when I was his age, I ruled out anything in Western New York for much the same reason.  In his mind now, and in mine then, part of the point of college was getting some physical distance from your parents.  I had a good relationship with my Mom, and he has a good relationship with us, but there comes a point when it’s time to stretch out.  I was the same way, so I don’t hold it against him.  Sorry, Brookdale, but you’re just too close to home.  That may be culturally specific, but so is he.

He’s thinking of becoming a surgeon, so he’s looking at pre-med programs.  His criteria are different than mine were, but just as idiosyncratic.  I wanted a small liberal arts college setting; he wants big and urban (and cold).  So off to Northeastern and BU we went.  It was certainly cold.

(At one point, we ducked into a used record store, seeking warmth.  I tried to explain some classic album covers to him, mostly in vain.  By the time we got to Emerson, Lake, and Palmer’s “Tarkus,” I just muttered “it was the 70’s” and moved on.)

College visits are more structured now than I remember them.  It used to be that you got there on a weekday during regular business hours, and students would lead periodic tours.  Now you have to pre-register and check in on arrival.  I’m told that they use registered visits as signs of “demonstrated interest,” which count in your favor at admissions time.  (Add that to the list of ways that low-income and first-generation students fall behind.)  Both schools started with group presentations featuring an admissions rep and a high-achieving student before turning us loose on tours.  (I had to smile when they did a show of hands to see which states people were from; New Jersey was a clear majority in both cases.  The public schools here were closed on Thursday and Friday of last week, so it was a popular college visit time.  TB saw a friend from his school at Northeastern.)  

To be fair, residential universities with research profiles will be different in some predictable ways from community colleges.  But the sheer monetary difference was staggering.  Yes, the tuition -- a full order of magnitude higher than ours -- but that’s only a part of it.  They offer far more options than most community colleges ever could.  The facilities are vast, modern, and impressive.  The cars parked along Bay State Street, at BU, included multiple Porsches.  The student presenter at Northeastern mentioned the discount that students get for Boston Symphony Orchestra concerts, and that she and her friends dress up and go monthly.  One Dad asked at Northeastern what their adjunct percentage is; the guide replied “zero.”  (The guide at BU didn’t understand the question.)  The racial composition of the student body at both schools was visibly different than most community colleges.  

I heard words like “co-op,” “internship,” and “abroad” a lot; words like “diversity,” “basic needs,” and “preparation” not at all.  I didn’t pick up on any economic anxiety among the students; the only anxiety in the air was about getting in.  In the community college world, that’s reversed.  And it probably goes without saying that remediation didn’t come up.

I don’t begrudge those universities what they have.  But it’s hard not to notice that the divide, already glaring, is getting bigger every year.  In America, we seem to have decided that elite education is worthy of tremendous support, but that mass education is a deadweight cost.  That may sound hackneyed or ideological, but if you go physically from one campus to the other, the difference hits you in the face.  It would take active effort not to see it.  

In my preferred world, we’d be closing the gap, rather than expanding it.  We’d look to offer the same sorts of opportunities to the students for whom leaving home isn’t an option, or even for whom “home” remains an elusive ideal.  We’re not there, which is one thing; as a society, we’ve nearly stopped trying, with is much worse.

In the meantime, my job is to help TB do what he wants to do.  To his credit, he has a much better sense of some of these disparities than I did at his age.  And he knows what he wants much more clearly than I did.  I don’t know where he’ll go, but I know the school will be lucky to have him.