Monday, June 6, 2011

Higher Ed, Moving Technology and Trying not to pick fights with Lauren

Before we begin, two things:

1)  Go read Lauren's post. 
2)  I agree with a very large chunk of what Lauren says and/ or brings up from the article she's citing.

I worded something funny in my response to Lauren's post via Twitter, and I am afraid she took it as me saying "this will never happen" or "you're wrong", neither of which was my intention.  I love what Lauren has to say, and unlike what you do see occasionally (ie: "how do we turn all these college classes into video games?", which has been asked to me before) the ideas make sense. Its mixing the technological with the sociological in a way that understands the dynamics of a bad situation and proposing plausible solutions.

Now, what started the great Twitter debate of 2011 was that I made a comment about cost and culture as barriers. What was I on about?

Some background:  Of my 13 years since graduating, I've spent more than ten working in higher education (and a year in the employ of higher ed prior to graduation).  In that time, I have always been employed in offices which have been responsible for rolling out new technology to universities, faculty, researchers and students (going back to when pages were written in this stuff called HTML), several years getting courses online in all shapes and forms, and now working in Digital Libraries (or, as I wish to re-configure it:  Research Networks).  So let me share a few things.

If there's risk here of crossing swords with Lauren, its in noting that a lot of what she pitches isn't news on most college campuses.  Rather, there is very little incentive for anyone to change the current model.

Yes, this is a depressing, depressing thought.  But what sort of blog would this be if I didn't go into excruciating detail to talk about the situation?

Changing the Game and What a Prof Wants

Mostly, though, when we start talking educational policy, theory and integration of technology, its important to know that universities are shelling out what they can afford to make change and trying to work with instructors to make improvements.  You're going to be hard pressed to find a major university that doesn't have offices like UT Austin's Center for Teaching and Learning, or offices within colleges such as the Faculty Innovation Center at the Cockrell School for Engineering.  Universities are littered with folks with degrees in Instructional Design/ Technology, technologists bringing new ideas to campuses, offices of assessment that analyze more data on the university than you can shake a stick at, etc... In short, just because you haven't seen the changes happen in every classroom doesn't mean there aren't people working on this stuff.

These offices are the places that spend all of their time considering trends in higher education, diagnose all sorts of educational theory, plot technology-in-the-classroom integration, et al.  They're out there talking about collaborative, assisted education, outcome and project based curricula, what-have-you.  Unlike the TED-talkers, these guys aren't cranking a few ideas out and waiting for applause from true-believers, they're on the ground trying to enable change.

However (and you knew this was coming), generally those offices wind up helping only a portion of the faculty.  I'm not going to talk about what should be here, I'm going to talk about what is, because these are two very separate things.*
  • Within a university, colleges, departments and faculty are not rewarded for innovation in instruction, but they will do it to remain competitive with their peers at top-tier schools who generally have financial and staff support to improve their courses, and far more latitude to experiment. 
  • Most disciplines ask tenure track faculty to focus on research and publishing.  I have seen very little evidence that faculty are able to count points toward tenure based upon innovation in instruction.
  • Once tenure is had, you will see some faculty really go to town on becoming better teachers.  Many, many others buy out their teaching time or teach the same two classes year after year.
  • Faculty are experts in a field or science, they are not trained educators (you can expect this to be untrue in the college of Education, I suppose)
  • Many faculty at research institutions have an expectation that part of higher education is the "sink or swim" model, and that their role is to provide information and assessment
  • The overriding sentiment is that college was competitive and difficult in their day, and its a matter of character that those who show up to get credit best be ready to deal with what's handed them
If that sounds overly cynical or rough, this is speaking from working only in higher ed in Engineering and working occasionally with scientists.  And from my view as a student in Communications and Liberal Arts (and, briefly, as a Masters candidate in Instructional Technology until I realized I really didn't want to write curricula or study learning theory until I croaked), but it isn't like you don't see trends.

Its likely worth noting that most suggestions for the implementation of technology is very learner focused.  It matches a best-case scenarios for students.  This doesn't jive very well with how faculty have traditionally taught, which gets back to the points above.  Unlike K-12 education, once you get past entry-level courses in higher ed, each course and the person delivering it is considered part of the value (your learner's mileage will vary on that sentiment).  Under this model, its simply cheaper to have the person delivering the course in the room talking at students than to spend time and money creating complex and expensive systems which can't be re-used should your faculty members swap out.

Now, what's interesting is that if you look at University of Phoenix (the for-profit online and face-to-face university), there's a model there that is actually not entirely dissimilar to what a lot of educational theorists like.  Courses are entirely online, there's a project-based, standard curricula that can is more mediated than taught, and few online students ever meet their colleagues, and yet work and learn through collaborative courses, staying (in theory) with a cohort from class day 1 through graduation.

This model hasn't been widely adopted, and hasn't won UofP many fans outside of the state of Arizona, meaning many employers still do not recognize a UofP degree in the way they might from Backwater State U.  I cannot, with any confidence, state that those opinions are changing. 

Asking a tenure-track junior faculty to spend significant time redesigning a course and curricula that will pace to an individual learner's needs?  When that same faculty/ researcher is also in the lab or writing papers or researching or trying to get published, all of which takes up a huge amount of time?

By the way, if I can point to one thing suggested with great frequency and which the fellow Lauren quotes pitches, do not pay the MTV Gambit.  Its my experience that you do not try to tell faculty that their courses need to be video-game-like or ready for the MTV generation.  (a)  The MTV generation is in its mid-30's, and (b) very, very few faculty see their job is to entertain.  Bear in mind the point above about an education is something people earn, and what I'm counting as 13 years of being told people will quit coming to college if we don't turn the class into Zelda... hasn't happened.  I'm not saying kids wouldn't love it, but from the other side of the coin...  its hard to incentivize the instructor.  To those who treasure their mortatr boards and gowns, its a bit like saying "but can't we spice up how Congress works for the MTV generation?"

I firmly believe in gaming and simulation as part of learning, but I can admit that the pay-off doesn't scale very well once you hit any courses with a seat count below the 400-seat History 101 classes.  But you also have to wonder how much you want a standard curricula at school after school (hint: most universities do not want it, but we'll see how things look if the recession continues).

If no faculty were willing, I wouldn't have had a job at a couple of my different employers, so let's be clear.   There numbers are not in the majority, and its always a sales job (and half the effort is in keeping faculty on the rails and not doing something awful and weird).  Faculty need to see that the students will be happier, and that there are positive results (ie: grades go up, students go to the dean less complaining about the abusive treatment received, etc...).  Whatever you bring to the instructor has to be extremely lightweight for them to use, and it has to work on the first try.  Your technology must have repeatable results, be something they can brag on when they show it to their colleagues, and be something that they can manage.  Oh, and, yeah,  it can never, ever go down or fail.  Failure means that its always easier to just go back to how things were before these crazy kids showed up with their gizmos.

Then, of course, you have to be ready for the students who are going to misunderstand and crater once they aren't in a traditional classroom, where they've previously thrived.

To be truthful, most of what Lauren described has long been talked about, but now that technology seems like its catching up to The Diamond Age, things are going to get interesting.  Likely one of the toughest hurdles to clear will be exactly how faculty remain within the structure (or as we'd say in higher-ed, "Paradigm") in order to ensure that content is the latest and greatest rather than expecting robots and pre-created content to churn out the perfect pupil.

Keeping the content updated, reliable, working, etc... brings us to our second point...


The issue with eLearning and technology solutions is that as powerful as they can be, introducing any new technology into the classroom has, thus far, had an extraordinarily high cost and extraordinarily short shelf life in comparison to white boards and markers.  Your award-winning effort of two years ago can be made moot or ridiculous by the next emerging technology and/ or educational practice du jour.  You may have also spent literally ten's of thousands of somebody's money getting there only to see Slippery Pete's EduWare release a free piece of software that's ten times slicker than your tool, just about the time you're ready to present on it. 

For this reason, higher ed loves Open Source.  There's little upfront cost, and higher ed often collaborates, so its easier to know what's going on and to contribute.  But, open source is (as we say) the Free Puppy Model.  Yes, the puppy is free, but you need to feed it, take it to the vet, etc...  Nothing is ever actually free, but in universities its usually better not to try to put a huge vendor cost down in front of the purse holder.  In general, my observations tell me the costs generally wind up evening out as Open Source generally means that the solution is not, as we said back in 2003, turn-key, and work must be done to manage and maintain the systems rather than picking up the phone and yelling at your sales rep to fix the damned thing.**

During the past decade, public support for education has become incredibly low in states like Texas and Arizona (California is going to be an interesting case as public-referendum-mandated state law binds them to all sorts of higher-ed services, but the tax dollars aren't there).  During high times, legislators decided to cap tuition (which they did in Texas) expecting others would pick up the tab.***  During low times, in order to balance budgets, the constant slashing of the state budget for higher ed (as they've done in Texas, California, etc, et al) has meant a reduction in all sorts of services at your local college.  The two together have caused a very odd place in higher ed, new since WWII, in that no matter the state of the economy, its time to slash and burn.

With the poor economy, the fund-raising (or, as its referred to in higher-ed speak: development) side of the pie has also been greatly reduced as wealthy alums aren't so wealthy anymore, or are investing in their own portfolios to remain afloat.  And, yes, corporate grants have followed suit.****

Unless a specific grant is landed to attempt a particular new model for education, funding can be withheld in favor of projects that are considered rock solid, re-usable, long lasting and which will affect and support the most students.

Can the University Mandate Changes?


Well, let me back up.  Yes, they can. If a Provost or President mandates changes - and is willing to go to the mat with their faculty to enforce that change, then one supposes that anything (properly funded) could occur.  And, despite what I said above, were funding and support readily available.

This is a fundamental revision of educational delivery.  But...  In the next few years, someone is going to deliver platforms that can handle this as easily as Learning Management Systems.  That integration into major schools took about 7 or 8 years to reach the point where the challenge isn't for the instructors, but for the IT folks to keep it running and for certain staff to guide faculty in the early days.  But that's also just a handy insta-class-website tool, not a change in the educational paradigm.

University culture is notoriously independent to the point of abso-ludi-crous-ness.  Until about two years ago, at UT departments were running email servers under their desks rather than give in and just admit it makes more sense for everyone to be on the same email system and at this late date, all those servers are still running, and we're just waiting for the right people to quit paying attention before they get shut down.  But its never been the business of the university to do more than assess and try to help faculty who admit they need help (or, I suppose, those who get such bad evaluations, somebody feels the need to intervene).

And so, in conclusion:

I don't want to paint the change of education as a hopeless pipe dream, but its going to take time, its going to have to be easier for faculty, and its going to need to be cheap.  Take heart, you dreamers of a better world, your advocates are out there and they are trying.

To see change, faculty have to believe that they aren't doing something correctly, and higher-ed is a pretty big echo chamber reinforcing some pretty bad habits (its SOP in undergraduate engineering to believe that all students earn 50% on exams and then the instructor just curves the grade and that's working - but somehow instructors believe that doesn't mean students missed half of the material taught to them).

Its going to happen.  Slowly, painfully, and a lot of people will need to die and retire before you see a sea change in how education occurs.  But it'll happen.

*I should be captain of a crack commando squad bringing our own special brand of justice to the streets, I am sitting on my couch in need of a shower
**One thing nobody really talks about is that we insist education, K-12 and higher ed, be "wired" without considering what that's looked like on the ledger.  Universities spent a lot of money ramping up IT in the 1990's and have absorbed a terrific cost in hiring systems folks, programmers, security people, networking people, etc...  none of which was part of HR consideration two decades ago when the height of classroom technology was an overhead projector and university IT systems = telephone exchanges.  This cost has been passed along to the consumer, to an extent, and it has gradually reduced some staffing as efficiencies have been found, but, of course, then you have this whole expensive IT staff you've got to pay somewhat competitively.  Very fortunately, the cost of It infrastructure itself seems to drop quite a bit on a continual basis, and universities opening themselves up to the idea of use of commodity IT resources has helped to lessen the blow.
***by the way, part of the skyrocketing costs of higher ed are not bloated university salaries or gold-plated chairs for faculty.   Its the cost of benefits (ie - health insurance) for staff and faculty.  Staff salaries are usually below those at private companies, but the understanding has been that the benefits are good.  This is increasingly untrue, and good people are pulling up stakes and moving on.  We lost 1/4 of our own staff in the past few months.
****I've seen the other side of this during boom times.  Having a corporate patron trying to think of ways to do new, cool things is a very strange experience in a place where ordering a white board marker requires multiple signatures.


Fantomenos said...

Interesting observations. I'm teaching my first online course as we speak, at a very small university where I have total freedom and no budget.

What am I using? IPhone, ScreenCast-o-Matic, YouTube and Moodle. Oh, and Excel for HW assignments (it's a business statistics course). Had I not had 3 years of experience as a software engineer, I don't know what I would do, I feel like I'm woefully behind the curve as is.

But, I think it's going ok, although I don't know how it would scale.

Your observations about research universities are pretty spot-on, this won't be a lightbulb moment when everyone sees the value in this, we just need the baby-boomers to hurry up and retire already so we can make some progress.

horus kemwer said...

I was puzzled by the linked article - it started with the problem of attrition of women and minorities in CS, then it moved to a call for online / whimsical education. As you (rightly) point out, whimsy / entertainment value isn't (importantly: shouldn't) be the issue.

Somewhat more important, however, I think is the original question about how to improve the gender ratio in an area like computer science (I say this coming from the single worst area of humanities for gender ratios, much worse than most sciences, though not quite as bad as CS). Online education is only going to help with this if it is features of the current educational experience (face to face contact) which are contributing negatively to the gender ratio. Maybe, but that's pinning all the blame on instructor bias and poor social skills of one's peers (mentioned in the article) and avoiding, say, prior socializing experience (e.g. in high school, also mentioned in the article). I'm not sure that this is the right analysis.

Speaking as someone who's both taught online (at an all online HS for gifted kids) and face to face at the undergraduate level, face to face education is unequivocally better. The benefit of online education in the HS I taught at is that it was able to bring together students from different countries who shared interests and skill level. This outweighed the difficulties in maintaining discipline, attention, etc. in the class room. But I've been on both sides of the online educational experience. Moving to computer education is definitely not unequivocally a good thing. Furthermore, don't imagine that all the face to face problems one might have (a boring instructor, etc.) can be fixed in any easy or obvious way in the online environment.

So, I'd say, yes: a powerful tool, something we'll eventually see more of in future (and we should). But it will never be a replacement (nor ever be as effective) as face to face instruction. Finally, the problems with gender ratios run deeper than anything that can be fixed by switching to an online environment - that's simply a red herring, these are totally disjoint issues.

The League said...

Well, the solutions proposed are technological in nature. With the plausibility of those solutions being created and implemented, I wanted to discuss why the technological solution can often be frustrated by outside forces.

I agree with what much of what Lauren says about societal bias, etc... To me, there's not a lot of room for debate. While I did not work in the sciences, I did work in engineering and I can count heads and see something outside the classroom is happening when the numbers are so uneven (gender speaking) in these programs beginning prior to Engineering 101. Those enrollments become even more lopsided for advanced degrees, which is kooky when you look at the numbers beginning to skew heavily toward more women entering college than men.

While day-to-day instruction is pretty strictly gender neutral (and certainly culturally if the number of international students thriving is any indication) in these fields, the bias of education is generally considered by those studying pedagogy something that can be managed. What CAN we do to cross that barrier and begin home-growing the young women and other under-represented populations into the engineering or computer science classroom? I agree with Lauren that many of the tools that she lists would make inroads.

Unfortunately, it also set off a mental time bomb. I've sat through too many webcasts of TED-talks, etc... where a "guru" proposes some solution to their audience of disciples describing how if educators were smart (implication: "like me"), they would implement this or that technology, and it wouldn't just make education more efficient, it would address social inequity.

Pie-in-the-sky is where you start, but its seldom acknowledged that universities aren't short on ideas or attempts. As you know, they're a complex eco-system, and I was attempting to inform.

So, if we're going to discuss how universities can be better at education, lets discuss what's actually currently happening and why its slow to change. Its something I've some experience and passion for, and I'm happy to discuss.

I'm out of elearning now, but now I'm dealing with similar issues from a parallel track in working with research university libraries.

horus kemwer said...

I understand that most of your points were orthogonal to the question of how to neutralize gender imbalance, and I'm sympathetic to your realist take on the implementation of any radical changes in teaching technology.

My basic point was that online education is not better education (even given current best technology which, as you point out, has all kinds of practical barriers to receiving widespread implementation). It's not clear to me how moving to a poorer education format is going to improve gender imbalances in any field. And if it does, it's not clear to me that the cost is worth it. As far as I could tell, every single one of the suggestions in the other blog post was predicated on the idea that moving to an online environment would improve the learning experience. But if it's harder to learn CS well in an online environment (and maybe CS is radically different in this respect than other subjects with which I am more familiar—if so, I'd like to hear an argument to this effect), then it's not going to help anyone learn, no matter what their gender.

The League said...

Ah. No, going online is not inherently better. Of course not! Its full of all sorts of hurdles and challenges that you'd never have in the classroom experience.

I do think you need to turn what teaching is a bit on its side in order to start looking at how successful online programs are deployed.

Much of the work is self-paced and self-managed, and replaces lecture with readings and activities, projects, and assessment. In this model, the person listed as "instructor" acts more or less as a moderator and overseer for public forum questions and private questions from students.

Pedagogicaly, it fits very neatly inline with project and rubric-based educational theory (which may be out of fashion now, its been a few years). The stand-up lecture is now considered sooooo 20th Century in EdTech circles. And in that model, the need for an instructor to stand up and talk for 3 hours a week really doesn't exist. Classroom time is used differently, for student collaboration, discussion, etc... (which can be replicated online - and in this case a bit more anonymously)

What I was addressing with my many stabs at the cost issue is that elearning hasn't been exactly proven to reduce cost, create less work for instructors, etc... In fact, it can create a lot of time-shifted work. And upgrading or re-creating course content has proven to be extremely expensive in comparison to 15 weeks of lectures and scantron exams.

But elearning does remove some of what are considered the intangibles reported in higher ed, such as women not feeling welcome or comfortable in face-to-face technology programs and removes issues like being shouted down in class in favor of a username and near anonymous presence online.

To be honest, I'm making a leap, because I haven't seen any statistics in years, and I'm making suppositions based upon what the reported challenges were in engineering when I worked with UT's Women in Engineerng Program, and what eLearning can bring to the table (as well as best practice from my Educational Technology days). But I don't think I'm leaping too far.

horus kemwer said...

Hmmm, well I agree with a lot of this. Anecdotally, I saw students at the online high school collaborate in teams without an array of prejudices (on the basis of age, appearance, size, etc.) which quite likely would have operated in a physical environment. On the other hand, even with today's networking tools, I don't think the collaborative environment available online is anything like as rich as that possible in a face-to-face environment.

Again, anecdotally, I have to completely agree with the "time-shifted work" point. For me, however, one of the things I most missed teaching online (and I at least had regular realtime sessions with students, which are not part of your basic online course proposal in a lot of situations), was the enormous difficulties in communication and in maintaining and controlling attention.

re: communication - the fact of the matter is, a good teacher learns a lot from the expressions on his students' faces. He can tell if they are following him and modify his teaching accordingly. Even in moderate sized classes. Of course, this assumes some level of care / attention on the part of the instructor, but there's a lot of positive feedback there that's unavoidable and automatic, I think, even for mediocre instructors. All of this is gone w/ the current technology for online teaching.

re: attention, this is becoming more and more an issue in physical classrooms as students check email on laptops and text each other. But attention issues are much worse in a completely online environment. Uniformly, I found that retention was significantly worse for students watching online lectures than it would have been in a physical classroom. Even during real time discussions, the distractions available to students on their computers outweighed those for a student w/ a laptop in a physical classroom. (And let's not even get started on the potential for cheating, etc.)

I guess bottom line is: I think online education will one day raise the baseline lowest common denominator education available for a large segment of the world's population (and, as you point out, that day will not be soon), but I don't think it will ever replace top-level education.

And, just to return to the original point about gender imbalance, this gets worse and worse as education levels increase. The Ph.D. stats are downright dismal . . .

The League said...

From my observations, it depends on audience, instructor experience, design of course and a whole bunch of factors that aren't as easy to manage as "instructor walks into class, class begins".

Full disclosure: I was responsible for about 1000 hours of broadcast online lecture per semester, managing around 30 faculty per and around 350 students. It was a factory and I was the grizzled foreman.

The programs we delivered were all Masters of Engineering, and between the competence of practicing engineers and their drive to earn a masters in a way that was keeping them on the job meant that they were very driven. students also usually were being reimbursed by their company for the tuition and fees, but only if they got an A or B, so... motivation.

We compensated faculty for each student, and my office ran on fees associated with online enrollment. So we were highly motivated to keep quality high.

The faculty made special time for online students, creating online-only office hours to compensate for the difference in communication. Truthfully, I didn't understand how some courses managed the labs and homework.

But that's a terrible model, and not one I feel is superior. I've also worked on lecture courses where discussion boards and forums were the primary means of communication. I've worked on courses that were project based, with timed conference calls. All sorts of stuff.

I guess this stuff is more complicated than I think about, but you CAN design a course where you break it down into modules that let you know where your students are, and will let you know if the flaw is in what you're delivering. But its a @#$%-ton of work, and requires a hell of a lot of monitoring.

These folks might be worth looking at some time:

Again, its cheapest and fastest to work off those cues that don't cost anything. Student's faces, hands raised, seeing who wants to continue the discussion outside of class, etc...

Moving online removes the non-verbal cues, and its certainly work (and often money) to figure out what methods work best for you and your learners based on the type of material.