every conversation with faculty members about copyright goes something like this…
long, rambling post alert. it’s been awhile since I’ve posted, so lots of things have been stewing. bear with me.
It’s fashionable to hate the LMS. It’s the poster child for Enterprise Thinking and lazy (online) pedagogy, so it is easy to rail against the LMS as The Cause of All Educational Evil. The LMS is put into the stocks, and we are expected to stand in the town square and throw rotten fruit at it.
We’re pushed into a false binary position – either you’re on the side of the evil LMS, working to destroy all that is beautiful and good, or you’re on the side of openness, love, and awesomeness. Choose. There is no possible way to teach (or learn) effectively in an LMS! It is EVIL and must be rooted out before it sinks its rotting tendrils into the unsuspecting students who are completely and utterly defenseless against its unnatural power!
I feel like I’m cast in the role of an LMS apologist, because I have a more nuanced approach.
I have been an advocate, proponent, supporter, and contributor to open source communities, open content licensing, and generally sharing stuff because why not? I have also played a key role in the recent adoption of a new LMS by my university. But. How on earth can I reconcile these two diametrically opposed world views? Gasp.
It’s almost as if different tools are used for different purposes.
When I think about the LMS, and its role in the enterprise, this is what makes many peoples’ hair stand on end. THE ENTERPRISE HAS NO BUSINESS IN THE CLASSROOM! etc. Except that’s largely bullshit. Of course classrooms are an Enterprise issue – whether physical (buildings and facilities are expensive to build and maintain, and need to be managed properly etc…) or online.
But, the arguement goes, online means there are no rules, no boundaries, no constraints. People should be free to do whatever they want.
That’s great – I think it is truly awesome that people can craft their own online environments, to support whatever online activities they want to do. And that instructors, staff, and even students (gasp!) can do this stuff on their own, with no interference or meddling from The Enterprise.
But. We can’t just abdicate the responsibility of the institution to provide the facilities that are needed to support the activities of the instructors and students. That doesn’t mean just “hey – there’s the internet. go to it.” It means providing ways for students to register in courses. For their enrolment to be automatically processed to provision access to resources (physical classrooms, online environments, libraries, etc…). For students’ grades and records to be automatically pushed back into the Registrar’s database so they can get credit for completing the course. For integration with library systems, to grant acccess to online reserve reading materials and other resources needed as part of the course.
Anyone who pushes back on this hasn’t had to deal with 31,000 students, and a few thousand instructors. This stuff needs to be automated at this scale. Actually – “scale” is another divisive issue. Why worry about scale? SCALE? WILL IT SCALE? As if scale is irrelevant. If a university needs to deal with tens of thousands of students, I assure you that scale is absolutely relevant. Anyone who thinks we shouldn’t spend time worrying about providing a common and consistent platform as a starting point needs to spend a week helping out at a campus helpdesk, answering questions from instructors and students.
OK. So the LMS is primarily used by institutions to make sure that there is a common starting platform for online courses. That courses are automatically created before a semester. That students, instructors, TAs, etc… are given access with appropriate privileges. That archives and backups are maintained. That records of activities and grades are kept. This is the boring stuff that is supposed to be invisible. But, it’s necessary if we are to responsibly teach online.
If instructors and/or students want or need to, they can of course do anything else they feel like doing online. Providing an LMS doesn’t mean “YOU SHALL NOT USE ANY OTHER TOOL” – there is no mandate to say “ONLY THE LMS SHALL BE USED”. It’s a starting point. And for some (many? most?) courses, it’s sufficient.
GASP! THE LMS IS SUFFICIENT? HOW CAN HE SAY THAT? BURN THE HERETIC!
Calm down. Take a step back, and think about some of the courses at a university. How about, say “Introduction to Chemistry” – yup. An LMS is entirely sufficient for that kind of course. Provide course info, share documents, maybe do some formative or summative assessment, and store some grades. LMS? check.
How about, say, “Calculus III”. Same pattern. LMS? check.
“Introduction to Shakespeare”? Students might want to blog about passages in Othello. Or link to performances of Macbeth. Maybe post photos of a campus production of King Lear. Great! Throw in a blog. Use the LMS for the basics, and do other things where needed. The LMS course becomes a source of links to other resources, and takes care of the boring administrative stuff.
But – why wouldn’t the instructor for the Shakespeare course want to be completely free of the shackles imposed by the LMS? THE SHACKLES! They might. Or, they might want to have a private starting point, before moving out into The Wide Open.
Even if the instructor decides to completely ignore the course shell that’s automatically created in the LMS, and go out on their own – say, using a WordPress mother blog site – they still need to take care of the boring administrative stuff. They’ll need to come up with a system for adding students to the mother blog site (and removing students when they drop the course). They’ll need to come up with a way to store grades (unless they’ve been able to convince adminstration and students that grades aren’t necessary – I haven’t met anyone who’s had luck there). They need to keep adding features to their custom website, until it starts accumulating lots of bits to handle the boring administrative nonsense.
Eventually, you come up against Norman’s Law of eLearning Tool Convergence:
Any eLearning tool, no matter how openly designed, will eventually become indistinguishable from a Learning Management System once a threshold of supported use-cases has been reached.
The custom platform starts to need care and feeding, maintenance, hacks to import and export data. It starts to smell like an LMS. So now, instead of a single LMS that can be supported by a university, we have an untold number of custom hacks that must all be self-supporting.
And here is where the pushback from the Open camp is strongest – BUT WE DON’T NEED OR WANT SUPPORT. JUST LET US DO OUR THING!
Which is great. Do your thing. But, what about the instructors (and students) who don’t have the time/energy/experience/resources to build and manage their own custom eLearning platform? Do we just tell them “hey – I did it, and it wasn’t that hard. I can’t see any reason why you can’t do it too.”? That starts to smell awfully familiar.
Which brings me back to my personal position on this. There is room for both. Who knew? The LMS is great at providing the common platform, even if it’s just a starting point. And the rest of the internet is awesome at doing those things that internets do. There’s lots of room for both.
“GREAT? NO WAY! THE LMS MAKES PEOPLE TEACH POORLY!”
No. It might make it easy for lazy people to just upload a syllabus and post a Powerpoint and think they’re teaching online. But that’s no different than physical classrooms being used by lazy people to show endless Powerpoint slides punctuated by more slides. Lazy teachers will teach poorly, no matter what tools they have access to. Just like awesome teachers will teach well, no matter what tools they have access to. The LMS is not the problem.
“But – why waste taxpayer dollars on an LMS at all? Just cancel the contracts and use the money for other stuff!” Um. It doesn’t work that way. We have a responsibility to provide a high quality environment to every single instructor and student, and the LMS is still the best way to do that.
And, although the costs have risen rather dramatically in the last decade, and seem ungodly high in comparison to, well, free… universities spend an order of magnitude more on the software that runs the financial systems – stuff that doesn’t have any direct impact on the learning experience. Hell, there are universities who pay their football coaches more than what they spend on the LMS for all students to use (thankfully, my campus doesn’t do that). For universities with $1B operational budgets, this kind of investment in online facilties is almost lost in budgets as a rounding error.
Anyway. Whew. I’ll try to write some more on this. 1600 words of rambling is a sign that I need to work on this some more…
Looks like the Connected Courses open course thing is shaping up to be kind of awesome. This is a placeholder post to let it sniff out the feed for the #connectedcourses tag here on the old blogstead. Here’s hoping my copious free time will be put to good use.
Fall 2014 Block Week kicked off today, meaning we just pushed into the 2014-2015 academic year. Holy. The last one is basically just a blur. But, we did a surprisingly epic number of major things as a team1:
- Migrating from Blackboard to D2L in about 8 months, including:
- building and testing the integration with Peoplesoft & Elluminate
- designing and conducting workshops to support a couple thousand instructors
- working to help get the 31,000 FTE student body through the move
- building online resources to help, at the UofC’s elearn website
- Doing an emergency migration from Elluminate to Adobe Connect, in response to the Javapocalypse of January 2014
- Probably a bajillion other things that got forgotten in the blur. what a year.
To get the campus community through the whole thing, I’d been using a diagram to outline the flow and timeline:
The 2 stars indicate (left) when we got access to our D2L server, and (right) when we had to turn off access to the Blackboard servers. Everything was driven by those dates, and mapped out over the academic year with semesters defining the major stages. The surprising/amazing/relieving thing is that we actually stuck to the schedule. I didn’t have to revise that document once, after using it last summer to outline the process. Wow.
On top of that, the shiny new Technology Integration Group in the Taylor Institute for Teaching and Learning’s Educational Development Unit had a bunch of other stuff to do:
- providing instructor training and support for D2L and Adobe Connect (working closely with the Instructional Design team)
- launching the new Teaching Community website
- rebuilding the “team formation tool”, from an old java-based codebase to a modern application implemented using the D2L Valence API
- producing a pretty awesome student orientation video
- building a new intranet website to manage data within the EDU
- preparing a new website for the new EDU (to be launched later this month)
- building a mobile app for D2L, using the Campus Life framework
- supporting the campus blogging and wiki platforms
- investigate additional tools within D2L to support learning, such as ePortfolios, badging, repositories, etc…
- exploring other learning technologies, including beacons, and a long list of other things we didn’t have nearly enough time to play with…
So, while 2013-2014 was a year of pretty epic and overwhelming changes, I’m looking forward to the big pieces stabilizing this fall, so we can start pushing at the edges a bit more. We’ve got lots of ideas for things we can do, once the major changes are done for a bit. That roadmap will be sorted out later this month, but it’s going to be a really fun year!
- this was a truly multi-department interdisciplinary team, with folks from the Taylor Institute EDU and Information Technologies working flat out together to get stuff done [↩]
John sent a link to our loose group of cycling buddies, and I’ve read the article 3 times now. Each time, it feels like it hits closer to home.
I’ve been riding my bike as the primary way of getting around, and have been communiting by bike almost exclusively since 2006. I’ve always ridden, but never really considered myself a cyclist until then. I was never athletic, never good at sports. But I was happy on a bike. Over the years, I actually got pretty good on a bike. I could make it go fast. I could climb hills. I could ride far. It was awesome.
And then it started feeling less awesome. Most recently, with my bad knee. Late last year, I somehow managed to get a stress fracture at the top of my tibia. I didn’t even know it had happened, and only wound up at the doctor because I thought I was dealing with progressive arthritis or something. Nope.
We couldn’t find any specific incident that might have caused it, but the doctors thought it may have been related to repetitive stress and strain while riding ~5,000km/year. Which meant it was self-inflicted. I’d been pushing myself for the last few years to try to keep up that pace. And, while limping around like a 70-year-old, I realized that I hadn’t been doing myself any favours. One knee is already pretty much shot, the other is likely not far behind it. And pushing to hit 5,000km/year wasn’t helping things. I’m largely recovered now – the knee is still sore, and feels weaker than it should, but it works. Physio has helped, but it’s obvious I need to pay attention to it before it gets worse.
I’ve been tracking personal metrics since 2006 – with detailed GPS logs since 2010, thanks to my use of Cyclemeter. Recently, I’ve added Strava to the mix. I really notice that I push myself more when I know a ride will be posted to Strava – either I need to let go of that, or I need to stop posting rides1.
I’m not really sure why I was pushing myself to keep hitting 5,000km/year. I think it was the feeling of accomplishment, of achieving a goal that not many people do. Some kind of macho “I’m not getting old! look what I can do!” thing. Whatever. I’m letting that go. I’m still going to ride as much as I can, but I’m not going to push it. I’m going to slow down, again. And have fun.
I’m registered in the Banff Gran Fondo this weekend. 155km, from Banff to Lake Louise and back2. I had been stressing out, because I lost 6 months of riding – of TRAINING! – and there was no way I’d be able to keep up a competitive pace. But that’s OK. I’m going to go for a nice ride. Stop at the rest stops. Enjoy the mountains. And I’ll finish when I finish.
Trying out the new Blogo app for writing stuff on my WordPress blog site. Looks really promising!
C.G.P. Grey posted this fantastic video on the inevitability of automation, and what it might mean for society at large.
We think of technological change as the fancy new expensive stuff, but the real change comes from last decade’s stuff getting cheaper and faster. That’s what’s happening to robots now. And because their mechanical minds are capable of decision making they are out-competing humans for jobs in a way no pure mechanical muscle ever could.
You may think even the world’s smartest automation engineer could never make a bot to do your job — and you may be right — but the cutting edge of programming isn’t super-smart programmers writing bots it’s super-smart programmers writing bots that teach themselves how to do things the programmer could never teach them to do.
via a post by Jason Kottke
For an extra-sobering good time, tie this in with Audrey Watters’ writing on robots in education.
The point of a lecture isn’t to teach. It’s to reify, rehearse, assemble and celebrate.
via Stephen’s Web.
Stephen ended his post linking to Tony’s blog post with what appears to be a throwaway line. It’s not. This is where the tension is centred when it comes to teaching. Lectures aren’t teaching, but have been used as a proxy for teaching because how else are you going to make sure 300 students get the appropriate number of contact hours? Butts-in-seats isn’t a requirement anymore. We can do more interesting things. And we can then use lectures for what they are good at. To reify, rehearse, assemble and celebrate.
It’s one of those things that sound unbelievably geeky – it’s like geocaching (a geeky repurposing of multibillion dollar GPS satellites to play hide and seek) combined with capture the flag, combined with realtime strategy games, bundled up as a mobile game app (kind of geeky as well), with a backstory of a particle collider inadvertently leading to the discovery of a new form of matter and energy (particle physics? a little geeky). It’s the kind of thing where peoples’ faces glaze over on the first description of portals and XM points, and resonators and links and fields.
One thing that’s been stuck in the back of my head as I worked my way up to Level 5 Nerd of the Resistance in the game, is the lack of an apparent business model. It’s a global-scale game, with thousands? millions? of users checking in from all around the world. There don’t appear to be ads in the game – I’ve never seen any – and there appears to be an unwritten rule that portals should be publicly accessible. That unwritten rule largely negates a business model that would have businesses pay for placement in the game in order to draw customers into their stores etc…
Niantic started the game in 2013, and launched it under the “release it free so we build a user base, then sell the company” business plan. It worked, as Google bought the company and ramped the game up. It’s now available for both Android and iOS platforms, free of charge, with no advertising or premium subscriptions or in-game purchases.
So, what is Google getting out of it? I think their largest draw is likely in crowdsourced geolocation of networks. They have every Ingress user actively (collectively) wandering the globe, reporting every wireless SSID and cell tower they come across, along with GPS coordinates. The game gently pushes players to stay at the location of a portal, confirming the geolocation and refining precision over time. It’s kind of a genius plan – it is constantly updating Google’s network geolocation database, which can then be used to more accurately track and target all users of the internet for advertising etc…. They’ve turned a bunch of nerd’s nerds into a crowdsourced network geolocation reporting system. And, at Google’s scale, it costs them a pittance to have this system running.
We may collect device-specific information (such as your hardware model, operating system version, unique device identifiers, and mobile network information including phone number). Google may associate your device identifiers or phone number with your Google Account.
When you use a location-enabled Google service, we may collect and process information about your actual location, like GPS signals sent by a mobile device. We may also use various technologies to determine location, such as sensor data from your device that may, for example, provide information on nearby Wi-Fi access points and cell towers.
Common TOS for all Google services, but especially relevant in a geolocation-based game that is actively pushing users to wander their neighbourhoods to gather this data and send it back to Google.
If they’d released the app as a “report network locations to improve google’s ad targeting” tool, it would have gotten huge pushback, and not many people would have downloaded it. But, by hiding that function and wrapping an insanely addictive game over top of it, it’s gone viral.
brb. I need to go recharge the portal at the playground down the street…
via Tyler Hellard
Looks like CAMP Festival would be pretty interesting.