Birdcage liners – Joel on Software

Algorithms, tuned not to help readers but to help advertisers. Intermittent reinforcement tuned to maximize engagement/addiction. This is some scary shit, but it’s the web in 2018. We can do better.

But whereas Twitter sort of stumbled upon addictiveness through the weird 140-character limit, Facebook mixed a new, super-potent active ingredient into their feed called Machine Learning. They basically said, “look, we are not going to show everybody every post,” and they used the new Midas-style power of machine learning and set it in the direction of getting people even more hyper-addicted to the feed. The only thing the ML algorithm was told to care about was addiction, or, as they called it, engagement. They had a big ol’ growth team that was trying different experiments and a raw algorithm that was deciding what to show everybody and the only thing it cared about was getting you to come back constantly.

Source: Birdcage liners – Joel on Software

And now, Facebook has seen the light! Its former executives are lining up to denounce the horrible things that Facebook has done. Zuckerberg is now pivoting away from algorithmic news (because hey that’s evil) toward algorithmic posts-from-friends.

Good news! Except that won’t help. It will only tighten the feedback loop and prop up the bubble. If you’re more likely to see things from your friends, you’re less likely to see things serendipitously. You only see what you agree with. Therefore everyone agrees with you. Bubble intensifies

Certificates (and badges) in university teaching and learning

This is a program we launched in Fall 2017, to coordinate programming offered by the Taylor Institute for Teaching and Learning for graduate students who are interested in developing expertise in university teaching and learning.

It’s run on the platform built by my team (go, team!), as well as D2L courses for online content and discussion. As grad students work through the program, they earn badges for completing a set of workshops or sessions in an area of focus:

(My team works with our Learning and Instructional Design team to offer sessions in the Learning Spaces & Digital Pedagogies badge.)

If a grad student works through all of the badges over a year or 2, they earn the full certificate, which is a recognized credential. It’s a great, low-stakes way to scaffold grad students as they build expertise in teaching as part of their career as students at UCalgary.

The narrative of teaching development in higher education is often “nobody ever thinks of grad students. ever!”. Here’s an example of what happens when a university values teaching, and an entire Institute mobilizes to develop robust and sustained programming for graduate students to develop into great teachers.

Next, instructors and faculty members…


How do we Indigenize post-secondary curriculum? | UToday | University of Calgary

We’ve been learning more about Indigenizing the university, and how we might approach that as an Institute. This article by Gabrielle Lindstrom is a great overview.

Indigenous pedagogy, which refers to a way of teaching using Indigenous educational principles, is grounded in creating, fostering and sustaining good relationships between student and teacher. Teaching moments are found in the human-to-human interactions which are reciprocal — my students understand that I have certain knowledge and experience they can learn from and I understand that I, too, can learn from my students.


Rather than compromising excellence, Indigenous epistemology, therefore, offers students the opportunity to strive for their full potential without compromising their human dignity or those of other cultures.

Fantastic. Indigenization is as much about shifting the power structure as it is about learning the history.

Source: How do we Indigenize post-secondary curriculum? | UToday | University of Calgary

adjusting my social media diet

For once, I’m not deleting anything. But, I’ve been struck by how

a) bad algorithmic news feeds are at actually getting what I want and need, and

b) how horribly distracting and time-sucking they are.

Companies – and we’re well past the rubicon of DIY internet hippie utopia – it’s companies all the way down now – have no reason to make their algorithms work better for me (or other humans). Their algorithms weren’t designed for that – their only reason for existing is to generate advertising revenue for the company, and to maximize that at all costs.

Cool. But I don’t have to use their crap. So, I’ve logged out of Twitter on every device I use. It’s no longer in my pocket, or on my desk, or anywhere else convenient. I won’t be deleting my account, but the only way I’ll be posting to Twitter will be through my blog. And I won’t be able to follow along, or check out the awesome hashtags or trending tweeters or whatever.

But! It’s 2018! How will you function? How will you stay part of communities?

I’m not going anywhere. RSS is still a thing – compare the noisy flashing algorithmic stream pushed at us by social media companies, with this:

A place where algorithms (if they’re in there at all) work to help me, not to pad anyone’s B2B enterprise advertising ponzi endeavour.

And. It’s a place where I can be done, close it, and move on with my day.

Your smartphone is making you stupid, antisocial and unhealthy. So why can’t you put it down? – The Globe and Mail

The article isn’t as hyperbolic as I was braced for, and connects the recent spate of Facebook billionaires lamenting that they just discovered that Facebook may not be the best thing for people or society (but thanks for the $billions).

I’m not about to say that having supercomputers in our pockets, wirelessly connected to the sum of published human knowledge and to every other pocket-supercomputer, is anything but an incredible boon for humanity. But, the way that capitalism and advertising revenue combined with algorithmic distribution to maximize “engagement” and tie into the feedback loop to boost ad revenue and then tweak algorithms and then boost ad revenue etc. etc. ad nauseum? Yeah. That might need a little work.

To ensure that our eyes remain firmly glued to our screens, our smartphones – and the digital worlds they connect us to – internet giants have become little virtuosos of persuasion, cajoling us into checking them again and again – and for longer than we intend. Average users look at their phones about 150 times a day, according to some estimates, and about twice as often as they think they do, according to a 2015 study by British psychologists.

Add it all up and North American users spend somewhere between three and five hours a day looking at their smartphones. As the New York University marketing professor Adam Alter points out, that means over the course of an average lifetime, most of us will spend about seven years immersed in our portable computers.

Source: Your smartphone is making you stupid, antisocial and unhealthy. So why can’t you put it down? – The Globe and Mail

The Looming Digital Meltdown – The New York Times

Zeynep Tufekci, in the NYTimes:

Modern computing security is like a flimsy house that needs to be fundamentally rebuilt. In recent years, we have suffered small collapses here and there, and made superficial fixes in response. There has been no real accountability for the companies at fault, even when the failures were a foreseeable result of underinvestment in security or substandard practices rather than an outdated trade-off of performance for security.

Source: The Looming Digital Meltdown – The New York Times

Her butler metaphor is great, too.