Managing Complexity & Tech Implementation
“A Complex Adaptive System?” (via flickr: ewar woowar)
Justin Baeder and I have been trading comments on his blog relating to the nature of complexity. A lot of what I know (and what I’ve forgotten) about complexity theory (and the study of Complex Adaptive Systems – CAS) comes from Dr. Rueben McDaniel, one my professors and dissertation committee members. As an aside, he is one of the smartest, most generous, and comprehensive people I’ve ever met. He laughs a lot, but often it’s because he has you (and only you) squirming in the hot seat with a difficult question.
Justin mentions another reading, putting to me the idea that maybe:
what we call complexity is actually just an excuse for not taking action – when we say “It’s complicated” we’re really just trying to let ourselves off the hook for making changes that would in fact make a difference.
On Justin’s first point, my response is, “Yeah, that’s not complexity.” From a leadership perspective, I think we agree that inaction is a recipe for disaster, but maybe for different reasons. Complexity actually necessitates MORE action (and more discussion).
In part, this is because:
- Everyone’s got a different read on the environment around us; no one is omniscient. In education, this kind of thinking is at the heart of Spillane’s work on sensemaking and policy implementation. In my own work, I’ve described how this also plays out around technology implementation. There’s no such thing as a “no brainer” or obviously useful technology. Different people could all be thinking different things (and thus taking different actions), even thought it’s all the same technology or policy.
- Throw into that mix the fact that as people take action, they’re working off of feedback from the environment while they are changing it. Because the landscape of demands and challenges is constantly changing, and because these changes are not always predictable, new information is needed to act appropriately. Eisenhardt has described how top executive teams prototype decisions, constantly reevaluating how ideas fly [here’s my blog post with a great activity to teach prototyping]. Weick’s story about the Mann Gulch forest firefighting disaster is an absolute classic about what happens “shit hits the fan” and people fail to process information organizationally. In education (and literally the stuff I’m writing about this afternoon), I’ve found that central office leaders are often surprised that projects begun months ago haven’t gotten off the ground as planned. The problem, however, isn’t that they had the wrong plans, but that they weren’t up to date about or adjusting those plans. For example, I’ve also seen leaders express surprise (months after a training) that their training didn’t result in action.
- Related to this, no one is omniscient. That’s just a fact of life.
- But it’s also important to note that complexity isn’t just about “lots of stuff going on that I don’t comprehend.” It’s also about how you can’t understand the whole by looking only at the parts. The properties of just hydrogen and just oxygen are nothing like the properties of water. Think of student learning — how it matters what kids are in the room, who is participating, and who is having a bad day. The same goes for teams and faculties. What gets done is emergent. I’m also writing today about how technology vendor’s “failures” were also linked to issues within central office (habits around job role; job vacancy; habits around “talking to the boss;” asking the provider the wrong questions, etc.). Outcomes were about the joint product of district and vendor, not one party in isolation.
- It might also be nonlinear– small events could have larger implications and vice versa. For example, a technology for sharing grades or lesson plans might make some people feel watched. Like celebrities, they might be more conscious of their followings, catering to the crowd rather than to student needs. Others might abandon use, fearing that their bosses are watching. Yet others might use this information to wonder how their school might collaborate better. At the policy level, a chance conversation between a principal and lawmaker’s staffer could plant the seeds for chain reaction of events, resulting in a new law to benefit all schools. But the law could also backfire or provide no improvement. A few reasons: technological advancement, personal discretion, local control, failure to predict a loophole, an amendment/rider with unexpected consequences, whatever.
A good introductory way to see complexity and complex systems is to think about fish swirling. How does the school take shape? What are consequences to unexpected events?