Monday, April 21, 2014

Are there any drawbacks to agile?

I was taken aback with my response to a perfectly reasonable question. Are there any drawbacks to agile?


A binary question, to which I responded, appropriately.  “Well, yes there are.” Then I paused. The audience was not satisfied, and my head was swelling with the conversations I have on a daily basis about the strategic challenges of agile. So, I went for a divergent and long-winded (for me) answer:


“The software development community is divided on the ultimate efficacy of agile. There has been research that shows that agile development projects improve speed and quality. But, there has not been enough research over a long period of time to convince everyone.


“The popularity of Scrum, which has become a cottage industry for charismatic, book-writing consultants, has inculcated some misconceptions which have undermined, ironically, the case for agile. People focus so much on Scrum, which is merely a set of rules for the planning game, that they tend to ignore the enabling engineering practices that ultimately fuel effective development teams.


“The Agile Manifesto and current thought leadership are notably silent about the role of the manager and leadership, so important to any transformational effort. This again depowers any transformative oomph that such an initiative might have. I have seen leaders at all levels underestimate the cultural component of such an effort, and they tend to think that agile is a training thing. “Get the people trained on agile, and we will be agile.


“Top-down governance, a necessary evil in big and publicly-traded companies, is usually highly disruptive to agile teams, unless all of the supporting and operational functions are also adopting an agile mindset. The same can be said of the support functions, which would include the lawyers, the bean counters, and yes, the people people.”


Other than that, it’s just great.


The person who asked the question had more questions for me after the session. “What are you saying? Are you a champion of agile or a critic?”


Sigh.


I believe in agile. I think the elements of the Manifesto, if followed as a value system, can lead to work output that is more focused on outcomes. I love that so many smart people, like the Original Signatories and others with whom I work, are so invested in making agile work, because they think it’s the right thing to do. I also see potential for agile ways of thinking to transform work and the world. There are a myriad of people who are making rain with agile. I believe agile can accelerate our contribution as human performance technologists.



Wednesday, April 16, 2014

What Does Test-Driven Development Mean to Education?

I work at CA Technologies, a large software development company that is seeking to maintain a competitive edge through “innovation, execution, and speed.” As a company we subscribe to agile as a principle-based approach, and we play the planning game with Scrum. But the enablers of higher quality software within faster release cycles are the engineering practices beyond just the agile planning methodology.


One of the most powerful practices used in software development is Test-Driven Development (TDD), where the test is written and run (and fails) before the code is written. The goal of writing the code is to pass the test (and nothing more). In an Agile environment, this is done in increasingly small increments. Ultimately, tests are run every time a piece of source code is checked back into its repository home. With small increments, bugs are isolated and fixed immediately. The full fruit of this strategy is Continuous Integration, where the software always works and always improves. (Massive disclaimer: The preceding paragraph was written by an corporate education person living in a software development world. Stay with me!)


So what does this have to do with anything in the realm of human performance improvement? How often are we presented with requests to create learning assets with no reference to the business problem being solved? (Answer: For me, several times in the past week.) How often do we create things without an evaluation component, and then someone later asks if it is having an impact? (Answer: Too often.) If we can start with the outcome, if we can write the test that will be satisfied before we design the intervention, then we significantly improve the odds that our solution solves the business problem. Or, at a minimum, we know whether or not the solution answers the test question.


From a performance consulting standpoint, this is related to our practice of front-end analysis. When encountering an “order-taking” situation, there are simple questions we ask to reframe the request, such as: What is the desired outcome? What will change as a result of us implementing something? What will happen if we do nothing? What is the root cause of the performance problem?


Do not be surprised when a frustrating, circular, rhetorical conversation ensues with your client, where everyone needs to acknowledge that only rarely is the answer to these questions related to training. After all, if it was a pure training need, as the original request assumed, we would be able to, using training in isolation, enable the development of employee capabilities that are aligned with business objectives. No surprises here, since in the vast majority of cases (around 85% of the time – people have studied this), training is not the solution to human performance problems. This state of affairs will often be perceived as unfortunate for our client requestors, because they were hoping that we could help them tick the box by developing and rolling out a training “solution.”


Now, as we continue along this line of discussion (often root cause analysis) we have to be the bearer of bad news: You have a problem, but it’s not related to training. Maybe people don’t understand the expectations? Maybe the process is broken? Maybe the interface on your application is not intuitive? Maybe people are not motivated or incented to do what you want them to do. In these cases, training is not the answer. In fact, if you look at the possible causes of issues that inspire training requests, it is really difficult to imagine a case where training by itself is the answer. (That’s a topic for another blog post…for more information, check out Thomas F. Gilbert’s Behavioral Engineering Model.)


Back in the real world: Do not let the client walk away; we can still help them! Yes, even the Education people can help them with their business problem.


If we responded to every training request with diligent and dutiful fulfillment, as I did very well for many years, we would be very busy creating things that add no real business value. On the other hand, if we restricted our efforts to projects that met some sort of academic definition of training/education, we would be doing almost nothing. Both of those situations are bad places to be. The sweet spot is doing things to help our clients succeed, regardless of whether or the instructional designer in us thinks it meets our definition of training/education.


Almost any work we do in response to this type of analysis is OK as long as it’s adding value. Adding value means being able to demonstrate that you’ve added value after the thing is implemented. And your chances of adding value expand exponentially if you can come to an agreement with your client, before the design and development starts, on what adding value means. That is, if what we do works, what test will be satisfied?

Now that we are in the business of adding value, rather than the business of creating useless training, we have to think differently about how we create tests. Many of us are familiar with how to create training tests in the realm of the classic Kirkpatrick model, such as knowledge reviews and behavior observations. But by applying TDD to Education (ETDD?!?), we will need to invent different types of tests that answer different types of questions: How to you measure awareness? Buy-in? Teamwork? Innovation? Strategic thinking? These questions might be out of our comfort zone, and indeed, out of scope for what we think of when we think of Education, but that doesn’t mean we shouldn’t attempt to work with our clients to write tests for them. It will take persistence, creativity, and innovation, because every business problem is unique. Certainly our clients are often out of their comfort zone with these questions as well. That is why they called us in the first place.