Skip to main content
Powered by the Principles for Digital Development

Approaches to Carry Out Your Key Activities

Many organizations and networks that have developed digital products or services to be used in the humanitarian or development sectors have done so using agile and/or lean methodologies. Yet, when they try to collaborate with more traditional humanitarian and development partners, they are confronted with more traditional waterfall and/or blueprint approaches to their work.

This difference in approaches can have varying effects on business model sustainability. If your digital solution and its value proposition are product led or have an automated and distant customer relationship with the buyer and user, and if the product works well for their needs, there aren’t likely to be any issues arising from how the partner carries out its activities vis-a-vis how your organization carries out its activities. However, there are a number of other instances in which this is likely to be an issue.

Tensions: When Are They Most Likely to Appear?

If your digital solution/Value Proposition has one or more of the following aspects, then there are likely to be areas of potential friction:

  1. Your solution is a service.
  2. One of your revenue streams is to provide complementary services.
  3. Your solution requires or offers assisted modularization or hackability.
  4. You want to carry out a pilot program with potential buyers and users (channels).
  5. You are co-developing your solution/Value Proposition with humanitarian, development, or government agencies.

If you are an agile or lean organization trying to collaborate or partner with a traditional development or humanitarian organization, then you are likely to encounter issues. The table below shows some of the key differences between non-agile/traditional and agile/lean approaches that might lead to tensions. Breaking down traditional and agile/adaptive processes into specific key components is useful for highlighting the tensions between the two.1

Non-Agile/TraditionalAgile/Lean
Development Cycles
  • Work is planned months or years in advance, with monthly/quarterly upward reporting and substantive reviews annually.
  • Nothing is launched to users until the full product is ready and tested, making it difficult to adapt goals and requirements based on changes in context or feedback from users.
  • Work is conducted in short iterations of one to four weeks, with retrospectives and reprioritization built into each iteration enabling course corrections.
  • A minimum viable product (MVP) is launched to users as early as practical, then new features are regularly added.
Changing Requirements
  • Attempts to set requirements early in the design process often lead to change control requests, messy budgeting, and contracting disagreements.
  • Requirements may not match the needs of the users or the situation may have evolved.
  • Planning at the start of each iteration enables goals/requirements to be reviewed regularly.
  • A broad direction of travel is set, but detailed requirements are defined as needed.
  • Changes and course corrections are actively encouraged throughout, based on real data.
  • The method lends itself to adapting major changes (pivots) to goals, business models, etc.
Relationship Between Customer and Development Team
  • Since requirements are designed upfront, there is little engagement between customers and the developer until the product is ready for testing.
  • This can lead to a situation where no one on the buyer side really understands the details of what is needed and what is being developed.
  • Often leads to the development of features that are not needed and rarely, if ever, used.
  • Requirements are constantly defined and refined, making it critical for the customer and the development team to be tightly integrated and work together on a day-to-day basis, ensuring a deeper understanding between the two.
  • This is achieved by hiring a product owner (PO) to own the product and user needs. The PO must understand and represent all stakeholder requirements (i.e., BUTI segments). It’s a full-time job that is only now being recognized and hired for. To view a Viamo job ad, click here.
Learning From User Feedback
  • Users are typically consulted at the start and may do user testing before launch, when it is too late to change requirements.
  • MEL ends up relegated to upward accountability throughout, and a baseline/endline evaluation comes too late to allow for improvements to the product or program.
  • Short cycles and MVPs enable users to actively engage with the product early and continuously, ensuring their needs and feedback can be built into regular prioritization processes.
  • Developers can learn from what users do/use, which is often distinct from what they say in surveys and workshops.
  • MEL feedback is incorporated in a meaningful learning cycle that actually influences the work.
Autonomy of Development Teams
  • Developers are sometimes seen as junior implementers with limited decision-making, notably if development is outsourced.
  • This leads to potential disconnects between the goals/needs and the actual development work, and inhibits the ability of the developers to learn and adapt based on user feedback or changing circumstances.
  • Delegating control to self-managing teams (including a PO) gives teams a holistic view of the original requirements, the changing needs of users, and the development process, enabling rapid course corrections (or pivot suggestions) when the situation demands it.
  • This model helps improve cross-learning between different teams/sectors and improves morale and engagement of key technical staff.

Humanitarian and Development Organizations: Why Are They Using Traditional Methods?

While significant evidence exists that flexible, iterative approaches to management—particularly when it comes to technology—are more effective, major aid organizations are still caught between this movement and a number of conflicting pressures, some of which are changeable and some less so. They include:

  • Requirements for full accountability and transparency by donors and regulators
  • Political pressure and election/budgeting cycles
  • Poorly informed public opinion of how aid money is (or should be) spent
  • Drivers for cost-efficiency
  • Lingering post-colonial North-South “we know best” attitudes
  • Engrained processes and training focused on linear models of plan-execute-evaluate thinking and outdated understanding of risk, uncertainty, and complexity
  • Traditional grant and contract methodologies

These tensions combine to create a need to make decisions ahead of time and stick to them in a way that directly counters the goals of an iterative, agile approach.

So realistically, despite some progress in adopting an agile approach in a non-agile/top-down environment, there are still many obstacles that the sector must overcome. This scenario is changing, gradually embracing adaptive processes, but it is largely still driven by an aging model of North-South assistance, especially when the fast-paced changing world of tech and digital is added to the mix.2

Case Study: Speed Evidence - Mind Your Language

The development of the early collective intelligence system for context analysis for humanitarian action, Speed Evidence, was a collaboration between World Vision, Ushahidi, Frontline SMS, and SMAP consulting.

A design workshop with the partners almost descended into chaos because of assumptions and language. The scrum master had presumed that the frontline humanitarian workers, who were part of the workshop as users, worked in a non-agile/traditional manner and used tech language to describe the agile processes being considered.

In fact, the issue was about language not agility. The users in the room already worked in agile ways, they just did not use the language of Silicon Valley. Once the assumptions were clarified, and a change in facilitator was made to someone who understood both sectors, the design workshop ran smoothly.

The product ended up failing when it was brought to the attention of more senior figures, who were still using waterfall planning for their systems architecture and could not envision investing in something that was not on an existing, agreed upon, technology roadmap.

Therefore, even within the same organization, there can be those who are used to working in agile ways and those who are still rooted in traditional methodologies.

For more lessons from this case study, see Lessons from the frontline of Humanitarian and Technology Company Partnerships

These frictions will inevitably lead to confusion, misunderstanding, and problems, but you can find tactics to help navigate these issues in the Agile Gap Analysis Tool.

Interactive Tool: Agile Gap Analysis Tool

It is vital to understand where there may be tension between an organization and a customer/client for whom the digital solution is being created. The customer/client can be within the organization, or they can be from another organization or government department. Use our interactive tool to explore how these methodologies work in your context.

Go to the tool

Key Takeaways

  1. The aid sector has not yet mainstreamed the adoption of iterative and agile approaches to project planning and implementation.

  2. Bureaucratic systems and processes in the aid sector can create friction and tensions for digital solution developers adopting the iterative approaches suited to digital development.

  3. Mapping out the areas of potential areas of friction and developing mitigation strategies will help.

  4. Factor in the opportunity cost where there is likely to be significant friction. Don’t be afraid of walking away from a potential customer/partnership if the friction looks like it will impact your team/organization too much.

  5. Remember: “Some problems are just hard, some people are just difficult, these methods are not salvation.” (Larman, 2004)

Complete the following in your Business Model Sustainability Canvas:
  • Use the Agile Gap Analysis Tool to understand where there may be tensions between an organization and the client/customer to ensure that there aren’t any issues with the key activities you have identified.
  • Add or remove key activities based upon the outcomes of the Agile Gap Analysis.

  1. Flahiff, J. (2014). Being Agile in a Waterfall World: A Practical Guide for Complex Organizations, Seattle, WA.
  2. https://www.weforum.org/agenda/2015/01/how-to-make-development-organisations-agile-and-effective/