top of page

Reconsider the system of IDPs: A technical analysis

I have been involved in municipal planning since 2007; the year when I first assisted a municipality to compile an Integrated Development Plan. Since then I have performed more than 30 similar consulting assignments. The depressing conclusion I have drawn from my experiences has been that the planning system of Integrated Development Planning has add little value. Over the more or less seventeen (17) years since the introduction of IDPs (through the Municipal Systems Act, 2000) the turmoil in local government; both in terms of dissatisfaction with service delivery, as well as administrative proficiency) has only increased.

The system of integrated development planning is too complicated, too dependent on sophisticated data collection and integration methodologies, to be useful in a country with South Africa’s development status. Officials, councillors and (less of all) communities do not properly understand it, and supervisory spheres of government (both provincial, as well as national) use it to pile on bureaucratic requirements that have eventually turned IDPs into volumes of unimplementable paper exercises.

Planning Systems

There is no doubt that all countries need structured planning systems; for two reasons, namely:

1. Accountability: Councillors, communities and funding spheres of government must be informed about the (measurable) planned performance objectives of municipalities; and

2. Performance measurement: In an accountability-centred system (such as democratic governments) there must be a guiding plan that focus the efforts of the executors and enable performance reporting.

Nevertheless, perhaps the time has come to reconsider the wisdom of integrated development planning as a legislatively prescribed system for development planning in municipalities. The concept of integrated development per se is not the problem, but rather the manner in which it is prescribed and applied in South Africa.

The idea of a strategic plan that also serves as policy statement for the development agenda of a municipality over a five-year period is perhaps not such a bad idea. It is, in fact, in line with other public sector planning and programming systems in the United States, Australia, New Zealand and various other countries; introduced in the late 1990s as part of the sweeping changes in traditional government systems and processes.

The roots thereof are in the idea that, similar to a budget that became a forward-looking financial planning document, and then, after-the-fact (the financial year) a point of reference to enforce accountability, an IDP plays a similar role as a service delivery performance accountability yardstick. Similar to a budget that gives rise to financial transactions and ultimately financial statements, an IDP that is cascaded into an annual operational plan and then into a performance management system.

However, different from a budget (and subsequent financial transactions), an IDP, cascaded down into a performance management system, is not suited to be measured in quantitative criteria. Sure, the modern accepted paradigm is that it is possible to translate service delivery targets into the same technical detailed measurable performance data as financial transactions, but it is not.

The result is audit standards for performance management and performance management systems that really says very little about actual service delivery, but much more about officials’ abilities to formulate technically compliant performance measures, indicators, targets and objectives. Good audit outcomes on performance management really indicate technical excellence and very little about actual service delivery. Every year auditors add another compulsory comma to the format in which performance indicators or targets must be presented.

Integrated Development Plans have become extremely technically complicated documents, exercises in compliance, rather than working plans that guide development in municipalities. The outcomes of IDPs, since the inception thereof in terms of the Municipal Systems Act, 1998, was not improved service delivery, but a dramatic increase in the number of service delivery protests.

Every time a new turnaround plan for the transformation of local government is launched, IDPs must be re-written to reflect the revised focus areas. The irony is that there are no new focus areas. At the end of the day, volumes of paper are produced that very few really understand; not Councillors, not management and especially not the community.

We need a simpler planning system for local government – and a simpler system to measure performance – for all municipalities.

Integration

I have a serious problem with the so-called integration that must be part of an IDP. The theory is that planning priorities of different spheres of government must be integrated in a single (and each and every) municipality’s IDP. This implies that the IDP must be aligned with –

-- the literally hundreds of national planning and performance objectives in the National Development Plan;

-- the multiple of objectives in the country’s Medium Term Strategic Framework;

-- the extensive sets of sector-specific planning documents (such as the National Spatial Development Framework, the national Local Economic Framework and several others);

-- Provincial Growth and Development Strategies; and

-- District Growth and Development Strategies.

With the assistance of consultants’ municipalities can achieve this – and provide a paper exercise document that will be place somewhere in a drawer where it will be forgotten until next year, when the entire ridiculous exercise will be repeated.

Information Requirements

But that’s not the end of it. Then the IDP must contain hundreds of statistics ranging from the number and percentage of households with access to water, sanitation, electricity, refuse removal and housing; to the number of businesses, employment rates and the social profile of the municipality. Especially smaller municipalities simply do not have the resources to attend to some of these statistical requirements, such as social cohesion, statistics about business ownership and many more. It became a cost wasting exercise aimed at compliance, with little to no practical value.

There is no place where municipalities could obtain reliable statistical information; data provided by Statistics South Africa are only updated and current for one year (the one immediately following the comprehensive National Census once a decade). Other providers are extremely expensive, and their statistics are also mostly based on outdated data.

Sector Plans

Municipalities must produce a Spatial Development Framework, a financial strategy, a disaster management framework, a land use management system and framework, a waste management plan, a water service development plan, a water resources’ plan, an integrated transport plan, a housing plan, a local economic development strategy, an energy master plan, an infrastructure investment programme, an anti-corruption strategy, and (for those municipalities at the coast) a coastal zone management plan. All these plans are highly technical in nature, and must mostly be prepared by consultant – at enormous cost. Not only that, these each and every plan is supposed to be reviewed annually.

I do not think all (or even most) of these so-called sector plans are necessary. I could understand the need for a Spatial Development Framework in a metropolitan municipality (and even some secondary cities) where land use management and spatial design are complicated considerations that must be guided by structured plans informed by knowledgeable planners and practitioners. However, for most municipalities in South Africa these plans are a waste of money; they are seldom really implemented (partly because most of these municipalities do not have qualified town planners). A much better idea will be to formulate Province-wide Spatial Development Framework, and allow for local discretion (exercise by councils), within certain non-negotiable design principles and patters.

A financial strategy, a disaster management framework, a land use management system and framework, a waste management plan, a water service development plan, a water resources’ plan, an integrated transport plan, a housing plan, a local economic development strategy, an energy master plan, an infrastructure investment programme, an anti-corruption strategy, and a coastal zone management plan are operational plans that must be prepared and utilised on operational organisational levels in municipalities (and not be required for IDP credibility).

Councils must be allowed to approve their own guidelines, even if it do not necessarily comply with all the hundreds of technical details currently prescribed for the compilation of these plans.

Yes, I actually do understand the argument that compilation of these plans requires specialised skills to ensure environment protection, structured design of towns and cities, organised spatial planning, financial viability, compliance with nationally acceptable accounting systems and practices, and so forth – I really do. However, the reality on the ground is that very few municipalities have updated versions of these plans, and are mostly guided by council and management policies and decisions about these matters. It is better to formalise this arrangement than to keep on demanding sophisticated plans that gather dust and are used to humiliate municipalities during IDP credibility assessment sessions. A more productive approach will be to allow for localised decision-making about these matters, with districts, provincial and sector departments and institutions providing active assistance and support.

Medium and small sized municipalities simply do not have the capacity to comply fully and translate the IDP into a living, meaningful management document.

Performance Indicators and Targets

In 2010, the Department of Cooperative Governance and Traditional Affairs introduced a simplified guideline for IDPs of smaller municipalities. However, very little really changed, and the requirements still boils down to a bureaucratic document loaded with useless information for little to now practical value.

In theory, the integrated planning and performance evaluation system is sound; IDPs set long-term performance objectives, which are then translated into annual performance Plans (Service Delivery and Budget Implementation Plans), and cascaded down into the Performance Management System. After quarterly, mid-year and annual performance reviews, an Annual Report is issued, where IDP objectives are listed as they appear in the Service Delivery and Budget Implementation Plan and with information about actual performance achieved during the year.

However, there are literally hundreds to thousands of individual key performance indicators and targets defined in IDPs (objectives translated into measurable terms). The problem is that the practice has started to “copy and paste” SDBIPs to IDPs. Why then distinguishing between these two documents? Furthermore, performance objectives and targets in IDPs are so operational that it tells very little to nothing about the strategic performance of the municipality. Actually, IDPs are supposed to be strategic plans and SDBIPs operational ones There ought to be major differences between them; both in terms of scope, as well as periods (time-frames). There are not.

The ’systemically enforced’ development of municipalities will be much better served if a few high-level objectives (with associated indicators and targets) are set for municipalities to achieve annually.

IDPs are compiled in terms of five prescribed Key Performance Areas (which originated during the launch of the first of the municipal turnaround attempts, namely the 5-Year Local Government Agenda. The 5 years have long lapsed, but the performance areas remained):

1. Basic Service Delivery

2. Local Economic Development

3. Institutional Transformation and Organisational Development

4. Municipal Financial Viability and Management

5. Good Governance and Public Participation

There are actually three broad strategic themes captured in the in these Key Performance Areas that drive the IDP and PMS integrated system in municipalities, namely:

1. To promote basic service delivery

2. To promote local economic development

3. To strengthen institutional capacity (through good governance, public participation, institutional transformation and development and financial viability and management).

There is no need to divide these goals into five key performance areas, because it only promotes the identification of hundreds of individual performance indicators and targets that do not actually belong in a strategic plan (it actually belongs in operational plans).

A much better idea will be to prescribe three broad goals (or then Key Performance Areas) for municipalities to base the strategic structure of IDPs on, and relate these goals to the triple bottom-line fundamentals of sustainable development, namely:

1. To promote social development.

2. To promote economic development.

3. To promote environmental sustainability.

A restricted number of priority objectives must then be identified in terms of these goals and linked to prescribed key performance indicators (but not targets, because those are too municipality-specific). These prescribed objectives must be based on municipality’s core functions.

However, there’s a problem. There is not a single municipality in this country that performs all the prescribed core functions of local government in Schedules 4 and 5 of the Constitution, 1996; it is simply impossible for them to do so, because of capacity constraints. Yet, what you will find is not only objectives related to these core functions, but also a magnitude of targets related to education, health and policing; all functions that Constitutionally relate to matters that are competencies of national and provincial government. Municipalities are simply not funded to perform these functions, yet they are often assessed on whether these functions are included in their IDPs or not.

The practical reality is that municipalities’ capacity limitations imply that only a limited number of performance objectives could be identified, namely the ones of highest priority for sustainable development of local communities, considering Constitutionally prescribed municipal core functions. It is not ideal, but it is better than to create unrealistic expectations and set municipalities up for failure through the contents of their planning frameworks. I would suggest:

Social development: Water, electricity, sanitation, refuse removal and provision of infrastructure for housing schemes.

Economic development: Local tourism, co-operatives and job creation schemes.

Environmental sustainability must be a generic supplementary (and support) function with all new projects and initiatives impacting on the physical environment.

It is not to say that municipalities cannot have supplementary objectives of their own, based on their unique circumstances (for instance, obviously municipalities located in forest, mining or fishery areas must include objectives that address – such as peers, etc.). However, one (or at most two) additional goals will be more than adequate to address these needs.

It is also important to draw a clear distinction between different categories of municipalities, based on a thorough capacity assessment. Nevertheless, although the scope of delivery obviously differs widely between metropolitan councils, secondary cities, Category B3 and B4 municipalities, their core functions remain more or less the same.

You will note that none of the goals that I have suggested above relate to institutional issues. What then about institutional building, which is without doubt vitally important for achieving sustainable development?

Outcomes-based Performance Management

Let me remind the reader about government’s own outcome-based planning model. The sequence of value-adding activities and results (in that model) are as follow:

1. Inputs (resources) are utilised to perform --

2. Institutional activities, which produce –

3. Outputs (services); which produce –

4. Outcomes (value adding over the short – to medium-term); which then result in –

5. Impacts (long-term goal achievement and development; that is, sustainable development)

Institutional transformation, organisational development, financial viability, good governance and public participation are all institutional activities, and do not belong in outcomes-based (and impact-focused) strategic plans (like IDPs). They belong in institutional operational plans.

All these activities (and their sub-activities, eventually culminating into jobs) are performed to produce outputs (services), which must then satisfy goals to produce outcomes and impacts.

Far too much time is spent on considering and auditing institutional activities, and almost none to consider (and audit) whether outcomes have been achieved in a cost-effective manner. The current system is promoting inward-looking bureaucratic institutionalisation of planning and performance assessment processes.

As long as a municipality’s key performance indicators and targets comply with the highly technical audit requirements for usefulness and reliability, the audit outcomes for that municipality will be good. It creates the impression of excellence, while, in actual affect, it says very little about the meaningful performance of the municipality. On the other hand, some municipalities that perform well in terms of developing local communities in those Constitutional areas that are listed as core functions, receive poor performance audit outcomes because their key performance areas and targets were not bureaucratically SMART.

I simply cannot see that the key performance indicators currently prescribed in Regulation 10 of the Municipal Planning and Performance Management Regulation effectively measure performance in terms of the core functions of the Constitution, or balanced institutional capacity.

I also have a bone to pick with institutions responsible for supporting municipalities with development planning through the lists used by national and provincial oversight bodies to assess the credibility of IDPs. These credibility assessment questionnaires are totally unrealistic, given the capacity of moderate to smaller municipalities, highly technical, often place a lot of focus on non-municipal functions and says very little about the projected performance of a municipality. It is a bureaucratic exercise in document-compliance.

Although there are some minor differences between the credibility questionnaires used for different categories of municipalities, it is basically the same criteria that are used. Given the literally enormous capacity deviations between metropolitan municipalities, secondary cities, smaller towns and village-based municipalities, a much more meaningful and contextual appropriate distinction is required.

It is ridiculous; the time to revise the system has arrived.

I am certainly not opposed to the concept of predetermined performance criteria (objectives), progress toward which the municipality must assess at regular intervals. It is in line with international best practices, and make sense. However, there is an urgent need to reduce the over regulating in the process, resulting in expensive and cost-wasting bureaucracy.


Image source:123RF

Comments


Frans Minnaar © Copyright. 2025-2026. All right reserved.

Use this website in terms of the legal notes regulating is (Terms of Use). If you do not agree with these terms, please leave and do not use the site.

South African legislation requires that specific privacy requirements must be guaranteed in the use of this website. Thee website's Privacy Policy can be access by clicking on the link below.

bottom of page