Categories
Blog

Responding to House of Commons Departmental Select Committees

In the third part of their trilogy examining sessional return data, Stephen Holden Bates (University of Birmingham), Mark Goodwin (Coventry University), Steve McKay (University of Lincoln) and Wang Leung Ting (LSE) explore government responses to departmental select committees.

The Palace of Westminster. Image Steve Sea [CC BY 3.0], via Wikimedia Commons

The overall aim of departmental select committees, as determined by the Liaison Committee, is ‘to hold Ministers and Departments to account for their policy and decision-making and to support the House in its control of the supply of public money and scrutiny of legislation’. This aim can be divided into two parts: holding departments to account; and supporting the House in scrutiny. At least five of the ten core tasks of departmental select committees are directly related to the first of these parts, relating directly to the Departments they shadow. One of the key points of interaction between select committees and government departments is the government response to select committee reports following inquiries. As part of the process of select committees holding governments to account, they inquire into matters relating to the policy, administration and expenditure of departments including taking evidence and calling witnesses in order to produce a report which is then presented for the government to consider and respond. Under ‘normal’ circumstances, the relevant government department and/or other public body is expected to respond publically and in writing to a select committee report within 60 days.

The procedure concerning responses to select committee work may ask us to question how normal is ‘normal’? Does government actually respond to select committees in this way? How well institutionalised is this system of inquiry-report-response? And how effective, therefore, are departmental select committees in performing their accountability function with regard to their corresponding department?

One way – but not the only way – (to begin the process) of measuring the effectiveness of this central aspect of departmental select committee work is to count the number of responses, whether timely or otherwise, that select committees receive for reports for which responses are expected.

We used sessional returns from 1985-86, which is when this information started to become available, until 2013-14[i] to count the number of full reports, the number of full reports for which responses were expected[ii], and the number of reports which received at least one response, whether from a government department or other public body.

Responses expected and received                                             

Over the period, departmental select committees published 3,142 full reports, of which a response was expected for 2,723 of them. Of these 2,723 reports, 2,493 (91.6%) received a response of some kind at some point in time.

As can be seen from Figure 1, the proportion of reports for which a response is expected and which then receive a response has risen steadily over the period (with dips for the last parliamentary sessions before a general election). Indeed, since 1995-96, it is only for the last parliamentary sessions before a general election that response rates fall below 90%. However, there is only one parliamentary session – 2005-06 – for which the response rate from government is 100%, which really ought always to be the expected figure.

As can also be seen from Figure 1, the proportion of reports for which a response is expected has declined over the period, the key session appearing to be 2002-03 – the session when select committee core tasks were first introduced. After this point, the proportion of reports for which a response is expected never rises above 90%; before this point, the proportion is above 90% for 13 out of the 17 sessions.

Figure 1: Proportion of Full Reports for which responses are expected & which receive a response, 1985-2014 (with trend lines)

Table 1 show the proportion of reports for which a response is expected and which then receive a response for each departmental select committee (and their forerunners) over the whole period. As can be seen, with one major exception, all departmental select committees receive a response for a very high proportion of their reports. The exception, which does indeed stick out like a sore thumb, is the Scottish Affairs Select Committee, which only received a response for 66% of reports for which a response was expected[iii].

Figure 2 shows both that the Scottish Affairs Select Committee did not publish a large number of reports for which a response is expected until towards the end of the period and that the response rates for reports published are comparable to other committees until the 2010 general election. After this point, the response rate dropped dramatically at a time when the committee started to publish a much higher number of reports.

What does all this mean for select committee effectiveness?

On the basis of what we present here, it is clear that, over time, departmental select committees are tending to receive responses to their reports more often. We cannot know simply from Sessional Returns whether the apparent deepening institutionalisation of this process is because select committees have become better at writing reports that demand a response and/or because Governments (and other public bodies) have become more willing to respond. Moreover, the data we analyse here tells us nothing, of course, about the content of responses. Governments may be willing to respond more often but the content may now be more cursory and/or dismissive than before. Indeed, the perfunctory nature of government responses is frequently complained about by those involved in select committee work. However, in terms of the evidence we do have, there is a prima facie case that, for whatever reason, departmental select committees have become more effective in holding the Government to account on this measure.

The seeming exception to this trend is the Scottish Affairs Select Committee. It is not clear what is going on here, or whether this trend continues post-2013-14 (which is another reason why the Sessional Returns should continue to collate data about responses, etc.[iv]). If it is indeed the case that the Scottish Affairs Select Committee receives fewer responses than other committees when responses to reports are deserved, then this needs to be explored further and rectified.

Of course, even ignoring the Scottish Affairs Select Committee for the time being, the response rates for the other select committees are still below 100% which should be the expected figure under normal circumstances. Indeed, despite the rising trend over time, it remains the case that more than one in twenty full reports published since 2010 which require a response do not receive even the most cursory and/or late response from government. Despite being a comparatively high figure when compared to earlier periods, this response rate perhaps remains a little disconcerting when full reports are the direct outputs of select committee inquiries – one of the principal means of scrutiny by Parliament over the work of government.

Finally, the evidence here suggests that departmental select committees are now tending to produce fewer reports for which a response is expected. This, in turn, suggests that departmental select committees are focusing to a greater extent on other work beyond direct departmental scrutiny. For example, the Treasury Select Committee now produces a lot of reports on (re)appointments to bodies such as the Bank of England and the Monetary Policy Committee for which responses are not required. It may be a coincidence but this trend appears to have begun shortly after the first iteration of the core tasks were introduced. It may well be the case, then, that this formal codification of select committee tasks helped to alter select committee practice and broaden out the focus of their work.

Stephen Holden Bates is a Senior Lecturer in Political Science at the University of Birmingham, UK. Follow him on Twitter: @Stephen_R_Bates 

Mark Goodwin is a Lecturer in Politics at Coventry University. Follow him on Twitter: @MarkRGoodwin

Steve McKay is Distinguished Professor in Social Research at University of Lincoln. Follow him on Twitter: @SocialPolicy 

Wang Leung Ting is a Fellow in the Department of Government at the London School of Economics and Political Science. Follow him on Twitter: @kiwiting

Our project, the Select Committee Data Archive (1979-Present), from which this blog emanates, was part funded by the British Academy (SQ140007).

Read Part One of Stephen Holden Bates, Mark Goodwin, Steve McKay and Wang Leung Ting’s analysis of select committee sessional return data: Debating the Effectiveness of House of Commons Departmental Select Committees in Informing the House and Part Two: Consensus and Division(s) in Departmental Select Committees

Footnotes

[i] We had to stop at this parliamentary session because the information provided in the Sessional Returns was drastically curtailed from 2015-16 onwards with readers pointed towards the committee’s webpage. Unfortunately, it is often both very difficult and very time-consuming to ascertain both whether or not a response is expected by the committee and whether or not the committee has received a response from the details (or lack of them) provided on the webpages and in the committee’s minutes and reports. This is another example of why recent changes to the Sessional Returns are a Bad Development and should, in our opinion, be reversed quick smart.

[ii] Not all select committee full reports require a government response. For example, some detail the work of select committees over a parliament, or are concerned with pre-appointment hearings about which the committee does not believe a response from government is required.

[iii] This result was so surprising and out of kilter with the results for the other departmental select committees that we triple-checked the Sessional Returns and Scottish Affairs Select Committee’s webpage. It may be the case that the sessional returns misreport the number of responses received, or that responses were received very late and post-2015 when it becomes much more difficult and time-consuming to determine whether a response has been received. However, on the basis of the evidence we have and to the best of our knowledge, these findings are accurate.

[iv] We would apologise for harping on about this but we are not sorry at all.