This month I have been having an interesting discussion with colleagues at the University of the West of Scotland about submitting popular music to the REF. (For those readers who don’t work in a university environment, the Research Excellence Framework [REF] is the system that rates the quality of research carried out in HE institutions in the UK. Important funding decisions are made based on the outcomes, which means that researchers are under pressure to consider how what research they are producing will be received and rated during this exercise, which happens every 6-7 years). In considering how best to publish and disseminate my work (for example the Pervasive Punishment EP I described in my previous post), the question is raised as to how the quality of the outputs will be ascertained and how this relates to ‘peer review’.
Peer review plays a central role in systems of academic publishing and is seen as a principal quality assurance mechanism for written research. Several independent experts (peers) will critically evaluate the research before it is published and authors must respond to their evaluations with changes or the justification of the lack thereof. Academic publishers and editors all follow a well-trodden and understood path of peer reviewing all materials. Submissions to the REF need to have an assured level of quality as pieces of research and peer review is one of the benchmarks of the appropriate level of quality.
Since this is the first time I will be submitting practice-based work to the REF, my colleagues have been asking this week – when it comes to practice-as-research in pop music, what is the equivalent of peer review in publishing? An artist could have an artefact accepted for viewing at an academic conference/festival, or accepted by a curator to be exhibited in an institutional gallery for example. For popular music there is no equivalent.
In terms of industry procedure, there is no standard practice for music publishers or record labels in my field of practice to send the music to a panel of experts to decide its quality before it is disseminated in that way. Certainly, record label owners/managers/A+R people are experts in the field, and they bring that expertise to bear on their decisions whether to release a record. But is this equivalent? There are aspects of a recording, and certainly of a performance of music, that are fixed or fleeting, and cannot be altered in response to comments from such reviewers. There may be a process of demoing songs and these being reviewed by A+R workers before being altered in the final process of recording or mixing. But this is by no means a ubiquitous process.
If we accept that those in record labels who make the decisions as to whether to release music or not (who may be single individuals rather than a panel) are experts or ‘peers’ in the field, then how do we rate the quality of those expert reviewers without institutional backing? Would the ‘peer reviewer’ record label need to be a major, like Sony? Would a major indie such as Domino records do? What about a true independent like my own record label Olive Grove?
Or is the appropriate peer review in this instance what the RIN calls “the evaluation of publications once they have been published, through reviews and review articles”? In this case, does practice-as-research require excellent media reviews and plays on national radio as an assurance of its quality? The practice output from my AHRC funded research project ‘Fields of Green’ – the Wrack Lines EP – received airplay from BBC Radio 6, for example. Would this be evidence of quality in lieu of ‘peer reviewed publication’? If so, how is this evidenced within the 300 word contextualisation that those submitting artefacts to the REF are allowed? The 2014 REF guidance asks that this is not used to “volunteer opinions about the quality of an output” but states that it can be used to detail “means of dissemination” and “factual information about the significance of the output… (for example, where the output has gained external recognition…)”.
Another option for releasing pop-as-research would be to do so primarily via ‘academic channels’ rather than within the industry, and demonstrate assured quality for the REF that way. Although the field of practice-as-research in popular music is growing, to my knowledge, there has only been one publication so far that such music could have been submitted to and which would meet the criteria of peer review – IASPM Journal’s Special Edition on Practice-Led and Practice-Based Popular Music Studies (unfortunately this was released while I was on maternity leave). Two questions arise here however – firstly as to whether an academic context is really the best context for deciding quality in the field of practice in pop given that, by its very title, some would argue pop music has its roots in its commercial nature? Of course, such music could also be released outside this context once its quality as ‘research’ was assured via this means. But the question remains regarding the number of opportunities to disseminate popular music research in this way.
This leads to another related question, which is what qualifies the practice as ‘research’ as well as ‘practice’, and therefore who has the best expertise to review its quality as such? The REF 2014 criteria states that the material needs to be “the product of a process of investigation leading to new insights, effectively shared” and that the given 300 word contextualisation should work to demonstrate how the practice counts as research, for example, detailing its contribution in terms of what they call new “interpretations or insights”/ “assembling of information in an innovative way”/”innovative methodologies and/or new forms of expression”. From the given guidance, it seems that as long as the 300 word submission sets out how the practice counts as research, then quality is purely judged by the REF panel engaging with the piece itself. For example, the panel can judge a piece as “of international significance”, even where there has been no “international exposure in terms of publication or reception, or any necessary research content in terms of topic or approach”. Having said this, although the assessment criteria don’t explicitly cover the necessity for peer review around publication/dissemination, experience and general insight by colleagues from many different institutions suggests that this is still an essential requirement.
Secondly, where what is being researched and evidenced through the practice is rooted in its ability to create dialogue and challenge assumptions within communities other than academic ones, does disseminating the music in an academic context (‘peer reviewed’ or not) allow the music to fulfil this function…? For example, my practice-led work on the ESRC/AHRD funded project ‘Distant Voices: Coming Home’ involves devising arts-based research practices and co-writing songs in a research team comprised of people with experience of the criminal justice system and then releasing these back into the communities involved as a way of creating dialogue, generating empathy, challenging stigma and exploring/enacting (re)integration. In this circumstance, releasing these works to the audience of an academic journal is not appropriate and does not allow the work to display its full multi-purpose function as research.
There are so many questions here aren’t there? There are many uncertainties about how the REF criteria are applied and judged in terms of practice-based research (not least because they change each time a new REF exercise comes around and because panel members aren’t allowed to discuss their deliberations with others), and I’m hoping to get as much guidance as I can from different practice-led researchers across disciplines who have already submitted practice to the REF. The Royal Musicological Society, for example, is setting up a working group to help think through these things. Perhaps thinking about how compositions in other types of music have been judged in the past will be helpful.
If I gain any further insights then I will update my thoughts here with another post.