Refine by

Researchers and New Technology

This chapter will consider the first of Boyer's scholarly functions, termed discovery, by examining the use of new technologies and practices by researchers.

The current state

There have been a number of recent studies examining researchers’ use of new technologies, and the conclusion one can draw from these is of cautious experimentation. Perhaps, more than any other of the scholarly functions, the use of new technology in research is the most conservative, maybe because research is the practice still most highly valued. This chapter will look at some of the current evaluation research and then look at some of the potential uses.

If technology uptake is examined first of all, most studies indicate that researchers tend to use a variety of tools, some of which are provided by their institution and others they have selected themselves (Kroll and Forsman 2010). In terms of Web 2.0 technologies, there is tentative take-up; for example, Proctor, Williams and Stewart (2010) in the United Kingdom found that

a majority of researchers are making at least occasional use of one or more web 2.0 tools or services for purposes related to their research: for communicating their work; for developing and sustaining networks and collaborations; or for finding out about what others are doing. But frequent or intensive use is rare, and some researchers regard blogs, wikis and other novel forms of communication as a waste of time or even dangerous.

As we saw in Chapter 2, there is little evidence to suggest that age is a factor in the use of new technologies, as Carpenter, Wetheridge and Smith (2010) claim:

[T]here are no marked differences between Generation Y doctoral students and those in older age groups. Nor are there marked differences in these behaviours between doctoral students of any age in different years of their study. The most significant differences revealed in the data are between subject disciplines of study irrespective of age or year of study.

There is a general suspicion around using social networks to share findings, although many researchers use them for personal and professional networking (James 2009; Carpenter 2010). Carpenter et al. describe researchers as ‘risk averse’ and ‘behind the curve in using digital technology’. Similarly Harley et al. (2010) state that ‘we found no evidence to suggest that “tech-savvy” young graduate students, postdoctoral scholars, or assistant professors are bucking traditional publishing practices’.

The relationship with publishing is a tense one (which we will look at in more detail in Chapter 12). While many researchers effused support for open access, for instance, with James et al. (2009) reporting 77 per cent agreement with the principle of open access publishing, there were also reservations about quality or, more significantly, perceptions by others of quality. Similarly Proctor et al. (2010) found that print journals were rated as more important than online ones.

What this indicates is the strong relationship between academic journals and recognition. It is through publishing in well-renowned journals that researchers are likely to gain tenure or promotion and also to be recognised in their own institution. There is thus a disincentive inherent in scholarly practice to explore new forms of publication, even when the majority of researchers themselves may support them. This is also related to reputation and identity. If other forms of output are perceived as frivolous then early stage researchers in particular will be discouraged from engaging with them. The academic with tenure, however, is often more willing to experiment with new technologies and forms of dissemination, as their reputation is already established. For instance, in the US context at least, Kroll and Forsman (2010) claim that ‘the issue of open access publishing elicited strong support with faculty who want to share their publications freely. However, faculty express a strong preference for their graduate students to publish in traditional high-impact journal’.

Harley et al. (2010) put it even more bluntly:

Established scholars seem to exercise significantly more freedom in the choice of publication outlet than their untenured colleagues, …

The advice given to pre-tenure scholars was consistent across all fields: focus on publishing in the right venues and avoid spending too much time on public engagement, committee work, writing op-ed pieces, developing websites, blogging, and other non-traditional forms of electronic dissemination.

Academic research is then in a strange position where new entrants are encouraged to be conservative while the reinterpretation of practice and exploration is left to established practitioners. This seems to be the inverse of most other industries, where ‘new blood’ is seen as a means of re-energising an organisation and introducing challenging ideas. This should be an area of concern for academia if its established practice is reducing the effectiveness of one of its most valuable inputs, namely the new researcher.

As touched upon in Chapter 4, one area that is seeing significant change is the open access approach to data. There is a driver in this area from research funders, who are implementing policies which place data sets as a public good, with frameworks and services for discovery, access and reuse. In the United Kingdom, five of the seven research councils now have such policies (Swan and Brown 2008). There is variation across the disciplines, where many have an already established practice of sharing data and others where this is not the norm.

The use of social networks to form research teams is still rather tentative, with well-established practices still prevalent. Kroll and Forsman (2010) stress the importance researchers place in personal contacts:

Almost all researchers have created a strong network of friends and colleagues and they draw together the same team repeatedly for new projects …

Everyone emphasizes the paramount importance of interpersonal contact as the vital basis for agreeing to enter into joint work. Personal introductions, conversations at meetings or hearing someone present a paper were cited as key in choosing collaborators.

This perhaps indicates something of a closed shop – successful researchers have established personal networks which have been built up from years of attending conferences and previous collaboration. As financial pressures begin to bite in research funding, competition for grants becomes more intense, with success rate decreasing from 31 per cent in 2000 to 20 per cent in 2009. The average age of first-time principal investigators has increased over the same period (Kroll and Forsman 2010). Both of these factors may suggest that having previously successful teams will become more significant, thus creating a research funding spiral, where a greater percentage of the smaller funds goes to a decreasing set of researchers.

The picture we have then of research is one where scholars are exploring the use of a number of different technologies to perform certain functions individually, but the overall uptake and attitudes vary enormously. This is partly because ‘research’ is such a catch-all term which encompasses differences in disciplines, widely varying research methodologies and, of course, many different personalities and attitudes. The engagement or uptake with new technologies is less than might be expected or found in other communities. As Wu and Neylon (2008) put it,

The potential of online tools to revolutionize scientific communication and their ability to open up the details of the scientific enterprise so that a wider range of people can participate is clear. In practice, however, the reality has fallen far behind the potential.

Given the potential benefits of new technologies (which I'll address below), why might this be so? The environment within which research operates can be seen as contributing to a lack of engagement. For example, in the United Kingdom, there was a Research Assessment Exercise, now superseded by the Research Excellence Framework (REF) (, which assesses the quality of research in UK universities and then allocates funds on this basis. Similar schemes have been implemented in Australia, the Netherlands and New Zealand. The current proposals for the REF have an aim to ‘support and encourage innovative and curiosity-driven research, including new approaches, new fields and interdisciplinary work’. However, the types of outputs mentioned focus on journal articles, and the exploration of metrics is restricted to a few commercial publishers’ databases. There is no explicit encouragement to engage with new forms of outputs or to forefront an open access approach. As with all such exercises they significantly shape behaviour, and do not simply measure it, so the message researchers may have gained from their institution that the exploration of new approaches is discouraged becomes reinforced at a national level.

Where researchers are using new tools they are doing so in conjunction with existing ones, finding appropriate uses for the tools to make their work more effective. Proctor et al. (2010) summarise it thus:

[T]here is little evidence at present to suggest that web 2.0 will prompt in the short or medium term the kinds of radical changes in scholarly communications advocated by the open research community. Web 2.0 services are currently being used as supplements to established channels, rather than a replacement for them.

This may be an entirely reasonable approach, since research is at the core of what it means to be a scholar, and issues around quality and reliability are essential in maintaining the status and reputation of universities. A cautious approach is therefore not surprising as researchers seek to understand where the potential of these new tools can enhance their practice, while simultaneously maintaining the key characteristics of quality research. I would argue that it is this integrity of research which should frame discussions and experimentation with new technologies, and not the negative influence of promotion criteria and funding frameworks, since a concern about the nature of research is just as likely to accept new methods if they improve its efficacy as reject them if they threaten its reputation.

The research context, in particular funding and publication models, may work against the adoption of new approaches, but that may not be the only reason. There may be intrinsic conflicts with the ingrained practices of the discipline itself. For example, examining ‘Science 2.0’ in Nature, Waldrop (2008) found that while wikis were being used regularly as collaborative research tools, blogging was less popular. The reasons for this may not be simply a reluctance to embrace new technology but rather that the form of communication runs against the training and values scientists have developed over many years:

‘It's so antithetical to the way scientists are trained,’ Duke University geneticist Huntington F. Willard said at the April 2007 North Carolina Science Blogging Conference, one of the first national gatherings devoted to this topic. The whole point of blogging is spontaneity – getting your ideas out there quickly, even at the risk of being wrong or incomplete. ‘But to a scientist, that's a tough jump to make,’ says Willard, head of Duke's Institute for Genome Sciences & Policy. ‘When we publish things, by and large, we've gone through a very long process of drafting a paper and getting it peer reviewed’.

There may be a dilemma with science in particular and the informal lightweight technologies: scientists are engaged in the business of predicting the future. Given certain variables then these outcomes will ensue with a certain probability (or these outcomes are a result of these input variables). But as we have seen already, the benefits of many ‘Web 2.0’ ways of working are wrapped up in unpredictability. Authors won't know which blog posts will be popular; they can share ideas on Twitter but can't predict who will take them up; they can release research data but won't know what the uses for it will be. It might be the case then that scientists in particular want predictable benefits and outcomes from engaging in this type of activity, and at least at this stage these benefits are less than predictable.

A networked research cycle

There are many proposed approaches to conducting research through a ‘research cycle’ (e.g. McKenzie 1996; Hevner and March 2003). This section will adopt a basic cycle of plan, collect data, analyse and reflect to demonstrate how an open, digital, networked approach to the process might be realised, using a variety of tools. This is intended to be indicative of how new approaches could be used in the research process and not a claim that all research can or should be performed in this manner:

  1. Planning – researchers establish their research question through iterative exposure, using social networks and blogs. They seek feedback and ask for relevant experience. Using online information sources such as Delicious feeds and Google scholar they gather relevant information to inform their research proposal. They set up a series of Google alerts around a number of subjects to gather daily information. A plan is created that incorporates regular release and small-scale outputs. They hold an informal online meeting with some interested parties and establish a project blog or wiki.

  2. Collect data – researchers continue to use online information sources for their literature review. They create an online database and seek user contributions, seeded by requested contributions from peers in their network. An online survey is created in SurveyMonkey.

  3. Analyse – researchers use Google analytics to examine traffic data and SurveyMonkey analytics to analyse responses. They use data visualisation tools such as ManyEyes to draw out key themes in responses.

  4. Reflect – reflection occurs throughout the process by means of a series of blog posts and video interviews.

This would constitute a valid approach to research which would be comparable with current approaches. New methods then could be, and frequently are, deployed within a conventional structure. It is the development of new approaches and interpretations of what constitutes research that I think is more interesting, and more challenging to our notions of scholarship, and it is these that will be explored in the next section.


Having set out the overall view of the landscape as it pertains to research and open, digital, networked approaches, I now want to look at a number of themes which I believe will have increasing relevance, whether it is because they become accepted practice or because the research community reacts against them.


Changes in granularity are one of the unpredicted and profound consequences of digitisation. As we saw in Chapter 3 the music industry has seen a shift to the track becoming the standard unit, rather than the album. A similar process has happened with newspapers, where the impact of search and social networks has seen individual articles being circulated, linked to and discovered, whereas with the physical artefact it was usually at the level of the whole paper that sharing occurred. As suggested in Chapter 3 a similar breakdown in granularity for books has yet to materialise.

The books and journals will undoubtedly continue to exist, but they will not hold the monopoly on being the conduit for ideas. Just like some albums, some books have an integrity that justifies the format, the whole product is worth the investment. But just like many albums used to consist of a handful of good tracks padded out with what we might generously term ‘album tracks’, so many books seem to be a good idea stretched over 100,000 words. This isn't the author's fault necessarily, rather a result of the book as the dominant route for transmitting ideas in society.

But this need no longer be the case – an online essay, a blog, a podcast, a collection of video clips – all these are perfectly viable means for disseminating ideas. As well as the book losing its monopoly, so does text – audio and video can be used effectively. Text became the dominant form largely because it was transportable when ideas were tied in with physical objects. And if ideas become the equivalent of tracks, then perhaps users create the equivalent of a playlist by pulling these together around a subject of their choice.

As a blogger, this range in granularity has been one of the most appealing aspects. A post can vary from a link to an essay; it can be a commentary on someone else's work, a piece of parody, a research finding, a suggestion, an appeal for contributions and so on. Having alternative outlets creates a backward reaction, in that it then influences the type of research people perform, which is my next category.

Pushback from outlets

In The Adventures of Augie March, Saul Bellow (1953) famously observes that ‘there is no fineness or accuracy of suppression; if you hold down one thing, you hold down the adjoining’. The reverse would seem to be true also, let's call it Bellow's law: There is no targeting of liberation; if you release one thing, you also release the adjoining. It is this knock-on effect that creates the era of uncertainty we are now in. We are seeing the liberation of a number of activities that are facilitated by the open, digital network. These include the removal of filters, freedom to publish and broadcast, the ability to share easily, establishing peer networks without the need for travel, creating communities of interest around diverse subjects and so on.

For each of these there are consequent effects. So if we take the change in granularity in outputs, combined with the removal of filters such as publishers, then we see an example of Bellow's law in action. What constitutes research itself begins to change, since what we regard as research has been partly determined by the process of communicating its outputs. The general approach is to conduct research and disseminate at the end of the project, with maybe a conference presentation on work in progress about halfway through. This can be seen as a necessary approach when conducting large-scale research and also of ensuring what is communicated is reliable and backed up by evidence. But it might also be influenced by the nature of outputs – if you are required to write a 5,000 word paper (or 10,000 word report), then it needs to be based on something substantial. There is an analogy with software production – the traditional model was to expend years in development and then release a finished product. The open source software approach, saw a reversal of this, with developers using the community to find bugs and help fix them, which Raymond (1999) has termed a ‘release early, release often’ approach. He compares the two approaches thus, talking about Linux developer Linus Torvalds:

If the overriding objective was for users to see as few bugs as possible, why then you'd only release a version every six months (or less often), and work like a dog on debugging between releases.

Linus's innovation wasn't so much in doing quick-turnaround releases incorporating lots of user feedback … , but in scaling it up to a level of intensity that matched the complexity of what he was developing. In those early times (around 1991) it wasn't unknown for him to release a new kernel more than once a day! Because he cultivated his base of co-developers and leveraged the Internet for collaboration harder than anyone else, this worked. Raymond (1999)

A similar approach may suit some elements of research (I would not suggest it suits all projects or disciplines). A researcher releases, or communicates, ideas, progress, mock-ups, prototypes, draft results and so on throughout their project, gathering feedback as they go.

Perhaps more interesting is that the granularity of what we consider to be research may then alter. The UK REF uses the following definition of research: ‘a process of investigation leading to new insights effectively shared’.

The REF is a fairly traditional, conservative view of research concerned with promoting research which is universally recognised as excellent, so their definition is not one we can assume is directed at revolutionising research practice. But if one examines it, there is nothing in its definition that specifies the length of a project or the size of the outputs it produces.

Let us take the example of my OU colleague Tony Hirst, who blogs at He typically explores new technologies and data visualisation in particular. A random sampling of recent posts include the following:

  1. an analysis of Twitter connections between UK politicians,

  2. a representation of online communities who use the same hashtag,

  3. an interrogation of the Mendeley software to show users by institution,

  4. sharing his own promotion case, and

  5. a presentation on ‘data-driven journalism’.

Each of these is intended to promote discussion and has suggestion for implications, for example, how higher education can make effective use of data. None of them arise from a specific research project, and each of them is fairly small in terms of time and resource. The existence of his blog, though, allows Hirst to engage in this ongoing experimentation, as it has an outlet, but it simultaneously encourages it also, since discussions will arise on the blog (or in other places such as Twitter). Taken as a whole then, the blog itself represents the research process, and in this context it is difficult to say that it is not demonstrating ‘a process of investigation leading to new insights effectively shared’.

What this may indicate is a shift from specific outputs and a focus on ongoing activity, engagement and reputation, which would be more difficult to measure and reward. Most people know what a good publication record looks like, but could we recognise a good blog track record?


Again building on the open source model, researchers are beginning to realise the potential of a distributed model gathering user input. This can be in the form of grid computing, which utilises the computing power of individual computers to crack complex tasks. An example of this was Oxford University's screensaver project which sought to find a cancer cure by using the distributed computational power of 3.5 million individual computers to screen molecules ( Other approaches include user contributions, such as the iSpot project, where users upload photographs of wildlife to be identified by experts. This can be used to develop an overall picture of the distribution of species, and in one case revealed a moth never seen before in the United Kingdom (

Similarly the Reading Experience Database ( seeks to gather examples ‘of the reading experiences of British subjects and overseas visitors to Britain from 1450–1945, whoever they were, and pretty much whatever they were reading’. This type of extensive record can only be achieved by opening it up to a wider audience, who not only will have access to different experiences but who may also have a different perspective on what constitutes reading matter than if the database were solely populated by academics, who might have a literary bias.

Whereas many such projects seek to encourage input from everyone, others are adding in a layer of filter and publication. For example, the Stanford Encyclopedia of Philosophy has a Wikipedia-type approach, but with an additional layer of editing, so that ‘all entries and substantive updates are refereed by the members of a distinguished Editorial Board before they are made public’ ( In this way they hope to combine the power of user-generated content with the reliability of a scholarly reference work.

The demonstrable advantage of such open approaches to data gathering for specific projects is leading to this being an increasingly popular methodology. The problem for such projects is in gaining sufficient contributions, and knowing how to promote this and generate appropriate levels of interest will become a relevant research skill.

Light connections and nodes

As the reviews above highlighted, collaboration and teams are still formed through personal contacts which are established through conferences, previous research, other professionals and so on. This is one area where I suspect we will witness a gradual alteration. As academics establish networks of peers online through blogs, Twitter, Facebook, Friendfeed, LinkedIn and other tools, and these become a more established part of the working pattern, the peers who constitute them come to be seen as another source of contacts. This will arise through the development of an online reputation, which will lead to collaboration. For example, if researchers are constructing a research proposal and realise they need a partner with experience in a particular subject, they will approach someone in their online network who has blogged or tweeted knowledgeably about the subject. Or they may even put out a direct request, asking for partners with appropriate expertise.

In this respect online social networks can be seen as a complement to existing ones. What may be more interesting is whether networks allow different forms of collaboration, just as open databases allow different forms of user contributions. Maintaining personal networks is hard work, since they operate on a one-to-one basis. They are therefore relatively small by nature. Maintaining online networks is less arduous, since an individual is effectively broadcasting to all those in their network. A Facebook status will be read by (potentially) all of your friends. One can view online relationships much more like activity networks; at different times certain nodes or clusters will be ‘activated’ or more intense. It is therefore possible to maintain a diverse and large network of peers through a series of light connections, just as content can be shared in a frictionless manner (more on this in Chapter 7).

If the definition of research becomes altered (or expanded) to include the smaller granularity outputs mentioned above, then it follows that the type of collaboration needed to realise these may vary from the large-scale, management-intensive project teams we currently operate. Collaboration may be to ask a number of peers within a network to contribute, or to come together for an online event or to engage in a distributed debate across blogs. All of these would constitute research but would not require face-to-face meetings or large investment.

Rapid innovation

In a presentation for TED (Technology, Entertainment, Design), founder Chris Anderson (2010) explores the idea of rapid innovation being driven by the sharing of video on a global scale. He gives the example of dancers sharing moves via YouTube, which they then learn, innovate upon and then share back. Anderson suggests that they have seen a similar effect with TED talks, where each speaker was effectively being challenged by the quality of previous ones which they had viewed online. He refers to it as ‘crowd accelerated innovation’, which requires three elements to flourish: a crowd, where people will occupy a number of roles; light, which can be interpreted as the ability to be able to see what people, particularly the innovators, are doing; and desire, which is the motivation to spend the required time in attempting innovation and is driven often by competition and the potential to be seen by a large audience.

This rapid innovation can be seen in skills such as dance, guitar playing, skateboarding and so on, which both lend themselves to the visual medium and also appeal to a younger audience. But it is interesting to reflect whether a similar phenomenon will arise in research. Will the early sharing of research, with a global audience, drive innovation and reduce time lags between cycles of research? A small example I have witnessed is with the improvement of presentations. Once individuals start sharing presentations on tools such as Slideshare, they both encounter good presentations and also realise that their slides are available to a potentially much wider audience. One way to reach that audience is to move away from the ‘death by bulleted list’ approach and make slides more visually engaging and the message clearer. The same may well happen with research in general, if we see a move to sharing smaller granularity outputs earlier in the research cycle. If a research project takes two years to complete and there is an 18-month delay in the publication of a paper, then a four-year cycle between rounds of research can be expected. But this could be reduced dramatically by the adoption of digital, networked, open approaches.


In this chapter the current attitude of researchers to new technologies and communication forms was reviewed. There are islands of innovation, but in general the attitude of the research community is one of caution and even occasional hostility. This can be partly attributed to the context within which research occurs and is recognised and rewarded, which acts to discourage use of different approaches by focusing heavily on the traditional peer-reviewed journal. It is also a product of inherent values and attitudes within research and disciplines which are at odds with many of the affordances of lightweight, informal communication channels. Researchers, in this respect, are engaging with new technologies when they complement existing practice and offer a more efficient means of realising their goals.

Some emerging themes and their implications were then drawn out, including changes in granularity, changes in the nature of research as a result, the use of crowdsourcing techniques and the development of light connections and online networks. These emerging themes sit less comfortably alongside existing practices and can be seen as a more radical shift in research practice. A combination of the two is undoubtedly the best way to proceed, but the danger exists of a schism opening up between those who embrace new approaches and those who reject them, with a resultant entrenchment to extremes on both sides. This can be avoided in part by the acknowledgement and reward of new forms of scholarship, a subject we will return to in Chapter 11.

Boyer's function of integration is the focus of Chapter 6 and in particular interdisciplinary work.

  • Paperback Copy £17.99
  • pb 9781849666176
  • Available
  • ePub File £17.99
  • ePub 9781849666251
  • Available