About the Author(s)


Chris W. Callaghan Email symbol
School of Economic and Business Sciences, Faculty of Commerce, Law and Management, University of the Witwatersrand, Johannesburg, South Africa

Citation


Callaghan, C. W. (2019). Rothwell’s augmented generations of innovation theory: Novel theoretical insights and a proposed research agenda. South African Journal of Business Management 50(1), a217. https://doi.org/10.4102/sajbm.v50i1.217

Original Research

Rothwell’s augmented generations of innovation theory: Novel theoretical insights and a proposed research agenda

Chris W. Callaghan

Received: 05 May 2018; Accepted: 23 Apr. 2019; Published: 20 June 2019

Copyright: © 2019. The Author(s). Licensee: AOSIS.
This is an Open Access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

Background: Technological advances necessitate the reconceptualisation of certain seminal theory. Novel developments in Internet and Communication Technologies have disrupted certain markets, industries and processes.

Objectives: Rothwell’s seminal generations of innovation typology provides a categorisation of advances in innovation theory. In so doing, it provides an overarching logic that relates these advances to a production function, categorising the way each wave of innovation theory has driven the time/cost curve closer to its axis. The objective of this article is to augment this typology to include theory that relates to novel technological advances, with implications for the innovation time/cost curve.

Method: As a conceptual article, this work seeks to offer a synthesis of theory, extending Rothwell’s schema to incorporate theory related to novel technological developments.

Results: Lacking in Rothwell’s conceptualisation of five generations of innovation is consideration of recent technological advances which necessitate a reformulation of this schema to take into account new scale relationships made possible by recent technological developments. This article seeks to extend Rothwell’s theoretical framework to incorporate these new potentialities.

Conclusion: In augmenting Rothwell’s theoretical schema, certain implications for society and industries are predicted, and suggestions are made for a proposed research agenda.

Keywords: Generations of innovation; research; development; R&D; probabilistic innovation theory; knowledge management; sixth generation innovation theory.

Introduction

What dynamic links most dramatic advances in human history? For Nielsen (2012), it has been innovations in the research, or discovery process itself, which primarily account for most of the dramatic, or revolutionary advances in human history. Rothwell’s (1994) generations of innovation theory predict that each ‘generation’ of innovation theory has driven the time/cost curve of production closer to its origin. Considering knowledge production according to Rothwell’s logic offers useful insights into how to improve the effectiveness and efficiency of the research process itself.

However, Rothwell’s fifth (final) generation of innovation essentially relates to the electronification of the production process. A sixth generation framework is lacking, incorporating certain developments and novel opportunities offered by new technologies, as well as a new era of radically increased connectivity, digitisation and big data analysis capabilities. Rothwell’s framework, therefore lacks in its application to the management context of innovation theory, is an integration of literature on scale relationships, or how concepts of economies of scale can be integrated into his model.

Given the need to augment literature relating to Rothwell’s schema, the objective of this article is to extend Rothwell’s (1994) categorisation of generations of innovation to incorporate recent technological capabilities. In so doing, an emerging theoretical framework is developed, together with a proposed research agenda that derives from this framework.

This work, therefore, seeks to build on the literature that suggests that digitisation creates the potential for marginal costs (the cost of producing one additional unit) to be driven close to zero, with important societal implications (Loebbecke & Picot, 2015). The potential for increasing returns to scale, or scale economies in the research (or research and development [R&D] process) itself (Callaghan, 2017), arguably has the potential to reshape society in certain important ways.

The article, therefore, seeks to make the following contributions to the literature. Firstly, it locates technology and its influence on society in relation to a seminal innovation schema that provides a clear rationale for linking time and costs to knowledge production. This contribution is perhaps timely, in light of the technologically enabled scale economies that now exist in both data collection and analysis. A theoretical framework is however needed to guide further research and practice at the nexus of research practice and emergent technologies.

This work contributes to an important stream of management literature that relates to the management of the research process itself. According to Nielsen (2012:19), the ‘reinvention of discovery is one of the great changes of our time’, whereby for ‘historians looking back a hundred years from now, there will be two eras of science: Pre-networked science, and network science’, and are currently ‘experiencing a time of transition to the second era of science’. The discussions here will also draw on Nielsen’s (2012) theory of networked science to identify certain channels, or mechanisms through which the research, or R&D process can be reconceptualised in terms of its configurations to take account of novel technological opportunities.

Secondly, this article seeks to make a contribution to the recent debates about how technology is enabling a fourth industrial revolution (Schwab, 2017), a third industrial revolution (Rifkin, 2011), a fourth paradigm of scientific research (Gray, 2009) and a second machine age (Brynjolfsson & McAfee, 2014). Seemingly lacking from these newly emerged bodies of literature is a core organising rationale, or a conceptual underpinning logic, that exists in the same form as that offered by Rothwell’s (1994) schema. This work seeks to synthesis this literature according to the logic of time and cost offered by Rothwell’s model, linking these logic to the need for responsiveness to societally important research problems.

Thirdly, this article seeks to also make a contribution to the technological innovation literature in terms of its application of management theory to disaster management research. A specific characteristic of conditions of disaster has particular relevance for the augmentation of Rothwell’s (1994) schema, namely that under conditions of disaster, the data required for research problem solving are typically only available after the onset of the disaster. This extension contributes to theory and practice in business contexts in which the time and cost dimensions of research are constrained. For Schwab (2017:20), all ‘new developments and technologies have one key feature in common: they leverage the pervasive power of digitisation and information technology’. The discussions undertaken here seek to offer certain novel conceptual insights into how further theory development and research can be organised to build on Rothwell’s principles to study how to leverage the power of technology to ‘crash’ the time dimension of discovery process itself.

Having outlined the objective of the article, it proceeds as follows: Firstly, theory and literature are reviewed that relate to Rothwell’s descriptive ‘generations of innovation’ schema. Theory related to a further generation of innovation is then introduced, with specific reference to certain opportunities suggested by recent novel innovation theory, and how economies of scale can be achieved in the R&D process. The rise of networked science is then discussed, in relation to Nielsen’s notion of the reinvention of discovery, as well as to Gray’s predictions of scientific convergence associated with the ‘fourth paradigm’ of scientific research. Discussions are extended to include Kitchin’s suggestions that big data science now offers useful insights for how theory development itself is changing, to capture new opportunities for big data analysis. Testable propositions are then derived. The implications of the augmented theory for society are then discussed in terms of positive (utopian), negative (dystopian) and more realistic expectations. Theory and literature related to Rothwell’s typology of innovation theories is now reviewed.

Theory and literature

Given the persistent failure of contemporary research systems to solve certain societally important problems (Nielsen, 2012; Wallace & Ràfols, 2018), not least of which is the problem of climate change, theory development needs to also focus on how to improve the research process itself. Which theoretical insights are then most helpful in the quest to improve the scientific discovery process, and what would the implications for society be, of such an improvement? Using these questions as an ordering framework, certain theoretical perspectives are now considered, in order to develop an embryonic theoretical framework that is able to make predictions about the impact on society of further theory development that is guided by an augmented form of Rothwell’s (1994) schema. In doing so, this study builds on Rothwell’s (1994) innovation theory to take into account more recent literature, particularly that which relates to technological change.

Rothwell’s generation of innovations schema

What links the different generations of innovation in Rothwell’s schema? Rothwell (1994) uses the logics of development cost and development time as axes to plot the way different generations of innovation have driven the time/cost curves associated with each generation closer to the origin. This plot is shown in Figure 1.

FIGURE 1: Rothwell’s generations of innovation.

Rothwell’s (1994) first generation innovation process relates primarily to a period in which technological progress drove innovation, or an era associated with innovation that resulted from technological push.

The second generation of innovation, primarily associated with the period from about 1965 to the early 1970s, relates to a shift in the focus of innovation towards strategic marketing issues, or market pull innovation. Increasing attention was therefore paid to competition for market share, and demand side factors, over this period (Rothwell, 1994).

Rothwell’s (1994) third generation of innovation relates to the coupling model, whereby market and technology aspects of the innovation process were related sequentially, and feedback loops were integrated into a model that incorporated aspects of the first and second generations of innovation. This model was derived from insights gleaned from the period from 1970 or so through to the mid-1980s, and the successes of Unites States (US) firms. Rothwell’s third generation of innovation is illustrated, together with the fourth and fifth, in Figure 1.

The fourth generation of innovation was primarily derived from analysis of the successes of Japanese companies, from the 1980s onwards until the early 1990s (Rothwell, 1994). This generation of innovation relates to the use of integrated systems, functional overlap, and parallel development, combined with ‘design for manufacturing’ principles, whereby Japanese firms were able to ascend to global dominance in manufacturing.

Rothwell’s fifth generation of innovation describes how the focus of innovation turned towards the application of technology to harness extensive horizontal and vertical alliances, to increase the flexibility of responses to markets, and which enabled the comprehensive ‘electronification’ of the innovation process itself (Rothwell, 1994:25).

Rothwell’s schema, however, is yet to be revisited to incorporate the influence of technologies that have emerged since the time that it was first developed. Augmenting this framework is therefore important, so as to incorporate novel theory that predicts that the development/time curve of knowledge production can be driven closer to the origin.

Indeed, certain technological developments have made accelerated learning possible in the R&D process itself.

The next generation on innovation theory

Technological change has enabled economics of scope and scale in the research process itself. Probabilistic innovation theory (PIT) (Callaghan, 2017) suggests that the mechanics that underlie crowdsourced R&D, or crowdsourcing processes and principles applied to the research process, can allow for high volume data collection as well as high volume problem solving inputs, which can result in a probabilistic relationship between problem solving inputs and outputs. An example of these economies of scale and scope is InnoCentive (InnoCentive, 2018), a site that allows firms to put scientific problems up as open calls to be solved on the Internet. The success of this process demonstrates that it is possible to solve certain complex scientific problems more quickly and cost effectively than they could be using in-house company R&D departments.

Novel opportunities that characterise the development of new innovation theories

Whereas in the past large organisational units were necessary to be able to combine the resources necessary for effective scientific R&D, because of technological advances the optimum size of a productive unit has shrunk (Reynolds, 2006:3).

Whereas economies of scope and scale used to give larger organisations an advantage, technology has neutralised this advantage in many areas (Reynolds, 2006). Rothwell’s theory relates to what is perhaps a first generation of R&D, in that economies of scale and scope were maximised under the auspices of the large organisations that were needed to obtain scale economies. However, given the advances in Internet and Communication Technologies (ICTs) that now allow radically increased economies of scale and scope, it is now necessary to update Rothwell’s framework to include these new potentialities. Given the importance of these developments, this new generation of innovation might be considered a sixth generation of innovation (6G). The new R&D capabilities enabled by new technologies warrant recognition as another wave of theory that has the potential to push Rothwell’s cost/time curve closer to the axis. On the basis of these criteria, the addition of this category to the Rothwell model is considered to be justified, as it rectifies a deficiency in this specific stream of literature.

Economies of scale in the research process itself: How?

The extension of Rothwell’s model is important, because there is much to be learned about how the open collaboration techniques of crowdsourced R&D can be used to capture scale effects. According to Nielsen (2012:55), open source collaboration can capture scale effects, an important example of which is the success of Linux, whereby open source software developers are able to develop thousands of lines of code per day. Open source collaborative R&D models like those used by Linux have come to challenge the dominance of conventional R&D models, such as that applied by Microsoft.

According to Nielsen (2012:55), open source is ‘a general design methodology that can be applied to any project involving digital information’. It is argued here that this design methodology complements existing crowdsourced R&D theory. A sixth generation of innovation perspective might be useful in that it describes certain regularities that underlie the emergence of recent discourse around the ‘fourth industrial revolution’ (Schwab & Samans, 2016), ‘fourth paradigm of scientific research’ (Gray, 2009) and incorporates them into Rothwell’s (1994) theory, in that its logics offer a clear conceptual rationale to aid further theory building. The present and coming productivity increases of the fourth industrial revolution relate to developments in artificial intelligence, biotechnology, genetics, robotics, 3D printing and nanotechnology, as well as the impact of these technologies on jobs and work systems (Schwab & Samans, 2016) are usefully incorporated into Rothwell’s model in terms of their cost and time productivity characteristics. If these technologies are able to ultimately push the cost and time dimensions of production much closer to the origin, then 6G would be expected to contribute to much more cost effective innovation. If important societal research becomes almost costless and can also be conducted more quickly (without compromising rigor), then this could also be considered to represent another generation of R&D, itself, if these predictions are indeed reflected in developments to come. These capabilities could be particularly important for research of societal importance, which has typically been neglected by industrially oriented innovation theory (Zoo, De Vries, & Lee, 2017).

The reinvention of discovery: The rise of networked science

Nielsen (2012) categorises these new open collaborations and their technologically enabled potentialities as ‘networked science.’ According to Nielsen (2012:19), the ‘reinvention of discovery is one of the great changes of our time’. For ‘historians looking back a hundred years from now, there will be two eras of science: pre-network science, and networked science’ (p. 19). He further suggests that we are currently ‘experiencing a time of transition to the second era of science’, notwithstanding the ‘possibility [that this transition] will fail or fall short of its potential’. Indeed, without the synthesis of different literatures and their incorporation into pre-existing theoretical frameworks it will be difficult to ensure that the full potential of these ideas are realised. It is, therefore, important to bring together these different literatures and the different terminologies therein, to reduce redundancies in discourse and theory development relating to technological change and its enablement of the R&D process itself.

Importantly, others also support the notion that because of new developments in technology we are at a crossroads in the discovery process. According to Gray (2009:xv), ‘almost everything about science is changing because of the impact of information technology’, as experimental, theoretical and computational sciences ‘are all being affected by the data deluge, and a fourth, “data intensive” science paradigm is emerging’.

The point of convergence

According to Gray (2009), paradigms of science have followed a progression of change. Science has developed from being primarily empirical, describing natural phenomena, to a new paradigm of theoretical science based on models and generalisations, but in recent decades computational science has emerged, offering the potential to simulate complex phenomena. Today, data exploration, or eScience, now seeks to unify theory, as data are either captured by instruments or simulated, and processed by software (Gray, 2009). Unlike in previous times, Internet connectivity is now able to ‘unify all the scientific data with all the literature to create a world in which the data and the literature interoperate with each other’ (p. xv). Such a capability ‘will increase the “information velocity” of the sciences and will improve the scientific productivity of researchers’ (Gray, 2009:xv). Given the rise of technologies described by Schwab and Samans (2016), that underlie the enablement of eScience and Internet connectivity, a radical shift in Rothwell’s cost/time innovation curve is predicted on account of these changes, with important societal implications.

The point of convergence, at which scientific data are seamlessly integrated with literature (Gray, 2009), is taken to be an important landmark, where interconnectivity in scientific research completes ‘the circle’, of the knowledge flows between the discovery process itself and all pre-existing literature. These changes echo arguments that the potentialities of big data analytics can also offer useful insights into potential innovations in the theory development process itself. Big data allows for more complete, and ultimately comprehensive, knowledge of phenomenon and their interlinkages. For Kitchin (2014:4), big data analysis can capture an entire domain, offering comprehensive knowledge of the interrelationships between phenomena, heralding the emergence of a new mode of science itself. There is, thus, ultimately the possibility of ‘full resolution data’ or data which can inductively link all phenomena through universal coverage. It is this convergence principle that characterises the 6G innovation generation in the augmented Rothwell model.

Convergence and the need for new modes of theorising

Some have suggested that full data coverage will eclipse deductive approaches, a condition described by some as the ‘end of theory’ that results in an exclusively inductive paradigm (Kitchin, 2014:5/6). Kitchin attempts to reconcile inductive and deductive methods of inquiry, offering the notion of ‘data-driven science’ which maintains the principles of the scientific method, ‘but is more open to using a hybrid combination of abductive, inductive and deductive approaches to advance the understanding of a phenomenon’ (p. 6). This approach, termed holistic theoretical assemblage (Kitchin, 2014), suggests that applying these three modes of theory development to a context of increasingly accessible big data will allow knowledge gaps that exist between different branches of science to be filled. Thus, the 6G generation also calls for new forms of theory development to suit new modes of data analysis. At this nexus, discussions of theory now turn to the derivation of theoretical propositions.

Ethical consideration

This was a conceptual article and thus no ethical approval was necessary.

Derivation of propositions

Rothwell’s theory has neither to date incorporated the potential opportunities offered by big data analytics, nor the implications for further theory development that derive from this incorporation. PIT suggests that what is ‘not known’ can be considered to represent a problem landscape, or a problem space (Callaghan, 2017), and that historical R&D systems have failed to solve certain important societal problems because this problem space has not been ‘populated’ to date with the necessary volumes of problem solvers. From this literature, the following proposition is derived:

Proposition 1: Cost and time efficiencies in R&D predicted by 6G innovation theory are ultimately a probabilistic function of the extent to which problem spaces are populated by problem solvers.

This proposition is taken to derive from the augmented Rothwell framework also echoing PIT predictions that relationships between R&D inputs and outputs will in time dramatically reduce in their uncertainty. Certain important implications derive from this proposition, not least of which is the prediction that risk in the R&D process can be radically reduced over time. If inequality in the outcomes of scientific discovery is primarily because of its high costs (which prioritise the needs of wealthier populations) then a dramatic reduction in risk is expected to accompany decreases in the cost of required investments. The following proposition is therefore proposed:

Proposition 2: Cost and time efficiencies in R&D predicted by 6G innovation theory will result in more equitable outcomes in the discovery process.

These proposed effects flow from Rothwell’s augmented theory as a logical function of the dynamics of the time/cost curve logics. It must be acknowledged that these discussions are considered at a certain level of abstraction, and a full discussion of the micro-level mechanisms and processes that operationalise this augmented theory are beyond the scope of this article. Nevertheless, it is important to identify further literature that supports the underlying assumptions of these propositions, namely that powerful cost and time efficiencies can be attained through the application of technology to the research process itself. The value of these propositions lies in their ability to focus further theory development and research on these time/cost dynamics.

In terms of changes in scientific research processes, a movement that has come to light is that of citizen science. Arising, interestingly enough, from research methods applied to ornithology, citizen science extends participation in scientific research to members of the population, or citizens (see Bonney et al., 2009, 2014 for a useful summary of this). This body of literature has also emerged since Rothwell’s original conceptualisation, providing an important supplementary theoretical framework in that it provides insights into how open collaborations can be extended across populations to achieve scale economies in data collection as well as analysis. Importantly, this body of literature suggests ways in which problem spaces can be ‘populated’ with large numbers of people. These conceptions, therefore, offer a useful complement to Rothwell’s (1994) framework, helping to bring it up to date with recent developments in such a way as to clarify how (and where) further research can build on it.

Those across different fields can benefit from theory that shows how to shift the time/cost curve by harnessing the potential of big data combined with the connectivity advantages of linking large numbers of experts. According to Nielsen (2012:13), scientists across fields are increasingly collaborating online, as they are, ‘piece by piece, assembling all the world’s knowledge into a single giant edifice’, thereby accelerating the rate of scientific advancement. Nielsen’s (2012) vision of change in the discovery process itself echoes that of Kitchin’s (2014), whereby more complete data coverage guided by theory development can result in more complete knowledge, which can radically reduce the problem space, even that related to societally important problems. Innovations in the method of scientific discovery itself are typically unlike others, in that they can have a considerable impact across scientific fields. Indeed, according to Nielsen (2012:12), ‘the process of science-how discoveries are made-will change more in the next twenty years than it has in the past 300 years’ on account of the dynamics reflected in certain practical examples, one of which is the Polymath project. Such examples have a definitional role. These examples may usefully illustrate, in real-life terms, the emerging phenomenon of the 6G. On the basis of this literature and the predictions of networked science the following proposition is derived:

Proposition 3: Cost and time efficiencies in R&D predicted by 6G innovation theory now offer the potential for near real time research productivity.

Rothwell’s augmented theory is still at this point of the discussions lacking a complementary framework that relates to the assumptions underlying the dynamics of collaboration. With regard to human population growth, Hardin (1968) invokes the notion of the tragedy of the commons, using the example of a herdsman whose utility of adding an animal to a herd is much greater than the negative utility associated with overgrazing (which is shared by all of the herdsmen).

Whereas the tragedy of the commons can be averted by the allocation of private property rights, morality is system-specific, according to Hardin. If failure to share data, or knowledge in the research process (up to publication, or even after) benefits the individual, and the negative utility, or costs of not sharing (or not collaborating) are lower because they are shared by all of society, then the assumption that researchers will typically share knowledge or collaborate requires closer scrutiny.

The problem with sharing data is that it is not typically rewarded or incentivised. The example of the Bermuda Agreement in 1996 offers useful insights into how the agreement to share all human genetic data stood to benefit science as a whole considerably, and how grant agencies supported this agreement, resulting in an important turning point for scientific discovery (Nielsen, 2012). Therefore, according to Nielsen, the incentives that underlie progress, as evidenced by the success of the Bermuda Agreement, are key to achieving changes in the discovery process itself. Proposition 4 is, therefore, derived:

Proposition 4: Incentive structures that support research collaboration are a necessary condition to the change in the discovery process itself predicted by 6G innovation theory.

Under conditions of increasing technological change and uncertainty, R&D theorists and practitioners need to rely on theory to understand the patterns, or underlying regularities underlying these changes, and to be able to forecast the impact of these changes. Without such forecasting, it is difficult to take advantage of these changes to improve R&D practice.

Discussions to this point have sought to identify deficiencies in Rothwell’s descriptive framework and to draw together literature to supplement these weaknesses. The augmented form of Rothwell’s schema, as developed here, might offer useful insights for theory and practice, as it is now possible to draw out implications for society. This schema now forms the basis of a nascent theoretical frame, as it links time and cost logics to new opportunities associated with novel technologies.

Given that the second objective of this article is to differentiate between certain societal impacts predicted by Rothwell’s augmented theory, it is necessary to first differentiate between three different scenarios, namely more favourable, more problematic, and more likely, or more probable societal outcomes. Whereas four core propositions were derived in the earlier sections, these four propositions are now used as core tenets of Rothwell’s augmented schema to order the importance of societal implications, and as a heuristic to prioritise key challenges facing the successful application of the framework (Figure 2). These implications are now considered.

FIGURE 2: Rothwell’s augmented theory: Key societal challenges.

Utopian societal outcomes

At the extreme, a sixth generation innovation, associated with scalable research outcomes, or radical increases in cost and time efficiencies of R&D, may result in research projects with the characteristics of the examples highlighted by those such as Nielsen (2012). Projects that demonstrate these characteristics already exist and a reverse engineering of their micro processes reveal mechanisms through which they have achieved economics of scope and scale. These processes of open science take advantage of open innovation principles. The human genome and HapMap projects provide evidence of the importance of the change in science towards open science. According to Nielsen (2012:55), open source is ‘a general design methodology that can be applied to any project involving digital information’.

PIT predicts that it is only a matter of time until technology provides the capabilities necessary to populate problem spaces sufficiently to attain near real time research productivity. As Nielsen (2012) suggests this body of developing theory predicts a world in which cancer can be cured or other societally important problems can be solved in weeks instead of decades or centuries. Large open source projects are able to focus the efforts large numbers of expert and non-expert knowledge problem solvers on societally important problems, also proving a more effective and quicker response to disasters. Such improved responsiveness to societally important problems may herald a more ethical approach to biomedical disasters (Fenton, Chillag, & Michael, 2015), in that the current system may be unethical in its slow response to these problems.

The example of Linux offers a perspective of certain principles that other similar projects build on (Nielsen, 2012), whereby open source collaboration can capture large scale effects. According to a utopian perspective of the societal implications of Rothwell’s augmented theory, there are few negative societal implications from the proliferation of research methodologies that can attain economies of scale. A utopian perspective of these changes suggests a world in which thousands of experts are efficiently linked in real time to solve important societal problems. The successful application of these methods to biomedical research, for example, might be taken to achieve in days and weeks what could previously only be achieved in years, decades or even centuries. More affordable and effective R&D processes could radically reshape human health outcomes, and reduce burdens of health costs, as well as power inequalities in the distribution of R&D outcomes, as the information explosion of past decades is supplanted by a knowledge explosion enabled by the quest for economies of scale across R&D contexts.

In short, a utopian interpretation of the implications of this body of theory predicts radical improvements in societal problem solving, and a new era of improved healthcare and a better quality of life for everybody. Having briefly considered the utopian perspective, its dystopian alternative is now also considered.

Dystopian societal outcomes

As with all important societal changes brought by technological change, it is important to consider the dystopian potential of this body of theory. There are certain scenarios in which societies might not end up better off with technological change (see Callaghan, 2018 for a more comprehensive discussion of these).

Firstly, it is possible for this theory to be put to use by groups that do not reflect the broader society. If elites were able to harness the potential of these methods and keep these benefits to themselves, without allowing the majority of society to benefit, then this would increase power asymmetries in society. Similarly, if national boundaries differentiate access to these methods then inequality between national states might be exacerbated.

Secondly, these methods might be applied to create R&D outcomes that can be destructive. It is not inconceivable that certain nation states might seek to weaponise outcomes using these methods. However, the use of large groups of people in the open collaborations necessary to achieve these economies of scale might in itself be a check on negative developments.

Thirdly, with such powerful mechanisms becoming formalised, crowdsourced R&D methods might be used indiscriminately, without ethical research oversight. It is important to develop ethical frameworks that are robust to accelerated R&D protocols. There already exist longstanding technological risks in the form of genetics, nanotechnology and robotics (Joy, 2000), and the use of these methods can accelerate these risks. As with all powerful technologies and discovery methods with the potential to greatly benefit societies, oversight is important. Further research should seek to develop ethical frameworks for R&D in anticipation of these coming changes.

According to a dystopian scenario, therefore, mobilisation of large numbers of participants might face substantial difficulties, and the examples discussed here may largely be exceptions rather than the rule. Under the control of elite groups and nations, these methods might result in increasing inequality in R&D outcomes. For a variety of reasons, near real time R&D might simply not be possible, or if not, then it might only benefit the few that are most powerful and lead to reinforcement of power. It also needs to be acknowledged that the incentivisation of collaborations and of data and knowledge sharing across global platforms might prove to be impossible, as individuals, groups, firms or nations may not give up valuable knowledge.

Most likely outcomes

Although utopian and dystopian scenarios predict very different outcomes, there are certain outcomes which might be predicted with more certainty. Rothwell’s augmented theory predicts how time and cost efficiencies can be achieved in R&D, and how technological change can be harnessed in support of these ends. Key to achieving these efficiencies is the challenge of mobilising sufficient problem solving inputs to solve important knowledge problems. Project participation akin to Galaxy Zoo’s over 200 000 participants may not easily possible, but to draw learning together it might be necessary to formalise Rothwell’s augmented theory and to derive a formal methodology from it. Crowdsourced R&D has already been tasked with seeking scale and scope economies in R&D in a methodological approach. Further research might usefully build on this project to deepen crowdsourced R&D as a scientific methodology.

Power relationships in R&D might presently favour wealthier markets, as the uncertainty associated with R&D investments bias decisions in favour of those with more promising returns. The extent to which the theoretical principles and their attendant methods proposed here are able to improve the lot of society is perhaps dependent on the extent to which these ideas become formalised, and are taken into the scientific system as a complement to existing research methods. These proposed methods are perhaps not well suited to replace other scientific methods, but to create new value through complementing existing scientific systems. Arguably, reducing discovery time and costs of R&D is possible, as evidenced by Nielsen’s (2012) examples of how discovery systems are currently being re-shaped to take into account principles of economies of scale. Social justice may be better served by lower cost and quicker R&D processes that can serve societal needs more efficiently and effectively. The most likely scenario is that these advances will take longer than expected to manifest, but that when they do, their influence will quickly be felt across the sciences.

The goal of near real time R&D might simply not be possible at present, but by focusing research efforts on this goal, Rothwell’s framework may be useful, and it is surely just a matter of time until incremental advances against this goal are realised.

Conclusion

The first objective of this article was to conceptually engage with a typology of innovation relating to production in industrial contexts and to link this typology to more recent literature that relates the rise of novel technologies to the phenomenon of near real time research problem solving. The second objective of the article was to derive certain societal implications from this augmented framework. These societal influences were differentiated according to three different categories of potential outcomes, namely those most beneficial, those most harmful and those most probable.

This article sought to contribute to the literature in the following ways. Firstly, the article contributes through applying Rothwell’s (1994) descriptive theory of industrial production to the research production process itself (conceptualising research production as industrial production). This approach was taken to offer useful insights for how R&D can be reconceptualised in terms of the twin goals of time and cost efficiency, in support of an explicit goal of near real time research productivity. These insights are considered to suggest a useful perspective of how to achieve more effective and efficient research production, with implications for real time research problem solving, without compromising on scientific rigor. Secondly, the article incorporated different literature and made explicit the way these bodies of literature complement Rothwell’s theory. Examples were drawn from the literature to illustrate arguments made, and testable propositions were derived for further research. Thirdly, societal implications of the augmented body of theory were identified and discussed, thus making explicit certain potential opportunities and threats associated with the application of this framework.

Further research should test the propositions and theoretical principles outlined here, and render a test of this body of theory across different contexts. The attainment of near real time research capability, and its application to solve important societal problems, may ultimately be dependent on the successful development of a body of theory that can support applications to practice, across different fields of research.

Acknowledgements

Competing interests

The author declares that he has no financial or personal relationships which may have inappropriately influenced him in writing this article.

Author’s contributions

I declare that I am the sole author of this research article.

Funding

This research received no specific grant from any funding agency in the public, commercial, or not-for-profit sectors.

Data availability statement

Data sharing is not applicable to this article as no new data were created or analysed in this study.

Disclaimer

The views and opinions expressed in this article are those of the author and do not necessarily reflect the official policy or position of any affiliated agency of the author.

References

Bonney, R., Cooper, C. B., Dickinson, J., Kelling, S., Phillips, T., Rosenberg, K. V., & Shirk, J. (2009). Citizen science: A developing tool for expanding science knowledge and scientific literacy. BioScience, 59(11), 977–984.

Bonney, R., Shirk, J. L., Phillips, T. B., Wiggins, A., Ballard, H. L., Miller-Rushing, A. J., & Parrish, J. K. (2014). Next steps for citizen science. Science, 343(6178), 1436–1437.

Brynjolfsson, E., & McAfee, A. (2014). The second machine age: Work, progress, and prosperity in a time of brilliant technologies. New York: W.W. Norton & Company.

Callaghan, C. W. (2017). Scientific real-time research problem-solving and pharmaceutical innovation. African Journal of Science, Technology, Innovation and Development, 9(4), 425–435.

Callaghan, C. W. (2018). Surviving a technological future: Technological proliferation and modes of discovery. Futures, 104, 100–116.

Fenton, E., Chillag, K., & Michael, N. L. (2015). Ethics preparedness for public health emergencies: Recommendations from the Presidential Bioethics Commission. The American Journal of Bioethics, 15(7), 77–79.

Gray, J. (2009). Jim Gray on eScience: A transformed scientific method. In T. Hey, S. Tansley & K. Tolle (Eds.), The fourth paradigm (pp. 17–31). Washington, DC: Redmond.

Hardin, G. (1968). The tragedy of the commons. Science, 162, 1243–1248.

InnoCentive. (2018). Innovate with InnoCentive. Retrieved from https://www.innocentive.com/.

Joy, B. (2000). Why the future doesn’t need us. Wired. Retrieved from https://www.wired.com/2000/04/joy-2/.

Kitchin, R. (2014). Big data, new epistemologies and paradigm shifts. Big Data & Society, 1(1), 1–12.

Loebbecke, C., & Picot, A. (2015). Reflections on societal and business model transformation arising from digitization and big data analytics: A research agenda. The Journal of Strategic Information Systems, 24(3), 149–157.

Nielsen, M. (2012). Reinventing discovery. Princeton, NJ: Princeton University Press.

Reynolds, G. (2006). An army of Davids: How markets and technology empower ordinary people to beat Big Media, Big Government, and other Goliaths. Nashville, TN: Thomas Nelson.

Rifkin, J. (2011). The third industrial revolution: How lateral power is transforming energy, the economy, and the world. New York: Macmillan.

Rothwell, R. (1994). Towards the fifth-generation innovation process. International Marketing Review, 11(1), 7–31.

Schwab, K. (2017). The fourth industrial revolution. London: Portfolio Penguin.

Schwab, K., & Samans, R. (2016). In the future of jobs. Retrieved from http://englishbulletin.adapt.it/wp-content/uploads/2016/01/WEF_Future_of_Jobs_embargoed.pdf.

Wallace, M. L., & Ràfols, I. (2018). Institutional shaping of research priorities: A case study on avian influenza. Research Policy, 47(10), 1975–1989.

Zoo, H., De Vries, H. J., & Lee, H. (2017). Interplay of innovation and standardization: Exploring the relevance in developing countries. Technological Forecasting and Social Change, 118(C), 334–348.


 

Crossref Citations

1. Open Source Process Insights From ‘Microbial Learning'
Chris William Callaghan
International Journal of Sociotechnology and Knowledge Development  vol: 11  issue: 2  first page: 1  year: 2019  
doi: 10.4018/IJSKD.2019040101