• No results found

Challenging the current publication culture: The phenomenon of publication pressure within the scientific work hierarchy

N/A
N/A
Protected

Academic year: 2021

Share "Challenging the current publication culture: The phenomenon of publication pressure within the scientific work hierarchy"

Copied!
33
0
0

Bezig met laden.... (Bekijk nu de volledige tekst)

Hele tekst

(1)

MSc Brain and Cognitive Sciences Track Cognitive Neuroscience

Literature thesis

Challenging the current

publication culture:

The phenomenon of publication pressure within the scientific work hierarchy

by Lara Engelbert 11276789 December 2017 12 EC 29.09.2017 – 22.12.2017 | Leonid Schneider, 2017 Supervisor/Examiner: Co-assessor:

Dr B.U. Forstmann Dr M.J.Mulder

(2)

2

Content

Abstract ... 3

Introduction ... 4

Method ... 5

The emergence of publication pressure ... 6

Statistics ... 6 Scientific Organization ... 6 Competition in Science ... 8 Journals ... 10 Publication Bias ... 12 Funding Resources ... 13

Consequences of publication pressure ... 15

Consequences of Publication Pressure across different career stages ... 19

Solutions ... 21

Open Science Network ... 21

Peer-reviewing ... 23 Education ... 24 Scientific Organization ... 24 Reflection ... 27 Conclusion ... 29 Literature ... 30

(3)

3

Abstract

Since decades, the scientific community is aware of the issue of publication pressure. Whereas publication pressure is widely discussed by scientific journalists and bloggers, not all levels within the scientific hierarchy are involved equally in this discussion. The present literature thesis is aimed at examining the phenomenon of publication pressure throughout the scientific work hierarchy and proposing possible solutions for this problem. Core underlying causes, like the use of a scientist’s publication list as an objective measurement for the quality of scientific work or the influence of high impact journals, as well as main consequences evolving out of publication pressure, for example maladaptive research practices and the replication crisis within science, will be discussed in the present thesis. In addition, different solutions on how science can deal with publication pressure are presented. Creating an open science culture within academia might encourage young scientists to speak out more and eventually lead to an improvement of common research practices, funding mechanisms and publication methods and will eventually reduce publication pressure.

(4)

4

Introduction

The scientific world confronts researchers with multiple challenges. Publication pressure can be considered as one of the most challenging issues for scientists. In a recent study, 72% of the interviewed researchers rated publication pressure as too high (Tijdink, Verbeke & Smulders, 2014). Publication listings of scientists are often used as an indicator for their scientific performance (Rawat & Meena, 2014). Therefore, a lot of scientists, including especially young scientists at the beginning of their careers, feel pressured to ‘collect’ as many publications as possible to improve their CVs. In addition, competition among scientists has increased during the last years. Consequently, publishing in so-called “high-impact” journals became more important (Albert et al., 2014). These journals are commonly perceived as the best journals within a specific research field (Hoeffel, 1998). The motive for publishing as well as the reason for experiencing publication pressure may differ at different career stages. However, the awareness of publication pressure in science is ever-present throughout all stages. Already students at the beginning of their career are confronted with publication pressure: “I think that publication pressure is a big problem, but we should work towards a system of informative research questions and not towards research results only, by using pre-registration, and pre-submission and through decreasing the influence of journals.” (Jurriaan te Koppele, MSc student Research Master Psychology, University of Amsterdam).

Moreover, publication pressure leads to serious consequences concerning the quality of the published scientific work. For example, Rawat and Meena (2014) state that many articles are never cited in other papers. Hence, those papers are not appreciated by peers or are simply not that important in the specific research area. Moreover, science suffers from a replication crisis which not only affects the scientific integrity but also has an impact on financial and societal issues (Steckler, 2015). Therefore, the importance of reviewers as well as the relevance of an intellectual honesty, which should be shared as a basic principle in the empirical circle among scientists, at all stages of their careers, should be considered. In the end, this highlights the importance of discussing publication pressure and working towards solutions to solve this controversial issue in science. Especially, an examination of how publication pressure affects scientists at different stages of their careers is needed.

However, the present thesis is not aimed at accusing any scientist of practicing bad science on purpose. In addition, I do not want to constitute myself as an upholder of moral

(5)

5

standards. In the present thesis, I want to shed light on the problem of publication pressure and mainly focus on the experience of publication pressure throughout different stages of scientific careers, that might have an influence on this issue.

First, statistics about publication rates and how they developed throughout the last century will be presented. In the following, I will investigate the emergence of publication pressure and the awareness of this problem in academia. In addition, I will reflect on the consequences of publication pressure for science. For this purpose, I will outline how scientists at different stages of their careers experience, approach and think about publication pressure. This includes (undergraduate) students, PhD students, post docs as well as senior scientists. In addition, I will elaborate on the issue that different career stages are not represented equally in the discussion about publication pressure. Finally, I will suggest concrete ideas about how academic staff should approach publication pressure and find solutions.

Method

For the present thesis a literature search was conducted on Google Scholar and the UVA library Catalogue Plus using the following key words: “publication pressure”, "competition in science" ,”publication pressure in science” , ”publication pressure in neuroscience” ,”publication bias neuroscience” ,””publication pressure” and “competition in neuroscience”” ,”“publication pressure” and “competition in science”” ,”solution publication pressure in neuroscience” ,“open science neuroscience” ,“publish or perish” ,”“publish or perish” and “neuroscience””, “statistics publication rate neuroscience”. In addition, the problem of publication pressure was divided in categories and sub-categories: Emergence of publication pressure (publication bias, competition in science, funding resources, statistics), consequences of publication pressure and solutions. Different concept maps were developed and designed using Lucid Chart (Lucid Software) to gain a better overview of how publication pressure emerges out of multiple factors discussed in the literature and how these lead to serious consequences (neuro-)science is facing currently. According to these concept maps, possible solutions concerning a reduction of publication pressure were elaborated. Finally, the separate parts were combined into a framework of publication pressure consisting of emergence, consequences and proposed solutions and generated in a table – on which I reflected on in the final part of the present thesis from a more philosophical point of view.

(6)

6

The emergence of publication pressure

Statistics

When discussing publication pressure, it is interesting to look at the numbers of publications of individual authors and the correlation of these with the phenomenon of publication pressure. In 2006, approximately 1.350.000 scientific articles were published of which 4.6% were directly openly accessible and 3.5% a year later (Bjork, Roos & Lauri, 2009). Due to the increasing concerns regarding a reduction of the quality of science resulting from publication pressure, Fanelli and Larivière (2016) investigated how the productivity of scientists measured as the number of papers published, increased throughout the last century. The authors analysed the publishing patterns of more than 40000 scientists between 1900 and 2013 using the platform Web of Science. They only included researchers who published two or more papers within 15 years. Although the authors claim that the number of publications of young scientists at their early career stages, i.e. within the first 15 years of their careers, has been increased as well as the number of co-authors per publication, they argue that they did not find an overall increase in the number of publications for individual scientists throughout the last century (Fanelli & Larivière, 2016). On the other hand, Larsen and Von Ins (2010) studied the growth rate of published scientific work from 1907 to 2007. The authors conclude that traditional scientific publishing is still increasing although other ways of publishing, like open access platforms, home pages and conference proceedings, become more popular among researchers (Larsen & Von Ins, 2010). Therefore, it seems that the overall amount of publications is still increasing (Larsen & Von Ins, 2010), whereas the productivity of individual scientists did not change over the last century (Fanelli & Larivière, 2016). However, it is important to note that the publication rate of scientists at early stages of their careers has increased (Fanelli & Larivière, 2016) which might be due to factors such as publication pressure.

Scientific Organization

First, it is important to outline how the issue of publication pressure emerged in science. Why do scientists feel the pressure to publish their work? In the following paragraph, different motives of scientists and structures within the scientific organization are analyzed which eventually lead to the experience of publication pressure.

(7)

7

In 1989, Wheeler, who was a senior tutor at the department of physiology and pharmacology at the University of Queensland (Australia) published an article concerning the issue of publication pressure and its consequences for science in the magazine “The Scientist”. In the first instance, this shows publication pressure is not a problem which emerged recently. Awareness of this topic is present for more than 28 years already. Wheeler (1989) describes prestige, more financial support and the length of the personal publication list as a measure of a ‘scientist’s worth’. This is eventually leading to experiencing the pressure to publish. Especially, the use of the publication list of an individual scientist in combination with the number of citations as an objective measure is common practice in academia. Based on these facts, important career decisions, like promotion or even position security, are made (Hackett, 1990). This method of judging the quality of scientific work and even the quality of a scientist has not changed over the years. More recently, Hartshorne and Schachner (2012) state that there are three primarily used criteria for assessing scientists and their research work. According to the authors, these are the number of publications and citations (which is in line with Hackett (1990) and Wheeler (1989)) and the impact factor of the journal in which the scientist has published.

In addition, the journalist Jack Grove writes in his article about publication pressure in India, that especially PhD students experience such a pressure because they must publish at least three papers before they can finish and defend their final thesis (Grove, 2017). Otherwise, the PhD students are at risk of losing their positions. This also relates to PhD students in the Netherlands, who need to publish a certain amount of papers within a given period (e.g. one paper per year during a four year long Veni Grant Research Project).

Recently, Hangel and Schmidt-Pfister (2017) interviewed 91 researchers working in humanities, social and natural sciences at different stages of their careers concerning their motives to publish. Their findings indicate that motives might differ according to the career stage of a scientist. The authors claim that PhD students as well as postdocs experience publication pressure because they want to promote their careers. On the other hand, based on their interviews, they conclude that scientists through all career stages want to publish because they want to contribute to the overall gathering of scientific knowledge (Hangel & Schmidt-Pfister, 2017). Their research is in line with what is stated by for example Grove (2017), since the authors also claim that PhD students feel pressured because they need to comply to the requirements of publishing a certain number of articles within a certain period of time. Moreover, Hackett (1990)

(8)

8

interviewed several researchers and found that these scientists all noticed certain pressures due to promotion, small budgets and competition with other researchers. Considering the fact that Hackett (1990) already conducted those interviews almost 30 years ago, it gets obvious that a lot of the underlying motives of publication pressure did not change over the last decades.

Furthermore, van Dalen (2012) describes another underlying cause of publication pressure. According to the author, the appreciation by other scientists serves as an extrinsic motivator for researchers within all fields of academia to publish. He specifies that the acknowledgment and attention which can be gained by publications contributes to an attitude of publishing “like hell” (van Dalen, 2012, p.1). In addition, van Dalen (2012) emphasizes that the amount of publications is still used as the only indicator of a scientist’s worth. This is in line with Wheeler (1989) and Hackett (1990), and shows that some of the motives strongly associated with publication pressure remained the same over time.

The organizational nature of science might be crucial for the emergence of publication pressure. This includes the influence of funding resources, career options and the impact of journals which will be analysed in the following. The discussed factors of financial issues, using the number of publications as an objective measure for quality, the will to contribute to an overall knowledge gathering, appreciation by other researchers as well as the impact of journals may build the first elements in a line of reasoning, eventually leading to the burden of publication

pressure.

Competition in Science

Another important factor which might underlie publication pressure is the competition among scientists and research groups. Already in 1990, Hackett stated that competition in science was increasing, especially concerning research funding and efficiency in producing results. Recently, Fochler, Felt and Müller (2016) analysed how PhD students and postdocs working within life sciences in Austria experience and plan their future in science. The authors conclude that excessive competition among scientists and quantitative measurements used to elaborate a scientist’s quality are replacing innovation and quality of research. Once again, the literature shows that factors which facilitate the emergence of publication pressure, did not change over time (Hackett, 1990; Fochler, Felt and Müller, 2016). Especially, Fochler, Felt and Müller (2016) outline that this development, towards a more competitive academic environment, endangers the future of young scientists, due to the use of maladaptive valuation practices. This is in line with

(9)

9

Hackett (1990) who states that the development of new practices due to an increasing competition among scientists, eventually leads to a reduction in the educational benefit for young scientists. Fochler, Felt and Müller (2016) link this to e.g. the pressure to publish in prestigious journals. In addition, Albert et al. (2014) explain that publishing in those high impact journals has become more important and at the same time more difficult. The influence of those prestigious, well-known journals will be discussed in more detail later.

Furthermore, the authors specify that the current publication mechanisms lead to more pressure, destructive collaborations in laboratories as well as poor research practices (Fochler, Felt & Müller, 2016). In addition, Hagstrom (1974) interviewed scientists and found that competition among scientists leads to shifting research priorities and increases a secretive attitude. Furthermore, 38% (total sample n=1947) stated that careless publication is a serious problem within science. The so-called replication crisis is most likely a symptom of the increasing competition among scientists and careless publication practices, due to the high pressure to publish in high impact journals and applying poor research practices to produce results as fast and significant as possible. Moreover, Anderson et al. (2006) examined focus-group discussions in order to analyse how scientists experience competition in science. The authors conclude that competition in science leads to negative outcomes associated with a decline in open communication concerning sharing information and methods, or even worse, including sabotage and poor research methods. For example, a group of postdocs was asked to explain their personal motives which enhance a competitive attitude within science. Their answers clearly focused on the pressure to publish which they linked to getting grants as well as publishing in prestigious journals (Anderson et al., 2006). Similar statements can be found on a scientific blog written by Landhuis (2015), who describes the experience of biologist Lawrence Rajendran. The biologist expresses his doubts about the overall importance awarded to publications and the value of publishing in prestigious journals. On the other hand, a postdoc explained in the Anderson et al. (2006) study: “I think the pressure to publish can cause people to sort of cave in and publish a lesser study in a lesser journal than a better study in a better journal, just because they need the numbers. […] It’s just basically the numbers.” (Anderson et al., 2016, p. 446). In my opinion, this shows that publication pressure and the importance of the personal publication list, which is instrumentalized as an essential driving factor for scientific careers, can lead to poor research practices. Moreover, the statements of the different scientists reveal that publication pressure is a

(10)

10

two-sided phenomenon. On the one hand, scientists feel the pressure to publish in prestigious journals. Appreciation by other scientists, grants and promoting the own career might be the main underlying motivational factors here. On the other hand, the personal publication list as an indicator of the quality of a scientist’s work remains a motive to comply with the pressure to publish as much as possible. Thus, in order to increase the number of publications, scientists might then be willing to publish in a less influential journal. Therefore, the pure number of publications remains a quantitative standard and motivation for scientists, causing the burden of publication pressure.

Additionally, Fang and Casadevall (2015) state that scientific work was ever competitive, but never as competitive as today, since competition and availability of positions and funding resources increased enormously. This is in line with Hackett (1990) and more recently with Flaherty (2017) who describes how competition for positions in science already burdens undergraduate students regarding e.g. the search for internships and PhD positions. Hong and Walsh (2009) examined competition in science from another point of view. According to the authors, science suffers from an increasing commercialization which leads to less cooperation among scientists and a more secretive attitude (Hong & Walsh, 2009). This increase in commercialization of science can be linked to the influence of prestigious journals and funding resources as described by e.g. Fochler, Felt and Müller (2016). This in turn, shapes the underlying motivations of scientists e.g. prestige and more financial support as already mentioned by Wheeler in 1989.

Journals

Regarding the commercialization of science, Young, Ioannidis and Al-Ubaydli (2008) published an article in which they compare economical mechanisms to scientific publication practices. According to the authors, scientific and economical systems differ from each other in several aspects, e.g. payment and involved stakeholders. On the other hand, Young, Ioannidis and Al-Ubaydli (2008) emphasize that both systems share the goal of exchanging a product (i.e. research results in the case of science) from producers (the researchers) to a consumer (other researchers, stakeholders, society). The authors model science according to economic processes and explain several disruptive mechanisms present in science. Among those, the most important might be that only a few high impact journals, which only publish a small number of articles every year, modulate the visibility of science (Young, Ioannidis & Al-Ubaydli, 2008). In

(11)

11

addition, they explain that researchers may be influenced by those publication practices in a maladaptive way. Especially, scientists might follow research paths suggested in those journals, even if they use poor research methods, without critically assessing them just because the research was published in a high impact journal.

As stated by Hartshorne and Schachner (2012), how the work of an individual scientists is assessed by the academic community strongly relies on the impact factor of the journal in which research results are published. Eyre-Walker and Stoletzki (2013) analysed how assessors judged publications and conclude that the assessor score strongly depends on the journal in which the assessed article was published. Moreover, the authors explain that assessors showed a tendency to overrate a publication when it was published in a high impact journal. This is not surprising since the quality of a journal is assessed by its impact factor which in turn is used as an objective measure of the individual performance of a scientist (Change, Mc Aleer & Oxley, 2011). For example, the Research Excellence Framework (REF) is used in the United Kingdom to assign research funding to suitable candidates. Among others, it is based on the impact factor of the journals in which a scientist has published (Eyre-Walker & Stoletzki, 2013). This highlights the strong interconnectivity between the factors that lead to the emergence of publication pressure, i.e. publishing in high impact journals to ensure funding and career promotion.

However, the impact factor as a measurement for the quality of a scientist’s work is a poor instrument, prone to errors and research has shown that scientists are poor in assessing the merit of a publication, which is partly due to the importance assigned to the impact factor of a journal (Eyre-Walker & Stoletzki, 2013). In addition, the importance of publishing in a high impact journal to obtain new research funding, promotion and ensure the survival of the own research group, increases the publication pressure which is experienced throughout all levels of academia. Already two decades ago, Hoeffel (1998) stated that the impact factor of a journal is far away from being a perfect instrument to assess the quality of research. At that time, the author explained that there was simply no better method and since this method already existed, it could be considered as a sufficient technique. However, I strongly disagree with this statement. No better alternatives should not mean that the scientific community takes the line of the least resistance. Especially, when awareness about this issue is already existent in science for decades. In addition, there are multiple alternatives to publish and assess the quality of a publication nowadays, which will be discussed later.

(12)

12

Publication Bias

The influence of high impact journals concurs strongly with the phenomenon of publication bias, two factors which in the end highly facilitate the emergence of publication pressure. Hence, publication pressure is associated with the issue of publication bias which is defined as the tendency of withholding negative results in publications to increase the chances of publishing in a prestigious journal, being recognized and cited by peers and ensuring future funding (Joober, Schmitz, Annable & Boksa, 2012). The suggestion that publications in science are not representative for all scientific work was already made in 1995 (Sterling, Rosenbaum & Weinkam, 1995). In addition, Sterling, Rosenbaum & Weinkam (1995) specify that publication bias and the scientific practice which enhances such a bias towards positive findings has not changed during the past three decades! Moreover, publishing only positive results is also favored by the journal editors since they compete amongst each other for their citation indexes and must guarantee the financial survival of the journals. Sterling, Rosenbaum and Weinkam (1995) illustrate this by quoting an editor of an environmental journal: “The manuscript is very well written and the study was well documented. Unfortunately, the negative results translate into a minimal contribution to the field. We encourage you to continue your work in this area and we will be glad to consider additional manuscripts that you may prepare in the future.” (p.109).

Similar reasons underlie the phenomenon of publication pressure and publication bias, i.e. high competition among research groups as well as funding and career options (Joober, Schmitz, Annable & Boksa, 2012). However, publication bias endangers the education of young scientists since they might adopt a publication attitude guided by publication bias to promote their career options. Moreover, publishing only positive results will eventually lead to a serious distortion of the literature, impede progress in science – since no one will know about negative results in a research field – and enhance misguiding and harmful research practices. In addition, this causes a waste of time and financial resources. For example, scientists might conduct the same research which was already done by a different research group but was not published due to negative results. Therefore, neglecting novel ideas, conducting research which is independent from prominent ideas and choosing for an innovative research path is endangered by these mechanisms

(Young, Ioannidis & Al-Ubaydli, 2008).

From another point of view, the flexibility of research designs increased over the last years, whereas other aspects of scientific work, e.g. adjustment of sample sizes for the sake of

(13)

13

power, were ignored (Button et al., 2013). However, there is a need to improve reproducibility in neuroscience since the replication crisis is a serious issue among all scientific fields (Alberts et al., 2014). According to Button et al. (2013), there is a need to shift attention from an excessive reporting of positive results, i.e. rejecting the null-hypothesis, often referred to as publication bias (Joober, Schmitz, Annable & Boksa, 2012), towards standard methodological principles. The authors recommend that scientists should handle their methods and findings in a transparent way, pre-register their designs and analysis plans as well as making data and used material available for other researchers (Button et al., 2013). These suggestions are not only applicable for the issue of publication bias but also suitable to improve the burden of publication pressure and will be discussed later.

The issue of publication bias highlights another aspect of publication pressure, i.e. the need and necessity to publish positive results in high impact journals which then again relates to former discussed underlying motives of publication pressure, e.g. ensuring promotion and appreciation of peer scientists.

Funding Resources

A reoccurring aspect while discussing publication pressure and the underlying causes of this issue, is the importance and influence of funding resources. Regarding this aspect, McGrail, Rickard and Jones (2006) not only highlight the importance of the publication list for the individual scientist and the influence of journals, but also state that the number of publications is used as an indicator for the quality of the research institute. To my mind, this is contradictory because quantity is not necessarily an indicator of quality, regarding both the importance of the publication list for the individual scientist as well as the research institute. According to the authors, the Australian government even includes the publication rates of an institute as a concrete measurement which is used for calculating governmental funding. This is in line with Eyre-Walker and Stoletzki (2013) who describe that governmental funding agencies of countries like Canada, the United Kingdom and Australia use the number of publications in combination with the number of citations and the impact factor of the journals as an instrument to assign their funds. However, McGrail, Rickard and Jones (2006) argue that there is a need to publish even more. Firstly, they state that not publishing can be considered as unethical. Thus, another motive to publish which is shared among scientists might be the believe that their published work will be beneficial for society. This is in line with Hangel and Schmidt-Pfister (2017) who show that

(14)

14

publishing is also guided by rather positive motives like contributing to the overall knowledge gathering within science. One example could be providing new knowledge which can be implemented in the treatment for a disease (Savitz, 2000). On the other hand, according to McGrail, Rickard and Jones (2006) financial reasons can be considered as acceptable motivators to publish, i.e. receiving grants and funds based on the number of publications. The authors even recommend structured interventions aimed at increasing the number of publications, which should be implemented by universities. This is conforming with the increasing commercialization of academia described by Hong and Walsh (2009) who state that the increasing number of publications in science is due to the growing competition for funding resources. The article by McGrail, Rickard and Jones (2006) also highlights that publication pressure is a controversial topic since the authors suggest increasing the number of publications in order to enhance research funding instead of criticizing the enormous amount of publications per year (1350000 published articles in 2006 worldwide; Bjork, Roos & Lauri, 2009). Thus, they do not consider the harmful consequences that a focus on the number of publications per scientist or institute might have for science. Furthermore, it seems that they argue for the funding mechanisms in science, which direct and influence research. However, in my opinion, these funding mechanisms should be considered as non-beneficial since they lead to serious consequence for the practice of science as will be discussed in the following section.

In the end, there are multiple factors which cause publication pressure in science. Among those is the organizational structure of science, in which career possibilities, promotion and the assignment of research funds are strongly dependent on the number of publications of an individual scientist, essential. These aspects appear together with the impact of prestigious journals as another instrument used by the scientific community to assess the worth of a publication. Moreover, the current selection procedure of journals enhances a publication bias. This endangers the integrity of science and increases publication pressure and competition among scientists. The reviewed literature has shown that researchers at all career stages experience the burden of publication pressure, with those at the lowest career stages (i.e. (under)graduate and PhD students) being at high risk due to their dependence on mentors and senior scientists. The following illustration is aimed at facilitating the reader’s understanding of the previous described line of argumentation for the emergence of publication pressure (Figure 1).

(15)

15

Figure 1| Main identified causes of publication pressure and their relation to each other. The figure illustrates how the structure of the current scientific organization leads to a high competitive environment within science (e.g. competition for prestige and promotion). In addition, competition is strongly related to funding resources, which ensure the survival of an individual scientist or even a whole research group. Competition in science is associated with the need to publish in prestigious journals which have a high impact within a certain research field. Publication in such high impact journals may then facilitate promotion and prestige of an individual scientist. Both aspects strongly rely on the number of publications of a scientist which is used as an objective measurement of a scientist’s worth in academia. However, these journals are highly selective and their work is often related to a publication bias, i.e. favoring positive results. Publishing positive results will eventually lead to a higher impact of the publication which once again enhances promotion and prestige of a researcher. Due to the high impact of a publication (based on a prestigious journal or due to favoring positive results), funding resources are influenced in assigned their grants. In the end, these are the main factors, which are strongly connected with each other, out of which publication pressure within science emerges.

Consequences of publication pressure

In the following paragraph, I will discuss the consequences for science and researchers which are evolving out of publication pressure. Already in 1989, Wheeler stated that the pressure to publish within science could lead to despicable, unethical and fraudulent research methods. In addition, publication pressure might change the priorities of scientists which eventually could force researchers to put down important work for the sake of adjusting priorities according to the requirements of a promoting stakeholder. This is in line with Young, Ioannidis and Al-Ubaydli (2008) who emphasize that researchers might adapt a research practice in which they uncritically follow methods and ideas proposed in high impact journals instead of being open for novel and innovative research designs. Furthermore, Fanelli (2009) states that fabrication (the invention of data and cases), falsification (the wilful distortion of data and results) and plagiarism (the copying of ideas, data and words without attribution) might be consequences of publication pressure. After reviewing existing literature on this topic, Fanelli (2009) conclude that up to one third of scientists involved in the reviewed surveys admitted questionable research methods. These included adjusting the design, method or results according to pressure imposed by a funding

(16)

16

source. In addition, Tijdink, Verbeke and Smulders (2014) interviewed 315 scientists who are working in medicine and investigated the association between publication pressure and scientific misconduct. The authors used the Validated Publication Pressure Questionnaire and a 12-item questionnaire which assesses scientific misconduct. In the end, Tijdink, Verbeke and Smulders (2014) claim that there is a significant association between the experienced pressure to publish and the scientific misconduct severity score. Thus, 15% of the interviewed scientists committed fabrication, falsification, plagiarism or other manipulations of their data within the last three years. In addition, Fanelli (2009) states in his systematic review, that 2% of the scientists interviewed within the reviewed articles, admitted a form of misconduct. Another consequence and problem for science which is often related to publication pressure, is the so-called replication crisis. A reoccurring theme within this topic is the missing transparency and deficient representation of the method section (Pashler & Harris, 2012; Barch & Yarkoni, 2013). For example, Vasilevsky et al. (2013) reviewed multiple articles within biomedical sciences and state that the presentation of the method sections is insufficient due to lacking details about the used materials and resources which impede a potential replication of the examined studies. Additionally, Barch and Yarkoni (2013) specifically express their concerns about replication within brain imaging studies due to conflicts of interests between researchers and stakeholders / funding sources, misaligned motivations and questionable research methods. The authors emphasize that the selection of the used statistical analysis can influence the results within those studies. Moreover, they state that small sample sizes and p-hacking, i.e. the manipulation of results and analysis in order to generate a p-value under the 5% significance boundary, might lead to inflated effect sizes as well as a higher incidence of false positives (Barch & Yarkoni, 2013). However, the authors also suggest that we should use the awareness about those issues as an opportunity to change and improve science instead of discrediting science and putting all scientists under general suspicion of being dishonest. Furthermore, Pashler and Harris (2012) argue against common defences of scientists aiming at relativizing the serious replicability problems within science. The authors specifically argue against statements which they found to be reoccurring excuses of scientists who claim that the replication crisis does not exist. One of their arguments is for example, that science is not self-correcting itself. The authors describe the notion that science is self-correcting as long-term optimism. After reviewing 40 non-replication articles, they conclude that this notion cannot be hold because non-replication articles mostly

(17)

17

targeted recent research (median time of four years between original and replication study) (Pashler & Harris, 2012). Thus, older research is often not considered in replication studies and mistakes might not be discovered by a self-correcting mechanism. Therefore, research should always be carried out in all conscience, rather than under the assumption that mistakes (someone is aware of before publishing!) will be solved and corrected over time via the described self-correcting mechanism.

The pressure of funding resources and scientific authorities in research groups along with increased demand for efficiency in producing publications is leading to new research practices and work conditions in science, eventually causing a change in the organization and culture of academia (Hackett, 1990). Since progress is difficult to measure in terms of quality, because this is based on subjective criteria, science has moved towards a more quantitative measure, i.e. the pure number of publications as an indicator of quality (Hackett, 1990; Rawat & Meena, 2014). According to Hackett (1990), changes in the organizational structure of science include the practicing of salami-science which the author describes as “bean-counting” (p.262), i.e. producing as many publications as possible to increase the personal publication list. However, it is even more worrying that these new standards are eventually more likely to be successful than other, more desirable, research and publication methods. For example, it might be more desirable to complete a fuller set of experiments. However, this might lead to a delay or even failure to publish which then endangers career opportunities (Hackett, 1990). Salami-science is a method to prevent such damage to the personal career, a practice which is reinforced by competition for funding and positions.

Besides these consequences which harm the validity and reliability of science, one can also distinguish different methods which are used by many scientists to deal with the pressure to publish. Such methods should as well be regarded as a consequence of publication pressure since they are a reaction of scientists to adapt to the challenges imposed by publication pressure. An overview of the main identified consequences of publication pressure for science is provided in figure 2.

A method which is often named and related to publication pressure is ‘salami-science’. Wheeler (1989) describes ‘salami-science’ as slicing up the results of a study aimed at producing multiple publications. The practice of ‘salami-science’ is a phenomenon found in all career stages of science. However, it gets obvious that students often have little influence on this practice,

(18)

18

since their internship positions and grades are strongly dependent on their supervisors. Therefore, already before students start to publish themselves, they are confronted with methods used to comply with publication pressure in science. In 2017, ‘The Guardian’ published an article in which an anonymous student describes his or her experiences with a supervisor who was more concerned about publishing according to the requests of the government (funding the research) than about following transparent research practices (Anonymous, 2017). In particular, the supervisor instructed the student to delete specific details from the report. The supervisor conducted research with the overall goal to publish in a prestigious journal which lead to research questions and methods that exactly matched already existing publications. However, they were not aimed at replicating them. Hence, the supervisor changed the priorities from conducting transparent and responsible research towards an attitude aimed at guaranteeing the funding of the research laboratory. Regarding the increasing competition among laboratories and scientists, this seems to be a justifiable choice in first instance. For example, funding for science is limited and the competition among different research groups is still increasing (Fang & Casadevall, 2015). However, it also reveals that those research practices endanger reliable research practices as well as the will, shared among a lot of scientists, to publish work which has a beneficial impact on society (McGrail, Rickard & Jones, 2006) and contribute to the overall goal of gathering knowledge (Hangel & Schmidt-Pfister, 2017). Besides, it is important to note that the student published his / her experience anonymously. This indicates that publication pressure and related problems cannot be discussed openly by everyone within the scientific hierarchy. Additionally, this notion links to the finding of Hangel and Schmidt-Pfister (2017) who showed an association between publication pressure and career oriented thinking within science. Thus, it might be that (graduate) students and PhD students do not want to discuss publication pressure and issues evolving out of this, e.g. poor research practices, openly because they are afraid that this will endanger their career opportunities. I will elaborate more on this topic in the following section.

(19)

19

Figure 2| Main Consequences of Publication Pressure for the Scientific Community. The figure shows the three main identified consequences of publication pressure for science: Maladaptive research practices (fabrication, salami science, falsification and plagiarism), adjusting priorities as a researcher (neglect of innovative ideas and sticking to ‘safe’ methods to guarantee publishing) and replication crisis (missing transparency and lack of details, especially regarding the documentation of methods).

Consequences of Publication Pressure across different career stages

Flaherty (2017) approaches the problem of publication pressure from the perspective of graduate students. According to the author, publishing became an acceptable requirement for being qualified for a position in academia after graduation. Leon Hilbert (McS Social and Organisational Psychology and future PhD student at the University of Leiden) explained: “For my master thesis I did not feel any publication pressure because the study and the results did not really qualify for publication. However, it feels like students with publications have a huge headstart over other applicants with regard to PhD programs. This seems unfair to me because a lot of students don’t have the chance to publish anything during their studies.”. In line with the subjective experience of the student, Clapham (2005) clearly states that nothing enhances career options more than publications. In contrast to for example Anderson et al. (2006), the author emphasizes the positive sides of publishing as a researcher. For example, publishing indicates that the researcher is serious about the scientific work. Moreover, the author focuses on the importance of publications for a scientific CV. However, the amount of publications increased to such an extend (500-600 articles per year for some journals!) that the attention peers give to each publication decreased enormously (Flaherty, 2017). The competition for academic positions, promotion and the impact of prestigious journals already accompanies (undergraduate) students during their studies as illustrated by the quote of the PhD student Leon Hilbert. Thus, graduate

(20)

20

students are distinctly aware of this because academia raised them according to this concept. On the other hand, there seems to be also a critical movement within science fighting the pressure and requirement to publish. In their article, Vale (Professor of Humanities and Director of the Johannesburg Institute for Advanced Study) and Karataglidis (Professor for Physics at the University of Johannesburg) (2016) describe an increasing resistance among young scientists to comply to the burden of publishing. In line with McGrail, Rickard and Jones (2006), the authors associate publication pressure with the influence of funding agencies which use the number of publications of an institute as an overall index for distributing their funds as well as the importance of this index for the individual scientist in order to get promoted.

Furthermore, Flaherty (2017) interviewed Philip Cohen (Sociology Professor at the University of Maryland) who claims that the existing status hierarchy in science might be a crucial factor which makes it hard for graduated students to challenge publication pressure. Hence, the question arises, to what extent the hierarchy in science, from undergraduate and graduate students to PhD students, postdocs and senior scientists, influences the manner of how scientists at different levels of their careers express their opinions, doubts and beliefs about publication pressure.

The existing literature on publication pressure is mainly written by senior scientists, journalists or describes studies among postdocs. The study conducted by Hangel and Schmidt-Pfister (2017) who interviewed PhD students within focus-groups still represents an exception, since statements and approaches of students and PhD students are difficult to find within the literature on publication pressure. Moreover, the example of the student who anonymously describes his / her experiences about consequences for science due to publication pressure in The Guardian (Anonymous, 2017) reveals the stigma that is resting on the controversial topic of publication pressure. Therefore, even outside the field of scientific literature, i.e. in blogs and (scientific) newspapers, students and PhD students do not feel free to discuss the issue of publication pressure and methods, such as salami-science and p-hacking, openly. Most likely, this is due to the importance that is assigned to publications concerning career and promotion options as well as differences within the scientific hierarchy. Especially, the ‘hyper-competition’ as explained by Fochler, Felt and Müller (2016), might add to the problem that scientists on lower career stages speak out less than those who are postdocs or senior scientists do.

(21)

21

Solutions

In the following, I will elaborate on several ideas and solutions on how to reduce the burden of publication pressure and how publication pressure may even be used to improve scientific work. Especially, the widely debated concept of an open science network, the importance of peer reviewing, the potential role of education and suggestions for a change of the scientific organization will be discussed.

Open Science Network

Recently, the concept of an open science network became more popular among funding institutes and other stakeholders, scientists and publishers (Molloy, 2011). In their editorial, Spires-Jones, Poirazi and Grubb (2016) argue for an open access approach within science. The authors state that an attitude towards openly shared and exchanged research is promotive for scientists at all stages of their careers. They emphasize that it might be especially beneficial for academics at an early stage of their career within neuroscience. However, the benefits of such an approach are not immediately obvious. Whereas some funding institutes (e.g. Wellcome Trust and NIH; Spires-Jones, Poirazi and Grubb, 2016) already demand researchers to publish openly, other institutes still require publications within high impact journals. In addition, senior scientists might have the attitude that openly shared research is not that important as publications in journals. This notion goes hand in hand with the value of a scientist’s publication list as an indicator of the quality of scientific work performance (Rawat & Meena, 2014). However, publishing in journals takes a lot of time, whereas openly publishing offers a fast alternative. In addition, sharing research and data openly would increase transparency and reproducibility. In contrast, data could be stolen and misused.

According to Molloy (2011), the current system works against a maximum of knowledge distribution. Therefore, science as well as society would benefit from openly shared research because everyone would have access, results would be published faster, data sharing would be facilitated and enhance collaboration, and negative results would be published which as well would decrease publication bias within the body of literature. However, establishing an open science network remains difficult. Although scientists are mostly aware of the benefits of such a concept, there are multiple reasons which impede them to actively support the open science movement. Colquhoun (2011) explains that competition within science and the influence of high impact journals, which serves as a gate for promotion and funding, are important reasons to

(22)

22

adhere to the traditional model of publishing in journals. Nevertheless, information integration and sharing of data among different fields is necessary and would facilitate collaboration and progress in science. Poldrack (2012) emphasizes that there is a need for data sharing especially within the field of brain imaging. However, besides the resistance to engage in open science publishing, it remains difficult due to rather practical reasons as the nature of e.g. fMRI datasets regarding their complexity as well as the need to share metadata concerning task dependent fMRI data. Despite this, Poldrack (2012) argues that the “’open science’ revolution” (Poldrack, 2012, p.1218) has already started. This is in line with Molloy (2011) who explains current efforts of the Open Data in Science working group at the Open Knowledge Foundation. The working group focuses on the development of tools and applications to facilitate the sharing of datasets. In addition, the development of guidelines seems to be inevitable in order to promote a future open science network. For example, the Open Data in Science working group developed the ‘Panton principles’. Among other statements, those principles demand the scientist to formulate explicit statements on how the research should be used as well as offering a license for the data (Molloy, 2011). Such regulations are needed in order to prevent a misuse of the openly shared data, since everyone would have access. Therefore, new possibilities to share data openly – mainly through online platforms – introduce new issues compared to the traditional way of publishing scientific data. However, in order to establish an open network culture within science, it might be necessary to change the criteria under which scientists used to work. This would include highlighting the importance of common goals – in contrast to individual goals – which in turn might reduce the influence of high impact journals as suggested by Fang and Casadevall (2015). In addition, the authors suggest that open communication would be necessary and useful in order to deal with research groups which work within the same field and are therefore considered to be rivals (Fang & Casadevall, 2015). For example, open communication would facilitate constructive criticism as well as exchange of opinions and expertise.

Moreover, Hartshorne and Schachner (2012) emphasize that the whole publication culture within science must be changed in order to make any of the suggested improvements working. The authors propose their idea of a “Replication Tracker”. According to the authors, replicability should be the most important criterion for assessing the quality of a scientific publication. Since the replication crisis is a serious issue within academia (Alberts et al., 2014), there is a need to generate a systematic way to collect replication studies and make them openly accessible for the

(23)

23

scientific community. Hartshorne and Schachner (2012) explain that such a system would change the way of searching for literature, would serve as a motivation for researchers to conduct replication studies, would improve communication among scientists and eventually would help improving scientific practices. However, an obvious limitation of their proposed system is that it is time-consuming due to the need of more participants and learning of new statistical analyses. Despite these limitations, such a system would add more transparency and replicability to science and might be a necessary additional element in an open network.

Peer-reviewing

Publishing data openly would change the current peer reviewing culture. Peer reviewing is an important mechanism in science to prevent misconduct and fraud. However, implementing an open science network would require renewing the traditional reviewing processes towards a post-publishing review culture (Colquhoun, 2011). Publications would be accessible for everyone. Therefore, everyone would be able to review the open publications. Colquhoun (2011) explains that differences in the scientific hierarchy might have an influence on how this new reviewing processes would work. This is in line with Flaherty (2017) who interviewed Philip Cohen (Professor for sociology at the University of Maryland) who emphasizes the need of an open discussion within the scientific community. Such an openness would create a community which allows for critical and constructive feedback loops, including everyone in the scientific hierarchy. Moreover, Chang, McAleer and Oxley (2011) emphasize that scientists worldwide are more likely to successfully judge the quality of a publication as compared to a small group of editors. In contrast, Garfield (2006) mentions that only in an ideal world, evaluators would read every publication and judge it personally. Unfortunately, this will never be possible due to time and money constrains. In addition, the author explains that those judgements would be influenced by comments of peer scientists and citations in the same way as it is the case with the current review and evaluation processes.

However, establishing an open feedback culture might reinforce already existing status differences within the scientific hierarchy (Flaherty, 2017). For example, it might be the case that a PhD student does not feel courageous enough to criticize the work of a senior scientist. On the other hand, a senior scientist might not want to endanger the collaboration with other senior scientists via criticizing their work. However, the most eligible solution for this issue would be to allow anonymous reviews (Colquhoun, 2011) which relates to the current review processes,

(24)

24

which are mostly done anonymously as well.

Concrete approaches of how to change peer-reviewing processes are made by Chang, McAleer and Oxley (2011), who state that the importance of the study should be in focus while judging the quality and publishing potential, instead of e.g. a focus on significant results. This approach would not only help to decrease publication pressure, but also help to counteract publication bias. Furthermore, the authors suggest that reviewing should already start before the actual study is conducted (Chang, McAleer & Oxley, 2011). This would guarantee that methods and data collection are carefully conducted as well as re-establishing a transparency within science.

Education

When the scientific community is willing to establish an open science culture in the future, it might be necessary to implement the ideas and principles of such a scientific culture in early career stages already, i.e. in undergraduate students. Jones et al. (2006) introduce their idea of an undergraduate course aimed at teaching objectives related to publishing and reviewing. According to the authors, the whole process of submitting, reviewing and publishing is widely ignored in the education of young neuroscientists. They propose a course which gives insight into the current practice of publishing in neuroscience, including elaborations on problems related to open publications and traditional subscription publications.

Scientific Organization

Moreover, Wheeler already suggested in 1989 that the scientific community needs to change the way in which scientific performance is judged, from quantity (i.e. number of publications and citations) back to quality (e.g. used methods and analysis as well as innovation and creativity). Especially, Clapham (2005) states that the scientific community should redefine the importance and value of publications. This includes reconsidering the purpose of publishing which for many scientists is still the wish to contribute to an overall knowledge gathering (Hangel & Schmidt-Pfister, 2017).

Wheeler (1989) recommends that applicants should only be asked for a limited number of their publications instead of including an overall overview of their publications in the CV. This is in line with Colquhoun (2011) who suggests that scientists should limit themselves in the amount they publish within a period. In addition, putting a limit on yearly publications per scientists – via funding agencies, research groups or journals – would help to decrease the pressure to publish as

(25)

25

much as possible (Wheeler, 1989). Eventually, this would lead to a change within the scientific organization, especially, how scientists are applying for positions and research funds. This is in line with Grove (2017) who states that there is a need to develop mechanisms within institutes and funding agencies to counteract the publishing “like hell” (van Dalen, 2012) attitude within academia. In addition to proposing a limit for publications, funding institutes could assign slow scholarships to allow for more detailed, deep and innovative research (Vale & Karataglidis, 2016). Moreover, assigning just one grant at a time to a researcher or research group (Coloquhoun, 2011) would decrease the pressure for the researchers. They could focus on just one research project at a time instead of being involved in many (different) projects aimed at fast publication to gain even more research funds.

Since poor research practices are strongly associated with publication pressure, Button et al. (2013) suggest that possible solutions to deal with publication pressure and its consequences should include pre-registration the study protocol and analysis plan, a transparent reporting of methods and findings as well as making the study material and data available for other researchers. This connects to the principles proposed by the Open Data in Science working group (Molloy, 2011) who recommend that scientists should provide other researchers with a license for their data to promote an openly information sharing system within science.

In the end, it might be crucial to change the scientific organization via reconsidering the value and purpose assigned to publications, decreasing the influence of high impact journals (Young, Ioannidis & Al-Ubaydli, 2008), encouraging critical and constructive peer review processes before studies are conducted and promoting the transparency and reproducibility of science in order to generate the right conditions to establish a functioning open access network within science. These new principles, guidelines and review mechanisms should already be included in university courses as for example proposed by Jones et al. (2006). An overview of main identified consequences of publication pressure as well as solutions and possible implementations is provided in table 1.

(26)

26

Specific Consequence Solution Implementation

Shift from quality to quantity for appreciating and assessing peer researchers.

Change the ways in which scientific quality is evaluated. From quantity, back to quality.

Reconsidering the value of publishing, raising awareness about this topic in science via

scientific blogs, newsletters at universities (such as Cognito newsletter at the University of

Amsterdam) or social media platforms (facebook groups such as Cogniton Student Union).

Adjustment of priorities, e.g. neglecting

innovations and sticking to ‘safe’ methods

Replication crisis through missing

transparency and lack of details in methods.

Promoting careful and reasonable research methods instead of methods which are aimed at producing fast results to publish. Allow innovation and creativity.

Implement knowledge about a reliable method use as well as learning objectives concerning publication (as suggested by Jones et al., 2006) in university courses.

Experience of ‘personal’ publication pressure.

Encourage discussion and awareness about issue of publication pressure.

Limit number of required publications.

Implement learning objectives of how to publish, share data etc. in university courses as e.g. Jones et al. (2006) state.

Limit yearly publications of scientists. Just ask for a few publications during application process.

Sabotage / maladaptive research practices, e.g. fabrication, falsification, salami science and

Reduce competition and encourage collaboration among research groups and

Renew peer-reviewing process through establishing open science network which is accessible for everyone for e.g. reviewing publications. Studies are reviewed before they

(27)

27

plagiarism. individual scientists. are conducted, i.e. reviewing the methods and

study plan. Non-representative body

of literature.

Bias in literature towards positive results.

Lack of knowledge about negative results. Replication crisis

Share research results and publications openly.

Establish popular open access platforms. Methods and data are shared openly. Licenses for data are provided. Studies are reviewed before they are conducted, i.e. reviewing the methods and study plan. In addition, importance of study is used as a measure of quality as

suggested by Chang, McAleer and Oxley (2011). Implementing “Replication Tracker” platform (Hartshorne & Schachner, 2012).

Table 1| Specific consequences of publication pressure, proposed solutions and possible implementations.

Reflection

It was not always easy to stay objective and disentangle the causes and consequences of publication pressure without representing myself as an upholder of moral standards, while reading multiple blogs, newspaper articles and scientific papers concerning the burden of publication pressure. Especially, reading the experiences of (undergraduate) students who had to comply with questionable research practices of their supervisors made me query whether the way in which young scientists are educated in academia is optimal. In addition, after outlining several causes and consequences of publication pressure, it was difficult for me to not get caught in an argumentation cycle in which I would accuse research and scientists in general of being only interested in prestige, obtaining grants, publishing in the most influential journals and not being interested in conducting research according to reliable and valid standards. Moreover, during the literature review I noticed that publication pressure is a topic which is already discussed since the 1980s. However, to my mind, it seems that nothing has changed so far and not all levels within the scientific work hierarchy are equally involved in the ongoing discussion. Therefore, I personally think that we as representatives of academia should encourage the discussion even more via involving every career level – from undergraduate students to PhD students to senior scientists – in an open and ongoing discussion to challenge the problem of publication pressure. Besides the proposed solutions in this thesis – creating an open access network which encourages and changes peer reviewing processes as well as involving these developments as learning

(28)

28

objectives in undergraduate education – I think it is necessary to create platforms at universities which allow an open communication about controversial topics in academia. Such a platform could be easily established by e.g. using social media like Facebook (building a group for discussion), creating a university intern debating society which involves teachers, students, research assistants and professors or discussing the topic of publication pressure in university intern newspapers (e.g. newspapers at the University of Amsterdam such as the Cognito newsletter or Toilet times)

Since I am part of the scientific network myself, I noticed that a lot of students and PhD students have a strong opinion about publication pressure. I think that, at least regarding my own personal environment, the young generation of scientists is willing to challenge the problem of publication pressure. Creating room for discussion would help to cope with the feeling of powerlessness which many of my fellow students, including me, experience due to the dependence on supervisors and the “rules’’ of the academic system.

The article of Young, Ioannidis and Al-Ubaydli (2008) and their concept of economically modelling the scientific system made me question the publishing and funding mechanisms in science. It might be a naive and romantic personal imagination of how science should be, but to my mind, knowledge should not be treated and sold like conventional tradable goods. Knowledge is probably the most powerful resource of humanity. Therefore, scientists should not be put under pressure by the various mechanisms described in this thesis. Instead, scientists should be able to walk on innovative and creative pathways to solve crucial problems, regardless of the urge that they should publish in prestigious high impact journals. Funding and promotion should not be dependent on such aspects, e.g. if the publication occurred in a high impact journal. I consider these aspects to be superficial whereas grants and governmental research funding should be based on the essence of the research, i.e. reliable methods and potentially useful outcomes of the research for society.

However, I know that development needs time and the discussion of the issue of publication pressure is going on in academia. In addition, I would like to mention that I always was lucky with the supervisors with whom I worked during my past internships. I learned the importance of pre-registering the research protocol, hypotheses and the analysis plan as well as looking for the best statistical options, not in terms of rejecting the null-hypothesis but concerning the used techniques (in my case fMRI). Considering this, I think that science is slowly

Referenties

GERELATEERDE DOCUMENTEN

Figure 12. Three Causes for a Fever. Viewing the fever node as a noisy-Or node makes it easier to construct the posterior distribution for it... H UGIN —A Shell for Building

A hwd i (default \cmidrulewidth) rule with \aboverulesep space above it (un- less following another \cmidrule, in which case it is on the same vertical align- ment; or if

The current research will contribute to this work by showing that influence hierarchy steepness (i.e. the strength of the influence hierarchy) is an important factor for

His letter raises two issues: (1) does publication bias occur, and (2) can the results of my study of rejected AJR submissions be generalized to the rejected submissions of

The publication output of the 30 most active countries in bioinformatics and their share in the world total in this field are presented in Table 7.. National data in Table 7 are

We hypothesized that due to the links between competition with respect to publishing, funding and academic jobs, publication and grant pressure are tightly linked to

We hypothesized that due to the links between competition with respect to publishing, funding and academic jobs, publication and grant pressure are tightly linked to securing