PNN response to interim evaluation of PhD education experiment
The PhD Scholarship Experiment, in which PhD candidates are appointed as students instead of employees, is taking place at the University of Groningen (with 850 places) and Erasmus University Rotterdam (with 15 places). The experiment has already generated considerable controversy. PhD students feel like second-class PhD candidates and believe that the promised benefits of the experiment, such as increased freedom, are not being realized . The University of Groningen (RUG) calls the experiment a great success and dismisses the complaints as " fake news ." Given these conflicting signals, a thorough interim evaluation is necessary. An interim evaluation is legally required and serves to protect PhD students. The goal is to determine whether the experiment should be halted due to serious problems. The interim evaluation has now been conducted by research agency CHEPS. However, this research shows clear shortcomings. As a result, PhD students remain part of an experiment that may have serious adverse effects on them.
1. Little own research
Things already went wrong during the implementation of the interim evaluation. The research agency CHEPS relies largely on data collected by the University of Groningen and did not conduct its own research, even where the University of Groningen did not probe sufficiently in its surveys (see points 9 and 10 below). Furthermore, the CHEPS researchers did not have access to the original data, only to a dataset prepared by the University of Groningen.
2. Interference by the University of Groningen
To supplement this, interviews were conducted among RUG staff. The interviewees were invited and largely selected by a high-ranking RUG official. They were then encouraged to contribute to a positive interim evaluation (see one of the emails here). Some even received promotional materials. Supporters of the experiment received a list of topics as preparation, while potentially more critical individuals were not.
3. Dishonest reporting
By sending the interview invitations from a high-ranking RUG official, including propaganda, the confidential and secure environment necessary for independent interviews was lacking. This is especially true for lower-ranking staff, such as PhD candidates. Academic integrity standards require that interference by interested parties be reported. [1] Despite PNN's insistence, the report contains no information about what happened. [2]
4. Little input from PhD students
PhD students were insufficiently involved in the research: 11 policymakers, directors, and deans were interviewed in detail in one-on-one conversations, compared to three PhD student representatives, a member of the participation council, and a student. PhD students were only able to participate in a two-hour roundtable discussion, with the possibility of peer review and therefore less room for criticism. We understand from two participants that much of the criticism they did express is not reflected in the reports.
5. Displacement of employed PhD candidates by PhD students
Minister Bussemaker wanted to use the experiment to determine whether employee PhD candidates would be displaced by PhD students. The Council of State warned that: " in the current economic situation, university boards will replace employee PhD programs on a large scale with cheaper PhD student programs ." This is what happened at the University of Groningen: 75 employee PhD candidates (paid from the first funding stream) were replaced by PhD students. [3] The researchers believe that this does not constitute displacement, because the university made a conscious decision to do so. [4] In the view of PNN, policy-related and structural displacement is even more serious, especially since the University of Groningen indicated at the start of the experiment that the PhD students would be supplementary and not replacing employee PhD candidates.
6. Juggling with numbers
The researchers conclude that the experiment has created more PhD positions. They fail to take into account that many of these additional positions at the University of Groningen are being created by a more than fivefold (!) increase in investment from the first funding stream in PhD positions: from €3.3 million per year in the past to €17.9 million. [5] If this amount had been invested in positions for employee PhD candidates, additional PhD positions would also have been created. The interim evaluation wrongly creates the impression that the experiment is the primary cause of the additional PhD positions.
It is also noteworthy that the researchers indicate that they did not conduct a cost-benefit analysis, [6] while claiming that the experiment resulted in additional PhD positions. However, without a thorough cost calculation, nothing can be said about this. For example, various additional costs associated with the experiment, which could therefore lead to fewer PhD positions, were not included in the interim evaluation. These include costs for the training program, the appointment of additional lecturers because PhD students provide no or less teaching, [7] compensation schemes that give PhD students extra time compared to employee PhD candidates, and a one-time financial compensation for some PhD students. Furthermore, it is stated, without further substantiation, that a four-year program for employee PhD candidates costs €240,000. [8] The NWO calculates this at approximately €200,000, and upon inquiry with the planning and control department of another university, it turns out that €193,000 is used. By inflating the costs for employee PhD candidates, PhD students appear more affordable than they actually are.
7. More freedom? Impossible to measure
The experiment is intended to increase research freedom for PhD students. The interim evaluation shows that, according to PhD students and their supervisors, this is disappointing. There is, however, more freedom in the choice of research topic. However, this is not due to student status. At the University of Groningen, PhD students are paid from the first funding stream. They are then compared with employed PhD students on externally funded (and clearly defined) projects. The experiment therefore measures differences based on source of income rather than status. The researchers leave this unmentioned and unreservedly attribute the greater freedom in research topic choice to the experiment. [9]
8. Shopping with data
Remarkably few consequences are attached to the survey among supervisors [10] , which the University of Groningen itself omitted from its self-evaluations. This survey shows that supervisors, by their own admission, do not (on average) give PhD students more freedom, not even in choosing their subject. This was not addressed in the conclusions. It is also striking that the table that clearly showed the poor scores has been replaced by a rather unclear bar chart (p. 89).
9. Satisfaction is not measured
The interim evaluation assumes that PhD students are satisfied with their trajectory, because they would choose a PhD position again. [11] This question (from the RUG surveys) is unusable for measuring satisfaction. PhD students are told that it's a PhD position, or nothing. The choice in the answer is therefore: would I choose a PhD position again, or would I choose no PhD position at all? This says nothing about satisfaction with the position compared to a position as an employee PhD candidate. And that is the comparison at the heart of the experiment.
10. Pressure to provide unpaid teaching
A common complaint , which is also reflected in the interim evaluation, is that PhD students are pressured to provide unpaid teaching. [12] Because CHEPS bases its findings on RUG surveys, it only looked at the response to the following statement: "I am happy that I do not have to teach or supervise students." [13] Without structured research and solely on the basis of denials by the responsible Graduate School directors, it is subsequently concluded that it is not that bad. [14]
11. The Netherlands performs well in the European context
The report (p. 17) states that the Netherlands scores rather poorly on the number of PhD graduates among 25-34 year-olds. This conflicts with EU figures, where the Netherlands has been in the top 5 since 2011. Furthermore, it is incorrect that almost all other European countries have a system with PhD students, as the report (p. 18) states. For example, Denmark (traditionally), as well as Norway and Sweden , almost exclusively employ PhD candidates.
12. Incorrectly citing the Bologna process
Finally, it is suggested (on pages 13 and 18) that the Bologna Process (which led to the Bachelor-Master system) requires the introduction of a system with doctoral candidates. This is incorrect; the Bologna Process leaves it up to countries themselves to decide on the legal status of doctoral candidates. In fact, the Salzburg Recommendations (part of Bologna) stipulate: "Doctoral candidates, as early-stage researchers, should be recognized as professionals – with commensurate rights – who make a key contribution to the creation of new knowledge."
Call: do good research
An experiment demands independent and fair research. Due to the aforementioned gaps, errors, biased representations, and irregularities surrounding the interviews, the interim evaluation says nothing about the actual course of events within the experiment. PNN therefore urgently calls on Minister Van Engelshoven, also in the interest of the PhD students involved, to conduct a second opinion and to launch an investigation into what happened. Furthermore, PNN calls on the Minister not to allow the University of Groningen to participate in a second round of the experiment. Firstly, because, according to Minister Bussemaker, a second round was intended for universities for which the first round came too soon. [15] Secondly, because the University is behaving in such a way that the reliability of the experiment's evaluation is compromised.
Because of all this, we also ask the Minister to halt the experiment at the University of Groningen. An experiment in which parties attempt to improperly influence the results is pointless: the results are then unreliable.
Call: no new round
Complaints from PhD students and the problems surrounding the interim evaluation call for careful monitoring and an end to the experiment at the University of Groningen. Further expansion, given all the negative signals, is irresponsible and poses a risk to hundreds more vulnerable young researchers. We ask the Minister to stand up for this vulnerable group of researchers and uphold her original decision not to expand the experiment. PNN regrets that the Minister has decided to implement the motion by Van der Molen et al. to open a second round. PNN believes that – if a second round is indeed held – only a very limited number of new PhD students will be able to start, because the law requires that the majority of PhD students must have started by 2018 at the latest. [16] Since the final evaluation will take place in 2021, [17] a new round starting in 2020 will hardly be able to produce valid results. A second round would therefore only provide a cost-saving measure for universities, while this was precisely what was mentioned as a reason to terminate the experiment prematurely when the experiment was introduced. [18]
Footnotes
[1] Dutch Code of Conduct for Research Integrity , code of conduct 8 and 44.
[2] It is striking that it is noted that GRIN, the local PhD student council, helped organize the round table discussion with PhD students (interim evaluation, p. 10).
[3] Interim evaluation, p. 35.
[4] Interim evaluation, p. 104.
[5] Interim evaluation, p. 40.
[6] Interim evaluation, p. 24.
[7] The interim evaluation states on page 98 that employee PhD candidates are only allowed to teach a limited amount, up to a maximum of 20%, according to the collective labor agreement. This is incorrect: this is not stated in the collective labor agreement, and at the University of Groningen, the maximum is 25%.
[8] Interim evaluation, p. 24.
[9] Interim evaluation, p. 104 and p. 49-51
[10] Interim evaluation, p. 26.
[11] interim evaluation, p. 102.
[12] See also p. 99 of the interim evaluation.
[13] Interim evaluation, p. 28.
[14] Interim evaluation p. 100 and 104.
[15] Parliamentary document House of Representatives, Session Year 2014–2015, 31288, No. 437, p. 25.
[16] Article 8, paragraph 2 of the Promotion Education Experiment Decree.
[17] Article 13 of the Promotion Education Experiment Decree.
[18] Parliamentary document House of Representatives, Session Year 2014–2015, 31288, No. 437, p. 23.
More News

