Scientific “ecosystem” paper now published in F1000 Research

Biosystems Analytics

I’ve previously blogged about our PeerJ Pre-print on moving away from the dominant metaphor of the scientific enterprise as “pipeline” leading to professorial positions in universities, towards a metaphor of diverse “ecosystem”. The paper has now been published in F1000 Research and has already garnered one peer review:

Lancaster AK, Thessen AE and Virapongse A. A new paradigm for the scientific enterprise: nurturing the ecosystem [version 1; referees: 1 approved]. F1000Research 2018, 7:803
(doi: 10.12688/f1000research.15078.1)

One the major points of the paper is that we need to move away from the currently closed system that emphasizes artificial scarcity (e.g. in journal spots), towards a system that emphasizes abundance, and we feel that publishing in journals that use post–publication and transparent peer review (like F1000 Research) helps us “walk-the-walk” as we build those new ecosystems.

Table 1 from the paper reinforces this point: illustrating the contrasting language between…

View original post 6 more words

Advertisements

Universities’s Most Excellent Adventures

Yours truly, reposted from the Ronin Institute blog

Academics in traditional university environments tend to be keenly aware of where their university ranks, whether they like to admit this or not. Most familiar are the college-level rankings like those from the US News & World Report, which weigh the undergraduate experience heavily. However in the research world, the notion of “excellence” has become the coin of the realm as evidenced by a proliferation of “excellence frameworks” such as the Research Excellence Framework  (UK), the German Universities Excellence Initiative, the Excellence in Research for Australia and the Performance Based Research Fund (New Zealand).  Given that many resources from capital funds, grants and permanent positions are doled out in accordance with rankings, where one’s institution stands goes beyond mere bragging rights. Most academics understand the arbitrary nature of such rankings and despite regular kvetching that they are either “unfair” (usually from those at an institution “lower” in the rankings) or that they have “finally” recognized the true worth of their institution (usually from those rising in the rankings), the existence of the ranking system itself, is normally taken as given.  After all, how are we to sort the worthy from the unworthy?

Samuel Moore, Cameron Neylon, Martin Paul Eve, Daniel Paul O’Donnell and Damian Pattinson have published an (ahem), excellent research paper “Excellence R Us”: university research and the fetishisation of excellence that comprehensively examines both the notion and practices of “excellence” in research.  Excellence, as most of the research frameworks define it, essentially boils down to some combination of ranking institutions by their scholars ability to publish in established prestige journals, ability to gain external grants and other easily-measured metric of scholarly output.

Their conclusion, in a nutshell: “excellence” is totally bogus:

…a focus on “excellence” impedes rather than promotes scientific and scholarly activity: it at the same time discourages both the intellectual risk-taking required to make the most significant advances in paradigm-shifting research and the careful “Normal Science” (Kuhn [1962] 2012) that allows us to consolidate our knowledge in the wake of such advances. It encourages researchers to engage in counterproductive conscious and unconscious gamesmanship. And it impoverishes science and scholarship by encouraging concentration rather than distribution of effort.

In other words in the context of scientific scholarship: focusing on excellence prevents the two things that we say we want from from science: careful reproducible science and the big breakthroughs. The article covers familiar ground to those who have been following the state of academia including discussions of the lack of reproducibility in science, the pernicious use of journal prestige to evaluate academics, and the general environment of hypercompetition in research. Many, if not most, academics are aware these issues, having been covered extensively in the trade press in recent years, but continue to view them through the lens of their effect on traditional tenure-track (or equivalent) faculty with established research programs. So it is refreshing that the article tackles how the rhetoric of ”excellence” can restrict the range of types and styles of scholarship, issues that are close to the heart of the Ronin Institute:

There is, however, another effect of the drive for “excellence”: a restriction in the range of scholars, of the research and scholarship performed by such scholars, and the impact such research and scholarship has on the larger population. Although “excellence” is commonly presented as the most fair or efficient way to distribute scarce resources (Sewitz, 2014), it in fact can have an impoverishing effect on the very practices that it seeks to encourage. A funding programme that looks to improve a nation’s research capacity by differentially rewarding “excellence” can have the paradoxical effect of reducing this capacity by underfunding the very forms of “normal” work that make science function (Kuhn [1962] 2012) or distract attention from national priorities and well-conducted research towards a focus on performance measures of North America and Europe (Vessuri et al., 2014)

The article continues by pointing out that “excellence” is often used as a proxy for  academic work that fit certain “standard” modes, which can result in a more bland and conformist world of scholarship:

Given the strong evidence that there is systemic bias within the institutions of research against women, under-represented ethnic groups, non-traditional centres of scholarship, and other disadvantaged groups (for a forthright admission of this bias with regard to non-traditional centres of scholarship, see Goodrich, 1945), it follows that an emphasis on the performance of “excellence”—or, in other words, being able to convince colleagues that one is even more deserving of reward than others in the same field—will create even stronger pressure to conform to unexamined biases and norms within the disciplinary culture: challenging expectations as to what it means to be a scientist is a very difficult way of demonstrating that you are the “best” at science; it is much easier if your appearance, work patterns, and research goals conform to those of which your adjudicators have previous experience. In a culture of “excellence” the quality of work from those who do not work in the expected “normative” fashion run a serious risk of being under-estimated and unrecognised.

As the authors point out it is common in such pieces to identify:

institutional administrators captured by neo-liberal ideologies, funders over-focussed on delivering measurable returns rather than positive change, governments obsessed with economic growth at the cost of social or community value

as the primary cultural driver of metric-driven “excellence”. And this is definitely a huge part of the issue (see Ronin blog posts “Graeber on the Transformation of Universities” and “Henry Heller on IP-Based Capitalism at Universities”), but it’s not the only driver. Attributing these issues purely to external forces lets the academy somewhat off the hook since:

the roots of the problem in fact lie in the internal narratives of the academy and the nature of “excellence” and “quality” as supposedly shared concepts that researchers have developed into shields of their autonomy. The solution to such problems lies not in arguing for more resources for distribution via existing channels as this will simply lead to further concentration and hypercompetition. Instead, we have argued, these internal narratives of the academy must be reformulated.

In other words: academia probably needs to take a look in the mirror once in a while and should question whether current norms really still serve their twin stated goals of encouraging sound “normal” scholarship as well as risky breakthroughs. I would also add: enabling all scholars to participate in whatever way fits their individual talents. There is much more to the article than space allows here, it’s a good piece for anybody interested in the future of scholarship, and it includes a highly detailed bibliography.

Citation: Moore, Neylon, Eve, O’Donnell, Pattinson. Palgrave Communications 3, Article number: 16105 (2017)

Coda: In a nice example of walking the walk, the authors have this note about “subverting traditional scarce markers of prestige” by adopting:

a redistributive approach to the order of their names in the byline. As an international collaboration of uniformly nice people (cf. Moran et al., 2016; Hoover et al., 1987; see Tartamelia, 2014 for an explanation), lacking access to a croquet field (cf. Hassell and May, 1974), writing as individuals rather than an academic version of the Borg (see Guedj, 2009), and not identifying any excellent pun (cf. Alpher et al., 1948; Lord et al., 1986) or “disarmingly quaint nom de guerre” (cf. Mrs Kinpaisby, 2008, 298 [thanks to Oli Duke-Williams for this reference]) to be made from the ordering of our names, we elected to assign index numbers to our surnames and randomize these using an online tool.

“Future of Careers in Scholarship”: November Unconference

The Ronin Institute is holding an Unconference on the 5th November on the future of careers in scholarship in which I’m participating. The Digital Biologist has more about the meeting:

If you’re interested in the future of research and scholarship, and like many in the field, you also subscribe to the consensus view that the current system is broken, you won’t want to miss the Ronin Institute’sThe Future of Careers in Scholarship”, being held in Cambridge MA on November 5th. The unconference format of the meeting, will even allow you and other attendees to shape the agenda of the meeting, so come prepared to be an active participant rather than just a spectator. The meeting will be hosted at The Democracy Center in Harvard Square and you will have the chance to meet and network with an eclectic and forward-thinking group of people from a range of fields of study.

If you are really interested in the future of research and scholarship, and would like to get involved in the movement to advance beyond our current broken system and build new models for doing research, take a look at the Ronin Institute website. The Ronin Institute is devoted to facilitating and promoting scholarly research outside the confines of traditional academic research institutions.

More from The Digital Biologist

Open Science and Its Discontents

My first post on the Ronin Institute blog:

Open science has well and truly arrived. Preprints. Research Parasites. Scientific Reproducibility. Citizen science. Mozilla, the producer of the Firefox browser, has started an Open Science initiative. Open science really hit the mainstream in 2016. So what is open science? Depending on who you ask, it simply means more timely and regular releases of data sets, and publication in open-access journals. Others imagine a more radical transformation of science and scholarship and are advocating “open-notebook” science with a continuous public record of scientific work and concomitant release of open data. In this more expansive vision: science will be ultimately transformed from a series of static snapshots represented by papers and grants into a more supple and real-time practice where the production of science involves both professionals and citizen scientists blending, and co-creating a publicly available shared knowledge. Michael Nielsen, author of the 2012 book Reinventing Discovery: The New Era of Networked Science describes open science, less as a set of specific practices, but ultimately as a process to amplify collective intelligence to solve scientific problems more easily:

To amplify collective intelligence, we should scale up collaborations, increasing the cognitive diversity and range of available expertise as much as possible. This broadens the range of problems that can be easy solved … Ideally, the collaboration will achieve designed serendipity, so that a problem that seems hard to the person posing it finds its way to a person with just the right microexpertise to easily solve it.

Read the rest at the Ronin Institute blog

The limitations of Big Data in life science R&D

Biosystems Analytics

Big Data has become an increasingly large presence in the life science R&D world, but as I have blogged about previously, increasingly larger datasets and better machine algorithms alone, will not leverage that data into bankable knowledge and can lead to erroneous inferences.  My Amber Biology colleague, Gordon Webster has a great post over on LinkedIn leavening the hype around Big Data, pointing out that analytics and visualizations alone are insufficient for making progress in extracting knowledge from biological datasets:

Applying the standard pantheon of data analytics and data visualization techniques to large biological datasets, and expecting to draw some meaningful biological insight from this approach, is like expecting to learn about the life of an Egyptian pharaoh by excavating his tomb with a bulldozer

“-omics” such as those produced by transcriptomic and proteomic analyses are ultimately generated by dynamic processes consisting of individual genes, proteins and other molecules…

View original post 130 more words

Where is this cancer moonshot aimed?

Biosystems Analytics

Much has been made of the recent announcement of VP Biden’s cancer moonshot program.  In these days of ever tightening research funding, every little bit helps, and the research community is obviously grateful for any infusion of funds.   However, large-scale approaches to tackling cancer have been a staple of funding ever since Nixon announced his “War on Cancer” back in the 1970s, and any new approaches must grapple with the often complicated history of research funding in this area.  Ronin Institute Research Scholar, Curt Balch, has a interesting post over on LinkedIn breaking down some of these issues.

What seems relatively new in this iteration of the “war”, however, is a greater awareness of the lack of communication between different approaches to those working on cancer.  Biden has specifically mentioned this need and has pledged to “break down silos and bring all cancer fighters together”.  This…

View original post 536 more words

Quantifying cost-effectiveness of scientific cloud computing in genomics and beyond

Biosystems Analytics

On-demand computing, often known as “cloud computing” provides access to the computing power of a large data center without having to maintain an in-house high performance computing (HPC) cluster, with attendent management and maintenance costs.  As even the most casual observers of the tech world will know, cloud computing is growing in any many sectors of the economy, including scientific research.  Cheap “computing as a utility” has the potential to bring many large-scale analyses within reach of smaller organizations that may lack the means or infrastructure to run a traditional HPC.  These organizations or individuals could include smaller clinics, hospitals, colleges, non-profit organizations and even individual independent researchers or groups of researchers.  But beyond the industry enthusiasm, how much can cloud computing really help enable low-cost scientific analyses?

There is now a veritable smorgasbord of offerings from many different vendors, but the big players are Amazon…

View original post 1,638 more words