One the major points of the paper is that we need to move away from the currently closed system that emphasizes artificial scarcity (e.g. in journal spots), towards a system that emphasizes abundance, and we feel that publishing in journals that use post–publication and transparent peer review (like F1000 Research) helps us “walk-the-walk” as we build those new ecosystems.
Table 1 from the paper reinforces this point: illustrating the contrasting language between…
Open science has well and truly arrived. Preprints. Research Parasites. Scientific Reproducibility. Citizen science. Mozilla, the producer of the Firefox browser, has started an Open Science initiative. Open science really hit the mainstream in 2016. So what is open science? Depending on who you ask, it simply means more timely and regular releases of data sets, and publication in open-access journals. Others imagine a more radical transformation of science and scholarship and are advocating “open-notebook” science with a continuous public record of scientific work and concomitant release of open data. In this more expansive vision: science will be ultimately transformed from a series of static snapshots represented by papers and grants into a more supple and real-time practice where the production of science involves both professionals and citizen scientists blending, and co-creating a publicly available shared knowledge. Michael Nielsen, author of the 2012 book Reinventing Discovery: The New Era of Networked Science describes open science, less as a set of specific practices, but ultimately as a process to amplify collective intelligence to solve scientific problems more easily:
To amplify collective intelligence, we should scale up collaborations, increasing the cognitive diversity and range of available expertise as much as possible. This broadens the range of problems that can be easy solved … Ideally, the collaboration will achieve designed serendipity, so that a problem that seems hard to the person posing it finds its way to a person with just the right microexpertise to easily solve it.
Much has been made of the recent announcement of VP Biden’s cancer moonshot program. In these days of ever tightening research funding, every little bit helps, and the research community is obviously grateful for any infusion of funds. However, large-scale approaches to tackling cancer have been a staple of funding ever since Nixon announced his “War on Cancer” back in the 1970s, and any new approaches must grapple with the often complicated history of research funding in this area. Ronin Institute Research Scholar, Curt Balch, has a interesting post over on LinkedIn breaking down some of these issues.
What seems relatively new in this iteration of the “war”, however, is a greater awareness of the lack of communication between different approaches to those working on cancer. Biden has specifically mentioned this need and has pledged to “break down silos and bring all cancer fighters together”. This…
The Big Short is just about the best film I’ve seen in quite a while. It’s as if Guy Ritchie and Michael Moore took some coke together and decided to make a film about the almost-complete financial meltdown of the world. Based on Michael Lewis’ 2010 bestseller, it delves deeply into both the mechanics of the crash and the mentality that drove us there. It doesn’t pander, isn’t emotionally overwrought and gives just about the best explanation that I’ve heard of a synthetic CDO thanks to Selena Gomez and behavioural economist Richard Thaler.
The casting is spot on with Steve Carrell giving an amazing career-defining performance. It has a fast-based, but not overly hyper-kinetic style, and is leavened through with a kind of gallows-humour, as expected given director Adam McKay’s background in comedy. It’s also a film that treats the underlying ideas seriously, but it also never feels too complicated and plot-driven, no mean feat for a director.
The promise of the Internet as a means to “level the playing field’ has seriously gone off the rails. A two-day conference at The New School that just wound up this last weekend, explored the emergence of platform cooperativism. Platform cooperativism aims to return the democratic promise of the Internet away from the rapacious, heavily-leveraged extractive models of the so-called “sharing economy” such as Uber and AirBnB, and towards models of true user ownership and governance. As pointed out in a set of 5 summary essays that appeared in The Nation, these are not (mainly) technical challenges but legal and political ones. An example is FairCoop:
FairCoop is one among a whole slew of new projects attempting to create a more democratic Internet, one that serves as a global commons. These projects include user-owned cooperatives, “open value” companies structured like a wiki, and forms of community-based financing. Part of what distinguishes them from mainstream tech culture is the determination to put real control and ownership in the hands of the users. When you do that, the platform becomes what it always should have been: a tool for those who use it, not a means of exploiting them.
Many of these efforts will face an uphill battle, and as pointed out by Astra Taylor at the conference (she follows Douglas Rushkoff’s presentation in the video link), will probably be fiercely resisted by the newly entrenched platforms of Google, Facebook and the like. But we can also say the same thing about those platforms many of which were just small upstarts back in the 1990s. The real challenge is one that is familiar to evolutionary biologists in game theory: building systems that reduce the chance of “invaders” or “cheaters” (in this case, rapacious VC firms and super-capitalism in general) from swamping a population of mutually beneficial co-operators (or turning those cooperators into cheaters). It doesn’t have to be, and could never be, perfect: you’ll never reduce the population of cheaters to zero, but at least keep them from taking over your population completely.
Read more about platform cooperativism at The Nation…