Skip to content

‘Stop the Avalanche of Low-Quality’ Vs ‘Discovery Deficit’

September 2, 2010

source:thecriticalcondition.com

Two items came to my attention during the last couple of months – both with diametrically opposing views of the publishing landscape. Following these items a number of responses showed up as blog pieces and interactive comments and I wanted to take some time to gauge the pulse of commentators and perform my own analysis (see further below).

Stop the Avalanche of Low-Quality

The first appeared as a commentary item titled ‘We Must Stop the Avalanche of Low-Quality Research’ in the well respected Chronicles of Higher Education. Not recognizing any of the author names I initially  ignored it as routine rhetoric but for the numerous comments the article attracted – 179 in all during the 4 weeks time the commenting was open (CoHE articles hardly get comments in double digits). Primarily basing on a 2009 article in Online Information Review, where it was found that 40.6 percent of the articles published in the top science and social-science journals were cited in the period 2002 to 2006, the authors argue that

–          Too much publication raises the refereeing load on leading practitioners
–          The productivity climate raises the demand on younger researchers.
–          Libraries struggle to pay the notoriously high subscription costs
–          The amount of material one must read to conduct a reasonable review of a topic keeps growing
–          Older literature isn’t properly appreciated, or is needlessly rehashed in a newer, publishable version
–          More isn’t better. At some point, quality gives way to quantity.

And they suggest the following fixes:

1)      limit the number of papers to the best three, four, or five that a job or promotion candidate can submit
2)      make more use of citation and journal “impact factors,”
3)      limit manuscripts to five to six journal-length pages, as Nature and Science do, and put a longer version up on a journal’s Web site.

As much one is familiar with the arguments the suggestions are hardly new –  #1 is currently followed by independent committees/bodies on their own, #2 the H-factor is widely used where it matters most (in China) and #3 doesn’t even apply to the web centric publishing although PNAS opting to take this track (actually for an entirely different reason, that of  turning the journal into a mag format).

Relating to #2, a much better opinion piece was published in Nature earlier this year that triggered a very healthy discussion and a full length commentary. A while ago I also noticed Scholarometer that seem to have some promise (I will review that at a later date).

As someone always trying to gauge the pulse of the community I found the follow up comments to the article highly valuable and informative. Although a majority of the comments dealt with the issue of the main article (‘stop the avalanche’), a significant number commented on alternate choices (‘publish everything’ and ‘search technologies’).  Having no familiarity with an appropriate quantitative methodology I decided to build my own quantitative metrics for the three notions. I first imported the comments into a doc file (turned out to be around 20 pages) and then deleted entire comments and text pieces that are of no relevance to the topic of the article (9 pages). I then tried to build a word cloud of the comments but since these are free form words the resulting cloud didn’t make any sense. I decided to manually count the comments to build the following metrics supporting the three notions (of the 178 comments only 87 directly addressed the issue raised by the authors):

stop the avalanche:
agree 0
disagree or implied to disagree 78
not clear 9
publish everything:
favor or implied to favor 22
no mention 65
search technology as an option:
recommend or implied to recommend 15
no mention 72

Very useful, isn’t it. The idea of the main article got almost zero support while at the same time a significant percentage of the commentators voluntarily expressed alternate viewpoints – 25% voluntarily suggested the ‘publish everything’ approach and 17% voluntarily recommended ‘search technologies’ to take over.

Discovery Deficit

The second item appeared as an entry on Cameron Neylon’s blog. To quote the article:

The great strength of the web is that you can allow publication of anything at very low marginal cost without limiting the ability of people to find what they are interested in, at least in principle. Discovery mechanisms are good enough, while being a long way from perfect, to make it possible to mostly find what you’re looking for while avoiding what you’re not looking for.  Search acts as a remarkable filter over the whole web through making discovery possible for large classes of problem. And high quality search algorithms depend on having a lot of data.

Academic publishers have not yet started semantic enrichment of content. No one doubts that once the trend starts the real cream of the web will start floating and as a number of commentators to this post pointed out (or implied) this is the ‘ultimate route’.

Advertisements
No comments yet

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: