Content area
Full Text
The Publish or Perish (PoP) software offers a swift and elegant tool to provide the essential output features that Google Scholar does not offer, and generates 18 bibliometric and scientometric indicators from the result set produced by Google Scholar. PoP allows the user to edit the result lists presented in a compact, efficient grid-format. It facilitates the identification and removal of duplicate entries by offering dynamic sorting of the set by eight metadata elements, un-checking items and instant recalculation of the indicators. PoP has a variety of export options and formats, including the comma separated value (CSV) file format widely used for exporting and importing records to and from many spreadsheet and database programmes.
PoP is also an excellent resource to gauge or estimate the scale of some of the errors of commission and of omission in the result sets of Google Scholar, beyond the duplicates. However, even after editing and refining the result sets, these indicators cannot be taken at face value because they come directly from the very often erroneous and/or dubious publication counts, citation counts and publication years reported by Google Scholar, which are then used for computing the various indicators. It adds to this problem that PoP does not show the distinction between the master records and the citation records (identified by the [citation] prefix in Google Scholar), which often pumps up the publication count significantly.
Some changes are recommended to enhance this useful utility by allowing users to clean and edit the erroneous entries in the result set, and then back-load it to PoP for the recalculation of the indicators. It is also suggested that the option to upload into PoP the result lists produced in CSV format from Web of Science and Scopus (which have much more reliable and reproducible data than Google Scholar) should also be offered.
Background
Hirsch developed his h-index and presented his samples using the Web of Science database. Some of the most experienced, influential and productive scientometricians endorsed the concept behind the h-index, suggested several variants based on it, and ran test samples producing and comparing the original h-index and its variants for researchers of different status, in different disciplines, as well as for journals, institutions and countries.
Web of Science, and then a...