New Journal Information Page

I’ve gathered information on a new page, linked to in the right side bar, titled Decade and Longitudinal Journal Information. It updates and improves the citation-based information contained in the page published earlier titled Philosophy Journal Information: ESF Rankings, Citation Impact, & Rejection Rates.

The new citation-based information is in 4 columns, the first two using citations in Google Scholar from 2000 to 2010 (data gathered May 25-7, 2010). The second two columns have no date restrictions.

One interesting thing to note is when the disparity between longitudinal numbers and decade numbers is unusually large or unusually small. For relatively new journals, an unusually small disparity is expected, but for mature journals, it is interesting to note which journals are on the decline and which on the rise. For such purposes, percentages are needed:
mean percentage of decade vs. longitudinal h-values: .48
mean percentage of decade vs. longitudinal g-values: .41
mean percentage of decade vs. longitudinal h-values for type 1 journals: .45
mean percentage of decade vs. longitudinal h-values for type 2 journals: .54
mean percentage of decade vs. longitudinal h-values for type 3 journals: .41

With just a brief look, I notice these above the .7 level and below the .3 level:

ON THE DECLINE: Review of Metaphysics, Journal of the History of Ideas, Kant-Studien, Phronesis, Ancient Philosophy, Mind, Journal of Philosophy, American Philosophical Quarterly, Southern Journal of Philosophy.

ON THE RISE: Mind and Language, Midwest Studies in Philosophy, Religious Studies, Kantian Review, Biology and Philosophy, Economics and Philosophy, Social Philosophy and Policy, Philosophy and Literature, Philosophy and Phenomenological Research, Philosophy.

Of these, perhaps the most telling is that the journals that have received the most complaints in online sites about management and refereeing practices are in the “decline” group. Not surprising but still noteworthy.


Comments

New Journal Information Page — 12 Comments

  1. Hi Jon, I’m uncertain why you think these deltas indicate a decline in standing or impact for all the journals you’ve mentioned, if that is what you meant to convey by ‘in decline’. Perhaps you could say more about what you had in mind?

    Let me say a bit why I doubt that a strong case for decline in impact can be made from this data. To conserve space, look at HD-(decade) and H-for the following in type 1, ordered by HD in bold:

    1. Midwest Stud. 38 / 39
    2. Synthese. 35 / 84
    3. Ethics 34 / 76
    4. Studia Logica 31 / 50
    4. Philosophy 31 / 42
    6. J of Phil. 30 / 119
    7. PPR 28 / 38
    7. Phil Review 28 / 94
    9. Analysis 26 / 60
    9. Mind 26 / 88
    9. Phil Studies 26 (HD) / 58 (H)
    12. J of Phil Logic 21 / 59
    12. European J of Phil 21 / 22
    14. Australaisian J of Phil 20 / 43
    14. Phil Quarterly 20 44
    16. Proc Aristotellian Soc 19 / 59
    17. The Monist 18 / 40
    17. Erkenntnis 18 / 51
    19. Nous 16 / 52
    19. Ratio 16 / 52
    21. Pacific Phil Quarterly 13 / 25
    22. American Phil Quarterly 13> / 55
    22. Inquiry 13 / 35
    22. Religious Studies 13 / 15

    33. Southern Journal of Philosophy 6 / 22

    Just to help in comparison, if ordered by H we’d have the top 15:

    1. J of P (119)
    2. Phil Review (94)
    3. Mind (88)
    4. Synthese (84)
    5. Ethics (76)
    6. Analysis (60)
    7. J of Phil Logic (59)
    7. Proc of Aristotelian Soc (59)
    7. Phil Studies (59)
    9. Phil Studies (58)
    10. APQ (55)
    11. Nous (52)
    12. Erkenntnis (51)
    13. Studia Logica (50)
    14. Phil Quarterly (44)
    15. Australasian J of Phil (43)

    The first point to observe is that only J of P, Mind, Phil Review, Synthese, Ethics, Analysis, and Phil Studies are in the top 11 on both orderings. (Eleven because of a tie). Note that the members of the top 11 by HD are the same as those by GD, although there is some swapping of positions, and that these same 7 journals appear in both the top ten by GD and G. In other words, their standing as top type 1 journals is robust under all the measures considered.

    Given this observation of robustness, I see no evidence to suggest that J of P or Mind are declining in influence or impact. There are alternative explanations for the difference between decade and longitudinal scores: these journals often publish seminal papers that have an unusually long half-life.

    APQ moves around a bit, from 10 (H) to 20 (HD), and there is a lot of movement for Review of Metaphysics, but the movement of the Southern Journal from 27 (H) to 35 (HD) appears to me mostly churn in this lower grouping.

    On this view, four journals (again in type 1) that are in the top ten (HD) but not in (H) might be candidates for having recent or current influence. These are: Studia Logica, Philosophy, Midwest Studies, and PPR. Note that this designation does not hold for (GD) and (G).

  2. Greg, actually, your ranking provides evidence of the sort I was citing. How significant it is, is another question, but it counts that JP drops from 1st to 6th and Mind drops from 3rd to 9th. This evidence isn’t independent of the percentage numbers I was using. I grant that both are still top tier journals, but there are more discriminations to be made than just at the coarse-grained level of which journals are top 10 or top whatever journals.

  3. Argh, typos. The most serious is that Phil Studies appears twice in the second list. The correct entry is Phil Studies (58). So, strike out Phil Studies (59).

  4. If I’m understanding the quantities you’re comparing, it seems that J of P and Mind are punished for the difference in (*) and (*D), but there are explanations for that difference which suggest that these two journals have more impact rather than less. The half-life of articles in these journals appears to be much longer. (It is tricky to do time-slices, since the number of publications has been increasing over time as well.)

    I don’t see a lot of reason to invest much in position swapping in the top ten by (*) and (*D), particularly since the values are so close. A difference in HD-impact from 30 (J of P) and 35 (Synthese) isn’t that informative on its own, and when you see the various measures simply switch the ordering of the same 7 titles, with various other titles popping in and out of the top 10 depending on the measure one selects, it suggests to me that the difference is mostly uninformative.

  5. It took me a while to understand the numbers, and I am not sure that I do as yet. But here is a question. Gregory suggests a long half-life diagnosis. I assume that in the case of a journal that publishes a lot of immediately discussed articles — in a way that Analysis used to — you’d get a high HD, but not necessarily a high H. As well, young journals could be expected to have a high HD/H. Looking at the top journals in the HD list, however, no pattern emerges for me. Midwest Studies has been going for quite a while — why such a high HD/H? Synthese, Ethics, and Philosophy have been going forever — why high HDs and high HD/H? (It isn’t as if they have suddenly become good.)

    Anyway, maybe what one needs to back up the decline hypothesis is snapshots of both HD and H — what was JPhil’s HD in 2000, 1990, etc? Is this going down? (H may not be quite as significant for various reasons, but maybe snapshots of H/year?)

  6. It is true that Analysis tends to be very current, but it too has a popular back catalogue: people still cite Gettier’s paper and Makinson’s preface paper, to give anecdotal evidence, both of which are from the 60s.

    Synthese publishes roughly 6 volumes a year, with three issues per volume, and the measures are sensitive to number of papers. (Analysis might benefit from this, too, as they have room for quite a number of papers per year.)

    Further, the h-index was originally conceived to compare scientists within a determined career stage by their productivity, and this restriction is imagined to mitigate against a widely varying number of papers (N) undermining otherwise reasonable comparisons. Self citations are also not filtered out, and in the case of a journal like Analysis (again, just as an illustrative example), this is relevant, since it tends to publish threads of papers that ping-pong with references to one another, all in the same journal.

    Although the measure was not designed for historical analysis, it would be interesting to see snapshots like Mohan Matthen has suggested. Even so, those counts will still need to be compared locally, something like I’ve done above, as the volume of publications has been increasing from decade to decade in most disciplines and cross time comparisons will need to be normalized. All this aside, I doubt that a distance measure as has been proposed will get at the questions about decline; instead, the data will need to be collected and looked over to see what kind of analysis will sort signal from noise, and then our various hypotheses can be put to test.

  7. Gregory, by “local” comparisons, I take it you mean rankings? So we’d want to see how various journals ranked with respect to HD back in 2000, 1990, etc. That seems a sensible way to measure decline.

    I played a bit with H-number divided by square root of number of years (or perhaps number of volumes) taken into account. That may be a little too big a discount factor as we go back in time, but note that JPhil stays roughly constant at 10. (HD = 30, divided by square root of 10 gives ~10; H = 119 divided by square root of 108 gives ~10 again.)

    Call that adjusted H. For Phil Rev again, the two values are roughly 9, and for Analysis roughly 8.

    Midwest Studies remains a mystery to me. It would be hard pressed to make top twenty in H, yet it is tops in HD. Any idea why?

  8. Midwest was a surprise to me, too. I visited the site and see that they dedicate each volume to a special topic, and this might attract the “survey article” effect: boosting citations by writing the latest go-to article that sums up what’s happening in a field. (The running joke in my department is that someone should write “A survey of logics for biochemistry”. I digress.)

    So, an explanation for a rather high ten-year rank but (comparatively) low over all is that, well, about every 10 years you get a new batch of survey articles to replace the old ones. (That’s glib, even by my low standards; but, something like that might explain what’s going on.)

  9. Good thoughts, Mohan and Greg. Let me point out, though, that “on the decline” and “on the rise” are operationally defined here in terms of the .7 and .3 thresholds mentioned. I take that as evidentially relevant to which journals are doing worse and which are doing better, but certainly not as decisive, even in terms of purely quantitative analyses of the data. So there is room for the sorts of issues being pressed here, and I don’t have an overall opinion about how the quantitative assessment would turn out. But I’d be pleased if journals with high levels of complaint were also journals that were having less influence in the profession and would continue to have less influence. It is for that reason that I used the terminology in question, even though it is operationally defined in a way that bars one from using the ordinary sense of the phrases.

  10. Not to labour the point, but falling under the .3 threshold should be expected in journals such as the top 3, each of which has been a must-read for over a hundred years. On the other hand, the ordinal drop in rank from H to HD may be significant. (I think this was Greg’s point about local comparisons.)

  11. Pingback: Dataset for New Journal Information Page » Certain Doubts

  12. Mohan, you’re right. I haven’t done this, but suppose we look at, say, top 20 or top 30 journals from the 20-3 years ago. If we control in that way, the differences would be more significant, no? (And control for number of articles published…) I still suspect that Mind and JP wouldn’t fare well but I haven’t checked.

Leave a Reply

Your email address will not be published. Required fields are marked *