Digital Commons's logos Digital Commons Help Center
  • Products
    • Digital Commons IR
    • Digital Commons Exhibits
    • Digital Commons Journals
    • Digital Commons Data
  • Resources
    • Announcements
    • Webinars
    • DC Customers
    • Client Community Portal
    • Submit a Support Case
    • My Cases
    • Products Overview
  • Status
  • Contact Us
  • Home

How Can We Help?

Search Results

Contact Us

If you still have questions or prefer to get help directly from an agent, please submit a request.
We’ll get back to you as soon as possible.

Contact us

Accurate Metrics in Digital CommonsAccurate Metrics in Digital Commons

Demonstrable, accurate, up-to-the-minute data about your institution’s scholarly impact is more important than ever. It’s also never been more challenging.

A pie chart with text and numbers
        Description automatically generated

Only 39% of downloads are verified human activity.

Spiders, bots, and crawlers

Just like not all bacteria are “bad,” some non-human activity is beneficial to your online content. For example, Google crawlers are instrumental in maintaining high discoverability through Google’s search engine. But a growing number of increasingly sophisticated agents are inflating download counts for open access material. How inflated? Using our filtering methodology, which is the most stringent in the industry, we end up with about 39 intentional human downloads for every 100 unfiltered or “raw” downloads.

Scalable, real-time filtering

We work with over 600 institutions and have a penchant for building visualization tools fed from live data—so no one has to manually pore over last month’s download data and determine whether particular articles logically drew as wide an audience as their downloads suggest. To help make sense of the data, we’ve developed a real-time filtering method that we continue to improve on as we (and the pesky bots) learn and grow.

Wheat from chaff

How do we tell human readers from machines? We could, of course, require logins and passwords, but we think that makes open access a little less open. So we track patterns of human and non-human activities to build an ever-evolving process. We build on existing COUNTER standards, which are designed to eliminate erroneous human usage patterns, to remove machine readers as well. We:

  • Review any downloads above a threshold percentage of readers coming from a given IP, excluding the host IP
  • Remove downloads as per COUNTER standards
  • Remove downloads from known robots and programs
  • Remove downloads from Digital Commons employees (local and remote)

Want to know more?

If you want to get an even deeper dive into what we do, you can watch our Bot Shields: Activate! webinar on download filtering (a reprisal of a presentation at Open Repositories 2016) to learn more about our process.

 

Table of Contents
  • Spiders, bots, and crawlers
  • Scalable, real-time filtering
  • Wheat from chaff
  • Want to know more?
Related Articles

Digital Commons Dashboard

Author Dashboard: Real-Time Usage Statistics for Authors

Overview of Digital Commons Reports

Was this article helpful?

Yes
No
Give feedback about this article
View/Download PDF Download Article

    About Digital Commons

  • Announcements
  • Products Overview

    Additional Support

  • Webinars
  • Client Community

    Need Help?

  • Contact Us
  • Submit a Support Case
  • My Cases
  • Linkedin
  • Twitter
  • Facebook
  • Youtube
Elsevier logo Relx logo

Copyright © 2025 Elsevier, its licensors, and contributors. All rights are reserved, including those for text and data mining, AI training, and similar technologies. For all open access content, the Creative Commons licensing terms apply.

  • Terms & Conditions Terms & Conditions
  • Privacy policyPrivacy policy
  • AccessibilityAccessibility
  • Cookie settingsCookie settings

Knowledge Base Software powered by Helpjuice

Expand