Skip to main content

User account menu
Main navigation
  • Topics
    • Customer Care
    • FDA Compliance
    • Healthcare
    • Innovation
    • Lean
    • Management
    • Metrology
    • Operations
    • Risk Management
    • Six Sigma
    • Standards
    • Statistics
    • Supply Chain
    • Sustainability
    • Training
  • Videos/Webinars
    • All videos
    • Product Demos
    • Webinars
  • Advertise
    • Advertise
    • Submit B2B Press Release
    • Write for us
  • Metrology Hub
  • Training
  • Subscribe
  • Log in
Mobile Menu
  • Home
  • Topics
    • 3D Metrology-CMSC
    • Customer Care
    • FDA Compliance
    • Healthcare
    • Innovation
    • Lean
    • Management
    • Metrology
    • Operations
    • Risk Management
    • Six Sigma
    • Standards
    • Statistics
    • Supply Chain
    • Sustainability
    • Training
  • Login / Subscribe
  • More...
    • All Features
    • All News
    • All Videos
    • Contact
    • Training

To Measure Bias in Data, NIST Initiates ‘Fair Ranking’ Research Effort

New initiative has long-term goal of making search technology more even-handed

Credit: N. Hanacek/NIST, Shutterstock
NIST
Wed, 12/18/2019 - 12:02
  • Comment
  • RSS

Social Sharing block

  • Print
Body

A new research effort at the National Institute of Standards and Technology (NIST) aims to address a pervasive issue in our data-driven society: a lack of fairness that sometimes turns up in the answers we get from information retrieval software.

ADVERTISEMENT

A measurably “fair search” would not always return the exact same list of answers to a repeated, identical query. Instead, the software would consider the relative relevance of the answers each time the search runs—thereby allowing different, potentially interesting answers to appear higher in the list at times.

Software of this type is everywhere, from popular search engines to less-known algorithms that help specialists comb through databases. This software usually incorporates forms of artificial intelligence that help it learn to make better decisions over time. But it bases these decisions on the data it receives, and if those data are biased in some way, the software will learn to make decisions that reflect that bias, too. These decisions can have real-world consequences—for instance, influencing which music artists a streaming service suggests, and whether you get recommended for a job interview.

 …

Want to continue?
Log in or create a FREE account.
Enter your username or email address
Enter the password that accompanies your username.
By logging in you agree to receive communication from Quality Digest. Privacy Policy.
Create a FREE account
Forgot My Password

Add new comment

8 + 1 =
Solve this simple math problem and enter the result. E.g. for 1+3, enter 4.
Please login to comment.
      

© 2024 Quality Digest. Copyright on content held by Quality Digest or by individual authors. Contact Quality Digest for reprint information.
“Quality Digest" is a trademark owned by Quality Circle Institute Inc.

footer
  • Home
  • Print QD: 1995-2008
  • Print QD: 2008-2009
  • Videos
  • Privacy Policy
  • Write for us
footer second menu
  • Subscribe to Quality Digest
  • About Us
  • Contact Us