Tech Nonprofits to Feds: Don’t Weaponize Procurement to Undermine AI Trust and Safety

Summary

Tech nonprofits are urging the U.S. General Services Administration (GSA) not to weaponize government procurement rules, as proposed guidelines could inadvertently make AI tools less safe and useful. The concern stems from requirements that contractors must license AI systems for "all lawful purposes," which nonprofits fear could lead to increased surveillance and data misuse by the government.

IFF Assessment

FOE

The proposed government procurement rules could lead to AI systems being used for purposes that undermine user trust and safety, making it harder for defenders to protect individuals' data and privacy.

Defender Context

This development highlights the growing intersection of AI policy and cybersecurity, specifically concerning data privacy and government surveillance. Defenders should monitor how these procurement rules evolve, as they could dictate the security and privacy features embedded in AI systems used by government contractors, potentially creating new attack vectors or enabling broader data collection.

Read Full Story →