|
Tech Nonprofits to Feds: Don’t Weaponize Procurement to Undermine AI Trust and SafetyWhile the very public fight continues between the Department of Defense and Anthropic over whether the government can punish a company for refusing to allow its technology to be used for mass surveillance, another branch of the U.S. government is quietly working to ensure that this dispute will never happen again. How? By rewriting government procurement rules. Using procurement -- meaning, the processes by which governments acquire goods and services-- to accomplish policy goals is a time-honored and often appropriate strategy. The government literally expresses its politics and priorities by deciding where and how it spends its money. To that end, governments can and should give our tax dollars to companies and projects that serve the public interest, such as open-source software development, interoperability, or right to repair. And they should withhold those dollars from those that don’t, like shady contractors with inadequate security systems . New proposed rules from the principal agency in charge of acquiring goods, property and services for the federal government, the General Services Administration, are supposed to be primarily an effort to implement one policy priority: promoting steering government funds toward “ideologically neutral” American AI innovation But the new guidelines do far more than that. As explained in comments filed today with our partners at the Center for Democracy and Technology, the Protect Democracy Project, and the Electronic Privacy Information Center, the GSA’s guidelines include broad provisions that would make AI tools less safe and less useful. If finally adopted, these provisions would become standard components of every federal contract. You can read the full comments here. The most egregious example is a requirement that contractors and government service providers must license their AI systems to the government for “all lawful purposes.” Given the government’s loose interpretations of the law, ability to find loopholes to surveil you, and willingness to do illegal spying, we need serious and proactive legal restrictions to prevent it from gobbling up all the personally data it can acquire and using even routine bureaucratic data for punitive ends. Relatedly, the draft rules require that “AI System(s) must not refuse to produce data outputs or conduct analyses based on the Contractor’s or Service Provider’s discretionary policies.” In other words, if a company’s safety guardrails might prevent responding to a government request, the company must disable those guardrails. Given widespread public concerns about AI safety, it seems misguided, at best, to limit safeguards a company deems necessary. There are myriad other problems with the draft rules, such as technologically incoherent “anti-Woke” requirements. But the overarching problem is clear: much of this proposal would not serve the overall public interest in using American tax dollars to promote privacy, safety, and responsible technological innovation. The GSA should start over. Note they are also about implementing "anti-woke" tech which is even more stupid. I rewrote to allude to it but really that's a whole other blog post |
|
Our Privacy Policy can be viewed at https://freeinternetpress.com/privacy_policy.php FIP XML/RSS/RDF Newsfeed Syndication https://freeinternetpress.com/rss.php © 2026 FreeInternetPress.com Free Internet Press is licensed under a Creative Commons Attribution 3.0 United States License. You may reuse or distribute original works on this site, with attribution per the above license. Any mirrored or quoted materials may be copyright their respective authors, publications, or outlets, as shown on their publication, indicated by the link in the news story. Such works are used under the fair use doctrine of United States copyright law. Should any materials be found overused or objectionable to the copyright holder, notification should be sent to [email protected], and the work will be removed and replaced with such notification. Please email [email protected] with any questions. |
|