CW Pakistan
  • Legacy
    • Legacy Editorial
    • Editor’s Note
  • Academy
  • Wired
  • Cellcos
  • PayTech
  • Business
  • Ignite
  • Digital Pakistan
  • DFDI
  • PSEB
  • PASHA
  • TechAdvisor
  • GamePro
  • Partnerships
  • PCWorld
  • Macworld
  • Infoworld
  • TechHive
  • TechAdvisor
0
0
0
0
0
Subscribe
CW Pakistan
CW Pakistan CW Pakistan
  • Legacy
    • Legacy Editorial
    • Editor’s Note
  • Academy
  • Wired
  • Cellcos
  • PayTech
  • Business
  • Ignite
  • Digital Pakistan
  • DFDI
  • PSEB
  • PASHA
  • TechAdvisor
  • GamePro
  • Partnerships
  • Global Insights

Anthropic Users Must Choose To Opt Out Or Share Chats For AI Training

  • August 28, 2025
Total
0
Shares
0
0
0
Share
Tweet
Share
Share
Share
Share

Anthropic is changing its data usage policies and is now requiring all Claude users to make a clear decision by September 28 on whether their conversations can be used to train its AI models. This represents a major shift, as the company previously refrained from using consumer chats for training purposes. Now, Anthropic intends to use both conversations and coding sessions to improve its systems while also extending data retention periods for those who do not opt out to as long as five years.

For years, Anthropic assured users that their prompts and outputs would be automatically deleted within 30 days, with only flagged or policy-violating content retained for up to two years. This sudden extension in retention duration signals a broader change in approach. Notably, these updates apply specifically to consumer services such as Claude Free, Pro, Max, and Claude Code. Business-focused offerings like Claude Gov, Claude for Work, Claude for Education, and API access remain unaffected. In this sense, Anthropic is following a strategy similar to OpenAI, which also exempts enterprise customers from data training requirements.

In its communication, Anthropic has framed the new policy as an effort to provide users with choice while presenting it as a way to improve safety and model accuracy. According to the company, conversations that are shared for training help its systems better detect harmful content and improve overall reasoning, coding, and analytical skills in future model iterations. However, many industry observers believe the real motivation lies in the need for vast amounts of high-quality conversational data. With competitors like OpenAI and Google racing ahead, access to millions of real user interactions gives Anthropic a valuable edge in refining its models.

The policy update also highlights wider tensions across the AI industry as companies struggle with questions of transparency, data usage, and privacy. OpenAI itself is dealing with a court order requiring it to retain all consumer ChatGPT conversations indefinitely, including deleted chats, due to a lawsuit brought by The New York Times and other publishers. OpenAI’s COO Brad Lightcap has called this requirement unnecessary and at odds with the company’s promises of user privacy. Like Anthropic, OpenAI shields its enterprise customers through zero-data-retention agreements, leaving consumer users most exposed to sweeping data policies.

Anthropic’s implementation of this update has drawn further criticism for how it is presented to users. New users will be asked their preference during signup, but existing users are shown a pop-up with the headline “Updates to Consumer Terms and Policies.” The screen includes a large “Accept” button, while the toggle to opt out of training is smaller, positioned below, and set to “On” by default. Analysts argue this design could lead users to click accept without realizing they are consenting to share their conversations for training.

Privacy experts have long warned that the complexity of AI systems makes meaningful user consent nearly impossible. Regulators in the United States, including the Federal Trade Commission, have already cautioned AI companies against hiding important policy shifts in fine print or hyperlinks. Although the FTC has taken such positions in the past, its current level of oversight remains unclear, raising doubts about whether stronger enforcement will follow.

Ultimately, Anthropic’s move underscores both the industry’s need for data and the growing challenge of maintaining trust with users. While the company insists the choice is up to its customers, the policy reflects the wider reality that in today’s AI race, user data remains one of the most valuable resources.

Source

Follow the SPIN IDG WhatsApp Channel for updates across the Smart Pakistan Insights Network covering all of Pakistan’s technology ecosystem. 

Share
Tweet
Share
Share
Share
Related Topics
  • AI training
  • Anthropic
  • Claude
  • data privacy
  • FTC
  • Google
  • OpenAI
  • tech policy
  • User Choice
Previous Article
  • Cellcos

PTCL Group Reports 16% Revenue Growth Driven By Broadband, 4G And Enterprise Services

  • August 28, 2025
Read More
Next Article
  • Global Insights

Investors Show Strong Interest In Swedish Startup Lovable At $4 Billion Valuation

  • August 28, 2025
Read More
You May Also Like
Read More
  • Global Insights

Dutch Government Takes Control Of Nexperia To Safeguard Semiconductor Supply

  • Press Desk
  • October 15, 2025
Read More
  • Global Insights

China Builds World’s Largest Coordinated Meteorological Observation Network Covering Land, Sea, Air And Space

  • Press Desk
  • October 15, 2025
Read More
  • Global Insights

Unitree Launches Affordable Humanoid Robot R1 For Under $6,000 With AI Capabilities

  • Press Desk
  • October 15, 2025
Read More
  • Global Insights

AI Search Impacting Online Media Traffic And Advertising Revenue

  • Press Desk
  • October 14, 2025
Read More
  • Global Insights

Bitcoin Price Plunges To $110,623 Amid Trade War Fears And Market Liquidations

  • Press Desk
  • October 13, 2025
Read More
  • Global Insights

China Plans Shared AI Satellite Mega Network To Deliver Global Services

  • Press Desk
  • October 12, 2025
Read More
  • Global Insights

China Unveils Dual-Tower Solar-Thermal Power Plant In Gobi Desert

  • Press Desk
  • October 12, 2025
Read More
  • Global Insights

China Enforces Stricter Export Controls On Rare Earths And Strategic Tech Materials

  • Press Desk
  • October 12, 2025
Trending Posts
  • Dutch Government Takes Control Of Nexperia To Safeguard Semiconductor Supply
    • October 15, 2025
  • Rawalpindi Police Launches Petrol Station Built From Recycled Plastic To Support Welfare Of Officers
    • October 15, 2025
  • FPCCI Urges FBR To Extend Income Tax Return Filing Deadline To October 31
    • October 15, 2025
  • China Builds World’s Largest Coordinated Meteorological Observation Network Covering Land, Sea, Air And Space
    • October 15, 2025
  • Pakistan Internet Users May Face Disruptions Due To Undersea Cable Repair
    • October 15, 2025
about
CWPK Legacy
Launched in 1967 internationally, ComputerWorld is the oldest tech magazine/media property in the world. In Pakistan, ComputerWorld was launched in 1995. Initially providing news to IT executives only, once CIO Pakistan, its sister brand from the same family, was launched and took over the enterprise reporting domain in Pakistan, CWPK has emerged as a holistic technology media platform reporting everything tech in the country. It remains the oldest continuous IT publishing brand in the country and in 2025 is set to turn 30 years old, which will be its biggest benchmark and a legacy it hopes to continue for years to come. CWPK is part of the SPIN/IDG Wakhan media umbrella.
Read more
Explore Computerworld Sites Globally
  • computerworld.es
  • computerworld.com.pt
  • computerworld.com
  • cw.no
  • computerworldmexico.com.mx
  • computerwoche.de
  • computersweden.idg.se
  • computerworld.hu
Content from other IDG brands
  • PCWorld
  • Macworld
  • Infoworld
  • TechHive
  • TechAdvisor
CW Pakistan CW Pakistan
  • CWPK
  • CXO
  • DEMO
  • WALLET

CW Media & all its sub-brands are copyrighted to SPIN-IDG Wakhan Media Inc., the publishing arm of NCC-RP Group. This site is designed by Crunch Collective. ©️1995-2025. Read Privacy Policy.

Input your search keywords and press Enter.