Trump Outlines New AI Regulation Plan: What’s in It and What’s Missing


The White House’s new policy framework for regulating generative artificial intelligence, released Friday, covers many areas, but one thing is clear: President Donald Trump wants the federal government to set the rules. And those rules appear to fall far short of what consumer and privacy advocates argue is necessary. 

The generative AI revolution has been underway for years, and US legislation is slow to catch up. This is despite the growing awareness of AI’s harms and challenges: chatbots’ dangerous impacts on mental health and child development, the widespread legal wrangling over the copyright protections, the dangerous spread of deepfakes and AI-powered scams, to name a few. 

Sen. Marsha Blackburn introduced the new policy package, called The Trump America AI Act, in Congress on Thursday. The Tennessee Republican’s bill is an attempt to codify a vision based on Trump’s 2025 AI Action Plan, while delving into more legal specifics and providing guidance on implementing new laws (or changing existing ones). 

AI Atlas

Trump has maintained that the federal government should be responsible for regulating the AI industry — and that requiring AI companies to comply with 50 different sets of state laws would prevent the US from “winning” the global AI race. However, a proposal to temporarily ban states from regulating AI failed back in July, when it was removed at the last minute from the massive budget bill, known as the “One Big Beautiful Bill Act.” 

Now, the White House is doubling down on its claim to be in charge, with a few exceptions. The plan addresses some of the biggest concerns people have about AI: job losscopyright chaos for creatorsrapidly expanding infrastructure such as data centers and the protection of vulnerable groups like children. But critics say it doesn’t go far enough to regulate the fast-growing AI industry. 

“It is light on protection and heavy on promotion of dangerous AI systems,” Alan Butler, president and executive director of the Electronic Privacy Information Center, said in a statement. “The American people deserve better, and Congress should do better than this.”

The White House’s new proposed AI laws

The White House’s 2026 AI proposal says Congress should not create a new governing body to oversee AI rules, but should let existing agencies and subject-matter experts regulate as they see fit. 

Protecting children: This is one area where the federal government won’t prevent states from creating laws. And many state governments are already leading the charge, especially in regulating romantic or companion chatbots. 

The plan highlights protecting kids from AI-powered deepfakes, a huge issue highlighted in AI creating child sexual abuse material. Shielding young people from the ill effects of AI is an ongoing battle, with several high-profile cases of teenagers using AI for self-harm and suicide.

Blackburn’s policy plan includes general language related to kids’ online safety. Existing bills like the Kids Online Safety Act and the Children’s Online Privacy Protection Rule are, theoretically, designed to protect kids, but advocates and tech experts say they could create a chilling effect on free speech and lead to censorship

Though Trump’s AI framework addresses censorship, it’s limited to preventing AI companies from including ideological or partisan bias in their products. Trump has previously railed against what he calls “woke” AI, a term the president and his allies have used to attack concepts like diversity, equity and inclusion.

Job loss: It’s not just translators and data entry folks who are worried about losing their jobs to AI — legacy tech workers like coders and engineers are, too. There have been a lot of concerns about AI disrupting the workforce, with retail giants like Amazon laying off thousands of employees in the name of AI efficiency. The White House says it should use “nonregulatory” methods to focus on youth development and AI workforce training.

Infrastructure: In line with Trump’s previous AI Action Plan, the framework calls for states and local governments to streamline data center construction and operation. These facilities are increasingly controversial, with nearby residents reporting environmental damage and strain on their existing electrical grids, creating higher electric bills. 

Several big tech companies recently agreed to foot the bill for any higher electricity costs, but there’s no way to enforce the voluntary pledge.

Copyright: Whether the use of copyrighted materials in AI training is fair use or copyright infringement is one of the biggest legal issues of the AI age. The plan reiterates the administration’s position that AI companies are covered by fair use — meaning they wouldn’t have to obtain permission or pay for copyrighted content when creating their models. 

But, given the ever-growing number of lawsuits asking the judiciary the same question, the federal government should allow those cases to play out. So far, limited cases with Anthropic and Meta have carved out narrow victories for tech companies, not authors.

The framework document hints that the federal government could become a future licensing partner for AI companies, stating that it should “provide resources to make federal datasets accessible to industry and academia in AI-ready formats for use in training AI models and systems.”

(Disclosure: Ziff Davis, CNET’s parent company, in 2025 filed a lawsuit against OpenAI, alleging it infringed Ziff Davis copyrights in training and operating its AI systems.) 

Does the White House plan do enough?

Tech industry groups praised the administration’s proposals, while consumer advocacy groups offered skepticism at best. 

In a statement backing the plan, the Consumer Technology Association supported a single set of rules for the entire country. 

“AI can and will make us better, and we agree that children need special protection, First Amendment rights are paramount, harmful deep fakes should be regulated, and Congress should not act to restrict AI platforms from relying on fair use protection,” the tech industry trade group said.

But according to Samir Jain, vice president of policy at the Center for Democracy and Technology, the government’s playbook is rife with internal contradictions. While it calls for the federal government to preempt state rules and laws on AI development, it also says the federal government shouldn’t undermine state authority. 

“The White House’s high-level AI framework contains some sound statements of principles, but its usefulness to lawmakers is limited by its internal contradictions and failure to grapple with key tensions between various approaches to important topics like kids’ online safety,” Jain said in a statement.

Ben Winters, director of AI and data privacy at the Consumer Federation of America, said the proposal prioritizes Big Tech over consumers.

“It’s encouraging to see some stated desires to protect people from AI-generated scams and data abuse of minors, but it’s not enough,” Winters said in a statement. “We need to see money where their mouth is on the protections — more money for consumer protection agencies at both the federal and state levels. So far, they’ve done nothing but cut and hamstring them.”





Source link

Leave a Reply

Subscribe to Our Newsletter

Get our latest articles delivered straight to your inbox. No spam, we promise.

Recent Reviews


The internet is changing and so is the way we search and find information. The trick behind all the search queries is nothing but a web crawler.

Yes, the machine that searches the web, retrieves data, and assists search engines such as Google in sorting the information into searchable indexes. Search engines would be nothing without crawlers. But do you know there are different types of crawlers lately?

Well, traditional crawlers like Googlebot have been using rule-based systems over the years to retrieve information and sift through links and draw results to user queries. This method is still effective, although there are a few limitations it comes with.

Let’s now introduce the new age of AI-powered crawlers, a next-generation genus of bots, based on artificial intelligence and machine learning. These crawlers do not just search the sites; they comprehend the sites. Through semantics, tone and context, they are going above and beyond in the web searching landscape.

Here in this blog, we are going to discuss the differences between traditional and AI crawlers, alongside how they will transform search in the future and share practical tips to make your content the best to thrive in today’s digital world.

So, let’s get started!

What are Traditional Crawlers? Traditional Crawlers

The old-fashioned crawlers, namely Googlebot and Bingbot are based on the following principles, scan, copy and index. They operate similar to librarians and index the information by use of HTML structures, metadata, and keywords.

    • Process: They search links, analyze code, and store page information in huge search databases.
    • Reliability: Suits well with static web sites and organized content.
    • Weakness: Problems with changing websites, with dynamic components, such as JavaScript-bulky applications, and subtle context.

As an example, a traditional crawler might not pick up the product information in a product page when it rewrites the class names or changes the structure of the product page, causing indexing errors. This has led the industry to smarter and AI-assisted means.

What Are AI Crawlers?

AI Crawlers

Intelligent crawlers go beyond bot to be more of an interpreter. Through the use of natural language processing (NLP), computer vision, and machine learning, they are able to comprehend content in a manner that can replicate human understanding.

    • Context Awareness: AI crawlers do not only read the text; however, they define meaning, tone, and purpose.
    • Flexibility: AI crawlers will be able to identify and retrieve suitable information even when a site alters the structure of the site.
    • Multimedia Intelligence: They are capable of processing video, audio and picture, and are therefore much more intelligent than bots that are rule-based.

Just think of a crawler that does not just read a blog post but knows whether it is a product review, a thought-leadership article or a how-to guide. This is the hope of AI-support crawling.

The Rising Dominance of Googlebot.

According to recent stats from Cloudflare, Googlebot is still dominating although AI crawlers are on the rise. Googlebot grew by 96 percent in May 2024-May 2025, with highs in April 2025 of 145 percent of the traffic of May 2024.

This spike was accompanying the introduction of AI Overviews by Google, which added generative answers to search results. The combination of old-style crawling with the use of AI improvements is the future of Google as the hybrid is establishing preconditions of the coexistence of the two systems.

How Does Traditional Search Work?

To value the changes, one should go back to the way the search engines used to operate:

Crawling/ Indexing– Robots search through internet sites and archive copies of pages on servers.

Ranking Algorithms– The ranking of pages depends on the relevance of the key words, back links and the freshness of the content.

Displayed Results– The Results display ads, organic links, snippets, and panels.

AI-Driven Search: A New Era

AI based search engines extend past keywords. They can:

    • Know natural language – responding to complex conversational questions.
    • Provide direct responses – eliminating the necessity to browse through several results.
    • Individualize findings – customize suggestions according to the behavior of the user.
    • Manipulate multimedia – The analysis of videos and podcasts, as well as voice recognition.

ChatGPT, Google Gemini, and Microsoft Copilot are the members of Large Language Models that can transform the search into a conversation instead of a list of search results.

AI Crawlers vs Traditional Crawlers: Key Differences

1. Understanding User Intent

Traditional Crawlers: Search query by a key word and scratch the surface without necessarily realizing what the query entails.

AI Crawlers: This is the next level, whereby the search engine goes beyond the keyword and interprets user intent, semantics and context to deliver even more useful information.

2. Scalability and Efficiency

Traditional Crawlers: Are able to construct a mass of data, but they can create duplicates or irrelevant records as they are not very aware of the context.

AI Crawlers: Smart filtering and prioritization of content, which creates a leaner and more efficient indexing which is more relevant.

3. Real-Time Adaptation

Traditional Crawlers are not good at keeping up with new structure of websites or newer technologies being introduced and thus require manual updating.

AI Crawlers): Learn and adapt in real time and recognize patterns and evolve without human interaction.

4. Content Depth and Quality

Traditional Crawlers– These are typically employed to access visible text and links, and they might not be concerned with multimedia, user-created and interactive content.

The AI crawlers use multimedia, dynamic content and even sentiment to produce a more refined view of the entire quality of pages.

Sharing Quick Wins for Crawlability

Technical SEO is essential even with the further development of AI. The following are fast fixes to increase crawlability:

Important pages should be served with server-side rendering (SSR).

    • Keep HTML lean, semantic and clean.
    • Enhance page speed- sluggish sites are conquered.
    • Provide clear, descriptive headings and titles (H1 -H3).
    • Blocking AI crawlers in robots.txt or llms.txt is not advisable.
    • Publicize verifiable factual, well formatted and prompt information.

Conclusion: Preparing for the Future of Indexing

The future of search lies at the intersection of traditional and AI crawling. While rule-based crawlers remain essential, AI-powered crawlers bring a new level of intelligence, adaptability, and context awareness.

For brands, this means rethinking SEO strategies and embracing AI Optimization (AIO) alongside Generative Engine Optimization (GEO). By preparing content for AI-driven indexing today, businesses can ensure long-term visibility, authority, and discoverability in tomorrow’s search ecosystem.

Stay updated with all the latest blog topics, here with us!

Recommended For You:

What are your Alternatives if your website has been hit by Google penalty?



Source link