OPINION:
I make my living chasing major weather events around the globe and documenting them on video and film. It’s dangerous and messy work, but I’m proud to help people understand the risks of extreme weather and provide images and data that meteorologists and scientists can study. Because my photos and videos are protected by U.S. copyright law, I have the right to get paid by filmmakers, news outlets and collectors who value my work and can support my family doing something I love.
As seems to be happening in every part of American life, the push to bring untested and unregulated artificial intelligence tools to market is pulling the foundation out from underneath storm chasing and creating major risks for American communities.
I am not opposed to responsible uses of AI. The tech can already help drones navigate cluttered airspace, and it holds huge promise in improving weather modeling and prediction using government data. No doubt other beneficial uses will emerge.
For now, those gains are far outweighed by the massive harms that irresponsible and unlicensed uses of AI inflict on storm chasers and every community struck by dangerous weather.
The most immediate problem is the scourge of phony AI-created weather images rushed onto social media platforms after or during emergencies. These images — which often include incorrect locations and weather paths, exaggerated damage or features and false crises, such as artificial images of trapped civilians or pets — can confuse and frighten the public and misdirect field and management resources.
Over time, AI falsehoods erode trust in official announcements and undermine legitimate efforts to inform and protect the public.
These artificial storm chase images also encourage recklessness and risk-taking by amateurs. When professionals post weather images, we stress the risks involved, include appropriate warnings and make clear the extensive training, planning, technology and skill it takes to face Mother Nature in her fury and survive. Online hoaxers don’t do any of this; instead, they sensationalize the work and minimize the risks involved.
The unlicensed use of storm chasers’ work by AI-generated weather images also creates major economic risks, putting our entire industry and ultimately the weather readiness and prediction we support at risk.
Here’s the problem: Phony weather images created by AI are all based on real footage that storm chasers risk their lives and spend a lot of money to capture. Still, AI companies don’t ask permission or pay us for our work, even when it is used to generate artificial products that undercut and devalue our originals.
Unsurprisingly, that’s not legal. Copyright law gives me and every other creator the right to decide whether and on what terms our work is used by others, including rejecting uses we find personally objectionable or that don’t make economic sense.
Developers claim they can override these rights because training AI counts as “fair use,” a legal doctrine that allows copyrighted works to be copied or used without permission for things like news reporting and classroom teaching. So far, courts are skeptical. A federal judge wrote in June that fair use “typically doesn’t apply to copying that will significantly diminish the ability of copyright holders to make money from their works,” which is exactly the case with most AI training.
Cases are still working through the legal system to resolve this for good, but recognizing the weakness of their position, the AI giants have unleashed their armies of lobbyists, asking the Trump administration to put a thumb on the scale and declare all AI training to be fair use.
So far, they haven’t had much success. Bedrock conservatives who respect private property, including intellectual property, are deeply skeptical about the world’s richest companies’ plan to steal copyrights. According to a recent poll, nearly 90% of Trump voters believe AI companies should have to get permission and pay before training for-profit models on copyrighted works.
If AI companies are allowed to steal the work of human storm chasers under vastly overreaching theories of fair use, the costs will be immense. Storm chasers will be driven out of business. Future weather events would lack real-time coverage. Property losses and even human casualties would rise. Eventually, the progress we have made on weather modeling and prediction would fizzle as new sources of reliable imagery become scarce.
There is a better way. Reject AI overreach, stick with long-standing copyright principles and respect individual creators, our work and our rights.
• Brandon Clement is the founder of WxChasing.
Please read our comment policy before commenting.