Turning on an adblocker isn’t piracy: it’s just protecting yourself from unasked for and unwanted content. If they want to insist that we watch something then pay us for it.
Also, Perplexity is never going to stop me using Google. Google has done an incredibly efficient and thorough job of that already.
I do actually look at it a couple of times a week for professional reasons just to see what utter shite my clients are shoving down their necks.
Heck, if someone just put an ad as a hyperlinked image in the middle of a webpage, no blocker would stop me from seeing it and honestly I wouldn’t really care. For a while people just kind of accepted that as how the web worked, and it’s the advertisers insisting on tracking more and more that changed that.
Big tech companies typically do not disclose the resources required to run their models, but outside researchers such as Luccioni have come up with estimates (though these numbers are highly variable and depend on an AI’s size and its task). She and her colleagues calculated that the large language model BLOOM emitted greenhouse gases equivalent to 19 kilograms of CO2 per day of use, or the amount generated by driving 49 miles in an average gas-powered car. They also found that generating two images with AI could use as much energy as the average smartphone charge. Others have estimated in research posted on the preprint server arXiv.org that every 10 to 50 responses from ChatGPT running GPT-3 evaporate the equivalent of a bottle of water to cool the AI’s servers
oh, so, its about money to feel guilty about when using it?
theft for profit isnt nice, but compared to destruction of the biosphere for profit (yeah, right, capitalism), its really the least of problems with “ai”.
what would be really helpful though would be regulation on ai, its cousin block chain*, and on greenhouse gas production.
and to get there, people have to acknowledge that large-scale ml usage is problematic.
( * part of the reason the texas power grid failed and that so many people died a few winters back was because none of the crypto miners were required to shutdown during the storm. ml will have these same issues )
Not just misinterpreting, but misrepresenting basic facts (often in line with the kind of prejudices that get baked into datasets). E.g. the recent Google “AI” summary of the story from Texas where a woman tried to drown two Muslim children got transformed by the “AI” such that in Google’s version, it was the perpetrator that was suddenly Muslim…
While this does a pretty good job generally, the problem with these things is they present things as facts that just are not true. At least with a google search you need to put information together yourself and figure out what is really accurate.
For instance, a search of my name and profession here: https://www.perplexity.ai/search/dave-blair-video-editor-YPwfYHsFQoeuKOiZ75QXyQ brings up truthful information, but also says “Additionally, Blair is a co-founder and CEO of Fanward, a company that provides video production and editing services” which is not true. There is someone by that name there, but it’s not me.
This is a problem, stating these things as facts that are just suppositions.
I don’t disagree, but I suspect most of the incoming AI legislation will be focused on deepfakes, copyright, and what kinds of ML usage should be illegal, rather than the environmental impact. Blockchain was the same, with lawmakers more concerned about tracking and taxing digital assets rather than emissions.
It’s a shame, because I do think comprehensive environmental regulation could go a long way towards solving the overuse and wild carbon footprint issues with both technologies.
Yes, this is exactly the biggest problem. They work by scraping sites and giving you a summary so you never have to go to the site. So people who actually write content, like journalists, won’t get any hits, and eventually won’t get paid. So they’ll stop producing content, and the enshittification of the web will continue, as everything becomes just AI-written crap.
Cool, cool, cool. Except that using AI search results burns through way more power. We need less random AI which doesn’t truly add anything to life, not more.
there really isn’t a way to solve the footprint problem. the technology requires huge amounts of power and water. worse, the only way to improve output quality is by adding more resources into its maw.
that, and the stealing of people’s work, are all externalities. and just like with oil companies, the difference between internalized costs and externalized costs is called profit
if they were paying the true cost, it wouldn’t be profitable, and these technologies would look very different.
personal gas powered vehicles, plastic bags and plastic bottles were all super cool and convenient. these technologies are too. but it’s not going to be good for us