Opinion

Pay for content, slice for responsible AI



Separate deals between Google and OpenAI with Wall Street Journal and Financial Times suggest a way out for technology firms to avoid bruising EP battles over training generative AI on LLMs. Content and AI creators have a symbiotic existence at this stage of tech development. AI trains best on information created by humans, while the output quality degenerates when trained on synthetic content. This makes the archival material of media organisations a vital resource in tech development, one that Big Tech is ready to pay for. Google, for instance, has agreed to pay WSJ $5-6 mn annually to develop new AI-related content and products. The alternative, of news sites denying access to content crawlers, reinforces bias that AI has the potential to magnify. That would reinforce the need for tighter regulation of a transformative tech.

Negotiated settlement between content creators and tech developers is also needed because AI is expected to eventually supplant human effort. Symbiotic existence between media and technology will strengthen as more AI is used to produce information. It is all the more desirable that the engagement starts out on mutually beneficial terms. Media and technology have had a complicated relationship since the dawn of the internet, with accusations of revenue cannibalisation accompanying staggering increases in distribution reach. Media is a frontline industry for disruption by AI and, this time, media-owners are determined to get their due at the outset.

Which is good news for the millions of content creators that feed all manner of platforms from social media to ecommerce. They can be assured of a monetisation model that encourages their creativity. Individual content creators have negligible bargaining power with Big Tech and are forced to seek legal redress for copyright violation. Terms that tech companies strike with media organisations can work as guides to compensating smaller content creators, who also serve technology development. For AI to be responsible, it must be so at the training stage as well.



READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.