For Christmas I got an intriguing gift from a buddy - my very own "very popular" book.
"Tech-Splaining for Dummies" (excellent title) bears my name and my photo on its cover, and it has radiant evaluations.
Yet it was totally written by AI, with a few easy triggers about me supplied by my buddy Janet.
It's an interesting read, and uproarious in parts. But it likewise meanders quite a lot, and is someplace between a self-help book and a stream of anecdotes.
It imitates my chatty style of writing, but it's also a bit repeated, and very verbose. It might have surpassed Janet's in looking at information about me.
Several sentences start "as a leading technology reporter ..." - cringe - which could have been scraped from an online bio.
There's also a mysterious, repeated hallucination in the kind of my feline (I have no pets). And there's a metaphor on almost every page - some more random than others.
There are lots of companies online offering AI-book composing services. My book was from BookByAnyone.
When I got in touch with the primary executive Adir Mashiach, based in Israel, he told me he had actually offered around 150,000 personalised books, primarily in the US, given that rotating from compiling AI-generated travel guides in June 2024.
A paperback copy of your own 240-page long best-seller expenses ₤ 26. The company uses its own AI tools to produce them, based upon an open source big language design.
I'm not asking you to buy my book. Actually you can't - only Janet, who created it, can purchase any further copies.
There is currently no barrier to anyone creating one in anyone's name, including celebrities - although Mr Mashiach says there are guardrails around violent content. Each book contains a printed disclaimer specifying that it is fictional, produced by AI, and created "entirely to bring humour and pleasure".
Legally, the copyright comes from the company, but Mr Mashiach worries that the product is meant as a "customised gag present", and the books do not get sold further.
He wishes to widen his variety, producing different genres such as sci-fi, and maybe using an autobiography service. It's developed to be a light-hearted kind of consumer AI - selling AI-generated products to human consumers.
It's likewise a bit terrifying if, like me, you write for a living. Not least because it probably took less than a minute to generate, and it does, certainly in some parts, sound just like me.
Musicians, authors, artists and stars worldwide have expressed alarm about their work being used to train generative AI tools that then produce comparable content based upon it.
"We ought to be clear, when we are speaking about information here, we actually mean human developers' life works," says Ed Newton Rex, founder of Fairly Trained, funsilo.date which campaigns for AI firms to respect creators' rights.
"This is books, this is short articles, this is pictures. It's artworks. It's records ... The entire point of AI training is to learn how to do something and after that do more like that."
In 2023 a tune including AI-generated voices of Canadian singers Drake and The Weeknd went viral on social networks before being pulled from streaming platforms due to the fact that it was not their work and they had not granted it. It didn't stop the track's creator attempting to choose it for a Grammy award. And although the artists were fake, it was still hugely popular.
"I do not believe using generative AI for imaginative purposes must be banned, but I do think that generative AI for these purposes that is trained on individuals's work without approval must be banned," Mr Newton Rex adds. "AI can be really powerful however let's build it fairly and fairly."
OpenAI states Chinese rivals using its work for their AI apps
DeepSeek: The Chinese AI app that has the world talking
China's DeepSeek AI shakes market and dents America's swagger
In the UK some organisations - consisting of the BBC - have actually selected to obstruct AI designers from trawling their online material for training purposes. Others have chosen to collaborate - the Financial Times has actually partnered with ChatGPT developer OpenAI for example.
The UK government is considering an overhaul of the law that would allow AI designers to utilize developers' content on the web to assist establish their models, unless the rights holders pull out.
Ed Newton Rex explains this as "madness".
He mentions that AI can make advances in locations like defence, health care and logistics without trawling the work of authors, journalists and artists.
"All of these things work without going and changing copyright law and messing up the incomes of the country's creatives," he argues.
Baroness Kidron, a crossbench peer in your house of Lords, is also strongly versus eliminating copyright law for AI.
"Creative markets are wealth developers, 2.4 million jobs and a lot of pleasure," says the Baroness, who is likewise an advisor to the Institute for Ethics in AI at Oxford University.
"The government is undermining one of its finest performing industries on the vague promise of growth."
A government representative said: "No move will be made until we are absolutely positive we have a useful plan that delivers each of our objectives: increased control for ideal holders to assist them certify their content, access to premium product to train leading AI designs in the UK, and more openness for best holders from AI developers."
Under the UK government's new AI plan, a nationwide data library containing public data from a large range of sources will also be offered to AI researchers.
In the US the future of federal rules to control AI is now up in the air following President Trump's return to the presidency.
In 2023 Biden signed an executive order that intended to boost the security of AI with, to name a few things, firms in the sector required to share information of the functions of their systems with the US government before they are released.
But this has actually now been rescinded by Trump. It remains to be seen what Trump will do instead, but he is stated to desire the AI sector to face less guideline.
This comes as a variety of claims against AI companies, and especially against OpenAI, continue in the US. They have actually been gotten by everyone from the New york city Times to authors, music labels, and even a comedian.
They claim that the AI companies broke the law when they took their content from the web without their permission, and utilized it to train their systems.
The AI business argue that their actions fall under "fair usage" and are therefore exempt. There are a variety of elements which can make up fair usage - it's not a straight-forward meaning. But the AI sector is under increasing examination over how it collects training information and historydb.date whether it should be paying for it.
If this wasn't all adequate to consider, Chinese AI company DeepSeek has actually shaken the sector over the previous week. It ended up being one of the most downloaded totally free app on Apple's US App Store.
DeepSeek claims that it developed its technology for a fraction of the price of the likes of OpenAI. Its success has raised security concerns in the US, and threatens American's current supremacy of the sector.
As for me and a career as an author, I think that at the minute, if I truly want a "bestseller" I'll still need to compose it myself. If anything, Tech-Splaining for Dummies highlights the current weakness in generative AI tools for bigger tasks. It has plenty of inaccuracies and hallucinations, and it can be quite hard to check out in parts since it's so long-winded.
But provided how rapidly the tech is evolving, I'm unsure the length of time I can remain positive that my considerably slower human writing and modifying abilities, are better.
Sign up for our Tech Decoded newsletter to follow the most significant developments in international technology, with analysis from BBC reporters around the globe.
Outside the UK? Sign up here.
2
How an AI written Book Shows why the Tech 'Frightens' Creatives
Alejandra Spedding edited this page 2 weeks ago