A few minutes every morning is all you need.
Stay up to date on the world's Headlines and Human Stories. It's fun, it's factual, it's fluff-free.
Last week, ChatGPT’s OpenAI launched its GPT store, which has been compared to Apple’s App Store. Why? Well, essentially, anyone can now create their own customized chatbot (called a GPT), and they can either keep it private, share it with a few people or share it with the world. On top of that, no coding is needed.
Up to now, you could already customize your own GPT on a premium plan, but this marketplace will broaden the number of customized GPTs, letting you buy what you need rather than make one yourself. “For example, GPTs can help you learn the rules to any board game, help teach your kids math, or design stickers,” writes the company on its site.
Since then, millions of custom GPTs have been created, and OpenAI announced that it will launch a revenue-sharing program with the creators of these GPTs early this year based on how popular and engaged their GPTs are.
But by day two of the store launch, rules were already being broken.
When the store launched, OpenAI’s policy also updated, banning GPTs “dedicated to fostering romantic companionship or performing regulated activities.” It’s not the clearest what “regulated activities” means, though. Last week, one quick search for “girlfriend” brought up at least eight AI chatbots, including “Virtual Sweetheart,” “Korean Girlfriend” and “Everyone’s Girlfriend.”
This isn’t exactly a new concept, with millions of people forming relationships with artificial intelligence (AI) chatbots today amid a loneliness epidemic. In fact, reports show that nearly a quarter of the world report feeling very or fairly lonely.
But some are saying that this just goes to show how difficult it could be to regulate these GPTs. In the meantime, OpenAI says it uses a variety of methods, like user reports, automated systems and manual reviews, to see whether or not a GPT violates the usage policies.