Using AI to offer analysis of a piece of writing is one thing. Using it to create “tips” from real people without the permission of those people is something else entirely.
Wired on the same behavior:
Using AI to offer analysis of a piece of writing is one thing. Using it to create “tips” from real people without the permission of those people is something else entirely.
Wired on the same behavior:
This is just creepy. And it makes me wonder about the reasoning and decision-making process that went into approving it and did anyone bring up the obvious ethical concerns? I mean this just screams PLEASE OH PLEASE LITIGATE US RIGHT NOW!
You can’t make this up. ![]()
The upside is: Weird times add clarity. People and organizations reveal their true colors. If you treat this crap as a warning label, suddenly a lot of companies become “A” companies without the need to research. Like in: "Check out our new A.–"delete
And it makes you more forgiving of dumb people. At least they’re alive.
Backlash and lawsuit. Apparently impersonating people is illegal in both New York and California.