Is Using AI to Write Morally Wrong?
|Photo by Markus Winkler on Unsplash Artificial intelligence is nothing new, but in the past few years, we’ve seen more and more impressive advances. Now, artificial intelligence can be leveraged to write content that sounds almost human — and in some cases it can produce results that are nearly impossible to distinguish from content generated by real human authors. Using AI to write is becoming more and more common due to the increasing popularity of several tools.
While man has been using technology to make his work easier since the dawn of time, and there’s nothing inherently wrong with this, using artificial intelligence to write does pose some interesting moral dilemmas that I believe we as writers — and as a society as a whole — will need to address. Artificial intelligence is not going away.
Just like the invention of the automobile or the lightbulb, artificial intelligence is set to revolutionize how we work, live and play. Artificial intelligence has brought with it a whole new level of convenience and accessibility.
Artificial intelligence is already being used in many sectors including but not limited manufacturing, health care, education, design, programming, retail, and transportation. As it is more and more widely adopted, AI is making previously tedious and time consuming tasks simple and quick. Many routine tasks can be relegated to an “intelligent” system to handle instead of needing the constant watchful oversight of a human.
Needless to say, AI isn’t going away. It’s here to stay. For better or worse.
What’s interesting to me is the fact that now more and more people are using AI to write. People are using AI to write?
Numerous tools now exist that leverage artificial intelligence to generate text output. This output can be something as simple as an autocomplete suggestion based on the probability of a given word or phrase appearing next or something as sophisticated as a multi-paragraph, structured chunk of text based on a prompt — or anything in between.
Like never before, writers can leverage artificial intelligence to generate content with a few keystrokes and the click of a button.
Using a tool like Jasper or Jenni, a human can generate human-like content that sounds close enough to human that it can be passed off as something he or she actually penned.
These tools, while neat, present a unique set of moral dilemmas. And as someone who both loves writing and technology, I’m conflicted.
There are a couple of popular AI writing tools that I hear mentioned frequently: Jenni and Jasper. Kristina God wrote about Jenni recently, and I’ve test driven Jasper a bit out of curiosity.
These tools and the concepts behind them are fascinating to me. But there are moral implications here that we need to consider. An AI writer needs to be trained — and mostly not by you.
Computers can’t think on their own. As much as some people may want us to believe they can, computers are not capable of independent thought. Yet.AI writing tools aren’t sentient programs that think about a […]