Should We AI or Not?

Jamie Vernon
3/28/2024
min read

AI is really attractive for the novelty right now. You can look at ChatGPT and see that 100 million users logged in yesterday. But, to do what? There are people who are using ChatGPT for a chocolate chip cookie recipe or to write a high school paper. It makes you wonder why someone is consuming AI-based resources where a quick Google search or even a cookbook on your shelf will do? Speaking of papers, it's getting way easier for organizations to see that ChatGPT wrote your paper. I’ve learned that most college applicants don't use the word ‘tapestry,’ which apparently is one of the words that is a dead giveaway for using ChatGPT.

Let's talk about the learning experience. The reason people are resorting to ChatGPT is because they don't want to write their papers. Maybe students don't have time to write them, making generative AI all the more seductive. Just as the AI tools are evolving, so are the anti-AI tools, and they are detecting more and more of these uses.  

Business reluctance around AI – Security concerns

Many companies actually are still loathe to use AI, and lots of CISOs are out there writing policies basically saying not to use it. One reason is large language models like ChatGPT consume the prompt that you send, then it becomes part of the large language model and impossible to erase. Companies have found that they've accidentally put in intellectual property or sensitive data or any number of things that ChatGPT literally can't forget. As a result, sensitivity about the sanctity of that data has been compromised by trying to use AI for something cool.

Underestimating the impact of AI on operations

Some people may adopt the perspective that AI is just the next iteration of the last 60 years of continuous automation from manufacturing through all sorts of business processes. That, however, oversimplifies what's going on. I do think that what we're seeing is revolutionary and has great value for the enterprise. Like any other tool, AI needs to be approached thoughtfully and deliberately. Because it's a tool, at least for now, you have a lot of opportunities to not use it. I would generally treat AI almost like a junior intern working for me in a field that I understand. But there are risks. For example, I'm not a sociologist, so if I ask ChatGPT to write a paper on any topic in sociology, I would personally have no basis to see if it's any good. That’s where people are really getting tripped up.

A better way to use AI

Let’s go back to the recent example of lawyers who submitted a case brief, and cited several cases that it turns out didn't exist. Later we learned that they were using ChatGPT. They should have read the brief, and they should have researched the cases, at least superficially. In a topic where I can vet the outcome, I totally should do that. I owe it to myself and to my organization; because, if I'm going to submit something as my work, I at least need to be able to stand by it with some confidence. Here's another example, a more creative one. If I'm going to generate a picture of some kind, I have a sense of what I want that picture to look like and whether or not that's correct or incorrect. Everybody who's using ChatGPT for video or image or audio generation should still check it.  

But is it art?

I had a fight with one of my children about the art contest in Colorado where it turns out that the winner submitted AI-generated art. This widely known incident got people really excited in an angry way. My now 18-year-old said it was not art, because a computer was used to generate it. My response was that hardly anybody does anything truly from scratch anymore. If I went to an interface, directed it to make me a picture of something, and just submitted the first thing that popped out, I would believe it was not art. The human input would be so minimal as to have no value.  

But if the human spent 20 hours or even 200 hours doing what's commonly called prompt engineering and dialing in the resulting image to do exactly what he wanted and to evoke an emotional response, that’s different. This person just did it with AI, instead of Photoshop, instead of watercolors... the level of thought and care that went into the outcome starts to sound like art created through a different medium. In this case it’s a tool that even I can use, because when I try to draw, it’s not beautiful at all.

Expanded capacity, more resources

As a business tool, I think that AI is a tool that helps me come up with things I haven't considered. I am human and have certain assumptions that I make. I have certain biases that I can't help but take into account. You can go into ChatGPT or something similar, plug in some ideas, and get elements you didn't think about. Functionally, because of the way Chat GPT was trained, you will be getting elements that thousands of other people with all sorts of other different backgrounds thought about. Chat GPT pulls from those and provides what it thinks is the most relevant answer. I often realize that I didn't think about that solution, option, approach, or whatever. It’s something to include in my model and planning, and an effective way to expand your team and idea universe. AI really is human assistive. I don't necessarily trust it to do anything on its own, but for helping me to create content, consider other options, or come up with considerations or problems, absolutely.

Learn more about Altiam Digital Automation Solutions.

Jamie Vernon

Related posts

Keep exploring!

Robotic Process Automation (RPA) Action and Information: Which Is More Valuable?

Robots are cute, but what they can tell you about your business is powerful. Don't be fooled by their perky antennae and happy faces - robots mean business!

Wayne White
10/25/2023
7
min read

New Year, New CISO (AGAIN!)!

The new year will bring-in new threats. Company leadership (CXO & Board) is vital for prevention and recovery.

Tina Valdez
10/31/2023
7
min read

It is Time for Boards to Deal with Their Risk - The Problem with Drift and How Foundational Oversight Will Help

"Drift" is super cool on the race track, but it is the silent security risk in your IT environment. This article focuses on how boards can beat it back.

Wayne White
11/13/2023
7
min read