Click here to join our #Storytelling LinkedIn group!

AI 101 — How to Begin Using AI

October 30, 2023
6 min read
Photo by Lukas on Unsplash
The best explanation is the one that makes you say, 'Ah! So that's how it works!'.
Marie Curie

“I’m sorry, Dave. I’m afraid I can’t do that.”

2001: A red glass eye gleams as HAL 9000 politely refuses to open the pod-bay doors. Psychotic space AI tries to kill clean-shaven, mild-mannered astronaut.

“I’ll be back.”

1984: A time-traveling T-800 trips in from the near future to kill anyone named Sarah Connor.

Every time—in books and movies—humans give machines the power to think, bad things happen. It’s not just Kubrik or Schwarzenegger. When the AI is beautiful (Ex Machina), a child (The Creator), or built to defend humankind (Ultron), the formula is the same—AI plus humans equals disaster.

The horror makes a good movie and a great headline.

February 16, 1999: Headlines screamed. “Armageddon! Year 2000 computer bug will turn machine against man!” 319 days later, calm settled in. “World Rejoices; Y2K bug is quiet.” If you lived through the 90s, you remember the existential threat of technology run amok.

It’s always this way.

From smallpox vaccines rumored to turn you into a cow to radio waves accused of mind control and trains so fast they'd make Victorian ladies swoon—the pattern of hype and horror is all too familiar.

Now, AI: “How could AI Destroy Humanity?” (The New York Times).

Fiction is almost always more frightening than fact.

Here’s another headline.

“The AI ‘gold rush’ is here. What will it bring?” (The Washington Post)

Businesses are paying attention.

AI’s impact will be massive. Tom Wilson, CEO of Allstate, predicts that AI will “rip through this economy like a Tsunami.”

Michael Miebach, CEO of Mastercard, compares AI to electricity, “It powers our fraud prevention and personalization services. It helps us protect over 125 billion – yes, billion – transactions on our network every year. We’re looking to do even more with AI.”

PricewaterhouseCoopers suggests that AI could boost global GDP by $15.7 trillion by 2030, with $3.7 trillion of that impact in North America. That equates to $30 million of GDP per minute, born from productivity and products.

We’ve noticed. 2023 was AI’s coming out party, dressed by ChatGPT.

Where will AI show up?


Bala Maddali, head of conversational AI at Verizon, breaks it down.

“AI already operates in the background, unseen and untouched—for example, email filtering or fraud detection. Then, there are AI-powered experiences—traffic prediction in Google Maps or Netflix’s recommendation engine. And finally, at its most visible, where AI itself is the experience. Here, humans interact directly with AI. ChatGPT, Siri, Alexa...”

Gains in productivity come from replacing or augmenting human labor. This allows us to work more effectively or on higher-value tasks. Meanwhile, AI operates behind the scenes in products like Google’s autocomplete, Spotify, and Quickbooks. Yet to come: a range of as-yet-not-invented, increasingly sophisticated products poised to boost consumer and business demand.

$15.7 trillion reasons to pay attention to AI.

So whAt Is it?

Here’s a definition for non-experts by a non-expert.

AI is about crafting machines to think, speak, move, and see—like (or better than) humans.

AI is a big, geeky topic.

As newcomers, we often glimpse one part of the elephant — ChatGPT, Boston Dynamics robots, self-driving cars — and think we see the whole thing.

What I'll tackle here is the aspect I believe we'll quickly get familiar with—the part where AI isn't just a tool but the experience itself. The subset of Machine Learning and Deep Learning that gives us something to think with—an intelligence that can augment our intelligence—Generative AI.

If you’re overwhelmed by all the autonomous, neural, learning, AI-related word-smithing, and word-hacking, here’s a quick primer.

#AI: Artificial Intelligence: the ability of machines to mimic human thinking or actions, often through problem-solving or learning.

#Anthropomorphization: The attribution of human qualities to non-human entities, such as machines or AI models. Try not to say please and thank you.

#ML: Machine Learning, a technique where computers learn from data without being explicitly told how to perform a task.

#DeepLearning: A type of machine learning that uses multi-layered neural networks to analyze various factors of data.

#NeuralNetworks: Computing systems inspired by the human brain, made up of interconnected nodes, used to process information.

#GenerativeAI: AI that creates new content, like images or text, by learning from existing data, recognizing patterns, and making predictions.

#LLM: Large Language Model, a type of deep learning model trained on vast amounts of text to understand and generate language.

#ChatGPT: An AI chatbot by OpenAI that can converse and answer questions.

#OpenAI: A company focused on creating and promoting friendly AI for the benefit of all.

#Bard: Google's AI chatbot designed for conversation; still experimental.

#Claude: A chatbot by Anthropic aiming for safe and ethical AI conversations.

#Anthropic: A company focused on creating AI to be helpful, harmless, and honest.

#ImageGenerationModels: AI models that create new images based on learned data.

#DALL-E: An AI by OpenAI that turns text descriptions into images.

#Factuality: The accuracy or truthfulness of the information generated by a generative AI model.

#Midjourney: An AI tool similar to DALL-E focused on generating artwork.

#Transformers: A type of algorithm in a neural network architecture that's particularly good at handling sequences like text.

#SupervisedLearning: Training AI using labeled data, where the answer is provided.

#UnsupervisedLearning: Training AI using data without explicit answers, letting it find patterns on its own.

#ReinforcementLearning: Training AI through trial and error, rewarding it for the right decisions.

#NLP: Natural Language Processing, making computers understand and generate human language.

#PromptEngineering: The technique of crafting specific inputs to get the best outputs from AI models.

#Robotics: The science of designing, building, and using robots for tasks.

#ComputerVision: Giving machines the ability to interpret and act on visual data.

*All definitions provided by ChatGPT-4

AI changes everything.

Now that we're fluent in AI-speak, let's explore why it's more than a buzzword—it's a game-changer from retail floors to boardrooms. Sundar Pichai, head of Google, calls AI “more profound than fire.”

That’s big.

Bigger than computers. Even bigger than the internet? Let’s try to size this latest wave of technology.

To do that, I spoke to Rohit Chauhan, Executive Vice President of AI at Mastercard, as he explained it:

“In the 1980s, computing became affordable. The cost of calculation effectively went to zero. Business challenges were reframed to become arithmetic problems characterized by spreadsheets and databases.

In the 1990s, the Internet effectively brought the cost of communication down to zero. Business challenges were reframed as communication problems characterized by APIs, email, and media.

Now, AI will bring the cost of prediction down to zero. Business challenges will be reframed as prediction problems. The scope and impact on productivity of AI will be bigger than computing and the Internet combined.”

Come on in, the water’s lovely.

Inevitably, technology comes at us fast and changes things slowly.

Email was the province of government and academic institutions when it was first invented in the 1970s. By the time Tom Hanks and Meg Ryan rom-commed to You’ve Got Mail in the 1990s, it had reached a majority. In the first and second decades of the twentieth century, email had spammed its way to near-ubiquity.

Waves of technology are crashing faster and faster.

Instead of the forty years of email, Social Media spread quickly, taking less than twenty years to become part of the fabric of our lives. Technology waves build on each other; we went from the first iPhone as a prized possession to the what-number-are we-on-now of the iPhone 14 less than fifteen years later.

We don’t know—yet—how big the AI wave will be. Pichai could be right.

The idea that AI paves the way for a spectacular future is shared by Emad Isaac. He’s the Chief Data Technology Officer at Allstate and is paving the way for the responsible use of AI within the company, as he puts it:

“It’s a little like learning to drive a car. When you’re young and you first get behind the wheel, it’s terrifying. A massive, powerful machine that can be dangerous if you can’t operate it properly. So you sit with an expert — a driving instructor — and gradually learn how to operate it. There will be a few minor incidents along the way, but over time and with practice, we get to the point where today, we drive and don’t even think about the act of driving anymore.

AI is like that. In time, it becomes routine to use and interact with. However, there’s danger in the hype cycle. That we don’t understand its promise and its limitations. We overplay the return and don’t understand the investment.

One thing I am certain about; it’s not going to replace people. But it will change the way people work.”

Ready or not, AI is here, and we—humans—need to get to grips with it.

Gavin McMahon is a founder and Chief Content Officer for fassforward consulting group. He leads Learning Design and Product development across fassforward’s range of services. This crosses diverse topics, including Leadership, Culture, Decision-making, Information design, Storytelling, and Customer Experience. He is also a contributor to Forbes Business Council.

Eugene Yoon is a graphic designer and illustrator at fassforward. She is a crafter of Visual Logic. Eugene is multifaceted and works on various types of projects, including but not limited to product design, UX and web design, data visualization, print design, advertising, and presentation design.

Free Survey
About Us
Our Thinking
Free Downloads