APU Careers & Learning Online Learning Original

AI Online Teachers May Not Be a Good Instructional Tool

By Dr. William Oliver Hedgepeth
Faculty Member, Transportation and Logistics Management, American Public University

Start a degree program at American Public University.

Some people think that artificial intelligence (AI) and robots will one day take our jobs or, worse, enslave us. This negative feeling is what is called techlash. That’s a belief that AI is not transparent in its decision-making and not accountable when it errs.

IBM believes that for AI “to be useful and enduring, it needs to go into your core processes.” Other experts caution, however, that artificial intelligence is easily confused.

Operational AI systems consume large amounts of energy to build, train and operate. According to New Scientist, “Training artificial intelligence is an energy intensive process. New estimates suggest that the carbon footprint of training a single AI is as much as 284 tonnes of carbon dioxide equivalent – five times the lifetime emissions of an average car.”

AI systems also require deep learning, whereby large amounts of data are fed into the system so it can operate properly. Many investors in AI see it as a “nearly boundless landscape.” But as Julia Bossmann warns in the World Economic Forum website, AI “is just as much a new frontier for ethics and risk assessment as it is for emerging technology.” Ethics is one of the central themes of such risks and suspicions.

Bossmann says we “shouldn’t forget that AI systems are created by humans, who can be biased and judgemental.” So, ethics is a big issue when it comes to building AI systems.

Typical AI Systems Might Require Being Fed Billions of Articles

A typical AI system might require billions of articles fed into it so that it can understand the meaning of words and sentences it encounters, such as in student essays or in replying to a student’s query.

A survey that I conducted of 31 current students found they believed an AI robot system could not replace a human as their online teacher. Twenty-one of these students thought an AI online system or robot as a teacher was a bad idea.

What Some Online Students Think of AI as Teachers

The students wanted a human to give them feedback on their work. They felt the need to communicate with a real person when they needed help or a nudge of encouragement.

Another aspect the students said was missing with an artificial intelligence system was experience. When they know instructors have 10 to 20 years experience, they believe their advice is valid.

As for a grading system, many students did not believe an artificial intelligence system could accurately grade their work or respond to their errors. They said a human teacher would know how to respond better.

Start a degree program at American Public University.

What Biases Might Be Built into Artificial Intelligence?

We should not embrace AI and robotic systems without examining how and who will build these AI online teachers. What biases might be built into these systems? Would these systems be programmed to treat active-duty military students differently than civilian students?

Are artificial intelligence online teachers as bad as some students think? As one survey student complained: Some professors do not show up for class or answer their emails. So, any AI teacher would be an improvement.

Colleges must consider ethics and risk when deciding whether an AI system can replace an online teacher. Or, maybe that decision should mirror the advent of autonomous vehicles. A human is behind the wheel “just in case.” An AI online teacher might need a human colleague to show up only a few hours a day.

As we explore ways to redress these artificial intelligence negatives, the answers should become the basic building blocks for the algorithms, data and ethics of the technological developments. If these negatives cannot be resolved to aid student learning, the “bad” might remain rather than the “good.”

About the Author

Dr. Oliver Hedgepeth is a full-time professor at American Public University (APU). He was program director of three academic programs: Reverse Logistics Management, Transportation and Logistics Management and Government Contracting. He was Chair of the Logistics Department at the University of Alaska Anchorage. Dr. Hedgepeth was the founding Director of the Army’s Artificial Intelligence Center for Logistics from 1985 to 1990, Fort Lee, Virginia.

Oliver Hedgepeth

Dr. Oliver Hedgepeth is a full-time professor in the Dr. Wallace E. Boston School of Business. He was program director of three academic programs: Reverse Logistics Management, Transportation and Logistics Management, and Government Contracting. Dr. Hedgepeth was also Chair of the Logistics Department at the University of Alaska, Anchorage, and the founding Director of the Army’s Artificial Intelligence Center for Logistics from 1985 to 1990, Fort Lee, Virginia.

Comments are closed.