How AI can help you make better sense of the world

Screenshot of a chess game app – Chess. Chess programs running on smartphones today can play at the grandmaster level, according to the AI Index 2018 Annual Report.
Screenshot of a chess game app – Chess. Chess programs running on smartphones today can play at the grandmaster level, according to the AI Index 2018 Annual Report.

By Xiaoyi Liu
Medill Reports

Artificial intelligence (AI) is on the cusp of materially changing our own intelligence and decision-making ability. Just as we saw the replacement of human labor with machines during the Industrial Revolution, plan on a similar revolution in the modern workforce. AI will also bring economic opportunity, societal disruption – and lots of mixed feelings.

The definition of AI, a buzzword in computer science and digital marketing, can vary depending on who is answering the question. For Kristian Hammond, a professor of electrical engineering and computer science at Northwestern University’s McCormick School of Engineering, if a machine is doing something that we think is intelligent when a human does it, give the machine credit for AI.

With mind-boggling amounts of data drowning people as they try to make decisions, AI offers a cool head and clear analysis. “I was struck by the bad relationship that people have with data,” Hammond said, and that is a motivation for him. “In general, every single day, the data that we generate – that moves one way to the other through the computer, is roughly equivalent to about 500 hundred books,” Hammond said. “Some of them are really valuable, we can get a lot of insight from them.”  Hammond makes tools to craft those insights for easy understanding.


The Mechanism
“I think language is probably one of the best examples,” Hammond said. The ability to understand language and produce language is not something that comes naturally to a computer. “If I can ask a machine questions and it tells me answers in natural language, then I never have to think of it as a machine anymore.”

He offers this example. Imagine you have friends who are struggling in school, and you are asked, “Are they doing any better?” If the caller knows about how your friends were doing last year, you would know how to answer that question by thinking about how they are performing now, what their grades look like, and you see if those grades today are better than formerly. “All we need to do is teach machines how to do that across a wide variety of ways,” said Hammond.

The machine doesn’t have an idea of the world to manipulate it in the same way that we do. It has to be able to map a question such as, “Is something getting better?” onto the analysis it is required to do. It needs the nature of the question and the directions to figure out what the answer looks like.

“Whenever I ask a question, there are going to be a whole bunch of different ways in which someone might be able to answer,” Hammond said. “A machine has to know about those different ways and be able to apply those different technologies, techniques, to different kinds of data.”

Narrative Science
Hammond co-founded the startup Narrative Science in 2010 with Larry Birnbaum, a computer science professor at Northwestern’s McCormick School of Engineering. Their focus? “Given all the data we have, how do you get people to understand what’s happening in the world without them all being data scientists.”

“I’m interested in how people interact with machines, so language is part of that,” said Hammond about the reasons why he founded the company.

Quill, Narrative Science’s flagship product, uses computer algorithms to extract the most important information from large segments of data and then crafts those insights into easy-to-scan, understandable narrative reports. Reams of data become bite-sized. The company more recently introduced Narratives for Business Intelligence, a suite of natural language extensions that mine charts, tables and graphs for insights and then convert that information into narrative language.

“We need something else, and that’s machines,” Hammond said. When we remove these human limitations and employ machine intelligence to process data at a super-human scale, structure and patterns emerge.

AI-Supported Jobs
When it comes to the impact of AI on jobs, there is no shortage of angst. A survey by Pew Research Internet finds Americans are roughly twice as likely to express worry (72 percent) than enthusiasm (33 percent) about a future where robots and computers can do many jobs that are currently done by humans.

According to “The Artificial Intelligence 2018 Annual Report” by AI Index, a project hosted at Stanford’s Human-Centered AI Institute, AI has become global. (Xiaoyi Liu/MEDILL)
According to “The Artificial Intelligence 2018 Annual Report” by AI Index, a project hosted at Stanford’s Human-Centered AI Institute, AI has become global. (Xiaoyi Liu/MEDILL)

AI has so far replaced more menial labor, but there are concerns in media industry about recruiting it for the more analytical positions in the economy such as writers and reporters.

“There was a lot of uproar about the idea of machines writing news stories,” Hammond said. “But I don’t think that’s incredibly impactful, because they are writing stories in those places where the stories are commodity. They are not like investigative reporting.”

Meanwhile, Hammond believes that it would be exciting if journalists can get a helping hand to investigate story leads against data bases from “automated data scientists” – software that could automate some of the data analysis work performed by data scientists, so that more complexity behind the numbers can be discovered and written in the reports.

“If you don’t have the data skills, you can’t find correlations or relationships in the data,” he said. “But if you actually can do your appropriate analysis, you can find these things and make use of them.”

News organizations are actually turning to machines for the help on data analysis. Reuters is building an AI tool “Lynx Insight” to help journalists analyze data and suggest story ideas, while the Press Association is working with Urbs Media for natural-language generation for local news with the project “Reporters and Data and Robots.”

“Environmental elements and climate changes are largely tracked in quantifiable, longitudinal data,” according to Dr. Kai-Fu Lee, former head of Google China and current CEO of Sinovation Ventures – an AI institute and technology venture capital firm. He believes that combining efforts from environmental and AI experts can also be very applicable using cases and solutions from AI in solving environmental problems such as climate change.

While AI brings long-lasting impact to certain industries, Hammond believes that in some fields, we have to be very careful with the applications. For instance, relying too much on predictions in law, criminal cases and policing can be dangerous.

“There are technologies that will predict the livelihood of someone committing a crime again that are being used to make decisions about paroles, which really shouldn’t…they shouldn’t be using right now. The technology is not ready.”

Better Decisions
The history of computer science has been a history of trying to get the machine closer and closer to us, according to Hammond, and what he has been devoted to is putting the notion of “building companions that pay attention to us” into practice in building the AI systems.

“They will be designed to help us, to partner with us,” he said.

How would this impact us? According to Hammond, we, as individuals, will do a better job of making decisions as a result of  access to AI systems. “We make mistakes, and having machine actually understand those mistakes and help us with those areas would be incredibly useful.”

Individuals commit “sunk cost fallacy,” one type of cognitive bias, when they continue a behavior or endeavor as a result of previously invested resources, such as time, money or effort.

Here is an example. When people need to choose between committing six months time to a path they have already devoted six months in pursuing or restarting on a promising new path that takes three months in total to finish, “most people will continue along the six-months,” Hammond said. They will feel that starting the new path wastes their previous six-month investment.

“But of course it’s faulty reasoning,” he said. “Having systems that actually remind us of that will help us make better decisions” And can sort out pros and cons.

“In the long run, machines are going to be partnering with us all over the place,” Hammond said, and we – human being and machines, stay together. “Imagine that if you could tell Alexa what are you thinking about, and what are you worried about, and what are you concerned about, and it could have a conversation with you about that,” Hammond said. “That’s what it means.”

Photo at top: Screenshot of a chess game app – Chess. Chess programs running on smartphones today can play at the grand master level, according to the AI Index 2018 Annual Report. (Xiaoyi Liu/MEDILL)