Introduction to Artificial Intelligence

Introduction to Artificial Intelligence

Gain a foundational knowledge of the science behind creating computer systems that can perform tasks typically requiring human intelligence in this introductory artificial intelligence (AI) course.

6 Weeks Access / 24 Course Hrs
  • Details
  • Syllabus
  • Requirements
  • Instructor



For decades, artificial intelligence (AI) has been a staple of science fiction stories, but thanks to modern advances in computational capacity and storage capabilities, it's becoming a reality. Today, there are examples of artificial intelligence all around us. The purpose of this course is to provide you with an artificial intelligence practical knowledge foundation.

This course will introduce you to various forms of artificial intelligence (AI) and how we interact with AI as consumers in applications like chatbots and recommendation engines. You'll see how AI provides analytics in business and consider industries that may be transformed or even disrupted by AI implementations. You'll go under the hood to see how computers can "learn" using artificial neural networks and various forms of machine learning. You will review AI applications such as natural language processing, forecasting, and robotics. You'll also learn about the AI development process and how AI will affect the workforce. Finally, you'll consider some of the ethical factors in AI deployment.


In this lesson, you will gain a clear understanding of what artificial intelligence is and its three forms—artificial narrow intelligence, artificial general intelligence, and artificial superintelligence. You'll also see how AI is already part of our everyday lives, sometimes in ways, you may not even realize, and differentiate real-world and science-fiction AI.

Now you will take a closer look at how we interact with AI as consumers in both pre-and post-purchase applications such as chatbots, recommendation engines, virtual reality, and shopping assistants. You'll see how other AI applications can gather business-related data and use it to inform decisions within an organization, yielding business forecasting, and analytics. Finally, we'll consider industries that may be transformed or even disrupted by AI implementations, such as healthcare and the financial and transportation sectors.

This lesson delves into machine learning—how computers can "learn" by mapping input to output using complex mathematical and statistical models. Suitable algorithms plus useful training data can enable computers to improve their production over time, effectively learning as humans do. With just a little math, you'll find out about supervised learning, including regression and classification, unsupervised learning, and reinforcement learning as they apply to computers. You will understand the importance of useful data in getting good results and how programmers avoid algorithmic bias.

In this deeper dive into how AI works, you'll learn about artificial neural networks—basically computational models that loosely replicate the biological brain structure. You'll see how an artificial neuron mimics a biological one and understands the specific training processes with a little more math. Then, you'll examine deep learning, a specialized subset of machine learning, including convolutional neural networks, recurrent neural networks, and long short-term memory.

Computer vision is a subset of artificial intelligence focusing on how computers can extract useful information from digital images or videos—easy for us, hard for them. You'll learn about how computers store and interpret images, along with some of the most advanced AI applications involving facial and object detection and recognition, autonomous vehicles, and triage and early diagnosis in healthcare.

You've probably seen natural language processing in action on your phone or digital home assistants such as Alexa, Google, or Siri. In this lesson, you'll consider the intricate steps the computer must execute to understand and then carry out your commands, converting words into machine-usable numbers using natural language processing techniques and back into words using natural language generation. You'll get a look at exactly how processes such as one-hot encoding, bag-of-words, term frequency, inverse document frequency, and word embedding work, as well as some applications of NLP in businesses today, including sentiment analysis and AI-powered surveys.

One beneficial application of AI is in forecasting. In this lesson, you will learn about time series analysis, which attempts to find the patterns in data. The pattern components are the trend, seasonality, cyclic patterns, and randomness (noise). Time series forecasting can involve univariate analysis (a single variable changing over time) or multivariate analysis. Many industries use time-series data analysis and forecastings, such as healthcare, sales, and weather prediction.

Robots are a well-known AI application. Unlike humanoid robots from science fiction movies, many real-life robots around us today in factories, warehouses, agriculture, and even in homes don't look much like people at all. You will learn about the kinds of tasks robots excel at, repetitive tasks with limited variability in a well-controlled environment. You will also learn about the challenges robotic projects face, such as high variability in the environment and high failure costs. Finally, you will see how robots are used today in two industries: logistics and agriculture.

In this lesson, you will look at the AI development process and a typical AI project workflow, along with the low-level languages commonly used for AI programming. You will also learn about machine learning framework software and software suites that can help with AI development. In addition, you will discover pre-made AI services that you can buy ready to use (or nearly so) from vendors such as Amazon, IBM, Google, and Microsoft.

AI has already started to affect employment, and its influence will continue to rise in the future. You will learn what job roles are involved in designing, developing, and deploying AI systems, including various types of engineers and data scientists, business analysts, and computing professionals. Thinking about what AI is good at (and not so good at), we will look at how jobs are being transformed and disrupted by AI and consider how several industries, and a variety of specific careers, might be affected.

Given the power of artificial intelligence, it's unsurprising that ethics is a big concern. You will learn how bias is an issue for human decision-making and decisions made by AI systems, some caused by programmers and some from training datasets. You will see how system engineers can develop AI systems in ways to make them more trustable by building in explainability and interpretability. You will also examine some of the ethical concerns with AI systems like facial recognition, such as loss of personal privacy and the potential for misuse.

With your increased understanding of artificial intelligence and its capabilities, you will consider what the future will bring with AI. You will learn about areas that AI researchers are working on now, such as natural language processing and interacting with objects, and about neuromorphic computing and its relationship to neural network research. You will preview some up-and-coming technologies in storage and processing that will enable the next generation of AI applications and discover how AI is changing the workplace. Finally, you will look at some possible views of a future that features AI prominently and how AI developers seek to make AI systems safer by carefully creating incentive systems.



Basic computer skills and high school level mathematics are required.


Hardware Requirements:

  • This course can be taken on either a PC or Mac.

Software Requirements:

  • PC: Windows 8 or later.
  • Mac: macOS 10.6 or later.
  • Browser: The latest version of Google Chrome or Mozilla Firefox are preferred. Microsoft Edge and Safari are also compatible.
  • Adobe Acrobat Reader.
  • Software must be installed and fully operational before the course begins.


  • Email capabilities and access to a personal email account.

Instructional Material Requirements:

The instructional materials required for this course are included in enrollment and will be available online.


David Iseminger

David Iseminger is an author and technology veteran with expertise in computing, networking, wireless and cloud technologies, data and analytics, artificial intelligence, and blockchain. While with Microsoft, David worked on early versions of Windows and its core networking infrastructure, transmission protocols, security, data visualizations, and multiple emerging cloud technologies. David is passionate about education, serving as a School Board director for over ten years, advocating at state and federal levels for increased learning standards, and has taught over 40,000 students through multiple technology courses. He has an awarded patent in Artificial Intelligence (AI) object detection and social posting methodologies. He is the founder and CEO of the blockchain company that created IronWeave, the unlimited scale blockchain platform, based on his patent-pending blockchain innovations and inventions.

Self-Guided Course Code: T14257
Instructor-Moderated Course Code: iai