CSI 4106 - Fall 2024
Version: Sep 9, 2024 08:36
We are assembling a lean, cracked team of the world’s best engineers and researchers dedicated to focusing on SSI (safe superintelligence) and nothing else.
Understanding the history of artificial intelligence is crucial, especially now as we find ourselves at the peak of speculative enthusiasm, with widespread claims that the era of general artificial intelligence is imminent.
Aristotle (384-322 BC) laid several foundational concepts for AI, including an informal system of syllogisms that facilitates proper reasoning by mechanically deriving conclusions from given premises.
Utilitarianism is an ethical theory that emphasizes the greatest good for the greatest number.
Complex algorithms have their origins with Euclid around 300 BC, while the term “algorithm” itself is derived from the work of Muhammad ibn Musa al-Khwarizmi in the 9th century.
The Church-Turing thesis posits that any computation that can be performed by a mechanical process can be computed by a Turing machine, essentially equating the concept of algorithmic computation with the capabilities of Turing machines (Church 1936; Alan M. Turing 1936).
The concept of NP-completeness, introduced by Cook and further developed by Karp, establishes a framework for evaluating the tractability of computational problems. (Cook 1971; Karp 1972)
Today, it is universally acknowledged that cognitive functions emerge from the electrochemical activities within these brain structures, illustrating how assemblies of simple cells can give rise to thought, action, and consciousness.
Large-scale collaborative studies have provided us with extensive data encompassing the anatomy, cell types, connectivity, and gene expression profiles of the brain (Maroso 2023; Conroy 2023).
Supercomputer | Personal Computer | Human Brain | |
---|---|---|---|
Processing units | \(10^6\) CPU+GPU cores | 8 CPU cores | \(10^6\) columns |
\(10^{15}\) transistors | \(10^{15}\) transistors | \(10^{11}\) neurons | |
\(10^{14}\) synapses | |||
Cycle time | \(10^{-9}\) sec | \(10^{-9}\) sec | \(10^{-3}\) sec |
Operations/sec | \(10^{18}\) | \(10^{10}\) | \(10^{17}\) |
If the organism carries a “small-scale model” of external reality and of its own possible actions within its head, it is able to try out various alternatives, conclude which is the best of them, react to future situations before they arise, utilize the knowledge of past events in dealing with the present and future, and in every way to react in a much fuller, safer, and more competent manner to the emergencies which face it.
Cognitive psychology conceptualizes the brain as an information-processing device.
Knowledge-based agents are conceptualized as receiving inputs (percepts) from their environment, having an internal state, and producting actions (outputs).
In the same year that the term “artificial intelligence” was introduced, cognitive science emerged as a discipline.
1956 MIT workshop:
Three foundational papers demonstrated how computer models can be applied to the psychology of memory, language, and logical reasoning.
1943–1974
The Turing Test is a measure of a machine’s ability to exhibit intelligent behavior indistinguishable from that of a human.
If a human evaluator cannot reliably distinguish between a machine and a human based solely on their responses to questions, the machine is said to have passed the test.
It’s likely that the Turing Test will become yet another casualty of our shifting conceptions of intelligence. In 1950, Turing intuited that the ability for human-like conversation should be firm evidence of “thinking,” and all that goes with it. That intuition is still strong today. But perhaps what we have learned from ELIZA and Eugene Goostman, and what we may still learn from ChatGPT and its ilk, is that the ability to sound fluent in natural language, like playing chess, is not conclusive proof of general intelligence.
Warren S. McCulloch & Walter Pitts 1943
In 1949, Donald Hebb introduced a straightforward updating rule for adjusting the connection strengths between neurons.
Hebbian learning is a learning mechanism in which the synaptic strength between two neurons is increased if they are activated simultaneously. This principle is often summarized as “cells that fire together, wire together,” and it forms the basis for understanding how neural connections are reinforced through experience.
Dartmouth Summer Research Project on Artificial Intelligence
We propose that a 2-month, 10-man study of artificial intelligence be carried out during the summer of 1956 at Dartmouth College in Hanover, New Hampshire. The study is to proceed on the basis of the conjecture that every aspect of learning or any other feature of intelligence can in principle be so precisely described that a machine can be made to simulate it. An attempt will be made to find how to make machines use language, form abstractions and concepts, solve kinds of problems now reserved for humans, and improve themselves. We think that a significant advance can be made in one or more of these problems if a carefully selected group of scientists work on it together for a summer.
Russell and Norvig (2020)
Newell and Simon presented perhaps the most mature work, a mathematical theorem-proving system called the Logic Theorist (LT). Simon claimed, ‘We have invented a computer program capable of thinking non-numerically, and thereby solved the venerable mind–body problem.’
Arthur Samuel’s work on machine learning using the game of checkers has had a profound impact on the field of artificial intelligence (AI) and computer science at large.
1957 Herbert Simon
It is not my aim to surprise or shock you—but the simplest way I can summarize is to say that there are now in the world machines that think, that learn and that create. Moreover, their ability to do these things is going to increase rapidly until—in a visible future—the range of problems they can handle will be coextensive with the range to which the human mind has been applied.
1958, New York Times, July 8
The Navy revealed the embryo of an electronic computer today that it expects will be able to walk, talk, see, write, reproduce itself, and be conscious of its existence.
1965 Herbert Simon (Mitchell 2019)
(\(\ldots\)) machines will be capable, within 20 years, of doing any work that a man can do.
1966 Marvin Minsky (Mitchell 2019)
(\(\ldots\)) in which they would assign undergraduates to work on “the construction of a significant part of a visual system.” In the words of one AI historian, “Minsky hired a first-year undergraduate and assigned him a problem to solve over the summer: connect a television camera to a computer and get the machine to describe what it sees.”
1967 Marvin Minsky (Strickland 2021)
Within a generation\(\ldots\) the problem of creating ‘artificial intelligence’ will be substantially solved.
1974–1980
In 1976, Newell and Simon, the authors of Logic Theorist (LT) create the General Problem Solver (GPS) meant to emulate how human solve problems.
Allen Newell and Simon (1976)
a physical symbol system has the necessary and sufficient means for general intelligent action.
Funding dried up.
Russell and Norvig (2020)
Failure to come to grips with the “combinatorial explosion” was one of the main criticisms of AI contained in the Lighthill report (Lighthill, 1973), which formed the basis for the decision by the British government to end support for AI research in all but two universities.
Fundamental limitations: what could be represented. Linearly separable data, for instance.
1980–1987
Expert systems are programs that emulate the decision-making abilities of a human expert by using a knowledge base and inference rules (typically, if-then rules) to solve complex problems within a specific domain.
Rule 1: - IF the patient has a fever AND the patient has a sore throat, - THEN consider the possibility of a streptococcal infection.
Rule 2: - IF the patient has a rash AND the patient has been in a wooded area recently, - THEN consider the possibility of Lyme disease.
Rule 3: - IF the patient is experiencing chest pain AND the patient has a history of heart disease, - THEN consider the possibility of a myocardial infarction (heart attack).
1987–1993
Strickland (2021)
By the 1990s, it was no longer academically fashionable to be working on either symbolic AI or neural networks, because both strategies seemed to have flopped.
1993–2011
2011–
In 2012, AlexNet, a convolutional neural network (CNN) architecture inspired by Yann LeCun’s work, wins the ImageNet Large Scale Visual Recognition Challenge.
This marked a pivotal moment in the field, as subsequently, all leading entries in the competition have been founded on deep learning methodologies.
See also:
Proficiency in Python is expected.
For those needing a refresher, the official tutorial on Python.org is a good place to start.
Simultaneously enhance your skills by creating a Jupyter Notebook that incorporates examples and notes from the tutorial.
Other resources include:
A notebook is a shareable document that combines computer code, plain language descriptions, data, rich visualizations like 3D models, charts, graphs and figures, and interactive controls. A notebook, along with an editor (like JupyterLab), provides a fast interactive environment for prototyping and explaining code, exploring and visualizing data, and sharing ideas with others.
Assuming the notebook is in the current directory, execute the following command from the terminal.
Similarly, to create a new notebook from scratch,
Ease of Use: The interface is intuitive and conducive to exploratory analysis.
Visualization: The capability to embed rich, interactive visualizations directly within the notebook enhances its utility for data analysis and presentation.
Reproducibility: Jupyter Notebooks have become the de facto standard in many domains for demonstrating code functionality and ensuring reproducibility.
By default, Jupyter Notebooks store the outputs of code cells, including media objects.
Jupyter Notebooks are JSON documents, and images within them are encoded in PNG base64 format.
This encoding can lead to several issues when using version control systems, such as GitHub.
jupyter nbconvert --clear-output
or
jupyter nbconvert 04_stock_price.ipynb --to notebook --ClearOutputPreprocessor.enabled=True --output 04_stock_price_clear
nbdime
, specialized for Jupyter Notebooks.These instructions use pip
, the recommended installation tool for Python.
The initial step is to verify that you have a functioning Python installation with pip installed.
Installing JupyterLab
with pip
:
Once installed, run JupyterLab
with:
Launching 03_get_youtube_transcript
in Colab.
04_stock_price
Launching 04_stock_price
in Colab.
05_central_limit
Launching 05_central_limit
in Colab.
:::
Important
Do not attempt to install these tools unless you are confident in your technical skills. An incorrect installation could waste significant time or even render your environment unusable. There is nothing wrong with using pip
or Google Colab for your coursework. You can develop these installation skills later without impacting your grades.
conda
, facilitate the creation of virtual environments tailored to specific projects.Anaconda is a comprehensive package management platform for Python and R. It utilizes Conda to manage packages, dependencies, and environments.
Anaconda is advantageous as it comes pre-installed with over 250 popular packages, providing a robust starting point for users.
However, this extensive distribution results in a large file size, which can be a drawback.
Additionally, since Anaconda relies on conda
, it also inherits the limitations and issues associated with conda
(see subsequent slides).
Miniconda is a minimal version of Anaconda that includes only conda
, Python, their dependencies, and a small selection of essential packages.
Conda is an open-source package and environment management system for Python and R. It facilitates the installation and management of software packages and the creation of isolated virtual environments.
Dependency conflicts due to complex package interdependencies can force the user reinstall Anaconda/Conda.
Plague with large storage requirements and performance issues during package resolution.
Mamba is a reimplementation of the conda
package manager in C++.
conda
.conda
, making it a viable replacement.Micromamba is a fully statically-linked, self-contained executable. Its empty base environment ensures that the base is never corrupted, eliminating the need for reinstallation.
Marcel Turcotte
School of Electrical Engineering and Computer Science (EECS)
University of Ottawa