# Online courses directory (90)

The "sense-and-correct" nature of feedback controllers make them an appealing choice for systems whose actuators, or environments, are highly variable. If the system also requires high performance (e.g. an industrial robot, a car, or an aircraft), the usual approach is to use a state-space feedback controller derived from a physics-based model. And when performance is less critical (e.g. for toys and appliances), the traditional choice has been to tune a low-cost proportional-derivative-integral (PID) controller.

Over the last few years, much has changed. The dramatic decline in the cost of accurate sensors and fast microcontrollers have made state-space controllers practical even for inexpensive toys. In addition, modeling approaches have become far more reliant on measurement and computation rather than physics and analysis. In this course, we examine the theory and application of this arc of alternatives to control, starting with PID, then moving to physical-modeling and state-space, and ending with state-space using measurement-based modeling. In each case, you will design and test controllers with your own copter-levitated arm, to solidify your understanding and to gain insight in to the practical issues.

PLEASE NOTE: This is intended to be an advanced course and students should have a background in linear algebra and differential equations, as well as some experience with control systems. IN ADDITION: THIS IS A BETA COURSE, THINGS WILL GO WRONG. We are testing a new type of on-line class, one where students use advanced concepts to design and then examine performance results on their own hardware. There will be difficulties, and we will be updating content and focus in response to student input.

Is my program correct? Will it give the right output for all possible permitted inputs? Computers are now essential in everyday life. Incorrect programs lead to frustration in the best case and disaster in the worst. Thus, knowing how to construct correct programs is a skill that all who program computers must strive to master.

In this computer science course, we will presents "goal oriented programming" the way Edsger Dijkstra, one of the most influential computer scientists, intended. You will learn how to derive programs hand-in-hand with their proofs of correctness. The course presents a methodology that illustrates goal-oriented programming, starting with the formalization of what is to be computed, and then growing the program hand-in-hand with its proof of correctness. The methodology demonstrates that, for a broad class of matrix operations, the development, implementation, and establishment of correctness of a program can be made systematic.

Since this technique focuses on program specifications, it often leads to clearer, correct programs in less time. The approach rapidly yields a family of algorithms from which you can then pick the algorithm that has desirable properties, such as attaining better performance on a given architecture.

The audience of this MOOC extends beyond students and scholars interested in the domains of linear algebra algorithms and scientific computing. This course shows how to make the formal derivation of algorithms practical and will leave you pondering how our results might extend to other domains.

As a result of support from MathWorks, learners will be granted access to MATLAB for the duration of the course.

This course covers matrix theory and linear algebra, emphasizing topics useful in other disciplines such as physics, economics and social sciences, natural sciences, and engineering. It parallels the combination of theory and applications in Professor Strang’s textbook linearalgebra/">*Introduction to Linear Algebra*.

#### Course Format

linear-algebra-fall-2011/Syllabus"> This course has been designed for independent study. It provides everything you will need to understand the concepts covered in the course. The materials include:

- A complete set of
**Lecture Videos**by Professor Gilbert Strang. **Summary Notes**for all videos along with suggested readings in Prof. Strang's textbook*linearalgebra/">Linear Algebra*.**Problem Solving Videos**on every topic taught by an experienced MIT Recitation Instructor.**Problem Sets**to do on your own with**Solutions**to check your answers against when you're done.- A selection of
**Java® Demonstrations**to illustrate key concepts. - A full set of
**Exams with Solutions**, including review material to help you prepare.

## Other Versions

## Other OCW Versions

OCW has published multiple versions of this subject.

## Related Content

This course offers a rigorous treatment of linear algebra, including vector spaces, systems of linear equations, bases, linear independence, matrices, determinants, eigenvalues, inner products, quadratic forms, and canonical forms of matrices. Compared with linear-algebra-spring-2010">*18.06 Linear Algebra*, more emphasis is placed on theory and proofs.

Matrices, vectors, vector spaces, transformations. Covers all topics in a first year college linear algebra course. This is an advanced course normally taken by science or engineering majors after taking at least two semesters of calculus (although calculus really isn't a prereq) so don't confuse this with regular high school algebra. Introduction to matrices. Matrix multiplication (part 1). Matrix multiplication (part 2). Idea Behind Inverting a 2x2 Matrix. Inverting matrices (part 2). Inverting Matrices (part 3). Matrices to solve a system of equations. Matrices to solve a vector combination problem. Singular Matrices. 3-variable linear equations (part 1). Solving 3 Equations with 3 Unknowns. Introduction to Vectors. Vector Examples. Parametric Representations of Lines. Linear Combinations and Span. Introduction to Linear Independence. More on linear independence. Span and Linear Independence Example. Linear Subspaces. Basis of a Subspace. Vector Dot Product and Vector Length. Proving Vector Dot Product Properties. Proof of the Cauchy-Schwarz Inequality. Vector Triangle Inequality. Defining the angle between vectors. Defining a plane in R3 with a point and normal vector. Cross Product Introduction. Proof: Relationship between cross product and sin of angle. Dot and Cross Product Comparison/Intuition. Matrices: Reduced Row Echelon Form 1. Matrices: Reduced Row Echelon Form 2. Matrices: Reduced Row Echelon Form 3. Matrix Vector Products. Introduction to the Null Space of a Matrix. Null Space 2: Calculating the null space of a matrix. Null Space 3: Relation to Linear Independence. Column Space of a Matrix. Null Space and Column Space Basis. Visualizing a Column Space as a Plane in R3. Proof: Any subspace basis has same number of elements. Dimension of the Null Space or Nullity. Dimension of the Column Space or Rank. Showing relation between basis cols and pivot cols. Showing that the candidate basis does span C(A). A more formal understanding of functions. Vector Transformations. Linear Transformations. Matrix Vector Products as Linear Transformations. Linear Transformations as Matrix Vector Products. Image of a subset under a transformation. im(T): Image of a Transformation. Preimage of a set. Preimage and Kernel Example. Sums and Scalar Multiples of Linear Transformations. More on Matrix Addition and Scalar Multiplication. Linear Transformation Examples: Scaling and Reflections. Linear Transformation Examples: Rotations in R2. Rotation in R3 around the X-axis. Unit Vectors. Introduction to Projections. Expressing a Projection on to a line as a Matrix Vector prod. Compositions of Linear Transformations 1. Compositions of Linear Transformations 2. Matrix Product Examples. Matrix Product Associativity. Distributive Property of Matrix Products. Introduction to the inverse of a function. Proof: Invertibility implies a unique solution to f(x)=y. Surjective (onto) and Injective (one-to-one) functions. Relating invertibility to being onto and one-to-one. Determining whether a transformation is onto. Exploring the solution set of Ax=b. Matrix condition for one-to-one trans. Simplifying conditions for invertibility. Showing that Inverses are Linear. Deriving a method for determining inverses. Example of Finding Matrix Inverse. Formula for 2x2 inverse. 3x3 Determinant. nxn Determinant. Determinants along other rows/cols. Rule of Sarrus of Determinants. Determinant when row multiplied by scalar. (correction) scalar multiplication of row. Determinant when row is added. Duplicate Row Determinant. Determinant after row operations. Upper Triangular Determinant. Simpler 4x4 determinant. Determinant and area of a parallelogram. Determinant as Scaling Factor. Transpose of a Matrix. Determinant of Transpose. Transpose of a Matrix Product. Transposes of sums and inverses. Transpose of a Vector. Rowspace and Left Nullspace. Visualizations of Left Nullspace and Rowspace. Orthogonal Complements. Rank(A) = Rank(transpose of A). dim(V) + dim(orthogonal complement of V)=n. Representing vectors in Rn using subspace members. Orthogonal Complement of the Orthogonal Complement. Orthogonal Complement of the Nullspace. Unique rowspace solution to Ax=b. Rowspace Solution to Ax=b example. Showing that A-transpose x A is invertible. Projections onto Subspaces. Visualizing a projection onto a plane. A Projection onto a Subspace is a Linear Transforma. Subspace Projection Matrix Example. Another Example of a Projection Matrix. Projection is closest vector in subspace. Least Squares Approximation. Least Squares Examples. Another Least Squares Example. Coordinates with Respect to a Basis. Change of Basis Matrix. Invertible Change of Basis Matrix. Transformation Matrix with Respect to a Basis. Alternate Basis Transformation Matrix Example. Alternate Basis Transformation Matrix Example Part 2. Changing coordinate systems to help find a transformation matrix. Introduction to Orthonormal Bases. Coordinates with respect to orthonormal bases. Projections onto subspaces with orthonormal bases. Finding projection onto subspace with orthonormal basis example. Example using orthogonal change-of-basis matrix to find transformation matrix. Orthogonal matrices preserve angles and lengths. The Gram-Schmidt Process. Gram-Schmidt Process Example. Gram-Schmidt example with 3 basis vectors. Introduction to Eigenvalues and Eigenvectors. Proof of formula for determining Eigenvalues. Example solving for the eigenvalues of a 2x2 matrix. Finding Eigenvectors and Eigenspaces example. Eigenvalues of a 3x3 matrix. Eigenvectors and Eigenspaces for a 3x3 matrix. Showing that an eigenbasis makes for good coordinate systems. Vector Triple Product Expansion (very optional). Normal vector from plane equation. Point distance to plane. Distance Between Planes.

This is a communication intensive supplement to Linear Algebra (linear-algebra-spring-2005/index.htm">18.06). The main emphasis is on the methods of creating rigorous and elegant proofs and presenting them clearly in writing. The course starts with the standard linear algebra syllabus and eventually develops the techniques to approach a more advanced topic: abstract root systems in a Euclidean space.

Linear Algebra: Foundations to Frontiers (LAFF) is packed full of challenging, rewarding material that is essential for mathematicians, engineers, scientists, and anyone working with large datasets. Students appreciate our unique approach to teaching linear algebra because:

- It’s visual.
- It connects hand calculations, mathematical abstractions, and computer programming.
- It illustrates the development of mathematical theory.
- It’s applicable.

In this course, you will learn all the standard topics that are taught in typical undergraduate linear algebra courses all over the world, but using our unique method, you'll also get more! LAFF was developed following the syllabus of an introductory linear algebra course at The University of Texas at Austin taught by Professor Robert van de Geijn, an expert on high performance linear algebra libraries. Through short videos, exercises, visualizations, and programming assignments, you will study Vector and Matrix Operations, Linear Transformations, Solving Systems of Equations, Vector Spaces, Linear Least-Squares, and Eigenvalues and Eigenvectors. In addition, you will get a glimpse of cutting edge research on the development of linear algebra libraries, which are used throughout computational science.

MATLAB licenses will be made available to the participants free of charge for the duration of the course.

We invite you to LAFF with us!

Foundations to Frontiers (LAFF) is packed full of challenging, rewarding material that is essential for mathematicians, engineers, scientists, and anyone working with large datasets. Students appreciate our unique approach to teaching linear algebra because:

- It’s visual.
- It connects hand calculations, mathematical abstractions, and computer programming.
- It illustrates the development of mathematical theory.
- It’s applicable.

In this course, you will learn all the standard topics that are taught in typical undergraduate linear algebra courses all over the world, but using our unique method, you'll also get more! LAFF was developed following the syllabus of an introductory linear algebra course at The University of Texas at Austin taught by Professor Robert van de Geijn, an expert on high performance linear algebra libraries. Through short videos, exercises, visualizations, and programming assignments, you will study Vector and Matrix Operations, Linear Transformations, Solving Systems of Equations, Vector Spaces, Linear Least-Squares, and Eigenvalues and Eigenvectors. In addition, you will get a glimpse of cutting edge research on the development of linear algebra libraries, which are used throughout computational science.

MATLAB licenses will be made available to the participants free of charge for the duration of the course.

**This summer version of the course will be released at an accelerated pace. Each of the three releases will consist of four ”Weeks” plus an exam . There will be suggested due dates, but only the end of the course is a true deadline.**

We invite you to LAFF with us!

**FAQs**

**What is the estimated effort for the course? **

About 8 hrs/week.

**How much does it cost to take the course?**

You can choose! Auditing the course is free. If you want to challenge yourself by earning a Verified Certificate of Achievement, the contributions start at $50.

**Will the text for the videos be available?**

Yes. All of our videos will have transcripts synced to the videos.

**Are notes available for download?**

PDF versions of our notes will be available for free download from the edX platform during the course. Compiled notes are currently available at www.ulaff.net.

**Do I need to watch the videos live?**

No. You watch the videos at your leisure.

**Can I contact the Instructor or Teaching Assistants?**

Yes, but not directly. The discussion forums are the appropriate venue for questions about the course. The instructors will monitor the discussion forums and try to respond to the most important questions; in many cases response from other students and peers will be adequate and faster.

**Is this course related to a campus course of The University of Texas at Austin?**

Yes. This course corresponds to the Division of Statistics and Scientific Computing titled “SDS329C: Practical Linear Algebra”, one option for satisfying the linear algebra requirement for the undergraduate degree in computer science.

**Is there a certificate available for completion of this course?**

Online learners who successfully complete LAFF can obtain an edX certificate. This certificate indicates that you have successfully completed the course, but does not include a grade.

**Must I work every problem correctly to receive the certificate?**

No, you are neither required nor expected to complete every problem.

**What textbook do I need for the course?**

There is no textbook. PDF versions of our notes will be available for free download from the edX platform during the course. Compiled notes are currently available at www.ulaff.net.

**What are the principles by which assignment due dates are established?**

There is a window of 19 days between the material release and the due date for the homework of that week. While we encourage you to complete a week’s work before the launch of the next week, we realize that life sometimes gets in the way so we have established a flexible cushion. Please don’t procrastinate. The course closes 25 May 2015. This is to give you nineteen days from the release of the final to complete the course.

**Are there any special system requirements?**

You may need at least 768MB of RAM memory and 2-4GB of free hard drive space. You should be able to successfully access the course using Chrome and Firefox.

This mini-course is intended for students who would like a refresher on the basics of linear algebra. The course attempts to provide the motivation for "why" linear algebra is important in addition to "what" linear algebra is. Students will learn concepts in linear algebra by applying them in computer programs. At the end of the course, you will have coded your own personal library of linear algebra functions that you can use to solve real-world problems.

We explore creating and moving between various coordinate systems. Orthogonal Complements. dim(V) + dim(orthogonal complement of V)=n. Representing vectors in Rn using subspace members. Orthogonal Complement of the Orthogonal Complement. Orthogonal Complement of the Nullspace. Unique rowspace solution to Ax=b. Rowspace Solution to Ax=b example. Projections onto Subspaces. Visualizing a projection onto a plane. A Projection onto a Subspace is a Linear Transforma. Subspace Projection Matrix Example. Another Example of a Projection Matrix. Projection is closest vector in subspace. Least Squares Approximation. Least Squares Examples. Another Least Squares Example. Coordinates with Respect to a Basis. Change of Basis Matrix. Invertible Change of Basis Matrix. Transformation Matrix with Respect to a Basis. Alternate Basis Transformation Matrix Example. Alternate Basis Transformation Matrix Example Part 2. Changing coordinate systems to help find a transformation matrix. Introduction to Orthonormal Bases. Coordinates with respect to orthonormal bases. Projections onto subspaces with orthonormal bases. Finding projection onto subspace with orthonormal basis example. Example using orthogonal change-of-basis matrix to find transformation matrix. Orthogonal matrices preserve angles and lengths. The Gram-Schmidt Process. Gram-Schmidt Process Example. Gram-Schmidt example with 3 basis vectors. Introduction to Eigenvalues and Eigenvectors. Proof of formula for determining Eigenvalues. Example solving for the eigenvalues of a 2x2 matrix. Finding Eigenvectors and Eigenspaces example. Eigenvalues of a 3x3 matrix. Eigenvectors and Eigenspaces for a 3x3 matrix. Showing that an eigenbasis makes for good coordinate systems. Orthogonal Complements. dim(V) + dim(orthogonal complement of V)=n. Representing vectors in Rn using subspace members. Orthogonal Complement of the Orthogonal Complement. Orthogonal Complement of the Nullspace. Unique rowspace solution to Ax=b. Rowspace Solution to Ax=b example. Projections onto Subspaces. Visualizing a projection onto a plane. A Projection onto a Subspace is a Linear Transforma. Subspace Projection Matrix Example. Another Example of a Projection Matrix. Projection is closest vector in subspace. Least Squares Approximation. Least Squares Examples. Another Least Squares Example. Coordinates with Respect to a Basis. Change of Basis Matrix. Invertible Change of Basis Matrix. Transformation Matrix with Respect to a Basis. Alternate Basis Transformation Matrix Example. Alternate Basis Transformation Matrix Example Part 2. Changing coordinate systems to help find a transformation matrix. Introduction to Orthonormal Bases. Coordinates with respect to orthonormal bases. Projections onto subspaces with orthonormal bases. Finding projection onto subspace with orthonormal basis example. Example using orthogonal change-of-basis matrix to find transformation matrix. Orthogonal matrices preserve angles and lengths. The Gram-Schmidt Process. Gram-Schmidt Process Example. Gram-Schmidt example with 3 basis vectors. Introduction to Eigenvalues and Eigenvectors. Proof of formula for determining Eigenvalues. Example solving for the eigenvalues of a 2x2 matrix. Finding Eigenvectors and Eigenspaces example. Eigenvalues of a 3x3 matrix. Eigenvectors and Eigenspaces for a 3x3 matrix. Showing that an eigenbasis makes for good coordinate systems.

Understanding how we can map one set of vectors to another set. Matrices used to define linear transformations. A more formal understanding of functions. Vector Transformations. Linear Transformations. Matrix Vector Products as Linear Transformations. Linear Transformations as Matrix Vector Products. Image of a subset under a transformation. im(T): Image of a Transformation. Preimage of a set. Preimage and Kernel Example. Sums and Scalar Multiples of Linear Transformations. More on Matrix Addition and Scalar Multiplication. Linear Transformation Examples: Scaling and Reflections. Linear Transformation Examples: Rotations in R2. Rotation in R3 around the X-axis. Unit Vectors. Introduction to Projections. Expressing a Projection on to a line as a Matrix Vector prod. Compositions of Linear Transformations 1. Compositions of Linear Transformations 2. Matrix Product Examples. Matrix Product Associativity. Distributive Property of Matrix Products. Introduction to the inverse of a function. Proof: Invertibility implies a unique solution to f(x)=y. Surjective (onto) and Injective (one-to-one) functions. Relating invertibility to being onto and one-to-one. Determining whether a transformation is onto. Exploring the solution set of Ax=b. Matrix condition for one-to-one trans. Simplifying conditions for invertibility. Showing that Inverses are Linear. Deriving a method for determining inverses. Example of Finding Matrix Inverse. Formula for 2x2 inverse. 3x3 Determinant. nxn Determinant. Determinants along other rows/cols. Rule of Sarrus of Determinants. Determinant when row multiplied by scalar. (correction) scalar multiplication of row. Determinant when row is added. Duplicate Row Determinant. Determinant after row operations. Upper Triangular Determinant. Simpler 4x4 determinant. Determinant and area of a parallelogram. Determinant as Scaling Factor. Transpose of a Matrix. Determinant of Transpose. Transpose of a Matrix Product. Transposes of sums and inverses. Transpose of a Vector. Rowspace and Left Nullspace. Visualizations of Left Nullspace and Rowspace. Rank(A) = Rank(transpose of A). Showing that A-transpose x A is invertible. A more formal understanding of functions. Vector Transformations. Linear Transformations. Matrix Vector Products as Linear Transformations. Linear Transformations as Matrix Vector Products. Image of a subset under a transformation. im(T): Image of a Transformation. Preimage of a set. Preimage and Kernel Example. Sums and Scalar Multiples of Linear Transformations. More on Matrix Addition and Scalar Multiplication. Linear Transformation Examples: Scaling and Reflections. Linear Transformation Examples: Rotations in R2. Rotation in R3 around the X-axis. Unit Vectors. Introduction to Projections. Expressing a Projection on to a line as a Matrix Vector prod. Compositions of Linear Transformations 1. Compositions of Linear Transformations 2. Matrix Product Examples. Matrix Product Associativity. Distributive Property of Matrix Products. Introduction to the inverse of a function. Proof: Invertibility implies a unique solution to f(x)=y. Surjective (onto) and Injective (one-to-one) functions. Relating invertibility to being onto and one-to-one. Determining whether a transformation is onto. Exploring the solution set of Ax=b. Matrix condition for one-to-one trans. Simplifying conditions for invertibility. Showing that Inverses are Linear. Deriving a method for determining inverses. Example of Finding Matrix Inverse. Formula for 2x2 inverse. 3x3 Determinant. nxn Determinant. Determinants along other rows/cols. Rule of Sarrus of Determinants. Determinant when row multiplied by scalar. (correction) scalar multiplication of row. Determinant when row is added. Duplicate Row Determinant. Determinant after row operations. Upper Triangular Determinant. Simpler 4x4 determinant. Determinant and area of a parallelogram. Determinant as Scaling Factor. Transpose of a Matrix. Determinant of Transpose. Transpose of a Matrix Product. Transposes of sums and inverses. Transpose of a Vector. Rowspace and Left Nullspace. Visualizations of Left Nullspace and Rowspace. Rank(A) = Rank(transpose of A). Showing that A-transpose x A is invertible.

Let's get our feet wet by thinking in terms of vectors and spaces. Introduction to Vectors. Vector Examples. Scaling vectors. Adding vectors. Parametric Representations of Lines. Linear Combinations and Span. Introduction to Linear Independence. More on linear independence. Span and Linear Independence Example. Linear Subspaces. Basis of a Subspace. Vector Dot Product and Vector Length. Proving Vector Dot Product Properties. Proof of the Cauchy-Schwarz Inequality. Vector Triangle Inequality. Defining the angle between vectors. Defining a plane in R3 with a point and normal vector. Cross Product Introduction. Proof: Relationship between cross product and sin of angle. Dot and Cross Product Comparison/Intuition. Vector Triple Product Expansion (very optional). Normal vector from plane equation. Point distance to plane. Distance Between Planes. Matrices: Reduced Row Echelon Form 1. Matrices: Reduced Row Echelon Form 2. Matrices: Reduced Row Echelon Form 3. Matrix Vector Products. Introduction to the Null Space of a Matrix. Null Space 2: Calculating the null space of a matrix. Null Space 3: Relation to Linear Independence. Column Space of a Matrix. Null Space and Column Space Basis. Visualizing a Column Space as a Plane in R3. Proof: Any subspace basis has same number of elements. Dimension of the Null Space or Nullity. Dimension of the Column Space or Rank. Showing relation between basis cols and pivot cols. Showing that the candidate basis does span C(A). Introduction to Vectors. Vector Examples. Scaling vectors. Adding vectors. Parametric Representations of Lines. Linear Combinations and Span. Introduction to Linear Independence. More on linear independence. Span and Linear Independence Example. Linear Subspaces. Basis of a Subspace. Vector Dot Product and Vector Length. Proving Vector Dot Product Properties. Proof of the Cauchy-Schwarz Inequality. Vector Triangle Inequality. Defining the angle between vectors. Defining a plane in R3 with a point and normal vector. Cross Product Introduction. Proof: Relationship between cross product and sin of angle. Dot and Cross Product Comparison/Intuition. Vector Triple Product Expansion (very optional). Normal vector from plane equation. Point distance to plane. Distance Between Planes. Matrices: Reduced Row Echelon Form 1. Matrices: Reduced Row Echelon Form 2. Matrices: Reduced Row Echelon Form 3. Matrix Vector Products. Introduction to the Null Space of a Matrix. Null Space 2: Calculating the null space of a matrix. Null Space 3: Relation to Linear Independence. Column Space of a Matrix. Null Space and Column Space Basis. Visualizing a Column Space as a Plane in R3. Proof: Any subspace basis has same number of elements. Dimension of the Null Space or Nullity. Dimension of the Column Space or Rank. Showing relation between basis cols and pivot cols. Showing that the candidate basis does span C(A).

This course provides students with the basic analytical and computational tools of linear partial differential equations (PDEs) for practical applications in science engineering, including heat / diffusion, wave, and Poisson equations. Analytics emphasize the viewpoint of linear algebra and the analogy with finite matrix problems. Numerics focus on finite-difference and finite-element techniques to reduce PDEs to matrix problems. The Julia Language (a free, open-source environment) is introduced and used in homework for simple examples.

In this course, you will study basic algebraic operations and concepts, as well as the structure and use of algebra. This includes solving algebraic equations, factoring algebraic expressions, working with rational expressions, and graphing linear equations. You will apply these skills to solve real-world problems (word problems). Each unit will have its own application problems, depending on the concepts you have been exposed to. This course is also intended to provide you with a strong foundation for intermediate algebra and beyond. It will begin with a review of some math concepts formed in pre-algebra, such as ordering operations and simplifying simple algebraic expressions, to get your feet wet. You will then build on these concepts by learning more about functions, graphing of functions, evaluation of functions, and factorization. You will spend time on the rules of exponents and their applications in distribution of multiplication over addition/subtraction. This course provides students the opportuni…

Precalculus I is designed to prepare you for Precalculus II, Calculus, Physics, and higher math and science courses. In this course, the main focus is on five types of functions: linear, polynomial, rational, exponential, and logarithmic. In accompaniment with these functions, you will learn how to solve equations and inequalities, graph, find domains and ranges, combine functions, and solve a multitude of real-world applications. In this course, you will not only be learning new algebraic techniques that are necessary for other math and science courses, but you will be learning to become a critical thinker. You will be able to determine what is the best approach to take such as numerical, graphical, or algebraic to solve a problem given particular information. Then you will investigate and solve the problem, interpret the answer, and determine if it is reasonable. A few examples of applications in this course are determining compound interest, growth of bacteria, decay of a radioactive substance, and the…

This course is a continuation of MA001: Beginning Algebra [1]. Algebra allows us to formulate real-world problems in an abstract mathematical term or equation. These equations can then be solved by using techniques you will learn in this course. For example, if I can ride my bicycle at 5 miles per hour and I live 12 miles from work, how long will it take me to get to work? Or, suppose I am a pitcher for the St. Louis Cardinals and my fast ball is 95 miles per hour, how much time does the hitter have to react to the baseball? And, can you explain why an object thrown up into the air will come back down? If so, can you tell how long it will take for the object to hit the ground? These are all examples of problems that can be stated as an algebraic equation and then solved. In this course you will study compound inequalities and solve systems of linear equations. You will then study radicals and rational exponents, followed by quadratic equations and techniques used to solve these equations. Finally, you will…

The main purpose of this course is to bridge the gap between introductory mathematics courses in algebra, linear algebra, and calculus on one hand and advanced courses like mathematical analysis and abstract algebra, on the other hand, which typically require students to provide proofs of propositions and theorems. Another purpose is to pose interesting problems that require you to learn how to manipulate the fundamental objects of mathematics: sets, functions, sequences, and relations. The topics discussed in this course are the following: mathematical puzzles, propositional logic, predicate logic, elementary set theory, elementary number theory, and principles of counting. The most important aspect of this course is that you will learn what it means to prove a mathematical proposition. We accomplish this by putting you in an environment with mathematical objects whose structure is rich enough to have interesting propositions. The environments we use are propositions and predicates, finite sets and…

This course is an introduction to linear algebra. It has been argued that linear algebra constitutes half of all mathematics. Whether or not everyone would agree with that, it is certainly true that practically every modern technology relies on linear algebra to simplify the computations required for Internet searches, 3-D animation, coordination of safety systems, financial trading, air traffic control, and everything in between. Linear algebra can be viewed either as the study of linear equations or as the study of vectors. It is tied to analytic geometry; practically speaking, this means that almost every fact you will learn in this course has a picture associated with it. Learning to connect the facts with their geometric interpretation will be very useful for you. The book which is used in the course focuses both on the theoretical aspects as well as the applied aspects of linear algebra. As a result, you will be able to learn the geometric interpretations of many of the algebraic concepts…