Online courses directory (36)

Sort by: Name, Rating, Price
Start time: Any, Upcoming, Recent started, New, Always Open
Price: Any, Free, Paid
28 votes
Khan Academy Free Closed [?] Computer Sciences Advanced Cryptography Applied Math Cryptography Journey into Cryptography

How have humans protected their secret messages through history? What has changed today?. What is Cryptography?. Probability Space. The Caesar Cipher. Caesar Cipher Exploration. Frequency Fingerprint Exploration . Polyalphabetic Cipher. Polyalphabetic Exploration. The One-Time Pad. Perfect Secrecy Exploration. Frequency Stability. Coin flip sequences. Frequency Stability Exploration. The Enigma Encryption Machine (case study). Perfect Secrecy. Pseudorandom Number Generators. Random Walk Exploration. Ciphers vs. Codes. Shift Cipher. Caesar cipher encryption. Caesar Cipher Decryption. Caesar cipher frequency analysis. Vigenere cipher encryption. XOR Bitwise Operation. XOR & the One-Time Pad. XOR Exploration. Bitwise Operators. What's Next?. The Fundamental Theorem of Arithmetic. Public Key Cryptography: what is it?. The Discrete Logarithm Problem. Diffie-Hellman Key Exchange. RSA Encryption: step 1. RSA Encryption: step 2. RSA Encryption: step 3. Time Complexity (Exploration). Euler's Totient Function. Euler Totient Exploration. RSA Encryption: step 4. What should we learn next?. What is Modular Arithmetic?. Modulo Operator. Congruence Modulo. Congruence Relation. Equivalence Relations. The Quotient Remainder Theorem. Modular Addition & Subtraction. Modular Addition. Modular Multiplication. Modular Multiplication. Modular Exponentiation. Fast Modular Exponentiation. Fast Modular Exponentiation. Modular Inverses. Introduction. Primality Test Challenge. Trial Division. Level 1: Primality Test. Running Time. Level 2: measuring running time. Computer Memory (space). Binary Memory Exploration. Algorithmic Efficiency. Level 3: Challenge. Sieve of Eratosthenes. Level 4: Sieve of Eratosthenes. Primality Test with Sieve. Level 5: Trial division using sieve. The Prime Number Theorem. Prime density spiral. Prime Gaps. Time Space Tradeoff. Summary (what's next?). Randomized Algorithms (intro). Conditional Probability (Bayes Theorem) Visualized. Guess the coin. Random Primality Test (warm up). Level 9: Trial Divison vs Random Division. Fermat's Little Theorem. Fermat Primality Test. Level 10: Fermat Primality Test. What's Next?. What is Cryptography?. Probability Space. The Caesar Cipher. Caesar Cipher Exploration. Frequency Fingerprint Exploration . Polyalphabetic Cipher. Polyalphabetic Exploration. The One-Time Pad. Perfect Secrecy Exploration. Frequency Stability. Coin flip sequences. Frequency Stability Exploration. The Enigma Encryption Machine (case study). Perfect Secrecy. Pseudorandom Number Generators. Random Walk Exploration. Ciphers vs. Codes. Shift Cipher. Caesar cipher encryption. Caesar Cipher Decryption. Caesar cipher frequency analysis. Vigenere cipher encryption. XOR Bitwise Operation. XOR & the One-Time Pad. XOR Exploration. Bitwise Operators. What's Next?. The Fundamental Theorem of Arithmetic. Public Key Cryptography: what is it?. The Discrete Logarithm Problem. Diffie-Hellman Key Exchange. RSA Encryption: step 1. RSA Encryption: step 2. RSA Encryption: step 3. Time Complexity (Exploration). Euler's Totient Function. Euler Totient Exploration. RSA Encryption: step 4. What should we learn next?. What is Modular Arithmetic?. Modulo Operator. Congruence Modulo. Congruence Relation. Equivalence Relations. The Quotient Remainder Theorem. Modular Addition & Subtraction. Modular Addition. Modular Multiplication. Modular Multiplication. Modular Exponentiation. Fast Modular Exponentiation. Fast Modular Exponentiation. Modular Inverses. Introduction. Primality Test Challenge. Trial Division. Level 1: Primality Test. Running Time. Level 2: measuring running time. Computer Memory (space). Binary Memory Exploration. Algorithmic Efficiency. Level 3: Challenge. Sieve of Eratosthenes. Level 4: Sieve of Eratosthenes. Primality Test with Sieve. Level 5: Trial division using sieve. The Prime Number Theorem. Prime density spiral. Prime Gaps. Time Space Tradeoff. Summary (what's next?). Randomized Algorithms (intro). Conditional Probability (Bayes Theorem) Visualized. Guess the coin. Random Primality Test (warm up). Level 9: Trial Divison vs Random Division. Fermat's Little Theorem. Fermat Primality Test. Level 10: Fermat Primality Test. What's Next?.

109 votes
Khan Academy Free Closed [?] Computer Sciences Applied Math Primality Testing Qa testing Testing

Why do Primes make some problems fundamentally hard? Build algorithms to find out!. Primality Test. Running Time. Computer Memory (space). Algorithmic Efficiency. Sieve of Eratosthenes. Primality Test with Sieve. The Prime Number Theorem. Time Space Tradeoff. Conditional Probability Visualized.

52 votes
Khan Academy Free Closed [?] Computer Sciences Advanced Algorithms Algorithms Applied Math Computer Science Randomized Algorithms Software Engineering

No votes
Udacity Free Closed [?] Georgia Tech Masters in CS

Data science plays an important role in many industries. In facing massive amount of heterogeneous data, scalable machine learning and data mining algorithms and systems become extremely important for data scientists. The growth of volume, complexity and speed in data drives the need for scalable data analytic algorithms and systems. In this course, we study such algorithms and systems in the context of healthcare applications. In healthcare, large amounts of heterogeneous medical data have become available in various healthcare organizations (payers, providers, pharmaceuticals). This data could be an enabling resource for deriving insights for improving care delivery and reducing waste. The enormity and complexity of these datasets present great challenges in analyses and subsequent applications to a practical clinical environment.

2 votes
Saylor.org Free Closed [?] Life Sciences Biology

The advent of computers transformed science.  Large, complicated datasets that once took researchers years to manually analyze could suddenly be analyzed within a week using computer software.  Nowadays, scientists can use computers to produce several hypotheses as to how a particular phenomenon works, create computer models using the parameters of each hypothesis, input data, and see which hypothetical model produces an output that most closely mirrors reality. Computational biology refers to the use of computers to automate data analysis or model hypotheses in the field of biology.  With computational biology, researchers apply mathematics to biological phenomena, use computer programming and algorithms to artificially create or model the phenomena, and draw from statistics in order to interpret the findings.  In this course, you will learn the basic principles and procedures of computational biology.  You will also learn various ways in which you can apply computational biology to molecular and cell…

No votes
Udemy $20 Closed [?] Technology

All algorithms and secrets reveled

No votes
Udemy Free Closed [?] Technology

This course provides a comprehensive overview of the concepts of algorithm analysis and development.

No votes
Udacity Free Closed [?] Georgia Tech Masters in CS

This class is offered as CS6505 at Georgia Tech where it is a part of the [Online Masters Degree (OMS)](http://www.omscs.gatech.edu/). Taking this course here will not earn credit towards the OMS degree. In this course, we will ask the big questions, “What is a computer? What are the limits of computation? Are there problems that no computer will ever solve? Are there problems that can’t be solved quickly? What kinds of problems can we solve efficiently and how do we go about developing these algorithms?” Understanding the power and limitations of algorithms helps us develop the tools to make real-world computers smarter, faster and safer.

No votes
OLI. Carnegie Mellon University Free Computer Sciences Carnegie Mellon University Open Learning Initiative

This course presents material in discrete mathematics and computation theory with a strong emphasis on practical algorithms and experiential learning. Discrete mathematics, also called finite mathematics or decision mathematics, is the study of mathematical structures that are fundamentally discrete in the sense of not supporting or requiring the notion of continuity. Objects studied in finite mathematics are largely countable sets such as integers, finite graphs, and formal languages. Concepts and notations from discrete mathematics are useful to study or describe objects or problems in computer algorithms and programming languages. The CDM course is currently under development and we are making the course available while it is under development. Only one of the planned fifteen modules is currently available. The module on Groups that is currently available would appear mid-way through the complete course.

No votes
Udacity Free Closed [?] Georgia Tech Masters in CS Web Development

This class is offered as CS6250 at Georgia Tech where it is a part of the [Online Masters Degree (OMS)](http://www.omscs.gatech.edu/). Taking this course here will not earn credit towards the OMS degree. This course covers advanced topics in Computer Networking such as Software-Defined Networking (SDN), Data Center Networking and Content Distribution. The course is divided into three parts: Part 1 is about the implementation, design principles and goals of a Computer Network and touches upon the various routing algorithms used in CN (such as link-state and distance vector). Part 2 talks about resource control and content distribution in Networking Applications. It covers Congestion Control and Traffic Shaping. Part 3 deals with the operations and management of computer networks encompassing SDN's (Software Defined Networks), Traffic Engineering and Network Security.

17 votes
Udemy Free Closed [?] Computer Sciences Math and Science

Lecture - 24 GraphsrnLecture Series on Data Structures and Algorithms by Dr. Naveen Garg, Department of Computer Science

12 votes
Udemy Free Closed [?] Computer Sciences Technology

Lecture Series on Design & Analysis of Algorithms by Prof.Abhiram Ranade, Department of Computer Science Engineering

4 votes
Saylor.org Free Closed [?] Mathematics Computer Science Discrete Math

This course has been designed to provide you with a clear, accessible introduction to discrete mathematics. Discrete mathematics describes processes that consist of a sequence of individual steps (as compared to calculus, which describes processes that change in a continuous manner). The principal topics presented in this course are logic and proof, induction and recursion, discrete probability, and finite state machines. As you progress through the units of this course, you will develop the mathematical foundations necessary for more specialized subjects in computer science, including data structures, algorithms, and compiler design. Upon completion of this course, you will have the mathematical know-how required for an in-depth study of the science and technology of the computer age.

4 votes
Saylor.org Free Closed [?] Computer Sciences Computer Science

This course focuses on the fundamentals of computer algorithms, emphasizing methods useful in practice.  We look into the algorithm analysis as a way to understand behavior of computer programs as a function of its input size.  Using the big-O notation, we classify algorithms by their efficiency.  We look into basic algorithm strategies and approaches to problem solving.  Some of these approaches include the divide and conquer method, dynamic programming, and greedy programming paradigms.  Sorting and searching algorithms are discussed in detail as they form part of a solution to a large number of problems solved using computers.  We also provide an introduction to the graph theory and graph algorithms as they are also used in many computer-based applications today.  We conclude the course with a look into a special class of problems called the NP-complete problems.

3 votes
Saylor.org Free Closed [?] Computer Sciences Computer Science

This course focuses on the fundamentals of information security that are used in protecting both the information present in computer storage as well as information traveling over computer networks. Interest in information security has been spurred by the pervasive use of computer-based applications such as information systems, databases, and the Internet. Information security has also emerged as a national goal in the United States and in other countries with national defense and homeland security implications. Information security is enabled through securing data, computers, and networks. In this course, we will look into such topics as fundamentals of information security, computer security technology and principles, access control mechanisms, cryptography algorithms, software security, physical security, and security management and risk assessment. By the end of this course, you will be able to describe major information security issues and trends, and advise an individual seeking to protect his or her dat…

No votes
Saylor.org Free Closed [?] Computer Sciences Computer Science

Cryptography is essentially the science of writing in secret code.  In data and telecommunications, cryptography has specific security requirements, such as authentication, privacy or confidentiality, integrity, and non-repudiation.  To meet these security requirements, we employ secret key (or symmetric) cryptography, public-key (or asymmetric) cryptography, and hash functions. In the first part of the course, we will review a number of different ciphers that were used before World War II.  These ciphers would be easily broken nowadays, since cryptography has advanced quickly over the past couple of decades with the advent of modern computers.  We will cover block cipher algorithms and describe the advanced encryption standard for a symmetric-key encryption adopted by the U.S. government.  We will also learn about the important MD5 and SHA-1 hash functions as well as the message authentication code. This course will focus on public key cryptography, which is best exemplified by the RSA algorithm (na…

8 votes
Udemy Free Closed [?] Lifestyle

Learn how to control your mind, with minimum willpower using Ben Franklin's 13x4 and some neat algorithms.

No votes
Udacity Free Closed [?] Georgia Tech Masters in CS

The goal of this course is to give you solid foundations for developing, analyzing, and implementing parallel and locality-efficient algorithms. This course focuses on theoretical underpinnings. To give a practical feeling for how algorithms map to and behave on real systems, we will supplement algorithmic theory with hands-on exercises on modern HPC systems, such as Cilk Plus or OpenMP on shared memory nodes, CUDA for graphics co-processors (GPUs), and MPI and PGAS models for distributed memory systems. This course is a graduate-level introduction to scalable parallel algorithms. “Scale” really refers to two things: efficient as the problem size grows, and efficient as the system size (measured in numbers of cores or compute nodes) grows. To really scale your algorithm in both of these senses, you need to be smart about reducing asymptotic complexity the way you’ve done for sequential algorithms since CS 101; but you also need to think about reducing communication and data movement. This course is about the basic algorithmic techniques you’ll need to do so. The techniques you’ll encounter covers the main algorithm design and analysis ideas for three major classes of machines: for multicore and many core shared memory machines, via the work-span model; for distributed memory machines like clusters and supercomputers, via network models; and for sequential or parallel machines with deep memory hierarchies (e.g., caches). You will see these techniques applied to fundamental problems, like sorting, search on trees and graphs, and linear algebra, among others. The practical aspect of this course is implementing the algorithms and techniques you’ll learn to run on real parallel and distributed systems, so you can check whether what appears to work well in theory also translates into practice. (Programming models you’ll use include Cilk Plus, OpenMP, and MPI, and possibly others.)

1 votes
Open.Michigan Initiative, University of Michigan Free Computer Sciences Computing science Cyberinfrastructure Cyberscience Information technology Scientific Computing

In the last half of the 20th century, the role of computation in the sciences grew rapidly, driven by advances in silicon-based processors, fiber-optic networks, a host of numerical algorithms, and sets of standard protocols for processing and exchanging data. Much of this digital technology now permeates everyday life. Building on these and emerging technologies, the 21st century is poised to unleash a new, data-intensive paradigm of scientific discovery that will dramatically enhance the scope and scale of data capture, curation, and analysis. In this new (4th) paradigm, cures for cancer might be found by the collective investigations of agents computing "in the cloud.

100 votes
Udacity Free Closed [?] Computer Sciences Software Engineering

Ever played the Kevin Bacon game? This class will show you how it works by giving you an introduction to the design and analysis of algorithms, enabling you to discover how individuals are connected.

Trusted paper writing service WriteMyPaper.Today will write the papers of any difficulty.