An algorithm is a specific
procedure for solving a well-defined computational problem.
The development and analysis of
algorithms is fundamental to all aspects of computer science: artificial
intelligence, databases, graphics, networking, operating systems, security, and
so on. Algorithm development is more than just programming. It requires an
understanding of the alternatives available for solving a computational
problem, including the hardware, networking, programming language, and
performance constraints that accompany any particular solution.
The (computational) complexity of
an algorithm is a measure of the amount of computing resources (time and space)
that a particular algorithm consumes when it runs. Computer scientists use
mathematical measures of complexity that allow them to predict, before writing
the code, how fast an algorithm will run and how much memory it will require.
Such predictions are important guides for programmers implementing and
selecting algorithms for real-world applications.