Copyright (c) 2013 John L. Jerz

Approximation Algorithms (Vazirani, 2003)

Home
A Proposed Heuristic for a Computer Chess Program (John L. Jerz)
Problem Solving and the Gathering of Diagnostic Information (John L. Jerz)
A Concept of Strategy (John L. Jerz)
Books/Articles I am Reading
Quotes from References of Interest
Satire/ Play
Viva La Vida
Quotes on Thinking
Quotes on Planning
Quotes on Strategy
Quotes Concerning Problem Solving
Computer Chess
Chess Analysis
Early Computers/ New Computers
Problem Solving/ Creativity
Game Theory
Favorite Links
About Me
Additional Notes
The Case for Using Probabilistic Knowledge in a Computer Chess Program (John L. Jerz)
Resilience in Man and Machine

approx.jpg

No approximations, this is THE optimal book!, February 16, 2002
By  Mosta McKracken (Cambridge, MA USA)
I have been using Dorit Hochbaum's book on approximation algorithms for NP-Hard problems as a guideline for my work. Hochbaum's book is, without a doubt, terrific. However, the survey format compromised a smooth flow in favor of bringing together the best people in the field. This book (Vazirani's) corrects this by being so smooth and elegant from start to finish. Excellent problem sets, excellent hints for most problems, and there is a section at the end of the book devoted to open problems, which is a really really cool feature. My favorite chapter -29 I think- deals with hardness of approximation and the PCP theorem. The chapter explains the PCP theorem so vividly that the exact next thing I was doing was reading and comprehending the latest papers in this area. If you're a researcher in algorithms and complexity, then this book is highly recommended, especially at this ridiculously low price.
Note on my background: I am a graduate (masters) student in CS.

VII Although this may seem a paradox, all exact science is dominated by the idea of approximation. Bertrand Russell (1872-1970)
 
VIII Most of the algorithm design effort actually goes into understanding the algorithmically relevant combinatorial structure of the problem. The algorithm exploits this structure in a minimalistic manner. The exposition of algorithms in this book will also follow this analogy, with emphasis on stating the structure offered by problems, and keeping the algorithms minimalistic.
 
p.1 An optimization problem is polynomial time solvable [JLJ - from Wikipedia, In computational complexity theory, polynomial time refers to the computation time of a problem where the run time, m(n), is no greater than a polynomial function of the problem size, n. For example, the quicksort sorting algorithm on n integers performs at most An*n operations for some constant A. Thus it runs in time O(n*n) and is a polynomial time algorithm. ] only if it has the algorithmically relevant combinatorial structure that can be used as "footholds" to efficiently home in on an optimal solution. The process of designing an exact polynomial time algorithm is a two-pronged attack: unraveling this structure in the problem and finding algorithmic techniques that can exploit this structure.
 
p.1 So, at a high level, the process of design of approximation algorithms is not very different from that of design of exact algorithms. It still involves unraveling the relevant structure and finding algorithmic techniques to exploit it. Typically, the structure turns out to be more elaborate, and often the algorithmic techniques result from generalizing and extending some of the powerful algorithmic tools developed in the study of exact algorithms.
 
p.72 if the algorithm had to resort to exhaustive search, does the problem really offer "footholds" to home in on a solution efficiently? [JLJ - a really good question. Perhaps the experts in computer chess search techniques can answer this question]
 
p.249 First, let us recall the fundamental technique of Lagrangian relaxation from combinatorial optimization. This technique consists of relaxing a constraint by moving it into the objective function, together with an associated Lagrangian multiplier.
 
p.250 the global constraint that at most k facilities be opened has been replaced with a penalty for opening facilities, the penalty being the Lagrangian multiplier.
 
From Wikipedia, the free encyclopedia
A relaxation technique is a method in mathematical optimization for relaxing a strict requirement, by either substituting for it another more easily handled requirement or else dropping it completely. Relaxation techniques are commonly used in place of branch and bound algorithms, or to obtain bounds in those algorithms.

Enter supporting content here