The D-Algorithm is a method to trace back observability of an error/a change fomr the outputs back into the design to judge the quality of a test pattern. AFAIK it was introduced by a Mrs. "Elisabeth Auth" from Siemens in the early 1990tes. The major reason behind that is to find those test patterns which cause the highest number of errors (like stuck as 0/1) to show up at the outputs to cut down the number of the testpatterns which must be used to qualify a certain chip.
With better testpatterns, tests run quicker and thus become cheaper. AFAIK there was a program DALGO released, doing this.
The other algo is not known to me. My ASIC times are over since >15 years now