An algorithm is there which is developed by scientist that automates a leading step in the analysis of forensic fingerprint, through which the process becomes more decisive and effective. In the United States, the biggest case which involves the evidence of fingerprints was the trial of the murder of Thomas Jennings in Chicago in 1911. He was imprisoned with the fingerprints which were left on the crime scene, as in the courts and in the imagination of the public fingerprints were assumed to be the dependable method of recognition.
It is noted in the research that the examination of fingerprints can produce enormous outputs. From the National Academy of Sciences, a report of 2009 found that result and later the ones who are experienced might deny with their past words which are said when the same prints are reexamined at a later phase. Through these situations there are people who are falsely accused and more crimes can be done through criminals. The chance of human error is reduced, as there are many scientists who are working on it.
It was reported by the national institute of standards and Michigan State University that an algorithm is being developed by them which automates the leading step in the process of fingerprint analysis.
Machine learning techniques are used by many researchers for their algorithm. Not like the programming which is traditional in which the accurate instructions are written for the computer to follow, and in machine learning the computers are trained for recognizing the patterns by some examples.
The examiners of the fingerprint can easily process the evidence through the standard practice in the forensic community. Automating this process will reduce the backlogs, and crimes will be solved more early, and then examiner could easily get time for the prints that require more time.