"The publishers Springer and IEEE are removing more than 120 papers from their subscription services after a French researcher discovered that the works were computer-generated nonsense.
Over the past two years, computer scientist Cyril Labbe of Joseph Fourier University in Grenoble, France, has catalogued computer-generated papers that made it into more than 30 published conference proceedings between 2008 and 2013. Sixteen appeared in publications by Springer, which is headquartered in Heidelberg, Germany, and more than 100 were published by the Institute of Electrical and Electronic Engineers (IEEE), based in New York. Both publishers, which were privately informed by Labbe, say that they are now removing the papers.
Slightly more to the point (I sat in on a presentation by several Science Fiction & Fantasy Writers of America [sfwa.org] members on this topic), the PublishAmerica company was basically recruiting authors with promises of advertising, editing, and other support which they completely failed to provide after they received their money. When authors would complain, they would respond in derogatory fashion, at which point that book concept was born. It showed up the highly-touted "publication standards" of the company in pretty epic fashion.
A solution would be for each peer reviewer to sign their name to the publication that they peer reviewed it. If we see that a peer reviewer signs off on way more articles than they can possibly read and reasonably understand, analyze, scrutinize, criticize, and investigate then we know something is suspicious. Peer reviewers should only be permitted to sign off on a limited, reasonable, number of publications a year in opposed to simply rubber stamping 20 a week (made up number) that they couldn't have possibly even gone through (yet alone thoroughly) if you consider all their other routine daily activities that they must also spend time doing (eating, sleeping, do they have another job and how much time is spent there, how much time do they allot to peer reviewing and is it on site monitored peer reviewing, where their time can be audited, or is it at home reviewing, etc..).
(same author, sorry for so many submissions). Another possible solution could be to somehow require the person peer reviewing the document to answer basic questions and to write those questions down.
Summarize the article. State its conclusions. Does the article prescribe anything? If so, what? How was this conclusion arrived at? What was the experiment done and what is the procedure? What is the uncertainty? What measurements were made and how were they made?
I'm sure others can think of more difficult questions.
If they can't even be bothered to answer some basic questions then they don't deserve to be selected to peer review anything.
(errrr ... write those answers down. They don't get to choose the questions).