"The publishers Springer and IEEE are removing more than 120 papers from their subscription services after a French researcher discovered that the works were computer-generated nonsense.
Over the past two years, computer scientist Cyril Labbe of Joseph Fourier University in Grenoble, France, has catalogued computer-generated papers that made it into more than 30 published conference proceedings between 2008 and 2013. Sixteen appeared in publications by Springer, which is headquartered in Heidelberg, Germany, and more than 100 were published by the Institute of Electrical and Electronic Engineers (IEEE), based in New York. Both publishers, which were privately informed by Labbe, say that they are now removing the papers.
(same author, sorry for so many submissions). Another possible solution could be to somehow require the person peer reviewing the document to answer basic questions and to write those questions down.
Summarize the article. State its conclusions. Does the article prescribe anything? If so, what? How was this conclusion arrived at? What was the experiment done and what is the procedure? What is the uncertainty? What measurements were made and how were they made?
I'm sure others can think of more difficult questions.
If they can't even be bothered to answer some basic questions then they don't deserve to be selected to peer review anything.
(errrr ... write those answers down. They don't get to choose the questions).