AnonTechie writes:
"The publishers Springer and IEEE are removing more than 120 papers from their subscription services after a French researcher discovered that the works were computer-generated nonsense.
Over the past two years, computer scientist Cyril Labbe of Joseph Fourier University in Grenoble, France, has catalogued computer-generated papers that made it into more than 30 published conference proceedings between 2008 and 2013. Sixteen appeared in publications by Springer, which is headquartered in Heidelberg, Germany, and more than 100 were published by the Institute of Electrical and Electronic Engineers (IEEE), based in New York. Both publishers, which were privately informed by Labbe, say that they are now removing the papers.
(Score: 2, Funny) by gringer on Monday March 03 2014, @11:26PM
Many theorists would agree that, had it not been for Boolean logic, the evaluation of gibberish papers through checksums might never have occurred. Given the current status of autonomous archetypes, scholars dubiously desire the study of the lookaside buffer as an alternative to checksum analysis. Such a hypothesis is generally an appropriate ambition but is derived from known wrong results, namely slashdot posts.
Our algorithm relies on the practical architecture outlined in the recent infamous work by Martinez et al. in the field of steganography; this seems to hold in most cases. We postulate that modular technology can manage atomic comparisons of gibberish without needing to learn stochastic algorithms. Each component of our algorithm, UlmicOrf, runs in O(n) time, independent of all other components. The architecture for our heuristic consists of four independent components: encrypted algorithms, the Internet, collaborative social networking websites, and RFCs. Thusly, despite substantial work in this area, our method is apparently the application of choice among experts
Ask me about Sequencing DNA in front of Linus Torvalds [youtube.com]