Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Wednesday March 19 2014, @11:06PM   Printer-friendly
from the did-the-earth-move-for-you? dept.

dotdotdot writes:

"The Los Angeles Times was the first newspaper to publish a story about an earthquake on Monday thanks to a robot writer.

Ken Schwencke, a journalist and programmer for the Los Angeles Times, was jolted awake at 6:25 a.m. on Monday by an earthquake. He rolled out of bed and went straight to his computer, where he found a brief story about the quake already written and waiting in the system. He glanced over the text and hit "publish." And that's how the LAT became the first media outlet to report on this morning's temblor. 'I think we had it up within three minutes,' Schwencke told Slate.

The original report read: 'A shallow magnitude 4.7 earthquake was reported Monday morning five miles from Westwood, California, according to the U.S. Geological Survey. The temblor occurred at 6:25 a.m. Pacific time at a depth of 5.0 miles.'

According to the USGS, the epicenter was six miles from Beverly Hills, California, seven miles from Universal City, California, seven miles from Santa Monica, California and 348 miles from Sacramento, California. In the past ten days, there have been no earthquakes magnitude 3.0 and greater centered nearby.

This information comes from the USGS Earthquake Notification Service and this post was created by an algorithm written by the author."

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 5, Funny) by Buck Feta on Wednesday March 19 2014, @11:09PM

    by Buck Feta (958) on Wednesday March 19 2014, @11:09PM (#18721) Journal

    Now even the L.A. Times is claiming FP.

    --
    - fractious political commentary goes here -
    • (Score: 3, Funny) by Nerdfest on Thursday March 20 2014, @03:00AM

      by Nerdfest (80) on Thursday March 20 2014, @03:00AM (#18764)

      I'm still waiting for a "frosty-piss" headline. That's the point where we know we're on the downhill slide for sure.

      • (Score: 1) by Buck Feta on Thursday March 20 2014, @12:34PM

        by Buck Feta (958) on Thursday March 20 2014, @12:34PM (#18855) Journal

        Ask them to send a reporter to McMurdo Station for St. Patrick's Day?

        --
        - fractious political commentary goes here -
  • (Score: -1, Troll) by Anonymous Coward on Wednesday March 19 2014, @11:09PM

    by Anonymous Coward on Wednesday March 19 2014, @11:09PM (#18722)

    quick enough for a first post?

  • (Score: 0) by Anonymous Coward on Wednesday March 19 2014, @11:30PM

    by Anonymous Coward on Wednesday March 19 2014, @11:30PM (#18723)

    Isn't this a couple get requests, regex for particular area, make up few sentences from the variables? Algorithm? Doesnt sound that groundbreaking to me.

    • (Score: 5, Interesting) by hemocyanin on Thursday March 20 2014, @01:06AM

      by hemocyanin (186) on Thursday March 20 2014, @01:06AM (#18743) Journal

      Yeah -- I do a similar thing with the Marine forecast for my area. I just scrape the data straight from UW, which gets it from NOAA, and eliminate everything but my particular area of interest. I use a cron job to check periodically and make a very simple plain text web page so even when I'm only getting 2g on my phone, it loads right up and I don't have to scroll around to see the data I want.

      It's an extremely lame three line script. This is almost it -- I took out the actual path to index.html -- so this generates a simple file covering one specific marine area (assuming you get rid of the [washington.edu ] that SN inserts) in whatever directory you run it in (just don't overwrite an index.html file you actually want):

      TZ=US/Pacific date > index.html
      echo '<br><hr>' >> index.html
      curl http://www.atmos.washington.edu/data/marine_report .html [washington.edu] | awk '/PZZ133/, /\$\$/' >> index.html

      As much as I like Weather Underground -- you can imagine that a page like this doesn't work at 2g: http://www.wunderground.com/MAR/PZ/133.html [wunderground.com]

      As far as earthquakes go -- no need to do any scraping -- the USGS will push info to you 7 different ways: http://earthquake.usgs.gov/earthquakes/feed/v1.0/ [usgs.gov]

      • (Score: 2) by hemocyanin on Thursday March 20 2014, @01:15AM

        by hemocyanin (186) on Thursday March 20 2014, @01:15AM (#18746) Journal

        If you are wondering about the TZ bit -- the server this runs on isn't in my time zone and I didn't want to have to convert in my head. Also, I want to have a date independent of the date embedded in the data so I know I'm actually looking at current data rather than old data due to my cron job stopping for some reason. Lastly, I'm best described as a lousy amateur programmer.

    • (Score: 1) by paddym on Thursday March 20 2014, @05:05AM

      by paddym (196) on Thursday March 20 2014, @05:05AM (#18783)

      Takes me back to my early days of learning BASIC:
      Enter a noun?
      Earthquake
      Enter a verb?
      Shakes
      Enter an adjective?
      Vigorously
      Enter a location?
      bed

      A tiger was seen mauling an Earthquake yesterday. For those who don't know, a tiger Shakes its head Vigorously at any adversary. To catch tiger cubs, go into tiger dens in bed.

      Ok, so the last sentence wasn't really a madlib, but a fortune from a Chinese restaurant. Since I was reminiscing I put it in there as well.

    • (Score: 0) by Anonymous Coward on Thursday March 20 2014, @06:42AM

      by Anonymous Coward on Thursday March 20 2014, @06:42AM (#18789)

      I was doing this with system messages being wrapped up in SMS to the DB admin, back in ... 1999. Shows you just how far behind journalists are...

  • (Score: 5, Funny) by pbnjoe on Wednesday March 19 2014, @11:36PM

    by pbnjoe (313) on Wednesday March 19 2014, @11:36PM (#18724) Journal

    As Janet stated, I can't believe my aunt makes $964/hr working from the computer, just by going to--
    Woah, hey, watch where you point that downmod. Just a joke; we're all friends here.

    • (Score: 3, Insightful) by pbnjoe on Wednesday March 19 2014, @11:44PM

      by pbnjoe (313) on Wednesday March 19 2014, @11:44PM (#18725) Journal

      Seriously though, I hope that articles written by bots that are published stick to short, immediately-relevant things like earthquakes where all that is needed is numbers. I wouldn't imagine anything written by bots would be much of an informative/interesting read.
      Then again, my only frame of reference is spambot garbage, so...

      • (Score: 3, Interesting) by cybro on Wednesday March 19 2014, @11:51PM

        by cybro (1144) on Wednesday March 19 2014, @11:51PM (#18728)

        What if a bot did a replay attack:

        1) scans popular message boards
        2) builds a massive database about all the posts
        3) calculates how popular a post was by how often it was replied to / quoted
        4) records what keywords existed in stories or posts that the popular posts replied too
        5) regurgitate that content at the "appropriate" times
        6) reply to itself if needed (sock puppet) using the same process

        Would you always be able to tell that was a bot? When the material was organic in origin?

        • (Score: 5, Funny) by Geotti on Thursday March 20 2014, @02:00AM

          by Geotti (1146) on Thursday March 20 2014, @02:00AM (#18754) Journal

          You forgot the obligatory

          1. ???
          2. Profit!
          • (Score: 3, Informative) by Geotti on Thursday March 20 2014, @02:03AM

            by Geotti (1146) on Thursday March 20 2014, @02:03AM (#18755) Journal

            Huh? Someone stole the numbers in my <ol>, but they appear, when replying...

      • (Score: 4, Funny) by stderr on Thursday March 20 2014, @12:44AM

        by stderr (11) on Thursday March 20 2014, @12:44AM (#18742) Journal

        I wouldn't imagine anything written by bots would be much of an informative/interesting read.

        So no "constructive"-bots [xkcd.com] for you?

        --
        alias sudo="echo make it yourself #" # ... and get off my lawn!
      • (Score: 3, Informative) by frojack on Thursday March 20 2014, @01:33AM

        by frojack (1554) on Thursday March 20 2014, @01:33AM (#18749) Journal

        But who goes to their newspaper's web site for this kind of breaking news anyway?

        Any one of a dozen breaking news apps will work better than waiting for your newspaper's bot to scrape the USGS. If I lived in earth quake country I'd follow @USGS_EQ_CA on twitter or something.

        --
        No, you are mistaken. I've always had this sig.
  • (Score: 2) by bd on Wednesday March 19 2014, @11:47PM

    by bd (2773) on Wednesday March 19 2014, @11:47PM (#18727)

    There were some mad programming skills at work.

    So this guy is in an earthquake-prone region and made a glorified word template for earthquakes? Parsing an automatically sent warning email for formatted data? Oh. my. god. The last paragraph of the original "computer generated" story said:

    This information comes from the USGS Earthquake Notification Service and this post was created by an algorithm written by the author.

    an algorithm... he probably really believes he did something special and ingeniously novel there.

    • (Score: 1) by paulej72 on Thursday March 20 2014, @12:15AM

      by paulej72 (58) on Thursday March 20 2014, @12:15AM (#18734) Journal

      I ant to know what the original USGS report read. The generated article was so simple, it could have been a copy paste job.

      --
      Team Leader for SN Development
      • (Score: 2) by bd on Thursday March 20 2014, @08:43AM

        by bd (2773) on Thursday March 20 2014, @08:43AM (#18808)

        They will send you example mail. See: https://sslearthquake.usgs.gov/ens/ [usgs.gov]. Basically a list of time, coordinates, depth and magnitude.

        It contains all the information, but not the text of the ingenious newspaper article.

    • (Score: 3, Interesting) by AnythingGoes on Thursday March 20 2014, @12:42AM

      by AnythingGoes (3345) on Thursday March 20 2014, @12:42AM (#18741)
      I wonder if shaking his bed would cause him to send out the same message :)
      In which case, I can think of some wonderful hacks (in the old sense of the word)
  • (Score: 5, Funny) by MrGuy on Thursday March 20 2014, @12:21AM

    by MrGuy (1007) on Thursday March 20 2014, @12:21AM (#18736)

    ....welcome our new robot journalists.

  • (Score: 3, Interesting) by istartedi on Thursday March 20 2014, @12:31AM

    by istartedi (123) on Thursday March 20 2014, @12:31AM (#18738) Journal

    This is marginally better than USGS TED [usgs.gov]. The annoying thing about the USGS tweets is that they take the magnitude and convert it into verbiage. IIRC, 5.0-5.9=strong 6.0-6.9=powerful 7.0-7.9=major 8.0-9.9=devestating and 9+=catastrophic. I hit them back with a request to just put the magnitude in there while ago; but got no reply.

    --
    Appended to the end of comments you post. Max: 120 chars.
  • (Score: 3, Insightful) by gishzida on Thursday March 20 2014, @02:18AM

    by gishzida (2870) on Thursday March 20 2014, @02:18AM (#18757) Journal

    A "But Wait There's more After This Commercial Message from our Friends at..." Script. At this point an advertiser's name will be inserted with an advertisement specific to the news event, such as health insurance for an epidemic story, specific types of insurance for specific disaster or the four seasons of California: Fire, Flood, Earthquake, or Civil Disturbance... and so on... then the script will add a whole bunch of boiler plate catch phrases of absolutely zero semantic content... and all of this will be available to advertisers at a penny or two for each email address.

    Who needs journalists when the future of news will all be scripted? [Gives a whole new meaning to script kiddie]

  • (Score: 3, Insightful) by tomp on Thursday March 20 2014, @04:25AM

    by tomp (996) on Thursday March 20 2014, @04:25AM (#18776)

    So LA Times readers found out about this three minutes after everyone that subscribes to the USGS updates? Better late then never, I guess. Why not get this information from the source? The LA Times sure didn't seem to add anything that was missing from the government report.

    Why not cut out the middle man and just get the info from the source. U.S. citizens already pay for it, the government makes it easy, and they actually do a good job of it.

  • (Score: 3, Insightful) by Boxzy on Thursday March 20 2014, @11:14AM

    by Boxzy (742) on Thursday March 20 2014, @11:14AM (#18838) Journal

    Idiot just removed his own job and the whole industry..

    2.. 1..

    --
    Go green, Go Soylent.
    • (Score: 2) by Phoenix666 on Friday March 21 2014, @09:14AM

      by Phoenix666 (552) on Friday March 21 2014, @09:14AM (#19202) Journal

      My first programming job was writing a story lead generation program for crain's Chicago business. Trust me the news has been automated for years. That's why every newspaper you read has the same articles-- they're all plucked from the AP Wire automatically. The remaining staff aren't there to report or investigate but to spin and sell advertising. I used to sit next to the sales guys and listen to them all day. There were three times more of them than anyone else and they had the best offices.

      --
      Washington DC delenda est.
  • (Score: 3, Interesting) by McD on Thursday March 20 2014, @04:17PM

    by McD (540) Subscriber Badge on Thursday March 20 2014, @04:17PM (#18937)

    You know what?

    I came here for all the laughter from those of us who realize just how trivial this kind of thing is to write. Hands up if you've ever scraped a web page and emailed it to yourself, you know?

    But on reflection... think about this from the reporter's perspective. He wrote "an algorithm," to do something useful, and got some publicity for it. He's justifiably excited - think about the first program you wrote that worked!

    I think the real headline here is, "non-programmer realizes just how incredibly handy a programmable computer can be."

    We all see the shift happening from computers as "something you program" and towards "an appliance you operate." Stories like this are the perfect example for why things like Apple's walled gardens are harmful - would this be possible if the reporter needed approval from Apple before he could write his program?

    This guy defends the general purpose computer. Good for him! Someone go buy him a Perl book.

    Peace,
    -McD