Stories
Slash Boxes
Comments

SoylentNews is people

Meta
posted by NCommander on Monday December 05 2022, @08:37AM   Printer-friendly
Hey folks,

Well, it's been a bit of time since the last time I posted, and well, I had to think a fair bit on the comments I received. It's become very clear that while I'm still willing to at least help in technical matters, the effort to reforge SN is much higher than I expected. In addition, given the, shall we say, lukewarm response I got to my posts and journal entries, well, I'm clearly not the right person for the job.

I think at this point, it's time to figure out who is going to lead SN going forward. After my de facto stepping down in 2020, the site has, for want of a better word, been a bit listless. At the moment, no one on staff really has the cycles to take that position on. A few people have expressed interest in the position, and I've talked with Matt, who is co-owner of the site about this. By and large, whoever fills the seat will have to figure out what, if anything, needs to change in regards to moderation policy, content, and more.

If you're interested in potentially fulfilling the role, drop me an email at michael -at- casadevall.pro, with the subject of "SN Project Leader", and include the following:

  • Who you are
  • What you want to do with the site
  • How you intend to do it
  • Why do you want to get involved

I'll leave this call for candidates open until December 14th, at which point Matt and I will go through, and figure out our short list, I'll talk to editors, and solicit more comments from the community. I'm hoping to announce a successor in early January, and formalize the transition sometime in February, which will be the site's 9th anniversary.

 
This discussion was created by NCommander (2) for logged-in users only, but now has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by janrinok on Monday December 05 2022, @03:35PM (6 children)

    by janrinok (52) Subscriber Badge on Monday December 05 2022, @03:35PM (#1281266) Journal

    There are problems with off-loading backups to remote locations. We are handling personal data from all over the world and some nations expect (not unreasonably in my view) that it is given adequate protection. A single person's data might not seem important to you but the compromise of a whole database could certainly result in a few legal challenges.

    The major players, such as Linode, AWS etc can easily handle such legal requirements but they become an onerous task for a private individual.

    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 3, Informative) by RS3 on Monday December 05 2022, @04:50PM (4 children)

    by RS3 (6367) on Monday December 05 2022, @04:50PM (#1281283)

    I have a lot more to say but little time at the moment:

    To OP- you don't need a single line of perl code to backup database:

    /usr/bin/mysqldump -q -e -hlocalhost -u(db username) -p(db username's password) (db name) > (some filename, or process like gzip, and/or an encryption utility then > filename, etc.)

    As you can see, you then encrypt the entire backup. Easy. I have cron.daily scripts doing this. They generate a backup filename based on the db_name and date+time at runtime.

    In all fairness, I did not invent this- someone previous to me, who I don't know, created the backup system in the early 2000s. Then other types of backup systems involving external backup server (just a fileserver), tape / optical / yet another hard disk / paper tape / punched cards (jk!) did their thing.

    • (Score: 2) by janrinok on Monday December 05 2022, @06:14PM (1 child)

      by janrinok (52) Subscriber Badge on Monday December 05 2022, @06:14PM (#1281296) Journal

      I use something similar on my own databases.

      • (Score: 2) by RS3 on Monday December 05 2022, @07:31PM

        by RS3 (6367) on Monday December 05 2022, @07:31PM (#1281308)

        Funny you mention that. I used to use "FoxPro" (dBase similar) for a few personal databases, but I haven't run that software in probably 15 years, and I forget which older system's hard disk has it (maybe one that died!).

        One database I built long ago was simple- my house's circuit breaker panel and various branch circuits. I made several printouts sorted on breaker number, another sorted by floor then room.

        I finally decided to add my own circuit database to one of the MySQL ones I admin. But I neglected (just didn't think of it) to add a backup script for it. No need to keep backing it up- I'll do one manual mysqldump backup and be happy that I don't have to key it in again. Thanks!!

    • (Score: 1) by shrewdsheep on Monday December 05 2022, @07:04PM (1 child)

      by shrewdsheep (5215) on Monday December 05 2022, @07:04PM (#1281301)

      This + using either a relay log or the general query log which should be rotated at the time of the db dump should allow the reconstruction also to points between dumps. I have not worked with mysql and do not know whether the dump is atomic which might be required to make replaying the log-files work.

      • (Score: 3, Interesting) by RS3 on Monday December 05 2022, @07:21PM

        by RS3 (6367) on Monday December 05 2022, @07:21PM (#1281305)

        Yes, really good point. AFAIK, and web search results say MySQL is atomic, so you'd want to back up the 2 (3?) log files.

        It really depends on the time-granularity (resolution) you need. Here, I dunno. It's not a Wall Street high-frequency trading database, so sub-millisecond is probably not necessary.

        It'd be good to do some analysis of various buffer / cache flushing. Big buffers can result in bigger data loss, but too small buffers would bottleneck a busy system (make faster storage system) so some tuning is in order to figure out the compromise, but that's pretty easy.

        Incremental backup is available within MySQL, so I'd look strongly at fairly frequent incremental backups, which again, compressed text is going to be tiny, and you could auto-delete older ones after a full backup is done.

  • (Score: 2) by inertnet on Monday December 05 2022, @09:38PM

    by inertnet (4071) on Monday December 05 2022, @09:38PM (#1281325) Journal

    I agree and I don't even want plain data on my NAS. Someone already mentioned encryption, which would be a logical thing to use with remote backups. I don't need encryption keys, the site administrators are the only ones who need to be able to rebuild the database from the encrypted backups.

    It's nice that you have automated Linode backups but apparently nobody knows how to restore them, or maybe they're unusable and you should stop paying for them.