Say what you will about netscape/mozilla/firefox, but you've got to admit they've been good for bug-driven development. First, they had so many bugs they had to create a new bugtracker, bugzilla, just to keep track of them all. Then their software leaked so much memory they had to invent a new language, rust, with "guaranteed memory safety". And now they've released rr, which records your execution and lets you debug the recording (using gdb), deterministically, as many times as you want. Currently, it's linux/x86 only but linux/x64 support should be here soon.
Related Stories
We've recently had a story on Mozilla's new rr tool, so I thought that this might also fit well in here. aphyr, aka Kyle Kingsbury, is a backend engineer that had a problem to solve - testing how distributed systems work under bad network conditions - and no tool to solve it. So he just made one himself.
Jepsen will simulate a workload on a distributed datastore (a SQL or NoSQL database, a search enegine, a queue, ...) and report all the inconsistencies that arise. The tool cooperates with another one, salticid, which will help with setting up the cluster and simulating bad network conditions: high latency, dropped packets and network partitioning. The topic is clearly a complex one, but the rationale and usage is neatly described in this blog entry. He also has a light-hearted indepth series on various datastores tested with jepsen.
aphyr is a really cool guy and is also running an easy to follow series on clojure, and will help you attend a conference if you can't pay for it by yourself.
(Score: 2, Informative) by RoyWard on Thursday January 01 2015, @10:52AM
From the news:
December 12, 2014: rr 3.0.0 released. See announcement. The major feature is x86-64 support.
It looks like an interesting and valuable tool.
(Score: 2) by Arik on Thursday January 01 2015, @02:32PM
It does indeed.
It doesnt replace gdb, it allows you to record an 'execution' and play that back through gdb later, over and over again if necessary. That could make gdb much easier to learn and use.
If laughter is the best medicine, who are the best doctors?
(Score: 3, Insightful) by FatPhil on Thursday January 01 2015, @11:05AM
Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
(Score: 1) by zugedneb on Thursday January 01 2015, @12:56PM
Strange, how this came to be a problem...
I am not an old-timer, but am of oldish school when it comes to programming (hobby level today). The way I was thought, and do, is to make data structures, and hide all low level manipulation. The only time I have to work with pointers is when I implement the basic structures, the actual working code is mostly library and function calls...
I am to frightened to look at the code of large projects, but why is there so many(?) strange execution and memory related bugs in for example Xorg or wine or firefox?
Why does the textbook model of developing fail?
old saying: "a troll is a window into the soul of humanity" + also: https://en.wikipedia.org/wiki/Operation_Ajax
(Score: 2, Insightful) by Anonymous Coward on Thursday January 01 2015, @01:46PM
>Why does the textbook model of developing fail?
Because one person's "great idea" is another's "impossible combination of conditions". :)
And because unfixed bugs breed more bugs.
No one programs internal low-level functions defensively; efficiency is prioritized. No one cares of investigating smallish problems while implementing the newest and greatest idea of one's own. No one wants to spend time fixing bugs which do not annoy oneself exceedingly.
And thus unexpected behaviours arise, and propagate ever deeper into the stack - till resulting in a catastrophic failure of some sort. And only then are they being investigated and fixed.
It is only to be expected that in any project where known unfixed bugs number in hundreds of thousands (!) no assumption can be guaranteed valid. Every function in there not doing a full parameter check is a possible point of failure.
(Score: 2) by FatPhil on Thursday January 01 2015, @04:09PM
Sure, good engineering can minimise the problem, but even good engineers occasionally slip up, and that's where electric-fence and valgrind, et al., come in.
Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
(Score: 1) by DeKO on Thursday January 01 2015, @02:43PM
That's pretty much how any language is born, since the late 90s. "Our language only got the cool bits of C++, and THAT's why it's better!" on release. A couple of versions later, "Look how awesome our language is, now it has MORE features from C++! We are innovating!"
Replace "C++" with just about any established technology (try "CORBA") and it fits the story of just about any "new" tech. It appears that authors nowadays don't realize the importance of features (that exist and are well understood in other tools/environments) right away, and only pressure from the user base will convince them that a feature is needed.
(Score: 2) by darkfeline on Friday January 02 2015, @04:16AM
Having static checking is pretty well established as being beneficial for programmers, not additional complexity that hinders them. Type checking is the big one, but distinguishing pointers for mutable/immutable/whatever memory would go a long way to preventing pesky memory bugs as well as allowing the compiler to optimize for, e.g., threads and parallel programming.
Join the SDF Public Access UNIX System today!
(Score: 0) by Anonymous Coward on Saturday January 03 2015, @09:21PM
Seems like it is a case of "better to have and not need, than not have and need".
http://words.steveklabnik.com/pointers-in-rust-a-guide [steveklabnik.com]
Also, the default kind seems to be a more safe take on pointers than the classic C pointer.
(Score: 5, Interesting) by mtrycz on Thursday January 01 2015, @11:56AM
I not all-in with the mozilla bashing. The browser, while it has its issues, is still great, and free as in speach (vs. Chrome, for example, which comes at the hefty prize of your soul).
That being said, Mozilla is much *much* more that its browser, and did much more for the web of today than I could count.
That being said, I do wonder how far can a codebase go when it's more convenient to write new tools than refactoring it. Cool new tools are always welcome, tho.
In capitalist America, ads view YOU!
(Score: 1) by ghost on Friday January 02 2015, @02:12PM
Well, Facebook is a PHP shop. First they ran PHP. Then they transpiled PHP to c++ (HipHop for PHP [facebook.com]). Then they created a new (faster) PHP virtual Machine, HHVM [hhvm.com]. Then they added some improvements onto PHP (hack [hacklang.org]). And then they created a PHP spec.