I wrote recently asking for advice about how to help a newbie tactfully.
I have been very surprised since then about some of the things our newbie has asked me and what I have had to explain to him. Is it a "young people today" thing, am I completely out of touch?
The first thing is when code reviewing this young guy's C. As you probably know, when you are working in a programming team there are certain things you have to agree on, such as the format of the code. It may sound trivial, but if different people format the code differently, it makes tools like diff harder to use, and it makes it harder to spot structural and functional changes on the code. We use spaces for indentation. Tabs were creeping in!
More seriously, though, I found out that he didn't know about git diff. I found out from someone else that he didn't know about branching in git. I would have thought a trivial search online would have given some clues?
Function prototypes are there for a reason. They define the public interface to functions in a source module. That's why they go in a separate header, and if your program wants to call the functions in the other module, you #include that header. I found redefined function prototypes in the program source (not the module where the functions were implemented) and they had the wrong argument lists for the functions! Why?
The other strange thing was I had to explain the difference between ASCII and binary. I had to explain, with an ASCII chart (which he didn't understand), that NUL has code 0. I also had to explain why editing binary data in a text editor was not a great idea.
Also, you don't need to terminate a binary array with '\0'. Certainly not outside the array bounds.
I've tried to keep it polite, professional and helpful, but the other day he informed me that he is only used to getting one or two code review comments and they're only about "the functionality."
Kids today? He has nearly 10 years of experience. Am I too harsh? It's a lot of code, so it's got a steep learning curve, and it's for an important job.
(Score: 2, Interesting) by Anonymous Coward on Sunday March 12, @05:40PM (3 children)
It sounds like somebody has zero experience in C. The rot started in the 00s with an influx of careerists and affirmative action recruitment. Then we got the "passionate about tech" types - because every true technician knows transistors switch courtesy of the magical properties of passion. Now universities are getting rid of standardized testing altogether. I'm reliably informed that questioning any of this makes you a terrible bigot. Embrace modern Lysenkoism or risk the wrath of Lineker!
(Score: 3, Touché) by Anonymous Coward on Monday March 13, @12:36AM (2 children)
"careerists and affirmative action recruitment"
Found the cranky racist! Couldn't be the fault of corporations cheaping out on hiring and driving away experienced coders that get too expensive. Nooooo, gotta be "careerists" (lolwut??) and "affirmative action" dog whistle.
(Score: 0) by Anonymous Coward on Monday March 13, @01:58AM (1 child)
Your objections to H1Bs have been noted comrade. Please turn and face the wall!
(Score: 1, Insightful) by Anonymous Coward on Monday March 13, @03:44PM
Um, buddy? H1Bs are a capitalist thing, not communist. Best go get a brain tune-up, offefed at most local community colleges at quite reasonable prices!
(Score: 2) by https on Sunday March 12, @05:44PM
It's not age, it's work culture: do you expect to work on someone else's code, or not? Consistent style inculcates the idea that someone you've never met may have to fix your "good enough for now" oversight. And, it allows diff to work. On the other hand, wdiff exists.
If you like the kid, lend them your copy of the blue C and its errata. And, see if you are allowed to say harsh things about their supervisors. Demanding results now and not asking how hints at a minefield of corner cases.
How important is consistent style when working with other programmers? Consider Ken Thompson's comment on Ritchie:
Offended and laughing about it.
(Score: 1) by khallow on Sunday March 12, @05:59PM (1 child)
Maybe less tact and more homework? A bunch of this is stuff that he can do himself with minor guidance. But you'll need to talk to him straight about it. Also, maybe work with his boss so that this homework counts as a deliverable for him rather than extra work that's not relevant to his job?
(Score: 4, Informative) by DannyB on Monday March 13, @03:03PM
An experienced developer can help younger developers by giving them good advice.
#define while if // speeds up code
#define struct union // uses less memory
The source debugger will be baffling. Even using a machine level debugger will create puzzlement.
How often should I have my memory checked? I used to know but...
(Score: 0) by Anonymous Coward on Sunday March 12, @06:13PM (9 children)
Maybe he just lacks the bit-twiddling experience due to never having to touch assembly or think much about why or how the computer works? I certainly didn't understand much of why things were the way they are until I sat down and read some architecture manuals and wrote some nontrivial programs in assembly. He also could just be the kind of person that's in tech for the money, and has all the right credentials but none of the motivation or personal interest that make a great craftsman.
(Score: 2) by turgid on Sunday March 12, @06:34PM (8 children)
The CV claims many years of embedded C experience, not far short of a decade.
I refuse to engage in a battle of wits with an unarmed opponent [wikipedia.org].
(Score: 4, Interesting) by krishnoid on Sunday March 12, @09:26PM (1 child)
+1, Suspicious.
(Score: 1, Touché) by Anonymous Coward on Sunday March 12, @10:01PM
One suspects the many years of C experience were embedded in his ass from whence the claim was produced.
I find the idea of no ASCII familiarity for a language without strings fascinating and would like to read this kids C23 standards proposal.
(Score: 1, Insightful) by Anonymous Coward on Monday March 13, @07:12AM (5 children)
The thing with a CV is that there is experience and then there is "experience." On the other side of that, they may have real experience in that area but not in the same way you do. Maybe they used another VCS which have different mental model and expectations than git. Maybe that experience isn't working with others in the same way. Maybe it turns out that the source of his experience was the same ass he pulled his CV from. The whole point of the interview is to discover which one it is. For example, I have over a decade of experience in embedded, C, OS development, and various real-time systems and if you asked me about them in an interview for a job to develop a hard-real time OS in C for a bespoke embedded platform, I could bullshit my way through it by using my strengths to cover my weaknesses unless you asked the right questions. But when it came time to actually code that with a team using SVN on VS Code, I'd would definitely need my fair share of time, docs, some web searches, and Stack Overflow (scary, I know) to muddle my way through to the MVP.
I should also point out that some of this is your system's fault too. Your tabs vs spaces things, for example, should be caught by a hook. I also suspect that, like many people when they get out of their depth, you may be running into his XY problems. There is also the possibility that he is just out of his depth for various reasons or just plain on the low end of the bell curve. Was their any sort of formal on-boarding and period to become familiar with the product before he was expected to code? Where is his boss and team on this? Are there systems to give him what he needs or to replace him if he is bad? Instead of all the "kids" being bad at everything, it could just be him in his current environment with his past experiences and expectations.
(Score: 2) by turgid on Monday March 13, @07:44AM (4 children)
Our onboarding process certainly is deficient, and the guy is new to the industry. It was a little unfair to give him a few thousands of lines of code that I had written, a bunch of documentation and telling him, "Here, finish this."
The problem is our company has burnt out every single software engineer on the team and we have either all left or are leaving. Our boss doesn't know C or embedded development. I'm the "brains" of the outfit, and those of us who are left are all 150% busy fighting fires on other projects hence why this code was given to the new guy "to finish off."
I'll have to admit, there are a couple of thousand lines of code there that I wrote in under a week. It's a bit short of comments. There are unit tests and regression tests and they work, however. If you follow them, you should understand how the code is supposed to work (it's pretty darn simple).
Yes, we should have hooks to catch the tabs vs spaces things and I believe some of our other repos have such things (like I said, I did this in a hurry).
These programs work on binary data. The guy was opening binary output in a text editor, copying it with the mouse and pasting it into another instead of just copying the file with cp. Then he was wondering why there was an extra 0x0a at the end of the new file. This is someone with nearly a decade of "embedded C" experience (RTOS etc.) on their CV. Apparently this guy has previously only used a proprietary embedded C compiler and IDE.
I just don't get it. How can you work for nearly a decade in the industry and not know these things? Am I expecting too much?
I refuse to engage in a battle of wits with an unarmed opponent [wikipedia.org].
(Score: 0) by Anonymous Coward on Monday March 13, @03:49PM (1 child)
If the problem never came up how would they learn about it? Maybe their old workplace had custom text editors that allowed for copy/pasting? Maybe the kid is a big fat liar? Regardless, it is your duty to help them become an effective team member. Or you can quit.
(Score: 2) by turgid on Monday March 13, @05:54PM
Indeed, but you'd expect that with a computing/engineering degree and nearly a decade of industrial experience it might.
I suspect the quality of code reviews previously was rather lax. I won't say which company this person used to work for, but he is at least trying to learn and do better. I'm just really surprised.
It is indeed.
That is my medium-term goal.
I refuse to engage in a battle of wits with an unarmed opponent [wikipedia.org].
(Score: 1, Informative) by Anonymous Coward on Monday March 13, @11:10PM
I cannot answer your question for two reasons. I don't know what that person claimed in their CV and interviews. I also don't know what your background and experience is. Both of those are important to truly answer your question because without them it is impossible to know what is under the umbrella of their claims and what falls under the false consensus effect from your experience. Basically, your complains seem to be unfamiliarity with POSIX utilities, unfamiliarity with your coding conventions, him being a bit slow on figuring your code out, unfamiliarity with git, and unfamiliarity with text encodings. I can come up with a number of scenarios where any candidate who does not know all of that based on their claimed experience is obviously a liar making things up. However, I can come up with number of scenarios where a candidate who does know all of that based on their claimed experience is obviously a liar making things up.
With all of those disclaimers laid out, I think that some of what you expect probably does fall under things they should know and some is you expecting too much. In my experience with a block language like C, mixing whitespace is just something that happens. A number of diff utilities and VCSes ignore (or can be told to ignore) whitespace changes precisely for that reason. There are a number of VCSes (including not having one!) that companies use and they have different models as well as graphical tool abstractions so you don't have to know git to be a programmer. There are also a number of IDEs for Windows that target embedded platforms which totally negate the need to know POSIX utils. As for you code, of course it is perfectly easy for you to grok, you wrote it! You have your innate understanding that you bring every time you look at it, plus you could be firmly above average in ability, like the proverbial super car doing 100+ mph on a familiar interstate wondering why all the cars from other states drive so slow.
All in all, the truth is probably somewhere in the middle: some of it is fluffing up the CV (everybody fluffs a little) and some of it is you expecting your past to apply to everyone (which I know from my own experience can be a very hard bias to overcome, at least it is for me, especially in the work environment where you don't necessarily have the time).
(Score: 3, Interesting) by istartedi on Monday March 13, @11:29PM
Apparently this guy has previously only used a proprietary embedded C compiler and IDE.
If his CV isn't just BS, that could explain a lot. The "string\0" thing seems particularly idiosyncratic. I've never seen anybody do that, except perhaps eons ago I might have seen some strings where they stuck \0 in the middle of it because it was a "glob" of strings and they had the indices in to the string hard-coded. Weird stuff like that was done to save memory, like they might have even known that you could use a 256 byte block to store the strings so they were working in that constraint and phrasing things to fit!
Appended to the end of comments you post. Max: 120 chars.
(Score: 3, Insightful) by DannyB on Monday March 13, @02:44PM (1 child)
People from my generation have significant trouble understandifying Unicode.
ASCII is too cemented in their brain.
You must let go of the idea that bytes and characters are the same thing. Working in Java I have had to embrace this idea. A character is represented by one or more bytes. An object called a Charset knows how to translate between bytes and characters. Whenever you have a String object, you are able to obtain a copy of the underlying byte array that represents the characters.
If you want to read/write bytes you use implementations of InputStream / OutputStream.
If you want to read/write characters you use implementations of Reader / Writer.
Naturally there are ready made types you can use such as InputStreamReader which is a character reader from an InputStream. It uses a Charset (which you can supply or use the default) to convert incoming bytes into characters. Similarly reverse this for an OutputStreamWriter.
It works pretty slick once you embrace it. Plus you have complete access to the entire process and pipeline of processing. Need to create your own CharacterCounter or ByteCounter classes and interject them into that pipeline? No problem!
However I continue to be surprised how difficult some people find to let go of the idea that characters and bytes are not the same thing. How do you think we are going to be able to represent all of the necessary characters for things such as:
Next, we can talk about an amazing form of ancient unreadable encryption called EBCDIC. Unreadable unless you have a Charset for it. All these pieces hook together neatly like lego bricks.
How often should I have my memory checked? I used to know but...
(Score: 1, Insightful) by Anonymous Coward on Tuesday March 14, @02:58AM
Variable-length encodings of both types (packing and multibyte) are one thing I am happy I ran into early in my career. Exactly as you have pointed out, I too have noticed that many people my age have problem with them. Even those I know who were around when Unicode didn't exist and you had to keep your code pages intact or where a byte represented more than a single character (packing VLEs) have trouble getting beyond the 1 byte = 1 character dichotomy. The idea that you could need multiple bytes for one character just breaks their brain. It also lead me to appreciate, as you do, standard library capabilities for text vs binary IO in more and more languages. It is one less thing to worry about and, I dare say, something the developer shouldn't have to worry about 99% or more of the time.
(Score: 3, Interesting) by Subsentient on Thursday March 16, @05:04AM (3 children)
I've seen that at a lot of companies in the last few years.
I don't get why. I feel like a lot of people just don't care enough to learn more and get better. I'm really encouraged when I see people who do care. The one I'm at now people seem pretty decent, even though a lot of their code is JS and TypeScript, which I've had bad experiences with in the past. But also straight C is one I've seen some really awful stuff. I once had a guy contracting for the same company as me tell me that his Raspberry Pi kept overheating so he had to put a heatsink on it. It wasn't running anything but his less-than-500KB C code binary. Turns out he was busy waiting in his code, for everything, and polling, for everything. So even doing nothing, his code was basically using all of the CPU, spinning in while(true); scenarios.
Thankfully they're keeping me working on systems language stuff. Interesting, fun stuff for the moment. Lots of C++.
"It is no measure of health to be well adjusted to a profoundly sick society." -Jiddu Krishnamurti
(Score: 2) by turgid on Thursday March 16, @04:47PM (2 children)
I believe that's a "performance" technique for multi-threaded software on multi-core systems. The idea is that if you have threads waiting on messages from other threads (semaphores, callbacks, queues...) it's faster to have them spin waiting than put them to sleep for a short period of time. The theory is that if you keep them spinning, they're going to stay on CPU and in cache rather than being paged out while something else runs. That means they start processing again faster when they receive the data.
I have no idea what sort of performance characteristics it gives. The last time I worked on big multithreaded realtime systems they were running on single core CPUs usually. For testing, we built and ran the code on workstations and PeeCees. As the numbers of CPUs or cores increased we found more and more interesting bugs. That's a good way to find bugs in multi-threaded code: vary the number of physical cores available and run the software.
I refuse to engage in a battle of wits with an unarmed opponent [wikipedia.org].
(Score: 0) by Anonymous Coward on Thursday March 16, @11:22PM (1 child)
That performance technique really only works in a cooperative multitasking environment or when a lack of starvation can be guaranteed (e.g more processing units than runnable+running tasks). In a preemptive multitasking environment, the operating system can preempt you at the next scheduling point. And the problem with spinning is that usually those preemptions will happen when you actually get the message because that is the point your task is actually preemptable. They are usually only worth it in embedded systems or drivers because it is possible to guarantee that the spinning period is less than your rescheduling interval.
(Score: 2) by turgid on Friday March 17, @02:30PM
Thanks for that explanation. That makes it clear!
I refuse to engage in a battle of wits with an unarmed opponent [wikipedia.org].