The Coder's Lounge

I'm invoking the "if I have to look at it everybody else has to look at it" rule:

Code:
while ( (name[c++] = ch = fgetc(load)) != '~');

That's right, no body, all the instructions are in the condition statement.

As part of an assignment we have to refactor (in groups of 6 - 7) a piece of open-source software that works but is badly written/designed (picked out and assigned to us by the prof/TAs). I'm 97 lines into my 230 line section of code and this is the most baffling thing I've seen so far though somehow not by much. (other gems include breaks everywhere, variables with names that tell you nothing and are declared randomly as needed except for where they were declared as global variables in god knows which file, hard-coded values that should have been defined as constants so I'm not spending 10 minutes wondering why a counter variable has inexplicably been started at 8, an 'else' statement that was 40 lines long and badly indented, and exactly zero comments)

In my 28 years I have never had even the slightest desire to drink but I'm pretty sure this code is going to change that.
Oh dear. I guess it's a different way to teach you to embrace good coding habits...?
 
Oh dear. I guess it's a different way to teach you to embrace good coding habits...?

Among other things lol. It's also to start teaching us how to work in larger groups, and how to, well, deal with and refactor bad code written by someone else (since in the workplace unless you're independent or working for a startup, you're not usually writing code from scratch). The program I'm in (software engineering) is more focused on "soft skills" and workplace preparation (teamwork, design/documentation, writing readable/maintainable code) vs the pure computer science degree which is more focused on technical skills (math/logic, machine language, hardware).
 
I'm learning through Java in one of my university classes now! Pretty interesting language, but too bad it's pretty much dead.

I am as well, though it's actually just a general class on object-oriented programming that happens to use Java as the example language. It's taking me a while to wrap my head around it (after doing almost everything in C up to this point my brain really really wants to find somewhere to declare function prototypes, and it just feels wrong to be able to do stuff like remove something from an array without having to go and manually shift everything to fill in the empty space) but I'm really liking it.
 
Java is 100% far from dead. :)
Personally speaking, I feel the only thing really keeping Java alive is Android. Java support in browsers is basically nonexistent now, and most applications don't really make use of it, they'd rather just target either exclusively Windows or OSX and use native libraries, unless they can do cross-platform without sacrificing quality or control. While I did learn it in school, it's really not the first language I'd go for, I'm hearing C# is one language that a lot of employers are looking for besides Python, C++, or Javascript, and as I once read, "C# is Java done right", and I'm inclined to agree having used both. Doing things in Java is a huge pain sometimes, but C# is just a memory-managed version of C++ with lots and lots of padding, compared to the likes of Java and the really absurd way some things are done, especially when it comes to syntax.
 
I'm currently dealing with DPI scaling. Interestingly enough, you can tell that that isn't the topmost priority of the Delphi creators considering how buggy that stuff still is in their implementation. So I end up having to do most of the work myself. Technically it's not even all too hard, however especially font sizes really mess things up...
 
They are completely different. JavaScript is a Scripting Language used primarily for web development whereas Java is more of a general purpose language, aimed for software projects, that claims to be crossplatform.

Is the developer important when considering the two?
They both apparently fell under the Netscape, Inc. banner at one point in all of their history. "Java" being in "JavaScript" has to mean something, after all.
 
Is the developer important when considering the two?
They both apparently fell under the Netscape, Inc. banner at one point in all of their history. "Java" being in "JavaScript" has to mean something, after all.

I'd argue it doesn't matter moreso what they do. JavaScript has diverged enough it's sort of its own entity that is heavily in use and on the rise meanwhile Java as has been stated before is mostly falling out of use (sometimes even for JavaScript because of web apps).
 
I'd argue it doesn't matter moreso what they do. As I can tell Java has more usage but less capability due to how broad it is. JavaScript has diverged enough it's sort of its own entity that is heavily in use and on the rise meanwhile Java as has been stated before is mostly falling out of use (sometimes even for JavaScript because of web apps).

JavaScript, however, is a specified tool. Increasing the division of labor. It is used mainly to animate and code in interactions between the site and the user. Also, JavaScript is non-functional outside of an HTML web page. So, I could see if the need for more specialized software were on the rise, but why would Java fall out of favor if it has so much more versatility?
 
JavaScript, however, is a specified tool. Increasing the division of labor. It is used mainly to animate and code in interactions between the site and the user. Also, JavaScript is non-functional outside of an HTML web page. So, I could see if the need for more specialized software were on the rise, but why would Java fall out of favor if it has so much more versatility?

It's really not "falling out of favor" as far as I can tell. Look up any article on most-used languages or most-asked-for in job postings for the past year or two and Java is almost always in the top 3, and if not, it's in the top 5 (I don't think I've seen any articles where it's lower than top 5). It's dropped a place or two according to some of those articles (all depends on exactly what they're using to measure it), but it's still very much in demand (and I believe still the most popular object-oriented language, even when it's not listed as the most popular language overall)
 
Last edited:
It's really not "falling out of favor" as far as I can tell.

In the mid to late 2000s there was a surge of new stories calling for the demise of Java in the years to come. The doubt over whether or not Java can compete with the more recent programming languages, particularly Python, has persisted. I don't know too much about this yet, but from what I have seen Python is getting high praise and that should be worrisome at least.

Java vs JavaScript arguments aside (which is frankly apples to oranges), this is entirely false. The JavaScript runtime powers many services today that aren't directly linked with rendering a web page.

Any actual examples? My research suggests otherwise.
 
Last edited:
I still have irks with Node, mostly how it introduces A LOT of bloat, and how there's basically a package for doing literally everything. In fact I made a tweet mocking it earlier and the sheer accuracy of it is unreal.
 
Yeah, saw that tweet and I loved it. Node is a blight on this world. I've had my share of fight with the developers where I work for because they insist not only on using Node, but also on giving the node package manager full access to the server's systems, overwriting or upgrading system libraries.

I had to end up making Backend agree to a waiver last year where I would no longer take any responsibility for maintaining or assisting a server after enabling npm modules globally. They (the devs) were not too happy about that and neither were they when I shared your tweet, but some aesops do need to be delivered, and "you are doing it wrong, you refuse to do it right despite my best offerings so I won't kill you but I also don't need to save you" is one of them.
 
Again, it's very easy to get caught up in comparisons of two very different tools. Python lacks the type safety of a proper systems language, yet it's very effective for quick prototyping. One tool is good for one type of job, and the other for a different kind. Python and Java can perfectly coexist, and neither is the be-all-end-all of the other.

The original discussion was over comparing Java and Javascript. I was merely pointing out what the framework of the Python vs. Java crowd has looked like over the years, with special attention to more recent years.

When you say it does not have the safety of a proper language, do you mean that it has plenty of easily manipulated tiny bugs or that it has massive oversight that makes it grossly inefficient? If it is the latter, I do not understand why this type of thing would be getting heralded as the wave of the future.

If I need all of them in my coding belt, why would anybody be focused on being an expert in just one? Would it not just make more sense to splice them all together at this point? There are probably hundreds of different languages from Java, to Argus, to Python, to C++. If they all bring something to the table, then you would constantly need a team of programmers to get any work done.
 
Okay, tone it down a notch. I was merely trying to get you to elaborate. Unsupported method calls, like String.format, seem like tiny bugs to me. Why would you say otherwise?
 
So "unsupported method calls" are just the code telling the programmer that it cannot find anything for the specific input? It is an easy mistake because the programmer should know to cover all of their tracks when designing the code? Or am I still not getting it? This would explain why a complex system would see more of these mistakes because the programmer could easily have forgotten to finish a pathway. Tracking back to find the type of object means finding the mistake in the code?

Thus, Java and C++ lack the ability to create a single path that works for all inputs for a given output?

What is the difference between compiling and translating code then? Would not compiling be building and translating be following pathways to their conclusions/outputs?
 
What is the difference between compiling and translating code then? Would not compiling be building and translating be following pathways to their conclusions/outputs?

I was trying to figure out where "translating" code came up, but couldn't, do you mean "interpreting"?

In the most basic sense, a compiled language would be something where you have to run a compiler, which creates a separate executable file that you run. It's slower, but safer, because the compiler will pick up (syntax) errors before you run the program, and won't allow you to even make the executable if it's a serious enough error. An interpreted language (like Python) is "compiled" on the fly as the program is running - it's faster (and therefore good for prototyping) because you don't have to recompile the program every time you make a change, and you can even run very simple programs while writing them by typing the code into the command line rather than a text file if you, say, wanted to see if a handful of lines of code will work before adding them to your program - but it's not as safe because if you make a mistake you won't get a warning ahead of time and your program could crash (or, worse, it will appear to work and you won't know there's a syntax error until much later - which is why interpreted languages are often not good for very large amounts of code or production software).

ETA: realized I should maybe specify that "faster" in the above cases refers to development time, not running time. All else being equal, a compiled program will run faster than an interpreted program
 
Last edited:
I am not sure why, but whatever you choose to spend time on is none of my business.

You mean that is how an .exe file is created, right? And it is not only safer but it sounds more practical to use a compiler, it would be like using word without spell check. Python being unsafe is presumably why it is usually compiled into byte code first before it is uploaded because it just makes more sense to know ahead of time whether you have a syntax error or not. Most people do not sacrifice quality for a quick fix. Which will be more smooth when running, of course something that is built without checking for accuracy will be completed faster than something gets checked. Although, I suppose if you were a confident programmer, then you would be more likely to trust yourself and choose an interpreted program every time.
 
You mean that is how an .exe file is created, right?

Basically, yeah, though it doesn't necessarily have to have the .exe extention

And it is not only safer but it sounds more practical to use a compiler, it would be like using word without spell check. Python being unsafe is presumably why it is usually compiled into byte code first before it is uploaded because it just makes more sense to know ahead of time whether you have a syntax error or not. Most people do not sacrifice quality for a quick fix. Which will be more smooth when running, of course something that is built without checking for accuracy will be completed faster than something gets checked.

It's kind of like using Word without spellcheck yeah lol but again, you're generally choosing to use an interpreted language for different reasons than a compiled language, in a situation where it makes sense to sacrifice safety for a faster development time - to continue your analogy, if you're just writing a rough draft or a note to yourself, you might not care about spellcheck, and may not even care about formatting and opt to write it in a faster plain-text editor instead. As well as not needing to spend time compiling, a lot of interpreted languages have the philosophy of "get more done with fewer lines of code" which makes them great for prototyping or making a proof-of-concept e.g., to show a client what the finished product might look like without having to actually *make* the finished product. They're also often used to create front-end interfaces, where safety is less of a concern because all the interface is doing is getting input from the user, passing it to the back-end program written in a more secure compiled language, then getting output from the back-end and passing it to the user. And sometimes it's just the most practical option under the circumstances, especially if it's not going out into the world to be used by others (as an example: my sister is an astrophysics student and spent last summer as a research assistant. They used programs written in C to run simulations and analyze data because there's lots of open-source programs already available that can be easily modified, it's flexible, and it's relatively safe (C will let you do a lot of stupid things, but at least you generally have to explicitly tell it "I'm doing this stupid thing on purpose"), but used Python to actually take that data and put it in a readable/presentable form, because doing that in C is a huge pain, especially when you're not actually a programmer and C compilers tend to give very cryptic error messages)

Although, I suppose if you were a confident programmer, then you would be more likely to trust yourself and choose an interpreted program every time.

Not necessarily - I know lots of experienced programmers that hate interpreted languages because it's too easy to make a mistake, and personally of the languages I've learned so far in school, I vastly prefer C and Java over the interpreted languages we've learned (Perl, Python, and JavaScript). And interpreted languages run more slowly because they have to be, well, interpreted *as* they're running, so as soon as you get past the prototype/demo stage or get into anything complicated, you're generally going to want to go with a compiled language, not just because it's safer, but because it will run much faster. (again, to go back to my sister's research, she switched to Python at one point for analyzing data because it's a much easier language, but switched back to C fairly quickly because it just took so long for the program to run. And in my own experience, reading in and analyzing several large csv file in Perl was super easy to code, but the run time was significantly longer than similar programs I've had to write in C - though much less frustrating because string handling in C is horrendous, and as much as I don't like Perl, I would definitely take the longer running time in Perl over fighting with segfaults in C any day if all we're talking about is a quick-and-dirty analysis of text/csv files and not a fully functioning program)
 
Last edited:
Definitely. It's the reason why I will always prefer a compiled language, because having the potential to have some kind of run-time error due to one small mistake is always something in the back of my mind, whereas a compiled language catches all that before the final executable file is generated. I've literally had a single missing semicolon in a PHP script make my website error out completely - it wasn't until I checked back in the source that I found the missing semicolon, and presto my website was working again as if by magic.

The only language that you don't compile is assembly - you're basically already doing half of the work that the compiler does, you just have to have the assembler assemble the final output as you've already brought it down to machine instructions. The only problem is that what you do in one language to go from A to B suddenly turns into A going to A1 to A2 to A3 to A4 to A5 to B. You have to manually do all those small optimizations that a compiler will already do for you in your code like unrolling loops or optimizing out redundancies. I like my compiled programs for doing all this work for me and pointing out the obvious errors before I even get to runtime, and going down even lower gives me even more power to do as I need, as well as see the inner workings of what I'm doing and be able to manage the code in a more effective way.

Interpreted languages, on the other hand, don't do any of this. It'll run your code, line-for-line, until either it terminates successfully, or errors out. The problems is that with each line it comes across, it first has to make sure there's no errors on that line and that everything it's attempting to do is legal: Are the arguments valid? Are their syntaxes correct? Is there any casting that has to be done? Are we missing any important stuff like a semicolon or some kind of terminator? What is it that we're even trying to do - is what is typed even a valid opcode? Once it's done all that, it then runs the line, if possible, then continues and repeats the whole process. It becomes absurdly slow in that way. Like mentioned, it's best for prototyping, but I'll be damned if I have to write a full application using an interpreted language - the final product would be absurdly bloated and would be quite slow to use.
 
Back
Top