The building blocks of programs: Why do you speak this?

Alexander Nicholi

what do you know about computing?
  • 5,500
    Posts
    15
    Years
    There is an array (heh) of different "high level" programming languages we use to create interfaces for the computer with. But which one do you use? Why? Use multiple languages? In what circumstance? What's your reasons?

    If you'd like a refresher, here's a short summary of all of the major languages used with x86 PCs.
    Spoiler:

    Other languages I either forgot or they weren't worth mentioning. Sorry QBASIC, PASCAL, and FORTRAN. You were old people languages and probably won't be missed. Fuck off Perl.


    I'll post my preferences and reasons later.
     
    The IT program that I've been in for the past almost 1.5 years have basically drilled Java into our heads for 3 consecutive terms. I understand their preference for object-oriented programming, but could they not have included other programming languages besides Java?

    I guess they mainly chose that one as the basic foundation for future courses, but I would have liked to learn C and C++ in school. Guess I'll teach myself those on my own time.
     
    It's beyond me why on earth universities are laying a computer science foundation with bricks of OOP. o_O As peachy as OOP is, it's not how computers work. They work a lot more closely to assembly, or better yet C. Computers function procedurally, not in objects. But anyway...


    I love writing in C. C is my favourite language, I love how simple, elegant, fast, and powerful it is. I just wish it had more libraries D:

    I like Python too but it being interpreted is a bit of a turn-off. C# is nice but I don't touch .NET much for the same reasons. Other languages are just ew.
     
    C has no class though. (hah, sorry!)

    OO seems to be all the rage these days. Though I really wish it wasn't, because inheritance gets to be a pain.

    I'm kinda bummed that we don't look at legacy languages here. So many of the lecturers here tell you ten times per week they're a Java programmer, and it's getting a bit old. Even if we did like one or two lecturers covering what came before, that would do me.

    Started off with C++ since that was the big thing for the course I'm doing, but I'm really liking C# at the moment. Sure, it relies on .NET but it's nicer to look at than C++!

    Whilst typing in assembly commands is probably better, it confuses the hell out of me. It is nice though getting that close to the actual machine code.
     
    Let's see...I had Delphi in school, C++, C#, Java, Haskell and MIPS Assembler at uni, but when it comes down to everyday use I always resort to Python 3, or maybe C#. Python is cool, because it allows for simple scripts about a couple lines long and complex oop stuff as well.

    I was also looking into Ada a while ago, because it looks really interesting, but I haven't done much with it, yet.
     
    I have varying levels of experience with Java, C, and PHP (yes, I'm counting PHP)
     
    I have varying levels of experience with Java, C, and PHP (yes, I'm counting PHP)
    PHP was surprisingly easy for me to understand. I started with it just after I began grasping the C-like general parlance, and with it being a giant collection of functions and the usual keywords, there wasn't a whole lot to "get," at least IMO.

    Everything wrong with Java is more or less remedied in Python, I think. I prefer writing in C because I like being close to the machine. For quickie apps though C# is my bae
     
    It's beyond me why on earth universities are laying a computer science foundation with bricks of OOP. o_O As peachy as OOP is, it's not how computers work. They work a lot more closely to assembly, or better yet C. Computers function procedurally, not in objects. But anyway...
    In computer science, you're not supposed to worry about how a computer functions. Rather, your supposed to worry about your software, design, procedure and implementation. You're already operating using a high-level programming language--you've really no need to learn how the computer distributes memory into its system but rather you should worry about how much memory your program uses. Once you've drilled what computer science is into your head, perhaps you'll better understand why OOP is arguably one of the best programming paradigms out there. :p

    I myself know C and Java like the back of my hand. I wish Java wasn't so damn slow though :(
     
    In computer science, you're not supposed to worry about how a computer functions. Rather, your supposed to worry about your software, design, procedure and implementation. You're already operating using a high-level programming language--you've really no need to learn how the computer distributes memory into its system but rather you should worry about how much memory your program uses. Once you've drilled what computer science is into your head, perhaps you'll understand better :p OOP is arguably one of the best programming paradigms out there.
    The relation between source code and machine code is vital in evaluating object code size, execution speed, and overall efficiency in a program.

    I'm not really suggesting one worry about functional irrelevancies, but rather how their code plays out when done. OOP is a good paradigm but also requires shifting out of the way computers actually work - this means that object code that is compiled from OOP often bears more erroneous procedures, wasted resources, and other things that naturally come with writing in a high level programming language. While procedural languages aren't perfect either they minimize discrepancies like that a lot better than OOP languages tend to.
     
    I stopped reading after machine code. If you're using a high-level programming language then you shouldn't have to worry about machine code AT ALL. That's the compiler's job.

    OOP is a good paradigm but also requires shifting out of the way computers actually work - this means that object code that is compiled from OOP often bears more erroneous procedures, wasted resources, and other things that naturally come with writing in a high level programming language.
    Uhm, where did you get this information?
     
    I like C++, Java, and C, roughly in that order. Most of my college education consisted of C, so that used to be my favorite, but I've come to like C++ better, because references spare me the need to do a bunch of mind-bending pointer operations, plus I'm the type of programmer who likes to strictly define things, so I make big use of the "const" keyword and namespaces and all that fun stuff.

    Ironically given all that, I'm also coming to like Javascript :P I just like how flexible it is, and dynamic typing has its benefits at times xD
     
    This is absolutely incorrect and completely disregards all the effort that goes into the optimisation and fine-tuning of modern compilers.
    How?

    Yes, it's appropriate to have a fundamental understanding of how software translates into little electric pulses on a circuit board, but claiming that 'shifting away from how computers actually work' is harmful is the oldest yet most ridiculous argument for a number of reasons. Abstraction is good, and key to progress.
    You can tell me that concept is bogus all day long, but where's your proof? All you've supplied is your opinion.

    You mention assembly and C as fundamental layers, but believe it or not these are also very high-level abstractions from how a machine really works. You could apply the same argument to those components and argue that we should really just be shifting binary and electrons all day.
    The point in case here is that the abstractions C have from assembly don't have as much distance as OOP, or managed/interpreted languages.



    Pascal and especially Fortran are still widely used today.
    Where? I'd like to see one widely-used project written in those BASIC bastards. lol
     
    Back
    Top