21 thoughts on “COBOL”

  1. I supported COBOL applications from 1986 to 1998 in Kansas and Missouri.

    I just asked one of my staffing company friends if he is seeing “quit your job” bill rates in Kansas or Missouri yet.

    He is not.

    1. Don’t know if you mean this in jest or with irony, but there is something to your recommendation.

      Back in the mid 1980s on microcomputers to be dominated by PCs, you had BASIC, assembly language, with clunky, expensive FORTRAN and Pascal compilers. Remember that Apple had a dialect of Object Pascal (what was an object, we asked back then) for Mac development, long before C made inroads.

      Besides being One Language to Rule them All, Ada was supposed to be in the European let’s do a programming language right in place of the mishmash of FORTRAN lineage of Algol 60 and later Pascal but supplying all of the missing pieces.

      Supply all of the missing pieces it did, with a vengeance. If micro FORTRAN and Pascal were clunky, Ada was ponderous. I actually embarked on the RR Software offering, at the time only a partial implementation so it didn’t count as a “real” Ada compiler, after hearing the sales pitch from R and R themselves before the Madison Computer Club. Imagine building your code with a multi-pass compiler, swapping out 360K floppy disks in a dual-disk “rig.”

      Then Borland Turbo Pascal happened, which for the time was this amazingly fast, compact, cheap(!) editor-compiler-debugger than ran in one pass off a single floppy drive. By the time we all had hard drives on our system, the bulkier, somewhat pricier Turbo 4 came along, which pretty much had all the features you needed. The authors of “Numerical Recipes in Pascal” (the fact that the original Numerical Recipes were for Pascal told you that Pascal was a “thing” on microcomputers at one time) admitted that warts and all, the Borland dialect had become the de facto Pascal standard.

      This hacky, MS-DOS-centric commercial product wasn’t Ada and it wasn’t even Modula-2, but by Turbo Pascal 4, it had its own version of the features that Ada and Modula-2 were supplying to extend Pascal into something usable. Being a disciple of the European let’s do programming right model, and having learned Pascal from Brinch Hansen himself, and in a Microsoft dominated world, why would I use anything else?

      Fast forward to today. My Pascal/Delphi experience makes me a kind of latter-day COBOL programmer. I haven’t bothered to try any of the current Ada compilers, but considering how massive Delphi and Visual Studio are and that we are far removed from dual-floppy PCs and swapping disks between compiler passes, how clunky could the Ada implementations be? Given C++ to be as far from the European let’s do programming right model as imaginable, why aren’t we giving Ada serious consideration?

      In the spirit that Modula-2 and whatever else Professor Wirth crafted became irrelevant with the Borland Pascal dialects, why are people puttering around with Rust/Swift/Go when there are complete, functioning Ada implementations?

      1. I remember sticking mostly with BASIC, Pascal, and 8080 or 6502 assembly language for most things, because that’s what was available and cheap, aside from the constant ads for things like “Nevada FORTAN, Nevada COBOL, Nevada Pascal, $29.95!” Back then techies could be generalists and stay on top of almost everything significant in the world of micro-computing, and folks had stacks of BYTE magazine and floppy disks all over the place.

        I really liked SBASIC for CP/M, which was a structured, compiled BASIC that was quite similar to Pascal. On the PC, I used Turbo Pascal for all sorts of things until Turbo-C came out. I ran lots of factory automation with Turbo-C.

        Thankfully the old issues of BYTE are online at archive.org
        Random 1983 issue (PDF)

        It’s perhaps interesting that everyone involved was so focused on building the future that we didn’t give much thought about how we were changing the world so dramatically, and whether anyone would care decades later. So now we have college history classes teaching all kinds of politically correct woke garbage about the history of hippie protesters in paper mache puppet heads, and we have computer science departments teaching coding classes, with nary a course to teach young folks where all this amazing tech came from, even though it’s all documented in excruciating detail, perhaps more than almost any historically significant thing mankind has ever done.

      2. I meant it in jest. Back in the 80’s, I was working for TRW at Norton AFB (supporting the Air Force Ballistic Missile Office). One of the many fads BMO fell for was converting everything to Ada. In tandem, our TRW Chief Engineer (whose name I forget) bought a Cyber 205 to beef up our computing center (this was before desktops). The latter had a staggering 7 million (48 bit) words of memory, so much that the Chief Engineer lamented that it encouraged sloppy coding.

        None of that coding was in Ada, however. The only BMO fad that fell flatter than Ada was TQM, which had absolutely no place in that environment whatsoever.

        I bought the first IBM PC in San Bernardino. With it, I bought a UCSD FORTRAN compiler (for $700, in 1982 dollars). It never worked. Eventually, I just learned BASIC. Then Microsoft came out with a BASIC compiler, which sped things up remarkably. I also added an 8087 math coprocessor, and it was like lightning (relatively speaking). I had occasion in the 2004 time frame to do some coding for work, and was amazed at how far Fortran had come. Far friendlier than the Minnesota FORTRAN 77 I had to use at Purdue (on punched cards). I bought Intel Visual Fortran (90/95) as my mid-life crisis purchase. The IDE is just too overwhelming, so these days, I use Simply Fortran ($100, and quite satisfactory), or Maple.

        1. “The latter had a staggering 7 million (48 bit) words of memory, so much that the Chief Engineer lamented that it encouraged sloppy coding.”

          There might be something to large amounts of memory encouraging sloppy programming. It is not that memory is any longer expensive as it once was, but that allowing programs to bulk up may increase the number of bugs. I say “may” because “tight” coding also has its bugs, but at least one is thinking more about the design.

          With respect to Dr. Dobb’s Journal, it turns out that the fictional Dr. Dobbs was a specific kind of doctor. The subtitle was “Digital Orthodontia, Running Light Without Over Byte” for a magazine aimed towards software development on microprocessors and microcomputers with limited memory.

          Back on Rand’s original topic of geezers with once obsolete skill sets now back in demand, the take-home message from a number of job interview talks here at the U is that writing software in insanely small amounts a memory is back in fashion. Love it or hate it, but the impetus is “The Internet of Things” (IoT), where you are trying implant a microcontroller inside a dental filling and have it run off the power to be harvested from a person chewing, or some such thing (Oooh! By telling a sarcastic joke, did I just give away an idea for which the U requires me to file a patent disclosure? )

      3. “Ada was ponderous.”

        I learned VMS Ada in college, as they were transitioning away from Pascal. Nothing like a single, multipage error when you forgot to instantiate an IO package for one of your 23 derived Integer subtypes.

    2. According to Alan Turing’s theory, it should be possible to perform any calculation in COBOL, even if the program is endless series of adds and subtracts on paper tape.

  2. Given C++ to be as far from the European let’s do programming right model as imaginable, why aren’t we giving Ada serious consideration?

    Oh I heartily disagree. I’ve seen Bjorn Stroustroup lecture on the origins of C++ in person. C++ was supposed to be “the better C” and that’s a quote.

    So if Data Structures + Algorithms = Programs (Pascal)
    Does Square Pegs + Round Holes = Objects (C++)?

    1. C++ operator overloading has to be the best thing to happen to programming since GO TO. Now anyone can look at a piece of legacy code, scratch their head, and wonder what ‘+’ or ‘=’ does.

          1. OBOL makes a lot of sense. “COmmon Business Oriented Language” no longer really applies, but “Obsolete Business Oriented Language” is spot on. ^_^

          2. Point taken, but we would still have Algol, Pascal, PL/I, Ada, FORTRAN, Perl, Python, Haskell, Swift, Go, Rust, and yes MATLAB, Maple and (wait for it) Mathemati a

          3. Nobody wants to admit to writing a program in P, even though I think those who insist on using every arcane feature of C++ should be forced to follow the BCPL nomenclature.

            Or, in other words, don’t eat the yellow SNOBOL.

  3. I have a friend who thought she’d be clever and sent me a birthday card one year encoded all in hex (hexadecimal numbers). So I quickly grabbed my ASCII chart and decoded it into gibberish.

    Then I remembered, her only exposure to computer programming was a COBOL course she took in college. Since she went to the same school as I did, I knew the COBOL labs were all done on an IBM System/360 mainframe.

    So I put away the ASCII chart and dug up my EBCDIC chart. Voila. A very nice birthday message.

    https://en.wikipedia.org/wiki/EBCDIC

  4. All software is crap. The bosses don’t know what the “coders” are doing and they don’t care. Until the CIOs go to JAIL for bad software (including security leaks) it will not change.

    Boeing 777 must be powered off every 51 days to reset all the memory leaks.

    From a recovering SW Ada Programmer. After 35 years doing Ada even that did not help with all the design and system problems built into the product. My last job before retiring I gave up programming and just did hardware.

  5. The very last job I had before retiring was maintaining and upgrading an obsolete VB6 infrastructure program to .Net so that it would run under Win10. It was so bady written I wondered if I could make a living just fixing old VB crap. Some of my old apps from 20-30 years ago, including one written in FoxPro (!) are still messaging me to say they’re doing just fine, completely unattended, as they’re ported from server to server. I wish at least one old client would notice their servers are emailing me…

Comments are closed.