I never much enjoyed playing around with assembly for DEC PDP-11 computers. Back then, BASIC or FORTRAN-77 was more my speed. I did love to program Intel 8085 microprocessors in assembly, but only because I am a hardware guy and the software folks where I worked turned their noses up at programming "toy" computers..So I was on my own with learning to program microprocessors in assembly for embedded applications.
In the 1980s digital image processing (which was our company specialty) was in its infancy and we had to purchase what was then high-resolution color monitors from
Aydin Controls in Fort Washington, Pennsylvania, and make them do fancy things, like trackball-controlled, on-the-fly, dynamic color bit-mapping. To take some of the load off the PDP-11 CPU we used hardware look-up tables to encode the digital video data stream to the monitors. This required that we take a DEC DR-11C general purpose I/O board and modify its hardware to interface to the Aydin video controller and re-write RSX-11M DR-11C driver software in a manner that didn't disturb the operating system.
My supervisor, and best friend today, was a wizard at rolling PDP-11 assembly code. We had the DEC source code for the original drivers to work from, as well as tall stacks of DEC documentation, so the job was doable. I hated it though. I only got involved because I was the "hardware engineer" in a mostly software-driven company. My boss thought it would be a "swell idea" if I assisted him with the Aydin interface to the PDP-11 and learned some DEC MACRO (as the DEC assembler was called) at the same time. For homework, I was given the source code for the RT11 OS to study. Fascinating.
We time-shared a PDP-11/34 with everyone in the company, so if we crashed RSX-11M while testing our driver there was hell to pay. We didn't have to re-write the DR11 driver from scratch, which would have been a nearly impossible task, but we did need to acquire an intimate understanding of how the DEC driver did its thing so we could coax it into doing our thing. We worked mostly at night to avoid the disruptive consequences of driver errors. There were time constraints too, since this was all part of a deliverable product on a government contract. It took us several months to get all the kinks worked out, but eventually it was done. I haven't touched PDP-11 assembly since then.
Then it was time to go back to Aydin with a new video controller design request for the next generation of color monitor hardware. At this time Aydin had a huge business commitment to vector graphic color displays for NASA and the military, so they weren't too thrilled to be pressured to design new hardware for their raster-scanned color displays. LCD color displays were right around the corner, but Aydin didn't know that. Nobody was paying much attention to the progress being made in the PC world... first high-res monochrome, low-res CGA, then VGA, SVGA and beyond. It took awhile but the huge flat-screen (or curved concave if you got the big bux and want 4K resolution) displays we have today were a direct result of the PC revolution.
I got to buy (on Uncle Sam's dime) an early version of a 1024 x 1024 raster-scanned color display, and the controller to go with it, interfaced to an IBM PC-AT. We needed this to display in human-readable form massive data sets from a hyperspectral aerial imaging camera. Each 2K x 2K x 2K x 8-bit image represented a collection of slices of the electromagnetic spectrum, from mid-infrared (8 to 10 μm) through the ultra-violet. The only way the human eye can make sense of this three-dimensional cube of data is to extract a sub-set of it and display it with false-color mapping. So we busied ourselves learning how to do that. Lots of fun.
The PDP-11s and VAX-11/750s scattered throughout the facility never saw it coming. DEC was obsolete and soon to go out of business by the time I left the company in 1991. DEC tried their hand at microprocessors, but it was too little, too late. Intel and Motorola had the lead, but silicon foundries were popping up like daisies after a spring rain... all over the word. You will take note that PCs killed minicomputers, but PCs are still around and get better and more powerful every year. Viva la Revolucion!
As for "inline parameter passing" I don't even want to remember what that was all about. I was attempting to learn "real" computer programming instead of just hacking my way to solutions. I remember there were several ways (in high-level languages) to pass variables (parameters) from one part your code to other parts while still guaranteeing the integrity of the data... call by reference or call by value... is the data accessible globally or localy... do I need semaphores to let other threads know when data is valid,,, yada, yada, yada. IIRC, "real" programmers didn't need to worry about what was going on in other parts of the machine, i.e., what other application threads were running in the time-shared environment. We did.
So it became very confusing, at least to me, and I eventually gave up any aspirations I might have had to become a "real" programmer. Why would I even want to become a "real" programmer (or analyst), you didn't ask? I will tell you anyway: they were the weenies who made the big bux in the company I was working for at the time. Little did I know at the time that really big bux were reserved for people who knew how to direct the efforts of a whole team of programmers working on a common goal. When I found that out, I decided not to become a "real" programmer. But I did learn how to do systems design before I left the company, and was even assigned a "programmer" and a technician to help me with one such design. That allowed me to concentrate on the hardware aspects (which I love) while leaving the programming to someone who loved to do that and did it well.
I went on to do other things after that, not many involving "real" programming. I still enjoy writing assembly code for embedded microprocessors though. It's a hoot to make those little buggers stand up and dance to my tunes.
Hop