Maker Pro
Maker Pro

Help optimizing byte compare algorithm

Code:
read   movfw     byte             ;perform framecheck
       xorwf     framing, w
       movwf     temp
       incf     temp
       decfsz     temp
       goto     read

This code is in assembly language of PIC16. It performs a compare between two registers, 'framing' and 'byte'. I am putting the result of the xor operation in a temporary register so that I can do a program branch whether it is zero or not. If not zero then it will keep checking (interrupt will provide the value change to get out of this loop).

It feels clunky. What would be a more proper/optimal way to perform this, if any?
 
I haven't written a single line in PIC assembly ever.
But what I see is more than "clunky"

1.You use the "goto" no-no command.
2.Inc and then dec is not needed.

Here is a "generic" code(translate it to pic)
note: If what you need is "Loop if equal" replace JNZ with JZ.

PIC-code.jpg
 
Yes they do perform the same
(probably.A drill for you: find out when they may not work).

But all 4 should be replaced with a single line!
and the use of a "goto" command is bad programming.
Something every beginner learns in "school" not to use!


EDIT:
I just looked at the PIC instruction set:
They use a "goto" command instead of "JUMP"...talking about Educational Issues:eek:
 
Last edited:
Code:
read
  movfw  byte  ;perform framecheck
  subwf  framing, w
  btfss STATUS, Z
  goto read
The Z bit of the STATUS register is set when an arithmetic instruction results in a 0. So you test this after subtracting the two. XOR would work as well, but is more obscure, to compare two values, subtract one from the other. This gives you < > or equal.

Bob
 

hevans1944

Hop - AC8NS
Yeah, the PIC goto command is used extensively for loop iteration because it does not have a JZ or JNZ instruction. The use of tests that skip the next instruction if the test is successful requires a jump (goto) instruction as the next instruction to make sure the loop iterates when the test fails.

I think the Microchip weenies are all hardware dudes and implemented the instruction set this way just to thumb their noses at software weenies. It's true that the use of GOTO is a leftover from BASIC and encourages the writing of spaghetti code, but it is what it is if you want to use a PIC.

Hop
 
Last edited:
Hi Hop,
What is the Higher-Level language people use for PIC ? C other?

EDIT: I found this Flowcode tool,looks very interesting.
Is anyone familiar with it?
 
Last edited:

hevans1944

Hop - AC8NS
Hi Hop,
What is the Higher Level language people use for PIC ?
@chopnhack would know more about that than I. He uses a C compiler with his PICs. I have discouraged him from doing this, but to no avail. The C compiler does encourage the use of proper programming techniques. Not a GOTO in sight with C. Plenty of IF, THEN, ELSE and CASE statements sprinkled here and there. The compiler sorts it all out of course and spits out the requisite goto instructions as needed, but that is all hidden from the person writing the C program. Those High Level Language people miss out on all the fun...:rolleyes:
 
Fun ha?
Hop,you are probably the only one here that still remembers what "inline parameter passing" was...

If you do need it :hint PDP11 assembly.
That was Fun,wasn't it...:confused:
 

hevans1944

Hop - AC8NS
I never much enjoyed playing around with assembly for DEC PDP-11 computers. Back then, BASIC or FORTRAN-77 was more my speed. I did love to program Intel 8085 microprocessors in assembly, but only because I am a hardware guy and the software folks where I worked turned their noses up at programming "toy" computers..So I was on my own with learning to program microprocessors in assembly for embedded applications.

In the 1980s digital image processing (which was our company specialty) was in its infancy and we had to purchase what was then high-resolution color monitors from Aydin Controls in Fort Washington, Pennsylvania, and make them do fancy things, like trackball-controlled, on-the-fly, dynamic color bit-mapping. To take some of the load off the PDP-11 CPU we used hardware look-up tables to encode the digital video data stream to the monitors. This required that we take a DEC DR-11C general purpose I/O board and modify its hardware to interface to the Aydin video controller and re-write RSX-11M DR-11C driver software in a manner that didn't disturb the operating system.

My supervisor, and best friend today, was a wizard at rolling PDP-11 assembly code. We had the DEC source code for the original drivers to work from, as well as tall stacks of DEC documentation, so the job was doable. I hated it though. I only got involved because I was the "hardware engineer" in a mostly software-driven company. My boss thought it would be a "swell idea" if I assisted him with the Aydin interface to the PDP-11 and learned some DEC MACRO (as the DEC assembler was called) at the same time. For homework, I was given the source code for the RT11 OS to study. Fascinating.

We time-shared a PDP-11/34 with everyone in the company, so if we crashed RSX-11M while testing our driver there was hell to pay. We didn't have to re-write the DR11 driver from scratch, which would have been a nearly impossible task, but we did need to acquire an intimate understanding of how the DEC driver did its thing so we could coax it into doing our thing. We worked mostly at night to avoid the disruptive consequences of driver errors. There were time constraints too, since this was all part of a deliverable product on a government contract. It took us several months to get all the kinks worked out, but eventually it was done. I haven't touched PDP-11 assembly since then.

Then it was time to go back to Aydin with a new video controller design request for the next generation of color monitor hardware. At this time Aydin had a huge business commitment to vector graphic color displays for NASA and the military, so they weren't too thrilled to be pressured to design new hardware for their raster-scanned color displays. LCD color displays were right around the corner, but Aydin didn't know that. Nobody was paying much attention to the progress being made in the PC world... first high-res monochrome, low-res CGA, then VGA, SVGA and beyond. It took awhile but the huge flat-screen (or curved concave if you got the big bux and want 4K resolution) displays we have today were a direct result of the PC revolution.

I got to buy (on Uncle Sam's dime) an early version of a 1024 x 1024 raster-scanned color display, and the controller to go with it, interfaced to an IBM PC-AT. We needed this to display in human-readable form massive data sets from a hyperspectral aerial imaging camera. Each 2K x 2K x 2K x 8-bit image represented a collection of slices of the electromagnetic spectrum, from mid-infrared (8 to 10 μm) through the ultra-violet. The only way the human eye can make sense of this three-dimensional cube of data is to extract a sub-set of it and display it with false-color mapping. So we busied ourselves learning how to do that. Lots of fun.

The PDP-11s and VAX-11/750s scattered throughout the facility never saw it coming. DEC was obsolete and soon to go out of business by the time I left the company in 1991. DEC tried their hand at microprocessors, but it was too little, too late. Intel and Motorola had the lead, but silicon foundries were popping up like daisies after a spring rain... all over the word. You will take note that PCs killed minicomputers, but PCs are still around and get better and more powerful every year. Viva la Revolucion!

As for "inline parameter passing" I don't even want to remember what that was all about. I was attempting to learn "real" computer programming instead of just hacking my way to solutions. I remember there were several ways (in high-level languages) to pass variables (parameters) from one part your code to other parts while still guaranteeing the integrity of the data... call by reference or call by value... is the data accessible globally or localy... do I need semaphores to let other threads know when data is valid,,, yada, yada, yada. IIRC, "real" programmers didn't need to worry about what was going on in other parts of the machine, i.e., what other application threads were running in the time-shared environment. We did.

So it became very confusing, at least to me, and I eventually gave up any aspirations I might have had to become a "real" programmer. Why would I even want to become a "real" programmer (or analyst), you didn't ask? I will tell you anyway: they were the weenies who made the big bux in the company I was working for at the time. Little did I know at the time that really big bux were reserved for people who knew how to direct the efforts of a whole team of programmers working on a common goal. When I found that out, I decided not to become a "real" programmer. But I did learn how to do systems design before I left the company, and was even assigned a "programmer" and a technician to help me with one such design. That allowed me to concentrate on the hardware aspects (which I love) while leaving the programming to someone who loved to do that and did it well.

I went on to do other things after that, not many involving "real" programming. I still enjoy writing assembly code for embedded microprocessors though. It's a hoot to make those little buggers stand up and dance to my tunes.

Hop
 
Last edited:
Not sure why people are surprised to find a goto instruction. It is the same thing as a jmp or br instruction on other assembly languages.

Test and skip instructions are a little more flexible than conditional branch instructions since they can test any bit in any registers instead of just the processor flag bits. However, they are a PITA for the programmer.

Bob
 
@chopnhack would know more about that than I. He uses a C compiler with his PICs. I have discouraged him from doing this, but to no avail. The C compiler does encourage the use of proper programming techniques. Not a GOTO in sight with C. Plenty of IF, THEN, ELSE and CASE statements sprinkled here and there. The compiler sorts it all out of course and spits out the requisite goto instructions as needed, but that is all hidden from the person writing the C program. Those High Level Language people miss out on all the fun...:rolleyes:

My knowledge on this would be anecdotal at best :oops:

I am currently nursing through a free course on edx.org called Systematic Program Design. They have a formulaic way of writing code that many would find tedious and slow, but I have rather enjoyed 'getting' how things go together logically. A bright spot was better comprehending the construction of a function. I am currently working on what they call how to design data.

ASM is too close to the hardware level for me... Way too tedious to do the simplest things in my opinion such as 'time'. I would rather have a function built for that instead of having to waste clock cycles upto the limit of the bit structure, rinse and repeat until enough time has elapsed. (yes that is what is done inside of the function, but at least I don't have to sit there and code it!). Of course I say this because my first experience was coding in BASIC on a commodore64 ;-) ASM is powerful and you can best take advantage of PIC's memory, but without any previous knowledge, I find it difficult at best.
 

hevans1944

Hop - AC8NS
@chopnhack: ASM doesn't have to be unstructured programming. And, yes, it is difficult until you get the hang of it, which requires an intimate understanding of the underlying hardware. At some point the Intel microprocessors became so complicated that I found it impossible to program them, even for embedded applications (which are usually simple). So I switched to Visual BASIC with some C sprinkled in here and there and loaded up the Windows programming environment on an "IBM compatible" PC. Still, for most projects, waaay more programming was required than I was comfortable with. It wasn't "fun" any more and instead looked an awful lot like "work". I like those two words to be synonymous if I am going to be involved.

Using macros you can write timer routines for a particular μP that behave exactly the same as their high-level language counterparts. If you store these macros in a library, they only need to be coded once and then they can be called just as easily from either an ASM program or a C program. Sometimes I think the essence of good programming is having a good library of functions and routines that you write once and call as many times as needed. Microsoft used to publish a huge paper volume, about three inches thick, that described all the application program interface (API) functions and routines you would (probably) need to use in your program to make it play nicely with Windows. You were of course free to write your versions, but that book was so intimidating that I decided not to write programs for Windows, which was a moving target at best.

AFAIK there are only two ways to code a time delay interval in a program: count machine cycles with a software loop, or use a programmable hardware counter, driven either by the internal clock oscillator of the μP or externally by a clock of your choice. The devil is in the details and will be specific to a particular μP and whatever external hardware (if any) is connected to it, but that is true whether you code in ASM or C. As you mentioned, when coding in C, the compiler and the include file you use to tell the compiler which μP you are using, hides the details by abstracting the timer function.

I apologize for citing you as the expert for high-level language programming, but you are certainly more qualified than the hacker that I am.:D

Hop
 

Harald Kapp

Moderator
Moderator
Not sure why people are surprised to find a goto instruction.
GOTO per se isn't evil. It is the unstructured use of GOTO (which, of course, is at least not disencouraged by the ease of use of this statement) that creates "spaghetti code". Higher level structured language elements (if...then...else, while(), for() etc.) enforce a more structured programming style - although with some "creativity" it is still possible to write unintelligible code.

Imho the structure needs to be in the thinking of the programmer. It will then show in the architecture and design of the software. The programming language finally is only the means to realize the code. One (general, not specialized) programming language may have certain advantages over another language in different details, but when it comes to putting actual code down do not forget the value of being acquainted with a language.
 
I apologize for citing you as the expert for high-level language programming, but you are certainly more qualified than the hacker that I am.:D

No worries mate, I am more of a bodger than an expert. (urban dictionary version of bodger, not the skilled woodworker of old!) I am itching to get back into that though, just too many other projects in the concrete world that need attention...
 
Top