Monthly Archives: February 2020

Marc Andreessen (Netscape) was a CoCo user!

Over in the Facebook Color Computer group, Jerry Stratton discovered that Marc Andreessen, co-founder of Netscape, was a CoCo user! Jerry ran across an April 1987 issue where Marc wrote in looking for CoCo pen-pals!

April 1987 Rainbow Magazine.

Here’s that issue, scanned and searchable, if you want to see who else might have been writing in:

http://www.colorcomputerarchive.com/coco/Documents/Magazines/Rainbow%2C%20The%20%28Searchable%20image%29/The%20Rainbow%20Vol.%2006%20No.%2009%20-%20April%201987.pdf

C compiler bugs that make you go hmmm…

There is a PIC compiler from CCS that I use at work. Every now and then we run across a bug (most of which get quickly fixed by CCS). The latest is a rather bizarre issue where strings are getting mixed up.

Attempting to do something like this:

printf("Value %d", value);

…created output like:

XXXXXX42

…because somewhere else in the code was this:

printf("XXXXXXXXXXXXX");

The compiler was using the correct format string (to know where to put the %d variable on output) but was displaying the string from a different bit of code.

And that other bit of code was unused (not being called at all).

This may be a compiler optimizer bug, but I found it amusing. I haven’t seen a mixed up printf() issue before.

Until next time…

Converting two 8-bit values to one 16-bit value in C

This is a follow-up to an earlier article I wrote about converting 8-bit values to 16-bit values.

There are several common ways to convert two byte *8-bit) values into one word (16-bit) value in C. Here are the ones I see often:

Math

This is how we learned to do it in BASIC. “C = A * 256 + B”:

uint8_t byte1;
uint8_t byte2;
uint16_t word;

byte1 = 0x12;
byte2 = 0x32;

word = byte1 * 256 + byte2;

Bit Shifting

This way generally makes less code, making use of the processor’s ability to bit shift and perform an OR.

word = (byte1 << 8) | byte2;

Unions

union {
   // First union element contains two bytes.
   struct {
      uint8_t byte1;
      uint8_t byte2;
   } Bytes;
   // Second union element is a 16-bit value.
   uint16_t word;
} MyUnion;

MyUnion.Bytes.byte1 = byte1;
MyUnion.Bytes.byte2 = byte2;

That one looks like much more source code but it may generate the smallest amount of executable instructions.

Array Access

And here was one I found at a previous job, which I hadn’t seen before, and haven’t seen since:

uint8_t  bytes[2];
uint16_t value;

value = 0x1234;

bytes[0] = *((uint8_t*)&(value)+1); //high byte (0x12)
bytes[1] = *((uint8_t*)&(value)+0); //low byte  (0x34)

That example is backwards from the topic of this post, but hopefully you can see the approach.

The question today is … which would you choose?

Clean Code would say writing it out in the simplest form is the best, since it would take less time for someone to understand what it was doing. But if you needed every last byte of code space, or every last clock cycle, you might want to do something else.

Which do you think is smallest?

Which do you think is fastest?

Which do you think is easiest to understand?

Comments appreciated.

Until next time…

A C tale of two ampersands

Every now and then, I run across a bug in a C program that makes me say “oh, one of those again?” For instance, a very common one is when you want to compare two values using equal (“==”) but you leave one off and do an assignment by mistake (“=”):

if (a = 10)
{
   DoSomething();
}

When you do this, the result is the value, thus “a = 10” resolves to 10, so it’s like saying “if (10) …”. It would work for any non-zero value, since “if (0)” would not run. A fun bug to find!

Most modern C compilers will emit a warning about that, because they know it’s probably not what the programmer intended.

But a lot of crappy small embedded compilers don’t. And since I work with a lot of small embedded compilers, I stumble across bugs like this from time to time.

I found a new one to add to my list, this time using a logical “and” comparison (“&&”) versus an accidental bitwise “and’ (“&”). Consider this:

if ( (a==100) & (b==42) )
{
   DoSomething();
}

The intent of that code was to only DoSomething() if a was 100 AND b was 42. But, there was only one ampersand, so it was actually taking the result of “a==100” (either true or false) and mathematically ANDing it to the result of “b==42” (either true or false).

If a was 100 and b was 42, it was doing this:

if ( (true) and (true) )
{
   DoSomething();
}

And that worked, since “true” resolves to a 1, and “1 & 1” is 1. (My grade school math teacher never taught it that way…)

But, clearly this was a bug and it should have been the logical AND (“&&”).

Which made me look at the generated code and see what the difference is. And the difference is that using the bitwise AND generates far more code because it has to save the results of two comparisons then AND them together and check that result. It was basically doing this:

result1 = (a == 100); // true or false
result2 = (b == 42); // true or false
if (result1 & result2)
{
   DoSomething();
}

So sure, it works, but it generated much larger code and did much more work and thus was both larger and slower than doing the desired logical AND (“&&”).

And that’s all I have to say about that, today.

Until next time…

BASIC REM memory savings

Over in the CoCo Facebook group, Diego Barzio just posted a note about discovering some BASIC program line comments that had left out the REM keyword at the start. Because those lines were never the target of a GOTO, there was no error. This had the side effect of saving a byte (or two, since most do “REM ” at the start) for comment lines.

This made me think about what was going on. BASIC will allow you to type in anything, and will try to parse it (to tokenize the line) and produce an error (?SN ERROR, etc.) if there is a problem. But, when you are typing in a line, this BASIC doesn’t check anything until that line is actually ran in the program. (Other versions of BASIC will check when you hit enter. Didn’t the Apple 2 do that?)

REM

In an earlier article, I mentioned that “REM” tokenized into one byte, and the apostrophe (‘) turned into two bytes because it was “:REM(space)” allowing you do do something like this:

10 REM This is a comment.
20 'This is a comment

Since most folks type “REM(space)”, using the apostrophe without a space after it takes the same room. But if you include the space:

10 REM This is a comment
20 ' This is a comment

…that would make line 10 take one byte less (REM + space is two bytes, versus ‘ + space which is three bytes).

Now, if you do not include REM or apostrophe, you can save a byte or two for each comment. If you program has 50 lines of comments, that’s 50-100 bytes.

BASIC Keywords

However, this made me realize that leaving out the REM would cause it to try to tokenize the line, and turn any BASIC keyword into a token, saving even more memory. If your comment included words like PRINT, OR, ELSE, etc., you’d save even more room!

Leaving our REM can save memory? Thanks, Diego Barizo!

As you can see, because this comment uses the word PRINT (five characters), the version without the REM appears to save six bytes — it saves the apostrophe (two bytes, or would save “REM(space)” for two bytes) plus tokenizes the PRINT keyword down from five bytes to one (saving four more bytes).

Interesting BASIC interpreter abuse, isn’t it? As long as you never run this line, you might save memory by leaving our REM (depending on if you use any keywords). I could imaging comments saying “SHOW THE USER NAME” changed to “PRINT THE USR NAME” to tokenize two words (PRINT and USR).

You would still need REM at the start of the code for any initial comments, and during the code, but for subroutines you could do this:

10 REM THIS IS MY PROGRAM
20 A$="HELLO, WORLD!":GOSUB 1010
999 END
1000 PRINT THE MESSAGE
1010 PRINT A$:RETURN

The GOSUB jumps to 1010 and can find it, and since the program would END before reaching 1000, that would work.

With great power comes great responsibility. Use this wisely :) and thanks, Diego, for noticing this.

Until next time…

Play CoCo’s “Donkey King” in a web browser.

One of the all-time best ports of Donkey Kong on a 1980s home computer was a clone called Donkey King (later renamed to The King). Although we didn’t know this until years later, it was authored by Chris Latham, who also created the first CoCo game to require 64K – The Sailor Man (a clone of Popeye).

Last week, the gang over at CoCoTalk (the weekly video chat/interview show) started a game contest where everyone is invited to try to set a high score on some CoCo game. The first game chosen was Donkey King, which is quite fitting since it’s one of the greatest CoCo games ever. Unlike most (all?) other versions of the day, it features all four levels as well as the intermissions (see the Donkey King link above for screen shots). It also plays amazing well and is as frustratingly difficult as the arcade game it was based on.

For those interested in trying it out, you can go to the JS Mocha CoCo Emulator webpage (the Java Script version of Mocha, which was original a Java CoCo emulator). It is one of the games available there.

You will find it in the second column. Just select it then click Load Bin. It uses the right joystick, I believe, so you can select “Dual Key R” from the Joystick config and that will map that to the keyboard – Arrow Keys and Space Bar (to jump).

If you want to hear the sound effects, you have to checkbox the “Sound” option in the lower right of this screen.

Give it a try and see what you think.

Until next time…

CoCoFEST! Challenge: An AI camera project?

John Linville recently announced a CoCoFEST! Challenge on the CoCo Crew Podcast Facebook page. The idea is to start and complete a new CoCo project between February 15, 2020 and April 1, 2020 (day after Valentine’s Day through April Fools Day). The project doesn’t even have to be technical (he uses the example of designing a dust cover for a Multi-Pak).

I’ve been thinking about this since I have dozens of past experiments that could easily turn into new projects. But I also thought it might be an excuse to start “yet another” experiment specifically for this challenge.

HuskyLens and case from DFRobot.

Recently I took possession of a HuskyLens from DFRobot. It is an AI camera device that started out as a Kickstarter project last year. The tiny gadget includes a camera, touch screen, and AI software that can do things like:

  • Object Tracking – teach it what an object looks like, and it will track its position when it is in front of the camera.
  • Face Recognition – teach it a face and it will identify when it sees that face.
  • Object Recognition – identify built in objects (dog, cat, etc.) or teach it to recognize new ones.
  • Line Tracking – identify a line (useful for a line following robot).
  • Color Recognition – teach and identify specific colors.
  • Tag Recognition – identify low-resolution QR-code style tags.

I acquired a HuskyLens specifically for Halloween projects, but since it communicates to a host computer over serial, it could be interfaced to a CoCo using a cheap TTL-to-RS232 adapter like my CoCoWiFi and SirSound projects use.

HuskyLens in case, attached to mount.

Since the slowest speed the HuskyLens firmware communicates with is 9600 baud, I’d have to do this using an RS-232 Pak under NitrOS-9 (so I could easily do it in BASIC09 or C), else I’d have to resort to assembly language under RS-DOS. If I go the assembly route, I’d have to see what code I could find to handle 9600 baud via bitbanger, else the RS-232 Pak would still be required.

Bitbanger is quite out of my wellhouse, since the only bitbanger code I’ve ever worked with was a remote terminal driver published in Rainbow that I modified to add features to. The RS-232 Pak I could probably handle since my first commercial program was a MIDI Librarian for a CoCo MIDI Pak and I wrote code for a much faster baud rate (31,500).

It wouldn’t be pretty, but I think I could make it work.

The question is … what does one do with an AI camera hooked to the CoCo? Perhaps…

  • CoCo Face ID – auto-log in to NitrOS-9 by face. (Not a security feature, since a photo would also work, but still neat.)
  • Visual Game Launcher – hold up a ROM-Pak and have it launch the program off of a CoCoSDC drive image.
  • Gestures Game – use gestures (rather than a keyboard/joystick) to interact with a game.

What can you think of? Ideas are appreciated…

Arduino compatible bit Macros

Years ago, I cloned the Arduino IDE “bit” macros for use in a GCC program (for testing generic Arduino code on a different system). I dug them out recently for a work project, and decided to share them here. Maybe they will be useful to someone else. (And apologies for WordPress clobbering the ampersand and turning it to HTML. When I edit and view it, it looks good, then WordPress “helps” — even though I am using a Syntax Highlighter plugin specifically for code examples.)

#ifndef BITMACROS_H
#define BITMACROS_H

// Bit macros, based on Arduino standard.
// https://www.arduino.cc/reference/en/language/functions/bits-and-bytes/bit/
//
// x: the numeric variable to which to write.
// n: which bit of the number to write, starting at 0 for the least-significant (rightmost) bit.
// b: the value to write to the bit (0 or 1).

#define bit(n)              (1 << (n))

#define bitSet(x, n)        ((x) |= bit(n))

#define bitClear(x, n)      ((x) &amp;= ~bit(n))

#define bitRead(x, n)       (((x) &amp; bit(n)) !=0 )

#define bitWrite(x, n, b)   ((b) ? bitSet((x), (n)) : bitClear((x), (n)))

/* Verification tests:
bool showBits (uint32_t value, unsigned int bits)
{
    bool status;

    //printf ("showBits (%u, %u)\n", value, bits);

    if ((bits == 0) || (bits > sizeof(value)*8))
    {
        status = false;
    }
    else
    {
        // Must use signed to check against 0.
        for (int bit = (bits-1); bit >= 0; bit--)
        {
            printf ("%u", bitRead(value, bit));
        }
        printf ("\n");
    }

    return status;
}

int main()
{
    unsigned int value;

    // Test bit()
    for (unsigned int bit = 0; bit < 8; bit++)
    {
        printf ("bit(%u) = %u\n", bit, bit(bit));
    }

    printf ("\n");

    // Test bitSet()
    for (unsigned int bit = 0; bit < 8; bit++)
    {
        value = 0;
        printf ("bitSet(%u, %u) = ", value, bit);
        bitSet(value, bit);
        showBits (value, 8);
    }

    printf ("\n");

    // Test bitClear()
    for (unsigned int bit = 0; bit < 8; bit++)
    {
        value = 0xff;
        printf ("bitClear(%u, %u) = ", value, bit);
        bitClear (value, bit);
        showBits (value, 8);
    }

    printf ("\n");

    // Test bitRead()
    value = 0b10101111;
    showBits (value, 8);
    for (unsigned int bit = 0; bit < 8; bit++)
    {
        printf ("bitRead(%u, %u) = %u\n", value, bit, bitRead (value, bit));
    }

    printf ("\n");

    // Test bitWrite - 1
    value = 0x00;
    showBits (value, 8);
    for (unsigned int bit = 0; bit < 8; bit++)
    {
        printf ("bitWrite(%u, %u, 1) = ", value, bit);
        bitWrite (value, bit, 1);
        showBits (value, 8);
    }
    // Test bitWrite - 0
    showBits (value, 8);
    for (unsigned int bit = 0; bit < 8; bit++)
    {
        printf ("bitWrite(%u, %u, 0) = ", value, bit);
        bitWrite (value, bit, 0);
        showBits (value, 8);
    }

    printf ("\n");

    return EXIT_SUCCESS;
}
*/

#endif // BITMACROS_H
// End of BitMacros.h