Hacking Home Depot Lethal Lily animatronic – part 1

See Also: intro, part 1, part 2, part 3 and part 4. (Github project.)

IMPORTANT NOTE: This series follows my typical “document as I discover” format, and will be speculating about something that turned out to be incorrect. Rest assured, I do end up showing how things actually work…

Lethal Lily Swamp Witch is (was?) an animated Halloween prop sold by Home Depot in 2023. Although it looks like you cannot currently buy a full Lethal Lily, replacement parts can be ordered from Seasonal Visions International:

Home | Seasonal Visions International – SVI Service

To see if the animated head could be controlled by an Arduino, I ordered a replacement HEAD, SENSOR, ADAPTER 3AMP and CONTROL BOX.

Lethal Lily replacement parts.

CONTROL BOX

The control box has the following:

  • Three-position slide switch to select modes: Sensor, OFF – Try Me, and Lights Only.
  • Barrel connector input jack for the 5.9V DC 3 amp power supply.
  • Volume knob.
  • Speaker.
  • Four cables that go to the sensor, head, and (I assume) lights.
Lethal Lily control box wiring.

The cables are as follows:

  • Three-pin (green-blue-white) connector that goes to the SENSOR.
  • Four-pin (blue-black-red-green) goes to the HEAD.
  • Two-pin (black-red) that goes to … ? (probably lights)
  • Two-pin (blue-red) that goes to … ? (probably lights)

Inside the box is a small circuit board. I have not removed it to see what is on the bottom, but I suspect the processor is under the black blob.

Lethal Lily control circuit board.

POWER ADAPTER

The power adapter is odd — 5.9V DC, 3 amps.

Lethal Lily 5.9V 3A power adapter.

SENSOR

The SENSOR uses three wires, and I assume it is just a normal motion sensor. A PIR (passive infrared) motion sensor has three wires (voltage, ground, and the output).

Lethal Lily sensor.

HEAD

The head has a four-pin cable coming out of it near the metal shaft that is used to connect it to the frame.

Lethal Lily head cable.

How does it work?

I had hoped to find a bunch of different cables that would run to the head, each operating independent servos. Seeing only a four-wire connector made me wonder if they were sending commands (like I2C messages or something) or doing something else to address all eight (?) servos in the head.

The first thing I did was connect the power adapter and sensor, then hook the head cable up to a Saleae Logic Analyzer. I assumed the black wire to be ground, and the red wire to be power, so I looked at what was going on with the blue and green wires on the ends of the connector.

Lethal Lily head control hooked to a Saleae logic analyzer.

I triggered the sensor seven times (since my understanding is there are seven different patterns Lily can play). What I saw looked promising:

Lethal Lily Saleae capture.

The Green Wire

Above, Channel 0 was hooked to the green wire. This appears to be high except for a few pulses low at the start and end of the sequence, and another set of pulses low at the end.

As I checked the start and stop pulses for each sequence, I saw that they were different. The pulses would vary is width, making it look like it might be I2C or some similar protocol where LOW represents a 0 and HIGH represents a 1 (or vise versa). There are no clock pulses, so if this is encoding a number, it must be doing it based on predefined time, starting to count when it sees the pulse drop. That makes me think it could be normal serial data at some baud rate.

If that is the case, I can work out a sequence of numbers being sent by the green wire at the start and end of each sequence. The signal would drop to indicate the start of data (a “start bit”). Below, “D” is a down pulse, and “U” is back up. I can easily “see” eight in each of the start/end sequences:

  1. UDDDDUUU / DDDUUUUD
  2. DUDDUDUU / DDDUUUUD
  3. UUDDDDUU / DDDUUUUD
  4. DDUDUUDU / DDDUUUUD
  5. UDUDDUDU / DDDUUUUD
  6. DUUDUDDU / DDDUUUUD
  7. UUUDDDDU / DDDUUUUD

Looking at the pulses this way shows a pattern. The end pulses are the same for each sequence. Using a serial decoder and and playing around with the baud rate until the bits decoded match where the pulses are, gives me something like this:

That is using a value of 550 baud. I have no idea if this is correct, but if so, it gives the following values for the start of each sequence. Note that the bits are being written least bit first (so, right to left) so UDDDDUUU becomes 11100001 in binary.

E1 - 11100001
D2 - 11010010
C3 - 11000011
B4 - 10110100
A5 - 10100101
96 - 10010110
87 - 10000111

And at the end of each pattern, there is a 0x78 sent (as shown in the screen shot above):

78 - 11100001

I see a pattern! Each set of eight pulses is a 4-bit value, then the value with the bits inverted!

E1 - 11100001 - 1110 and 0001 (1)
D2 - 11010010 - 1101 and 0010 (2)
C3 - 11000011 - 1100 and 0011 (3)
B4 - 10110100 - 1011 and 0100 (4)
A5 - 10100101 - 1010 and 0101 (5)
96 - 10010110 - 1001 and 0110 (6)
87 - 10000111 - 1000 and 0111 (7)

The values go from 1 to 7 — matching the number of patterns the control box can perform. It looks like 0 is not used.

As for the end sequence of 78, that is the same pattern:

78 - 01111000 - 0111 and 1000 (8)

I do not know what these values instruct the head to do, but at least now I should be able to recreate sending those pulses out via an Arduino or something just by writing out a byte at 550 baud.

The Blue Wire

The data on the blue wire looks like pulses of different widths. My understanding is a servo position is set based on how wide of a pulse is sent. The head must be receiving these pulses and passing them to the different servos somehow. Speculation: Perhaps it sends eight pulses that go to servo 1, 2, 3, etc. and then restarts back at servo 1. (Me from the future: It does not…)

To figure out how many pulses there are for each of the seven sequences, I used the Saleae Logic Analyzer and highlighted a range of pulses. It showed me that each sequence has this many “rising edges” (where the pulse goes up):

  1. 85 pulses
  2. 87
  3. 75
  4. 87
  5. 100
  6. 96
  7. 79
  8. (back to pattern 1) 84 … ?

Speculation: Seeing that the first pattern 1 reported 85 pulses in the Saleae Logic Analyzer, then the second one reported 84, tells me I may not have things set up properly with my Saleae.

This what not what I hoped to see. I expected. If there are eight servos, each number should be a multiple of eight. This clearly does not be the case. Perhaps the pulses can stop at any time (say, updating 8 servos, 8 servos, 8 servos, then just 3 for a total of 27 pulses)? I would still expect every pattern to end with pulses that set the head back to a default “off” state. Perhaps the head does that on its own when it receives the 0x78 “end” byte? Since the start bytes are different for each pattern, I suspect the head must need to know that for some purpose, as well. (Me from the future: This is foreshadowing…)

Also, I only think there are eight servos because of a reference in this Home Depot press release. There is an unofficial wiki that says there are nine servos.

I also do not assume all servos are independently controlled. If each eye has it’s own left/right servo, one pulse could be sent to each eye so the move together. At some point I may have to take this thing apart and look inside.

Until then, let’s see what eight servos might control:

  1. Head tilt forward and backwards.
  2. Head tilt left and right.
  3. Head turn left and right.
  4. Left eye looks left and right.
  5. Left eye looks up and down.
  6. Right eye looks like and right.
  7. Right eye looks up and down.
  8. Eyelids blink.
  9. Mouth open and close…if this is a servo?

Maybe this is why the wiki says nine?

And, if it is nine, but they tied the eyes together, it might really look like this to the control box signals:

  1. Head tilt forward and backwards.
  2. Head tilt left and right.
  3. Head turn left and right.
  4. Both eyes looks left and right.
  5. Both eyes looks up and down.
  6. Both eyelids blink.
  7. Mouth open and close…if this is a servo?

That could mean that only seven pulses are needed in the sequence.

Before I can proceed, I need to hook up the head and see what all motions it does.

To be continued…

Hacking Home Depot Lethal Lily animatronic props?

See Also: intro, part 1, part 2, part 3 and part 4. (Github project.)

As a lifelong fan of Halloween, I certainly think we are living in a golden age when it comes to store-bought props and decorations. There are so many animated and light up props available each year at temporary stores (like Spirit) or, for some reason, home improvement stores like Home Depot and Lowes.

In 2023, Lethal Lily was introduced. This prop has eights servos to control its head – eyes move left, right, up and down. Eyelids blink. Mouth opens. And the head can turn and tilt forward and backwards (so it seems from the video).

Here was their press release, which contains a link to a video of Lily:

https://corporate.homedepot.com/news/products/innovation-quality-and-value-introducing-home-depots-2023-halloween-product-lineup

I am making this post in case someone else is considering hacking on these to control them via an Arduino, Raspberry Pi or some other keyword I would list here if I could think of it.

More to come… Leave a comment if you found this page, looking for this information.

“You’re not wrong…”

This statement is now on my pet peeve list.

It is great to use when you know someone is right, but cannot tell them that.

Is there a name for this type of negative agreement?

I can just picture what such a High Score screen might look like…

If you agree, you can tell me ‘you’re not wrong” in the comments… ;-)

After the Rain / The Passage (1988) – the movie I was almost in.

In 1984, I was moved from Houston, Texas (where I grew up) to deep East Texas. I completed high school there, then moved to Lufkin, TX. That is where I lived when I started Sub-Etha Software with Terry Todd.

An interesting moment in my high school years was when a movie production company came to town to film something in San Augustine. My friend Jeremy (a drummer I played keyboards with) was from San Augustine. A group of us went down to audition to be extras in this movie.

Ned Beatty was in this thing! Apparently they turned the old downtown area in to “really old down town” by covering the streets with dirt and such.

Here is the IMDB listing for the movie:

https://www.imdb.com/title/tt0191764

I do not think any of us saw the movie when it was released in 1988. Just the other day, Jeremy contacted me asking if I knew what was involved in playing a PAL VHS tape. He had located a VHS copy of the movie — from another country. Here is a sub-titled trailer, though the movie is called “The Passage” in this trailer:

The Passage (1988)

When he sent me this, I went searching Ned Beatty’s IMDB page to look up more details. “The Passage” was not listed. I soon learned the movie was also called “After the Rain.” With that information, I was able to locate the IMDB entry and find a few other references to this film.

But, I cannot find any source to stream, rent or buy this film. Lost Media! At least in the USA.

I am posting this in case someone else is searching for it. (I did see one review on IMDB from someone who got to watch a premier of the film in Tyler, TX when it came out.)

Leave a comment if you end up hear after a search…

10 minutes of Insta360 X4 VR 360 video

From my Park Hopping site, here is ten minutes of Insta360 VR 360 video.

I set the camera in various places using a Best360 tripod I purchased on Amazon. I set the camera to 8K 360 video mode and just clicked record. No manual settings – just automatic mode.

The only “editing” of the video was putting the clips together in Final Cut Pro’s 360 video editor, adding some transitions, and some overlay text. I did no color corrections or enhancements. These are the files exported out of the Insta360 desktop app and then brought into a Final Cut Pro 360 video timeline in 8K.

YouTube renders the video down to 4K, it seems, so I guess we can’t share 8K video on YouTube yet…

4/28/2024 – Butterfly Palace, Branson MO USA

More to come…

Insta360 X3 versus Insta360 X4 in low light

Updates:

  • 2024-04-26 – When I did this test, I recorded an 8K run, a 5.7K+ run, then 5.7K. I could not tell which video was which from looking at the info inside Insta360 Studio. I now think the #1 pass was in 5.7K+ mode. I will have to redo all of these ;-)

By request, here are comparison videos of the Insta360 X3 and Insta360 X4˘cameras recording in low light conditions. The recording was made at sunset, and the light level was low enough that the X4 displays the warning that it is too low for shooting in 8K.

But I did it anyway.

In the first test, I set the X3 to 360 mode and 5.7K. This allows reframing and exporting to HD. For the X4, I set it to 360 and 8K. This allows reframing and exporting to 4K. This obviously should make the X4 side have more detail, but what will it do to brightness of the video?

X3 5.7K versus X4 8K

X3 5.7K versus X4 5.7K+ (I think)

For the next test, I did two recordings with both cameras set to 360 5.7K+ (I think). In both cases, the reframed video is exported as HD. This was the mode the X4 tells you to use when recording in low light.

Test #1:

Test #2 in normal 5.7K mode (unless I have #1 and #2 mixed up):

Is one better than the other? You can certainly see alot of stabilization glitching going on at these low light levels.

To be continued…

I also repeated these tests at 24 fps (to see if that really does increase low light performance) and some other frame rates, but one of the files was incomplete from me hitting the button by mistake. I’ll go through the rest of my test clips, including some done in single lens mode, and create more comparison videos soon.

Insta360 Dolby Vision Enhanced comparison

Updates:

  • 2024-04-20 – Added longer single-lens example.

I do not know how long this has been in the Insta360 mobile app, but when I was exporting an X4 clip today I noticed an option to enable “Dolby Vision Enhancement.”

Dolby Vision Enhanced

After transcoding, the details of the surface are significantly enhanced, enhancing the light and shadow effects and the sense of presence of the video, and presenting a more realistic color rendition of the real-life scenes.

I was unfamiliar with this, and looked it up on the wikipedia:

https://en.wikipedia.org/wiki/Dolby_Vision

…and the official website…

https://www.dolby.com/technologies/dolby-vision

Does it really do anything beyond playing with colors? The Insta360 already has Color Plus and Clarity Plus to play with. I decided to do a quick test of the same Skylapse clip with and without Dolby Vision Enhancement:

Well, it’s bluer, at least. Now that I am aware of this option, I will do some more tests with other footage I have shot. Exporting from the mobile app does not offer the higher bitrate that the desktop Insta360 Studio has, but if this feature is in the desktop app, I could not locate it.

Here is a longer test, shot in single-lens mode:

More to come… Please leave a comment if you use this mode and tell us why.

Insta360 X3 versus Insta360 X4 – side by side videos

Updates:

  • 2024-04-18 – You can download the mp4 files I uploaded to YouTube from my Dropbox if you want to see them without YouTube’s compression.

I received my X4 the day after release (thank you Amazon) . That evening, I went out and did a few quick videos with the X3 and X4 mounted side-by-side. For one test, I recorded video in single lens mode using the default 4K settings. For the other test, I recorded in default 360 mode then reframed and exported. Since the X4 comes with plastic lens guards, and since the built in tutorial shows how to install them as a first step, I put them on my X4. I wanted to recreate what a new user would most likely be seeing if they followed the on-screen instructions.

Single Lens 4K

When comparing Insta360 X3 single lens (4K, 30fps) to Insta360 X4 single lens (4K, 60fps), I think the X4 is noticeably better. In this video, the audio comes from the X4. The X4 also had the Standard Lens Guards installed. You will see some extra lens flare type stuff caused by these lens guards on the X4 that the X3 video does not have.

360 Video Reframed

I shot in default 360 video mode. The 360 footage on the X3 records in 5.7K, and the 360 footage on the X4 records in 8K. The reframe export option from Insta360 Studio is 3840×2160 (4K) for the X4, and 1920×1080 HD for the X3. This video is in 4K. The X4 appears to be a substantial upgrade.

Conclusion

My goal here was to do the simplest test I could, using default settings like a regular user would use. I set video and 150 (single lens) mode and recorded, and I set 360 mode and recorded. I then exported (and reframed/exported) in Insta360 Studio using the highest bitrate it offers (200 Mbits). I edited those videos together in Final Cut Pro X then exported to “HVEC” format (h.265) for uploading.

What do you think?

More to come…

Insta360 X4 Standard Lens Guards see sun spots – maybe?

Yesterday, I mounted both the X3 and X4 to my Kugoo G5 electric scooter and rode them around some residential streets. To represent the X4 “as shipped,” I attached the “X4 Standard Lens Guards” that come with the unit. One of the first things the X4’s built-in tutorial screens show you is how to install them, so I wanted to follow the “default instructions” like a new user hopefully does.

On playback, I noticed bright sun spots dancing around the screen. You can see one here:

You will notice that the sun is above in this photo. When facing away from the sun, this spot is not there.

Is this from the Standard Lens Guard? Or just from the normal lens? I did not notice this until viewing it later. This may or may not be related to the lens guards. I will do more testing, soon.

Just an FYI for those curious to how this looks. I will be posting the video soon, but needed a place to post this photo so I can share the link with those asking about it.

More to come…