Monday, February 20, 2017

ThinkPad X200 Tablet as peripheral device Attempt 1

The goal of this project is to turn a "Wacom Penabled" Thinkpad Laptop into a peripheral screen and input device for my main Windows PC machines.

This documents just the first attempt, as not everything is working yet and I need to move on to other projects for a while.

The starting point: I have a Wacom Penabled ThinkPad x200 Lenovo Machine, pictured below. I read the following articles to figure out how I might go about my goal: article 1article 2

The instructions from the article above were not sufficient for my setup and included some steps I skipped. Below are the steps I took to get as far as I have so far.

I installed Ubuntu (latest desktop version) on the Thinkpad machine. It installed cleanly alongside my existing Windows 7 install without need for any other tools (such as WUBI  mentioned in the articles). The pen worked out of the box on this system.

I verified the port of the pen with the following command (the port is /dev/ttyS4 for me)
    dmesg | grep ttyS 
I installed ser2net:
sudo apt-get install ser2net
And changed the configuration file located at /etc/ser2net.conf and added the following line at the bottom section:
    7000:telnet:600:/dev/ttyS4:38400 remctl NONE 1STOPBIT 8DATABITS -XONXOFF -RTSCTS -LOCAL 
I added my username to the group dialout with this line:
    sudo adduser myusername dialout
At this point, I was not able to see the output of the pen on port /dev/ttyS4 because all attempts to access manually it gave me a resource busy error. I fixed this by commenting out the second to last line in /lib/udev/rules.d/69-wacom.rules. This steps should require a restart of the machine afterwards in order to have effect. The line to comment out reads as follows, with comments:
    # comment out the next line if your distribution does not provide systemd
    # If a /dev/ttySx device with the WACf name is detected, start the
    # wacom-inputattach service with the kernel name as parameter
    # SUBSYSTEM=="tty|pnp", KERNEL=="ttyS[0-9]*", ATTRS{id}=="WACf*", TAG+="systemd", ENV{SYSTEMD_WANTS}+="wacom-inputattach@%k.service"
Now it is also necessary to disable the Wacom pen services from xinput. If not done, streaming the data out with the cat command will make the computer behave erratically, and probably caused the kernel panic I experienced once as well. To list the devices, use command "xinput" which will return a list with ID numbers listed to the right. I found three Wacom entries under Virtual Core Pointer. They were the stylus, eraser, and touch. They had IDs of 12, 13, and 14 respectively. I disabled all three with commands as follows. These should be put in a start up script, as otherwise they must be manually entered with each start up of the system, possibly including log ins.
xinput disable 12
xinput disable 13 
xinput disable 14

At this point the pen won't work on Ubuntu anymore, which shouldn't be a problem because the goal is just to get the data over to the Windows machine anyway.

Now I can finally test the device locally. I entered the following command and saw data (unintelligible gibberish symbols as displayed) come out whenever I touched the stylus to the screen.
sudo cat /dev/ttyS4
Now I opened up a PuTTY session on the Windows PC . I set it up to 'Connection Type: Telnet', 'Port: 7000', and host name as the IP of the Ubuntu machine on the network which I identified using the command 'ifconfig' on the Ubuntu machine. The connected network interface, which for me is called wls1 and indicates my wireless card, states the address on the second line. It looks something like the following:
inet addr:
With the address entered into the 'Hostname (or IP address)' field I hit the "Open" button and saw that whenever I moved the pen to the Ubuntu screen, unintelligible symbols appeared in the window. Excellent. [Note: close the PuTTY winow after this test]

The only remaining steps were to get the Wacom modified tablet drivers to work and install HW VSP. I only made half-successful progress here, so I won't write it up yet in detail. I followed the instructions for the drivers from article 2. I tried this on two Windows machines each running Windows 7.

On my desktop computer, the Wacom drivers (installed per the instructions on  seem to be the issue. I get a complaint about them every time I boot the machine, and the HW VSP says "connected" but all the packets end up just adding to the number in the Queued section, and the VSP fields other than Status all remain blank. On the Windows 7 laptop, I do not get the Wacom drivers complaining at me, and the HW VSP window shows some values for all the VSP fields, as well as incrementing values for Rx and Tx packets in the VSP column. But the pen on the Ubuntu machine still does not move the pen on the PC.

For now I'm shifting gears away from this project, but hopefully I can get it working in the future. The Windows drivers are clearly the problem, and I've tried a lot of things, including other drivers, as well as messing with the registry and the driver files. I might have more success going through a second Ubuntu machine and using Ubuntu tools and drivers here. But the point is in part to get this working for a Windows PC, just because of my software and OS use patterns (I develop on Ubuntu, but use design and entertainment software on Windows).

If you think you have a solution let me know!

Wednesday, February 15, 2017

Simple Educational Robot Arm with Inverse Kinematics

The point of this project is to test a lesson that covers the theory and construction of a simple, two degree of freedom robot arm that moves in a plane and is controlled by specifying the X and Y position of the end of the arm (the 'hand'), rather than by specifying the angles of the two servos directly.

I ran most of this lesson last year at a summer camp, but due to limited time/high variance in math background of the students, we weren't able to reach the point I wanted to reach. (Really what happened is that I volunteered to have another class join my class for three hours for an activity, which meant a lot of my prep time ended up being soldering leads on more potentiometers...)

This activity seemed promising so I went through it again, at a less hectic pace this time, to try and make a list of the tricky 'gotchas' that would be useful to provide as hints if the goal was to have most of the students complete the project in the class time allotted.

Students who have completed a class covering trigonometry should have sufficient mathematical background to cover the math, which is described nicely on this page:

A partial list of the tricky parts students may get stuck on (beginners tend to try to build the whole thing at once; a few compound problems can make it very difficult to debug):
  • Constructing the arm so it is sturdy enough to last until demonstration time.
  • Including math.h and finding documentation for C trig functions
  • Keeping track of radians vs degrees
  • Measuring link lengths accurately and setting up the physical arm to match the model (the arm should physically be perfectly straight  line if both servos are at their max range
  • Mapping the analog inputs (reading from the potentiometers) to a reasonable range of the X and Y axis, in accordance with the units chosen for specifying the link length in the code
  • Dealing with floating point error--floating point operations are not the optimal solution here but the goal here is to avoid adding more complex C programming concepts onto an already dense lesson. See the code for more detail. A good summary of the issues can be found here:
  • Using a separate power supply or filtering the Arduino power supply. The arm can otherwise get stuck in a cycle where servo moves, drawing a lot of power momentarily, which causes the analog inputs to jitter (and the onboard LEDs to flicker), which causes a new reading which causes the servo to move... Most students will use the separate power supply if it is provided, but it is also common for students to forget and tie everything to a single rail.
Here's an illustration of an issue that will happen if the student mounts the second link so that it cannot fold inwards. The green area on the left shows where the end of the second link (the 'hand') can reach. The shape is awkward to navigate. The right hand side shows the area reached by a better choice of angle limits for the second link. If it can fold inward, similar to how a human forearm/elbow/upper arm works, then it is easier to use. (This image assumes links are equal length).

I did mount the second servo flipped downwards, to reduce overall height of the assembly. That was a mistake in hindsight, because it means the arm can collide with itself.

Here's a video of the test arm I built, demonstrating that it more or less works correctly.

Finally here's the sketch. It isn't clean or commented...but I've decided it is better to share these things than to never get around to it at all.

Arduino Sketch for this project

Tuesday, February 14, 2017

Talking Calipers 1

At a recent Sector67 monthly meeting, a member who teaches at a school for the blind and visually impaired mentioned a need for talking calipers. They are hard to find for sale and they typically cost in the realm of $250, though if you have a link to something significantly cheaper please let me know. My overall goal is to design a cheap little add-on that slots on to standard calipers that have a serial output already. This post is about the first step: a quick talking calipers proof of concept. Here's a picture of the working setup. See bottom of post for a video of the setup in action.

Here's what's in the photo. Everything came from either my collection of prototyping stuff, or from the collection of a fellow Sector member who took interest in this project.

If you add up the total cost of the items listed above, it comes out to about $ already cheaper than the $250 price point. (I'm aiming to get the total cost down a lot further than that though).

Here's how it all works together. The calipers are bumped up from their usual 1.5 volt voltage from the coin cell, to a 1.9 volt supply from the breadboard power supply. This makes it so that the logic level converter works with the signal from the calipers (because the minimum signal it will support is 1.8 volts, so 1.5 is too low). The signal from the calipers is easy to access; there's a port on the calipers accessible by a removable plastic cover. On the other end of the logic level converter we have the Arduino at 5 volts.

The wave shield comes with it's own example code. The calipers communication protocol has already been reversed engineering by a number of folks online; I got my code snippet from

The main challenge here is that the Wave shield uses all the external interrupt pins on the Arduino. I needed to use a different kind of interrupt. After a brief read through the forums and the datasheet for the Atmega328P, I decided to use the analog comparator. Comparing against some code snippets made it a fairly easy job to set all the registers I wanted to the values I needed. I used the internal bandgap reference as the positive input to the comparator and my calipers CLOCK signal as the negative input. Then I set the interrupt to trigger on rising output edges. [Noting that the positive input to the comparator, Atmega pin AIN0, refers to Arduino pin D6, and the negative input to the comparator, Atmega pin AIN7, refers to Arduino pin D7.]

The code for this project is available here.

Here's the video:

Next up I'll tackle building a prototype of a cheap and portable version.

Sunday, February 12, 2017


Cardboard-i-copter wasn't a project so much as a fortuitous discovery.

A few years ago my uncle gifted me this quadcopter:

Unfortunately I couldn't get it to fly. It mostly chaotically bounced around the room. Friends with many more quadcopter flying hours than I all failed to get it to fly.

One night while hanging around the main Sector67 table, I pulled the roll cage off. Still no dice. So I disassembled it entirely, and then as a joke, taped the electronics and motors to cardboard.

It worked.

Nothing about this configuration was rigid or carefully measured or square. It seemed to self-right the motors so long as they were more than 45 degrees off the plane of the cardboard. It was pretty easy to fly. It was incredibly easy to repair after a crash.

Eventually it developed an issue with the control system that sometimes caused it to go full throttle and unresponsive. Those props are kind of painful to fingers and I was sick of chasing it down and fighting it to get the battery unplugged.

It would be fun to explore cardboard-i-copters more one day, formally (the physics of it), and informally (building a bunch of designs). If you have some cheap mini quadcopters or quadcopter parts lying around, try it and let me know how it goes.

Wednesday, October 19, 2016

Blender Graphic Novel Work Diary #1

I've been working on my Blender Graphic Novel for a while now. Blender is an open source 3d creation suite. The graphic novel has no title yet, but the story has been fleshed out in full (50 or so scenes). The graphic novel will very much resemble a hand-drawn one, but all the visuals are computer generated. Other than Blender, I use Marvelous Designer to make the outfits, and the GIMP to lay out the final panels. Most models come from and full attribution documents will be appended to every chapter release. All humanoids are built with Manuel Bastioni Lab in Blender.

Here are some images and notes from my 'work diary.' Layout mess courtesy Blogger's terrible photo layout tools.

Viewport render of a hairstyle for a minor character
Test of ink shader
Backlight keyboard image concept
Police uniform in Marvelous Designer
Major character clothing and hair test
Prison uniform in Marvelous Designer
Struggles with the hair system for a minor character
Testing Marvelous Designer by making a dress

Outfit pose test

Braid and wig test
Color and style test

Character pose in the viewport

Monday, October 17, 2016

Nefertiti 3D printed

The Nefertiti Hack refers to the release of a high quality 3d scan of the bust of Nefertiti without permission of the Neues Museum who held the piece. Read more here. I believe that we as a society should feel obligated to digitize, catalog, and make it easy for the public to access art. I'm a big fan of the work done by Cosmo Wenman and his commentaries on these topics.

Anyway, once it was released I printed the Nefertiti head in 80% scale. I split it up into 8 pieces (using the Boolean object modifiers in Blender and a large cube object) that would be manageable for our army of Flashforge Creators here at Sector67. The pieces were the neck, face, left and right head sections with ears, middle left and right hat sections, and top left and right hat sections.

They were printed in ABS black plastic and glued together with acetone. The hat pieces warped a lot and the surfaces were not flat enough, so those were glued together with dowel rod pins and hot glue. I used Tamiya putty to fill in cracks and then did a lot of hand wet sanding to get the creases less noticeable.

Putting Tamiya putty on the surfaces to fill out striations and then sanding worked fairly well. At this scale, filament lines aren't that obvious anyway. It felt like I was rubbing lotion onto her skin. Only it was nasty toxic smelly lotion with fumes that cause some neurological damage. I did most of this outside but it isn't really worse than the other poisons I use like spray paint or wood stain or whatnot.

I don't have more photos but I basically left it at a very gold color (after a few more coats of the paint above, which is a brass containing paint). I was going to patina it but it is making rounds at various makerfaires and other events and chips caused by falling are easy to repair when the color is a flat brass, but would not be easy to fix if I had to also redo the patina.

Tuesday, September 27, 2016

Robot arm painting

Here's the robot arm at a university run back to school party in the library. The wrist was broken so I just tore it off and put a paintbrush. As usual, I left this for the 30 minutes before the party and the first 30 minutes of the party. No time for fine-tuning settings (counterweight, easel position, angle limits, etc) but it worked alright.

I've done a few of these robot painting things at science fairs. Since this one was for college age students and up, I felt safe handing the controls over. At kids science fairs I can't trust them not to poke out a few eyes, so it can be difficult to get any documentation of the event since I'm always busy with the controller.