Laser cutter/Engraver add-on for 3D printer

A while back I was mulling over the idea of a laser engraved PCB etching process and I found a 2.5 Watt cutting laser diode on sale for cheap on Banggood and decided to go for it, many months later I finally managed to get around to testing it and the laser works great! I decided that the obvious thing to do with the laser would be to attach it to my 3D printer as a removable module so that I could have 3D printing and laser cutting/engraving all in one machine instead of having to build another motion platform.

The 3D printer I’m using is a heavily modified Geeetech i3 Pro C kit that I assembled a few years ago, since getting the kit I have upgraded the hotend to an E3D V6 using this extruder assembly. The extruder works great and best of all it has lots of attachment points built in, which made adapting the laser very simple.

I made up a bracket in OnShape to mount the laser to the front of the extruder, printed it and bolted it on and slid the laser into the mount. I added a connector to the wiring of my layer cooling fan and added a mating connector to the laser so that I could use the fan PWM (which I recently upgraded to a 5A MOSFET) to control the laser.

I donned my laser goggles and turned the laser up to 30% power, focused the beam as close to perfect as I could by eye and started checking the new zero point I would have to use for the laser. With my new zero point determined, I started looking into my laser etching process, the first test was a simple “Hello World” engraving into a piece of wood to verify my settings and focal point. I used Inkscape with the J Tech Photonics plugin and it worked great with no issues, it’s a great plugin for outlining vector shapes!

With the first test completed, I decided to look into the PCB etching process, the usual way that this is done is to paint the board with a light coat of matte black paint, take the board layout from the CAD software, generate an image of the paths, make it a black and white negative, raster burn that onto the board and then etch the result, this works, but it’s not very precise and it’s not very fast, the laser ends up doing a lot of extra moves and diagonal lines tend to end up with a lot of aliasing.

To get around these issues, I decided to take my board design from Eagle, run it through the isolation milling plugin PCB-GCODE, modify the resulting gcode to replace all spindle motion in the z axis with laser on/off commands and then go through the normal process of paint, engrave and etch.

Setting up PCB-GCODE was fairly straightforward, I just setup the machine settings to match my 3D printers capabilities, set the cutter diameter to the approximate laser spot size, set the isolation step to half of the spot size and set the Z up and Down parameters to 0 and 1 so I could easily parse them later with a script. I ran the resulting gcode through NC Viewer to validate the tool paths and kept tweaking the machine settings until I was satisfied with what I was seeing in the simulation, at this point it was on to scripting.

I decided to use Python to do the processing for no other reason than I hadn’t done any Python scripting before and I wanted to give it a try. I was able to find some simple example code that showed how to find and replace data in a file and modified that to take in the gcode file and replace and remove the elements I needed to change, in this case:

G00 Z0.0000 became M106 S255
G01 Z1.0000 F6000.00 became M107
M02, M03 and M05 were removed

and the following code was injected after the last comment in the file header to home the axes and set the new zero location for the file to start at:

G28 ;Home all axes
G0 X0 Y46 Z47.5 ;Move to laser Zero position
G92 X0 Y0 Z0 ;Zero X,Y and Z

After all this I have a basic tool chain assembled for the PCB laser etch process, the only things left to do are to do a few test runs on paper to validate the beam width settings, do some tests on painted copper to determine the necessary cut speed for ablating the paint and then try it on a real board to see what kind of results can be had! So far, it looks promising and should be much faster and more precise than raster engraving, it’s also worth mentioning that it should use far less etching chemicals than the conventional methods! of course in order to continue I will need to hook up a fan to clear the smoke away from the laser and figure out a ventilation system to vent the fumes outdoors rather than into my office, but we’ll get their in time.

Mapping a small sensor range to full ADC range

Today I decided to look into a more elegant solution to the problem of accurately determining the angle of my rudder pedals, previously I had solved this problem with a mechanical solution that converted the specific degrees of rotation of the rudder pedal cross arm to the 270° rotation of the potentiometer I’m using. This mechanical solution involved a timing belt and two 3D printed toothed wheels of the correct ratio.

There’s supposed to be some pictures here, but WordPress is currently broken for me 🙁

While this solution did work, it was not a particularly elegant solution and made calibration difficult as well as ran the risk of damaging the pot or causing the calibration to drift if the axis was forcibly pushed past it’s limits, which is very likely with a foot operated device!

The electrical solution I came up with was to use an Op Amp to map the output range of the sensor to the larger range of the Analog to Digital Converter so that I could use the limited motion of the potentiometer directly coupled to the rudder pedal cross arm and still get the full 0-5v range I was looking for to get the maximum precision from my ADC.

In my research for this solution I didn’t really know what I was looking for, I’m not an engineer of any description, nor do I possess a particularly strong grasp of mathematics or electronics theory, and so I frequently have to look things up, which is difficult when you don’t know how to phrase your question!

I stumbled my way across this thread on the electronics Stack Exchange:  where the original poster was looking to do something similar to myself and it provided a link to a Texas Instruments reference document on calculating the gain and offset of an Op Amp: Designing Gain and Offset in Thirty Seconds.

After parsing the formulas I entered them into a spreadsheet and began testing my results with TinkerCAD circuits, an excellent tool for testing circuits without blowing up any real world components! The circuit I came up with, with the help of the spreadsheet was this.

Please note that if you attempt to use the spreadsheet and it gives you crazy values, it may be that I made an error in the formulas, I have only tested the Positive m and Negative b function, as that is the one that is pertinent here. If you find any bugs and make a fix, please let me know so I can update the sheet!

This effectively reproduces the the scenario of a pot with limited range and swings the values between roughly 0 and 5v, which in this case was just a set of arbitrary values that I decided to use for testing the spreadsheet. I had to tweak the value of R1 in order to get the range I wanted, which is described in the TI reference.

So to sum it up, if you have a sensor that outputs a non-ideal range of voltages you can use an Op Amp to map that to a usable range and you can use TI’s helpful “Designing Gain and Offset in Thirty Seconds” to get you there, though the title is a little optimistic unless you already know what you’re doing!

Capturing a time lapse from a Raspbery Pi camera

After running into a bug with VLC capturing images from an RTSP stream in Linux, I decided to go a different route for capturing time lapses from my 3D printer and that was to use a local script on the PI that can be run with command line parameters to set the interval and duration of the capture and have the images output to a network share, this should satisfy the same goal as the RTSP stream by not having the PI store or write data to the SD card and so prolong its life

I went about the process in this order:

  • Setup a share on my workstation
  • On the pi “sudo apt-get install samba-common smbclient samba-common-bin smbclient cifs-utils”
  • Add the share to /etc/fstab by appending “//<server>/Timelapse /mnt/timelapse cifs guest,rw,iocharset=utf8,file_mode=0777,dir_mode=0777 0 0”
  • Reboot the Pi to ensure the mount is permanent

After that I wrote a simple script to allow for timelapse capture:

#!/bin/bash
#Timelapse from an RPi Camera
if [ $# == 0 ]
then
echo
echo “usage: $0 [runtime] [interval] [destination]”
echo
echo ‘All times are in seconds (3600 seconds per hour)’
echo
echo “example:  $0 3600 60 ~/images/timelapse”
exit
fi
runtime=$1
interval=$2
if [ $# -gt 2 ]
then
store_path=$3
else
store_path=/mnt/timelapse
fi
while [ $SECONDS -lt $runtime ]
do
datetime=$(date +”%FT%H%M%S”)
raspistill -vf -hf -o $store_path/$datetime.jpg
sleep $interval
done
echo ‘time lapse complete’

Unfortunately I ran into trouble where whenever calling raspistill I would get an ENOSPC error and it would fail to do any capturing, I ran a firmware update and that led me down a rabbit hole where it looks like the rpi-update is broken on older Pis such as my Rpi 1B, this required downloading the files manually and forcing the update without a download as per the below:

curl -L https://github.com/Hexxeh/rpi-firmware/archive/master.tar.gz -o master.tar.gz
cd /root/.rpi-firmware
sudo tar -xvzf /home/pi/master.tar.gz –strip-components=1
sudo SKIP_DOWNLOAD=1 rpi-update

Finally, with all of that complete I tried again and it failed, reason being I still had my streaming script running from the previous article! I went through and disabled the service I had setup previously:

systemctl disable stream-rstp.sh

And tested and successfully got an image off the camera, I tested my script, and it worked!

I can now grab high resolution time lapses of my 3D printer by SSHing into my Pi and running a single command string!

 

Capturing a time lapse from an RTSP Stream

I was looking for a better way to capture time lapse recordings of my 3D printer, Octopi on my old raspberry Pi just wasn’t cutting it, so I decided to setup an RTSP stream as per this post here by Chris Carey, that was straightforward enough, but then I needed a way to periodically grab frames from the camera as it streamed, for this, VLC came to the rescue! With a (relatively) simple command line I’m now able to grab frames and encode them in JPG format and those can in turn be recombined into a time lapse video!

The command I used was a follows:

“C:\Program Files (x86)\VideoLAN\VLC\vlc” rtsp://192.168.254.202:8554/stream -V dummy –intf=dummy –dummy-quiet –video-filter=scene –no-audio –scene-path=C:\temp –scene-format=jpeg –scene-prefix=snap-%datetime% –no-scene-replace –run-time=1 vlc://quit

I used a little batch trickery to get the filename to include the current date and time via this handy little addendum:

for /f “tokens=2 delims==” %%I in (‘wmic os get localdatetime /format:list’) do set datetime=%%I
set datetime=%datetime:~0,8%-%datetime:~8,6%

And that’s it, just run that as a scheduled task and you have periodic screen grabs!

I’ll be working on converting this to Linux later on, which should be pretty simple, but for now I’m working on a Windows machine so that’s what I have for now.