This is a very cool visualization, and I will continue to play around and explore various parts of the country. One thing I noticed immediately, due to where I live, was an interesting consequence of the algorithm. According to the model, there are several very large buildings nearby (see the outskirts of Forest Grove in the picture).Screen Shot 2018-10-12 at 5.30.04 PM

There is one cluster in the lower left (near Ritchey Rd) and another in the upper right near Schefflin. These are not really buildings, they are hoop-houses. Structures made of only plastic sheets and a few metal pipes. You could argue this is nit-picking, and I’m fine with that characterization. But I’d love to press the issue and ask what would it take to train a neural network better… so it knows the difference between what you and I would call a building, and what merely looks like a building from space.

To be more clear, here is a picture of the Ritchey Rd area:

And a full-zoom of the satellite image, corresponding to the top left (northwest) portion of the nursery (the NYT article doesn’t zoom in from here):

Screen Shot 2018-10-12 at 5.37.37 PM

Clearly these are structures, but what would be the next step in developing the algorithm to know they aren’t actually buildings (or do you define them as such?). Certainly including some nurseries in the training set would be a first step.

Python with a Cocoa GUI on macOS

I finally had a chance to try writing a native Cocoa interface in python. My test project was based on a post by Adam Wisneiwski (though I had to use the cached google version due to a bad gateway error). In that post, Adam laid out the process to put together a basic python app that uses a native cocoa GUI created in Xcode. Given the updates to Xcode, and the fact that I’m using anaconda python, I figured I’d repost my process, with the modifications I made to get it to work in case it is useful for anyone else (and so I can remember how I got it to work).

First, some background: I have anaconda python installed in my home directory, so I had to add some packages so python can connect with the Mac Cocoa framework. If you are just running vanilla python that comes with your Mac, then you shouldn’t need to add anything. Test your setup by trying to import Cocoa in python. In my case, that didn’t work until I installed two packages:
pip install pyobjc-core
pip install pyobjc-framework-cocoa

I didn’t find these with conda on conda-forge so it was a job for pip. Once these are installed, you should be able to import Cocoa and import Foundation in python. Next, it’s time to check for py2app. This step was new to me (I haven’t used py2app before), but I figured it would work best to have this installed in the python environment I plan to use, so once again, let pip do the work: pip install py2app. After that, you should be able to run py2applet on the command line and it will return a default help message. We’ll use this utility to create a file for the application, and bundle it.

First, put the following application code into a file named

from Cocoa import *
from Foundation import NSObject

class SimpleXibDemoController(NSWindowController):
    counterTextField = objc.IBOutlet()

    def windowDidLoad(self):

        # Start the counter
        self.count = 0

    def increment_(self, sender):
        self.count += 1

    def decrement_(self, sender):
        self.count -= 1

    def updateDisplay(self):

if __name__ == "__main__":
    app = NSApplication.sharedApplication()

    # Initiate the contrller with a XIB
    viewController = SimpleXibDemoController.alloc().initWithWindowNibName_("SimpleXibDemo")

    # Show the window

    # Bring app to top

    from PyObjCTools import AppHelper

Next, we can start to package this app by creating a file using py2applet:

py2applet --make-setup

Notice that you call this with the name of the python application file. The file generated should look like:

This is a script generated by py2applet

    python py2app

from setuptools import setup

APP = ['']

    options={'py2app': OPTIONS},

You’ll want to associate the application with an xib file (i.e. the GUI side of things). Just add the filename (although we haven’t made the actual file yet):

DATA_FILES = ['SimpleXibDemo.xib']

Just be sure to name your XIB file accordingly when the time comes (that step is coming soon).

Now it’s time to fire up Xcode. I’m using Xcode 8, on macOS Sierra, things should be similar on earlier versions, but Xcode has evolved a bit over the years. I still have the Welcome widget enabled on Xcode, so the quickest way to get what I wanted was to “create a new Xcode project”, and choose “cross-platform” and “Empty”. We don’t need much in this project, but make sure to save it in the same folder where you created and

Next, create a new xib file for the app window. The menu path is File→New→File… and under macOS in the User Interface category pick Window. For a filename, type in SimpleXibDemo and it should add the .xib extension. Next, add your .py file to the listing for Xcode: File→Add files to “Project”… Select your file If you open this file in Xcode, you’ll see some decorators that let Xcode know how you will interface with it: @objc.IBAction indicates a function that you want to receive an action. You’ll also notice that the python script instantiates an outlet for the interface and calls it counterTextField: counterTextField = objc.IBOutlet() This will be the recipient of actions within the script.

To lay out the GUI, drag three buttons onto the window from the Object Library (lower right side by default). You can change the text that appears on the button by double clicking.

Next, we’ll associate the quit button with the terminate action. Ctrl-drag from the button over to the FirstResponder. In the pulldown, choose terminate, this will associate the quit button with ending the program.

Next, we have to associate the window with the python class that it represents, and link the buttons to the appropriate IBActions. Note that it is important the class specified in the File’s Owner matches the python class in the .py file. In the code above, that is SimpleXibDemoController so click on the top icon to the left of the workspace (File’s Owner) and then choose the Identity Inspector (likely the third-from-left in the right-side pane). Enter the class name in the top field (Class). This associates the window with the class that will handle actions. Finally, Ctrl-drag from the “-” button to the File’s Owner and select the decrement action. Do the same from the “+” button but pick the increment action. This connects the two buttons to the corresponding class functions.



Finally (as shown below), add a label to the center of the window and ctrl-drag from the File’s Owner to the label. In the pulldown, choose counterTextField (which should be the only option). This links the label in the GUI to the counterTextField in the python code.


Save the .xib file and we’re ready to build the app. From the command line, run:

python py2app -A

You can then run it from the command line as: ./dist/

Note: it may not work just yet. I had to change the file to point to the right python executable (this comes from using a local anaconda python instead of the built-in Framework python).

This is a script generated by py2applet

    python py2app

from setuptools import setup

APP = ['']
DATA_FILES = ['SimpleXibDemo.xib']

OPTIONS = {'argv_emulation': True,
           'plist': {
               'PyRuntimeLocations': [

    options={'py2app': OPTIONS},

Where I added the plist to the OPTIONS variable and made sure to provide the path to libpython3.6m.dylib (select the path to the dylib that matches your runtime python version). If you use anaconda, then it should be similar to the path above though your username is probably different.

With this change, run
python py2app -A
once more, and then try the app again:

I’m pretty new to all this, but feel free to let me know if you have trouble getting it to work and I’ll do my best to help.

Teaching Quantum Mechanics with Python

I’m excited to announce that I’ll be giving a talk at PyCon 2017, on May 18 in Portland OR. The talk is based on a set of Jupyter Notebooks that I’ve developed over the past two years for use in my quantum mechanics class. As the talk comes together, and as I clean up and document the resources, I plan to write a series of posts describing how I use the libraries (like QuTiP) that make these lessons possible, the process I used to revise these resources, and the things I learned along the way about teaching students python and teaching students with python.

Printing on Shims

An emergency hack saves a doomed print job.

I’ve been 3d printing since late 2010 when I built my first kit. One thing I learned early on is that overhangs are tricky in printed objects. Generally that means that you either design an object to minimize overhangs or you print with support. Usually you can pick an orientation and part design that work well together and give good results. Sometimes you need to print support. Today, I ran into another issue. I’ve been printing many items for student projects in my electronics class and got a bit casual about sending files to the printer without looking too closely. I had a full print bed worth of parts running when I realized one part was designed with major overhangs; essentially a flat plate that had some mounting lugs extending up and down from it. The print was already 1/3 through and I didn’t want to kill the job it since most of the print would be fine… but I knew that this part of the print would fail. Staring down this impending problem, I figured I’d try a hack and at least see if I could salvage the print job.

I looked through my gcode in octoprint to see where the overhang would kick in (layer 13 it turns out). Grabbed enough index cards to make a stack about 13*0.25mm high and started cutting. When I had a reasonable set of cards ready to go, I waited for layer 12 and paused the print. I started to stack the cards and tape them down with kapton tape. Based on feel, the layer height wasn’t 0.25mm so I pulled a few cards off the stack until they felt as tall as the existing print. The results are certainly better than if there wasn’t any support, and I’m actually surprised it worked as well as it did. Surface quality is actually about as good as it is with support; not as nice as it would be if the surface were more even, but I had to have a way to hold the cards in place so the tape strips show up a bit. In the future, I’d just lay down wide strips of masking tape (i.e. blue tape) since I like the finish it gives and I know PLA sticks to it.

An interesting note is that the cards definitely change the heat properties of the bed but that doesn’t seem to have changed the outcome much. I was worried about printing on a cold surface instead of the heated bed but that seems to be an unfounded concern. I suspect ABS may be more picky about this, but the PLA didn’t show any warping.


Shims in place. The printer was paused at this point.


Finished print on the bed. Looks good so far.


Part printed on shims shows minor surface defects.

Overall, this definitely worked as a rescue mission. The easiest approach is to avoid the issue with careful design choices. However, some parts need to break the no overhangs rule. And for those parts, shims may be a solution.

Video of the print in action:

Update: After posting this I found Xiang Chen‘s work on print-over and other augmented printing techniques. Very exciting, and I’m going to have to start playing with some possibilities along those lines.

Inkscape clipboard fix on mac

Inkscape is my go-to vector editing program and I’ve done many publication figures, exam questions, and other general work in inkscape on both mac and linux. I’ve always just resigned to use clone (ctrl-D) instead of copy/paste since on the mac, the copy/paste cycle results in a pixelated image being pasted into the document. After beating my head against a wall trying to create a pattern-on-path effect, I realized that there must be something wrong with the clipboard implementation on the mac. Sure enough, a quick search took me to the inkscape FAQ, and this section in particular:

Starting with XQuartz 2.3.2, X11 has some functionality to exchange the content of the clipboard with OS X. It currently does not know how to deal with vector images, so it just captures the screen, i.e., creates a bitmap copy, and then pastes that. You need to deactivate this functionality in X11 preferences > Pasteboard: uncheck “Update Pasteboard when CLIPBOARD changes”. However, this will also prevent copying text from any X11 application to Mac OS X ones. It will not prevent copying text from OS X to X11.

When you just want to make a copy of an object within Inkscape, you can also use duplicate (Ctrl-D) rather than copy/paste (Ctrl-C/Ctrl-V) — Duplicate does not interact with the X11/OSX clipboards. For other Inkscape commands involving the system clipboards (e.g.Paste StylePaste Size or Paste Path in path effects) there is no alternative workaround other than changing the X11/XQuartz preferences as described above.

On the bright side, I had found the suggested workaround, and was able to effectively duplicate items in my drawings. However, that doesn’t work when the effects are expected paths in the clipboard (and they weren’t there). Since inkscape is about the only thing I use X11 for on the mac, I went ahead and disabled the X11-mac clipboard sync. I will dive in the the nuanced solutions if I need to. It’s been a blessing to have access to many of the path effects and other cool tools in recent inkscape versions.

NSF Grant awarded to Photonics and Quantum Optics Lab at Pacific

I’m pleased to announce that my research group has been awarded a second NSF RUI grant to further support our research. The RUI (research at undergraduate institutions) program specifies resources for scientific research at colleges like Pacific and is a valuable funding mechanism for science research at smaller colleges. I feel very fortunate to continue to offer summer research opportunities to undergraduates for at least the next three years. Below is the public abstract that is posted on the NSF website.

As electronic devices reach their maximum processing speeds, the demand for high speed internet communications and data networks will require new technologies for storing and processing large amounts of data. Electronics are built on the use of the electron to carry and process information and in an analogous way the field of photonics is developing devices that use particles of light called photons to carry and process information. Individual photons obey the laws of quantum mechanics, so in order to fully understand the operation of photonic devices, quantum measurements must be performed on these new devices. One particularly essential component is a memory or information storage device. Many candidates for photonic memory exist but few have been characterized at the quantum (few-photon) level. This research program will apply new techniques for measuring the quantum properties of light to a variety of photonic memory devices. The result will be a deeper understanding of device operation that will lead to optimized devices for future applications.
Photonic memory devices have been demonstrated using slow and stored-light protocols based on electromagnetic-induced transparency (EIT) in Rubidium. The goal of this program is to measure the quantum state of light retrieved from several implementations of these devices in both warm and cold Rubidium vapor samples. The light stored and retrieved from such systems will be measured and analyzed using a highly efficient array of low-noise photodetectors. This technique can simultaneously measure multiple optical modes and will be used to correlate multiple modes and determine which modes (or combinations of modes) are most robust under different storage conditions. A full quantum-mechanical understanding of the optical signal retrieved from memory allows complete characterization of the device performance and will inform future work in the development of photonic memory devices.