In a previous post, I played with code and created a Particle function that made LEDs flicker like a candle whenever the function was called. While experimenting, I had the idea of creating a flickering paper circuit fireplace, or a candle, that might be triggered by data connected to a light sensor. Ultimately, I wanted to figure out a way to do this as a step toward creating a wifi-connected book containing artwork brought to life with real-time data.
1. ThingSpeak channel documenting light sensor 2. Photon connected to a paper circuit
Unlike the Goofy Photon Servo Notifierwhich used the Particle.subscribe feature (linked to data in my Google Calendar) the function illustrated below is triggered via a web hook that I set up in ThingSpeak, which only calls the function if a sensor on another Photon reports a low light level.
PROOF OF CONCEPT (Video demo)
I'm making progress toward my goal of creating a wifi-connected book that uses data to help tell a story. As a test of concept, I created a Thing HTTP app and a "React" app in ThingSpeak to trigger a Particle function (on one Photon) based upon readings from a Photocell (attached to a second Photon). To get started, I created a ThingSpeak channel and a Particle web hook by following this tutorial.
Once I created the web hook, my Photon started logging the photocell data on ThingSpeak. In order to make the data more useful, I modified the Photon code in the tutorial by adding the following lines to the loop, just above the line containing the Write API key for the ThingSpeak Channel.)
value = map(value, 0, 4096, 0, 255); //maps values value = constrain(value, 0, 255); //constrains values between 0 and 255.
Next, I uploaded this code to a second Photon, connecting the Photon to a paper circuit using alligator clips. Lastly, I created the ThingHTTP and React apps.
The React app allowed me to set up a trigger related to the data. I set it up so that a reaction happens any time the sensor picks up a value of 150 or greater (when it's cloudy or dark and the resistance increases). The ThingHTTP app allowed me to post an HTTP request to Particle, triggering the Particle function that illuminates the owl's eyes and the candle flame.
As my previous posts have explained, I'm in the process of learning about the Photon in hopes of creating a wifi-connected book that will use paper circuitry triggered by data to help tell a story. This won't be completed until the spring. Since we were required to create some sort of project and video for this class, I took a detour from the work I'm most interested in to focus on another application of the Photon.
Rather than creating a formal lesson plan, I created an instructional video and a functioning model of a project that might be adapted for use in a classroom or other venue.
Before I launch into my final project, here's a teaser of what the circuit in my wifi-connected book might look like. This is only a prototype to help me think about where the leads need to go, as I begin to plan my book's layout.
I'll need to overlay the copper tape with conductive fabric in areas where the circuit will continue between the pages of the book. I'll also need to determine precisely where I'll be poking the holes that I'll be sewing through, to bind the signatures between the covers. My first book model will most likely be constructed from binder's board. Once I've received some feedback and determined that everything works the way I want it to, my hope is to create wooden covers with recesses engraved with a laser cutter or CNC router.
FINAL PROJECT:
While thinking about how I might be able to use a servo in my book, I hunkered down and created a gadget that I'm calling the Goofy Photon Servo Notifier. I've designed a basic paper pop-up book structure that folds flat once the servo is removed.
The pop-up structure folds flat.
In theory, I could add something similar to the inside of the pages of the wifi-connected book I'll eventually make and a reader could pop the servo in.
VIDEO DEMO
QUICK OVERVIEW:
Taking inspiration from this "Study Budy" project, I started experimenting with the Particle.subscribe feature. The basic premise of the project is to have a new event on a Google Calendar trigger a servo connected to a Photon, in order to alert a student that there is something they should be working on.
Basically, Particle.subscribe is a piece of code that tells a Photon to listen for an specific event that you can give a very specific name to; a unique name will reduce the likelihood of other people subscribing to your event (and triggering your device). When the Photon receives word that the event is taking place (a new task added to your Google Calendar, for example), the Photon carries out a unique function referred to as an "event handler" that basically uses data to trigger an action on a physical object. (Note: In my earlier servo exploration, I used a different method to trigger a servo.)
To start, I logged into my If Then Than That account. I created a new Applet, selecting Google Calendar as my If THIS. I selected the first box depicted below, which reads "Any event starts".
I selected "Particle" for my Then THAT.
The trickiest part was figuring out what to put into the configuration boxes. I named my event something rather unoriginal: New_Event. I will be changing this name in my Applet and in my code to make it more unique.
I left the box below this (for optional data) empty. After saving the Applet, I uploaded my code and added a new event to my Google calendar to test it out.
CODE
While the "Study Budy" project got me going in the right direction, helping me to explore a new way to trigger events on my Photon, the code I wrote for my project was completely different.
For starters, my code requires only one Photon. My code also periodically detaches the servo to reduce jitter, and resets the servo to a zero position after 45 seconds. If a user didn't want the servo to reset automatically, they could comment out the code and manually press the reset button. Here is a screenshot of the code I wrote. Here's the link to my code.
RESOURCES TO LEARN MORE:
One juicy resource that I found this week was Daragh Burn's IoT course outline. a plan for an Internet of Things class that will be offered by Carnegie Mellon starting in January 2017. It was there that I got a useful explanation of the differences between Particle.publish and Particle.subscribe that I'd only just begun to understand.
My next step is to log in to the Particle-Raspberry Pi beta in order to see whether I can get my Photon to interact with data pulled from a Raspberry Pi I'm not certain that learning how to do this will actually help me create my physical book, but it will help me continue my investigation into the Internet of Things.
Here is a link to the standards I've selected for my experimental bookbinding project. My ultimate goal is to create a wifi-connected book, using a Photon microcontroller, that I'll begin constructing in January.
This week, I continued to experiment with code by adding a function that makes an LED flicker like a candle or fireplace. This function will only be called when a light sensor registers that it's cloudy or dark outside.
Here are a couple screenshots of the Arduino code that I adapted for use on my Photon. The thing I'm most proud of is that I figured out how to apply a For Loop in the Particle function I created that allows an LED to flicker for a reasonable amount of time--enough to get a point across--without flickering non-stop. I've since merged the code written below into the larger program that I've been working on.
I've also been experimenting with conductive fabrics and tapes this week, to find a material that might be useful for a book hinge. Although I had some mixed results, I'm not entirely certain that I'll be able to solder copper tape to the fabric without it scorching.
Lastly, I've begun experimenting with Inkscape, in an attempt to start brainstorming what my wooden covers might end up looking like.
The following image was my first attempt, but I've already discovered some problems with the design.
This week, I've continued my exploration of the Photon, as I prepare to implement my Capstone project next trimester. I am happy to report that I have a working prototype that merges all of the functions that I've been working on and writing about in prior posts. I'm not finished by any stretch of the imagination, but I have proven to myself that the concept in my mind can work!
Merging all of the functions into one program took considerable time, experimentation, and trouble-shooting. After several failures, and many hours of work, I took a paper copy of my code to a cafe and attempted to debug it. Surprisingly, this worked, even though I wasn't sitting anywhere near a computer. Not being near the computer helped me to be more clear about what I was trying to do and helped me hone in on potential problem areas.
Debugging in a Cafe
My next step is to work on finding a smaller servo that will work with my Photon.
This week I spent time experimenting with code and playing around with If Then Than That (IFTTT), which recently changed its set-up. My goal was to start merging programs on my Photon. So far, I've got a servo and NeoPixels running in one program, but I'd still like to add in code for a buzzer and LEDs that will be triggered by a light sensor. (This is a cross-post from my personal blog.)
My most exciting breakthrough was figuring out how to use IFTTT's "New Tweet From Search" feature, which makes it possible to trigger a web request by filtering a search in Twitter. In the case of my experiment, I created Applets, formerly referred to by IFTTT to as recipes, that can control the colors of NeoPixels connected to my Photon, in much the same way that CheerLights work!
This could provide an interesting way to interact with a wifi-connected book. A reader could send a tweet to change the color of LEDs in the book or scan a QR code to achieve the same effect, by triggering a Maker Event (also set up in IFTTT). While I'd already figured out how to do this with my own Tweets, I now know how to allow other peoples' tweets to interact with my Photon. My next step is to add code to the program so that a musical function is called in response to data received on a light sensor.
My next step is to add music to this program and trigger an event from data, such as readings from a light sensor or motion detector.
This week, another open exploration week, I unboxed my BBC Microbit and played around with some of the coding exercises for Python. I really like the way that I can practice different coding languages, such as JavaScript and Python with one device! My favorite lesson so far was this one, where I was able to connect a speaker to my Microbit with alligator clips and play music. I attempted to complete the Dalek Poetry activity, but I couldn't get it work.
Paper Circuit Triggered Microbit
I also couldn't get the Bluetooth feature to work on my device (I just learned that it's not compatible with Python yet), so the process of uploading the code was a bit cumbersome.
Instead of using the Microsoft iPad app as I'd planned, I ended up having to drag and drop the code from the downloads folder on my PC onto the device, as if it were a flash drive. Aside from that minor annoyance, it was a fun way to get instant gratification while playing with code.
I found an animated book making project that allowed me to tweak some code to create different animations on my Microbit that were triggered by copper traces in a book touching different pin configurations. It works in principle, as you can see in the video below.
Here are a couple of screenshots.
I'm running a little short on time this week, so this is where my exploration needs to stop for now. Next time I play with this, I'm going to check out a few of these lessons. I think that everyone learning the basics of coding should have a Microbit, because you can immediately practice physical coding.
This was "Open Project Week," where we were invited to work on a project of our choice, as long as it relates to the course syllabus. I continued my exploration of the Photon, which is something that I started prior to this class. Continuing my independent inquiry supports this course's learning objective of setting "personal and professional goals for increasing [my] coding confidence."
Since last week, when I got a servo up and running on my Photon, I've had a couple of major breakthroughs. In particular, I've figured out how to trigger a web request via a QR code. What that means is that I can now use a QR code to trigger a specific function in my code to "control physical objects," which is another learning objective of this course. In this case, I can control a servo and a buzzer. I'm continuing to build upon my understanding of Particle functions.
I've embedded a couple of videos here, but if you are interested in learning more, please visit my personal blog (linked below).