Using query params we can instruct Scene7 to construct a dynamic image for us. The above example creates a single composition from two images. You can see things like background color, opacity, image source, and size.
In the above example I’m making the middle layer slight smaller to maintain a safe area/margin. Scene7 will scale from center, but there is also an anchor point option if needed.
Use the resource below to get started. Some key callouts I learned in development:
add a debug refresh button to reload your extension’s content when updating html/js file
when making changes to the .jsx file, disable/enable the extension to see changes reflected
use defaults write com.adobe.CSXS.8 PlayerDebugMode 1 to side load an unsigned extension (make sure you restart/log out/or kill process for changes to take effect)
put your extension dir in /Users/$user/Library/Application Support/Adobe/CEP/extensions/ – if this path doesn’t exist you can create it
watch out for ruler units – ensure your logic considers the user’s ruler units before doing arithmetic. Some sample code I’ve seen and have used suggest doing:
origRulerUnitPrefs = app.preferences.rulerUnits; //save original
app.preferences.rulerUnits = Units.PIXELS; //set
app.preferences.rulerUnits = origRulerUnitPrefs; //restore original
I’ve been wanting to experiment with epoxy resin for awhile and I finally found some wildflowers from the local forest Preserve that I run in frequently. They made the perfect subject to create a balanced and visually stunning composition.
I also wanted to add a technical touch, so I embedded in the bottom layer of the epoxy a NFC chip. See the video below!
Fresh off of building PTSExplorer.com I had some inquires about building an adaption for a kiosk or exhibition. I was excited by the challenge and below is what I built in a night to show my vision of what that could look like.
The iPad app uses the didConnectNotification from UIScreen to detect when a secondary screen is connected, via HDMI or AirPlay. Once a second screen is detected at runtime I programmatically create another window with a root view controller from a storyboard.
One of my favorite things about the model train hobby is getting to experiment with my love for hardware and code. One day I was playing with a crossing system I bought from a big retailer and quickly became frustrated with its shortcomings, I literally had an “ah hah” moment when I realized I could create my own, pretty easily.
The video below describes in detail why I decided to build my own, and how I went about doing it. I go over different “activation methods” and why I chose to go with sonar over voltage or light.
One additional thing I did was add a BLE chip so I could control the whole system with my smart phone (luckily I know a good iOS developer to build the app).
Grossing Gates, Turnout, and Signal Bridge
Sonic sensors (HC-SR04)
NJI Crossing Gates (NJI 1164) & NJI dwarf
Adafruit Bluefruit LE UART Friend – Bluetooth Low Energy (BLE)
One day a few of my peers on the creative team came to me with a request/challenge – could I build a Sketch plugin that streamlines keeping image assets up-to-date in their comps? After a few iterations, I believe I settled on a solution for them, see the [narrated] video below for details.
assign local/remote URLs to image layers
data is saved at the document level and persists (see sketch file format for more info about this)
upon menu item click, or document open, grab layer object via saved layer ids and update image data with associated URL
no need to iterate through layer hierarchies (slow/wasteful), no need to rename layers, no need to keep directories in a specific structure