Check the downloads section for a new downloadable pdf file that covers the process of creating a linearized calibration workflow for positives used in making polymer photogravure plates. This new piece of software from BWMastery.com allows the use of QTR to create perfectly calibrated positives from which to make polymer intaglio plates.
Here is a direct link to that pdf:
Normally, this would be a download link, but there is something munged up with my download code and pdf files. I don’t have time to figure it out right now, but it is easy enough to click the link on the pdf and then save it from your browser window a printed or saved version is wanted.
Panic Inc. has a well-deserved reputation for creating a very speedy, robust and easy-to-use ftp client app for the Mac. I have used the Transmit app for many years and it is everything a Mac app should be: intuitive, elegant, and fast.
Recently, Panic released an iOS version of Transmit that can be used on iPhones and iPads. This is a great thing for users that want to easily add content or edit a website while away from their primary ‘serious’ computer. As evidenced by some of my previous posts, I am now hosting my website on an Amazon EC2 instance, and getting sFTP service up and running can be a little bit of a twitchy process. The following is a very brief outline of the approach to take to enable the Transmit iOS app to access an Amazon EC2-based website.
This step is optional, but I assume that most users who are interested in downloading the Transmit iOS app will likely have the OSX version running on their computer.
Start the Transmit app, and click on Favorites->Export.. in the menu. You will be presented with an export dialog box. Create a folder in your Dropbox account and name it something like ‘EC2’, and then save the file TransmitFavorites.exportedFavorites to this folder.
On your iOS device, open the dropbox app and navigate to the file you just saved, and then tap the ‘upload’ icon, and a window will pop up with some options. Choose the Open in.. option: Another window will appear, and in this one, tap on the Transmit icon: The iOS device will then switch to Transmit, and here you will select the file that appears and then tap on the Import… selection: Transmit will then ask if you want to import Favorites, and then you should respond affirmatively.
This is the part that feels a little more involved to me. Essentially, the .pem SSL certificate needs to be available inside the Transmit app. The approach recommended on the Panic website is to use iTunes to transfer this file to the ‘keys’ area of the app.
Using a USB connector, connect the iOS device to your computer (iTunes should be running). The iOS device should appear in iTunes. Click on the device name at the top of the iTunes app windows (iPad, iPhone or whatever), and then click on the Apps tab. Scroll down to the File Sharing section on the left-hand pane and locate the Transmit app. Select the Transmit app, and then click the Add.. button at the right hand bottom part of the right-side pane. Navigate to the .pem file and then import it into the Transmit app.
Note: This assumes you have not imported your EC2 server setup in Step 1. If you have, skip down to step 3
/var/www/htmlif you have your site configured in a typical fashion. If the iOS app will be used for poking around on your server, for instance, editing unix configuration files, you may want to set the root path to
A quick note on a discovery I had this morning. I decided to bite the bullet and upgrade my old Quicken for Mac 2007 installation to the newish Quicken for Mac 2015. Paid for it, downloaded and installed it per the instructions. But as soon as I clicked on it, it would crash with an error message indicating that I did not have permission to create a folder in the Application Support directory.
Hmmm. This is strange. This is part of my user directory. Why the heck would I not have permission?
I’ll backtrack here. Early in the year, I migrated my whole system from a Retina MacBook Pro to my current MacPro 2015 desktop. I used Apple’s Migration Assistant to move all my applications and user files from the old system to the new. And I noticed that there were a handful of applications that no longer worked. One of them was CodeRunner, which is a great little program to try out little code snippets in a variety of programming languages. Anyhoo - it was annoying, but not something I had time to really track down and figure out.
So this morning’s experience with the new Quicken 2015 provided the impetus to figure out what the hell is going on.
The average Mac OSX user probably is not familiar with a lot of the directory structure of OSX. One confusing aspect of this is that there are multiple Library directories (folders). There is the system Library directory and the user Library folder. And the error message I was getting from Quicken seemed to indicate a problem with the user Library folder.
I opened a terminal window, and typed the following command:
The tilde (~) is a shorthand for the user’s home directory.
I then typed in
ls -ald App*
And to my surprise, this unix command listed the owner of my Application Support directory as wheel! The owner of this directory should have been my user id, not wheel.
The fix at this point was obvious - I needed to establish my user id as the correct owner of the Application Support directory.
I was already in the ~/Library directory, so all I had to do was use the unix chown command to change the owner of this directory to my user id.
sudo chown -R myuserid Application\ Support
And Quicken 2015 would finally launch. The side benefit to this was that now all my other balky apps began to work as well!
I guess the moral of this tale is that Migration Assistant can occasionally do some stupid things to ownerships and permissions during the migration process. If problems arise, the ~/Library directory a good place to begin looking for problems.
We had a very sudden and fairly violent thunderstorm pop up this evening right at sunset. It moved from the southwest, as they tend to do, and pretty quickly the violent gusts of wind and battering rain had moved on, leaving a fantastic hole in the clouds revealing the remains of the sunset just off in the west and northwest sky. I stood under the eaves of my deck, dialed in 2 stops of underexposure compensation on my camera and just waited for something interesting to happen. It did, the shutter lag did not lose the shot, and I got something I like quite a lot.
My beer did get warm during the wait, however. This falls under the category of ‘first world problem’.
I am still tweaking small things on the Amazon EC2 server that is hosting my site. One of the things that I did not do immediately is enable gzip compression of all the site data when it is served to a browser. What this does is compress all the files down before they are pushed across all those tubes that make up the internet, and the browser then decompresses the files on the other side. ⇒
And all of a sudden, the built-in apps Textedit and Preview began to crash whenever they were opened. ⇒
If you have had the patience to labor through all the material on this part of my website covering the subject of photo website creation, you will notice that the driving force behind all my modifications over the years have had to do with ease-of-maintenance and my efforts to come up with methods to streamline the process of updating content, especially photo content. ⇒
My posts lately have tended toward the computer-ish side of things, and this one will be no different. It will be brief and concise, to wit:
I realize that is the third time you have read this in the last 14 seconds, but there is the story behind it:
Most of us photographer types have a lot of data that needs to be stored digitally. The days of ring binders filled with plastic negative and slide sleeves are coming to a close. I still shoot a lot of film, but in just my casual, non-professional digital shooting over the last seven years, I have accumulated about 1 terabyte of digital images on my computer! Naturally, I don’t want to lose any of these photographs. So what is the best strategy for protecting them?
I’ll start by admitting that my job involves using seismic data for oil and gas exploration. We have HUGE amounts of data and interpretation that we need to access on high powered graphical workstations. What I have learned the hard way over the last thirty years are a few things:
It seems like every day I will see some offhand comment on a photography forum from someone who has just gotten a new RAID device (often a Drobo or Buffalo-type device) and that they can now breathe easy. Well, this is a misplaced sense of security. Here is the deal: A RAID device is only as good as the the piece of hardware or software that controls it. A RAID uses multiple drives and distributes the data among them in a systematic way. But the hardware or software controller is like the map. You lose the map, and your data is gone, and likely can only be recovered by some very pricey specialists who make quite a good living by being able to reconstruct your map. It is hugely complicated, and is akin to reconstructing a wine glass that has had an encounter with a concrete floor.
“Oh”, you say, “But I have RAID-5, and I can have one disk go bad and it will rebuild the data if I lose a disk”. Yep, it could work that way. Or not. That assumes the failure point is a bad disk. What if something else fails? Say, for instance, that you have a hardware based RAID-5 standalone box like the Mercury Elite QX-2, and have it configured in 3+1 mode, which means that it is RAID-5 and can reconstruct the data with the ‘hot spare’ if one of the primary disks goes down. What if the hardware/firmware inside the box goes bad? Where does that leave you? Up a foul-smelling estuary without means of locomotion. This isn’t a disk going bad, this is the hardware-controller having a brain aneurysm and not having any recollection of where it left the car keys.
This happened to me about three weeks ago. No disk failures. Just a hardware failure at a level above the disks. Did I freak out? No, because now I don’t trust anything, and I had two spare copies of that data off site and one live copy on-site that is cloned once a day. I just packed up the box and sent it back for a replacement. But the moral is: Don’t trust a RAID as a backup. The only safety you have is many copies of your data in many different places. And for backup purposes, I would prefer two or three copies on high capacity single drives over a RAID anyday.
Another lesson I learned is not to rely on a piece of software to automatically backup your data. You need to have a method of confirming that the backup took place. I like to use Carbon Copy Cloner for my mac, and I have it send me an email when a backup starts and when it finishes. I also have it use the Growl notification windows to tell me the same thing. If I get up in the morning and I don’t have matching pairs of Growl notifiers on my screen when I log in, I know there is a problem.
Trust, but verify, in short. I learned this the hard way last spring when my backup software failed to run for two weeks and then my system disk went bad. I was all smug that I had a live clone, and then horrified to find out that it was two weeks out of date because Super-Duper software hiccuped and stopped backing up. I hadn’t set all the notification routines in place that I have currently, and I had no clue there was a problem. I switched to Carbon Copy Cloner, and so far (fingers crossed) I have not had a problem.
And while I am on the subject of Carbon Copy Cloner, I want to emphasize that this is a replication method, not a true backup. All the software does is ensure that the files on one piece of hardware are duplicated onto another. If you have a major screwup on the parent copy, all the cloning does is to ensure that you have the same screwup written to your child copy. Backup software like Time Machine allows you to store incremental copies of your data, and if you decide you want to go back to the work you had a week ago, it will make that possible. A replication approach will merely allow you to go back to the most recent copied version, and that is it.
So here are a few pieces of advice:
I have been taking later-summer/early fall hiking and climbing trips to the Wind River mountain range in Wyoming for the last fifteen years. These trips have ranged between week long excursions focused on climbing to simple hike-in-and-set-up-camp for a few days trips with larger groups of people with widely varying levels of fitness. This year, I suggested to Doug, one of my very fit friends, that we try a slightly more aggressive loop hike that would cover almost fifty miles of trail over the most scenic spots in the Southern Wind River range. ⇒
This group of photographers are all part of that hard-core subgroup of photographers whose primary means of photographic expression involves the use of the wet-plate collodion process. ⇒
The html-based iframe tag apparently is one of those embarrassing things that no self-respecting professional code jockey would ever be caught using in a website design. But just as the chimpanzees at the zoo have no idea that they are part of the Hominidae family and go ahead and pleasure themselves in public anyway, I am not a professional code jockey, and any sense of shame is vastly overshadowed by the pleasure I get from a dirt-easy way to embed my Lightroom HTML galleries directly into a web page. ⇒
Safari, Chrome and Firefox are all browsers that have implemented many of the features of the upcoming HTML5 and CSS3 specifications. So you can do some very easy styling customizations for getting things like nifty Web 2.0 rounded corners and cool gradients for many of your html elements. No need to go to the trouble of making some slick mockup in photoshop and then slicing it into nine pieces so your shiny button will scale properly. Just add a custom style to your Thesis custom.css file. ⇒
The first part of this series of how-to posts will begin with a short discussion of the components I have used to build this site and my reason for using them. ⇒
By the fall of 2009, the web world had moved ahead quite rapidly. Photo-sharing sites were plentiful. And blogging had become about as common as breathing. ⇒
In the fall of 2007, I decided that I needed to make a little more updated version of my website. I had been reading up on the purported benefits of a nice clean HTML/CSS approach to laying out the website. The philosophy behind this approach is solid: It effectively allows you to (in theory) separate the website content from its appearance. All the content is supposed to go into HTML files which then contain a reference to a external CSS file that tells the browser how to display this content. Don’t like how it looks? Then just change the CSS file and leave your data alone. It all sounded good. ⇒
So I decided to suck it up and learn html and css and all the other things you need to roll-your-own. But it was like being the person at the rear of the car trying to push it out of the mud. The car wasn’t going anywhere, and I was getting very muddy. ⇒
The first two posts covered some basic considerations and options for creating a photography related website. Now I want to relate a very quick timeline of the wobbly path I have taken through the maze of website creation possibilities over the last six or seven years. ⇒
The previous post discussed what I view as the essential requirements of a photographer’s website. Now I will discuss some of the pros and cons of each approach. ⇒
This is part one of a series of posts I am going to make about the process of creating a website for displaying photographic work. I have had a live website for about five years, and the one you are currently viewing is about release level 3.1. ⇒