Wednesday, November 20, 2013

Google Glass Q & A

Note: updated 19 December 2013 to reflect changes with XE12: you can now wink to take photos.

Q: Are those Google Glasses?
A: For some reason, the plural sounds so much less cool. Unless you're making fun of me, it’s called Glass.

Q: Are those Google Goggles?
A: Goggles are a different Google product.

Q: What do you see right now?
A: Nothing. The display is usually off unless you're interacting with the device. The big activities that keep the display on are walking navigation and recording a video.

Q: Facial recognition blah blah blah?
A: Nope. Technically you could create a facial recognition app, but it certainly doesn't ship with one. I'm not aware of an app for it either. I’m guessing facial recognition apps would eat some serious battery power or bandwidth.

Q: Can the NSA see me?
A: Yes, it seems like the NSA is spying on a lot of people in the world, but probably not through Glass. I don't think I would have much battery life, storage, or data plan left if I were streaming audio or video to the NSA. I’m not sure how easy it is for the NSA to spy on the audio or video I intentionally record. Google is currently working to make NSA spying more difficult.

Q: Can you see through my clothes?
A: No I can’t. Not unless you're wearing a cling-wrap kilt.

Q: What does it look like?
A: When the display is on, it feels a lot like looking at the rear view mirror in a car. It’s not blocking much of your vision, and you glance up to see it. A few people who have tried on my Glass have been confused by the optics of the display because they expected to be focusing at something centimeters from your eye. Both the size of the display and it’s apparent focus distance make it seem like looking at a largish TV at 8 feet or so.

Q: So if you look at something about 6 to 8 feet away, the display on Glass will appear in focus?
A: That’s about right. I think this is a Good Thing because I believe there is a lot less eye strain at that focus distance.

Q: OK Glass, take a picture!
A: Sorry, that trick only works if Glass is active and I’m on the home screen. If that were the case, shouting commands might have worked. Nice try though.

Q: How does it feel? Is it comfortable?
A: I rarely notice Glass when I'm not using it. When I first got it, I tended to get headaches if I wore it for more than a few hours without a break. I’m told that folks who start wearing prescription lenses can experience the same thing if they don't ease into it. Now I'm used to Glass, and it doesn't bother me. I sometimes forget I'm wearing it, or not wearing it.

Q: Is it augmented reality?
A: That depends on the definition of Augmented reality. I think most folks associate AR with informational overlays on live video feeds. In that definition, AR involves looking through a device to see the world. You don’t look through Glass’s display to see the world. Glass feels a little bit more like closed captioning, or Picture-In-Picture for reality. Glass can provide information based on your location and other cues. I regularly use a Glass app called Field Trip, and it provides information about nearby landmarks as I travel.

Q: How long does the battery last?
A: In my experience, the battery life varies a lot depending on your usage. On my normal day, I have battery left when I get home. When I travel, I use it a lot, and I often have to charge it after 4 hours or so.

Q: How much do you wear it?
A: I wear it 80-90% of the time when I’m not home or at the office. Since I'm surrounded by computers and communications devices there, Glass seems redundant. I also don't wear Glass at Crossfit because I’m afraid of shorting it out with gallons of sweat, or smashing a kettlebell into it. I also don't wear it when I'm feeling extra introverted. Glass is a great way to meet strangers.

Q: What is the biggest change it has made in your life?
A: The biggest change is that a lot of strangers walk up and talk to me. The second biggest change is the ability to respond to text messages and email without touching anything. The ability to spontaneously take photos is a close third.

Q: When will it be released, how much will it cost, and will it do X?
A: I have no idea.

Q: Do you work for Google? Did they give it to you?
A: No, I work for Evernote. I paid for Glass out of my own pocket.

Q: Did you have to tweet to get it?
A: No, I'm part of a different group. I attended Google’s developer conference, Google IO 2012. Anyone who attended that conference had an opportunity to sign up to purchase Glass.

Q: I thought it had lenses.
A: That’s not a question. Also, it does have two sets of lenses that snap in: a clear shield, and a polarized sun shield. With the polarized shades, Glass looks like an especially intense pair of sunglasses. Fewer folks notice that it is Google Glass.

Q: How do you charge Glass?
A: Glass has a micro-USB port located a bit in front of your right ear, facing down.

Q: Has Glass gotten you in trouble anywhere.
A: Not really. A bouncer at a club in San Francisco was scandalized when I told him that it could take photos. He implied that photography in a nightclub was a huge breach of etiquette. I offered to wear Glass around my neck, and he was OK with that. Another time, a man walked up to me and said I couldn't wear Glass in a pub. When I asked him who he was, he admitted that he was just a patron and that he was messing with me. We both laughed and chatted about Glass.

Q: Do you wear Glass in the restroom?
A: I remove Glass from my face and wear it backwards around my neck in the restroom. Ever since Nick Bilton’s strange blog post and NYT article about Glass in the restroom, I've tried to avoid accusations of urinal photography. We live in strange times, but it seems like good etiquette.

Q: Oh, does Glass take photos when you wink?
A: As of XE12 (released 17 December 2013), it is possible to configure the second-generation Glass Explorer Editions to take photos when the user winks the right eye. You can read more about the feature in the wink help page. You'll note that they offer etiquette advice, and that it is considered an experimental feature. Not the way they arrive from Google! There is a piece of third-party software that uses the proximity sensor to detect a wink (I think) and trigger a photo. As convenient as this may be, I think it has a high creepy factor. I haven't installed it. I think this misconception around wink photograpny started with Nick Bilton's article. He implies that wink photography is a standard feature of Glass, even though it is a hack. I really wish he had asked someone first before writing about it in the New York Times.

Q: How do I know you aren't taking a photo or recording this now?
A: Out of the box, a photo or video will activate the display, which you can see from both sides of the prism. But I suppose you don't know for sure; I might have hacked Glass, or I might be wearing a wire or a hidden-camera bow tie.

Q: People standing in front of you can see what you're looking at?
A: Yes. The display looks quite small from the other side, but you can see it. If you were really close, you could probably read it. You would probably know if I was using Glass rather than paying attention to you.

Q: Tell me something ironic.
A: Many people with smart phones try to take covert photos of me wearing Glass. 

Sunday, November 17, 2013

What is Glass?

Note: The last few paragraphs were scratched and replaced to discuss the new GDK on 4 December 2013

When Steve Jobs introduced the iPhone, he called it a telephone, an iPod, and an internet communicator. One of the most common questions I get when I wear Google Glass is "What is it?" I wish I had an answer as concise as Steve's.

Glass isn't a phone, or an iPod. I think it qualifies as an internet communicator. But more than that, I claim that it is a tool for simplifying and speeding up the common interactions you might have with a smartphone or computer. Perhaps it will inspire computer interactions that don't yet exist. Like most devices with Apps, what it does depends a lot on the software you use.

The Pieces

If you tear open a Glass device like these folks, you will find a camera, a display, a bone conduction speaker, a touchpad, a shutter button, an accelerometer / gyroscope, WIFI and Bluetooth transceivers, and a CPU. It comes with a snap-in polarized sun shield too.

The display technology works by projecting an image into the prism which sits above the right eye. The images it creates are translucent; you can see right through them. The positioning of the display above the eye -- not in front of it -- means that you aren't trying to peer through it to see the world. The prism appears to have a photosensitive film on the side away from your eye to create a darker background for the display in bright light.

The bone conduction speaker is a tiny pill-shaped apparatus that touches the head near the right ear. It looks temptingly like a button. One of the curious properties of the bone conduction speaker is that in loud environments you can hear it better by plugging your ears. It also tickles just a little bit when it makes sound.

Using Glass

To begin interacting with Glass, you either need to tap the touchpad near your temple, or perform the Glass head flip. The head flip involves tilting your head up until the screen activates, a configurable behavior. With both the tap and the flip, the display shows the home screen and begins listening for the magic words: “OK Glass”.

If you say “OK Glass”, you can verbally select from a menu of commands: send a message, take a picture, record a video, get directions, make a call, start a video hangout, or take a note. Speak and Glass obeys. Some of the commands appear depending on what services you have connected, or how your phone is configured. The “Take a Note” command, for instance, can be handled by the Evernote Glass app, and it lets you dictate a note.

Using the touchpad, the same options are available, and you can also navigate the card-based interface of Glass. To the left of the home screen there are a collection of cards mostly related to the Google Now service. You might find a card with upcoming events on your Google Calendar, a card for the local weather, a card for stock prices, or a card offering driving times and directions to destinations recently searched for or for calendar events. These cards and their contents are contextually sensitive just like the Google Now cards on an Android Device, or in the Google Search app on iOS. They appear and vanish depending on what Google thinks is most useful. The card furthest to the left is the settings card, which shows the battery charge, and allows the user to configure Glass.

To the right of the home screen, there is a row of cards in reverse-chronological order, starting with the most recent. These are cards which come from your own interactions with the device, from communications, or from third party services. If you take a photo, you’ll find a card for it in the timeline. Did you search google? You'll find a card for that. Text messages and emails too. The interface feels like a long strip of film that you can click through one frame at a time, like an old-fashioned slide show.

Typically, each card responds to a tap on the touchpad. Depending on the card, it will either show a menu or another collection of cards that were metaphorically stacked. Text messages are a good example of stacked cards. In the timeline, only the most recent text is visible. Tapping on that message reveals a card for each message in the conversation. Tapping on any one of those cards offers a menu: reply, read aloud, call, delete.
You can see in the image above a diagram of the Glass interface stitched together from actual Glass screenshots. Click or tap on it to see a larger version. The home screen is where you usually start an interaction, and is one of the indications that Glass is ready for a verbal command. If you swipe forward on the touchpad, the next thing you would see is a text message, followed by a photo taken with glass, and finally a search for "will it rain today."

The triangular dog-ear in the top right corner of the message card is a clue that this is a stack of cards. If you tap on the touchpad while the text message is visible, if will dive into that stack of cards: a list of messages in that conversation. Follow the yellow arrows above. If you tap on any of those message cards, you are offered a menu.

The vertical stacking of the interface is a useful way to think about the UI. Swiping down on the touchpad will return the user to the next level up. From any of the menu cards, you can swipe back to the messages list, and from there you can swipe back to the top level timeline. From there, an additional swipe down will turn off the screen.

Incoming Communications

If you get a new message or interaction from an App, Glass will chime. If you respond immediately by tapping or performing a head flip, you will be shown the card associated with the notification. Depending on the notification, there will often be an “OK Glass” cue on the card indicating that you can address the notification verbally. When I get a new email or text message, I can say “OK Glass, read aloud”. Glass will then read the contents of the message to me. When it’s finished, I can say “OK Glass, reply,” and then dictate a response.

One useful UX trick that Glass uses is that it will display the text you dictate for a few seconds before performing an action. This gives you an opportunity to cancel a message in case there was a transcription error..

Photo and Video

In addition to the audio commands, you can take photos or a video by using a physical button on top of Glass. Photos and videos can be explicitly shared or pulled off of Glass using USB. In addition, once Glass is on a WIFI network and has a decent battery charge, it will automatically upload the media to a private Google+ album. You can choose to share or download the images from there.

The camera Glass has (at least the first generation Explorer Edition I have), is a wide-angle fixed-focus camera. I don’t really think that the camera compares to what you would find in an iPhone 5. However, Glass uses some computational photography techniques to create better photos that what the hardware normally would produce.

Navigation

I've only used Glass for walking navigation, but it works really well. Walking navigation seems like a killer app to me. The map is continuously projected in the display, and is oriented in real time with your head motion. Since the map spins so that it orients where your head is aimed, there is no need to look at street signs. Just line up the arrow with the path and walk.

You look like a normal, purposeful human being using walking navigation on Glass. Compare that to folks trying to navigate with their smart phone. They walk with their heads either down, or looking for street signs. They walk ten feet in a direction before making a u-turn to go the correct direction. Glass is a nice improvement.

Apps

Glass offers two main paths for developing apps. The first is the Mirror API. To use the Mirror API, the app developer doesn't write code for Glass. Instead she writes server code. Your server interacts with Google servers which then act as a proxy for a user’s Glass device. The server and the Mirror API interact with JSON and HTML representations of timeline items through RESTful endpoints.

Since the app runs on your server, you can use whatever technology you want to implement your side. Google has example code written in a variety of different technologies.

Users enable an app that uses the Mirror API by authorizing them with a familiar Google authentication flow. If Google has approved an app, it can be switched on through the MyGlass Android app, or the Glass dashboard.

The second path for creating Glassware is writing an Android app. You can load apps using the traditional Android development tools and load them with a USB cable. Developers will want to heavily customize their app for Glass since there is no touch screen and most apps aren't quite ready for a 640 x 360 display. At the moment there is no simple distribution method for apps created this way. Glass doesn't yet come with the equivalent of the Play Store.

Update: Google has released a preview of what they're calling the Glass Development Kit (GDK). The GDK is an Android library that gives developers direct access to the Glass-specific elements of the device: the timeline, the cards, menus, and so on. It also enables Glass apps with real-time interaction.

Apps developed using the GDK are installed the same way the Mirror API apps are: through the MyGlass web interface, or the MyGlass Android app. Flip the switch, and the package is pushed on to the device and installed. Glass requires an internet connection to retrieve the APK.

Along with the GDK, Google and several third-party companies released GDK based apps. Google released a compass app. Word Lens released an impressive translation app that replaces text in a live video feed from the camera. And there are several other apps in categories like sports.

The GDK opens up a lot of new possibilities for Glass development. It offers more challenges too, since it takes careful work to get smooth performance and low energy usage from code run on the device itself. That's just the sort of thing I enjoy. I've already started having fun with the GDK.

Saturday, November 9, 2013

Sony NEX-6 Long Term Review

The time seems right for a long-term followup review to the Sony NEX-6*. Sony recently announced two new high end cameras using the same lens mount as the NEX-6, but featuring larger full-frame sensors. The new cameras are called the Sony Alpha a7 and the Sony Alpha a7r, and they appear to be targeting the professional photographer. These bodies both have large, high resolution sensors mounted in durable splash-proof bodies.
Colors of Burano
The Colors of Burano
I expect that the addition of these impressive-looking cameras on the high end of the spectrum will increase interest in the midrange E-Mount camera bodies. Why? I imagine that some folks will want to test out the NEX system with the expectation of later growing into the new full-frame cousins. Although the new a7 and a7r seem to have reasonable prices for their capabilities, they are still expensive, high end cameras.

The crop-frame NEX line of cameras also has a few minor advantages over the new a7 besides price. There are many more lenses on the market designed for the NEX than for the full-frame a7 and a7r. Sure, you can use a crop-frame E-mount lens (most of the lenses launched before the Alpha a7 and a7r qualify) with any E-mount camera, but those lens won't let you take advantage of the full sensor area of an a7 or a7r. Depending on what kind of photography you enjoy, you might discover that the most suitable lens isn't designed to fill an entire full-frame sensor.
Venetian Cafe
Cafe in Venice
On the other hand, there is no reason you can’t use a full-frame lens on a crop-sensor NEX camera. I personally never owned a full-frame Canon camera even though most of my lenses were designed for the full frame. I tell anyone who asks to spend more on great lenses rather than great camera bodies. You will probably find that you use a given lens much longer than a given body. The technologies in camera sensors change much more rapidly than the technology in the lens: just ask the folks who are using ancient Pentax, Leica, and other manual-focus lenses on modern cameras. You don't have to buy the best body to enjoy and get value from a fantastic lens.

Read on to see how the NEX-6 has treated me over the past ten months. All of the photos you'll see in this review were made with the NEX-6. Click on them to get a larger view.

Smart Phone Integration

In my first review of the Sony NEX-6, I noted how terrible the PlayMemories Android and iOS (iPhone / iPad) apps for the camera was. Maybe Sony heard my whining because several new revisions of the PlayMemories app have been released. Now I would say the apps are at least mediocre. That might sound bad, but it is a big upgrade over terrible. At least the apps work now. Now I can go to an event, take photos with my NEX, send them to my iPhone, iPad, or Android phone, edit them, and share them with just a few minutes of work. Like I mentioned in my last review, the camera makes fantastic photos. The extra steps involved in using the NEX instead of the iPhone 5’s built in camera is often worth it.
Big Shade
Big Shade
So how does the sharing to the Android work? In one method, you first go to the photo you want to share , hit the menu button, select “Playback”, select “View on smart phone,” and them select “This image”. At this point you can open the PlayMemories app, enter the password for your camera (for the first use only), and then wait a few seconds for the phone to connect to the Camera’s wifi network.

For the iPhone, you need to use the phone’s settings app to connect to the Camera’s wifi network before launching the PlayMemories app. For both platforms, the process takes maybe thirty seconds if it goes well. The Android app has crashed for me several times. I have also had issues when more than one of my devices connects to the same NEX-6. If more than one smart phone or tablet connects to the NEX-6, the sharing functionality seems to fail. It took me a while to realize what was happening. Bummer!

If all goes well, you will see a thumbnail of the image you were viewing on the camera. Tap the thumbnail to get a larger preview, tap it again to select it, and then hit the copy button to copy the photo to your phone's gallery. The app has a share button, but in my limited testing, it doesn’t seem reliable. Unless you can persuade the share button to work, you’ll have to go hunting in the gallery to find the photo you just copied over.

Lens Adapters

Another new development since my initial review is the RJ Camera "Electronic Aperture Canon EOS (EF, EF-S) mount lens adapter to Sony E mount", which allows me to attach my big beautiful Canon glass to the NEX. The adapter I ordered allows the camera to control the aperture, capture the EXIF data from the lens (focal length, and possibly some other data), and to perform autofocus. I purchased this adapter so I could use the variety of Canon EF mount lenses I already own. Note that there are several different competing adapters that allow you to connect a Canon lens to an E-mount body.
Harvest Tools
Harvester
Sadly, the autofocus using this adapter seems limited to the contrast based methods (the usual NEX-6 phase detection seems to be disabled), and feels really slow. Occasionally, the autofocus just gives up. The autofocus feature of the RJ Camera adapter doesn’t provide a great experience. It is useful in certain situations, but I mostly manually focus my Canon lenses, relying on the focus peaking and the ability to zoom in on the live view.

I mentioned in my initial review that the Sigma 30mm lens only focused using contrast detection. The Sigma feels reasonably fast to focus. Don't count on getting similar autofocus performance on the RJ Camera adapter. The focus behavior with the RJ Camera adapter attached jumps to what seems like a series of coarse focus points before dialing in at a finer level. Sometimes the camera gives up before the entire process completes. I've mostly been using this functionality with My Cannon 100mm IS L Macro lens. Different lenses seem likely to have different results. The camera has a very difficult time focusing with my Canon 8-15mm L fisheye lens.Your mileage may vary, but it seems unlikely you will get the kind of focus speeds needed for action photography.

Luckily, the NEX-6 has two nice innovations to assist with manual focus. First, it offers focus peaking. Focus peaking highlights areas in the viewfinder which are in focus with a colored fringe. It seems to work by highlighting high-contrast transitions at the pixel level. If you don't have an extremely fast aperture, it is an easy way to verify focus — as long as there is an area of high contrast for it to identify.
Troll Door
Troll Door
The other tool the NEX-6 offers is the ability to zoom in to the live view from the viewfinder or rear LCD display. Press a button, zoom in, focus. This takes more time than the focus peaking, but if does offer higher precision. Note that with the RJ Camera adapter, the in-lens image stabilization of my Canon lenses is disabled while using the zoomed in live preview. Unless you have a very wide lens, you’ll find this behavior disappointing. IS would really help keep things steady while focusing my Canon EF 100mm f2.8L IS Macro lens.

My one final comment is that the tripod collar which came with the RJ Camera adapter didn’t have a long enough screw to securely tighten it to the adapter. I purchased a pile of washers and a longer screw at Home Depot for a few bucks. Also, once you remove the body from the adapter, nothing but friction holds the collar on. That’s OK, unless you get confused and try to hold the adapter by the tripod collar. Do that without the body attached, and you might drop your lens. Be careful!

Durability

In my original review of the NEX-6, I complained about losing the viewfinder eyecup while walking around the Magic Kingdom. I’m still a bit disappointed in that experience, but since switching my carrying system away from the Black Rapid, I haven’t lost an eyecup again. If you plan to hang your camera upside-down from the tripod mount, you might start losing $12 eye cups too.
Speaker Food
Elegant Food
In case you’re curious, I now use a PeakDesign Leash and also a PeakDesign CapturePRO. The Leash is a very versatile shoulder strap that can be rapidly removed or reconfigured to become a tether to attach to your belt. The leash can be made quite long too, which allows you to hold the camera weight across the shoulders rather than just around the neck like a tourist.

The Capture is a system which securely holds a camera to a belt or strap using a specialized (but compatible with arcs-swiss tripod heads) plate. It’s a great way to completely remove the weight of a camera from the shoulders, and you can tighten it so that the camera doesn’t bounce around when you walk or run.

Like my Canons, my NEX-6 has had some rough treatment from me. It has been bumped around in bags with just a light neoprene padding. It has swung from my shoulders, and been bumped into people. The camera has been drizzled on, had various tasty sauces spilled on it. And it has pretty much survived without complaint. In fact, there is very little visible evidence that it has had a tough life at all. The only exception was the dang viewfinder eye cup that I lost about ten months ago.
Daily Driver
Daily Driver
Occasionally, the kit lens will fail to register on the camera. It’s always been an easy fix though: turn off the camera, remove the lens, re-attach the lens. The poor plastic lens housing has been abused enough to have a bit of an excuse.  The 16-50mm kit lens spends more time on my camera than any other lens I own. It isn’t the best glass in the world, but it sure is light and versatile. Adobe Lightroom does a fine job of correcting many of it’s flaws.

Battery Life

The battery life on the NEX-6 hasn’t improved, but I did buy an inexpensive set of two Wasabi batteries with a wall charger and even a cigarette lighter adapter. That means I have a total of three batteries for the NEX, and that I can charge them rapidly without worrying about the finicky USB charging on the camera body (see previous review). The three batteries have been more than plenty to get me through any day. When I travel for three or four day weekend, I sometimes leave the charger at home. I can always use USB to charge it in an emergency.

If you’re in a situation where you need the camera ready to take photos instantly, DSLR style, you will probably burn through batteries more quickly than I usually do. If you plan to use it extensively throughout a day, I suggest having at least one extra battery on hand. Luckily, the Wasabi batteries are not too expensive.

The Keeper Rate

I’m still convinced that my NEX-6 has a much higher keeper rate than my Canon 7D, or my Canon 40D. That is, I feel that more of the photos I take with the NEX-6 are sharp. With the Canon 7D, I seemed to capture a certain percentage of my photos slightly blurry, usually due to camera shake. I’m almost convinced that the dang swinging mirror in the Canon is what ruined so many photos for me. I suppose that it’s also possible that the additional weight of the dSLR somehow contributed to shaky photos too.
Sunday Exercise
Sunday Exercise
Either way, I worry less about motion blur in my photos on the NEX-6. It still can happen, but I don’t feel the need to take 3 photos of every beautiful scene.

Conclusions

For the most part, I really like the Sony NEX-6. Yes, the apps for Android and iPhone still make me weep. I develop iOS apps for a living, so I might be more picky than the average user. That said, I fell much better mentioning the feature than I did in January. The apps work much better now, even if they still frustrate me.

Also, I sometimes miss the lightning-quick focus of the Canon 7D, especially when I’m manually focusing my old Canon Glass on the NEX-6. The battery life of the Canon was far better too. And I can’t complain about the easy access to the most commonly used settings through buttons on the camera body. Every time I have to navigate a menu to change a basic setting, I miss my 7D.
Crossing Dark Waters
Crossing Dark Waters - the Bay Bridge in San Francisco
I don’t miss the size and weight of a dSLR though. The automatic modes on my NEX-6 are far more advanced and far more useful than on my Canon 7D. The NEX does a great job of picking shutter speed and aperture without my help in most situations (as long as I'm not using an adapter), and that’s great.

Likewise, auto-ISO is perfectly acceptable on the NEX-6. In my opinion the camera makes the correct tradeoff in terms of shutter speed and ISO. And I rarely feel like I need to disable automatic ISO to get a shot — something I can’t say about my Canon 7D.

I also love the electronic viewfinder, which is usable even in the dark and allows you to zoom in before taking a photo. When I use an optical viewfinder, I feel like I’m using an antique. How will I know what a photo will look like without knowing how the sensor sees the world? I can’t believe I was ever concerned about the lack of an optical viewfinder.

Over the next ten years, I see no reason for the traditional dSLR to stick around. At the moment, they have a few advantages, but think we will see the same capabilities in mirror less cameras in a few years. Cameras with moving mirrors will seem just as quaint as those antique cameras with bellows and flash powder do today.


[Article updated November 10th to include a link to the Wasabi batteries]

*Moving Average Inc. is a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for sites to earn advertising fees by advertising and linking to amazon.com. Buying items through this link helps sustain my outrageous camera addiction and is much appreciated!