Thursday 10 November 2016

Ultimate Phone Photos: Part 2

Ultimate Phone Photos: Part 2

In the second part of this series, Mark Pickavance looks at different photographic techniques and how to best achieve them using your phone

Last week, I covered generally taking better pictures and video with your phone. Building on that, I’ll be talking about special situations this week.

What’s really interesting about some of the methods is that because of the software technology of the phone, it is easier to do some of these things on a phone than with an actual camera.

If you’ve ever wanted to move beyond the occasional family snap and selfie stage of phone photography, then you might want to consider trying some of these alternatives.


Time-Lapse


This is a type of photography where the phone really shines, though you’ll realistically have to invest in a tripod mount if you want the best outcomes.

Some phones, like the Huawei P9, have time-lapse built-in into their custom camera application, but in the vast majority you’ll need to install an app.

On my Nexus 5X, I use one called Time Lapse Video Recorder, and this is available as both a free to use and a bought app. There are others, and many people like one called Lapse It. What this software allows is you to configure a capture rate – i.e. a frame ever second and then record either for a set amount of time or until the phone is full or the battery exhausted.

Obviously, hooking up the phone to charger could solve that last problem or alternatively you could use the charging port to add a memory stick for longer recordings.

The only issue I’ve had with using Time Lapse Video Recorder is that on the Nexus 5X LG mounted the photo sensor the opposite way up to what most phones have, and as a result the video it makes is inverted. This can be solved easy in Movie Maker 2 or whatever, but it’s yet another job to do in post-production.

I’ve had some good results with this, and if you want to see one then check out youtu.be/fVrT6PrQ7bE.

However, there are a few things about shooting time-lapse that you need to be aware, most noticeably to do with locking the exposure and white balance.

When you take individual frames on the phone they are calibrated on a pershot basis, and when the sun comes out from behind a cloud or goes behind one, the lighting situation can drastically alter.

If you don’t use a tool that can lock exposure and white balance then in the video you might get frames that appear to flash as they’re exposed differently than the ones before. The problem I ran into was that even if you lock exposure you can get some odd results, because it is locked at the point you started to shoot, and that might be unsuitable for either a very dark or bright scenes later on. This happened when I was shooting a storm, and it was dark to begin with and then brighter as it moved through.

If you don’t want to use special software, you can just shoot a long sequence using a tripod, and then in the PC have it remove frames to speed up the video. It’s more work, but can be done on almost any phone with a camera.

Slow-motion


Normally video is shot at 25 or 30fps, but many phones these days allow you to increase that to 120 or even 240fps. At 120fps an event that took a single second at 30fps now takes four in the video and eight at 240fps.

That’s nothing when compared with the frame capture that shows like Mythbusters have used, but equally you can get some terrific results if you pick the right subject.

The trade-off is usually that the video resolution is reduced, down to 720p at the highest speed, as the processor in the phone can only handle so much data within a given time. And because the video effectively expands time the real-time length of recording is correspondingly reduced.

As you’re also grabbing lots of frames per second, you need really good lighting.

On the default Android app, there is also a little post capture fun you can have, because you don’t need to have the whole recording play back in slow-motion.

After you’ve shot the video a slider appears that allows you to define the part of the video that will be in slow and the other parts are played at normal speed.

These are things you can also do on the PC if you transfer it there, and you can use frame interpolation to expand the sequence even longer if the movement isn’t too extreme.

This is a feature that Apple phones do well, I’m told. The iPhone 6S Plus can record 1080p video at 120fps and 720p at 240fps, and the new iPhone 7 is equally capable. Android phones that are good for this are the Nexus 6P, that can do 240fps at 720p and the new Samsung Galaxy S7 has the same capacity.

There is also a wide selection of useful slow-motion focused capture apps available, should you really become interesting in this technique.

Slow-motion is something many phones can now do, and it’s mostly a matter of realising when would be a good time to deploy it.

Macro Photography


Ever seen one of those close-up shots of an insect and fancied capturing something so amazing yourself? What those photos never reveal is the equipment, patience and capacity for failure you require to get them.

On a phone you are usually limited to how close the optics will focus, which is usually about 10cm away. That’s not close enough to make an insect fill the screen unless it’s a scarily huge one.

The answer, as I discussed last week, is a macro lens that clips on to the phone allowing the optics to focus at a much closer range. These don’t have to cost much, though obviously the more you spend the better crafted lenses you’ll get, and the pictures will show the investment.

Because the amount of light bouncing off the subject is going to be relatively small, it’s also a good idea to use a tripod and also to execute a timer-fired shot.

Macro photography is challenging for another reason, and that’s the very narrow portion of the image that is in focus.

This is very apparent if you try to shoot a subject like a coin and it isn’t exactly perpendicular to the lens. If you focus at the centre of the coin you’ll notice that both the top and bottom edges are out of focus.

This is an effect that can nicely blur the background, but it makes shooting objects with any depth very difficult indeed. On a DSLR you can mess with the aperture and exposure times to increase (or decrease) the depth of field, but on the phone you’ll generally be limited to what the software decides.

Being so close to the subject can also reduce what natural light you’ve got, so shooting either outside or with light to help is also a good idea.

You can also run into some difficulties depending on how the phone focuses, because not all focus through the lens, and those phones that have two lenses don’t work well with a clip-on lens.

If you can use one then my best advice is to take plenty of shots refocusing all the time, because when you get to examine them on a bigger display you might well find that not all of them are exactly how you imagined.

Panoramic


Many years ago I bought a panoramic disposable camera to go on holiday to an exotic location, and loved the wide scenic views that it captured.

These days most cameras have a panoramic mode or you can simply place the camera on a tripod, take multiple images and then get PC software to stitch them into a single shot once you’re back home.

If you’ve ever done this, and I have, the snag you can easily run into is that while it seemed your shots where all at the same level, humans aren’t natural spirit levels.

Even with a tripod it is possible to get a slope as you pan the camera, and if you forget to lock the exposure settings and the sun comes out as you do it that can also mess up the whole exercise.

It is therefore not a surprise that I’ve created much more successful panoramic images using the phone, where software processing is on hard to dynamically help you line up the shots and take just the right number.

On Android phones, depending if they use the default camera app or not, there are two ways to create a panorama.

On some phones, like the aforementioned Huawei P9, they use a ‘sweep’ method, where you initiate a photo and then sweep the view from left to right. The problem with this technique is that it assumes you know exactly how fast to turn, and that you know where the centre of the image will be when you’re finished.

For lots of reasons I therefore prefer the standard Google camera app method where you start with the very centre of the view and then can expand that shot sideways using target markers that the software provides you. Using this mode you can make a photo as wide as you like, even going a full 360 degrees around if you wish.

Using this I’ve had much better results, and this idea has been expanded in the latest software to create sphere shots which work brilliantly with a Google Cardboard if you want to place someone inside the viewpoint using VR.

When taking panoramic images using the phone there are a few things that you need to consider for achieving the best results.

Avoid situations where things are moving across the frame, because when you take the multiple images, they’ll be clipped at the boundaries between the originating images. People moving in across an open space or boats on a river are equally problematic.

Keep things distant. When objects are very close in a panoramic they can easily become distorted. Ideally objects should be 20ft or more away from the camera or odd things can happen when you see the finished capture. For this reason indoor shooting is generally a bad plan.

Use a tripod if you can. While it is possible to do this by hand, the end result is often better with a tripod.

Don’t use landscape mode. While the software generally lets you shoot in landscape, don’t do it. If you use portrait (upright) shooting the final panorama will have many more pixels vertically, making for a higher quality shot.

Rotate the phone, not yourself. This might seem obvious, but it is easier to accurately rotate the phone than you.

Try different software. While I get on well with the default app, there are plenty of alternatives, and one might suit you better.

Think about the goal. If you’ve a purpose in mind for the image think about that when you create the panorama. Where Facebook and Flicker are fine at presenting them, for obvious reasons Instagram isn’t. Equally if you want to print them out you’ll need either a printer that can take paper rolls or you’ll need to use a professional service with this capability.

Lightening


I’ve being trying to record a lightning strike using my DSLR for years without any success. Much of the problem is that you never generally know exactly when they’re about to occur, and the exact direction can be largely guesswork too.

Location is also a significant factor, as being low down can obscure the best strikes behind nearby buildings, and also mask when an approaching storm is about to arrive.

The best place to shoot, without doubt, is high up in a tall building with an unobscured view of the storm as it moves in, and to help you access which direction and likelihood of strikes I use an application called Weather Radar by Netweather.tv. This not only gives me the precipitation map in five-minute snapshots it also shows you were strikes have occurred in the last hour. From this you can see the direction of movement, and if it is coming in your direction you should be able to work out roughly when it’s going to get to your location.

With the event predicted it’s time to solve the not insignificant photography problem, which boils down to taking a photo of something that’s already happened.

The DSLR approach to this is to reduce the amount of light entering the camera dramatically and then to do a long exposure where a release cable is used to complete the exposure after the flash.

As that statement infers there are so many things that can go wrong it’s hard to know where to start, and very often those successful captures you do get can end up either with multiple strikes overlaid on them or exposure issues.

On the phone you’ve got two options, both of them being significantly easier than the DSLR method. It was by using one of these I actually managed to get my first lightning strike captured this last summer.

My method was to use an app called Lightning Camera v2.1 by Pluto Applications. This is a relatively simple tool that creates a buffer of captured frames and it continually adds and deletes images from the buffer while you are shooting. The beauty of this solution is that you can press to stop the recording after a lightning flash, and then look through the frames to find the ones that contain the strike.

The downside of it is that the pictures are only at screen resolution, not the best that your phone can take. And you need to sit with the phone on a tripod and wait for the flash to disable the recording mode.

An alternative technique that might get you better results is to just record 4K video, if your phone can do that, and then take that into movie editing software on the PC and pull out the relevant frames.

Video isn’t recorded at the same quality as a still image, but you don’t need to monitor the capture in the way other techniques require. And you get a video out of the exercise for good measure. In short, you can get good lightning shots, even with a phone.

Ultimate Phone Photos: Part 2

Night Photography


Low light conditions provide the greatest challenge to a phone with a very small lens through which to capture all the light it needs for the image or video.

We’ve all seen enough grainy out of focus shots that people took on a night out to realise that some phones aren’t good at these situations. However, some are better than others, and a few like the Huawei P9 actually have modes specifically for recording stars, lightpainting or long exposure shots.

Whatever it is you want to do the first rule of proper night photography is to use a tripod, because these frames are going to take a period of time where hand movement will be an issue.

For exactly the same reason a timer shutter release is an imperative, because poking the screen with your finger will almost certainly introduce movement. You can use a Bluetooth clicker, but all phones have a timer so use that instead.

The other problem that I ran into with the night is that the phone can find it hard to focus, and the auto-focus will repeatedly hunt for the right place. If you’ve got a phone that you can set ‘infinity’ for focus that helps, but it can be annoying.

Exposure is the critical aspect, and often the camera will work against you if you don’t have specific night modes available to choose.

That’s because the phone will try to expose to the darkest part of the image, and that will make brighter areas far too light. Phones that have a manual mode, like the Huawei P9 are a good answer to that snag.

To help those phones that don’t have special night modes, there are some excellent apps available, like Camera FV-5 Lite on the Google Play store, and LongExpo on iTunes for the Apple iPhone.

Oddities


There are few types of photos that I’ve not mentioned, mostly because they’re a bit specialist. In this grouping I’d put Fish-eye photos, because as interesting as they can be they have very limited usage.

Most clip-on lens kits come with a fish-eye lens that allows you to effectively take a circular image with a very wide field of view.

Even with a good quality lens there is lots of unwanted optical artefacts and distortions in these images, and that’s something you learn to live with if you like using this effect.

Another feature I’ve seen on a few phones is one that attempts to give you control over the focus so that you can blur the background or foreground for effect. I’m not a huge fan of this, because what you generally do with these is take an image and then tilt the camera so extra picture data can be captured to process. The  results are highly unpredictable, so you’d be better just taking a good shot and then processing it in a picture editor later.

Another feature I’ve yet not experimented with is light-painting. This is a method where a very long exposure is created for a night shoot where you then introduce the light using torches, small lasers or even hand-held fireworks.

The trick here is that you have an allotted amount of time to paint with the light, and you use audio cues from the phone as to when this starts and when it is about to end. Obviously this requires the phone to be on a tripod, and lots of experimentation is required to understand how best to work in the dark.

Given some of the results I’ve seen this is certainly something I’ll be trying, as there are a few apps that provide this functionality if the phone doesn’t inherently offer it.

Final Thoughts


The ramp up in quality of phone cameras is really inspiring many people to take up photography were they wouldn’t have previously. That’s marvellous, and hopefully even those that don’t want to do this will just take better quality pictures to share with others.

A smartphone with a decent camera can offer a very wide range of possibilities, so when you take yours out next to snap something take a moment to consider how you can best use it. And you might turn the mundane into something much more interesting.

Happy snapping!


Is HDR Good Or Bad Thing?


Lots of phones these days have the capability to take HDR images, and in many respects this is generally a good thing. What should generally happen with HDR is that more than one image is taken (at least two), and from those two usually different exposures a unified image is created that has the perfect bits of both merged. Or that’s theory.

However, some of the results I’ve seen of HDR on phones are so poor that I’m generally inclined to believe that only a single image was taken and then processed to achieve an HDR affect.

The other issue is that if more than one image is taken then any movement between the two will cause blurriness on the image, so it isn’t suitable for shooting anything that isn’t relatively static. Water in particular can produce very strange results if shot in HDR mode. Obviously, using a tripod to hold the phone still while taking shots will help reduce movement from the process, and produce better results.

I’ve noticed that the best results are often created by shooting with HDR mode disabled, and then processing the image back on the PC to bring out the extra detail. It’s a shame that phones don’t offer RAW capture mode, because the HDR results using that data could be really impressive.

What you need to accept with HDR is that it isn’t always the best solution for any given situation, and if the first results don’t look good then you need to disable it.

It is also worth considering that processing the image away from the phone can give you much better results and control.

Here is an example of an image that was captured on the Nexus 5X and then processed on the PC to get more detail and colour out of it.